Giter Site home page Giter Site logo

yokawasa / fluent-plugin-azure-loganalytics Goto Github PK

View Code? Open in Web Editor NEW
53.0 6.0 10.0 643 KB

Azure Log Analytics output plugin for Fluentd

Home Page: https://rubygems.org/gems/fluent-plugin-azure-loganalytics

License: Apache License 2.0

Ruby 100.00%
azure loganalytics fluentd fluentd-plugin log-analytics ruby

fluent-plugin-azure-loganalytics's Introduction

fluent-plugin-azure-loganalytics

Azure Log Analytics output plugin for Fluentd. The plugin aggregates semi-structured data in real-time and writes the buffered data via HTTPS request to Azure Log Analytics.

fluent-plugin-azure-loganalytics overview

Requirements

fluent-plugin-azure-loganalytics fluentd ruby
>= 0.3.0 >= v0.14.15 >= 2.1
< 0.3.0 >= v0.12.0 >= 1.9

Installation

Installing gems into system Ruby

$ gem install fluent-plugin-azure-loganalytics

Installing gems into td-agent’s Ruby

If you installed td-agent and want to add this custom plugins, use td-agent-gem to install as td-agent has own Ruby so you should install gems into td-agent’s Ruby, not system Ruby:

$ /usr/sbin/td-agent-gem install fluent-plugin-azure-loganalytics

Please see also I installed td-agent and want to add custom plugins. How do I do it?

Configuration

Azure Log Analytics

To start running with Log Analytics in the Microsoft Operations Management Suite (OMS), You need to create either an OMS workspace using the OMS website or Log Analytics workspace using your Azure subscription. Workspaces created either way are functionally equivalent. Here is an instruction:

Once you have the workspace, get Workspace ID and Shared Key (either Primary Key or Secondary Key), which are needed by Log Analytics HTTP Data Collector API to post the data to Log Analytics.

Fluentd - fluent.conf

<match azure-loganalytics.**>
    @type azure-loganalytics
    customer_id CUSTOMER_ID   # Customer ID aka WorkspaceID String
    shared_key KEY_STRING     # The primary or the secondary Connected Sources client authentication key
    log_type EVENT_TYPE_NAME  # The name of the event type. ex) ApacheAccessLog
    endpoint myendpoint
    add_time_field true
    time_field_name mytime
    time_format %s
    localtime true
    add_tag_field true
    tag_field_name mytag
</match>
  • customer_id (required) - Your Operations Management Suite workspace ID

  • shared_key (required) - The primary or the secondary Connected Sources client authentication key

  • log_type (required) - The name of the event type that is being submitted to Log Analytics. log_type only supports alpha characters

  • endpoint (optional) - Default:'ods.opinsights.azure.com'. The service endpoint. You may want to use this param in case of sovereign cloud that has a different endpoint from the public cloud

  • time_generated_field (optional) - Default:''(empty string) The name of the time generated field. Be carefule that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ). See also this for more details

  • azure_resource_id (optional) - Default:''(empty string) The resource ID of the Azure resource the data should be associated with. This populates the _ResourceId property and allows the data to be included in resource-context queries in Azure Log Analytics (Azure Monitor). If this field isn't specified, the data will not be included in resource-context queries. The format should be like /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}. Please see this for more detail on the resource ID format.

  • add_time_field (optional) - Default:true. This option allows to insert a time field to record

  • time_field_name (optional) - Default:time. This is required only when add_time_field is true

  • localtime (optional) - Default:false. Time record is inserted with UTC (Coordinated Universal Time) by default. This option allows to use local time if you set localtime true. This is valid only when add_time_field is true

  • time_format (optional) - Default:%s. Time format for a time field to be inserted. Default format is %s, that is unix epoch time. If you want it to be more human readable, set this %FT%T%z, for example. This is valid only when add_time_field is true.

  • add_tag_field (optional) - Default:false. This option allows to insert a tag field to record

  • tag_field_name (optional) - Default:tag. This is required only when add_time_field is true

Configuration examples

fluent-plugin-azure-loganalytics adds time and tag attributes by default if add_time_field and add_tag_field are true respectively. Below are two types of the plugin configurations - Default and All options configuration.

(1) Default Configuration (No options)

fluent_1.conf

<source>
    @type tail                         # input plugin
    path /var/log/apache2/access.log   # monitoring file
    pos_file /tmp/fluentd_pos_file     # position file
    format apache                      # format
    tag azure-loganalytics.access      # tag
</source>

<match azure-loganalytics.**>
    @type azure-loganalytics
    customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
    shared_key ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksCzvBmQXHw==(dummy)
    log_type ApacheAccessLog
</match>

(2) Configuration with All Options

fluent_2.conf

<source>
    @type tail                         # input plugin
    path /var/log/apache2/access.log   # monitoring file
    pos_file /tmp/fluentd_pos_file     # position file
    format apache                      # format
    tag azure-loganalytics.access      # tag
</source>

<match azure-loganalytics.**>
    @type azure-loganalytics
    customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
    shared_key ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksCzvBmQXHw==(dummy)
    log_type ApacheAccessLog
    azure_resource_id /subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/otherResourceGroup/providers/Microsoft.Storage/storageAccounts/examplestorage
    add_time_field true
    time_field_name mytime
    time_format %FT%T%z
    localtime true
    add_tag_field true
    tag_field_name mytag
</match>

(3) Configuration with Typecast filter

You want to add typecast filter when you want to cast fields type. The filed type of code and size are cast by typecast filter. fluent_typecast.conf

<source>
    @type tail                         # input plugin
    path /var/log/apache2/access.log   # monitoring file
    pos_file /tmp/fluentd_pos_file     # position file
    format apache                      # format
    tag azure-loganalytics.access      # tag
</source>

<filter **>
    @type typecast
    types host:string,user:string,method:string,path:string,referer:string,agent:string,code:integer,size:integer
</filter>

<match azure-loganalytics.**>
    @type azure-loganalytics
    customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
    shared_key ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksCzvBmQXHw==(dummy)
    log_type ApacheAccessLog
    add_time_field true
    time_field_name mytime
    time_format %FT%T%z
    localtime true
    add_tag_field true
    tag_field_name mytag
</match>

[note] you need to install fluent-plugin-filter-typecast for the sample configuration above.

gem install fluent-plugin-filter_typecast

(4) Configuration with CSV format as input and specific field type as output

You want to send to Log Analytics, logs generated with known delimiter (like comma, semi-colon) then you can use the csv format of fluentd and the keys/types properties. This can be used with any log, here implemented with Nginx custom log. fluent_csv.conf

Suppose your log is formated the way below in the /etc/nginx/conf.d/log.conf:

log_format appcustomlog '"$time_iso8601";"$hostname";$bytes_sent;$request_time;$upstream_response_length;$upstream_response_time;$content_length;"$remote_addr";$status;"$host";"$request";"$http_user_agent"';

And this log is activated throught the /etc/nginx/conf.d/virtualhost.conf :

server {
	...
	access_log /var/log/nginx/access.log appcustomlog;
	...
}

You can use the following configuration for the source to tail the log file and format it with proper field type.

<source>
  @type tail
  path /var/log/nginx/access.log
  pos_file /var/log/td-agent/access.log.pos
  tag nginx.accesslog
  format csv
  delimiter ;
  keys time,hostname,bytes_sent,request_time,content_length,remote_addr,status,host,request,http_user_agent
  types time:time,hostname:string,bytes_sent:float,request_time:float,content_length:string,remote_addr:string,status:integer,host:string,request:string,http_user_agent:string
  time_key time
  time_format %FT%T%z
</source>

<match nginx.accesslog>
    @type azure-loganalytics
    customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
    shared_key ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksCzvBmQXHw==(dummy)
    log_type NginxAcessLog
    time_generated_field time
    time_format %FT%T%z
    add_tag_field true
    tag_field_name mytag
</match>

Sample inputs and expected records

An expected output record for sample input will be like this:

Sample Input (apache access log)

124.211.152.156 - - [10/Dec/2016:05:28:52 +0000] "GET /test/foo.html HTTP/1.1" 200 323 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.99 Safari/537.36"

Output Record

The output record for sample input can be seen at Log Analytics portal like this:

fluent-plugin-azure-loganalytics output image

Sample Input (nginx custom access log)

"2017-12-13T11:31:59+00:00";"nginx0001";21381;0.238;20882;0.178;-;"193.192.35.178";200;"mynginx.domain.com";"GET /mysite/picture.jpeg HTTP/1.1";"Mozilla/5.0 (Windows NT 10.0; Win64; x64) Chrome/63.0.3239.84 Safari/537.36"

Output Record

Part of the output record for sample input can be seen at Log Analytics portal like this with field of type _s (string) or _d (double):

fluent-plugin-azure-loganalytics output image

Tests

Running test code (using System rake)

$ git clone https://github.com/yokawasa/fluent-plugin-azure-loganalytics.git
$ cd fluent-plugin-azure-loganalytics

# edit CONFIG params of test/plugin/test_azure_loganalytics.rb
$ vi test/plugin/test_azure_loganalytics.rb

# run test
$ rake test

Running test code (using td-agent's rake)

$ git clone https://github.com/yokawasa/fluent-plugin-azure-loganalytics.git
$ cd fluent-plugin-azure-loganalytics

# edit CONFIG params of test/plugin/test_azure_loganalytics.rb
$ vi test/plugin/test_azure_loganalytics.rb

# run test 
$ /opt/td-agent/embedded/bin/rake test

Creating package, running and testing locally

$ rake build
$ rake install:local

# running fluentd with your fluent.conf
$ fluentd -c fluent.conf -vv &

# send test apache requests for testing plugin ( only in the case that input source is apache access log )
$ ab -n 5 -c 2 http://localhost/test/foo.html

Data Limits

As described in Azure Monitor Data Collection API doc, there are some constraints around the data posted to the Azure Monitor Data collection API. Here are relevant constraints:

  • Max payload size: 30 BM
  • Max field value size: 32 KB
  • Max characters num for each field name: 500

Please be noticed that the plugin checks the max payload size before it post to the API (>=0.7.0), however it doesn't check max field value size and max charactores num for each field name.

Change log

Links

Contributing

Bug reports and pull requests are welcome on GitHub at https://github.com/yokawasa/fluent-plugin-azure-loganalytics.

Copyright

CopyrightCopyright (c) 2016- Yoichi Kawasaki
LicenseApache License, Version 2.0

fluent-plugin-azure-loganalytics's People

Contributors

catweisun avatar cosmo0920 avatar smira avatar yokawasa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

fluent-plugin-azure-loganalytics's Issues

Check fields/body size not to exceed data limits of Azure Monitor Data Collection API

Enhancement

There are data limits of Azure Monitor Data Collection API. An enhancement is to check fields/body size not to exceed data limits of Azure Monitor Data Collection API.

Data Limits

There are some constraints around the data posted to the Azure Monitor Data collection API.

Maximum of 30 MB per post to Azure Monitor Data Collector API. This is a size limit for a single post. If the data from a single post that exceeds 30 MB, you should split the data up to smaller sized chunks and send them concurrently.
Maximum of 32 KB limit for field values. If the field value is greater than 32 KB, the data will be truncated.
Recommended maximum number of fields for a given type is 50. This is a practical limit from a usability and search experience perspective.
A table in a Log Analytics workspace only supports up to 500 columns (referred to as a field in this article).
The maximum number of characters for the column name is 500.

relevant issues

Fix: CVE-2020-8130 Moderate severity

moderate severity
Vulnerable versions: <= 12.3.2
Patched version: 12.3.3
here is an OS command injection vulnerability in Ruby Rake before 12.3.3 in Rake::FileList when supplying a filename that begins with the pipe character |.

Support Azure sovereign cloud

Azure sovereign cloud is using different endpoints for data collector api, such as https://.ods.opinsights.azure.cn/api/logs?api-version=2016-04-01 for Azure China, it's better to support customized endpoint for sovereign cloud environment.

Thanks.

Unable to push logs to Log Analytics

Issue description:
Unable to push logs to Log Analytics using the provided sample config and log line. Not sure how to gather more info to troubleshoot further.

Config used:

<source>
    @type tail                         # input plugin
    path /tmp/access.log   # monitoring file
    pos_file /tmp/fluentd_pos_file     # position file
    format apache                      # format
    tag azure-loganalytics.access      # tag
</source>

<match azure-loganalytics.**>
    @type azure-loganalytics
    customer_id foobar   # Customer ID aka WorkspaceID String
    shared_key foobar     # The primary or the secondary Connected Sources client authentication key
    log_type access_FL  # The name of the event type. ex) ApacheAccessLog
    add_tag_field true
    tag_field_name access_FL_tag
</match>

Steps to reproduce the issue:

  • Start docker container with ruby:2.4.1-jessie image
  • Install td-agent
  • Install fluent-plugin-azure-loganalytics via td-agent
  • Apply above config and restart td-agent
  • Append the log line from README to tailed file:
124.211.152.156 - - [10/Dec/2016:05:28:52 +0000] "GET /test/foo.html HTTP/1.1" 200 323 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.99 Safari/537.36"

What's the expected result?
Log gets sent to Azure Log Analytics

What's the actual result?
Output from /var/log/td-agent/td-agent.log:

2017-08-28 07:02:11 +0000 [fatal]: #0 Exception occured in posting to DataCollector API:
2017-08-28 07:02:11 +0000 [warn]: #0 failed to flush the buffer. retry_time=8 next_retry_seconds=2017-08-28 07:02:11 +0000 chunk="557cacc5a24ffe7d37bd7548ca8352bb" error_class=NoMethodError error="undefined method `+@' for #<String:0x007f24e4c48028>"
2017-08-28 07:02:11 +0000 [warn]: #0 suppressed same stacktrace

Does this plugin support json format log_type jsonABT

I am trying to forward k8 logs to azure log analytics with following settings and it's not working. Can you help?

@type tail # input plugin path /tmp/access.log # monitoring file pos_file /tmp/fluentd_pos_file # position file format json # format tag azure-loganalytics.access # tag

<match azure-loganalytics.**>
@type azure-loganalytics
customer_id foobar # Customer ID aka WorkspaceID String
shared_key foobar # The primary or the secondary Connected Sources client authentication key
log_type jsonON # The name of the event type. ex) ApacheAccessLog
add_tag_field true
tag_field_name json_ON_tag

Trying to add plugin within the docker container

Per this fluentd documentation, I'm trying to build the official container with your plugin added.

# fluentd/Dockerfile
# inspired by: https://docs.fluentd.org/container-deployment/docker-compose

FROM arm64v8/fluentd:v1.16.0-1.0
USER root
RUN apk --no-cache add \ 
	gcc \
	libffi-dev \
	make \
	ruby-dev && \
	gem install --no-document rake fluent-plugin-elasticsearch fluent-plugin-azure-loganalytics && \
	apk del ruby-dev
USER fluent

However, when I do, I get the following errors. I don't know anything about ruby and find it very difficult to find any clarity about it online despite being someone with programming experience (but not an active programmer).

Any chance you can help?

Here's the relevant log:

#0 20.88 Successfully installed rake-13.0.6
#0 20.88 Successfully installed faraday-net_http-3.0.2
#0 20.88 Successfully installed faraday-2.7.4
#0 20.88 Successfully installed excon-0.99.0
#0 20.88 Successfully installed faraday-excon-2.1.0
#0 20.88 Successfully installed multi_json-1.15.0
#0 20.88 Successfully installed elastic-transport-8.2.1
#0 20.88 Successfully installed elasticsearch-api-8.7.0
#0 20.88 Successfully installed elasticsearch-8.7.0
#0 20.88 Successfully installed fluent-plugin-elasticsearch-5.3.0
#0 20.88 Successfully installed netrc-0.11.0
#0 20.88 Successfully installed mime-types-data-3.2023.0218.1
#0 20.88 Successfully installed mime-types-3.4.1
#0 20.88 Building native extensions. This could take a while...
#0 21.93 ERROR:  Error installing fluent-plugin-azure-loganalytics:
#0 21.93        ERROR: Failed to build gem native extension.
#0 21.93
#0 21.93     current directory: /usr/lib/ruby/gems/3.1.0/gems/unf_ext-0.0.8.2/ext/unf_ext
#0 21.93 /usr/bin/ruby -I /usr/lib/ruby/3.1.0 extconf.rb
#0 21.93 checking for -lstdc++... *** extconf.rb failed ***
#0 21.93 Could not create Makefile due to some reason, probably lack of necessary
#0 21.93 libraries and/or headers.  Check the mkmf.log file for more details.  You may
#0 21.93 need configuration options.
#0 21.93
#0 21.93 Provided configuration options:
#0 21.93        --with-opt-dir
#0 21.93        --without-opt-dir
#0 21.93        --with-opt-include
#0 21.93        --without-opt-include=${opt-dir}/include
#0 21.93        --with-opt-lib
#0 21.93        --without-opt-lib=${opt-dir}/lib
#0 21.93        --with-make-prog
#0 21.93        --without-make-prog
#0 21.93        --srcdir=.
#0 21.93        --curdir
#0 21.93        --ruby=/usr/bin/$(RUBY_BASE_NAME)
#0 21.93        --with-static-libstdc++
#0 21.93        --without-static-libstdc++
#0 21.93        --with-stdc++-dir
#0 21.93        --without-stdc++-dir
#0 21.93        --with-stdc++-include
#0 21.93        --without-stdc++-include=${stdc++-dir}/include
#0 21.93        --with-stdc++-lib
#0 21.93        --without-stdc++-lib=${stdc++-dir}/lib
#0 21.93        --with-stdc++lib
#0 21.93        --without-stdc++lib
#0 21.93 /usr/lib/ruby/3.1.0/mkmf.rb:490:in `try_do': The compiler failed to generate an executable file. (RuntimeError)
#0 21.93 You have to install development tools first.
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:583:in `try_link0'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:601:in `try_link'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:819:in `try_func'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:1062:in `block in have_library'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:989:in `block in checking_for'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:354:in `block (2 levels) in postpone'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:324:in `open'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:354:in `block in postpone'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:324:in `open'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:350:in `postpone'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:988:in `checking_for'
#0 21.93        from /usr/lib/ruby/3.1.0/mkmf.rb:1057:in `have_library'
#0 21.93        from extconf.rb:6:in `<main>'
#0 21.93
#0 21.93 To see why this extension failed to compile, please check the mkmf.log which can be found here:
#0 21.93
#0 21.93   /usr/lib/ruby/gems/3.1.0/extensions/aarch64-linux-musl/3.1.0/unf_ext-0.0.8.2/mkmf.log
#0 21.93
#0 21.93 extconf failed, exit code 1
#0 21.93
#0 21.93 Gem files will remain installed in /usr/lib/ruby/gems/3.1.0/gems/unf_ext-0.0.8.2 for inspection.
#0 21.93 Results logged to /usr/lib/ruby/gems/3.1.0/extensions/aarch64-linux-musl/3.1.0/unf_ext-0.0.8.2/gem_make.out
#0 21.96 10 gems installed
------
failed to solve: process "/bin/sh -c apk add \tgcc \tlibffi-dev \tmake \truby-dev && \tgem install --no-document rake fluent-plugin-elasticsearch fluent-plugin-azure-loganalytics && \tapk del ruby-full" did not complete successfully: exit code: 1

Question: Enable debug logging

Hello,

This is more of a question than an issue. I would like to know how to enable the DEBUG level logging in this plugin

In the fluentd.conf file, at root level, I've put

<system>
    log_level debug
    workers 4
</system>
...

And I can see the DEBUG for the output plugins like the HTTP one, but nothing for the fluent-plugin-azure-loganalytics type.

Would you be able to give me a hit about this?

Thanks!

Fails to Push to Azue After Period of Time

Hello, I seem to be running into an issue which I believe maybe related to the Azure LogAnalytics plugin I am using. It seems after a period of time I will start receiving the following errors in my Fluentd

{"time":"FT05:21:05+00:00","level":"fatal","message":"Exception occured in posting to DataCollector API: ","worker_id":0}

At that point it seems that logs are no longer pushed to Azure Log Analytics. I am curious as to if you have seen this issue before and know of a fix? Appreciate any help you can provide. At this point I simply have to constantly restart fluentd to get it back to pushing logs again to azure.

fluentd 0.14.21

uri encoding error with o.4.1

[fatal]: Exception occured in posting to DataCollector API: 'bad URI(is not URI?):

exception when pushing json to log analytics.

Error configuring buffer with copy output

Hello,

Thanks for your work on integrating fluentd with Azure Insights, I'm trying to configure the output to go to Azure insights (while adjusting the buffer settings) and stdout (for debugging purposes). But shipping to Azure Insights fails with this configuration:

<match system.**>
  @type copy

  <store>
    @type azure-loganalytics

    customer_id XXX
    shared_key XXX
    log_type onedrive

    <buffer time>
      timekey 1m
      timekey_wait 1m
    </buffer>
  </store>
  <store>
    @type stdout
  </store>
</match>

The error I'm getting is:

2019-08-29 15:31:02 +0000 [error]: #0 fluent/log.rb:362:error: error on output thread error_class=NoMethodError error="undefined method `-' for nil:NilClass"
  2019-08-29 15:31:02 +0000 [error]: #0 plugin_helper/thread.rb:78:block in thread_create: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin/buffer.rb:460:in `block in dequeue_chunk'
  2019-08-29 15:31:02 +0000 [error]: #0 plugin_helper/thread.rb:78:block in thread_create: /usr/lib/ruby/2.3.0/monitor.rb:214:in `mon_synchronize'
  2019-08-29 15:31:02 +0000 [error]: #0 plugin_helper/thread.rb:78:block in thread_create: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin/buffer.rb:453:in `dequeue_chunk'
  2019-08-29 15:31:02 +0000 [error]: #0 plugin_helper/thread.rb:78:block in thread_create: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin/output.rb:1086:in `try_flush'
  2019-08-29 15:31:02 +0000 [error]: #0 plugin_helper/thread.rb:78:block in thread_create: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin/output.rb:1428:in `flush_thread_run'
  2019-08-29 15:31:02 +0000 [error]: #0 plugin_helper/thread.rb:78:block in thread_create: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin/output.rb:458:in `block (2 levels) in start'
  2019-08-29 15:31:02 +0000 [error]: #0 plugin_helper/thread.rb:78:block in thread_create: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin_helper/thread.rb:78:in `block in thread_create'
2019-08-29 15:31:02 +0000 [warn]: #0 fluent/log.rb:342:warn: thread exited by unexpected error plugin=Fluent::Plugin::AzureLogAnalyticsOutput title=:flush_thread_0 error_class=NoMethodError error="undefined method `-' for nil:NilClass"
2019-08-29 15:31:02 +0000 [error]: #0 fluent/log.rb:362:error: unexpected error error_class=NoMethodError error="undefined method `-' for nil:NilClass"
  2019-08-29 15:31:02 +0000 [error]: #0 fluent/supervisor.rb:551:block in run_worker: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin/buffer.rb:460:in `block in dequeue_chunk'
  2019-08-29 15:31:02 +0000 [error]: #0 fluent/supervisor.rb:551:block in run_worker: /usr/lib/ruby/2.3.0/monitor.rb:214:in `mon_synchronize'
  2019-08-29 15:31:02 +0000 [error]: #0 fluent/supervisor.rb:551:block in run_worker: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin/buffer.rb:453:in `dequeue_chunk'
  2019-08-29 15:31:02 +0000 [error]: #0 fluent/supervisor.rb:551:block in run_worker: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin/output.rb:1086:in `try_flush'
  2019-08-29 15:31:02 +0000 [error]: #0 fluent/supervisor.rb:551:block in run_worker: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin/output.rb:1428:in `flush_thread_run'
  2019-08-29 15:31:02 +0000 [error]: #0 fluent/supervisor.rb:551:block in run_worker: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin/output.rb:458:in `block (2 levels) in start'
  2019-08-29 15:31:02 +0000 [error]: #0 fluent/supervisor.rb:551:block in run_worker: /var/lib/gems/2.3.0/gems/fluentd-1.7.0/lib/fluent/plugin_helper/thread.rb:78:in `block in thread_create'
2019-08-29 15:31:02 +0000 [error]: #0 fluent/log.rb:362:error: unexpected error error_class=NoMethodError error="undefined method `-' for nil:NilClass"
  2019-08-29 15:31:02 +0000 [error]: #0 fluent/supervisor.rb:732:main_process: suppressed same stacktrace

I'm quite convinced it's my configuration as I can see log shipping without the copy plugin ...

Exception when sending post data to Log Analytics

When using fluentd v1.11-1 and your last plugin version (0.7.0) we are facing an issue when sending data to Log Analytics to be ingested.

Basically, we are sending a JSON like this one:

{
   "anonymous_uid":"45645645645645",
   "event_date":"2021-05-11T04:00:00",
   "country_code":"ZZ",
   "country_name":"Unknown",
   "web_privacy":"PRIVATE",
   "info_level":"PERSONAL",
   "timezone":"america/new_york"
}

And we are getting this error from this line https://github.com/yokawasa/fluent-plugin-azure-loganalytics/blob/master/lib/fluent/plugin/out_azure-loganalytics.rb#L98:

2021-05-11 08:39:32 +0000 [fatal]: #3 [loganalyticis_custom_endpoint] Exception occured in posting to DataCollector API: 'undefined method `response' for #<SocketError:0x000055b025a4ee28>
Did you mean?  respond_to?', data=> [{"anonymous_uid":"45645645645645", "event_date":"2021-05-11T04:00:00", "country_code":"ZZ", "country_name":"Unknown", "web_privacy":"PRIVATE", "info_level":"PERSONAL", "timezone":"america/new_york"}]

This error happens from time to time, not on every request.

We are also seeing that when an http status code 503 from Log Analytics occurs, instead of going through line 100 (https://github.com/yokawasa/fluent-plugin-azure-loganalytics/blob/master/lib/fluent/plugin/out_azure-loganalytics.rb#L100) is catching the general exception in line 104 (https://github.com/yokawasa/fluent-plugin-azure-loganalytics/blob/master/lib/fluent/plugin/out_azure-loganalytics.rb#L104) as follows:

Exception occured in posting to DataCollector API: '503 Service Unavailable'

Could you take a look?

Thanks!

'undefined method `code' for nil:NilClass'

Fluentd version 1.15.2 (c32842297ed2c306f1b841a8f6e55bdd0f1cb27f)

Fluend config:

<source>
@type tail
path /log
pos_file /log.pos
read_from_head true
tag azure
<parse>
  @type apache2
</parse>
</source>

<match azure>
    @type azure-loganalytics
    customer_id xxxx
    shared_key xxxx
    log_type  ApacheAccessLog
</match>

The config is referenced from the example config in this plugin README, the log is from the Fluentd provided example log in the apache2 parser section. This is to simulate the scenario of sending apache logs to azure log-analytics.
sample log:
192.168.0.1 - - [28/Feb/2013:12:00:00 +0900] "GET / HTTP/1.1" 200 777 "-" "Opera/12.0"
However, I got the error listed below, it seems a ruby error. Did I config it wrong? I tried using fluent bit to send the similar log and azure received it.

#0 Exception occured in posting to DataCollector API: 'undefined method `code' for nil:NilClass', data=>[{"host":"192.168.0.1","user":null,"method":"GET","path":"/","code":200,"size":777,"referer":null,"agent":"Opera/12.0","time":"1362020400"},{"host":"192.168.0.1","user":null,"method":"GET","path":"/","code":200,"size":777,"referer":null,"agent":"Opera/12.0","time":"1362020400"},{"host":"192.168.0.1","user":null,"method":"GET","path":"/","code":200,"size":777,"referer":null,"agent":"Opera/12.0","time":"1362020400"},{"host":"192.168.0.1","user":null,"method":"GET","path":"/","code":200,"size":777,"referer":null,"agent":"Opera/12.0","time":"1362020400"}]

time filed added as string

If i add an timefield to the output

add_time_field true
time_field_name xtime
time_format %Y%m%d-%H:%M:%S
localtime true

its added as string in LogAnalytics, how can i change this to type timestamp?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.