Giter Site home page Giter Site logo

fluent-plugin-elasticsearch's Issues

Sending logs silently dies

I have following configuration:

<match apache.**>
  type elasticsearch
  logstash_format true
  host my-es-host.com
  port 80
  index_name fluentd
  type_name fluentd
  logstash_prefix apache
  buffer_chunk_limit 10m
  buffer_queue_limit 64
  flush_interval 5s
</match>

For some reason sometimes it just stops sending any apache logs to my ES cluster, without any errors or anything. After I restart fluentd it works fine again, but I expect now it to fail again.

Is there any way to debug such situations? Any proposals why it happens at all?

Specify ElasticSearch index template

I'm using fluentd with the in_syslog plugin and elasticsearch plugin to get syslog into elasticsearch, with a kibana frontend.

One of the problems I'm having though, is that the fields are indexed in elasticsearch so when I add a terms dashboard in kibana to give me, say, the top-10 hostnames, hostnames with dashes in them are broken up. so mysql-test-01 would come across as three hostnames: mysql, test, and 01.

Logstash got around this issue by making a "raw" version of several fields that is set to not-analyzed upon creation, so that you can run your dashboards against that instead.

More information here: http://www.elasticsearch.org/blog/logstash-1-3-1-released/

With syslog messages going into ES with this plugin, I'm finding that I'd like to have a "raw" or non-analyzed host (hostname) field and ident field (gives me the application). Unfortunately right now both of those fields are analyzed and it's messing with our dashboards.

Time of creating new index is wrong.

Hi,

I use td-agent to receive log from nginx, then forward to ES. Here is my td-agent.conf :

<source>
  type forward
  port 24224
  bind 0.0.0.0
</source>

<match access>
  type elasticsearch
  host 10.92.0.4
  port 9200
  logstash_format true
</match>

And here is my indexes :

#ls -al 
drwxr-xr-x 8 elasticsearch elasticsearch 4096 Oct  6 06:58 logstash-2013.10.06
drwxr-xr-x 8 elasticsearch elasticsearch 4096 Oct  7 06:58 logstash-2013.10.07
drwxr-xr-x 8 elasticsearch elasticsearch 4096 Oct  8 06:59 logstash-2013.10.08
drwxr-xr-x 8 elasticsearch elasticsearch 4096 Oct  9 06:58 logstash-2013.10.09
drwxr-xr-x 8 elasticsearch elasticsearch 4096 Oct 10 06:58 logstash-2013.10.10

Everything works fine, but the time of creating the each new index is not correct, I think it should be at 00:00 or 00:01 ( begining of a day ). These indexes were creating almost at 06:58 or 06:59.

My timezone is GMT+07.

The nginx servers have lot of users visiting for all the time. I am sure that there's always logging event at 00:00 and later.

How could I fix this ? I've read the Doc but found nothing about this ?

Hope you can show me the correct way.

BRs.

Error with Patron gem install

Hi,

When using td-agent, installing fluent-plugin-elasticsearch will fail:

# /usr/lib64/fluent/ruby/bin/fluent-gem install fluent-plugin-elasticsearch --no-ri --no-rdoc
Building native extensions.  This could take a while...
ERROR:  Error installing fluent-plugin-elasticsearch:
        ERROR: Failed to build gem native extension.

        /usr/lib64/fluent/ruby/bin/ruby extconf.rb
checking for curl-config... yes
checking for rb_thread_blocking_region()... yes
creating Makefile
......
session_ext.c:727: error: for each function it appears in.)
session_ext.c:730: error: ‘CURLPROXY_SOCKS4A’ undeclared (first use in this function)
session_ext.c:731: error: ‘CURLPROXY_SOCKS5_HOSTNAME’ undeclared (first use in this function)
make: *** [session_ext.o] Error 1

Gem files will remain installed in /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/patron-0.4.18 for inspection.
Results logged to /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/patron-0.4.18/ext/patron/gem_make.out

I discovered the workaround from here: https://groups.google.com/forum/#!topic/heroku/bjO4EuBg3Y8

Following that, this works:

# /usr/lib64/fluent/ruby/bin/fluent-gem install patron -v0.4.9 --no-ri --no-rdoc
Fetching: patron-0.4.9.gem (100%)
Building native extensions.  This could take a while...
Successfully installed patron-0.4.9
1 gem installed

# /usr/lib64/fluent/ruby/bin/fluent-gem install fluent-plugin-elasticsearch --no-ri --no-rdoc
Fetching: elasticsearch-transport-0.4.11.gem (100%)
Fetching: elasticsearch-api-0.4.11.gem (100%)
Fetching: elasticsearch-0.4.11.gem (100%)
Fetching: fluent-plugin-elasticsearch-0.3.0.gem (100%)
Successfully installed elasticsearch-transport-0.4.11
Successfully installed elasticsearch-api-0.4.11
Successfully installed elasticsearch-0.4.11
Successfully installed fluent-plugin-elasticsearch-0.3.0
4 gems installed

Something needs to be fixed here with the patron gem, or a version pegged.

Impossible to install on CentOS7

Trying to install the plugin on a CentOS7 box, I end up with the following trace

[root@td ~]# cat /etc/redhat-release
CentOS Linux release 7.0.1406 (Core)
[root@td ~]# uname -a
Linux td 3.10.0-123.el7.x86_64 #1 SMP Mon Jun 30 12:09:22 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
[root@td ~]# ruby --version
ruby 2.0.0p353 (2013-11-22) [x86_64-linux]
[root@td ~]# gem --version
2.0.14
[root@td ~]# rpm -qa | grep -e curl -e openssl
openssl-libs-1.0.1e-34.el7_0.3.x86_64
openssl-devel-1.0.1e-34.el7_0.3.x86_64
libcurl-7.29.0-19.el7.x86_64
python-pycurl-7.19.0-17.el7.x86_64
openssl-1.0.1e-34.el7_0.3.x86_64
curl-7.29.0-19.el7.x86_64
libcurl-devel-7.29.0-19.el7.x86_64
[root@td ~]# /opt/td-agent/embedded/bin/fluent-gem install fluent-plugin-elasticsearch
Building native extensions.  This could take a while...
ERROR:  Error installing fluent-plugin-elasticsearch:
        ERROR: Failed to build gem native extension.

    /opt/td-agent/embedded/bin/ruby extconf.rb
checking for curl-config... yes
checking for rb_thread_blocking_region()... *** extconf.rb failed ***
Could not create Makefile due to some reason, probably lack of necessary
libraries and/or headers.  Check the mkmf.log file for more details.  You may
need configuration options.

Provided configuration options:
        --with-opt-dir
        --with-opt-include
        --without-opt-include=${opt-dir}/include
        --with-opt-lib
        --without-opt-lib=${opt-dir}/lib
        --with-make-prog
        --without-make-prog
        --srcdir=.
        --curdir
        --ruby=/opt/td-agent/embedded/bin/ruby
        --with-curl-dir
        --without-curl-dir
        --with-curl-include
        --without-curl-include=${curl-dir}/include
        --with-curl-lib
        --without-curl-lib=${curl-dir}/lib
/opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:456:in `try_do': The compiler failed to generate an executable file. (RuntimeError)
You have to install development tools first.
       from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:541:in `try_link0'
        from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:556:in `try_link'
       from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:742:in `try_func'
        from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:1027:in `block in have_func'
       from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:918:in `block in checking_for'
        from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:351:in `block (2 levels) in postpone'
       from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:321:in `open'
        from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:351:in `block in postpone'
       from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:321:in `open'
        from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:347:in `postpone'
       from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:917:in `checking_for'
        from /opt/td-agent/embedded/lib/ruby/2.1.0/mkmf.rb:1026:in `have_func'
       from extconf.rb:47:in `<main>'

extconf failed, exit code 1

Gem files will remain installed in /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/patron-0.4.18 for inspection.
Results logged to /opt/td-agent/embedded/lib/ruby/gems/2.1.0/extensions/x86_64-linux/2.1.0/patron-0.4.18/gem_make.out
[root@td ~]#

type_name based on tag_key

I have a match pattern: traffic.**, so it matches traffic.logs and traffic.metrics

It would be nice to set the type_name to logs or metrics ....

ClassCastException

So I am using dstat plugin to get the system data and then use the elasticsearch plugin to put it in to elasticsearch then use Kibana to visualize it. The issue I am having is that when I am trying to plot the value of dstat.total cpu usage.hiq or any of the dstat.total.* they come up as String. I checked my _mappings and they came out all as string.

{
  "logstash-2015.02.05" : {
    "mappings" : {
      "fluentd" : {
        "properties" : {
          "@timestamp" : {
            "type" : "date",
            "format" : "dateOptionalTime"
          },
          "dstat" : {
            "properties" : {
              "dsk/total" : {
                "properties" : {
                  "read" : {
                    "type" : "string"
                  },
                  "writ" : {
                    "type" : "string"
                  }
                }
              },
              "net/total" : {
                "properties" : {
                  "recv" : {
                    "type" : "string"
                  },
                  "send" : {
                    "type" : "string"
                  }
                }
              },
              "total cpu usage" : {
                "properties" : {
                  "hiq" : {
                    "type" : "string"
                  },
                  "idl" : {
                    "type" : "string"
                  },
                  "siq" : {
                    "type" : "string"
                  },
                  "sys" : {
                    "type" : "string"
                  },
                  "usr" : {
                    "type" : "string"
                  },
                  "wai" : {
                    "type" : "string"
                  }
                }
              }
            }
          },
          "hostname" : {
            "type" : "string"
          }
        }
      }
    }
  }
}

The error from Elasticsearch that I am having is,

ClassCastException[org.elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be cast to org.elasticsearch.index.fielddata.IndexNumericFieldData]

How can I define custom mappings ?

an error in td-agent.log i don't understand related to elasticsearch plugin

2013-08-05 14:19:01 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2013-08-05 14:20:58 +0000 error_class="NoMethodError" error="undefined method `merge!' for #<String:0x007fd766382008>" instance=70281580213200
  2013-08-05 14:19:01 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.1.2/lib/fluent/plugin/out_elasticsearch.rb:43:in `block in write'
  2013-08-05 14:19:01 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.36/lib/fluent/plugin/buf_memory.rb:62:in `feed_each'
  2013-08-05 14:19:01 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.36/lib/fluent/plugin/buf_memory.rb:62:in `msgpack_each'
  2013-08-05 14:19:01 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.1.2/lib/fluent/plugin/out_elasticsearch.rb:41:in `write'
  2013-08-05 14:19:01 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.36/lib/fluent/buffer.rb:288:in `write_chunk'
  2013-08-05 14:19:01 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.36/lib/fluent/buffer.rb:272:in `pop'
  2013-08-05 14:19:01 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.36/lib/fluent/output.rb:292:in `try_flush'
  2013-08-05 14:19:01 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.36/lib/fluent/output.rb:119:in `run'

Part of my config (i obfuscated the hostname) :

    <store>
    type elasticsearch
    host ec2-*-*-*-*.compute-1.amazonaws.com
    port 9200
    index_name fluentd
    type_name fluentd
    logstash_format true
    </store>

PS : my config work without the elasticsearch store

Pushing the data to ES in integer format .

I am trying to use kibana to search the logs but if data (field) is stored in string format I can not do range comparison. Is there a way to store the content in integer format whenever possible ?

custom timestamp format - millisecs are down to 0

Hi,
I'm trying to insert custom @timestamp value into records, but need to use milliseconds, so using this format - "@timestamp":"2014-05-16T10:43:52.207-04:00", but then in kibana it's shown without milliseconds (just as 10:43:52.000).

Checked mappings:

          "@timestamp" : {
            "type" : "date",
            "format" : "dateOptionalTime"
          },

so instead of dateOptionalTime there should be dateTime (as in http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/mapping-date-format.html) ..

any idea?
Thanks

setup on local host

I am setting up fluentd and elasticsearch on a local VM in order to try the fluentd and ES stack.

OS: centos (recent)

[root@localhost data]# cat /etc/redhat-release 
CentOS release 6.5 (Final)

I am elasticsearch up and running on localhost (I used it with logstash with no issue)

[root@localhost data]# curl -X GET http://localhost:9200/
{
  "status" : 200,
  "name" : "Simon Williams",
  "version" : {
    "number" : "1.2.1",
    "build_hash" : "6c95b759f9e7ef0f8e17f77d850da43ce8a4b364",
    "build_timestamp" : "2014-06-03T15:02:52Z",
    "build_snapshot" : false,
    "lucene_version" : "4.8"
  },
  "tagline" : "You Know, for Search"
}

I have installed td-agent following the installation notes from fluentd website.
I am using that configuration file:

<source>
  type tail
  path /tmp/data/log
  pos_file /tmp/data/log.pos
  format /^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^ ]*) +\S*)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[\
^\"]*)" "(?<agent>[^\"]*)")?/
  time_format %d/%b/%Y:%H:%M:%S %z
  tag front.nginx.access
</source>

<match front.nginx.access>
  type elasticsearch
  host localhost
  port 9200
  index_name fluentd
  type_name nginx
  include_tag_key

  # buffering                                                                                                                                                
  buffer_type file
  buffer_path /tmp/fluentd/buffer/
  flush_interval 10s
  buffer_chunk_limit 16m
  buffer_queue_limit 4096
  retry_wait 15s
</match>

Here is the start-up log:

2014-07-24 13:39:58 +0200 [info]: starting fluentd-0.10.50
2014-07-24 13:39:58 +0200 [info]: reading config file path="/etc/td-agent/td-agent.conf"
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-mixin-config-placeholders' version '0.2.4'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-mixin-plaintextformatter' version '0.2.6'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-plugin-elasticsearch' version '0.3.1'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-plugin-flume' version '0.1.1'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-plugin-mongo' version '0.7.3'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-plugin-parser' version '0.3.4'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-plugin-rewrite-tag-filter' version '1.4.1'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-plugin-s3' version '0.4.0'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-plugin-scribe' version '0.10.10'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-plugin-td' version '0.10.20'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-plugin-td-monitoring' version '0.1.2'
2014-07-24 13:39:58 +0200 [info]: gem 'fluent-plugin-webhdfs' version '0.2.2'
2014-07-24 13:39:58 +0200 [info]: gem 'fluentd' version '0.10.50'
2014-07-24 13:39:58 +0200 [info]: using configuration file: <ROOT>
  <source>
    type tail
    path /tmp/data/log
    pos_file /tmp/data/log.pos
    format /^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^ ]*) +\S*)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?/
    time_format %d/%b/%Y:%H:%M:%S %z
    tag front.nginx.access
  </source>
  <match front.nginx.access>
    type elasticsearch
    host localhost
    port 9200
    index_name fluentd
    type_name nginx
    include_tag_key 
    buffer_type file
    buffer_path /tmp/fluentd/buffer/
    flush_interval 10s
    buffer_chunk_limit 16m
    buffer_queue_limit 4096
    retry_wait 15s
  </match>
</ROOT>
2014-07-24 13:39:58 +0200 [info]: adding source type="tail"
2014-07-24 13:39:58 +0200 [info]: adding match pattern="front.nginx.access" type="elasticsearch"
2014-07-24 13:39:58 +0200 [info]: following tail of /tmp/data/log

I get that error:

2014-07-24 13:40:00 +0200 [warn]: temporarily failed to flush the buffer. next_retry=2014-07-24 13:40:13 +0200 error_class="Elasticsearch::Transport::Transport::Errors::ServiceUnavailable" error="[503] " instance=70247139359260
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/base.rb:132:in `__raise_transport_error'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/base.rb:227:in `perform_request'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/http/faraday.rb:20:in `perform_request'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/client.rb:92:in `perform_request'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-api-0.4.11/lib/elasticsearch/api/actions/ping.rb:19:in `ping'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.3.1/lib/fluent/plugin/out_elasticsearch.rb:46:in `client'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.3.1/lib/fluent/plugin/out_elasticsearch.rb:103:in `send'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.3.1/lib/fluent/plugin/out_elasticsearch.rb:98:in `write'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/buffer.rb:296:in `write_chunk'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/buffer.rb:276:in `pop'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/output.rb:310:in `try_flush'
  2014-07-24 13:40:00 +0200 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/output.rb:132:in `run'

running tcpdump on port 9200, I get nothing...

tcpdump -x -X -i any 'port 9200'

mapping

I have loaded a custom mapping in my elasticsearch for index with name logstash*

I have mapped the field pow_1 as float.

{
  "logstash_per_index": {
    "order": 0,
    "template": "logstash*",
    "settings": {
      "index.store.compress.stored": "true",
      "index.cache.field.type": "soft"
    },
    "mappings": {
      "_default_": {
        "_source": {
          "compress": true
        },
        "properties": {
          "@fields": {
            "dynamic": true,
            "path": "full",
            "properties": {
              "errnum": {
                "type": "integer"
              }
            },
            "type": "object"
          },
          "@pow_1": {
            "type": "float"
          },
          "@message": {
            "index": "analyzed",
            "type": "string"
          },
          "@source": {
            "index": "not_analyzed",
            "type": "string"
          },
          "@tags": {
            "index": "not_analyzed",
            "type": "string"
          },
          "@type": {
            "index": "not_analyzed",
            "type": "string"
          },
          "@source_host": {
            "index": "not_analyzed",
            "type": "string"
          },
          "@timestamp": {
            "index": "not_analyzed",
            "type": "date"
          },
          "@source_path": {
            "index": "not_analyzed",
            "type": "string"
          },
        },
        "_all": {
          "enabled": false
        }
      }
    },
    "aliases": {
    }
  }
}

But in indexes I can see two mapping: default (with my mapping) and other one called fluentd.

{
  "logstash-2014.07.24": {
    "mappings": {
      "_default_": {},
      "fluentd": {}
    }
  }
}

And the problem is...

I have two fields in the mapping called fluentd:

@pow_1 as float
pow_1 as string

{
  "logstash-2014.07.24": {
    "mappings": {
      "_default_": {
        "_all": {
          "enabled": false
        },
        "_source": {
          "compress": true
        },
        "properties": {
          "@fields": {
            "dynamic": "true",
            "properties": {
              "errnum": {
                "type": "integer"
              }
            }
          },
          "@message": {
            "type": "string"
          },
          "@pow_1": {
            "type": "float"
          },
          "@source": {
            "type": "string",
            "index": "not_analyzed"
          },
          "@source_host": {
            "type": "string",
            "index": "not_analyzed"
          },
          "@source_path": {
            "type": "string",
            "index": "not_analyzed"
          },
          "@tags": {
            "type": "string",
            "index": "not_analyzed"
          },
          "@timestamp": {
            "type": "date",
            "format": "dateOptionalTime"
          },
          "@type": {
            "type": "string",
            "index": "not_analyzed"
          }
        }
      },
      "fluentd": {
        "_all": {
          "enabled": false
        },
        "_source": {
          "compress": true
        },
        "properties": {
          "@fields": {
            "dynamic": "true",
            "properties": {
              "errnum": {
                "type": "integer"
              }
            }
          },
          "@message": {
            "type": "string"
          },
          "@pow_1": {
            "type": "float"
          },
          "@source": {
            "type": "string",
            "index": "not_analyzed"
          },
          "@source_host": {
            "type": "string",
            "index": "not_analyzed"
          },
          "@source_path": {
            "type": "string",
            "index": "not_analyzed"
          },
          "@tags": {
            "type": "string",
            "index": "not_analyzed"
          },
          "@timestamp": {
            "type": "date",
            "format": "dateOptionalTime"
          },
          "@type": {
            "type": "string",
            "index": "not_analyzed"
          },
          "application": {
            "type": "string"
          },
          "date": {
            "type": "string"
          },
          "host": {
            "type": "string"
          },
          "message": {
            "type": "string"
          },
          "pow_1": {
            "type": "string"
          },
        }
      }
    }
  }
}

Who create this mapping?
Can I change this mapping?
There is a solution for that?

Thanks

get type from record if present, fallback to type_name

I'm storing different logs in elasticsearch and i want to utilize the type-field to separate them so i can have different mappings for each logformat.

I already took care that the _type field is in the record, but this plugin ingnores that and uses the type_name param instead.

Would it be pssoible to first check if there already is a _type field in the record and use this instead of type_name?

Json Encoding problems

Hi uken,
when fluentd try to flush some mail logs to your plugins it fails with this message:

2013-12-04 07:30:24 +0100 [warn]: temporarily failed to flush the buffer. next_retry=2013-12-04 07:30:25 +0100 error_class="JSON::GeneratorError" error="source sequence is illegal/malformed utf-8" instance=9518360
  2013-12-04 07:30:24 +0100 [warn]: /var/lib/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.1.4/lib/fluent/plugin/out_elasticsearch.rb:59:in `to_json'

The string to encode is something like "Hai già provato..." with lowercase a-grave or other italian characters.

After some investigation i have found that fluentd force encoding to ASCII-8BIT, as consequence when you call to_json method the encoding fail.

To fix it i have changed this line:
bulk_message << meta.to_json
bulk_message << record.to_json

to:
bulk_message << Yajl::Encoder.encode(meta)
bulk_message << Yajl::Encoder.encode(record)

in out_elasticsearch.rb

It's a correct fix?

Thanks!

Why so slow?

I did some tests on a tiny vagrant box with fluentd + elasticsearch by using this plugin.

I seems every log that send to fluentd need roughly 20 sends to write into elasticsearch, compares to write to a file, it just need to few seconds. Sometime even worse.

this is my configuration in fluentd

# Source
<source>
  type forward
  port 24224
</source>

# Output
<match devops.log.**>
  type elasticsearch
  index_name devops
  type_name log
  include_tag_key true
</match>

Am I doing something wrong then there somethings I can to do to improve it?

Or just because the hardware is just too weak?

Feature: Allow setting of transport timeout

It looks to me like there's no way to override the default transport (patron) timeout (5s). Assuming I'm not overlooking this, or a clever way to do it without modifying your plugin, I think this would mean adding a config option and passing it to your call to Elasticsearch::Client.new to include something like:

transport_options: { request: { timeout: @timeout } }

What do you think? Apologies if I'm just overlooking a way to set this some other way.

Thanks.

License missing from gemspec

RubyGems.org doesn't report a license for your gem. This is because it is not specified in the gemspec of your last release.

via e.g.

spec.license = 'MIT'
# or
spec.licenses = ['MIT', 'GPL-2']

Including a license in your gemspec is an easy way for rubygems.org and other tools to check how your gem is licensed. As you can imagine, scanning your repository for a LICENSE file or parsing the README, and then attempting to identify the license or licenses is much more difficult and more error prone. So, even for projects that already specify a license, including a license in your gemspec is a good practice. See, for example, how rubygems.org uses the gemspec to display the rails gem license.

There is even a License Finder gem to help companies/individuals ensure all gems they use meet their licensing needs. This tool depends on license information being available in the gemspec. This is an important enough issue that even Bundler now generates gems with a default 'MIT' license.

I hope you'll consider specifying a license in your gemspec. If not, please just close the issue with a nice message. In either case, I'll follow up. Thanks for your time!

Appendix:

If you need help choosing a license (sorry, I haven't checked your readme or looked for a license file), GitHub has created a license picker tool. Code without a license specified defaults to 'All rights reserved'-- denying others all rights to use of the code.
Here's a list of the license names I've found and their frequencies

p.s. In case you're wondering how I found you and why I made this issue, it's because I'm collecting stats on gems (I was originally looking for download data) and decided to collect license metadata,too, and make issues for gemspecs not specifying a license as a public service :). See the previous link or my blog post about this project for more information.

more information in connection error log

recently i receive fluent log message, but it can't display host or port information.

my elasticsearch server use cluster(6 server), i don't know which server has a problem.

cluster don't have a problem, when i restart fluent it works.

td-agent.conf

<match apache.access>
  type elasticsearch
  hosts x.x.x.x:10200,x.x.x.y:10200,x.x.x.z:10200,x.x.x.a:10200,x.x.x.b:10200,x.x.x.c:10200
  type_name access-log
  logstash_format true
  logstash_prefix logstash
  utc_index false
  request_timeout 20s

  # buffer
  buffer_type file
  buffer_path /var/log/td-agent/buffer/apache-access-elasticsearch.*.buffer
  buffer_chunk_limit 8m
  buffer_queue_limit 10000
  flush_interval 60
  retry_limit 17
</match>

fluent log

2014-08-12 07:53:45 +0900 [warn]: temporarily failed to flush the buffer. next_retry=2014-08-12 10:17:52 +0900 error_class="Faraday::ConnectionFailed" error="couldn't connect to host" instance=69894834792040
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/patron-0.4.18/lib/patron/session.rb:223:in `handle_request'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/patron-0.4.18/lib/patron/session.rb:223:in `request'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/adapter/patron.rb:33:in `call'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/rack_builder.rb:139:in `build_response'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:377:in `run_request'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/http/faraday.rb:21:in `block in perform_request'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/base.rb:187:in `call'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/base.rb:187:in `perform_request'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/http/faraday.rb:20:in `perform_request'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/client.rb:92:in `perform_request'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-api-0.4.11/lib/elasticsearch/api/actions/ping.rb:19:in `ping'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.4.0/lib/fluent/plugin/out_elasticsearch.rb:50:in `client'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.4.0/lib/fluent/plugin/out_elasticsearch.rb:108:in `send'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.4.0/lib/fluent/plugin/out_elasticsearch.rb:103:in `write'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/buffer.rb:296:in `write_chunk'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/buffer.rb:276:in `pop'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/output.rb:310:in `try_flush'
  2014-08-12 07:53:45 +0900 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/output.rb:132:in `run'

Cannot install 0.3.0

Hello,

Trying to install latest version with:

/usr/lib64/fluent/ruby/bin/fluent-gem install fluent-plugin-elasticsearch

Getting this error:

ERROR:  Error installing fluent-plugin-elasticsearch:
    ERROR: Failed to build gem native extension.

        /usr/lib64/fluent/ruby/bin/ruby extconf.rb
checking for curl-config... no
checking for main() in -lcurl... no
*** extconf.rb failed ***
Could not create Makefile due to some reason, probably lack of
necessary libraries and/or headers.  Check the mkmf.log file for more
details.  You may need configuration options.

Provided configuration options:
    --with-opt-dir
    --without-opt-dir
    --with-opt-include
    --without-opt-include=${opt-dir}/include
    --with-opt-lib
    --without-opt-lib=${opt-dir}/lib
    --with-make-prog
    --without-make-prog
    --srcdir=.
    --curdir
    --ruby=/usr/lib64/fluent/ruby/bin/ruby
    --with-curl-dir
    --without-curl-dir
    --with-curl-include
    --without-curl-include=${curl-dir}/include
    --with-curl-lib
    --without-curl-lib=${curl-dir}/lib
    --with-curllib
    --without-curllib
extconf.rb:39:in `<main>':   Can't find libcurl or curl/curl.h (RuntimeError)

OS is Amazon Linux.

libcurl is installed:
libcurl-7.33.0-1.41.amzn1.x86_64

No issue with previous version.

What do you guys think?

"no implicit conversion of nil into String" error

I keep getting these errors in the log file.

2015-01-22 15:27:35 -0800 [warn]: temporarily failed to flush the buffer. next_retry=2015-01-22 15:27:32 -0800 error_class="TypeError" error="no implicit conversion of nil into String" instance=70364703686720
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/patron-0.4.18/lib/patron/session.rb:223:in `handle_request'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/patron-0.4.18/lib/patron/session.rb:223:in `request'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/faraday-0.9.1/lib/faraday/adapter/patron.rb:33:in `call'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/faraday-0.9.1/lib/faraday/rack_builder.rb:139:in `build_response'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/faraday-0.9.1/lib/faraday/connection.rb:377:in `run_request'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/elasticsearch-transport-1.0.6/lib/elasticsearch/transport/transport/http/faraday.rb:21:in `block in perform_request'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/elasticsearch-transport-1.0.6/lib/elasticsearch/transport/transport/base.rb:187:in `call'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/elasticsearch-transport-1.0.6/lib/elasticsearch/transport/transport/base.rb:187:in `perform_request'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/elasticsearch-transport-1.0.6/lib/elasticsearch/transport/transport/http/faraday.rb:20:in `perform_request'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/elasticsearch-transport-1.0.6/lib/elasticsearch/transport/client.rb:111:in `perform_request'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/elasticsearch-api-1.0.6/lib/elasticsearch/api/actions/ping.rb:19:in `ping'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-elasticsearch-0.7.0/lib/fluent/plugin/out_elasticsearch.rb:62:in `client'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-elasticsearch-0.7.0/lib/fluent/plugin/out_elasticsearch.rb:168:in `send'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-elasticsearch-0.7.0/lib/fluent/plugin/out_elasticsearch.rb:161:in `write'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.10.55/lib/fluent/buffer.rb:296:in `write_chunk'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.10.55/lib/fluent/buffer.rb:276:in `pop'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.10.55/lib/fluent/output.rb:311:in `try_flush'
  2015-01-22 15:27:35 -0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.10.55/lib/fluent/output.rb:132:in `run'

environment:

  • Mac OSX 10.9.5
  • fluentd version 0.10.55
  • fluent-plugin-elasticsearch verson 0.7.0
  • elasticsearch version 1.4.1
  • running on localhost, port 9200
  • a local Logstash instance has no trouble connecting and writing to it over HTTP
  • debug output seems to suggest that file input gets parsed without issues

config

####
## Output descriptions:
##
<match tb.logs>
  type elasticsearch
  host localhost
  port 9200
  logstash_format true
  logstash_prefix fluentd
</match>
...
## File input
## read apache logs continuously and tags td.apache.access
<source>
  type tail
  format apache
  path /Users/tanya/logs
  tag tb.logs
  read_from_head true
</source>

Target index does not use the @timestamp property present in the record

If logstash_format is true, target index's date is computed by looking at the current time rather than the @timestamp property of the document. This creates a problem when the current day according to the document's @timestamp differs from the current time's day.

For example, if the document's timestamp is 2014-09-22T22:59:00Z and the record was sent to fluentd on 2014-09-23T00:01:00Z, the index will be index-name-2014.09.23 which is incorrect. The target index's date should be computed from the document's timestamp (if available).

No data send to elasticsearch

I want to setup a default chain of applications: fluent/elasticsearch/kibana. However, fluent-plugin-elasticsearch doesn't seem to send any data to elasticsearch. I hope you can help me out.

I'm running Linux Mint (debian based). Elasticsearch/kibana works. I have been able to test that using this information:

http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/docs-bulk.html

Which on my system results in:

$ curl -s -XPOST localhost:9200/_bulk --data-binary @requests; echo {"took":434,"errors":false,"items":[{"index":{"_index":"test","_type":"type1","_id":"1","_version":1,"status":201}}]}

I have attached some related config/log files. The fluentd log file is silent (after the startup output), also the elasticsearch.log is silent.

I don't know how to troubleshoot any further, all help is greatly appreciated.

Thanks in advance!

content-type in Net::HTTP::Post

Hi uken,
when sending json data to elasticsearch like this:

request = Net::HTTP::Post.new("/_bulk")

Net::HTTP library try to escape request body and so elasticsearch sometime receive non utf-8 data and log a org.elasticsearch.common.jackson.core.JsonParseException

To fix this i have added content-type and charset to the Post call:

request = Net::HTTP::Post.new("/_bulk",{"content-type"=>"application/json; charset=utf-8"})

Bye,
Luigi

HTTP and HTTPS auth support

I have a elasticsearch exposed in Internet to be used by kibana.

I have secured elasticsearch with https://github.com/sonian/elasticsearch-jetty

Now, I'm trying to connect this plugin with the elasticsearch.

Following http://rubydoc.info/gems/elasticsearch-transport, I have changed the file out_elasticsearch.rb like this...

diff out_elasticsearch.rb out_elasticsearch.rb.org
11,13d10
<   config_param :user, :string, :default => nil
<   config_param :password, :string, :default => nil
<   config_param :scheme, :string, :default => nil
57c54
<         [{host: @host, port: @port, user: @user, password: @password }]

---
>        [{host: @host, port: @port }]

This proof of concept works

Could you add support for http auth and for ssl scheme?

You prefer a pull request?

Thanks.

Error 400

Hi, I tried this plugin on the same pc and it works fine, but now I specified an external host and it seems the request are forwarded but the receiver server respond with error 400.

It could be something like:

400 Bad Request
'json' or 'msgpack' parameter is required

# nginx log:
79.16.x.x - - [24/Jul/2013:22:07:05 +0200] "POST /_bulk HTTP/1.1" 400 58 "-" "curl/7.22.0 (i686-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3"

Does fluent-plugin-elasticsearch convert to UTF-8 before creating json?

I have an error with sending some data to elasitcsearch using this plugin, some examples of the error:

Caused by: org.elasticsearch.common.jackson.core.JsonParseException: Invalid UTF-8 start byte 0xa3
Caused by: org.elasticsearch.common.jackson.core.JsonParseException: Invalid UTF-8 middle byte 0x6d
Caused by: org.elasticsearch.common.jackson.core.JsonParseException: Invalid UTF-8 middle byte 0x20

From the research I have done ( http://stackoverflow.com/questions/13830346/jackson-json-parser-invalid-utf-8-start-byte ; http://stackoverflow.com/questions/24629013/jmeter-invalid-utf-8-middle-byte ) it could be that the json being sent by the plugin is not UTF-8 - and json should always bt URD-8 which I am sure you know.

Could you confirm that fluent-plugin-elasticsearch converts to UTF-8 before crafting the json request?

Thanks.

cannot install on AWS EC2

Hey,
I need some help on why it failed to install the plugin. Below is the print out, and your help is very much appreciated.

[root@ip-10-178-115-44 td-agent]# /usr/lib64/fluent/ruby/bin/fluent-gem install fluent-plugin-elasticsearch
Building native extensions.  This could take a while...
ERROR:  Error installing fluent-plugin-elasticsearch:
    ERROR: Failed to build gem native extension.

        /usr/lib64/fluent/ruby/bin/ruby extconf.rb
checking for curl-config... yes
checking for rb_thread_blocking_region()... *** extconf.rb failed ***
Could not create Makefile due to some reason, probably lack of
necessary libraries and/or headers.  Check the mkmf.log file for more
details.  You may need configuration options.

Provided configuration options:
    --with-opt-dir
    --without-opt-dir
    --with-opt-include
    --without-opt-include=${opt-dir}/include
    --with-opt-lib
    --without-opt-lib=${opt-dir}/lib
    --with-make-prog
    --without-make-prog
    --srcdir=.
    --curdir
    --ruby=/usr/lib64/fluent/ruby/bin/ruby
    --with-curl-dir
    --without-curl-dir
    --with-curl-include
    --without-curl-include=${curl-dir}/include
    --with-curl-lib
    --without-curl-lib=${curl-dir}/lib
/usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:381:in `try_do': The compiler failed to generate an executable file. (RuntimeError)
You have to install development tools first.
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:461:in `try_link0'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:476:in `try_link'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:619:in `try_func'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:894:in `block in have_func'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:790:in `block in checking_for'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:284:in `block (2 levels) in postpone'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:254:in `open'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:284:in `block in postpone'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:254:in `open'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:280:in `postpone'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:789:in `checking_for'
    from /usr/lib64/fluent/ruby/lib/ruby/1.9.1/mkmf.rb:893:in `have_func'
    from extconf.rb:47:in `<main>'


Gem files will remain installed in /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/patron-0.4.18 for inspection.
Results logged to /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/patron-0.4.18/ext/patron/gem_make.out

Always receive a 400 Bad request when trying to use plugin with Logsene

I'm trying to push logs to an Elasticsearch compatible application called Logsene. See https://sematext.atlassian.net/wiki/display/PUBLOGSENE/Index+Events+via+Elasticsearch+API

The problem is that I always receive a 400 error when pushing using fluent plugin. I can cURL the same log message and it works fine.

I have cloned the repo and build the gem using the latest code from github. I've tried a variety of configurations with and without HTTPS, for example:

<match>
  type elasticsearch
  host logsene-receiver.sematext.com
  port 80
  path /**MYSECRETKEY**/fluentd
  index_name **MYSECRETKEY**
  type_name fluentd
</match>

and

<match>
  type elasticsearch
  hosts http://logsene-receiver.sematext.com:80/**MYSECRETKEY**/fluentd
  index_name **MYSECRETKEY**
  type_name fluentd
</match>

I always get an error like the following:

2014-09-02 21:04:59 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2014-09-02 21:09:39 +0000 error_class="Elasticsearch::Transport::Transport::Errors::BadRequest" error="[400] " instance=70175379032060
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/base.rb:132:in `__raise_transport_error'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/base.rb:227:in `perform_request'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/http/faraday.rb:20:in `perform_request'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/client.rb:92:in `perform_request'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-api-0.4.11/lib/elasticsearch/api/actions/ping.rb:19:in `ping'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.4.0/lib/fluent/plugin/out_elasticsearch.rb:55:in `client'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.4.0/lib/fluent/plugin/out_elasticsearch.rb:138:in `send'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.4.0/lib/fluent/plugin/out_elasticsearch.rb:133:in `write'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/buffer.rb:296:in `write_chunk'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/buffer.rb:276:in `pop'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/output.rb:310:in `try_flush'
  2014-09-02 21:04:59 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/output.rb:132:in `run'

specify multiple elasticsearch hosts

It should be able to push to multiple elasticsearch server similar to the default

<server>
   host elasticsearch1
</server>
<server>
   host elasticsearch1
</server>

Need to pass timeout parameter to Patron adapter to avoid duplicate records in elasticsearch

Default timeout value in /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/patron-0.4.18/lib/patron/session.rb is 5 seconds, but sometimes it's not enough to completely read bulk response from elasticseach and log data records sent to elasticsearch again and again. It leads to appearance a duplicate records in elasticsearch.

2014-07-01 16:35:23 +0400 [warn]: temporarily failed to flush the buffer. next_retry=2014-07-01 16:34:33 +0400 error_class="Faraday::TimeoutError" error="Operation timed out after 5000 milliseconds with 0 bytes received" instance=70263784889780
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/patron-0.4.18/lib/patron/session.rb:223:in `handle_request'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/patron-0.4.18/lib/patron/session.rb:223:in `request'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/adapter/patron.rb:33:in `call'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/rack_builder.rb:139:in `build_response'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:377:in `run_request'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.2/lib/elasticsearch/transport/transport/http/faraday.rb:21:in `block in perform_request'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.2/lib/elasticsearch/transport/transport/base.rb:187:in `call'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.2/lib/elasticsearch/transport/transport/base.rb:187:in `perform_request'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.2/lib/elasticsearch/transport/transport/http/faraday.rb:20:in `perform_request'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.2/lib/elasticsearch/transport/client.rb:102:in `perform_request'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-api-1.0.2/lib/elasticsearch/api/actions/bulk.rb:91:in `bulk'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.3.0/lib/fluent/plugin/out_elasticsearch.rb:102:in `send'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.3.0/lib/fluent/plugin/out_elasticsearch.rb:90:in `write'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/buffer.rb:296:in `write_chunk'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/buffer.rb:276:in `pop'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/output.rb:310:in `try_flush'
  2014-07-01 16:35:23 +0400 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/output.rb:132:in `run'

fluent-plugin-elasticsearch, geoip, typecast Together and slow。

hi uken:
I find the data is so slow show in elasticsearch-head when I uses fluent-plugin-elasticsearch, geoip, typecast. my config file is:

<source>
  type tail
  path /opt/realtimesearch/elasticsearch-0.90.5/s3log/access/clickstream-access.log
  pos_file /opt/realtimesearch/elasticsearch-0.90.5/s3log/access/clickstream-access.log.pos
  tag geoip.access
  format /^(?<remote_intranet>([0-9a-z\.,\s\%20]*,(\%20|\s)*)*)(?<remote_internet>[0-9a-z\. ]*) (?<host>[^ ]*) (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^\"]*) +\S*)?" (?<code>[^ ]*) (?<size>[^ ]*) "(?<referer>[^\"]*)" "(?<agent>[^\"]*)" (?<uid_set>[^ ]*) (?<uid_got>[^ ]*) (?<customer_id_cookie>[^ ]*) (?<ga_cookie>[^ ]*) (?<request_time>[^ ]*) (?<cookie_TestType>[^ ]*) (?<ABTest>[^ ]*)$/
  time_format %d/%b/%Y:%H:%M:%S %z
</source>

<match geoip.access>
    type geoip
    geoip_lookup_key         remote_internet
    enable_key_city          geoip_city
    enable_key_latitude      geoip_lat
    enable_key_longitude     geoip_lon
    enable_key_country_code  geoip_country
    enable_key_region        geoip_region
    remove_tag_prefix    geoip.access
    add_tag_prefix       typecast.access
    flush_interval       1s
</match>

<match typecast.access>
  type typecast
  item_types code:integer,size:integer,request_time:float
  prefix es
</match>

<match es.typecast.access>
  type elasticsearch
  host 127.0.0.1
  port 9200
  logstash_prefix logstash
  logstash_format true
  type_name fluentd
  index_name fluentd
  flush_interval 5s
</match>

if I want to use Buffer Plugin, How can I do, you can give me an example in my config file. I don't known buffer plugin is before or after <match es.typecast.access> Thanks!

temporarily failed to flush the buffer. next_retry

Trace:

      2014-11-03 09:04:40 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2014-11-03 09:06:42 +0000 error_class="NameError" error="undefined local variable or method `log' for #<Fluent::ElasticsearchOutput:0x007f8daf934c28>" instance=70123258002580
      2014-11-03 09:04:40 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.6.0/lib/fluent/plugin/out_elasticsearch.rb:66:in `client'
      2014-11-03 09:04:40 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.6.0/lib/fluent/plugin/out_elasticsearch.rb:164:in `send'
      2014-11-03 09:04:40 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.6.0/lib/fluent/plugin/out_elasticsearch.rb:157:in `write'
      2014-11-03 09:04:40 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/buffer.rb:292:in `write_chunk'
      2014-11-03 09:04:40 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/buffer.rb:272:in `pop'
      2014-11-03 09:04:40 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/output.rb:305:in `try_flush'
      2014-11-03 09:04:40 +0000 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/output.rb:131:in `run'

Config:

<match feedback>
  type elasticsearch
  type_name c_feedback
  logstash_format true
  flush_interval 10s
  host endpoint
  port 9200
  logstash_prefix stats
  include_tag_key true
  id_key request_id
  buffer_type memory
  tag_key for
  num_threads 2
</match>

This issue is existent from version 0.5.1. Tried to upgrade to 0.6.0 in hopes to fix the issue; still no luck.

Setup:
td-agent-1.1.18-0.x86_64
fluent-plugin-elasticsearch version 0.6.0. Although issue existed from version 0.5.1.

Allow daily splitting of index even if not logstash

I might be using fluentd to pass some json-formatted data to ES, but it's definitely not in syslog/logstash format (wouldn't make any sense anyway)

So I did a quick hack to allow splitting, would be nice if this was done in more ...adjustable way. I don't want indexes split at UTC but other people might:

    config_param :split_daily, :bool, :default => false

    if @split_daily
      target_index = @index_name + Time.at(time).strftime("%Y-%m-%d")
    else
    target_index = @index_name
    end

Plugin locks up completely when accidentally sending a string log message

Hi, I had some buggy code which was sending preformatted strings (rather than a json dict) to td-agent. td-agent was fine with this, but a single bad message causes the elasticsearch plugin to stop logging anything: basically causes our sitewide central logging system to never log anything again until td-agent is restarted.

Just needs a spot of exception handling I guess. If you need more information, just say!

Log trace:

2014-07-06 09:46:58 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2014-07-06 14:53:37 +0000 error_class="NoMethodError" error="undefined method `has_key?' for #<String:0x007f3d74cb6390>" instance=69951065878960
  2014-07-06 09:46:58 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.3.1/lib/fluent/plugin/out_elasticsearch.rb:71:in `block in write'
  2014-07-06 09:46:58 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/plugin/buf_memory.rb:62:in `feed_each'
  2014-07-06 09:46:58 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/plugin/buf_memory.rb:62:in `msgpack_each'
  2014-07-06 09:46:58 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.3.1/lib/fluent/plugin/out_elasticsearch.rb:69:in `write'
  2014-07-06 09:46:58 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/buffer.rb:296:in `write_chunk'
  2014-07-06 09:46:58 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/buffer.rb:276:in `pop'
  2014-07-06 09:46:58 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/output.rb:310:in `try_flush'
  2014-07-06 09:46:58 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/output.rb:132:in `run'

(there are later log traces where it continues to try to process the same message every hour or so, failing each time)

is @timestamp field works?

Accroding to document, we can use custom time field but is seems not working..

By default, when inserting records in logstash format, @timestamp is dynamically created with the time at log ingestion. If you'd like to use a custom time. Include an @timestamp with your record
{"@timestamp":"2014-04-07T000:00:00-00:00"}

json I posted to fluentd.

{"locale": "kokr", "count": 6, "_type": "dodo.summary.real", "@timestamp": "2014-11-23T000:00:00-00:00", "package": "com.iconnect.launcher.theme.apink"}

but fluent-requested time added to elasticsearch @timestamp field.

Mapping, tokenization and index

It is rather a question than a bug
Let say i have the following output coming out from stdout

{
"remote": "89.85.14.146",
"city": "saint-hubert"
}

Then, i would like to insert it in ES. However, 'city' field has to be defined as index: 'not_analyzed' in elasticsearch, otherwise it is getting tokenized as 'saint' and 'hubert' which is not my expected behavior.

According to ES documentation

"To create a mapping, you will need the Put Mapping API, or you can add multiple mappings when you create an index."

Ideally, I'd like to do it using fluentd plugin configuration, is it possible ?

No data send to elasticsearch

I want to setup a default chain of applications: fluent/elasticsearch/kibana. However, fluent-plugin-elasticsearch doesn't seem to send any data to elasticsearch. I hope you can help me out.

I'm running Linux Mint (debian based). Elasticsearch/kibana works. I have been able to test that using this information:

http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/docs-bulk.html

Which on my system results in:

$ curl -s -XPOST localhost:9200/_bulk --data-binary @requests; echo

{"took":434,"errors":false,"items":[{"index":{"_index":"test","_type":"type1","_id":"1","_version":1,"status":201}}]}

And the following in elasticsearch.log:

[2014-09-08 09:09:26,472][DEBUG][cluster.service ] [Karl Malus] processing [update-mapping [test][type1] / node [RlqapoBORZGN9ZJPhAfgXA], order [1]]: execute
[2014-09-08 09:09:26,478][DEBUG][cluster.metadata ] [Karl Malus] [test] update_mapping type1 with source [{"type1":{"properties":{"field1":{"type":"string"}}}}]
[2014-09-08 09:09:26,479][DEBUG][cluster.service ] [Karl Malus] cluster state updated, version [10], source [update-mapping [test][type1] / node [RlqapoBORZGN9ZJPhAfgXA], order [1]]
[2014-09-08 09:09:26,479][DEBUG][cluster.service ] [Karl Malus] publishing cluster state version 10
[2014-09-08 09:09:26,480][DEBUG][cluster.service ] [Karl Malus] set local cluster state to version 10
[2014-09-08 09:09:26,481][DEBUG][river.cluster ] [Karl Malus] processing [reroute_rivers_node_changed]: execute
[2014-09-08 09:09:26,481][DEBUG][river.cluster ] [Karl Malus] processing [reroute_rivers_node_changed]: no change in cluster_state
[2014-09-08 09:09:26,486][DEBUG][cluster.service ] [Karl Malus] processing [update-mapping [test][type1] / node [RlqapoBORZGN9ZJPhAfgXA], order [1]]: done applying updated cluster_state (version: 10)
[2014-09-08 09:09:26,486][DEBUG][cluster.action.index ] [Karl Malus] successfully updated master with mapping update: index [test], indexUUID [dT30Z6KjSWy6yvZ38491kg], type [type1] and source [{"type1":{"properties":{"field1":{"type":"string"}}}}]

And also data is shown in Kibana.

My fluent.conf:

Read Apache access log

type tail #format apache2 format /^(?[^:]+):(?[[:digit:]]+) (?[^ ]+) -(?[^-]+)- [(?[^]]+)] "(?[[:alpha:]]+) (?[^ ]+) (?[^ ]+)" (?[[:digit:]]+) (?[[:digit:]]_) "(?[^\"]_)" "(?[^\"]+)"?$/ time_format %d/%b/%Y:%H:%M:%S %z path /var/log/apache2/other_vhosts_access.log pos_file /var/log/fluentd/local_apache_access.pos tag local.apache.access # Send everything to elasticsearch # https://github.com/uken/fluent-plugin-elasticsearch

<match **>
type copy

type elasticsearch
host 127.0.0.1
port 9200
#index_name fluentd
#type_name fluentd
#logstash_format true
#logstash_prefix mylogs
#logstash_dateformat %Y.%m.%d
#utc_index false
#request_timeout 5s
#include_tag_key true
#tag_key tag
#id_key request_id
#buffer_type memory
#flush_interval 60
#retry_limit 17
#retry_wait 1.0
#num_threads 1


type file
path /var/log/fluentd/fluentd.dmp

And the fluentd.dmp file contains:

2014-09-08T09:09:46+02:00 local.apache.access {"hostname":"thinkpad","port":"80","hostip":"127.0.0.1","user":" ","method":"GET","path":"/kibana-3.1.0/config.js","httpversion":"HTTP/1.1","returncode":"200","responsesize":"1296","referer":"http://127.0.0.1/kibana-3.1.0/index.html","agent":"Mozilla/5.0 (X11; Linux x86_64; rv:32.0) Gecko/20100101 Firefox/32.0"}
2014-09-08T09:09:46+02:00 local.apache.access {"hostname":"thinkpad","port":"80","hostip":"127.0.0.1","user":" ","method":"GET","path":"/kibana-3.1.0/app/dashboards/guided.json?1410160186689","httpversion":"HTTP/1.1","returncode":"200","responsesize":"7017","referer":"http://127.0.0.1/kibana-3.1.0/index.html","agent":"Mozilla/5.0 (X11; Linux x86_64; rv:32.0) Gecko/20100101 Firefox/32.0"}
2014-09-08T09:09:51+02:00 local.apache.access {"hostname":"thinkpad","port":"80","hostip":"::1","user":" ","method":"OPTIONS","path":"","httpversion":"HTTP/1.0","returncode":"200","responsesize":"125","referer":"-","agent":"Apache/2.4.6 (Debian) PHP/5.5.6-1 (internal dummy connection)"}
2014-09-08T09:09:52+02:00 local.apache.access {"hostname":"thinkpad","port":"80","hostip":"::1","user":" ","method":"OPTIONS","path":"
","httpversion":"HTTP/1.0","returncode":"200","responsesize":"125","referer":"-","agent":"Apache/2.4.6 (Debian) PHP/5.5.6-1 (internal dummy connection)"}
2014-09-08T09:09:53+02:00 local.apache.access {"hostname":"thinkpad","port":"80","hostip":"::1","user":" ","method":"OPTIONS","path":"*","httpversion":"HTTP/1.0","returncode":"200","responsesize":"125","referer":"-","agent":"Apache/2.4.6 (Debian) PHP/5.5.6-1 (internal dummy connection)"}

The fluentd log file is silent (after the startup output), also the elasticsearch.log is silent.

I don't know how to troubleshoot any further, all help is greatly appreciated.

Thanks in advance!

error_class="ThreadError" error="current thread not owner"

Problem:

Fluentd setup seems to work just fine, except data is only flushed to elasticsearch when the fluentd service is restarted. I tried with ruby 1.9 and 2.0, same results. The next_retry field in the logs suggests that the worker thread(s) for flushing do not execute until SIGHUP is recieved by fluentd.
The whole setup is running in docker containers.

Only thing i could find related to the problem is: http://stackoverflow.com/questions/24701230/fluentd-does-not-flush-any-data-to-elastic-search-but-does-flush-upon-shutdown

Versions i tried it with:

ruby 1.9.3p547 (2014-05-14 revision 45962) [x86_64-linux]
ruby 2.0.0p481 (2014-05-08 revision 45883) [x86_64-linux]

Config:

<source>
    type tail
    format json
    time_key time
    time_format %Y-%m-%dT%T.%LZ
    path /var/lib/docker/containers/4c8e53463ec86870aeb1694bf0255c78315b0274d582524a30f43b4bc7553ed1/4c8e53463ec86870aeb1694bf0255c78315b0274d582524a30f43b4bc7553ed1-json.log
    pos_file /var/lib/docker/containers/4c8e53463ec86870aeb1694bf0255c78315b0274d582524a30f43b4bc7553ed1/4c8e53463ec86870aeb1694bf0255c78315b0274d582524a30f43b4bc7553ed1-json.log.pos
    tag docker.container.elasticsearch
    rotate_wait 5
    read_from_head true
  </source>
  <match docker.container.collector>
    type elasticsearch
    host es1
    port 9200
    index_name fluentd
    type_name collector
    logstash_format true
    request_timeout 15s
    flush_interval 3s
    retry_limit 17
    retry_wait 1s
  </match>

-vv Log from SIGHUP until restart:

2014-10-08 13:47:42 +0000 [info]: plugin/in_tail.rb:475:initialize: following tail of /var/lib/docker/containers/4c8e53463ec86870aeb1694bf0255c78315b0274d582524a30f43b4bc7553ed1/4c8e53463ec86870aeb1694bf0255c78315b0274d582524a30f43b4bc7553ed1-json.log
2014-10-08 13:48:49 +0000 [debug]: fluent/supervisor.rb:306:block in install_supervisor_signal_handlers: fluentd supervisor process get SIGHUP
2014-10-08 13:48:49 +0000 [info]: fluent/supervisor.rb:307:block in install_supervisor_signal_handlers: restarting
2014-10-08 13:48:49 +0000 [debug]: fluent/supervisor.rb:432:block in install_main_process_signal_handlers: fluentd main process get SIGTERM
2014-10-08 13:48:49 +0000 [debug]: fluent/supervisor.rb:435:block in install_main_process_signal_handlers: getting start to shutdown main process
2014-10-08 13:48:49 +0000 [info]: fluent/engine.rb:237:run: shutting down fluentd
2014-10-08 13:48:49 +0000 [warn]: fluent/output.rb:344:rescue in try_flush: temporarily failed to flush the buffer. next_retry=2014-10-08 13:47:47 +0000 error_class="ThreadError" error="current thread not owner" instance=28378900
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/1.9.1/monitor.rb:246:in `mon_check_owner'
2014-10-08 13:48:49 +0000 [warn]: fluent/output.rb:344:rescue in try_flush: temporarily failed to flush the buffer. next_retry=2014-10-08 13:47:47 +0000 error_class="ThreadError" error="current thread not owner" instance=28391600
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/1.9.1/monitor.rb:195:in `mon_exit'
2014-10-08 13:48:49 +0000 [warn]: fluent/output.rb:344:rescue in try_flush: temporarily failed to flush the buffer. next_retry=2014-10-08 13:47:47 +0000 error_class="ThreadError" error="current thread not owner" instance=28476300
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/1.9.1/monitor.rb:246:in `mon_check_owner'
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/site_ruby/1.9.1/rubygems/core_ext/kernel_require.rb:123:in `ensure in require'
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/site_ruby/1.9.1/rubygems/core_ext/kernel_require.rb:123:in `require'
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/1.9.1/monitor.rb:195:in `mon_exit'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/utils.rb:254:in `default_uri_parser'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/utils.rb:246:in `URI'
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/site_ruby/1.9.1/rubygems/core_ext/kernel_require.rb:123:in `ensure in require'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:309:in `url_prefix='
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/1.9.1/monitor.rb:246:in `mon_check_owner'
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/site_ruby/1.9.1/rubygems/core_ext/kernel_require.rb:123:in `require'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:77:in `initialize'
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/1.9.1/monitor.rb:195:in `mon_exit'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/utils.rb:254:in `default_uri_parser'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:43:in `new'
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/site_ruby/1.9.1/rubygems/core_ext/kernel_require.rb:123:in `ensure in require'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/utils.rb:246:in `URI'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:43:in `block in __build_connections'
  2014-10-08 13:48:49 +0000 [warn]: /usr/lib64/ruby/site_ruby/1.9.1/rubygems/core_ext/kernel_require.rb:123:in `require'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:309:in `url_prefix='
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:36:in `map'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/utils.rb:254:in `default_uri_parser'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:77:in `initialize'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:36:in `__build_connections'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/utils.rb:246:in `URI'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:43:in `new'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/base.rb:32:in `initialize'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:309:in `url_prefix='
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:43:in `block in __build_connections'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:47:in `new'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:77:in `initialize'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:36:in `map'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:47:in `client'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:43:in `new'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:36:in `__build_connections'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:161:in `send'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:43:in `block in __build_connections'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/base.rb:32:in `initialize'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:154:in `write'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:36:in `map'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:47:in `new'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/buffer.rb:296:in `write_chunk'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:36:in `__build_connections'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:47:in `client'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/buffer.rb:276:in `pop'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/elasticsearch-transport-1.0.5/lib/elasticsearch/transport/transport/base.rb:32:in `initialize'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:161:in `send'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/output.rb:311:in `try_flush'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:47:in `new'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:154:in `write'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/output.rb:132:in `run'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:47:in `client'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/buffer.rb:296:in `write_chunk'
2014-10-08 13:48:49 +0000 [warn]: fluent/output.rb:385:rescue in before_shutdown: before_shutdown failed error="current thread not owner"
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:161:in `send'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/buffer.rb:276:in `pop'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.1/lib/fluent/plugin/out_elasticsearch.rb:154:in `write'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/output.rb:311:in `try_flush'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/buffer.rb:296:in `write_chunk'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/output.rb:132:in `run'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/buffer.rb:276:in `pop'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/output.rb:311:in `try_flush'
  2014-10-08 13:48:49 +0000 [warn]: /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/output.rb:132:in `run'
2014-10-08 13:48:49 +0000 [info]: plugin/out_elasticsearch.rb:63:client: Connection opened to Elasticsearch cluster => {:host=>"es1", :port=>9200, :scheme=>"http"}
2014-10-08 13:48:50 +0000 [info]: plugin/out_elasticsearch.rb:63:client: Connection opened to Elasticsearch cluster => {:host=>"es1", :port=>9200, :scheme=>"http"}
/usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/engine.rb:298:in `join': deadlock detected (fatal)
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/engine.rb:298:in `block in shutdown'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/engine.rb:288:in `each'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/engine.rb:288:in `shutdown'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/engine.rb:238:in `run'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/supervisor.rb:464:in `run_engine'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/supervisor.rb:135:in `block in start'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/supervisor.rb:250:in `call'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/supervisor.rb:250:in `main_process'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/supervisor.rb:225:in `block in supervise'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/supervisor.rb:224:in `fork'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/supervisor.rb:224:in `supervise'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/supervisor.rb:128:in `start'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/lib/fluent/command/fluentd.rb:160:in `<top (required)>'
        from /usr/lib64/ruby/site_ruby/1.9.1/rubygems/core_ext/kernel_require.rb:53:in `require'
        from /usr/lib64/ruby/site_ruby/1.9.1/rubygems/core_ext/kernel_require.rb:53:in `require'
        from /usr/local/lib64/ruby/gems/1.9.1/gems/fluentd-0.10.53/bin/fluentd:6:in `<top (required)>'
        from /usr/local/bin/fluentd:23:in `load'
        from /usr/local/bin/fluentd:23:in `<main>'
2014-10-08 13:48:50 +0000 [info]: fluent/supervisor.rb:240:supervise: process finished code=256
2014-10-08 13:48:50 +0000 [error]: fluent/supervisor.rb:138:start: fluentd main process died unexpectedly. restarting.
2014-10-08 13:48:50 +0000 [info]: fluent/supervisor.rb:223:supervise: starting fluentd-0.10.53

Use Fluentd Tag as _type

Are you able to configure the plugin to grab the Tag used in Fluentd to be used as the type passed to ElasticSearch?

Example:

<match my.logs>
  type elasticsearch
  logstash_format true
  include_tag_key true
  tag_key _type
</match>

The record inserted into elasticsearch would be:

{
"_index" : "logstash-2014.07.10",
"_type" : "my.logs",
"_id" : "G3He_H6qShdsfa2Zx5NAzbw",
"_source": {"message":"This is a Test"}
}

error with recently upgraded 0.5.0: the scheme http does not accept registry part: :9200MYHOST1

With the config:

<match this.*>
  type forest
  subtype elasticsearch
  <template>
    buffer_chunk_limit 5m
    buffer_path /var/log/td-agent/buffer/output_${tag}.*
    buffer_queue_limit 1000
    buffer_type file
    flush_interval 2s
    hosts MYHOST1,MYHOST2
    scheme http
    port 9200
    include_tag_key true
    logstash_format true
    logstash_prefix ${tag}
    num_threads 1
    request_timeout 120s
    retry_limit 300
    retry_wait 5s
    tag_key tag
  </template>
</match>

I'm seeing a massive amount of errors...

2014-10-06 21:43:31 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2014-10-07 00:18:27 +0000 error_class="URI::InvalidURIError" error="the scheme http does not accept registry part: :9200MYHOST1 (or bad hostname?)" instance=70105317242240
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/1.9.1/uri/generic.rb:213:in `initialize'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/1.9.1/uri/http.rb:84:in `initialize'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/1.9.1/uri/common.rb:214:in `new'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/1.9.1/uri/common.rb:214:in `parse'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/1.9.1/uri/common.rb:747:in `parse'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/1.9.1/uri/common.rb:994:in `URI'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/utils.rb:246:in `call'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/utils.rb:246:in `URI'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:309:in `url_prefix='
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:77:in `initialize'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/http/faraday.rb:42:in `new'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/http/faraday.rb:42:in `block in __build_connections'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/http/faraday.rb:35:in `map'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/http/faraday.rb:35:in `__build_connections'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/base.rb:32:in `initialize'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.0/lib/fluent/plugin/out_elasticsearch.rb:47:in `new'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.0/lib/fluent/plugin/out_elasticsearch.rb:47:in `client'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.0/lib/fluent/plugin/out_elasticsearch.rb:161:in `send'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.5.0/lib/fluent/plugin/out_elasticsearch.rb:154:in `write'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/buffer.rb:296:in `write_chunk'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/buffer.rb:276:in `pop'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/output.rb:310:in `try_flush'
2014-10-06 21:43:31 +0000 [warn]: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.50/lib/fluent/output.rb:132:in `run'

Any ideas?

Connection timed out

2014-04-23 09:45:27 +0800 [warn]: before_shutdown failed error="Connection timed out - connect(2)"
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/1.9.1/net/http.rb:763:in `initialize'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/1.9.1/net/http.rb:763:in `open'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/1.9.1/net/http.rb:763:in `block in connect'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/1.9.1/timeout.rb:55:in `timeout'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/1.9.1/timeout.rb:100:in `timeout'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/1.9.1/net/http.rb:763:in `connect'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/1.9.1/net/http.rb:756:in `do_start'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/1.9.1/net/http.rb:745:in `start'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/1.9.1/net/http.rb:1285:in `request'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/adapter/net_http.rb:80:in `perform_request'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/adapter/net_http.rb:39:in `call'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/request/url_encoded.rb:15:in `call'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/rack_builder.rb:139:in `build_response'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/faraday-0.9.0/lib/faraday/connection.rb:377:in `run_request'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/http/faraday.rb:21:in `block in perform_request'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/base.rb:187:in `call'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/base.rb:187:in `perform_request'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/transport/http/faraday.rb:20:in `perform_request'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-transport-0.4.11/lib/elasticsearch/transport/client.rb:92:in `perform_request'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/elasticsearch-api-0.4.11/lib/elasticsearch/api/actions/bulk.rb:81:in `bulk'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.3.0/lib/fluent/plugin/out_elasticsearch.rb:96:in `send'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-elasticsearch-0.3.0/lib/fluent/plugin/out_elasticsearch.rb:91:in `write'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/buffer.rb:296:in `write_chunk'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/buffer.rb:276:in `pop'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/plugin/buf_memory.rb:87:in `block in before_shutdown'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/1.9.1/monitor.rb:211:in `mon_synchronize'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/plugin/buf_memory.rb:83:in `before_shutdown'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/output.rb:382:in `before_shutdown'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/output.rb:152:in `block in run'
  2014-04-23 09:45:27 +0800 [warn]: <internal:prelude>:10:in `synchronize'
  2014-04-23 09:45:27 +0800 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.45/lib/fluent/output.rb:151:in `run'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.