Giter Site home page Giter Site logo

logstash-filter-fix_protocol's Introduction

FIX Protocol Logstash Filter Build Status

A LogStash filter plugin for FIX Message parsing

Given a FIX log file that looks like this:

2015-08-26 23:08:38,096 FIX.4.2:DUMMY_INC->ANOTHER_INC: 8=FIX.4.2�9=184�35=F�34=2�49=ANOTHER_INC�50=DefaultSenderSubID�52=20150826-23:08:38.094�56=DUMMY_INC�1=DefaultAccount�11=clordid_of_cancel�41=151012569�54=1�55=ITER�60=20250407-13:14:15�167=FUT�200=201512�10=147�
2015-08-31 17:48:20,890 FIXT.1.1:DUMMY_INC->ANOTHER_INC: 8=FIXT.1.1�9=140�35=W�34=2�49=DUMMY_INC�52=20150831-17:48:20.890�56=ANOTHER_INC�22=99�48=.AQUA-W�262=golden_path_test�268=1�269=3�270=640754�272=20150831�273=17:48:20.882�10=070�
2015-08-31 20:48:26,536 FIXT.1.1:DUMMY_INC->ANOTHER_INC: 8=FIXT.1.1�9=189�35=W�34=5�49=DUMMY_INC�52=20150831-20:48:26.535�56=ANOTHER_INC�22=99�48=ITRZ21�262=req_A�268=2�269=0�270=0.01005�271=10�272=20150831�273=20:48:26.514�269=1�270=0.0101�271=2�272=20150831�273=20:48:26.514�10=123�

The FIX Message filter plugin can read the FIX log as an input and turn it into something like this:

alt tag

Installation

$ /opt/logstash/bin/plugin install logstash-filter-fix_protocol

Plugin Configuration

Setting Input type Required Default Value
fix_message string/variable Yes "message"
data_dictionary_path string Yes "/PATH/TO/YOUR/DD"
session_dictionary_path string No nil

fix_message

  • value type is a string
  • required

Should be the actual fix message passed to the filter. You might need to use a separate filter, like grok, to parse a log and set a fix string variable.

data_dictionary_path

  • value type is a string
  • required

Should be the absolute path to your data dictionary xml file.

session_dictionary_path

  • value type is a string
  • Not required

Should be the absolute path to your session dictionary xml file for FIX versions > 5.0. Note, if you do not set this but are using FIX 5.0, the filter will still work, but admin messages won't be correctly parsed - you'll lose data. The filter ignores key-value pairs that it doesn't parse correctly.

Sample Config File

Note: For FIX < 5.0, simply omit the session_dictionary_path.

input {
  file {
    path => "/PATH/TO/YOUR/FIX-MESSAGE.log"
    start_position => "beginning"
  }
}
filter {
  grok {
    match => ["message","%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:fix_session}: %{GREEDYDATA:fix_string}"]
  }
  fix_protocol {
    fix_message => fix_string
    session_dictionary_path => "/PATH/TO/FIX/5.0/SESSION/DICTIONARY/FIX.xml"
    data_dictionary_path => "/PATH/TO/FIX/5.0/DATA/DICTIONARY/FIX.xml"
  }
}
output {
  stdout { codec => rubydebug }
}

Sample Config File For Multiple FIX Versions

input {
  file {
    path => "/path/to/fix.log"
    start_position => "beginning"
  }
}
filter {
  grok {
    match => ["message","%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:fix_session}: %{GREEDYDATA:fix_string}"]
  }
  if [message] =~ "=FIX.4.2" {
    fix_protocol {
      fix_message => fix_string
      data_dictionary_path => "/path/to/datadict/FIX42.xml"
    }

  if [message] =~ "=FIX.4.4" {
    fix_protocol {
      fix_message => fix_string
      data_dictionary_path => "/path/to/datadict/FIX44.xml"
    }
  }
  if [message] =~ "=FIX.5.0" {
    fix_protocol {
      fix_message => fix_string
      data_dictionary_path => "/path/to/datadict/FIX50.xml"
    }
  }

  }
}
output {
  stdout { codec => rubydebug }
}


Notice, we're using the Grok filter to create a fix_message variable from a theoretical FIX Message log file. Then, we're passing that variable to our filter. You can see this emulated behavior in our specs.

Development Environment

To get set up quickly, we recommend using Vagrant with the Ansible provisioning available in this source repository.

Setup with Vagrant

Then,

vagrant up

Manual Setup (OSX)

  • rvm install jruby
  • rvm use jruby
  • bundle install
  • brew install logstash

Then, run bin/console for an interactive prompt that will allow you to experiment.

To release a new version, update the version number in logstash-filter-fix_protocol.gemspec, and then run bundle exec rake release to create a git tag for the version, push git commits and tags, and push the .gem file to rubygems.org.

Note: If you get an error message about metadata, you'll need to update to ruby gems > 2.0. Run gem update --system

Running Tests

$ ./bin/rspec rspec

Logstash 2x vs 5x

Remove any installed versions of logstash and install your desired version.

After you've completed the 'Manual change' or 'Ansible provisioning change' below, follow instructions for 'Development Logstash Installation'

Manual change:

Change the version number in lib/logstash/filters/version.rb

module Logstash
  VERSION = '2.x'
end

Ansible provisioning change:

Change the version number in provision/group_vars/all.yml

logstash_version: 5.x # -> 2.x

Run vagrant provision:

vagrant provision

Development Logstash Installation

  1. Add the filter to your installation of LogStash

    # /opt/logstash/Gemfile
    #...
    gem "logstash-output-kafka"
    gem "logstash-input-http_poller"
    gem "logstash-filter-fix_protocol", :path => "/PATH/TO/YOUR/FORK"
  2. Install the filter plugin

    $ /opt/logstash/bin/plugin install --no-verify
    
  3. Start logstash installation with a LogStash configuration file.

    $ /opt/logstash/bin/logstash -f /PATH/TO/logstash.conf
    

Contributing

Contributions are welcome! Please see the Contribution Guidelines for details.

Connamara Systems

FIX Message Logstash Filter is maintained and funded by Connamara Systems, llc.

The names and logos for Connamara Systems are trademarks of Connamara Systems, llc.

Licensing

FIX Message Logstash Filter is Copyright © 2016 Connamara Systems, llc.

This software is available under the Apache license and a commercial license. Please see the LICENSE file for the terms specified by the Apache license. The commercial license offers more flexible licensing terms compared to the Apache license, and includes support services. Contact us for more information on the Connamara commercial license, what it enables, and how you can start commercial development with it.

This product includes software developed by quickfixengine.org (http://www.quickfixengine.org/). Please see the QuickFIX Software LICENSE for the terms specified by the QuickFIX Software License.

logstash-filter-fix_protocol's People

Contributors

cbusbey avatar daino3 avatar kkozel avatar kmalyar-tic avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logstash-filter-fix_protocol's Issues

Support for multiple Fix protocols

Hi,

I'm relatively new to FIX log parsing. I was asked to look into it. We get messages in 4.2/4.4/5. I have a data dictionaries, but i don't want to just hack up an xml document.

Is there a way to set up filters in logstash to support each type?

Add configuration for adding FIX tag to output hash keys

current:

{
  "BeginString" => "FIX.4.2"
  "MsgType" => "NewOrderSingle"
}

could add a configuration to add tag numbers:

  # config
  fix_protocol {
    fix_message => fix_string
    include_tags => true # new config
    session_dictionary_path => "/vagrant/spec/fixtures/FIXT11.xml"
    data_dictionary_path => "/vagrant/spec/fixtures/FIX50SP1.xml"
  }

Which would generate new hash keys:

{
  "BeginString (8)" => "FIX.4.2"
  "MsgType (35)" => "NewOrderSingle"
}

Seeing FIX parse failure on OrderCancelRejects

8=FIX.4.2|9=189|35=9|34=65|49=KOD|52=20160607-16:37:28.909|56=KFK|57=4444|1=kozel|11=1_1465317407913511019_c|37=NONE|39=4|41=1_1465317407913511019|58=Too late to cancel (1_1465317407913511019)|102=0|434=1|10=185|

vagrant up fails

Note the part about ant being installed. But I tried installing ant and it failed for other reasons.

TASK: [rvm_io.rvm1-ruby | Install rubies] ************************************* 
failed: [default] => (item={'cmd': ['/home/vagrant/.rvm/bin/rvm', 'jruby-1.7.4', 'do', 'true'], 'end': '2016-02-26 16:44:22.054822', 'stderr': 'Unknown ruby string (do not know how to handle): jruby-1.7.4.\nRuby jruby-1.7.4 is not installed.', 'stdout': u'', 'item': u'jruby-1.7.4', 'changed': False, 'rc': 2, 'failed': False, 'warnings': [], 'delta': '0:00:00.379883', 'invocation': {'module_name': u'command', 'module_args': u'/home/vagrant/.rvm/bin/rvm jruby-1.7.4 do true'}, 'stdout_lines': [], 'failed_when_result': False, 'start': '2016-02-26 16:44:21.674939'}) => {"changed": true, "cmd": ["/home/vagrant/.rvm/bin/rvm", "install", "jruby-1.7.4"], "delta": "0:03:09.566558", "end": "2016-02-26 16:47:32.185184", "item": {"changed": false, "cmd": ["/home/vagrant/.rvm/bin/rvm", "jruby-1.7.4", "do", "true"], "delta": "0:00:00.379883", "end": "2016-02-26 16:44:22.054822", "failed": false, "failed_when_result": false, "invocation": {"module_args": "/home/vagrant/.rvm/bin/rvm jruby-1.7.4 do true", "module_name": "command"}, "item": "jruby-1.7.4", "rc": 2, "start": "2016-02-26 16:44:21.674939", "stderr": "Unknown ruby string (do not know how to handle): jruby-1.7.4.\nRuby jruby-1.7.4 is not installed.", "stdout": "", "stdout_lines": [], "warnings": []}, "rc": 127, "start": "2016-02-26 16:44:22.618626", "warnings": []}
stderr: Unknown ruby string (do not know how to handle): jruby-1.7.4.
Unknown ruby string (do not know how to handle): jruby-1.7.4.
Unknown ruby string (do not know how to handle): jruby-1.7.4.
Unknown ruby string (do not know how to handle): jruby-1.7.4.
Unknown ruby string (do not know how to handle): jruby-1.7.4.
Unknown ruby string (do not know how to handle): jruby-1.7.4.
Unknown ruby string (do not know how to handle): jruby-1.7.4.
Unknown ruby string (do not know how to handle): jruby-1.7.4.
Unknown ruby string (do not know how to handle): jruby-1.7.4.
Unknown ruby string (do not know how to handle): jruby-1.7.4.
No binary rubies available for: centos/7/x86_64/jruby-1.7.4.
Continuing with compilation. Please read 'rvm help mount' to get more information on binary rubies.
$JAVA_HOME was empty, setting up JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.71-2.b15.el7_2.x86_64, if it fails try setting JAVA_HOME to something sane and try again.
Unknown ruby string (do not know how to handle): jruby-1.7.4.
From git://github.com/jruby/jruby
 * branch            master     -> FETCH_HEAD
Error running '__rvm_ant jar',
showing last 15 lines of /home/vagrant/.rvm/log/1456505116_jruby-1.7.4/ant.jar.log
[2016-02-26 16:47:32] __rvm_ant
__rvm_ant () 
{ 
    \ant "$@" || return $?
}
current path: /home/vagrant/.rvm/src/jruby-1.7.4
PATH=/sbin:/bin:/usr/sbin:/usr/bin:/home/vagrant/.rvm/bin
command(2): __rvm_ant jar
++ ant jar
/home/vagrant/.rvm/scripts/functions/support: line 383: ant: command not found
++ return 127
stdout: Searching for binary rubies, this might take some time.
Checking requirements for centos.
Installing requirements for centos.
Installing required packages: patch.......
Requirements installation successful.
Cloning from git://github.com/jruby/jruby.git, this may take a while depending on your connection.
HEAD is now at 35323f6 [Truffle] Fix bad usage of capacity.
Already up-to-date.
git checkout 1.7.4
Copying from repo to src path...
jruby-1.7.4 - #ant jar.

FATAL: all hosts have already failed -- aborting

PLAY RECAP ******************************************************************** 
           to retry, use: --limit @/Users/chris/main.retry

default                    : ok=20   changed=14   unreachable=0    failed=1   

Ansible failed to complete successfully. Any error output should be
visible above. Please fix these errors and try again.

filter crashes on malformed FIX messages

Given the following malformed FIX message:

8=FIX.4.09=5235=034=24garbled9=TW52=20160302-18:29:1056=ISLD10=0

Filter crashes with

Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>#<ArgumentError: Constructor invocation failed: Bad tag format: For input string: "4garbled9" in 8=FIX.4.09=5235=034=24garbled9=TW52=20160302-18:29:1056=ISLD10=0>, "backtrace"=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.1/lib/logstash/filters/fix_message.rb:17:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.1/lib/logstash/filters/fix_protocol.rb:38:in `filter'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:151:in `multi_filter'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:148:in `multi_filter'", "(eval):67:in `filter_func'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:256:in `filter_batch'", "org/jruby/RubyArray.java:1613:in `each'", "org/jruby/RubyEnumerable.java:852:in `inject'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:254:in `filter_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:212:in `worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:190:in `start_workers'"], :level=>:error}
ArgumentError: Constructor invocation failed: Bad tag format: For input string: "4garbled9" in 8=FIX.4.09=5235=034=24garbled9=TW52=20160302-18:29:1056=ISLD10=0
     initialize at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.1/lib/logstash/filters/fix_message.rb:17
         filter at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.1/lib/logstash/filters/fix_protocol.rb:38
   multi_filter at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:151
           each at org/jruby/RubyArray.java:1613
   multi_filter at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:148
    filter_func at (eval):67
   filter_batch at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:256
           each at org/jruby/RubyArray.java:1613
         inject at org/jruby/RubyEnumerable.java:852
   filter_batch at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:254
    worker_loop at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:212
  start_workers at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:190

Example message comes out of FIX acceptance tests for quickfix/n/j/go

Unclear filter logging

Get the following running logstash on log produced from

https://github.com/quickfixgo/quickfix/blob/master/_test/definitions/server/fix40/14a_BadField.def

Not really clear what the WARNING message is about

Logstash startup completed
********
WARNING: Could not correctly parse 
{
  "BeginString": "FIX.4.0",
  "BodyLength": 52,
  "MsgSeqNum": 2,
  "MsgType": "Heartbeat",
  "SenderCompID": "TW",
  "SendingTime": "20160303-17:46:32",
  "TargetCompID": "ISLD",
  "[#<LogStash::Filters::DataDictionary:0x2fbff2ab @file=#<File:/home/vagrant/src/github.com/quickfixgo/quickfix/spec/FIX40.xml>>, #<LogStash::Filters::DataDictionary:0x2fbff2ab @file=#<File:/home/vagrant/src/github.com/quickfixgo/quickfix/spec/FIX40.xml>>]": "HI",
  "CheckSum": "089"
}
Message: undefined method `start_with?' for #<Array:0x59d3104f>
********
********
WARNING: Could not correctly parse 
{
  "BeginString": "FIX.4.0",
  "BodyLength": 50,
  "MsgSeqNum": 3,
  "MsgType": "Heartbeat",
  "SenderCompID": "TW",
  "SendingTime": "20160303-17:46:32",
  "TargetCompID": "ISLD",
  "[#<LogStash::Filters::DataDictionary:0x2fbff2ab @file=#<File:/home/vagrant/src/github.com/quickfixgo/quickfix/spec/FIX40.xml>>, #<LogStash::Filters::DataDictionary:0x2fbff2ab @file=#<File:/home/vagrant/src/github.com/quickfixgo/quickfix/spec/FIX40.xml>>]": "HI",
  "CheckSum": "221"
}
Message: undefined method `start_with?' for #<Array:0x1af68016>
********
********
WARNING: Could not correctly parse 
{
  "BeginString": "FIX.4.0",
  "BodyLength": 51,
  "MsgSeqNum": 4,
  "MsgType": "Heartbeat",
  "SenderCompID": "TW",
  "SendingTime": "20160303-17:46:32",
  "TargetCompID": "ISLD",
  "[#<LogStash::Filters::DataDictionary:0x2fbff2ab @file=#<File:/home/vagrant/src/github.com/quickfixgo/quickfix/spec/FIX40.xml>>, #<LogStash::Filters::DataDictionary:0x2fbff2ab @file=#<File:/home/vagrant/src/github.com/quickfixgo/quickfix/spec/FIX40.xml>>]": "HI",
  "CheckSum": "013"
}
Message: undefined method `start_with?' for #<Array:0x138963e8>
********
********
WARNING: Could not correctly parse 
{
  "BeginString": "FIX.4.0",
  "BodyLength": 53,
  "MsgSeqNum": 5,
  "MsgType": "Heartbeat",
  "SenderCompID": "TW",
  "SendingTime": "20160303-17:46:32",
  "TargetCompID": "ISLD",
  "[#<LogStash::Filters::DataDictionary:0x2fbff2ab @file=#<File:/home/vagrant/src/github.com/quickfixgo/quickfix/spec/FIX40.xml>>, #<LogStash::Filters::DataDictionary:0x2fbff2ab @file=#<File:/home/vagrant/src/github.com/quickfixgo/quickfix/spec/FIX40.xml>>]": "HI",
  "CheckSum": "119"
}
Message: undefined method `start_with?' for #<Array:0x76eb4c83>
********
{
          "message" => "8=FIX.4.0\u00019=56\u000135=A\u000134=1\u000149=TW\u000152=20160303-17:46:32\u000156=ISLD\u000198=0\u0001108=2\u000110=219\u0001",
         "@version" => "1",
       "@timestamp" => "2016-03-03T17:46:38.305Z",
             "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
             "host" => "localhost.localdomain",
             "type" => "fix40",
      "BeginString" => "FIX.4.0",
       "BodyLength" => 56,
        "MsgSeqNum" => 1,
          "MsgType" => "Logon",
     "SenderCompID" => "TW",
      "SendingTime" => "20160303-17:46:32",
     "TargetCompID" => "ISLD",
    "EncryptMethod" => 0,
       "HeartBtInt" => 2,
         "CheckSum" => "219"
}
{
          "message" => "8=FIX.4.0\u00019=56\u000135=A\u000134=1\u000149=ISLD\u000152=20160303-17:46:32\u000156=TW\u000198=0\u0001108=2\u000110=219\u0001",
         "@version" => "1",
       "@timestamp" => "2016-03-03T17:46:38.306Z",
             "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
             "host" => "localhost.localdomain",
             "type" => "fix40",
      "BeginString" => "FIX.4.0",
       "BodyLength" => 56,
        "MsgSeqNum" => 1,
          "MsgType" => "Logon",
     "SenderCompID" => "ISLD",
      "SendingTime" => "20160303-17:46:32",
     "TargetCompID" => "TW",
    "EncryptMethod" => 0,
       "HeartBtInt" => 2,
         "CheckSum" => "219"
}
{
         "message" => "8=FIX.4.0\u00019=52\u000135=0\u000134=2\u000149=TW\u000152=20160303-17:46:32\u000156=ISLD\u0001999=HI\u000110=089\u0001",
        "@version" => "1",
      "@timestamp" => "2016-03-03T17:46:38.321Z",
            "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
            "host" => "localhost.localdomain",
            "type" => "fix40",
     "BeginString" => "FIX.4.0",
      "BodyLength" => 52,
       "MsgSeqNum" => 2,
         "MsgType" => "Heartbeat",
    "SenderCompID" => "TW",
     "SendingTime" => "20160303-17:46:32",
    "TargetCompID" => "ISLD",
        "CheckSum" => "089"
}
{
         "message" => "8=FIX.4.0\u00019=78\u000135=3\u000134=2\u000149=ISLD\u000152=20160303-17:46:32\u000156=TW\u000145=2\u000158=Invalid tag number (999)\u000110=086\u0001",
        "@version" => "1",
      "@timestamp" => "2016-03-03T17:46:38.321Z",
            "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
            "host" => "localhost.localdomain",
            "type" => "fix40",
     "BeginString" => "FIX.4.0",
      "BodyLength" => 78,
       "MsgSeqNum" => 2,
         "MsgType" => "Reject",
    "SenderCompID" => "ISLD",
     "SendingTime" => "20160303-17:46:32",
    "TargetCompID" => "TW",
       "RefSeqNum" => 2,
            "Text" => "Invalid tag number (999)",
        "CheckSum" => "086"
}
{
         "message" => "8=FIX.4.0\u00019=50\u000135=0\u000134=3\u000149=TW\u000152=20160303-17:46:32\u000156=ISLD\u00010=HI\u000110=221\u0001",
        "@version" => "1",
      "@timestamp" => "2016-03-03T17:46:38.323Z",
            "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
            "host" => "localhost.localdomain",
            "type" => "fix40",
     "BeginString" => "FIX.4.0",
      "BodyLength" => 50,
       "MsgSeqNum" => 3,
         "MsgType" => "Heartbeat",
    "SenderCompID" => "TW",
     "SendingTime" => "20160303-17:46:32",
    "TargetCompID" => "ISLD",
        "CheckSum" => "221"
}
{
         "message" => "8=FIX.4.0\u00019=76\u000135=3\u000134=3\u000149=ISLD\u000152=20160303-17:46:32\u000156=TW\u000145=3\u000158=Invalid tag number (0)\u000110=219\u0001",
        "@version" => "1",
      "@timestamp" => "2016-03-03T17:46:38.323Z",
            "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
            "host" => "localhost.localdomain",
            "type" => "fix40",
     "BeginString" => "FIX.4.0",
      "BodyLength" => 76,
       "MsgSeqNum" => 3,
         "MsgType" => "Reject",
    "SenderCompID" => "ISLD",
     "SendingTime" => "20160303-17:46:32",
    "TargetCompID" => "TW",
       "RefSeqNum" => 3,
            "Text" => "Invalid tag number (0)",
        "CheckSum" => "219"
}
{
         "message" => "8=FIX.4.0\u00019=51\u000135=0\u000134=4\u000149=TW\u000152=20160303-17:46:32\u000156=ISLD\u0001-1=HI\u000110=013\u0001",
        "@version" => "1",
      "@timestamp" => "2016-03-03T17:46:38.324Z",
            "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
            "host" => "localhost.localdomain",
            "type" => "fix40",
     "BeginString" => "FIX.4.0",
      "BodyLength" => 51,
       "MsgSeqNum" => 4,
         "MsgType" => "Heartbeat",
    "SenderCompID" => "TW",
     "SendingTime" => "20160303-17:46:32",
    "TargetCompID" => "ISLD",
        "CheckSum" => "013"
}
{
         "message" => "8=FIX.4.0\u00019=77\u000135=3\u000134=4\u000149=ISLD\u000152=20160303-17:46:32\u000156=TW\u000145=4\u000158=Invalid tag number (-1)\u000110=012\u0001",
        "@version" => "1",
      "@timestamp" => "2016-03-03T17:46:38.324Z",
            "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
            "host" => "localhost.localdomain",
            "type" => "fix40",
     "BeginString" => "FIX.4.0",
      "BodyLength" => 77,
       "MsgSeqNum" => 4,
         "MsgType" => "Reject",
    "SenderCompID" => "ISLD",
     "SendingTime" => "20160303-17:46:32",
    "TargetCompID" => "TW",
       "RefSeqNum" => 4,
            "Text" => "Invalid tag number (-1)",
        "CheckSum" => "012"
}
{
         "message" => "8=FIX.4.0\u00019=53\u000135=0\u000134=5\u000149=TW\u000152=20160303-17:46:32\u000156=ISLD\u00015000=HI\u000110=119\u0001",
        "@version" => "1",
      "@timestamp" => "2016-03-03T17:46:38.325Z",
            "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
            "host" => "localhost.localdomain",
            "type" => "fix40",
     "BeginString" => "FIX.4.0",
      "BodyLength" => 53,
       "MsgSeqNum" => 5,
         "MsgType" => "Heartbeat",
    "SenderCompID" => "TW",
     "SendingTime" => "20160303-17:46:32",
    "TargetCompID" => "ISLD",
        "CheckSum" => "119"
}
{
         "message" => "8=FIX.4.0\u00019=79\u000135=3\u000134=5\u000149=ISLD\u000152=20160303-17:46:32\u000156=TW\u000145=5\u000158=Invalid tag number (5000)\u000110=119\u0001",
        "@version" => "1",
      "@timestamp" => "2016-03-03T17:46:38.325Z",
            "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
            "host" => "localhost.localdomain",
            "type" => "fix40",
     "BeginString" => "FIX.4.0",
      "BodyLength" => 79,
       "MsgSeqNum" => 5,
         "MsgType" => "Reject",
    "SenderCompID" => "ISLD",
     "SendingTime" => "20160303-17:46:32",
    "TargetCompID" => "TW",
       "RefSeqNum" => 5,
            "Text" => "Invalid tag number (5000)",
        "CheckSum" => "119"
}
{
         "message" => "8=FIX.4.0\u00019=45\u000135=5\u000134=6\u000149=TW\u000152=20160303-17:46:32\u000156=ISLD\u000110=234\u0001",
        "@version" => "1",
      "@timestamp" => "2016-03-03T17:46:38.327Z",
            "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
            "host" => "localhost.localdomain",
            "type" => "fix40",
     "BeginString" => "FIX.4.0",
      "BodyLength" => 45,
       "MsgSeqNum" => 6,
         "MsgType" => "Logout",
    "SenderCompID" => "TW",
     "SendingTime" => "20160303-17:46:32",
    "TargetCompID" => "ISLD",
        "CheckSum" => "234"
}
{
         "message" => "8=FIX.4.0\u00019=45\u000135=5\u000134=6\u000149=ISLD\u000152=20160303-17:46:32\u000156=TW\u000110=234\u0001",
        "@version" => "1",
      "@timestamp" => "2016-03-03T17:46:38.327Z",
            "path" => "/home/vagrant/src/github.com/quickfixgo/quickfix/_test/tmp/FIX.4.0-ISLD-TW.messages.current.log",
            "host" => "localhost.localdomain",
            "type" => "fix40",
     "BeginString" => "FIX.4.0",
      "BodyLength" => 45,
       "MsgSeqNum" => 6,
         "MsgType" => "Logout",
    "SenderCompID" => "ISLD",
     "SendingTime" => "20160303-17:46:32",
    "TargetCompID" => "TW",
        "CheckSum" => "234"
}

Cannot Start Logstash w/ Fix Protocol Filter

Followed installation instructions and results in inability to load logstash config:

[[email protected] ~]$ tail -f /var/log/logstash/logstash.log
{:timestamp=>"2016-04-04T08:20:29.668000-0700", :message=>"fetched an invalid config", :config=>"input {\n file {\n path => "/motif/logs/tradeexecution.console.log"\n start_position => "beginning"\n }\n}\nfilter {\n grok {\n match => ["message","%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:fix_session}: %{GREEDYDATA:fix_string}"]\n }\n fix_protocol {\n fix_message => fix_string\n }\n}\noutput {\n amazon_es {\n hosts => ["search-fix-logs-ht5lnwidn2x4e6g7czxxqdbrw4.us-east-1.es.amazonaws.com"]\n region => "us-east-1"\n index => "uat-fix-logs-%{+YYYY.MM.dd}"\n\t aws_access_key_id => 'ACCESS_KEY'\n aws_secret_access_key => 'SECRET_KEY'\n }\n}\n\n", :reason=>"Couldn't find any filter plugin named 'fix_protocol'. Are you sure this is correct? Trying to load the fix_protocol filter plugin resulted in this error: uninitialized constant ActiveSupport::Autoload", :level=>:error}

Parsing 552 <NoSides> incorrectly

Hi @daino3 ,

So we are still chugging away. We have encountered an issue. It looks like for tag 552 it is pulling a bunch of inappropriate tags into a nested group when it shouldn't

eg:

http://www.quickfixengine.org/FIX44.xml
image

But in the actual parsing of a message:
image

the actual fix message that caused this:
"8=FIX.4.4\u00019=458\u000135=AE\u000134=1613\u000149=blablabla-blablabla\u000156=blablabla-blablabla\u000152=20170831-18:56:45.294\u0001571=LSZUFPPS49391601999999795\u0001568=tcr-2017-08-30-17-05-23\u0001150=F\u000139=2\u000117=blablabla\u0001570=N\u000155=USD/MXN\u0001167=FOR\u000138=6000000.000000\u00016054=106931160.000000\u000144=17.821860\u000132=blablabla.000000\u000131=99.99\u0001194=99.99\u0001195=0.0000\u000175=20170831\u00016215=SP\u000160=20170831-18:56:45.294\u0001552=1\u000154=1\u000137=blablabla\u0001453=1\u0001448=blablabla\u0001447=D\u0001452=11\u00011=blablabla blablabla blablabla-blablabla NA\u000115=USD\u000140=D\u0001487=0\u000164=20170905\u000110=002\u0001"

Any chance you can help us out?

So far its an awesome plugin!

Regards,
Kelvin

Configuration for flattening nested groups/components

Feature proposal. @cbusbey and/or @kkozel - would like your thoughts.

it's debatable whether or not folks want a parsed FIX message to mimic the structure of the data dictionary. If saving to elasticsearch, or other schema-less db's this presents a search issue. If piped to a relational db, the nested fields must be handled with logic or stored as strings.

so instead of:

            "PartyIDSource" => "PROPRIETARY_CUSTOM_CODE",
            "NoPartySubIDs" => [
                {
                    "PartySubIDType" => 4,
                        "PartySubID" => "27"
                },
                 {
                    "PartySubIDType" => 4000,
                        "PartySubID" => "25906"
                }
            ]

You would see:

"PartyIDSource" => "PROPRIETARY_CUSTOM_CODE",
"PartySubIDType" => 4,
"PartySubID" => "27"
"PartySubIDType" => 4000,
"PartySubID" => "25906"

This is made possible in ruby by comparing hash key uniqueness by identity.

h1 = {}
h1.compare_by_identity
h1["a"] = 1
h1["a"] = 2
p h1 # => {"a"=>1, "a"=>2}

Although, I'm not sure how mongodb or elasticsearch would treat duplicate fields. It's worth investigating or incrementing fields like: PartySubIDType_1 => 4, PartySubIDType_2 => 4000

Support for logstash 6.x and 7.x needed

I trying to upgrade ELK 5.5.2 to ELK 6.8 or ELK 7.1, but get error messages shown below when running ./bin/logstash -t -f config/logstash.conf with logstash 7.1.
error message:

[2019-06-18T17:59:33,192][ERROR][logstash.plugins.registry] Tried to load a plugin's code, but failed. {:exception=>#<LoadError: no such file to load -- i18n/core_ext/string/interpolate>, :path=>"logstash/filters/fix_protocol", :type=>"filter", :name=>"fix_protocol"}
[2019-06-18T17:59:33,202][FATAL][logstash.runner          ] The given configuration is invalid. Reason: Couldn't find any filter plugin named 'fix_protocol'. Are you sure this is correct? Trying to load the fix_protocol filter plugin resulted in this error: no such file to load -- i18n/core_ext/string/interpolate
[2019-06-18T17:59:33,211][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

I've already installed fix_protocol plugin with ./bin/logstash-plugin install logstash-filter-fix_protocol, and got result Validating logstash-filter-fix_protocol Installing logstash-filter-fix_protocol Installation successful

So I think it is a problem related to version mis-compatibility.

I'd appreciate it if anyone can add a pull request to this plugin. Thanks.

Move FIX Message parsing to java-land

Create a java library to parse a fix message in Java, and then - similar to quickfix-jruby - include the .jar file in a ruby gem. Finally, use that gem here.

Then, we can delete all the parsing logic in FIX Message class and call:

# lib/logstash/filters/fix_message.rb
def to_hash
  JSON.parse(self.to_json)
end

Run entire FIX log through FIX filter via cukes

  • Copy over fix message log from AIX
  • Add logstash input configuration to read from fix message log file
  • Add Grok to filter FIX message log file before passing onto our filter
  • Add logstash output configuration to write to a file
  • Add cukes to boot logstash filter with input / output configuration + using FIX filter
  • Assert against output file

Plugin stopping pipeline in logstash v8.9.0

Upgrading to logstash v8.9.0 (opensearch flavour) and using OpenJDK17, the plugin seems to work but crashes on some FIX messages.

Environment

Linux REDHAT
logstash 8.9.0
jruby 9.3.10.0 (2.6.8) 2023-02-01 107b2e6697 OpenJDK 64-Bit Server VM 17.0.4.1+1 on 17.0.4.1+1 +indy +jit [x86_64-linux]
java 17.0.4.1 (Eclipse Adoptium)
jvm OpenJDK 64-Bit Server VM / 17.0.4.1+1

Version isntalled is latest (0.3.3)

$ ./bin/logstash-plugin list --installed logstash-filter-fix_protocol
logstash-filter-fix_protocol

$ cat ./vendor/bundle/jruby/2.6.0/gems/logstash-filter-fix_protocol-0.3.3/logstash-filter-fix_protocol.gemspec
# coding: utf-8
lib = File.expand_path('../lib', __FILE__)
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
require 'logstash/filters/version'

Gem::Specification.new do |s|
  s.name          = "logstash-filter-fix_protocol"
  s.version       = "0.3.3"
  s.authors       = ["Connamara Systems"]
  s.email         = ["[email protected]"]

  s.summary       = "FIX Protocol Logstash Filter"
  s.description   = "Put your financial application logs to work with logstash FIX filtering"
  s.homepage      = "https://github.com/connamara/logstash-filter-fix_protocol"
  s.licenses      = ['Apache License (2.0)']

  s.files         = Dir['lib/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE.txt','QUICKFIX_LICENSE.txt','NOTICE.TXT', 'spec/**/*', 'features/**/*']

  s.test_files    = s.files.grep(%r{^(spec|features)/})

  s.require_paths = ["lib"]

  s.metadata = { "logstash_plugin" => "true", "logstash_group" => "filter" }

  if Logstash::VERSION == '5.x'
    s.add_runtime_dependency 'logstash-core', '>= 5.0.0'
  elsif Logstash::VERSION == '2.x'
    s.add_runtime_dependency 'logstash-core', '>= 2.0.0.beta2', '< 3.0.0'
  else
    raise "Invalid Logstash::VERSION - should be 2x or 5x located in `/lib/logstash/filters/version`"
  end
  s.add_runtime_dependency "logstash-input-generator"
  s.add_runtime_dependency "activesupport"
  s.add_runtime_dependency "quickfix-jruby", '~> 1.6', '>= 1.6.5'

  s.add_development_dependency "logstash-devutils"
  s.add_development_dependency "bundler", "~> 1.8"
  s.add_development_dependency "rake", "~> 10.0"
  s.add_development_dependency "rspec"
  s.add_development_dependency "pry"
end

Log

[2023-08-25T16:22:22,799][ERROR][logstash.javapipeline    ][fix-decoding] Pipeline worker error, the pipeline will be stopped
  {
    :pipeline_id=>"fix-decoding",
    :error=>"Cannot invoke \"quickfix.Group.setField(quickfix.StringField)\" because \"group\" is null",
    :exception=>Java::JavaLang::NullPointerException,
    :backtrace=>[
        "quickfix.Message.checkFieldValidation(Message.java:662)",
        "quickfix.Message.parseGroup(Message.java:633)",
         "quickfix.Message.parseBody(Message.java:559)",
         "quickfix.Message.parse(Message.java:470)",
         "quickfix.Message.fromString(Message.java:453)",
         "quickfix.Message.<init>(Message.java:102)", "rubyobj.LogStash.Filters.FixMessage.<init>(/.../logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-fix_protocol-0.3.3/lib/logstash/filters/fix_message.rb:8)",
         "rubyobj.LogStash.Filters.FixMessage.<init>(/..../logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-fix_protocol-0.3.3/lib/logstash/filters/fix_message.rb:8)",
         "jdk.internal.reflect.GeneratedConstructorAccessor89.newInstance(Unknown Source)",
         "java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)",
         "java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)",
         "java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480)",
         "org.jruby.java.proxies.ConcreteJavaProxy$NewMethodReified.call(ConcreteJavaProxy.java:303)",
         "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:238)",
         "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:234)",
         "...logstash.vendor.bundle.jruby.$2_dot_6_dot_0.gems.logstash_minus_filter_minus_fix_protocol_minus_0_dot_3_dot_3.lib.logstash.filters.fix_protocol.RUBY$method$filter$0(/.../logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-fix_protocol-0.3.3/lib/logstash/filters/fix_protocol.rb:46)",
         "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:165)",
         "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:185)",
         "org.jruby.ir.targets.indy.InvokeSite.fail(InvokeSite.java:278)",
         "...logstash.filters.base.RUBY$method$do_filter$0(/.../logstash/logstash-core/lib/logstash/filters/base.rb:159)",
         "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:165)",
         "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:185)",
         "org.jruby.ir.targets.indy.InvokeSite.fail(InvokeSite.java:278)",
         "...logstash.filters.base.RUBY$block$multi_filter$1(/.../logstash/logstash-core/lib/logstash/filters/base.rb:178)",
         "org.jruby.runtime.CompiledIRBlockBody.yieldDirect(CompiledIRBlockBody.java:151)",
         "org.jruby.runtime.BlockBody.yield(BlockBody.java:106)", "org.jruby.runtime.Block.yield(Block.java:188)",
         "org.jruby.RubyArray.each(RubyArray.java:1865)",
         "...logstash.filters.base.RUBY$method$multi_filter$0(/.../logstash/filters/base.rb:175)",
         "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:165)",
         "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:185)",
         "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:218)",
         "org.logstash.config.ir.compiler.FilterDelegatorExt.doMultiFilter(FilterDelegatorExt.java:128)",
         "org.logstash.config.ir.compiler.AbstractFilterDelegatorExt.lambda$multiFilter$0(AbstractFilterDelegatorExt.java:133)",
         "org.logstash.instrument.metrics.timer.ConcurrentLiveTimerMetric.time(ConcurrentLiveTimerMetric.java:47)",
         "org.logstash.config.ir.compiler.AbstractFilterDelegatorExt.multiFilter(AbstractFilterDelegatorExt.java:133)",
         "org.logstash.generated.CompiledDataset3.compute(Unknown Source)",
         "org.logstash.generated.CompiledDataset11.compute(Unknown Source)",
         "org.logstash.generated.CompiledDataset3.compute(Unknown Source)",
         "org.logstash.config.ir.CompiledPipeline$CompiledUnorderedExecution.compute(CompiledPipeline.java:347)",
         "org.logstash.config.ir.CompiledPipeline$CompiledUnorderedExecution.compute(CompiledPipeline.java:341)",
         "org.logstash.execution.ObservedExecution.lambda$compute$0(ObservedExecution.java:17)",
          "org.logstash.execution.WorkerObserver.lambda$observeExecutionComputation$0(WorkerObserver.java:39)",
         "org.logstash.instrument.metrics.timer.ConcurrentLiveTimerMetric.time(ConcurrentLiveTimerMetric.java:47)",
         "org.logstash.execution.WorkerObserver.lambda$executeWithTimers$1(WorkerObserver.java:50)",
         "org.logstash.instrument.metrics.timer.ConcurrentLiveTimerMetric.time(ConcurrentLiveTimerMetric.java:47)",
         "org.logstash.execution.WorkerObserver.executeWithTimers(WorkerObserver.java:50)",
         "org.logstash.execution.WorkerObserver.observeExecutionComputation(WorkerObserver.java:38)",
         "org.logstash.execution.ObservedExecution.compute(ObservedExecution.java:17)",
         "org.logstash.execution.WorkerLoop.abortableCompute(WorkerLoop.java:113)",
         "org.logstash.execution.WorkerLoop.run(WorkerLoop.java:86)",
         "jdk.internal.reflect.GeneratedMethodAccessor33.invoke(Unknown Source)",
         "java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)",
         "java.base/java.lang.reflect.Method.invoke(Method.java:568)",
         "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:442)",
         "org.jruby.javasupport.JavaMethod.invokeDirect(JavaMethod.java:306)",
         "org.jruby.java.invokers.InstanceMethodInvoker.call(InstanceMethodInvoker.java:32)",
       "...logstash.java_pipeline.RUBY$block$start_workers$1(/.../logstash/logstash-core/lib/logstash/java_pipeline.rb:304)",
         "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:141)",
         "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:64)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)",
         "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:309)",
         "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:107)",
         "java.base/java.lang.Thread.run(Thread.java:833)"],
    :thread=>"#<Thread:0x139a6016@/.../logstash/logstash-core/lib/logstash/java_pipeline.rb:134 sleep>"
}

Logstash crashing when parsing execution reports

Hi,

Perhaps you have experienced the following ... When processing execution reports through the filter logstash crashes and restarts. Initially I considered this may be my FIX messages but testing with your example FIX caused the same issue

Thus far I have successfully processed the following message types (All FIX 4.2):
MsgType: "Heartbeat"
MsgType: "TestRequest"
MsgType: "Logon"
MsgType: "ResendRequest"
MsgType: "SequenceReset"

log-file output:
[2017-07-14T15:31:30,778][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-07-14T15:31:30,793][DEBUG][logstash.inputs.file ] writing sincedb (delta since last write = 1500042690)
[2017-07-14T15:31:30,801][DEBUG][logstash.pipeline ] filter received {"event"=>{"path"=>"/usr/share/logstash/spiderlogs/spider.test.fix.messages.log", "@timestamp"=>2017-07-14T14:31:30.789Z, "@Version"=>"1", "host"=>"uk1pvrpt01", "message"=>"8=FIX.4.4\u00019=0996\u000135=8\u000149=FOOO\u000156=BAR\u000134=1505\u0001128=BAZ\u0001115=BLAH\u000152=20170620-19:22:00\u000160=20170620-19:22:00\u0001150=F\u000131=9999.11111\u0001151=0\u0001541=20171109\u000132=150000000\u0001662=1.179999948\u0001423=4\u0001663=4\u000164=20170621\u00015975=FOOBAR:20170620:1073:6\u00016=9999.11111\u000137=FOOBAR:20170620:1072:6\u000138=150000000\u0001218=14.2591147256\u0001548=FOOBAR:20170620:1073:6\u000139=2\u0001159=0\u0001669=9999.11111\u0001699=912796MB2\u0001460=6\u0001761=1\u0001223=0\u000114=925000000\u000115=USD\u000175=20170620\u0001106=TREASURY BILL \u000117=FOOBAR:20170620:1072:6:12\u0001167=TBILL\u000148=912796KX6\u0001198=7771:20170620:1072:6\u0001470=US\u0001381=123456789\u000122=1\u000154=2\u00017014=21\u000155=[N/A]\u0001118=123456789\u0001453=4\u0001448=FOOBAR\u0001447=D\u0001452=1\u0001448=JDR4:282205\u0001447=D\u0001452=11\u0001802=2\u0001523=27\u0001803=4\u0001523=25906\u0001803=4000\u0001448=ACME CORPORATION\u0001447=D\u0001452=13\u0001448=BBUNNY:13785105\u0001447=D\u0001452=36\u000110=001\u0001"}}
[2017-07-14T15:31:30,805][DEBUG][logstash.filters.grok ] Running grok filter {:event=>2017-07-14T14:31:30.789Z uk1pvrpt01 8=FIX.4.4^A9=0996^A35=8^A49=FOOO^A56=BAR^A34=1505^A128=BAZ^A115=BLAH^A52=20170620-19:22:00^A60=20170620-19:22:00^A150=F^A31=9999.11111^A151=0^A541=20171109^A32=150000000^A662=1.179999948^A423=4^A663=4^A64=20170621^A5975=FOOBAR:20170620:1073:6^A6=9999.11111^A37=FOOBAR:20170620:1072:6^A38=150000000^A218=14.2591147256^A548=FOOBAR:20170620:1073:6^A39=2^A159=0^A669=9999.11111^A699=912796MB2^A460=6^A761=1^A223=0^A14=925000000^A15=USD^A75=20170620^A106=TREASURY BILL ^A17=FOOBAR:20170620:1072:6:12^A167=TBILL^A48=912796KX6^A198=7771:20170620:1072:6^A470=US^A381=123456789^A22=1^A54=2^A7014=21^A55=[N/A]^A118=123456789^A453=4^A448=FOOBAR^A447=D^A452=1^A448=JDR4:282205^A447=D^A452=11^A802=2^A523=27^A803=4^A523=25906^A803=4000^A448=ACME CORPORATION^A447=D^A452=13^A448=BBUNNY:13785105^A447=D^A452=36^A10=001^A}
[2017-07-14T15:31:30,814][DEBUG][logstash.filters.grok ] Event now: {:event=>2017-07-14T14:31:30.789Z uk1pvrpt01 8=FIX.4.4^A9=0996^A35=8^A49=FOOO^A56=BAR^A34=1505^A128=BAZ^A115=BLAH^A52=20170620-19:22:00^A60=20170620-19:22:00^A150=F^A31=9999.11111^A151=0^A541=20171109^A32=150000000^A662=1.179999948^A423=4^A663=4^A64=20170621^A5975=FOOBAR:20170620:1073:6^A6=9999.11111^A37=FOOBAR:20170620:1072:6^A38=150000000^A218=14.2591147256^A548=FOOBAR:20170620:1073:6^A39=2^A159=0^A669=9999.11111^A699=912796MB2^A460=6^A761=1^A223=0^A14=925000000^A15=USD^A75=20170620^A106=TREASURY BILL ^A17=FOOBAR:20170620:1072:6:12^A167=TBILL^A48=912796KX6^A198=7771:20170620:1072:6^A470=US^A381=123456789^A22=1^A54=2^A7014=21^A55=[N/A]^A118=123456789^A453=4^A448=FOOBAR^A447=D^A452=1^A448=JDR4:282205^A447=D^A452=11^A802=2^A523=27^A803=4^A523=25906^A803=4000^A448=ACME CORPORATION^A447=D^A452=13^A448=BBUNNY:13785105^A447=D^A452=36^A10=001^A}
[2017-07-14T15:31:30,859][ERROR][logstash.pipeline ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>"undefined method to_hash' for #<Java::Quickfix::Group:0x2327e6e9>", "backtrace"=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:87:in field_map_to_hash'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:72:in field_map_to_hash'", "org/jruby/RubyRange.java:476:in each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/activesupport-4.1.16/lib/active_support/core_ext/range/each.rb:7:in each_with_time_with_zone'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:69:in field_map_to_hash'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:25:in to_hash'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_protocol.rb:52:in filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:in do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:in multi_filter'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:in multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:43:in multi_filter'", "(eval):382:in initialize'", "org/jruby/RubyArray.java:1613:in each'", "(eval):379:in initialize'", "org/jruby/RubyProc.java:281:in call'", "(eval):184:in filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:383:in filter_batch'", "org/jruby/RubyProc.java:281:in call'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:238:in each'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:237:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:382:in filter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:363:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:330:in start_workers'"]}
[2017-07-14T15:31:31,045][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<NoMethodError: undefined method to_hash' for #<Java::Quickfix::Group:0x2327e6e9>>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:87:in field_map_to_hash'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:72:in field_map_to_hash'", "org/jruby/RubyRange.java:476:in each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/activesupport-4.1.16/lib/active_support/core_ext/range/each.rb:7:in each_with_time_with_zone'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:69:in field_map_to_hash'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:25:in to_hash'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_protocol.rb:52:in filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:in do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:in multi_filter'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:in multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:43:in multi_filter'", "(eval):382:in initialize'", "org/jruby/RubyArray.java:1613:in each'", "(eval):379:in initialize'", "org/jruby/RubyProc.java:281:in call'", "(eval):184:in filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:383:in filter_batch'", "org/jruby/RubyProc.java:281:in call'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:238:in each'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:237:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:382:in filter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:363:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:330:in start_workers'"]}
[2017-07-14T15:31:43,360][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x5296f15c @module_name="fb_apache", @Directory="/usr/share/logstash/modules/fb_apache/configuration">}
[2017-07-14T15:31:43,412][DEBUG][logstash.runner ] -------- Logstash Settings (* means modified) ---------

Conf file:
input {
file {
path => "/usr/share/logstash/spiderlogs/spider.*"
start_position => "beginning"
}
}
filter {
grok {
match => ["message","%{GREEDYDATA:fix_string}"]
}
if [fix_string] =~ "=FIX.4.2" {
fix_protocol {
fix_message => fix_string
data_dictionary_path => "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/spec/fixtures/FIX42.xml"
}
}
if [fix_string] =~ "=FIX.4.3" {
fix_protocol {
fix_message => fix_string
data_dictionary_path => "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/spec/fixtures/FIX43.xml"
}
}
if [fix_string] =~ "=FIX.4.4" {
fix_protocol {
fix_message => fix_string
data_dictionary_path => "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/spec/fixtures/FIX44.xml"
}
}
if [fix_string] =~ "=FIX.5.0" {
fix_protocol {
fix_message => fix_string
session_dictionary_path => "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/spec/fixtures/FIXT11.xml"
data_dictionary_path => "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/spec/fixtures/FIX50SP1.xml"
}
}
else {
drop {}
}
}
output {
elasticsearch {
hosts => "127.0.0.1:9200"
index => "logstash-betafix-index"
}
stdout { codec => rubydebug }
}

Please excuse my lack of ruby/logstash knowledge ... Any assistance would be gratefully received.

Thanks

Issue parsing ExecTransType in rspec tests

Runs against FIX50SP1 data dict at the moment and fails to parse ExecTransType which doesn't exist in that version.

WARNING: Could not correctly parse 8=FIX.4.29=24035=834=649=DUMMY_INC52=20150826-23:10:17.74456=ANOTHER_INC57=Firm_B1=Inst_B6=011=15101256917=ITRZ1201508261_2420=022=831=101032=537=ITRZ1201508261_1238=539=240=241=best_buy44=101154=155=ITRZ160=20150826-23:10:15.547150=2151=010=227
{
  "BeginString": "FIX.4.2",
  "BodyLength": "240",
  "MsgSeqNum": "6",
  "MsgType": "ExecutionReport",
  "SenderCompID": "DUMMY_INC",
  "SendingTime": "20150826-23:10:17.744",
  "TargetCompID": "ANOTHER_INC",
  "TargetSubID": "Firm_B",
  "Account": "Inst_B",
  "AvgPx": 0.0,
  "ClOrdID": "151012569",
  "ExecID": "ITRZ1201508261_24",
  "[#<LogStash::Filters::DataDictionary:0x5af8bb51 @file=#<File:/Users/andrewpage/Desktop/logstash-filter-fix_message/spec/fixtures/FIX50SP1.xml>>, #<LogStash::Filters::DataDictionary:0x799ed4e8 @file=#<File:/Users/andrewpage/Desktop/logstash-filter-fix_message/spec/fixtures/FIXT11.xml>>]": "0",
  "SecurityIDSource": "EXCHANGE_SYMBOL",
  "LastPx": 1010.0,
  "LastQty": 5.0,
  "OrderID": "ITRZ1201508261_12",
  "OrderQty": 5.0,
  "OrdStatus": "FILLED",
  "OrdType": "LIMIT",
  "OrigClOrdID": "best_buy",
  "Price": 1011.0,
  "Side": "BUY",
  "Symbol": "ITRZ1",
  "TransactTime": "20150826-23:10:15.547",
  "ExecType": "2",
  "LeavesQty": 0.0,
  "CheckSum": "227"
}

install from windows

Hi,

I try to install from windows and receive this message:

C:\elastic\logstash-7.9.0\bin>logstash-plugin install /c:/elastic/logstash-filter-fix_protocol/logstash-filter-fix_protocol
ERROR: Something went wrong when installing /c:/elastic/logstash-filter-fix_protocol/logstash-filter-fix_protocol, message: certificate verify failed

How to solve?

Issue with iterative IF statements

I have issue with iterative IF statements. I want to use different FIX.XML for different messages same as example provided. but I am getting below errors. Logstash is crashing with this errors. Everything works perfect when I only use one FIX.XML and take out if statements. would any one please help?

Errors :

[2017-10-20T15:15:29,635][INFO ][logstash.filters.fixprotocol] Using version 0.1.x filter plugin 'fix_protocol'. This plugin isn't well supported by the community and likely has no maintainer.
[2017-10-20T15:15:29,851][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Constructor invocation failed: does not have a 'required' attribute"}

My Logstash config:
......
if "FIX" in [message] {
grok { patterns_dir => [ "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-5.3.0/patterns" ]
match => ["message","%{TIMESTAMP_ISOFIX:timestamp} %{GREEDYDATA:fix_session} - >?<? %{GREEDYDATA:fix_string}"]
}

if "vendor1" in [message] {
fix_protocol { fix_message => fix_string
data_dictionary_path => "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.2/FIX.xml"
}
if "vendor2" in [message] {
fix_protocol { fix_message => fix_string
data_dictionary_path => "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.2/FIXXML/FIX44.xml"
}
}
}
..........

Thank you.

Filter is unable to parse FIX logs

Hi,

I'm just playing around with the filter. Tried to install and ran over sample FIX statements. I have tried in n number of ways but still getting '_fix_parse_failure' exception.

Could you please help me in this regard :

My conf is something like this :

input {
stdin {}
}
filter {
grok {
match => ["message","%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:fix_session}: %{GREEDYDATA:fix_string}"]
}
fix_protocol {
fix_message => fix_string
session_dictionary_path => "C:/logstash-2.3.2_win/bin/FIXT11.xml"
data_dictionary_path => "C:/logstash-2.3.2_win/bin/FIX42.xml"
}
}
output {
stdout { codec => rubydebug }

}

and the runtime looks like this :

C:\logstash-2.3.2_win\bin>logstash -f fix.conf
io/console not supported; tty will not be manipulated
Settings: Default pipeline workers: 4
Pipeline main started
2015-08-26 23:08:38,096 FIX.4.2:DUMMY_INC->ANOTHER_INC: 8=FIX.4.2�9=184�35=F�34=2�49=ANOTHER_INC�50=DefaultSenderSubID�52=20150826-23:08:38.094�56=DUMMY_INC�1=DefaultAccount�11=clordid_of_cancel�41=151012569�54=1�55=ITER�60=20250407-13:14:15�167=FUT�200=201512�10=147�
�[33mReceived an event that has a different character encoding than you configured. {:text=>"2015-08-26 23:08:38,096 FIX.4.2:DUMMY_INC->ANOTHER_INC: 8=FIX.4.2\xFD9=184\xFD35=F\xFD34=2\xFD49=ANOTHER_INC\xFD50=DefaultSenderSubID\xFD52=20150826-23:08:38.094\xFD56=DUMMY_INC\xFD1=DefaultAccount\xFD11=clordid_of_cancel\xFD41=151012569\xFD54=1\xFD55=ITER\xFD60=20250407-13:14:15\xFD167=FUT\xFD200=201512\xFD10=147\xFD\r", :expected_charset=>"UTF-8", :level=>:warn}�[0m
{
"message"�[0;37m => �[0m�[0;33m"2015-08-26 23:08:38,096 FIX.4.2:DUMMY_INC->ANOTHER_INC: 8=FIX.4.2\xFD9=184\xFD35=F\xFD34=2\xFD49=ANOTHER_INC\xFD50=DefaultSenderSubID\xFD52=20150826-23:08:38.094\xFD56=DUMMY_INC\xFD1=DefaultAccount\xFD11=clordid_of_cancel\xFD41=151012569\xFD54=1\xFD55=ITER\xFD60=20250407-13:14:15\xFD167=FUT\xFD200=201512\xFD10=147\xFD\r"�[0m,
"@Version"�[0;37m => �[0m�[0;33m"1"�[0m,
"@timestamp"�[0;37m => �[0m"2016-05-10T17:04:15.724Z",
"host"�[0;37m => �[0m�[0;33m"localhost"�[0m,
"timestamp"�[0;37m => �[0m�[0;33m"2015-08-26 23:08:38,096"�[0m,
"fix_session"�[0;37m => �[0m�[0;33m"FIX.4.2:DUMMY_INC->ANOTHER_INC"�[0m,
"fix_string"�[0;37m => �[0m�[0;33m"8=FIX.4.2\xFD9=184\xFD35=F\xFD34=2\xFD49=ANOTHER_INC\xFD50=DefaultSenderSubID\xFD52=20150826-23:08:38.094\xFD56=DUMMY_INC\xFD1=DefaultAccount\xFD11=clordid_of_cancel\xFD41=151012569\xFD54=1\xFD55=ITER\xFD60=20250407-13:14:15\xFD167=FUT\xFD200=201512\xFD10=147\xFD\r"�[0m,
"tags"�[0;37m => �[0m[
�[1;37m[0] �[0m�[0;33m"_fix_parse_failure"�[0m
]
}


I am currently using Logstash - 2.3.2. Could you please help me in understanding as to where exactly I had gone wrong.

Logstash crashes with FieldNotFound

Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>quickfix.FieldNotFound: 386, index=3, "backtrace"=>["quickfix.FieldMap.getGroup(quickfix/FieldMap.java:583)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:497)", "LogStash::Filters::FixMessage.field_map_to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:64)", "LogStash::Filters::FixMessage.field_map_to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:64)", "org.jruby.RubyRange.each(org/jruby/RubyRange.java:479)", "RUBY.each_with_time_with_zone(/opt/logstash/vendor/bundle/jruby/1.9/gems/activesupport-4.1.14.2/lib/active_support/core_ext/range/each.rb:7)", "LogStash::Filters::FixMessage.field_map_to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:63)", "LogStash::Filters::FixMessage.field_map_to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:63)", "LogStash::Filters::FixMessage.to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:26)", "LogStash::Filters::FixMessage.to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:26)", "LogStash::Filters::FixProtocol.filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_protocol.rb:49)", "LogStash::Filters::FixProtocol.filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_protocol.rb:49)", "LogStash::Filters::Base.multi_filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:151)", "LogStash::Filters::Base.multi_filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:151)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "LogStash::Filters::Base.multi_filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:148)", "LogStash::Filters::Base.multi_filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:148)", "LogStash::Pipeline.cond_func_13((eval):357)", "LogStash::Pipeline.cond_func_13((eval):357)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "LogStash::Pipeline.cond_func_13((eval):354)", "LogStash::Pipeline.cond_func_13((eval):354)", "RUBY.filter_func((eval):171)", "LogStash::Pipeline.filter_batch(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:256)", "LogStash::Pipeline.filter_batch(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:256)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "org.jruby.RubyEnumerable.inject(org/jruby/RubyEnumerable.java:852)", "LogStash::Pipeline.filter_batch(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:254)", "LogStash::Pipeline.filter_batch(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:254)", "RUBY.worker_loop(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:212)", "RUBY.start_workers(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:190)", "java.lang.Thread.run(java/lang/Thread.java:745)"], :level=>:error}
Exception in thread "[base]>worker0" quickfix.FieldNotFound: 386, index=3
    at quickfix.FieldMap.getGroup(quickfix/FieldMap.java:583)
    at java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:497)
    at LogStash::Filters::FixMessage.field_map_to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:64)
    at LogStash::Filters::FixMessage.field_map_to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:64)
    at org.jruby.RubyRange.each(org/jruby/RubyRange.java:479)
    at RUBY.each_with_time_with_zone(/opt/logstash/vendor/bundle/jruby/1.9/gems/activesupport-4.1.14.2/lib/active_support/core_ext/range/each.rb:7)
    at LogStash::Filters::FixMessage.field_map_to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:63)
    at LogStash::Filters::FixMessage.field_map_to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:63)
    at LogStash::Filters::FixMessage.to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:26)
    at LogStash::Filters::FixMessage.to_hash(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_message.rb:26)
    at LogStash::Filters::FixProtocol.filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_protocol.rb:49)
    at LogStash::Filters::FixProtocol.filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.1.3/lib/logstash/filters/fix_protocol.rb:49)
    at LogStash::Filters::Base.multi_filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:151)
    at LogStash::Filters::Base.multi_filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:151)
    at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)
    at LogStash::Filters::Base.multi_filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:148)
    at LogStash::Filters::Base.multi_filter(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb:148)
    at LogStash::Pipeline.cond_func_13((eval):357)
    at LogStash::Pipeline.cond_func_13((eval):357)
    at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)
    at LogStash::Pipeline.cond_func_13((eval):354)
    at LogStash::Pipeline.cond_func_13((eval):354)
    at RUBY.filter_func((eval):171)
    at LogStash::Pipeline.filter_batch(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:256)
    at LogStash::Pipeline.filter_batch(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:256)
    at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)
    at org.jruby.RubyEnumerable.inject(org/jruby/RubyEnumerable.java:852)
    at LogStash::Pipeline.filter_batch(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:254)
    at LogStash::Pipeline.filter_batch(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:254)
    at RUBY.worker_loop(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:212)
    at RUBY.start_workers(/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:190)
    at java.lang.Thread.run(java/lang/Thread.java:745)

Will track down message that caused issue

Fatal Exception on Nested Groups/SubGroups

Hi @daino3 ,

We have a message with (FIXv4.4) and it uses the group and then subgroup tags.

eg:
field number='453' name='NoPartyIDs' type='NUMINGROUP'
and
field number='802' name='NoPartySubIDs' type='NUMINGROUP'

It crashes when it finds another group (beyond the first level it seems like)

[2017-06-21T11:06:35,571][ERROR][logstash.pipeline ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>"undefined method to_hash' for #Java::Quickfix::Group:0x3478b854", "backtrace"=>["/home/kmalyar/tmp/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:87:in field_map_to_hash'", "/home/kmalyar/tmp/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:72:in field_map_to_hash'", "org/jruby/RubyRange.java:479:in each'", "/home/kmalyar/tmp/logstash/vendor/bundle/jruby/1.9/gems/activesupport-4.1.16/lib/active_support/core_ext/range/each.rb:7:in each_with_time_with_zone'", "/home/kmalyar/tmp/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:69:in field_map_to_hash'", "/home/kmalyar/tmp/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_message.rb:25:in to_hash'", "/home/kmalyar/tmp/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-fix_protocol-0.3.1/lib/logstash/filters/fix_protocol.rb:52:in filter'", "/home/kmalyar/tmp/logstash/logstash-core/lib/logstash/filters/base.rb:145:in do_filter'", "/home/kmalyar/tmp/logstash/logstash-core/lib/logstash/filters/base.rb:164:in multi_filter'", "org/jruby/RubyArray.java:1613:in each'", "/home/kmalyar/tmp/logstash/logstash-core/lib/logstash/filters/base.rb:161:in multi_filter'", "/home/kmalyar/tmp/logstash/logstash-core/lib/logstash/filter_delegator.rb:43:in multi_filter'", "(eval):211:in initialize'", "org/jruby/RubyArray.java:1613:in each'", "(eval):203:in initialize'", "org/jruby/RubyProc.java:281:in call'", "(eval):142:in filter_func'", "/home/kmalyar/tmp/logstash/logstash-core/lib/logstash/pipeline.rb:370:in filter_batch'", "org/jruby/RubyProc.java:281:in call'", "/home/kmalyar/tmp/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:224:in each'", "org/jruby/RubyHash.java:1342:in each'", "/home/kmalyar/tmp/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:223:in each'", "/home/kmalyar/tmp/logstash/logstash-core/lib/logstash/pipeline.rb:369:in filter_batch'", "/home/kmalyar/tmp/logstash/logstash-core/lib/logstash/pipeline.rb:350:in worker_loop'", "/home/kmalyar/tmp/logstash/logstash-core/lib/logstash/pipeline.rb:317:in start_workers'"]}

Have you run into this before?

Thanks!
Kelvin

Unable to vagrant up: Update RubyGems

ansible 1.9.3
Vagrant 1.7.4
VB 5.0.14

vagrant up from master results in:

TASK: [Main | Update RubyGems] ************************************************
failed: [default] => {"changed": true, "cmd": "sudo -iu gem update --system \"2.5.2\"", "delta": "0:00:00.006842", "end": "2016-02-26 17:12:06.159130", "rc": 1, "start": "2016-02-26 17:12:06.152288", "warnings": []}
stderr: sudo: unknown user: gem
sudo: unable to initialize policy plugin

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.