Giter Site home page Giter Site logo

Comments (4)

robcowart avatar robcowart commented on July 25, 2024

That is a change that can be made. However I am curious, what ifIndex values you are receiving? Have you confirmed that they actually exist when you do a MIB walk?

Strictly speaking ifIndex is a 32-bit signed integer, so you are really limited to 31-bits for a valid value. Any device using values larger than that technically has a bug. That said, I am all for trying to fix such device issues where possible.

from elastiflow.

joakimlemb avatar joakimlemb commented on July 25, 2024

The ifIndex ids mentioned in the logstash logs does not appear on the switch:

1    Up    Vlan1    Vl1        53    1000000000
5179    Up    StackPort1    StackPort1        
5180    Up    StackSub-St1-1    StackSub-St1
5181    Up    StackSub-St1-2    StackSub-St1
5182    Up    StackPort2    StackPort2        
5183    Up    StackSub-St2-1    StackSub-St2
5184    Up    StackSub-St2-2    StackSub-St2
10101    Down    GigabitEthernet1/0/1    
10102    Down    GigabitEthernet1/0/2    
10103    Down    GigabitEthernet1/0/3    
10104    Down    GigabitEthernet1/0/4    
10105    Up    GigabitEthernet1/0/5    
10106    Down    GigabitEthernet1/0/6  
10107    Up    GigabitEthernet1/0/7    
10108    Down    GigabitEthernet1/0/8  
10109    Down    GigabitEthernet1/0/9  
10110    Down    GigabitEthernet1/0/10 
10111    Up    GigabitEthernet1/0/11   
10112    Down    GigabitEthernet1/0/12 
10113    Up    GigabitEthernet1/0/13   
10114    Down    GigabitEthernet1/0/14 
10115    Down    GigabitEthernet1/0/15 
10116    Down    GigabitEthernet1/0/16 
10117    Down    GigabitEthernet1/0/17 
10118    Down    GigabitEthernet1/0/18 
10119    Down    GigabitEthernet1/0/19 
10120    Down    GigabitEthernet1/0/20 
10121    Down    GigabitEthernet1/0/21 
10122    Down    GigabitEthernet1/0/22 
10123    Up    GigabitEthernet1/0/23   
10124    Up    GigabitEthernet1/0/24   
10125    Down    GigabitEthernet1/0/25 
10126    Down    GigabitEthernet1/0/26 
10127    Down    GigabitEthernet1/0/27 
10128    Down    GigabitEthernet1/0/28 
10129    Down    GigabitEthernet1/0/29 
10130    Up    GigabitEthernet1/0/30   
10131    Down    GigabitEthernet1/0/31 
10132    Down    GigabitEthernet1/0/32 
10133    Down    GigabitEthernet1/0/33 
10134    Down    GigabitEthernet1/0/34 
10135    Down    GigabitEthernet1/0/35 
10136    Up    GigabitEthernet1/0/36   
10137    Down    GigabitEthernet1/0/37 
10138    Down    GigabitEthernet1/0/38 
10139    Down    GigabitEthernet1/0/39 
10140    Down    GigabitEthernet1/0/40 
10141    Down    GigabitEthernet1/0/41 
10142    Down    GigabitEthernet1/0/42 
10143    Down    GigabitEthernet1/0/43 
10144    Down    GigabitEthernet1/0/44 
10145    Up    GigabitEthernet1/0/45   
10146    Down    GigabitEthernet1/0/46 
10147    Down    GigabitEthernet1/0/47 
10148    Up    GigabitEthernet1/0/48   
10149    Down    GigabitEthernet1/0/49 
10150    Down    GigabitEthernet1/0/50 
10151    Down    GigabitEthernet1/0/51 
10152    Down    GigabitEthernet1/0/52 
10601    Up    GigabitEthernet2/0/1    
10602    Down    GigabitEthernet2/0/2  
10603    Down    GigabitEthernet2/0/3  
10604    Down    GigabitEthernet2/0/4  
10605    Down    GigabitEthernet2/0/5  
10606    Down    GigabitEthernet2/0/6  
10607    Down    GigabitEthernet2/0/7  
10608    Down    GigabitEthernet2/0/8  
10609    Down    GigabitEthernet2/0/9  
10610    Down    GigabitEthernet2/0/10 
10611    Down    GigabitEthernet2/0/11 
10612    Down    GigabitEthernet2/0/12 
10613    Down    GigabitEthernet2/0/13 
10614    Down    GigabitEthernet2/0/14 
10615    Down    GigabitEthernet2/0/15 
10616    Down    GigabitEthernet2/0/16 
10617    Down    GigabitEthernet2/0/17 
10618    Down    GigabitEthernet2/0/18 
10619    Down    GigabitEthernet2/0/19 
10620    Down    GigabitEthernet2/0/20 
10621    Down    GigabitEthernet2/0/21 
10622    Down    GigabitEthernet2/0/22 
10623    Down    GigabitEthernet2/0/23 
10624    Down    GigabitEthernet2/0/24 
10625    Down    GigabitEthernet2/0/25 
10626    Down    GigabitEthernet2/0/26 
10627    Down    GigabitEthernet2/0/27 
10628    Down    GigabitEthernet2/0/28 
14001    Up        Null0    Nu0        
14002    Down    FastEthernet0    Fa0

But we can clearly see in the logstash logs that it tries to insert some very high values:

[2018-02-13T14:52:59,951][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-2018.02.13", :_type=>"logs", :_routing=>nil}, 2018-02-13T13:52:59.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"elastiflow-2018.02.13", "_type"=>"logs", "_id"=>"AWGPcKYAHRuydoSWUCfE", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.output_snmp]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (2887062808) out of range of int\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@15853497; line: 1, column: 94]"}}}}}
[2018-02-13T14:53:02,802][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-2018.02.13", :_type=>"logs", :_routing=>nil}, 2018-02-13T13:53:02.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"elastiflow-2018.02.13", "_type"=>"logs", "_id"=>"AWGPcLEkHRuydoSWUCgI", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.output_snmp]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (2533359616) out of range of int\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@82d2e05; line: 1, column: 92]"}}}}}
[2018-02-13T14:53:02,802][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-2018.02.13", :_type=>"logs", :_routing=>nil}, 2018-02-13T13:53:02.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"elastiflow-2018.02.13", "_type"=>"logs", "_id"=>"AWGPcLEkHRuydoSWUCgU", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.input_snmp]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (4261412864) out of range of int\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@6d9cf4fc; line: 1, column: 480]"}}}}}
[2018-02-13T14:53:02,803][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-2018.02.13", :_type=>"logs", :_routing=>nil}, 2018-02-13T13:53:02.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"elastiflow-2018.02.13", "_type"=>"logs", "_id"=>"AWGPcLEkHRuydoSWUCgb", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.output_snmp]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (4012146348) out of range of int\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@728b2e66; line: 1, column: 92]"}}}}}
[2018-02-13T14:53:02,803][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-2018.02.13", :_type=>"logs", :_routing=>nil}, 2018-02-13T13:53:02.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"elastiflow-2018.02.13", "_type"=>"logs", "_id"=>"AWGPcLEkHRuydoSWUCgc", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.output_snmp]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (4033195029) out of range of int\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@65655048; line: 1, column: 92]"}}}}}
[2018-02-13T14:53:16,613][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-2018.02.13", :_type=>"logs", :_routing=>nil}, 2018-02-13T13:53:16.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"elastiflow-2018.02.13", "_type"=>"logs", "_id"=>"AWGPcOc1HRuydoSWUCoL", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.input_snmp]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (2365587456) out of range of int\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@29ffb62c; line: 1, column: 475]"}}}}}
[2018-02-13T14:53:16,613][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-2018.02.13", :_type=>"logs", :_routing=>nil}, 2018-02-13T13:53:16.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"elastiflow-2018.02.13", "_type"=>"logs", "_id"=>"AWGPcOc1HRuydoSWUCoM", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.input_snmp]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (2542600192) out of range of int\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@1d914f25; line: 1, column: 471]"}}}}}
[2018-02-13T14:53:16,613][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-2018.02.13", :_type=>"logs", :_routing=>nil}, 2018-02-13T13:53:16.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"elastiflow-2018.02.13", "_type"=>"logs", "_id"=>"AWGPcOc1HRuydoSWUCoP", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.output_snmp]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (2365587456) out of range of int\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@79b106a9; line: 1, column: 94]"}}}}}

I'm not sure what conclusion to make of this, could it be a problem at the NetFlow decoder in logstash?
I know that the 2960X switches uses NetFlow Lite, which might use a slightly different format: https://www.cisco.com/c/en/us/td/docs/switches/lan/catalyst2960x/software/15-2_2_e/fnf/configuration_guide/b_fnf_1522e_2960x_cg/b_fnf_32se_3850_cg_chapter_010.html

We are also seeing netflow._pkts and netflow._bytes overflow for type long.

[2018-02-13T15:28:42,023][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-2018.02.13", :_type=>"logs", :_routing=>nil}, 2018-02-13T14:28:41.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"elastiflow-2018.02.13", "_type"=>"logs", "_id"=>"AWGPkVWHHRuydoSWURnZ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.in_pkts]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (15492960188970243883) out of range of long (-9223372036854775808 - 9223372036854775807)\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@23ce958c; line: 1, column: 125]"}}}}}
[2018-02-13T15:28:42,023][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-2018.02.13", :_type=>"logs", :_routing=>nil}, 2018-02-13T14:28:41.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"elastiflow-2018.02.13", "_type"=>"logs", "_id"=>"AWGPkVWHHRuydoSWURna", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.in_bytes]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (12399839240809884375) out of range of long (-9223372036854775808 - 9223372036854775807)\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@10dc7d12; line: 1, column: 421]"}}}}}

from elastiflow.

robcowart avatar robcowart commented on July 25, 2024

You might want to look at this thread...
https://supportforums.cisco.com/t5/lan-switching-and-routing/netflow-v-s-netflow-lite/td-p/1892027

In particular...

Our team worked directly with Cisco and the nProbe engineer on the NetFlow-Lite project. NetFlow-Lite requires the nProbe or nBox to compile the sampled packets into traditional NetFlow or IPFIX.

I don't believe that the Logstash Netflow codec supports Netflow Lite. The developer/maintainer of the codec provides really good support, so you might want to open an issue there and send him a PCAP of the flow records and see if he will add support.

Once the codec adds support, I can make any necessary adjustments to ElastiFlow.

from elastiflow.

joakimlemb avatar joakimlemb commented on July 25, 2024

Thanks for looking into it, I'm closing this since it's not a problem with ElastiFlow itself.

from elastiflow.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.