Logstash and cisco asa v9 netflow

has anyone got this working,

i can see some work has been done on the netflow.yaml file in regards to this

I've tried adding the changed to my netflow.yaml but i keep getting this error.

no matching template for flow id 263
no matching template for flow id 257

i can see others have had the same problem in the forums here, but no clear solution with a copy of a logstash conf file that solved it. if anyone has i running please share.

additional info

have verified that the cisco asa is sending templates ever other minute and let the logstash run for more than 5 mins to see if it fixed it.

my current config
input{
udp{
port => 2055
codec => netflow {
definitions => "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow/netflow.yaml"
versions => [5,9]}
}
}
output {
elasticsearch { hosts => ["localhost"] }
stdout { codec => rubydebug }
}

I'm in the exact same boat if you have any ideas. I'm using Logstash 2.2 and 2.3.0 of the codec.

i have it working jorrit send me some files of his and i then c/p 'ed the files from the github that he had amended into the netflow plugin itself and overwrotw the existing-. jorrits setup is tested on 1.5.5 of logstash i have it running on newest version hvowever im currently examining a approx 1.5 min delay in the data being displayed in kibana i can forward the files jorrit send me and you can then c/p the changed files from github

If not too much to ask could you please tell me what files you replaced in 2.3.0 and where they reside on git ? Did you need to recompile the gem for it to take effect? And to be clear we're talking logstash 2.2 and netflow-codec-2.3.0

Thanks for the help!

here is the link to the github

chose the raw view for them and c/p the content into a new file called the same as the ones in the github

save the files in folder.

look at the path within the netflow codec folder structure to locate the original files - replace the original files with the ones you created.

there is an automated way to do this. i just could'net get it to work

if you give me and email to reply to that accepts attached files i can zip the whole thing you and send it to you, you then just have to delte the content of the codec foler (not the folder it self) and unpack the zip file into the now empty codec folder.

set your network device to update netflow frames/into to 1 minute - restart logstash. it willl take until the update frame is send from the network device for logstash to get the proper format.

currently im gettting a bit of delay in kibana approx 1.5 min but it works with logstash 2..2.0. and everything else

Thank you so much sir for the help. I believe I did this right but I am still getting these errors:

{:timestamp=>"2016-03-01T12:37:53.535000-0500", :message=>"No matching template for flow id 256", :level=>:warn}
{:timestamp=>"2016-03-01T12:37:54.172000-0500", :message=>"No matching template for flow id 257", :level=>:warn}
Here are the positions of the files in the netflow-codec build directory

./
./lib
./lib/logstash
./lib/logstash/codecs
./lib/logstash/codecs/netflow
./lib/logstash/codecs/netflow/netflow.yaml
./lib/logstash/codecs/netflow/util.rb
./lib/logstash/codecs/netflow/netflow.yaml.bak
./lib/logstash/codecs/netflow.rb.bak
./lib/logstash/codecs/netflow.rb
./lib/logstash/codecs/netflow.rb.rej
./lib/logstash/codecs/netflow.rb.orig
./lib/logstash/codecs/netflow9_test_valid_cisco_asa_data.dat
./lib/logstash/codecs/netflow9_test_valid_cisco_asa_tpl.dat
./CHANGELOG.md
./CONTRIBUTORS
./Gemfile
./LICENSE
./NOTICE.TXT
./README.md
./logstash-codec-netflow.gemspec
./24.patch
./logstash-codec-netflow.gemspec.rej
./spec
./spec/codecs
./spec/codecs/netflow_spec.rb
./logstash-codec-netflow.gemspec.orig
./logstash-codec-netflow-2.0.3.gem

Thanks!
Tim

Did you verify that the network device is sending the index frame every other minute you might need to have your network people change the configure if you don't have access

Send me an email I can reply to and I'll pack the whole thing for you that way you can get it working I can't make out if you copied the files to their right location I'm on cet so I'll be off line for a while gotta sleep

Please send it to tbw@theyarewatching.us

Appreciate the help. I am the network guy so I can modify the settings -- is the timing issue related to only Cisco devices as I've got multiple vendors that send flow data to a Riverbed flow gateway and I'm not sure I can even adjust the timings on the other platforms. What function does the minute export provide us? Thanks again!

I completely removed and re-added the entire plugin directory, and made sure to move all the updated files to the correct directory and the errors have changed but I'm still not there.
:timestamp=>"2016-03-02T14:24:49.502000-0500", :message=>"Failed action. ", :status=>400, :action=>["index", {:_id=>nil, :_index=>"flowstash-2016.03.02", :_type=>"netflow", :_routing=>nil}, #<LogStash::Event:0x1513e9f2 @metadata_accessors=#<LogStash::Util::Accessors:0x2b053785 @store={}, @lut={}>, @cancelled=false, @data={"@timestamp"=>"2016-03-02T19:28:05.000Z", "netflow"=>{"version"=>9, "flow_seq_num"=>57079014, "flowset_id"=>257, "conn_id"=>178999322, "ipv4_src_addr"=>"10.170.117.49", "l4_src_port"=>0, "input_snmp"=>18, "ipv4_dst_addr"=>"0.0.0.0", "l4_dst_port"=>54395, "output_snmp"=>9997, "protocol"=>0, "icmp_type"=>6, "icmp_code"=>2, "xlate_src_addr_ipv4"=>"27.15.0.0", "xlate_dst_addr_ipv4"=>"0.0.0.0", "xlate_src_port"=>59, "xlate_dst_port"=>2730, "fw_event"=>73, "fw_ext_event"=>60866, "event_time_msec"=>10376078872824435712, "in_permanent_bytes"=>768, "ingress_acl_id"=>"0000980a-ab501a0a-aa753200", "egress_acl_id"=>"00001200-000000d4-7c270d00", "username"=>"\x06\x02\e\x0F\x00\x00\x00\x00\x00\x00\x00<\n\xAAI\xF1\xC2\x8F\xFF<\xC2\x90"\xD4\x00\x00\x00\x03\x00\x00\x00\x98\n\xABT;\n\xAAu1\x00\x00\x00/\x00\x00\x00\x00\xC2\x9D'\r\x00\x06\x02\x1A\x0F\x00\x00\x00\x00\x00\x00\x00;"}, "@version"=>"1", "type"=>"netflow", "host"=>"10.170.73.244"}, @metadata={}, @accessors=#<LogStash::Util::Accessors:0x7f6b0694 @store={"@timestamp"=>"2016-03-02T19:28:05.000Z", "netflow"=>{"version"=>9, "flow_seq_num"=>57079014, "flowset_id"=>257, "conn_id"=>178999322, "ipv4_src_addr"=>"10.170.117.49", "l4_src_port"=>0, "input_snmp"=>18, "ipv4_dst_addr"=>"0.0.0.0", "l4_dst_port"=>54395, "output_snmp"=>9997, "protocol"=>0, "icmp_type"=>6, "icmp_code"=>2, "xlate_src_addr_ipv4"=>"27.15.0.0", "xlate_dst_addr_ipv4"=>"0.0.0.0", "xlate_src_port"=>59, "xlate_dst_port"=>2730, "fw_event"=>73, "fw_ext_event"=>60866, "event_time_msec"=>10376078872824435712, "in_permanent_bytes"=>768, "ingress_acl_id"=>"0000980a-ab501a0a-aa753200", "egress_acl_id"=>"00001200-000000d4-7c270d00", "username"=>"\x06\x02\e\x0F\x00\x00\x00\x00\x00\x00\x00<\n\xAAI\xF1\xC2\x8F\xFF<\xC2\x90"\xD4\x00\x00\x00\x03\x00\x00\x00\x98\n\xABT;\n\xAAu1\x00\x00\x00/\x00\x00\x00\x00\xC2\x9D'\r\x00\x06\x02\x1A\x0F\x00\x00\x00\x00\x00\x00\x00;"}, "@version"=>"1", "type"=>"netflow", "host"=>"10.170.73.244"}, @lut={"type"=>[{"@timestamp"=>"2016-03-02T19:28:05.000Z", "netflow"=>{"version"=>9, "flow_seq_num"=>57079014, "flowset_id"=>257, "conn_id"=>178999322, "ipv4_src_addr"=>"10.170.117.49", "l4_src_port"=>0, "input_snmp"=>18, "ipv4_dst_addr"=>"0.0.0.0", "l4_dst_port"=>54395, "output_snmp"=>9997, "protocol"=>0, "icmp_type"=>6, "icmp_code"=>2, "xlate_src_addr_ipv4"=>"27.15.0.0", "xlate_dst_addr_ipv4"=>"0.0.0.0", "xlate_src_port"=>59, "xlate_dst_port"=>2730, "fw_event"=>73, "fw_ext_event"=>60866, "event_time_msec"=>10376078872824435712, "in_permanent_bytes"=>768, "ingress_acl_id"=>"0000980a-ab501a0a-aa753200", "egress_acl_id"=>"00001200-000000d4-7c270d00", "username"=>"\x06\x02\e\x0F\x00\x00\x00\x00\x00\x00\x00<\n\xAAI\xF1\xC2\x8F\xFF<\xC2\x90"\xD4\x00\x00\x00\x03\x00\x00\x00\x98\n\xABT;\n\xAAu1\x00\x00\x00/\x00\x00\x00\x00\xC2\x9D'\r\x00\x06\x02\x1A\x0F\x00\x00\x00\x00\x00\x00\x00;"}, "@version"=>"1", "type"=>"netflow", "host"=>"10.170.73.244"}, "type"]}>>], :response=>{"create"=>{"_index"=>"flowstash-2016.03.02", "_type"=>"netflow", "_id"=>"AVM4yjXaI89Q4Aylxu29", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [netflow.event_time_msec]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Numeric value (10376078872824435712) out of range of long (-9223372036854775808 - 9223372036854775807)\n at [Source: org.elasticsearch.common.io.stream.InputStreamStreamInput@7f6aa5ef; line: 1, column: 477]"}}}}, :level=>:warn}
{:timestamp=>"2016-03-02T14:24:50.377000-0500", :message=>"Template length doesn't fit cleanly into flowset", :template_id=>257, :template_length=>139, :record_length=>1219, :level=>:warn}
{:timestamp=>"2016-03-02T14:24:51.374000-0500", :message=>"Template length doesn't fit cleanly into flowset", :template_id=>257, :template_length=>139, :record_length=>1325, :level=>:warn}

Further, it looks as if the ASA flows might be getting inputed correctly now but that my regular Cisco router flows are now the ones that are broken!

I've noticed two things -- logstash seems to work somewhat importing my flows (though not completely) but since I applied the patches from your git it crashes after a couple hours. I'm waiting to catch the error but before it crashes I'm still seeing a lot of these:
{:timestamp=>"2016-03-04T10:32:03.898000-0500", :message=>"Template length doesn't fit cleanly into flowset", :template_id=>256, :template_length=>94, :record_length=>52, :level=>:warn}

This appears to be the exception that causes the flow imports to stop completely:
{:timestamp=>"2016-03-03T16:19:47.533000-0500", :message=>"Exception in inputworker", "exception"=>#<NoMethodError: undefined method value' for nil:NilClass>, "backtrace"=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow/util.rb:62:inget'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/array.rb:208:in each'", "org/jruby/RubyArray.java:1613:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/array.rb:208:in each'", "org/jruby/RubyEnumerable.java:752:incollect'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow/util.rb:62:in get'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/primitive.rb:111:insensible_default'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/base_primitive.rb:142:in _value'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/primitive.rb:103:indo_num_bytes'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/struct.rb:253:in sum_num_bytes_below_index'", "org/jruby/RubyRange.java:479:ineach'", "org/jruby/RubyEnumerable.java:852:in inject'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/struct.rb:250:insum_num_bytes_below_index'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/struct.rb:246:in sum_num_bytes_for_all_fields'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/struct.rb:144:indo_num_bytes'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/base.rb:174:in num_bytes'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow.rb:202:indecode_netflow9'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow.rb:89:in decode'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/array.rb:208:ineach'", "org/jruby/RubyArray.java:1613:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/array.rb:208:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow.rb:85:in decode'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-2.0.3/lib/logstash/inputs/udp.rb:96:ininputworker'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-2.0.3/lib/logstash/inputs/udp.rb:73:in `udp_listener'"], :level=>:error}

Followed by:
{:timestamp=>"2016-03-03T16:19:47.533000-0500", :message=>"Exception in inputworker", "exception"=>#<NoMethodError: undefined method value' for nil:NilClass>, "backtrace"=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow/util.rb:62:inget'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/array.rb:208:in each'", "org/jruby/RubyArray.java:1613:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/array.rb:208:in each'", "org/jruby/RubyEnumerable.java:752:incollect'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow/util.rb:62:in get'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/primitive.rb:111:insensible_default'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/base_primitive.rb:142:in _value'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/primitive.rb:103:indo_num_bytes'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/struct.rb:253:in sum_num_bytes_below_index'", "org/jruby/RubyRange.java:479:ineach'", "org/jruby/RubyEnumerable.java:852:in inject'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/struct.rb:250:insum_num_bytes_below_index'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/struct.rb:246:in sum_num_bytes_for_all_fields'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/struct.rb:144:indo_num_bytes'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/base.rb:174:in num_bytes'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow.rb:202:indecode_netflow9'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow.rb:89:in decode'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/array.rb:208:ineach'", "org/jruby/RubyArray.java:1613:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.2.0/lib/bindata/array.rb:208:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.0.3/lib/logstash/codecs/netflow.rb:85:in decode'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-2.0.3/lib/logstash/inputs/udp.rb:96:ininputworker'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-2.0.3/lib/logstash/inputs/udp.rb:73:in `udp_listener'"], :level=>:error}

Full text of the error where flow imports stop as this won't allow me to paste it: http://pastebin.com/uR47TmLW
Thanks for the help!
Tim

I am trying to build gem using this command sudo gem build logstash-codec-netflow.gemspec
ERROR: While executing gem ... (NoMethodError)
undefined method `metadata=' for "#Gem::Specification:0x7f9db0e93450"

I am using logstash 2.2 has anyone run into this issue?

ruby --version
ruby 1.8.7 (2011-06-30 patchlevel 352) [x86_64-linux]

I used to download the gem file from here

https://rubygems.org/gems/logstash-codec-netflow/versions/2.0.5

Installed the plugin, but still getting the same issue no template

No matching template for flow id 258 {:level=>:warn}

Another suggestion was made was to get the template info by parsing the tcpdump using wireshark

So here is what I get


Packet 330 from /Users/od9655/Downloads/netflow.pcap

  • 331
  • 8.267551
  • 172.16.201.xx
  • 10.1.52.xxx
  • CFLOW
  • 1500
  • total: 28 (v9) records Obs-Domain-ID= 768 [Data-Template:258] [Data:258]

Cisco NetFlow/IPFIX
Version: 9
Count: 28
SysUptime: 285007.1539335464 seconds
Timestamp: May 10, 2016 11:29:09.000000000 PDT
CurrentSecs: 1462904949
FlowSequence: 1763496
SourceId: 768
FlowSet 1 [id=0] (Data Template): 258
FlowSet 2 [id=258] (27 flows)

How can I update the netflow.yaml file based on the info I have?

258:

  • 1500
  • : flows