Hi,
I've got some serieus issues with parsing relativly large nmap scan using the logstash-nmap-codec. In my situation I've got a ~1.8MB XML file with the resutls of the nmap scan which I want to import but I've got issues importing it when using either stdin or HTTP as input.
input {
stdin {
codec => nmap {
emit_hosts => true
emit_ports => true
emit_traceroute_links => true
emit_scan_metadata => true
}
}
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
index => "nmapdebug"
codec => "json"
}
}
When using the above configuration, Logstash fails with the following error:
[INFO ] 2018-08-09 16:35:41.861 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9603}
[ERROR] 2018-08-09 16:35:42.093 [[main]<stdin] pipeline - A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Stdin codec=><LogStash::Codecs::Nmap emit_hosts=>true, emit_ports=>true, emit_traceroute_links=>true, emit_scan_metadata=>true, id=>"466103d9-1a18-4549-b34e-f016f96cdc41", enable_metric=>true>, id=>"10bfae584e3d4f5f26d5926b2455d9f9fa5068a0b3bad04ff4cd88d41af5d737", enable_metric=>true>
Error: undefined method `attributes' for nil:NilClass
Exception: NoMethodError
Stack: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-nmap-0.0.21/lib/logstash/codecs/nmap.rb:45:in `decode'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-stdin-3.2.6/lib/logstash/inputs/stdin.rb:38:in `run'
/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:512:in `inputworker'
/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:505:in `block in start_input'
After some debugging, I found out that somehow the XML file is segmented in large parts which results in the Ruby-nmap gem to fail because the XML file is not valid. This in turn results in the above error. Even ensuring that the entire XML file is on 1 single line does not resolve the above issue. When using smaller nmap scan with smaller XML file, the codec works perfectly, this issue is only with larger files.
Due to this issue I've switched to the HTTP plugin as input with the following configuration:
input {
http {
host => "127.0.0.1"
port => 34568
codec => nmap {
emit_hosts => true
emit_ports => true
emit_traceroute_links => true
emit_scan_metadata => true
}
}
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
index => "nmapdebug"
codec => "json"
}
}
I've then used curl to upload the XML file. Which works with smaller files, but when using larger files I've got that curl timeouts before the entire file is processed. Resulting in only a part of the results being written to ES.
$ cat foobar.xml | curl -H "x-nmap-target: foobar" http://localhost:34568 --data-binary @-
curl: (52) Empty reply from server
I'm basically looking for a way to parse the entire Nmap XML file using Logstash and the logstash-codec-nmap, but somehow I don't seem to be able to do it.
Any help is highly appreciated.