Unless your XML file has a single line you have to use a multiline codec to join the lines of the file into a single event. This is the third time the topic of parsing XML files comes up this week.
So... this replaces the contents of the report_host_start field with the contents of the report_host_start field. What are you really trying to accomplish here?
I'm trying to extract my .nessus file to get the important data such as report name, host name and severity and such on. I've updated my config file to include the multiline codec but now it seems no data is being send to elasticsearch. Below are my new config file
You'll want to set the codec's auto_flush_interval option to something quite low. That'll make sure that Logstash stops waiting for the next <?NessusClientData_v2> line that'll never come.
I tried the store_xml to true and set the target, but still the field declared on the xpath is not showing in kibana. Below are the current setting. I will send my .nessus sample to you.
Sorry i don't know how to view the output from stdout { codec => rubydebug } but i will paste the json output from kibana. Below are one of the output from kibana
a) I've sent the sample nessus file in your message box via dropbox. (if this is what you mean by input document.)
b) Event from the log of logstash is it? (Sorry i'm not so sure)
2018-02-23T07:52:00,695][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@192.168.1.152:9200/]}}
[2018-02-23T07:52:00,696][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elastic:xxxxxx@192.168.1.152:9200/, :path=>"/"}
[2018-02-23T07:52:00,967][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@192.168.1.152:9200/"}
[2018-02-23T07:52:01,081][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
[2018-02-23T07:52:01,120][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-02-23T07:52:01,138][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-02-23T07:52:01,302][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-02-23T07:52:01,536][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.1.152:9200"]}
[2018-02-23T07:52:21,211][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x7e4df78@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 run>"}
[2018-02-23T07:52:21,835][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
I've able to change the log level from info to debug from logstash.yml config file, here is the log when the service started until the xml is stopped from parsing
The debug logs are not interesting. The output from a stdout { codec => rubydebug } output is.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.