path => "/usr/share/logstash/bin/myXML.xml"
start_position => beginning
# I DON'T KNOW WHAT TO PUT HERE
hosts => ["localhost:9200"]
user => elastic
password => changeme
I am confused as to how I should make my config file with the filters and what not. The XML file will have nearly hundreds of fields (hence the ". . . " and some have sub-fields (sort of a Object Oriented way of encapsulating the data within other data like a Class in Java). Is there a way to dynamically parse the XML file so I don't have to manually define the fields and the contents of them?
Ok. My bad, I didnt see your configuration.
With your configuration, logstash will read each line as a new event. To fix that use multiline codec. Multiline codec will aggregate multiple lines into a single log event, in this case it will create one xml file. There are plenty of sample around multiline please go through them. As I am on mobile device unable to give you exact config.
Secondly, once we aggregated and created a single xml file use ruby code.
Below is a recursive way of iterating all elements in an xml
code => "
require 'nokogiri'def iterative(ele)
ele.children.each do |tempNode|
xml_doc = Nokogiri::XML.parse(event.get('xml-data'))
Above is the sample, which we used to parse xml do some inline masking on the data. This should give you some insight on xml processing. If you dont have to do much manipulation on XML, I would suggest Badger solution rather than this.