Im using logstash to index my xml file data into elasticsearch. Now my question is how to store my data so that it will be faster to be searched on.
1)Should i put all the file data in one field
2)Should i add different fields for different tags of my xml file(if so then there are a lot of xml tags of my xml file is there a way to automatically amke the fields)
Have a look at Logstash's xml filter. It'll create a field for each tag in your XML document. Indexing the XML data in a single field is most likely a bad idea.
hi magnus,
I have already looked into it and I'm able to create fields using xpath but i need to manually write xpath for each and every tag in the logstash config.
I'm having hundreds of xml files with hundreds of tags,so manually writing xpath for each and every file is not possible.
Is there a way to make logstash automatically create fields and assign data from the xml file?
Here is a snapshot of my xml file.
Then don't use the xpath
option. Set store_xml
to true (that's the default anyway) and set target
to indicate where you want to store the results.
Please don't post screenshots if it's text that can be copied and pasted.
thank you!
It solved my problem.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.