Hello,
I am completely new to ElasticSearch. I did successfully set it up and explore some test data but warning: I am a newbie.
I am trying to index an XML log file through logstash and I am lost as to how to procede going forward. For information, I am using windows.
Here is the structure of the XML file I am trying to parse.
<robot>
<suite >
<test>
<kw>
<doc>Some text I want to index</doc>
<arguments>
<arg>Some other text I want to index</arg>
My entire XML is thus contained in the <robot>
tag.
Here is my configuration file:
input {
file {
path => "pathtomyxml/file.xml"
start_position => "beginning"
sincedb_path => "NUL"
type => "xml"
}
}
filter {
xml {
source => "message"
store_xml => false
xpath => [
"//robot/suite/test/kw/doc/text()", "doc_field",
"//robot/suite/test/kw/doc/arguments/arg/text()", "arg_field"
]
}
}
output
{
stdout {
codec => dots
}
elasticsearch {
index => "myxml-logs"
}
}
My goal is to store the text contained in the <doc
> and <arg>
fields into two separate fields.
This is what I tried based on my understanding however, I have several questions:
-
I did not really understand the source => "message" that I put in, this seems pretty standard but what does it mean? Elastic documentation was not clear enough for me.
-
Could anyone point me to what I am doing wrong ? When I run this, this gives me complete nonsense when checking it in Kibana.
Thank you in advance !