Hi Experts,
I'm trying a simple usecase using ELK to start with.
The usecase is, I've an XML file like below
======================
<stations version="2.0"> <station> <id>2</id> <name>Irving</name> <installed>true</installed> </station> <station> <id>3</id> <name>DFW</name> <installed>true</installed> </station> </stations>
======================
I want to extract id and name fields and their values from the above xml and send it to elasticsearch.
Once done, want to query the station with id=2 from kibana.
For that, I tried to create a config file as below
================
input {
file {
path => "/var/tmp/tests/input.xml"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => multiline
{
pattern => ""
negate => true
what => "previous"
}
}
}
filter {
xml {
store_xml => false
source => "message"
target => "doc"
xpath => [
"/stations/station/id/text()", "id",
"/stations/station/name/text()", "name"
]
}
}
output {
stdout {
codec => rubydebug
}
file {
codec => "json"
path => "/var/tmp/logstash_out.log"
}
elasticsearch {
hosts => "localhost:9200"
index => "stations1"
codec => "json"
#document_type => "xmlfiles"
}
}
================
The logstash is processing it and forwarding it to elasticsearch.
But I'm facing couple of problems.
- Logstash is not filtering the installed line from the output. More over it's giving the output in xml format only.
Below is the console debug message
[DEBUG] 2019-09-26 21:31:10.428 [[main]>worker2] xml - Event after xml filter {:event=>#LogStash::Event:0x5dc66c49}
{
"@timestamp" => 2019-09-26T21:31:10.306Z,
"@version" => "1",
"message" => "\t\n\t\t3\n\t\tDFW\n\t\ttrue\n\t\n",
"tags" => [
[0] "multiline"
],
"path" => "/var/tmp/tests/input.xml",
"host" => "elk"
}
Because of which I see below stuff in kibana gui, where I can't see any filed with key as id so that I can do the search based on the id.
Appreciate your help.
Thanks
Ram