Hello All,
Based on an advice from earlier post, I created a config responsible for splitting my xml into multiple events. It looks like this:
input {
file {
path => "/mnt/abc/*.xml"
start_position => "beginning"
}
}
filter {
if [message] =~ /<BATCH.*$/ {
xml {
source => "message"
target => "theXML"
store_xml => true
}
split {
field => "[theXML][PANEL][0][DUT]"
}
split {
field => "[theXML][PANEL][0][DUT][GROUP][0]"
}
split {
field => "[theXML][PANEL][0][DUT][GROUP][0][GROUP]"
}
split {
field => "[theXML][PANEL][0][DUT][GROUP][0][GROUP][TEST]"
}
}
else {
drop { }
}
}
output {
stdout {
}
elasticsearch {
hosts => ["10.10.10.10:9200]
index => "abc"
http_compression => true
}
}
And here's part of my xml (added line breaks for better view, usually it's single line):
<?xml version="1.0" encoding="UTF-8"?>
<BATCH TIMESTAMP="2019-03-04T07:28:42.208+01:00">
<FACTORY NAME="PARIS" />
<PRODUCT NAME="PRODUCT"/>
<REFS SEQ_REF=""/>
<PANEL ID="1" COMMENT="">
<DUT ID="1111" COMMENT="">
<GROUP NAME="Main">
<GROUP NAME="First_Test_Group">
<TEST NAME="Test_1"/>
<TEST NAME="Test_1"/>
<TEST NAME="Test_1"/>
<TEST NAME="Test_1"/>
</GROUP>
<GROUP NAME="Main_2"/>
</GROUP>
</DUT>
</PANEL>
</BATCH>
Here's the stdout output:
http://oneclickpaste.com/3401/
One record view from Kibana:
Logstash properly generated 5 events and put them into Elastic, but each event has a _split_type_failure and I'm not sure what might be the outcome later on. What am I doing wrong with my config? What should I change to make sure that data will be parsed properly?
Thanks in advance.