I've got two files, an xml file and a csv file. When I ingest the data from the csv file I want to add_fields from the xml file.
Here's what my xml looks like
<Root>
<Date>07.31.2015</Run>
<Customer>
<Name>John</Name>
<ID>12345</ID>
</Customer>
</Root>
And here is my csv
column1,column2,column3
example1,example2,example3
example4,example5,example6
Now I want to pull Date and ID from the xml file and add those fields to each row of the csv when I pull it into elastic search, so something like this:
{
"_index" : "indx-2.0",
"_type" : "logfile",
"_id" : "abcdefg",
"_score" : 1.0,
"_source":{"message":["column1,column2,column3"],"Date":"07.31.2015", "CID": "12345"(etc)}
}
{
"_index" : "indx-2.0",
"_type" : "logfile",
"_id" : "abcdefh",
"_score" : 1.0,
"_source":{"message":["example1,example2,example3"],"Date":"07.31.2015", "CID": "12345"(etc)}
}
{
"_index" : "indx-2.0",
"_type" : "logfile",
"_id" : "abcdefi",
"_score" : 1.0,
"_source":{"message":["example4,example5,example6"],"Date":"07.31.2015", "CID": "12345"(etc)}
}
Is this kind of thing possible using logstash (And what would the .conf file look like)?