Dynamic Mapping CSV file

Hello,

i have an CSV file which i need to import into Elasticsearch.

For now it is released with grok:

grok
{
match => { "message" => "%{NUMBER:count}?;(?[^\d]*(\d+))(?[a-zA-Z]+)(?[0-9]{4});%{TIME:time}?;[...]
}

Unfortunately the CSV files header is changing everyday. So the static way with grok will not work.

Is there an easy way to use the header of the CSV for the mapping? Could you please give me a example?

Best Regards,
Christian

There's no easy way to do this unfortunately. You may want to follow https://github.com/elastic/logstash/issues/2088

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.