Load database CSV

Hi guys!

I have a relational database and now I want migrate the information of it to indexs in ElasticSearch. I have created 3 index using datasets but now I have a problem, how can I load a list using CSV?

I have this:

  • Config:

filter {
csv {
columns => ["code1", "code2","name","uris"]
separator => "|"
remove_field => [ "message", "host", "@version", "@timestamp", "_all" ]
}
}

I know that it is incorrect but I don't know how I can do it.

CSV:

000-000001|0-1|JOAQUIN|[eur.com,su.com]

[eur.com,su.com] --> I want load a list of uris, where the values are: eur.com and su.com

I will be grateful if you can help me.

Thanks!

Why is that incorrect?

I moved the thread to #logstash

First of all thanks warkilm for answer me, If I load it I will get one field named uris that only have one value: [eur.com,su.com]

I would like get this structure:

"uris": [
{
"name": "eur.com"
},
{
"name": "su.com"
}

Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.