Parse a CSV file with grok logstash

Hello!
How could I parse a string that is between double quotes with grok logstash.

"Kaathadimattam, Balacola Post, NEAR Siva Tea Factory, Ooty, 643203 Ooty, India – Great location -",ooty,India,.....
thank you.

What do you want to do with it? You could use a second csv filter to parse it into multiple fields, or a mutate+split filter to parse it into an array. There are many other things one could do with ruby. What do you want?

at the beginning I imported a CSV file in elasticsearch by configuring a logstash.config file but when viewing I noticed that each record is stored in only one field so I thought to parse the fields in the grok filter logstash to properly structure my data, is that already how we work with elasticsearch?

if you can help me,
I parsed the data using grokdebug and it worked fine, i got the structure i wanted, but when i visualized my data via kibana's discover the structure did was not the same ... I do not understand where the problem is.
Details are attached

(Attachment test_grok.txt is missing)

(Attachment logstash_spanish.config is missing)

(Attachment my_data_in_kibana.txt is missing)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.