Loading Fixed legth CSV File with Logstash

Hi,
I'm new to elasticsearch, i install elasticsearch, logstash now i need to import a fixed legth CSV file and process data to elastic search.
please can help me in configuration.

thanks in advance

Use a file input to read the file, a csv filter to process each line, and an elasticsearch output to send the resulting data to ES.

Thanks for reply.Can you please help to separate columns with fixed length.

So there's no column separator at all? Then it's not really CSV. I'm not sure Logstash has a plugin to process fixed-length columns without delimiters, but a ruby filter can certainly do it.

For further help we'll need to see an example.

Thanks for Reply.here my example

INPUT Data in file
01KIRANV1990
03GIRISH1985

OUT PUT COLUMNS
Id : FIRST TWO DIGITS
NAME:3 to 9 (next 6 charcters)
Year :Lnext 4 characters

ID NAME Year

01 KIRANV 1990
03 GIRISH 1985

So... no delimiters at all then? You could use a grok filter for this, e.g. with this expression:

^(?<id>..)(?<name>.{6})(?<year>.{4})

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.