How to load data from large parquet files of 500M records to ES using logstash

how to load data from large parquet files of 500M records to ES using logstash

I am not aware of any suitable plugin that supports this format, but there seems to be a workaround available here. Another option might be using the Elasticsearch Hadoop connector, but that would naturally bypass Logstash.

My parquet files are in s3 repository i am using s3 plugin.I just want to know how to import the data into es using logstash

As far as I know there exists no parquet codec, so that may require developing one.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.