Loading dumped non-parsed ES data to Logstash

I have data in Elasticsearch that was sent by Filebeat to Elasticsearch without being parsed in Logstash, also I have a dump of this data using Elasticdump (indexes and mappings as jsons).

I need to parse these logs in Logstash. Is there a way to parse batch data which already exists in Elasticsearch?

My current dataflow looks like this:
Filebeat -> Kafka/zookeeper -> Logstash -> Elasticsearch -> Kibana

I tried to run a Kafka producer and send the Jsons I have already but they were fully received under "message" field, while disregarding other important fields.

Is there a way to ingest them into my cluster while parsing them again (backfilling) ? or a way to parse batch data which exist in ES using Logstash?

Thanks:)

There's an elasticsearch input that you can use.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.