Transfer and indexing data on JSON

Hello, we have a data server which contains XML and / or JSON data, we want to transform these data on JSON, transfer and indexing them on Elasticsearch. After a quick search, it's possible to use Filebeat.

I think Filebeat is for transfer logs into Elastic or Logstash, does it work for data to indexing? Are there some modules for managing indexing process? To havest and transform data on JSON, transfer from source servers to Elastic servers, do indexing and reindexing, etc. Is ILM (index lifecycle management) a good choice?

Yes, filebeat can read any text file and you can add whatever processors to filebeat or elasticsearch ingest processors to transform it however u want.

Thank you, I will test it.

Hi @Wonder_Garance just to add a but more

There is a new XML processor in filebeat you can read about it here

This is a brand new feature so keep that in mind...

There is also a very mature / powerful XML filter / parser in Logstash here but that requires using Logstash with an architecture like this :

Soiurce -> Filebeat -> Logstash -> Elasticsearch

There is no Elasticsearch ingest processor to parse XML today

@stephenb Thanks a lot, it's the thing I'm looking for.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.