Where can we apply business logic in ELK? For instance, I don't want logs just the way they are rather some more metadata to each row. (assuming logs are in csv)

Elastic search is mostly for querying data. However, I am stuck with the part where I want to ingest some logs but probably with business logic. So that the data is transformed by the time it reaches Elastic search. Can we do this while ingesting it using Logstash?

Hey,

there are several solutions to this, logstash is one of them. Elasticsearch features a so-called ingest node allowing you to configure a pipeline, that is a set of steps/processor which change your JSON before it is being indexed. That might help you as well!

--Alex

Thanks @spinscale. Actually I just realized I am getting data in xls format which is not supported by Logstash. Either I can write a python script to convert xls to json, ask Logstash to apply business logic or I can apply business logic in python script (too much memory usage, not ideal) and push to Elasticsearch. Which one do you think is suitable for 12 MB data? I think both approach would work similar way.

if you have to write a tool that converts the excel data anyway, maybe do everything in one take then? 12MB of data does not sounds too much, so I guess you're fine there...

If memory is a concern, do not load the whole file into memory, but read chunks, send them to Elasticsearch, then read the next...

Thank you! This helps. :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.