How Can I parse logs that are already indexed in elasticsearch with logstash?

Hello!

How can I parse logs that are already indexed? For example one of my log files has a lot of info and all the time shows up new ideas to extract the info in many different ways for many different use cases.

My question is if Logstash can read the files that are already indexed for create new fields and stuff we have like 4 years of data.

For example in Splunk all the time you can create new fields with the extraction of de data that are already indexed with the GUI it has.

You could use an elasticsearch input and an elasticsearch output, preserving the index name and document id from the docinfo metadata. Alternatively, if you are just adding new fields use an elasticsearch input and write out a file using the bulk and update APIs and then use curl to POST that into elasticsearch. This thread has some discussion of that.

Hello Badger!

Actually we don't use the elasticsearch for the searchs, instead we use kibana, we want that new fields from the old logs appear there.

Do I have to do the same thing?

Maybe. Kibana has support for scripted fields, which are evaluated when the data is fetched from elasticsearch. That might be enough for what you want to do. But note that they get evaluated every time a fetch occurs, and there is a cost to that.

If scripted fields are not powerful enough for your use case then yes, you need to reindex.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.