I'm having some problems with ElasticSearch. Keeping it short, I need to look into my logs and normalize them in various indexes.
My logs are stored in both an index of ElasticSearch and in a relational database. These other indexes I want to feed are also in ElasticSearch.At first, I thought of using the Logstash to do that. I would keep feeding my ElasticSearch event store as I do today (e.g., getting data from the DB) and would have a Logstash pipeline to look the data coming at the event store in ElasticSearch and outputting the normlized data in my other index(es). Problem is, the Logstash plugin to input data from ElasticSearch seems to a built-in way to track data, as JDBC provides provides with tracking_column
.Is there a way to simulate the tracking_column
behaviour in the ElasticSearch plugin, or should I go another solution? If so, do you guys have a suggestion?
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.