Lookup during query time

In Splunk we can do the lookup during the query time
lookup sn_instances ip AS ip, node AS port OUTPUTNEW name AS nodename

Is it possible to something similar in elastic?

I know Logstash filters [csv, jdbc, .. etc] using which we can transform the data in logstash. But once data is indexed if we want to any additional transformations, can we do it?

My use case is : I am ingesting the data with simple schema which contains only timestamp and message fields. In message fields I have a IP and port, using Painless script i can extract these two fields, using these two extracted fields I want to do lookup [csv or DB ... ] which should give some readable name for this combination. Please let me know how to achieve this functionality?


No, this is something you typically do at index time when working with Elasticsearch. Please see this blog post for further details.

Thank you Christian.
But below is my use case.
Application is generating the log entries into specific log file.
i.e. in single log file we have many formats and it depends on developer and we don't have any control over the format of the log.
As log is having many formats, so we can't apply any grok filter during the index time.

In this case, the only option is to extract the required fields at query time and do the lookup.
Can we do the lookup in painless script? or some other place?


Not that I know of. Doing all this work at query time will be very slow and is generally not the best way to work with Elasticsearch in my opinion. If you want to continue doing analysis this way, which is what Splunk was designed for, why not stick with Splunk?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.