In Splunk we can do the lookup during the query time
Example:
lookup sn_instances ip AS ip, node AS port OUTPUTNEW name AS nodename
Is it possible to something similar in elastic?
I know Logstash filters [csv, jdbc, .. etc] using which we can transform the data in logstash. But once data is indexed if we want to any additional transformations, can we do it?
My use case is : I am ingesting the data with simple schema which contains only timestamp and message fields. In message fields I have a IP and port, using Painless script i can extract these two fields, using these two extracted fields I want to do lookup [csv or DB ... ] which should give some readable name for this combination. Please let me know how to achieve this functionality?
Thank you Christian.
But below is my use case.
Application is generating the log entries into specific log file.
i.e. in single log file we have many formats and it depends on developer and we don't have any control over the format of the log.
As log is having many formats, so we can't apply any grok filter during the index time.
In this case, the only option is to extract the required fields at query time and do the lookup.
Can we do the lookup in painless script? or some other place?
Not that I know of. Doing all this work at query time will be very slow and is generally not the best way to work with Elasticsearch in my opinion. If you want to continue doing analysis this way, which is what Splunk was designed for, why not stick with Splunk?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.