Comparing fields of log files with different index in kibana

I have 2 different log files in elasticsearch, output as two different indexes I1 and I2 respectively in kibana. I want to compare the these two indexes or files and produce a result on the matching of field. Basically, for a row, if entry in field of I1 matches any entry form the entire file of I2 then that matched expression should be added as a field in I1 in that particular row.
I am new to elasticsearch and environment and I tried doing it via DSL queries but couldn't find any luck. Please help me do that. Thank You

I don't think you can do that easily OOTB.

May be you can scroll all the i1 index (fetch all documents) and one by on run a query on i2 to see if it matches but this is going to be slow for sure.
You may be able to do that with Logstash with an elasticsearch input and elasticsearch filter.

I tried doing it by fetching one by one but then it becomes slow and does not serve the purpose.

If you could tell how it can be done with logstash config file.

If speed is an issue here, I don't think Logstash will solve it for this specific use case.

Doing lookups is a slow task.

1 Like

ok but I still need a way to do it with logstash and its filters

Can you tell us a bit more about the use case? What does the data in the 2 indices represent? How frequently are data in the index you are looking up against being added to or updated? What is the size of the index you are looking up against? Do you expect to always get a match?

Data in index1 is a log file which has field "DATA" and index2 has set of patterns to which I want to match the DATA filed in i1.
Log files contain approximately 7-8K records and pattern file has in total of 14 patterns to be matched.

Can you provide an example of the patterns and the log entries? If the index you are looking up against is quite static, you may be able to create a file based on the data and use the translate filter plugin, which supports strict key-value lookups as well as regular expressions. This would avoid the network round trips and might be more efficient.

The patterns and logs are confidential so I made some changes.
So following provides the kind of patterns and log file field "DATA".

Pattern : Error -103 creating case for service tag
DATA : Response: ***, type=----, req#=register_request, code=2, msg=Invalid value specified for SourceHeader.ClientHostName

if any out of 14 patterns exist in any of the record in "DATA" field then it should show in I1 as another field for matched pattern.

Any help would make me closer to my project..

I do not understand the sample data you provided. Can you provide an example of each record type and describe how the match is done and what data you are looking to add to the processed event?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.