Help in capturing E2E timestamp from raw logs

Hi All,

I am new to ELK and am trying to achieve a real time monitoring framework which captures E2E timestamp of an ansync logback application.
The ask is to capture the timestamp corresponding to a unique ID in the logs for start and end of the processing of the correlation ID logged in the a distributed log file system.
I need some help in finding right direction to getting this implemented

It sounds like this usecase would benefit from the aggregate plugin Aggregate filter plugin | Logstash Reference [8.12] | Elastic

Specifically example 1 in the document sounds similar to what you're trying to do.

@strawgate Thank you for your response, the example pathway looks promising I will try implementing that within the test ecosystem I have.
The only doubt/confusion I have is each ID is a unique value which is randomly generated within the application, how will I be able to compare these random values and then aggregate the timestamps.

Also one more question would be in linux I was searching within the logs using regexp and was selecting a specific group which yielded me the relevant data, how can i achieve this in elastic.

Hi, you can use a grok filter to parse the lines using regex.

With aggregate you can use any field parsed from the log, ID is just an example but you can use IP address or anything else

Thank you so much for the responses, I am in the process of implementation as advised :slightly_smiling_face:

Happy to help!

Feel free to share your progress and any issues you run into along the way!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.