Approach to combining log data

What is the best approach to combining log data from two distinct type of logs?

Use Case: One log contains Events that occurred from time A to B. A second log contains user actions from time C to D, where there may or may not be overlap in the times.

Example Event log has 3 events occurring at 1am, 2am, 3am. It is down to the ms, but for simplicity.
The User Action log has user actions at 1:30 ,2:30 and 3:30.

I want to associate the user actions at 1:30 and 2:30 to the events.

As a novice here, i created two distinct indexes for these and quickly learned the concept of a join was not present as in sql.

I looked at has_child and join but they do not solve the problem directly ... that I can tell anyway.

Since the logs can arrive independently ... ie one might show up the next day based on connectivity i was planning on writing an app that had logic to look into the User Action bucket and go hunting in the Event Log bucket for an event log that occurred in that time range and then delete the User Action log after it was placed.

That sounds like a dubious approach. I already use NEST to write the data as the logs were structure in a format that was challenging to crack with logstash. Whereas in .Net I could crack it quickly as I had libraries that did that work.

Also Kibana will be the consumer of the data it that is of value.

Is there a good way to solve this problem?

One way is to use similar index patterns, eg logstash-events- and logstash-actions- and then setup an index pattern of logstash-* that lets you query both.

We are working on other ways to better implement this though, such as entity centric indexing patterns.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.