How to execute dynamic queries with visualisations

Hi,

I am new to ELK and I am facing one issue in creating visualisations.

So we have two different sources(log files) from where we are reading the logs and storing triggeredTime and pickedTime of application event in elasticsearch. Both triggeredTime and pickedTime doesn't belong to single document but both entry contains same eventId.

For example:
Document1 is like

{
eventId : EV123
status: TRIGGERED
triggeredTime : 2021.08.11 12:01:02:0536
}

Document2 is like

{
eventId : EV123
status: PICKED
pickedTime : 2021.08.11 12:02:03:0456
}

I have created data table visualisation like below,

eventId | status | triggeredTime | pickedTime

Now, I want one more column in data table which will be inQueueTime which will give me the time of how long that particular event has been there in pending queue.

InQueueTime = pikedTime - triggeredTime

How to create this custom column in data table which will give me inQueueTime of event on the fly ?
OR
Is there any other visualisation which I can use to calculate inQueueTime and create a dashboard out of it.

Hey! If I understand correctly, you have created one index pattern which has documents with different fields. Correct? Is it a time-based index pattern? Moreover, can you also share your datatable configuration and your kibana version?

Yes it is a time-based index pattern with documents having different fields.My Kibana version is 7.5.1

For data table I am simply performing aggregations on terms and splitting the rows like
For example siteAddress and appName in below config .Likewise we also have pikedTime and triggeredTime but both belongs to different document as both pickedTime and triggeredTime coming from different sources(log files) in elasticsearch.

"aggs": {
"2": {
"terms": {
"field": "siteAddress.keyword",
"order": {
"_key": "desc"
},
"size": 5
},
"aggs": {
"3": {
"terms": {
"field": "appName.keyword",
"order": {
"_key": "desc"
},
"size": 5
},

If you had all the information per document you could use scripted fields to do this calculation but as it is right now I don't think that this is possible.
I think that there is another discuss post about it Calculating one field from different documents

Wouldn't this be possible using the new Lens functions?

No, still the same problem. The user wants a calculation between two different docs.

@Stratoula_Kalafateli @Felix_Roessel
Can we acheive this using tranform to create entity centric index but again the question remains same how to join two separate documents to calculate difference in two fields in transform.

Just to add to the discussion, I've recently been looking into options for doing calculations across events/documents. I've come across the following options, although I don't have a wealth of direct experience in any of them.

  1. Logstash Aggregate filter - Can piece together related events and saves a single event after the final event has been detected (or timed out). Has some scaling concerns.
  2. Logstash Elasticsearch filter and elasticsearch output - Logstash can query elasticsearch for a previously-ingested event/document, use its fields to calculate something new, and then update the original document.
  3. Elasticsearch Transforms - After events have been ingested, transform them into an entity-centric index using the Transforms feature. I'm not sure how much delay you can expect from this post-processing, but Transforms can run in continuous mode, so in theory it can be fairly minimal.

Anyone, feel free to correct me on anything. There's also some good discussion here:

1 Like

I wanted to add an additional note for option #2 I listed. Using the Elasticsearch filter plugin also has scale concerns. If events are being processed in parallel, then you might not be able to guarantee that the "previous" event is already in Elasticsearch to be queried and updated.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.