How we can determine the execution time between event name START and END ? we have msecs filed to calculate the mili second difference between two rows.I tried creating the scripted field...but it is not working as expected

Hi @Raja_Kushwah Welcome to the community

At Query Time : Is that exec key a unique ID for the transaction, if so you can do that with at query time with a min and max aggregation on the msecs field then do a sibling aggregation to calculate the difference.

It would looks something like this (put in your fields

POST /dataset/_search
{
  "query": {
    "bool": {
      "must": [
        {
          "match": {
            "execkey": "1413124234i234y2"
          }
        }
      ]
    }
  },
  "size": 0,
  "aggs": {
    "all_matching_docs": {
      "filters": {
        "filters": {
          "all": {
            "match_all": {}
          }
        }
      },
      "aggs": {
        "min_timestamp": {
          "min": {
            "field": "msecs"
          }
        },
        "max_timestamp": {
          "max": {
            "field": "msecs"
          }
        },
        "total_elapsed_time": {
          "bucket_script": {
            "buckets_path": {
              "min_time": "min_timestamp",
              "max_time": "max_timestamp"
            },
            "script": "params.max_time - params.min_time"
          }
        }
      }
    }
  }
}
2 Likes

Actually here is a whole set of query time examples that you can look at try.
These are all query time, give them a try... you can also get stats etc... like min, max, averages over a set of transactions etc.

GET /dataset/_search?pretty

POST /dataset/_doc
{
  "datasetId" : "abc",
  "processName" : "process1",
  "startTimestamp": 100,
  "endTimestamp": 115,
  "duration": 15
}

POST /dataset/_doc
{
  "datasetId" : "abc",
  "processName" : "process2",
  "startTimestamp": 200,
  "endTimestamp": 250,
  "duration": 50
}


POST /dataset/_doc
{
  "datasetId" : "abc",
  "processName" : "process2",
  "startTimestamp": 300,
  "endTimestamp": 450,
  "duration": 150
}

POST /dataset/_doc
{
  "datasetId" : "def",
  "processName" : "process1",
  "startTimestamp": 1000,
  "endTimestamp": 1500,
  "duration": 500
}

POST /dataset/_doc
{
  "datasetId" : "def",
  "processName" : "process2",
  "startTimestamp": 2000,
  "endTimestamp": 3000,
  "duration": 1000
}


POST /dataset/_doc
{
  "datasetId" : "def",
  "processName" : "process2",
  "startTimestamp": 3000,
  "endTimestamp": 4500,
  "duration": 1500
}






#
# Query 1
#
POST /dataset/_search
{
  "size": 0,
  "query": {
    "bool": {
      "must": [
        { "match": { "datasetId": "abc" }}
      ]
    }
  },
  "aggs": {
    "min_timestamp": {
      "min": {
        "field": "startTimestamp"
      }
    },
    "max_timestamp": {
      "max": {
        "field": "endTimestamp"
      }
    }
  }
}

#
# Query 2
#
POST /dataset/_search
{
  "query": {
    "bool": {
      "must": [
        {
          "match": {
            "datasetId": "abc"
          }
        }
      ]
    }
  },
  "size": 0,
  "aggs": {
    "all_matching_docs": {
      "filters": {
        "filters": {
          "all": {
            "match_all": {}
          }
        }
      },
      "aggs": {
        "min_timestamp": {
          "min": {
            "field": "startTimestamp"
          }
        },
        "max_timestamp": {
          "max": {
            "field": "endTimestamp"
          }
        },
        "total_elapsed_time": {
          "bucket_script": {
            "buckets_path": {
              "min_time": "min_timestamp",
              "max_time": "max_timestamp"
            },
            "script": "params.max_time - params.min_time"
          }
        }
      }
    }
  }
}

# Query 3  Shows Total Processing Time plus stats
POST /dataset/_search
{
  "size": 0,
  "query": {
    "bool": {
      "must": [
        {
          "match": {
            "datasetId": "abc"
          }
        }
      ]
    }
  },
  "aggs": {
    "duration": {
      "stats": {
        "field": "duration"
      }
    }
  }
}

#
# Query 2
#
POST /dataset/_search
{
  "size" : 0,
  "aggs": {
    "data_set_aggs": {
      "terms": {
        "field": "datasetId.keyword"
      },
      "aggs": {
        "duration": {
          "stats": {
            "field": "duration"
          }
        }
      }
    }
  }
}


#
# Query 4
#
POST /dataset/_search
{
  "size": 0,
  "aggs": {
    "data_set_aggs": {
      "terms": {
        "field": "datasetId.keyword"
      },
      "aggs": {
        "min_timestamp": {
          "min": {
            "field": "startTimestamp"
          }
        },
        "max_timestamp": {
          "max": {
            "field": "endTimestamp"
          }
        },
        "total_elapsed_time": {
          "bucket_script": {
            "buckets_path": {
              "min_time": "min_timestamp",
              "max_time": "max_timestamp"
            },
            "script": "params.max_time - params.min_time"
          }
        }
      }
    }
  }
}
2 Likes

Hi @Raja_Kushwah,

You can also achieve this using Logstash Filter Plugin called "Elapsed", which can calculate the execution time between two events using unique transaction id for both the events.

Please refer below for more details :

https://www.elastic.co/guide/en/logstash/current/plugins-filters-elapsed.html

Also below is the sample logstash conf file for the same.


input{
	beats{
		port => 5044
	}
}

filter{
	grok{
		match => ["message", "%{TIME:timestamp:date} %{GREEDYDATA:Thread} %{WORD:LoggingLevel}  %{WORD:RouteName} - (?<logmessage>(.|\r|\n)*)"]
	}
	json{
		source => "logmessage"
		target => "doc"
	}

	mutate {
		add_field => {"trxnId" => "%{[doc][transactionId]}"}
		add_tag => ["%{[doc][messageType]}"]
	}

	elapsed {
		unique_id_field => "trxnId"
		start_tag => "SourceRequest"
		end_tag => "SourceResponse"
		new_event_on_match => false
	}

}
output{
	elasticsearch{
		hosts => ["http://ES_HOST:ES_PORT"]
		index => "index-%{+YYYY.MM.dd}"
	}
}

In the above example SourceRequest and SourceResponse are the start and end event in the transaction.

Regards,
Rakesh Chhabria

2 Likes

Thank you so much this is working for me.

@stephenb Is it possible to calculate the time difference between two rows and adding difference field to our data while ingesting the records.

my data looks like-

{
"id": "077e94c0-f354-11ea-899d-213fb1217173",
"datetime": "2020-09-10T10:54:41.676Z",
"msecs": "1599735281676",
"threadid": "qtp1215694472-268",
},{
"id": "07817af0-f354-11ea-899d-213fb1217173",
"datetime": "2020-09-10T10:54:41.695Z",
"msecs": "1599735281695",
"threadid": "qtp1215694472-268"
}

i want to calculate the difference between the msecs in each event and then make and new filed into record stating the time difference between two events trigger and we can make our visualization based on difference field.

i am able to determine the msecs difference from above query but the execution key will be dynamic and also please guide how i can visualize using es query

Hi @Raja_Kushwah
To do what you are asking you could use the method @rakesh15 described above.

Or you could use an ingest pipeline which would be a bit more complex.

It is unclear what visualization your trying to create... Depending or what you are trying to show the time difference may or may not need to be calculated ahead of time... Although I suspect it might need to be.

I would try the method above first

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.