I have assigned a unique ID for each pair of lines, then I need to calculate the transaction time in second between the datetime value in these lines, finally display on line chart to show average transaction time by date time.
Is there any possible way without modifying logstash config?
Elasticsearch and Kibana generally expect you to work with a single document at a time, but there are some options you can use:
Without changing your logstash pipeline, you could try using the transformations feature of Elasticsearch, because you said that each pair of lines has a unique id. You could set up the pipeline this way:
Group by unique ID
Get the minimum timestamp and maximum timestamp and assign them to a single field
I think you can even calculate the max - min timestamp using a painless script in the transformations
This would let you visualize the duration using Kibana.
Another option might be to use Vega to construct a query where you group by the unique IDs, and do the same calculation that I previously described. This is probably harder than the previous option.
Transforms were introduced in 7.2, and have been significantly improved in later versions. 7.9 is the most recent version of the stack that has been released.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.