How to maintain the order of logs

Hi I have a script which is bringing in log files and I am pushing them into Elasticsearch via Logstash and it all appears fine. However when searching over these logs I am finding it difficult to keep them in the order in which they appeared in the log file. I thought about adding the line of the log but looks like I cant access that in Logstash.

I do have the timesamp and it looks like this (see below) but as you can see alot of them have the exact time down to the MS

2016-06-14 13:53:42.5851 I 12 INFO
2016-06-14 13:53:42.5851 I 12 INFO
2016-06-14 13:53:42.5851 I 12 INFO
2016-06-14 13:53:42.5851 I 12 INFO
2016-06-14 13:53:42.7403 I 5 INFO
2016-06-14 13:53:42.7403 I 5 INFO
2016-06-14 13:53:42.7403 I 5 INFO
2016-06-14 13:53:42.7564 I 5 INFO

I am also facing similar issue, are there any solutions or workaround?

If they have exactly the same timestamp, you may need to include a line number or offset in the file when you index the data and include this when sorting.

1 Like

Thanks Christian.
It works

N1k how did you get it to work? How did you get the offset? Is it in logstash or after?

I got offset in logstash after filtering.
But one thing I am concerned about is when source log file gets rotated then again offset will start from 0, and how can we recollect the order. Need to check on this

But one thing I am concerned about is when source log file gets rotated then again offset will start from 0, and how can we recollect the order. Need to check on this

Use the timestamp as primary sort key and the file offset as secondary?

1 Like

That duel sorting will work for me. Does anyone have an example of using the file offset as part of the logstash script?

How did you get offset in logstash (in each event)?