Splunk equivalent for _index_earliest and _index_latest

(shahzad) #1

I am currently running a test VM to see if we can migrate from Splunk to ELK stack. I've got the data in the elasticsearch and I can see that data in Kibana. Now my next challenge is to run few searches which we run in Splunk.In one of the search we search for all the logs splunk has indexed in last 5 minutes or whatever time you specify. I need to know how can I do this in Kibana? I've looked at the documentation but couldn't find anything.

(Mark Walkom) #2

Have a look at https://www.elastic.co/guide/en/kibana/current/discover.html#set-time-filter

(shahzad) #3

Mark thanks for providing the link.

I'm afraid it doesn't answer my question. The set time filter applies to the time stamps of the events. Whereas I want to search against the time the events were indexed.
For example Elasticsearch is indexing logs from yesterday right now. So I want to search for the data that was indexed in last 5 minutes and that should give me these events even though their time stamps are from yesterday. I hope my question make sense?

(Mark Walkom) #4

Oh right, well you need to have specified a timestamp field with that time so you can filter on it.

(shahzad) #5

Thanks for replying back Mark.

I believe I haven't configured my indexing properly. My @timestamps are different than the date in the events which have been indexed. @timestamp is displaying the time these events were indexed.
I've added the date filter to match the time from the events, restarted logstash but that didn't work.

filter {
    date {
            match => ["date","YYYY-MM-dd HH:mm:ss"]

   csv {
            columns => ["date","field2","field3","field4"]
            separator => ","


(Christian Dahlqvist) #6

It looks like you are trying to apply the date filter to a field that you have not yet extracted. Try putting the csv filter before the data filter.

(shahzad) #7

Hi Christian, that has fixed the problem. I can see in the logstash.log that its reading the date but its complaining about the formatting so I hope you'll be able to help once again.

{:timestamp=>"2016-10-18T10:43:38.739000+0100", :message=>"Failed parsing date from field", :field=>"date", :value=>"2016-10-03 08:48:11.0", :exception=>"Invalid format: \"2016-10-03 08:48:11.0\" is malformed at \".0\"", :config_parsers=>"YYYY-MM-dd HH:mm:ss", :config_locale=>"default=en_US", :level=>:warn}

This is what I've as my match stanza

match => ["date","YYYY-MM-dd HH:mm:ss"]

Logs are in this time format:

2016-10-03 23:41:39.0

(Christian Dahlqvist) #8

As the error message indicates, there is a .0 at the end that does not match your date pattern. You may want to clean the data up before applying the date filter.

(system) #9