Replacing @timestamp correctly

Hi there!

I have a problem and hope that someone could help me with it:
I try to import a file via Filebeat to Logstash to be handled there and afterwards to be indexed in Elasticsearch.
So far this has worked fine!

For the @timestamp field I wanted to use an extracted time from the logfiles.
I can see that in the output of logstash the @timestamp field is set to the extracted value.

Now my problem is that as soon as I create the time filter in Kibana with @timestamp as value there are no results shown. If I let the time filter empty the lines from the indexed file are shown, even with the @timestamp value from the logfile.

I tried multiple things and searched for some hours now for a soultion, but I can't quite make it work.

Could someone please help me out here? Thanks!

logstash.conf:

input {
    beats {
        port => 5044
  }
}

filter {
  grok {
    break_on_match => false
	  match => {
      "message" => ["%{LOGLEVEL:loglevel}","%{TIMESTAMP_ISO8601:event_timestamp}"]
    }
    match => {
      "source" => ["%{GREEDYDATA}\\%{GREEDYDATA:filename}.log"]
      }
  }  
  mutate {
    lowercase => [ "filename" ]
  }

  date {
       match => ["event_timestamp", "ISO8601"]
   }
}

output {
	elasticsearch {
		hosts => "elasticsearch:9200"
    index => "%{[filename]}"
	}
  #stdout { codec => rubydebug }
}

Logstash Output:

{
           "loglevel" => "INFO",
           "@version" => "1",
               "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
             "offset" => 306,
            "message" => "INFO  2019-01-08 00:26:00.422 Lorem Ipsum dolor sit amet",
             "source" => "m:\\Logs\\logfile.2019-01-08.01.log",
               "host" => {
            ...]
    },
              "input" => {
        "type" => "log"
    },
           "filename" => "logfile.2019-01-08.01",
         "prospector" => {
        "type" => "log"
    },
               "beat" => {
         "version" => "6.5.4",
    },
         "@timestamp" => 2019-01-08T00:26:00.422Z,
    "event_timestamp" => "2019-01-08 00:26:00.422"
}

Kibana without time filter:
screenshot_kibana

I am misunderstanding what you're trying to do. As I read your description, I thought you wanted the @timestamp field to end up with the value from event_timestamp, which is pulled out of the message field. But your sample output shows that to be happening already. Can you describe what's wrong with the sample output, or give an example of what you do want it to look like?

The @timestamp is replaced as far as I can tell, that's correct!
My problem is that when I try to create an index pattern for this file and try to use the "time filter" in Kibana there are no results shown in "Dicover".
If I don't use the "time filter" there are entries in the "Discover" tab.
As this problem only occured after changing the @timestamp value through logstash I thought this was fitting in the logstash category here.

Examples:

  1. Create new index pattern with time filter




  2. Create index pattern without time filter
    2019-02-01%2019_28_42-Kibana
    2019-02-01%2019_28_49-Kibana


Thanks!

Ah, now I understand, sorry... this is more of a Kibana question I suppose. With the time filter in place, make sure the time range you have selected in the Discover section actually includes data in the index. In your included images, I can't see what yours was set to. Here's a snip that shows what I'm talking about, in the upper right corner of the page:

In that case, you would only see documents with a @timestamp value that was within the past hour.

Okay, changing the date range was the issue, thank you very much!
I have overlooked this setting....
Have a nice weekend :slight_smile:

Hi jdmcalee
i have maillog file like this

Jan 29 13:41:20 twofive25 dovecot: imap-login: Login: user=testatmail104@twofive25.com, method=PLAIN, rip=192.168.100.45, lip=192.168.100.125, mpid=19060, session=

I have tried many way to parse timestamp from

Jan 29 13:41:20

but i can't,

i have also tried this

Parse Syslog with Logstash Grok Filter and Mapping to Elasticsearch | by Oğuzhan Karacüllü | Medium

would you please help me solve it.

I would use dissect to get the timestamp from that.

    dissect { mapping => { "message" => "%{ts} %{+ts} %{+ts} %{}" } }
    date { match => [ "ts", "MMM dd HH:mm:ss" ] }

Thanks, I got it.
I have additional question is, i'm monitoring time of email session. so how can i get duration time from Login time to Logout time

Use an elapsed filter, or if that does not work an aggregate fitler.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.