Get the right timestamp for old log files

So I just basically write target => "@timestamp" ? Or do I just skip the "@timestamp" part?

You don't have to set target since it defaults to @timestamp.

So the date filter makes my "mytimestamp" field go to the original "@timestamp" field by default? The reaopn I'm curious is because it didn't really do it by default for me. Maybe I wrote something wrong in my date filter?

date {

   match => ["mytimestamp", "dd/MMM/YYYY:HH:mm:ss +SSSS"]

   }

So the date filter makes my "mytimestamp" field go to the original "@timestamp" field by default?

Yes. That's what I said.

The reaopn I'm curious is because it didn't really do it by default for me. Maybe I wrote something wrong in my date filter?

Possibly yes. For example, "+SSSS" doesn't look right. Are you sure you don't mean "ZZZ", for parsing a timezone offset? The date filter will cry out in the logs if it's not able to parse a timestamp (as you saw when you still had locale => "sv" in place).

The reason that the "+SSS" is there is because I'm trying to parse "15:42:51.541" haha! Don't know if there is a pattern for millieseconds.

EDIT: Sorry that was all wrong. I'm trying to parse this => 22/Jun/2015:02:37:17 +0000.

I actually don't know what the last zeros stand for.

That's the timezone offset, which ZZZ should be able to parse.

Should it be something like this?

 match => ["mytimestamp", "dd/MMM/YYYY:HH:mm:ss +ZZZZ"]

The plus sign is part of the offset and will be parsed. As I said "ZZZ" should be fine. "ZZZZ" might also work. I'm not sure if there's any difference between them. The Joda-Time documentation isn't amazingly clear on this.

Hi I've tried them both and unfortunately none of them seems to work. I keep getting the message

:message=>"Failed parsing date from field", :field=>"mytimestamp", :value=>"22/Jun/2015:23:59:51", :exception=>java.lang.IllegalArgumentException: Invalid format: "22/Jun/2015:23:59:51" is too short, :level=>:warn}

This states that the format is to short so my best guess is that the match under the date filter doesn't match the custom grok filter.

Here is what it looks like.

grok {

     type => "nexus-log"
     patterns_dir => "./config-dir/patterns"
     match => [
        "message", "\b\w+\b\s/nexus/content/repositories/(?<repositories>[^/]+)",
        "message", "(?<mytimestamp>%{MONTHDAY}/%{MONTH}/%{YEAR}:%{HOUR}:%{MINUTE}:%{SECOND})"

      ]
   }
   date{
      match => ["mytimestamp", "dd/MMM/YYYY:HH:mm:ss +ZZZZ" ]

   }

Yes, the grok expression doesn't include the timezone offset in the mytimestamp field.

Is there a ceartain format for the timezone offset that I can include in my custom filter?

ISO8601_TIMEZONE should work.

I tried it and got this error message.

:message=>"Failed parsing date from field", :field=>"mytimestamp", :value=>"22/Jun/2015:00:02:32 +0000", :exception=>java.lang.IllegalArgumentException: Invalid format: "22/Jun/2015:00:02:32 +0000" is malformed at "0000", :level=>:warn}

This is how the config looks like right now

 grok {

     type => "nexus-log"
     patterns_dir => "./config-dir/patterns"
     match => [
        "message", "\b\w+\b\s/nexus/content/repositories/(?<repositories>[^/]+)",
        "message", "(?<mytimestamp>%{MONTHDAY}/%{MONTH}/%{YEAR}:%{HOUR}:%{MINUTE}:%{SECOND} +%{ISO8601_TIMEZONE})"

      ]
   }
   date{
      match => ["mytimestamp", "dd/MMM/YYYY:HH:mm:ss +ZZZZ" ]

   }

What does "malformed" mean?

It seems ZZZ and ZZZZ weren't acceptable after all but Z and ZZ works.

And again, the plus sign is part of the timezone offset, both the one recognized by ISO8601_TIMEZONE and the one parsed by Z or ZZ. Don't include it in either place. I've verified that the following filters work with the example input you've provided:

filter {
  grok {
    match => [
      "message",
      "(?<mytimestamp>%{MONTHDAY}/%{MONTH}/%{YEAR}:%{HOUR}:%{MINUTE}:%{SECOND} %{ISO8601_TIMEZONE})"
    ]
  }
  date {
    match => ["mytimestamp", "dd/MMM/YYYY:HH:mm:ss Z"]
  }
}

Yes it works. Both Logstash, Elasticsearch and Kibana start without any error messages. Although the standard "@timestamp" still says today's current date. I'm gonna check everything out so it checks out as it's supposed to. Very strange.

Okey so I have been toying around a little bit with the logstash configuration. Apparently when I skip the "+" sign in front of "Z" I don't get any error messages although Kibana 4 doesn't recognize my custom filter "mytimestamp". When I have the "+" sign in front of "Z" it recognizes my custom filter but I keep getting error messages saying that it has a invalid format.

Use the @timestamp field in Kibana and make sure it's correctly populated with the timestamp from your log. Do not try to get Kibana to use mytimestamp.

That is exactly the problem I'm having. The standard @timestamp is populated with the timestamp that is generated with the time I'm indexing the log instead of being populated with the timestamp from the events in the log itself. How do I make sure it's populated with the time from the log itself? I thought you had to use the date filter and in some way make Kibana use my own timestamp as the standard one.

Gonna try the "mutate" filter.

With the configuration you've shown the @timestamp field should be populated with the date from your log entries and everything should be fine. If you can produce a minimal example that exhibits the problem it'll be possible to help out.

The following example shows what I did to test the grok and date filter combination, and it works. So you're obviously doing something differently that causes things to break. I suggest you start from this example and step by step turn it into what you actually have and observe when it breaks.

$ cat data
22/Jun/2015:00:02:32 +00:00
$ cat test.config 
input { stdin { codec => plain } }
output { stdout { codec => rubydebug } }
filter {
  grok {
    match => [
      "message",
      "(?<mytimestamp>%{MONTHDAY}/%{MONTH}/%{YEAR}:%{HOUR}:%{MINUTE}:%{SECOND} %{ISO8601_TIMEZONE})"
    ]
  }
  date {
    match => ["mytimestamp", "dd/MMM/YYYY:HH:mm:ss Z"]
  }
}
$ /opt/logstash/bin/logstash -f test.config < data
{
        "message" => "22/Jun/2015:00:02:32 +00:00",
       "@version" => "1",
     "@timestamp" => "2015-06-22T00:02:32.000Z",
           "host" => "redacted",
    "mytimestamp" => "22/Jun/2015:00:02:32 +00:00"
}