Invalid format for timestamp field


#1

Hi there,

i'm currently facing an issue with the timestamp field after upgrading logstash from 1.5 to 2.0.

The data i try to parse is:

    "timestamp"=>"14/Nov/2015:10:47:15 +0100",

And the date filter looks like this:

date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}

And this is the error i get:

        "error"=>{
            "type"=>"mapper_parsing_exception",
            "reason"=>"failed to parse [timestamp]",
            "caused_by"=>{
                "type"=>"illegal_argument_exception",
                "reason"=>"Invalid format: \"14/Nov/2015:10:47:15 +0100\" is malformed at \"/Nov/2015:10:47:15 +0100\""}}}}, :level=>:warn}

I already double checked the config and cannot find any mistake.

Has anybody some good advice?

Thanks very much.


(Magnus Bäck) #2

Logstash uses the locale when parsing names of months. What's the system's locale? If other than English you should to add locale => "en" to your date filter.


#3

Hi,

ok i already checked it, it's set to LANG=en_US.UTF-8
I also found our suggestion in another thread and setting the locale in the config did not work for me.


#4

I could solve the issue. It had absolutely nothing to do with the logstash config or the parsing of the timestamp.

After deleting the data from elasticsearch and just processing new events (after the update of ELK to the current version) fixed it for me.


#5

I have exactly the same issue. But the solution from alex22 doesn't work for me.

Here is my config:

filter {
    if [type] == "apache-access" {
        grok {
            match => { "rawmsg" => "%{COMBINEDAPACHELOG}" }
        }

        date {
            locale => "en"
            match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
        }
    }
}

This is the error from logstash:

"error"=>{
  "type"=>"mapper_parsing_exception", 
  "reason"=>"failed to parse [timestamp]", 
  "caused_by"=>{
    "type"=>"illegal_argument_exception", 
    "reason"=>"Invalid format: \"02/Feb/2016:08:22:19 +0100\" is malformed at \"/Feb/2016:08:22:19 +0100\""   
  }
}

I use logstash 2.1.1 and ElasticSearch 2.1.1. I've cleaned the ElasticSearch index as suggested by Alex22, but this doesn't help in my case.

If I search through the internet, then I've found a lot of similar issues. These issues were solved by adding the local to the date filter. But in my case that doesn't help. There is also a difference in the error message. The issue wich could be solved with the local, has an error message in the form "is malformed at "Feb/2016:08:22:19 +0100"", which indicates that the month couldn't be parsed. But my error message indicates that the date couldn't be parsed at the leading slash.

Any help would be appreciated!


(Magnus Bäck) #6

The timestamp field has, for some reason, been mapped as a date in Elasticsearch, but the timestamp string parsed from the HTTP log doesn't follow that pattern. Since you'll be parsing the timestamp into the @timestamp field anyway you might as well delete the timestamp field which should get rid of the error above.

date {
  locale => "en"
  match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
  remove_field => ["timestamp"]
}

#7

Thanks magnusbaeck! This fixed the issue.


(system) #8