Elasticsearch failed to parse date field with logstash

Here is logstash rubydebug result:

{
     "@timestamp" => "2015-06-01T07:56:39.581Z",
        "message" => "2015-06-01 15:56:39,213 [vert.x-eventloop-thread-0] INFO  hawkeyes.rtds.monitor.MetricsVerticle  - Metrics of RTDS:\n{DB Worker={max=2674, min=1, total=2728, count=3, avg=909.3333333333}, Realtime Cacher={max=2349, min=1000, total=2349, count=1, avg=2349}}",
       "@version" => "1",
           "tags" => [
        [0] "multiline",
        [1] "rtds_log",
        [2] "vertx"
    ],
           "type" => "rtds",
           "host" => "hawkeyesTest",
           "path" => "/var/log/rtds/rtds.log",
           "date" => "2015-06-01 15:56:39,213",
    "thread_name" => "vert.x-eventloop-thread-0",
      "log_level" => "INFO",
       "verticle" => "hawkeyes.rtds.monitor.MetricsVerticle",
        "content" => "Metrics of RTDS:"
}

But elasticsearch raises an error:

[2015-06-01 15:56:29,906][DEBUG][action.bulk              ] [Watcher] [logstash-2015.06.01][2] failed to execute bulk item (index) index {[logstash-2015.06.01][rtds][AU2uHzNIK8Y8pKwZT0pi], source[{"@timestamp":"2015-06-01T15:56:24.560+08:00","message":"2015-06-01 15:56:24,213 [vert.x-eventloop-thread-0] INFO  hawkeyes.rtds.monitor.MetricsVerticle  - Metrics of RTDS:\n{DB Worker={max=2674, min=1, total=2728, count=3, avg=909.3333333333}, Realtime Cacher={max=2349, min=1000, total=2349, count=1, avg=2349}}","@version":"1","tags":["multiline","rtds_log","vertx"],"type":"rtds","host":"hawkeyesTest","path":"/var/log/rtds/rtds.log","date":"2015-06-01 15:56:24,213","thread_name":"vert.x-eventloop-thread-0","log_level":"INFO","verticle":"hawkeyes.rtds.monitor.MetricsVerticle","content":"Metrics of RTDS:"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [date]
	at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:416)
	at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:709)
	at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:500)
	at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:542)
	at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:491)
	at org.elasticsearch.index.shard.service.InternalIndexShard.prepareCreate(InternalIndexShard.java:410)
	at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:432)
	at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:149)
	at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:512)
	at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:419)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.index.mapper.MapperParsingException: failed to parse date field [2015-06-01 15:56:24,213], tried both date format [dateOptionalTime], and timestamp number with locale []
	at org.elasticsearch.index.mapper.core.DateFieldMapper.parseStringValue(DateFieldMapper.java:620)
	at org.elasticsearch.index.mapper.core.DateFieldMapper.innerParseCreateField(DateFieldMapper.java:548)
	at org.elasticsearch.index.mapper.core.NumberFieldMapper.parseCreateField(NumberFieldMapper.java:236)
	at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:406)
	... 12 more
Caused by: java.lang.IllegalArgumentException: Invalid format: "2015-06-01 15:56:24,213" is malformed at " 15:56:24,213"
	at org.elasticsearch.common.joda.time.format.DateTimeFormatter.parseMillis(DateTimeFormatter.java:754)
	at org.elasticsearch.index.mapper.core.DateFieldMapper.parseStringValue(DateFieldMapper.java:614)
	... 15 more

logstash version: 1.4.2
elasticsearch version: 1.4.5

Here is my logstash grok:

filter {
    if [type] == "rtds" {
        grok {
            match => { "message" => 
                "%{TIMESTAMP_ISO8601:date} \[(?<thread_name>.+?)\] (?<log_level>\w+)\s+(?<verticle>.+?)  -\s*(?<content>.*)"
            }
        }

        if [log_level] == "ERROR" or "Exception" in [message] {
            throttle {
                after_count => 2
                key => "%{content}"
                add_tag => "throttled"
            }
        }
    }
}

That doesn't look like a date format that ES understands.
Why not use a date filter to convert it?

Thanks. But I don't know how to do. Would you please show me an example?
Best regards.

The docs are ver helpful - https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html - but let us know if you still have problems!

Hmm. Strange. The date format is ISO8601, but logstash parsed error.

Here is my new config:

input {
    file {
        type => "rtds"
        path => "/var/log/rtds/rtds.log"
        tags => [ "rtds_log", "vertx" ]
        codec => multiline {
            pattern => "^%{TIMESTAMP_ISO8601}"
            negate => true
            what => "previous"
        }
    }
}

filter {
    if [type] == "rtds" {
        grok {
            match => { "message" => 
                "%{TIMESTAMP_ISO8601:date} \[(?<thread_name>.+?)\] (?<log_level>\w+)\s+(?<verticle>.+?)  -\s*(?<content>.*)"
            }
        }

        date {
            match => [ "date", "ISO8601" ]
            target => "date"
        }

        if [log_level] == "ERROR" or "Exception" in [message] {
            throttle {
                after_count => 2
                key => "%{content}"
                add_tag => "throttled"
            }
        }
    }
}

But the rubydebug shows:

Failed parsing date from field {:field=>"date", :value=>"2015-06-03 10:57:16,072", :exception=>java.lang.IllegalArgumentException: Invalid format: "2015-06-03 10:57:16,072" is malformed at ",072", :level=>:warn}
{
     "@timestamp" => "2015-06-03T02:57:17.442Z",
        "message" => "2015-06-03 10:57:16,072 [vert.x-eventloop-thread-0] INFO  hawkeyes.rtds.monitor.MetricsVerticle  - Metrics of RTDS:\n{DB Worker={max=4223, min=1000, total=4223, count=1, avg=4223}, Realtime Cacher={max=4518, min=1000, total=4518, count=1, avg=4518}}",
       "@version" => "1",
           "tags" => [
        [0] "multiline",
        [1] "rtds_log",
        [2] "vertx"
    ],
           "type" => "rtds",
           "host" => "hawkeyesTest",
           "path" => "/var/log/rtds/rtds.log",
           "date" => "2015-06-03 10:57:16,072",
    "thread_name" => "vert.x-eventloop-thread-0",
      "log_level" => "INFO",
       "verticle" => "hawkeyes.rtds.monitor.MetricsVerticle",
        "content" => "Metrics of RTDS:\n{DB Worker={max=4223, min=1000, total=4223, count=1, avg=4223}, Realtime Cacher={max=4518, min=1000, total=4518, count=1, avg=4518}}"
}
Failed parsing date from field {:field=>"date", :value=>"2015-06-03 10:57:21,071", :exception=>java.lang.IllegalArgumentException: Invalid format: "2015-06-03 10:57:21,071" is malformed at ",071", :level=>:warn}
{
     "@timestamp" => "2015-06-03T02:57:21.469Z",
        "message" => "2015-06-03 10:57:21,071 [vert.x-eventloop-thread-0] INFO  hawkeyes.rtds.monitor.MetricsVerticle  - Metrics of RTDS:\n{DB Worker={max=4223, min=1000, total=4223, count=1, avg=4223}, Realtime Cacher={max=4518, min=1000, total=4518, count=1, avg=4518}}",
       "@version" => "1",
           "tags" => [
        [0] "multiline",
        [1] "rtds_log",
        [2] "vertx"
    ],
           "type" => "rtds",
           "host" => "hawkeyesTest",
           "path" => "/var/log/rtds/rtds.log",
           "date" => "2015-06-03 10:57:21,071",
    "thread_name" => "vert.x-eventloop-thread-0",
      "log_level" => "INFO",
       "verticle" => "hawkeyes.rtds.monitor.MetricsVerticle",
        "content" => "Metrics of RTDS:\n{DB Worker={max=4223, min=1000, total=4223, count=1, avg=4223}, Realtime Cacher={max=4518, min=1000, total=4518, count=1, avg=4518}}"
}
Failed parsing date from field {:field=>"date", :value=>"2015-06-03 10:57:26,071", :exception=>java.lang.IllegalArgumentException: Invalid format: "2015-06-03 10:57:26,071" is malformed at ",071", :level=>:warn}
{
     "@timestamp" => "2015-06-03T02:57:26.486Z",
        "message" => "2015-06-03 10:57:26,071 [vert.x-eventloop-thread-0] INFO  hawkeyes.rtds.monitor.MetricsVerticle  - Metrics of RTDS:\n{DB Worker={max=4223, min=1000, total=4223, count=1, avg=4223}, Realtime Cacher={max=4518, min=1000, total=4518, count=1, avg=4518}}",
       "@version" => "1",
           "tags" => [
        [0] "multiline",
        [1] "rtds_log",
        [2] "vertx"
    ],
           "type" => "rtds",
           "host" => "hawkeyesTest",
           "path" => "/var/log/rtds/rtds.log",
           "date" => "2015-06-03 10:57:26,071",
    "thread_name" => "vert.x-eventloop-thread-0",
      "log_level" => "INFO",
       "verticle" => "hawkeyes.rtds.monitor.MetricsVerticle",
        "content" => "Metrics of RTDS:\n{DB Worker={max=4223, min=1000, total=4223, count=1, avg=4223}, Realtime Cacher={max=4518, min=1000, total=4518, count=1, avg=4518}}"
}
Failed parsing date from field {:field=>"date", :value=>"2015-06-03 10:57:31,072", :exception=>java.lang.IllegalArgumentException: Invalid format: "2015-06-03 10:57:31,072" is malformed at ",072", :level=>:warn}

This log file is written by log4j.
And here is log4j.properties:

log4j.rootLogger=DEBUG, Console, R

log4j.appender.Console=org.apache.log4j.ConsoleAppender
log4j.appender.Console.layout=org.apache.log4j.PatternLayout
log4j.appender.Console.layout.ConversionPattern=%d [%t] %-5p %c %x - %m%n

log4j.appender.R=org.apache.log4j.RollingFileAppender
log4j.appender.R.File=rtds.log
log4j.appender.R.MaxFileSize=10000KB
log4j.appender.R.MaxBackupIndex=10
log4j.appender.R.layout=org.apache.log4j.PatternLayout
log4j.appender.R.layout.ConversionPattern=%d [%t] %-5p %c %x - %m%n

log4j.logger.org.vertx.level=INFO
log4j.logger.com.hazelcast.level=WARN
log4j.logger.io.netty.util.internal.PlatformDependent.level=WARN
log4j.logger.com.zaxxer=WARN
log4j.logger.org.mongodb=WARN

I find %d in log4j layout.ConversionPattern means ISO8601.

The date filter's ISO8601 format doesn't like the comma that separates seconds and milliseconds. It needs to be a period there. If you can't or don't want to change your logging format, use a custom ISO8601 date pattern. Let yourself be inspired by what Logstash uses when you say ISO8601:

This limitation has been reported here:

Ah. But elasticsearch is a java program, and still not parse the date format. Is elasticsearch interested in “date” field? Not treated "date" field as string format, but date format.

Ah. But elasticsearch is a java program, and still not parse the date format. Is elasticsearch interested in “date” field? Not treated "date" field as string format, but date format.

I don't understand this question. We suggest that you use the date filter to parse your date field into the @timestamp field. Elasticsearch will happily accept the format of the latter field and if you use Kibana everything will work fine there too.

Final filter config:

        date {
            match => [ "date", "yyyy-MM-dd HH:mm:ss,SSS" ]
            target => "date"
            timezone => "PRC"
        }

It works for me. Target must set "date", because elasticsearch raised errors as the first post.