_dateparsefailure when shipping logs from Docker via GELF

Hey there!

I am trying to ship logs from Docker to logstash via the gelf input plugin:

input { gelf {} }

This is what I am getting in my logs:

{:timestamp=>"2016-06-02T09:14:00.434000+0000", :message=>"Failed parsing date from field", :field=>"@timestamp", :value=>"2016-06-02T09:14:00.000Z", :exception=>"cannot convert instance of class org.jruby.RubyObject to class java.lang.String", :config_parsers=>"yyyy-MM-dd HH:mm:ss,SSS", :config_locale=>"default=en_US", :level=>:warn}
{
           "version" => "1.1",
              "host" => "default",
             "level" => 6,
          "@version" => "1",
        "@timestamp" => "2016-06-02T09:14:00.000Z",
       "source_host" => "192.168.99.100",
           "message" => "{\"@name\": \"test\", \"@timestamp\": \"2016-06-02 09:14:00,000\", \"@levelname\": \"INFO\", \"@source_host\": \"2c3ccf23e729\", \"duration\": 0.37261}",
           "command" => "bin/app",
      "container_id" => "2c3ccf23e729ab7ebc7e586e3068dd2eeccac3f1250cda012ddd601249891a45",
    "container_name" => "localdev_test_1",
           "created" => "2016-06-02T07:41:57.018434389Z",
          "image_id" => "sha256:db3aab57f09f0e64f39895232afc31f266865edb95521cd38a3cc0c068e71311",
        "image_name" => "test/container-name:latest",
               "tag" => "json",
              "tags" => [
        [0] "_dateparsefailure"
    ]
}

I have tried adding a custom date filter:

date {
  match => ["@timestamp",
              "ISO8601",
              "yyyy-MM-dd HH:mm:ss.SSS",
              "yyyy-MM-dd' 'HH:mm:ss.SSS",
              "yyyy-MM-dd'T'HH:mm:ss.SSSZ",
              "yyyy-MM-dd'T'HH:mm:ss.SSSSSS'+0000'"]
  timezone => 'UTC'
}

But that does not change the errors I am getting. Curiously I am still seeing this: :config_parsers=>"yyyy-MM-dd HH:mm:ss,SSS". I would expect to see my own patterns here as well.

Any ideas? Because I am at a complete loss here...

Version information

  • logstash (2.0.0)
  • logstash-input-gelf (2.0.2)
  • logstash-filter-date (2.0.2)

Hi,
I tried your example, you can see my results here ,

Pipeline main started
2016-06-02T09:14:00.434000+0000
{
    "@timestamp" => 2016-06-02T09:14:00.434Z,
      "@version" => "1",
          "host" => "skywalker",
       "message" => "2016-06-02T09:14:00.434000+0000"
}
foo
Failed parsing date from field {:field=>"message", :value=>"foo", :exception=>"Invalid format: \"foo\"", :config_parsers=>"ISO8601,yyyy-MM-dd HH:mm:ss.SSS,yyyy-MM-dd' 'HH:mm:ss.SSS,yyyy-MM-dd'T'HH:mm:ss.SSSZ,yyyy-MM-dd'T'HH:mm:ss.SSSSSS'+0000'", :config_locale=>"default=en_US", :level=>:warn}
{
    "@timestamp" => 2016-06-02T12:41:33.413Z,
      "@version" => "1",
          "host" => "skywalker",
       "message" => "foo",
          "tags" => [
        [0] "_dateparsefailure"
    ]
}

I think what happen to you here is that you're using the @timestamp value as the one used to get the value. This is a reserved keyword for logstash, i do suggest you used another name for incoming timestamp, as you can see from my example here parsing works just as expected with your rules, so issues is not related with your expressions.

hope it helps,

-purbon