Hey there!
I am trying to ship logs from Docker to logstash via the gelf input plugin:
input { gelf {} }
This is what I am getting in my logs:
{:timestamp=>"2016-06-02T09:14:00.434000+0000", :message=>"Failed parsing date from field", :field=>"@timestamp", :value=>"2016-06-02T09:14:00.000Z", :exception=>"cannot convert instance of class org.jruby.RubyObject to class java.lang.String", :config_parsers=>"yyyy-MM-dd HH:mm:ss,SSS", :config_locale=>"default=en_US", :level=>:warn}
{
"version" => "1.1",
"host" => "default",
"level" => 6,
"@version" => "1",
"@timestamp" => "2016-06-02T09:14:00.000Z",
"source_host" => "192.168.99.100",
"message" => "{\"@name\": \"test\", \"@timestamp\": \"2016-06-02 09:14:00,000\", \"@levelname\": \"INFO\", \"@source_host\": \"2c3ccf23e729\", \"duration\": 0.37261}",
"command" => "bin/app",
"container_id" => "2c3ccf23e729ab7ebc7e586e3068dd2eeccac3f1250cda012ddd601249891a45",
"container_name" => "localdev_test_1",
"created" => "2016-06-02T07:41:57.018434389Z",
"image_id" => "sha256:db3aab57f09f0e64f39895232afc31f266865edb95521cd38a3cc0c068e71311",
"image_name" => "test/container-name:latest",
"tag" => "json",
"tags" => [
[0] "_dateparsefailure"
]
}
I have tried adding a custom date filter:
date {
match => ["@timestamp",
"ISO8601",
"yyyy-MM-dd HH:mm:ss.SSS",
"yyyy-MM-dd' 'HH:mm:ss.SSS",
"yyyy-MM-dd'T'HH:mm:ss.SSSZ",
"yyyy-MM-dd'T'HH:mm:ss.SSSSSS'+0000'"]
timezone => 'UTC'
}
But that does not change the errors I am getting. Curiously I am still seeing this: :config_parsers=>"yyyy-MM-dd HH:mm:ss,SSS"
. I would expect to see my own patterns here as well.
Any ideas? Because I am at a complete loss here...
Version information
- logstash (2.0.0)
- logstash-input-gelf (2.0.2)
- logstash-filter-date (2.0.2)