Hello,
I'm trying to format date by using filter, and use it as "Time Filter field name" to create index pattern in Kibana.
I've already have a "date" field in a json-format log, like this:
{"project":"sth","date":"2018-12-07 17:12:54,332","level":"INFO ","etc.":"etc."}
And I tried parse "date" in shell:
echo 'LOG ABOVE' | bin/logstash -e 'input { stdin { codec => json } } filter { date { match => [ "log_time", "ISO8601", "YYYY-MM-dd HH:mm:ss,SSS" ] target => "logdate" } }'
The result was:
[2018-12-07T17:36:23,959][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-12-07T17:36:24,567][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2018-12-07T17:36:27,521][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-12-07T17:36:27,623][INFO ][logstash.inputs.stdin ] Automatically switching from json to json_lines codec {:plugin=>"stdin"}
[2018-12-07T17:36:27,673][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x262ccbba run>"}
[2018-12-07T17:36:27,749][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2018-12-07T17:36:28,286][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
{
"thread" => "some string",
"project" => "some string",
"class" => "some string",
"date" => "2018-12-07 17:12:54,332",
"host" => "some string",
"requestId" => "some string",
"message" => "some string",
"level" => "INFO ",
"@version" => "1",
"@timestamp" => 2018-12-07T09:36:27.786Z
}
[2018-12-07T17:36:28,532][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x262ccbba run>"}
which is not what I expected.
It should match "2018-12-07 17:12:54,332"
and convert to date format instead of string, also add a new field named logdate
.
Is there anything woring?