Hi,
I am learning logstash to insert my logfiles in elasticsearch. By now I am testing this config file:
input {
file {
type => "accounting"
path => ["/root/logstash-tests/mini-test/accounting.txt"]
start_position => "beginning"
}
}
filter {
if [type] == "accounting" {
grok {
match => { "message" => "%{DATA:qname}:%{DATA:exechost}:%{DATA:group}:%{DATA:owner}:%{DATA:job_name}:%{INT:jobid}:%{DATA:account}:%{INT:priority}:%{INT:submission_time}:%{INT:start_time}:%{INT:end_time}:%{INT:failed}:%{INT:exit_status}$" }
}
date {
match => [ "submission_time", "UNIX_MS" ]
target => "@timestamp"
}
}
}
output {
stdout {codec => rubydebug}
}
which gives me this output:
[root@logs-test mini-test]# /opt/logstash/bin/logstash agent -f test.conf
Logstash startup completed
{
"message" => "gpu.q:pgi02.mydomain.com:group22:user22:arrayjob.sh:3511939:sge:0:1428589483565:1429783559251:1429785864500:0:0",
"@version" => "1",
"@timestamp" => "2015-04-09T14:24:43.565Z",
"type" => "accounting",
"host" => "logs-test",
"path" => "/root/logstash-tests/mini-test/accounting.txt",
"qname" => "gpu.q",
"exechost" => "pgi02.mydomain.com",
"group" => "group22",
"owner" => "user22",
"job_name" => "arrayjob.sh",
"jobid" => "3511939",
"account" => "sge",
"priority" => "0",
"submission_time" => "1428589483565",
"start_time" => "1429783559251",
"end_time" => "1429785864500",
"failed" => "0",
"exit_status" => "0"
}
the submission_time field is being converted to a time type field to be used in @timestamp correctly by using the date filter but I have other two fields in format UNIX_MS (start_time and end_time) which I would like to convert to date type fields when inserting to elasticsearch. How should I modify by config file so start_time and end_time are also date type fields in elasticsearch? Could you point to some example?
thanks in advance for your help.