Works fine here (below). Are you sure your filters are being run? Your grok filter should either a) be successful and produce sys_timestamp and syslog_data fields or b) be unsuccessful and that the event _grokparsefailure. Right now it appears to produce only a sys_timestamp field and that doesn't make sense.
$ cat test.config
input { stdin { } }
output { stdout { codec => rubydebug } }
filter {
grok {
match => [
"message",
"%{TIMESTAMP_ISO8601:sys_timestamp} %{GREEDYDATA:syslog_data}"
]
}
date {
match => [ "sys_timestamp", "ISO8601" ]
timezone => "Europe/Rome"
}
}
$ echo '2017-10-26T14:37:06.540286+02:00 some-data' | /opt/logstash/bin/logstash -f test.config
Settings: Default pipeline workers: 8
Pipeline main started
{
"message" => "2017-10-26T14:37:06.540286+02:00 some-data",
"@version" => "1",
"@timestamp" => "2017-10-26T12:37:06.540Z",
"host" => "lnxolofon",
"sys_timestamp" => "2017-10-26T14:37:06.540286+02:00",
"syslog_data" => "some-data"
}
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}