Can't replace timestamp of logstash with timestamp of log file csv and send to elasticsearch.
in debugger all work.
When I change output to elasticsearch it's write that is work but after I create index patern in kibana I don't see something in discovery.
If I delete date filter all work correctly but I need change timestamp.
version of Elastic Stack 7.10.1
Java openjdk 14 2020-03-17
configuration file of logstash
input {
file {
path => "C:/Users/vladi/Desktop/Elastic_Stack/*Z.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
}
filter {
csv {
separator => ","
skip_header => "true"
columns => [ "timestamp","number" ]
remove_field => [ "path", "host", "@version"]
}
mutate{
convert => { "number" => "integer" }
}
date {
match => ["timestamp", "YYYY-MM-dd'T'HH'_'mm'_'ss.SSS'Z'"]
timezone => "UTC"
add_field => { "debug" => "timestampMatched"}
remove_field => [ "timestamp" ]
}
}
output {
stdout { codec => rubydebug }
}
I get:
{
"number" => 1232,
"message" => "2020-12-10T09_24_13.986Z,1232\r",
"@timestamp" => 2020-12-10T09:24:13.986Z,
"debug" => "timestampMatched"
}
When I change output to elasticsearch after creating index patern and entering to discovery I take error:
How can I solve it?