Add date{} config make logstash not working

I add date{} config into filter {}, but then the log is not passed to elasticseatch(cannot see new log in kibana). Logs are back if I removed the date config.
This is the date config I added:

date {
     match => ["ts" , "UNIX"]
     target => "ts"
}

'ts' field is an integer number representing timestamp.
This is the error log

[WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-vm-metrics-2020.12.21", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x7179472d>], :response=>{"index"=>{"_index"=>"logstash-vm-metrics-2020.12.21", "_type"=>"doc", "_id"=>"SThbhHYBTmkvgweLvH9p", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ts] of type [long] in document with id 'SThbhHYBTmkvgweLvH9p'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"2020-12-21T08:12:48.000Z\""}}}}}

It has no error when outputting to console, but only in indexing to ES.
I have no clues why it failed.

So the date filter successfully parsed the value of [ts] and replaced it with "2020-12-21T08:12:48.000Z". However, elasticsearch, because you indexed some documents without the date filter, expects [ts] to be a 'long'. In elasticsearch a field has to have the same type on every document. It cannot be a long on some documents and a date on others.

You could change the name of the target field in the date filter, or create a new index, in which case elasticsearch will auto-detect that the field is a date.

I have deleted the index and let the logs to be imported again, but the 'ts' field is still not treated as Date type but number type (only parse the year 2020), and I cannot change it in Kibana.
image

Do you have a template?

what do you mean template

I am referring to index templates.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.