Could not index event to Elasticsearch, failed to parse [timestamp]

Good evening,

I'm pushing JSON content from filebeats to logstash and I'm receiving a 'failed to parse [timestamp]' error and don't understand what is wrong but to be fair, I'm, pretty green to this technology.

Sample Log Message (At the Producer):
{"@version":"1","fields":{"environment":"development","appname":"someApp"},"trace_id":"4edbb090","source":"/var/log/someApp-json.log","message":"Packet dump:\n*** Sending to 172.160.100.1 port 49068 ....\n\nPacket length = 20\n05 28 00 14 d6 f1 a3 52 32 40 5a 2d e9 d3 f5 9b\na0 10 1c b5\nCode: Accounting-Response\nIdentifier: 40\nAuthentic: <214><241><163>R2@Z-<233><211><245><155><160><16><28><181>\nAttributes:\n","time":1580159784,"timestamp":"Mon Jan 27 21:16:24 2020","host":"server1.net","priority":"DEBUG","@timestamp":"2020-01-27T21:16:26.623Z","beat":{"type":"filebeat","version":"6.2.4","hostname":"server1.net","ip_address":"204.246.14.249","name":"server1.net"},"prospector":{"type":"log"},"offset":17641557,"tags":["someApp-dev01","json-log","beats_input_codec_plain_applied"]}

Sample Error Message:
[2020-01-27T16:18:28,192][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>[ "index" , {:_id=>nil, :_index=> "someApp-2020.01.27" , :_type=> "doc" , :_routing=>nil}, #<LogStash::Event:0x24a7a1e5>], :response=>{"index"=>{"_index"=>"someApp-2020.01.27", "_type"=>"doc", "_id"=>"4hzM528BhwJu0nMGDxHt", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"**Invalid format: \"Mon Jan 27 16:18:24 2020\"**"}}}}}

Doing some research and I may had found JSON documents don't support 'date' as a type but I'm still at a loss. I figured it may simply be I didn't format 'timestamp' properly but I've tried so many things and been burning a lot of time on something simple I may be overlooking.

Filter

filter {
if [data_log_type] == "na-radiator-dev01" {
date {
# Tue Jan 28 14:48:54 2020
locale => "en"
match => [ "timestamp", "EEE MMM dd HH:mm:ss YYYY" ]
target => "timestamp"
}
useragent {
source => "user_agent"
target => "user_agent_parsed"
}
}
}

Template

{
"template":"na-radiator-template",
"index_patterns": ["na-radiator-*"],
"settings":{
"index.mapper.dynamic":false,
"index.number_of_replicas":1,
"index.refresh_interval":"5s",
"index.number_of_shards":1
},
"mappings":{
"doc":{
"_all":{
"enabled":false
},
"properties":{
"data_log_type":{
"type":"keyword"
},
"type":{
"type":"keyword"
},
"date":{
"type":"date"
"local":"en"
"format":"EEE MMM dd HH:mm:ss YYYY"
},
}
}
}
}

Your template specifies a format for date, but not for timestamp.

Your date filter does not appear to have parsed the timestamp field, since it is still in the original format, and does not look like a LogStash::Timestamp.

Thanks for the reply Badger,

 You were correct, I needed to define the field correctly in the template. I wasn't quite understanding the relationship but now I do.

Filter

filter {
if [tags] == "someApp" {
date {
match => [ "timestamp", "EEE MMM dd HH:mm:ss YYYY" ]
target => "timestamp"
}
useragent {
source => "user_agent"
target => "user_agent_parsed"
}
}
}

#Template
{
"template":"someApp-template",
"index_patterns": ["someApp-*"],
"settings":{
"index.mapper.dynamic":false,
"index.number_of_replicas":1,
"index.refresh_interval":"5s",
"index.number_of_shards":1
},
"mappings": {
"doc": {
"_all": {
"enabled":false
},
"properties": {
"data_log_type":{
"type":"keyword"
},
"type": {
"type":"keyword"
},
"timestamp": {
"type":"date",
"format": "EEE MMM dd HH:mm:ss YYYY"
},
}
}
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.