I am having trouble trying to get this timestamp to match, any ideas?
[20200609 084347]
I am having trouble trying to get this timestamp to match, any ideas?
[20200609 084347]
That looks like "[yyyyMMdd HHmmss]" to me.
(Just a sidenote: In general your chances of having someone help you increase if "I am having trouble" is followed by "I have tried X and Y that did not work." so people have more details and know that you are putting in effort.)
It doesn't seem to work. I get:
Jun 09 11:53:50 mon-01 logstash[24891]: [2020-06-09T11:53:50,421][WARN ][logstash.outputs.elasticsearch][main][49e1af4fd928e2ce9f2190100e0f6d6671d7d8d7be6cdbf3f87f31b0add83914] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"var_log_2020.06.09", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x5ece5487>], :response=>{"index"=>{"_index"=>"var_log_2020.06.09", "_type"=>"_doc", "_id"=>"9uKAmHIBkgu1dtKtFSNM", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [log_timestamp] of type [date] in document with id '9uKAmHIBkgu1dtKtFSNM'. Preview of field's value: '20200609 115349'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [20200609 115349] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
What does your Logstash config look like? Did you try to process this field with a date filter with the correct pattern? If you send it to ES with its original format, it makes sense that ES rejects the event because [strict_date_optional_time||epoch_millis] is expected.
So from Logstash I am of course sending the data to Elasticsearch. Here is my grok filter pattern:
patters/common
DATESTAMP_LOG %{YEAR}%{MONTHNUM}%{MONTHDAY} %{HOUR}%{MINUTE}%{SECOND}
conf.d/15-app-filters
match => { "message" => "\[%{DATESTAMP_LOG:log_timestamp}\] %{LOGLEVEL:loglevel} %{WORD:thread_name} %{NUMBER:application_runtime_ms}ms \(*%{WORD:user_name}\)* - %{GREEDYDATA:message}" }
As I said: you need to apply the date filter.
Oh sorry my bad I forgot to paste this:
date {
match => [ "log_timestamp", "yyyyMMdd HHmmss" ]
target => "@timestamp"
timezone => "Europe/Berlin"
add_field => { "debug" => "timestampMatched" }
}
you will also need to add that format to your elasticsearch mapping to address this error.
This same format?
yyyyMMdd HHmmss
I added this:
"log_timestamp": {
"type": "date",
"index": true,
"format": "yyMMdd HHmmss",
"ignore_malformed": false,
"doc_values": true,
"store": false
},
With this configuration @timestamp should contain the parsed date, but log_timestamp still contains the original format. ES won't accept the field unless it uses one of the formats that have been configured in the mapping.
Edit: Ah, okay. You adjusted the mapping
Edit2: But you forgot 2 'y's
Oh yes thank you Jenni
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.