By default the date filter stores the result of the parsing in the @timestamp field. If you want it stored elsewhere you need to set the filter's target parameter.
As suggested I have added another date field with target parameter in the logstash.conf file.
If there is null in the "ReleaseDate" those logs are not indexing. Because of that only few records are indexing and the remaining records are not indexing. But I want all the records to be indexed.
If certain messages are dropped I doubt it's related to a "null" ReleaseDate field (do you mean a missing field?), but you can use a conditional to only use the date filter when the field is set.
if [ReleaseDate] {
date {
...
}
}
If you still have problems please provide a complete example that exhibits your problem.
I have a very similar question about having multiple dates in my data.
I have a Start_Date_Time and an End_Date_Time. If I use the date plugin to shove both of them into the @timestamp so I can convert them both with the target option to store them as date/times in the data instead of strings (thanks for that tip, since it was what I was looking for), which @timestamp goes into ElasticSearch or do they both go in as two different @timestamps? I'm assuming the last one in the code would be the one stored as typically code overwrites previous code; however, want to be sure before changing the logstash files that I've inherited since this could really muck up my data if I get it wrong. Thanks in advance!
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.