i'm facing an issue about empty date field. so my data is ingested from csv file. and some row has a empty date field. i already made a condition like this
but i keep getting error like this
response=>{"index"=>{"_index"=>"pgd-2023.03.28", "_type"=>"_doc", "_id"=>"PphGJ4cB90Y937f_h7LS", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [data.listOrder.tglTerima] of type [date] in document with id 'PphGJ4cB90Y937f_h7LS'. Preview of field's value: ''", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"cannot parse empty date"}}}}}
I've even used the ruby filter too but still not work
This error is from Elastic. The record is sent from Logstash to Elastic and then Elastic looks at the mapping of the index for the field data.listOrder.tglTerima and sees it requires a date field. The mapping parser throws an error saying you sent me a field data.listOrder.tglTerima but the data was empty and therefore I can't except this record and then shoots the error back to Logstash.
So you either need to change your mapping on your index for that field to allow for malformed data or remove that field before it's sent in Logstash if you don't need it.
Briefly, this error occurs when Elasticsearch is unable to parse a date field because the value is empty. Elasticsearch expects a valid date value for date fields and cannot parse an empty value. To resolve the issue, ensure that the date field has a valid value assigned to it.
my goal in creating such a condition is to overcome the imperfection of the incoming data. so that the logs with incomplete data can still enter elastic. because this configuration will be put into production. so I can't even be sure whether the data sent to elastic is proper or not
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.