How to handle missing values for date datatype while ingesting the data in Elasticsearch

There are few empty values for date datatype field in a csv file. So, while ingesting the data in Kibana (uploading csv file), getting the below error.

Some documents could not be imported

1445 out of 45436 documents could not be imported. This could be due to lines not matching the Grok pattern.

Failed documents

1of86

234: field [eol] not present as part of path [eol]

{"message":"SJADGFF,Installed,vmware photon os 3,Linux container,not_found,Vmware Photon,,"}

324: field [eol] not present as part of path [eol]

{"message":"HGSFHJF,Installed,vmware photon os 3,Linux container,not_found,Vmware Photon,,"}

546: field [eol] not present as part of path [eol]

{"message":"JJADSDVN,Installed,vmware photon os 3,Linux container,not_found,Vmware Photon,,"}

Basically, the documents with empty values for EOL (date datatype) field are not being ingested into Elasticsearch.

Any help, please?

Hi,

You can use null_value parameter in mapping to hard code a date value if a field value is null.

The null_value parameter allows you to replace explicit null values with the specified value so that it can be indexed and searched.

Hi,

Thanks for your response. Where do you want me to make these changes?

I am uploading the CSV file directly in Kibana. Do you want me to keep the blank values as is in CSV file? Or replacing blank values with null_value in CSV file?

Could you please explain how to proceed on this a bit elaborate?

Hi,

When uploading file in kibana, after clicking on import , click on advanced part and you can customize mappings , put above mapping for the fields where you will get empty values.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.