Logstash Fail to Index Events due to field type

Hello,

I have events not indexing into elastic when passing through logstash. The reason is because sometimes a field like "remoteAddress" (of type ip) contains no values.

]>worker18","logEvent":{"message":"Could not index event to Elasticsearch.","status":400,

:{"type":"document_parsing_exception","reason":"[1:1390] failed to parse field [remoteAddress] of type [ip] in document with id 'Wiag6pABiALJHf7sVsMt'. Preview of field's value: ''","caused_by":{"type":"illegal_argument_exception","reason":"'' is not an IP string literal."}}}}}}

I know this has to do with the mappings I had set up. I mapped remoteAddress of type IP. The issue is not all events have an ip value. If theres no IP value, its just " ".

How would I go about fixing this? I still want it of IP type. Also I don't understand why one field would cause the whole event to not ingest.

Thanks
Logstash: 8.14.1

That's just the way it is. If a field cannot be mapped then the document is not indexed.

You will need to clean up your data in logstash. There is an example of using a ruby filter to delete empty fields here.

1 Like

The default behaviour when you have a mapping conflict is to reject the entire document, you can change this a little using the ignore_malformed parameter to your fields or to all fields in the index.

This will make Elasticsearch ignore the conflicting field but index the others, keep in mind that this does not work for conflicts with objects fields, like the mapping of the field is an object, but in the document the field is a scalar.

This blog post also has more insight on this.

The parameter ignore_malformed is already configured by default on the built-in templates since 8.9.

If you use a custom template you need to add it yourself.

But I would say that it is better to remove the empty fields, you can do that in logstash using a ruby filter or on Elasticsearch side using an ingest pipeline with a script processor.

For example, all elastic agent integrations have a final script processor that do that, you could copy this processor and use it in a custom ingest pipeline, and then configure your templates to call this ingest pipeline when sending data to Elasticsearch.

2 Likes