Failover Mapping

Hey, we are using elasticsearch primarily to index our logging data. Unfortunaley due to the many systems we are maintaining, we can't really guarantee that every "device" is logging it's events the same way.

For example: We are using the fieldname 'srcip' in every event which contains a source ip address. In our elasticsearch cluster I've created a static mapping for this field to parse it as "ip". Thats perfect for 99% of our logevents. Unfortunately our windows events which we are indexing by using:

nxlog > syslog-ng (to hdd) > logstash json parser (from hdd) > elasticsearch

do not always contain an IP address even if it has been added by windows. Sometimes it is only a dash '-' and if that is the case elasticsearch is logging an error and the document is not indexed. Like so:

"stacktrace": ["org.elasticsearch.index.mapper.MapperParsingException: failed to parse field [srcip] of type [ip] in document with id 'F9_XOXABfPOu07_eHMGA'. Preview of field's value: '-'"

So my question is, if it is possible to define a kind of "failover mapping". So a solution for the above example would be, that the content of srcip ("-") should be indexed as string in case the ip field mapper fails.

PS: The same problem applies to several other sources were we are collecting logdata. Windows was only a example.

Hi @tomx1,

You can't fail over to another type but you can use ignore_malformed to make the event index properly for its other fields. See https://www.elastic.co/guide/en/elasticsearch/reference/current/ignore-malformed.html

You will still have the contents of the broken field in the source if you need to inspect it.

1 Like