ECS issues

Hi,

We have recently upgraded from 7.x to 8.x which overall has been OK. We have a number of feeds via logstash which were in place under 7.x which are indexing correctly due to having been created under 7.x, however we are now trying to add a new feed which is identical but collecting events from a different source and receiving a message to advise that the event cannot be indexed due to a mismatch with the ECS format.

Previously, we have just been able to send data to a new index, and Elasticsearch mapped the fields. Is there any way we can avoid having to match the ECS format when sending events via logstash? A setting in the Elasticsearch output?

Whilst I can see the benefits of the ECS and we will certainly head that way, we just don't have time to create the logstash filters to comply right now.

Thanks.

It's be useful if you shared the full message, along with an example of the mapping and the data.

[2022-04-04T04:44:00,345][WARN ][logstash.outputs.elasticsearch][adv-jdbc-logging][975684571fa15924835a86b188c57248bf02f343b57de03f9760d526ba1ef891] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"jdbc-prod-auth", :routing=>nil}, {"@version"=>"1", "site"=>"https://xxxx.xxxxx.app/", "user"=>"jonp@xxxx.com", "sourceip"=>"xxx.xxx.xxx.xxx", "windowsidentity"=>"IIS APPPOOL\\xxx.xxxxxx.app", "logid"=>42, "@timestamp"=>2022-04-04T04:44:00.148794Z, "issuccess"=>false, "loggedondate"=>2022-04-04T04:43:56.110Z, "tags"=>["JDBC-PROD-AUTHLOG"], "useragent"=>"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.84 Safari/537.36"}], :response=>{"index"=>{"_index"=>"jdbc-prod-auth-000001", "_id"=>"Q23j8n8BOqpPxiVrC4DH", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [user] tried to parse field [user] as object, but found a concrete value"}}}}

The issue appears to be that in the ECS the value for user is an array, however in our data it's a simple text field. My question is, is there any way to avoid having to use ECS or using custom templates? Elasticsearch has always just indexed the data in the past.

Thanks.

In ECS, user.* is a top-level object with leaf fields underneath, like user.name. I suspect your data is using user as a leaf field, causing ES to throw this exception.

In Logstash 8.x, all pipelines use ECS compatibility mode by default, but the setting can be disabled: ECS in Logstash | Logstash Reference [8.1] | Elastic.

Thanks for this reply, that is exactly what the problem was. Instead of disabling compatibility, I created a new mapping to match my events.

I assume it would be better to comply with the ECS where possible?

There is a lot gained from structuring your data to a standard schema, and the Elastic security and observability solutions both take advantage of ECS-mapped data: Elastic Common Schema: Normalizing your data with ECS | Elastic.

But I can't say it's always better to comply with ECS. How to structure your data depends on your specific needs and uses.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.