Logstash elasticsearch output - data_stream_auto_routing

I'm a bit confused about the data_stream_auto_routing option when using elasticsearch output to data streams in logstash.

According to documentation:

Automatically routes events by deriving the data stream name using specific event fields with the %{[data_stream][type]}-%{[data_stream][dataset]}-%{[data_stream][namespace]} format.

My output config:

output {
  stdout {}
  elasticsearch {
    hosts => ["${ES_HOSTS}"]    
    user => "${ES_USER}"
    password => "${ES_PASS}"
    ssl => "true"
    data_stream => "true"   
    data_stream_auto_routing => "true"
  }
}

So I was under the assumption that when logging from different sources, they could all send their own name spaces via data_stream.namespace and logstash would send the output to different data_streams, i.e. logs-generic-app1, logs-genetic-app2 and so forth...

Instead, it seems that all events are being sent to logs-generic-default while also throwing an error:

 "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Mapper for [data_stream.namespace] conflicts with existing mapper:\n\tCannot update parameter [value] from [default] to [app1]"}}}}

Am I completely misunderstanding how this is supposed to work?

Never mind... Figured it out. For future reference:

This only works when the data is a literal object in logstash, meaninig

  "data_stream": { "dataset": "some_set", "namespace": "namespace" } 

The incoming fields were dotted (data_stream.namespace:something), so logstash completely ignored them, while Elasticsearch was treating them as nested fields...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.