Use field name in datastream namespace or dataset

I'm trying to use a field with logstash output to Elasticsearch but I can't seem to get this to work. My configuration is as below:

output {
  elasticsearch {
      hosts => "es01"
      data_stream => "true"
      data_stream_namespace => "%{[ticket_number]}"

The resulting index has the name:


I'm not sure that dynamic fields are supported or not? Maybe the syntax is incorrect?

I tried a few different approaches too, the following didn't work:


The output does not sprintf the datastream name, but you can use the data_stream_auto_routing option to tell the output to reference fields on the event for parts of the stream name.

Hey @Badger - Thanks for the source code link, I was looking for this and couldn't find it.

I've tried your suggestion with data_stream_auto_routing, but I receive an error. My configuration looks like this:

filter {
   // set ticket_number based on grok filter
  grok {
    match => {
      "path" => "%{GREEDYDATA}/data/%{GREEDYDATA:[ticket_number]}/%{GREEDYDATA:[node_name]}/logs/%{GREEDYDATA}\.log"
  mutate {
    replace => {"data_stream.dataset" => "%{ticket_number}"}

output {
  elasticsearch {
    hosts => "es01"
    data_stream => "true"
    data_stream_auto_routing => "true"

The error I receive is as follows:

[2022-04-13T10:48:54,447][WARN ][logstash.outputs.elasticsearch][debug-logs][b1d4c5c8928af2ea4daa905af544039dc325d7cc8c34d4e82fd26795b0a83810] Could not index event to Elast
icsearch. {
            "message"=>"[system/00000000] Checkpoint triggered by \"Store copy\" @ txId: 73 checkpoint started...",
            ], "data_
stream.dataset"=>"123456", "path"=>"/home/logstash/data/123456/node1/logs/mylog.log", "loglevel"=>"INFO", "@timestamp"=>2022-04-04T10:21:16.563Z, "data_stream"=>{"type"=>"logs", "dataset"
        "status"=>400, "erro
r"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [data_stream.dataset
        ] of type [constant_keyword
        ] in document with id 'spKKIoABpJlrM6S2W1GG'. Preview of field's v
alue: '123456'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"[constant_keyword
        ] field [data_stream.dataset
        ] only accepts values that are equal to the value defined in t
he mappings [generic
        ], but got [

Looking at the above I see the following :_index=>"logs-generic-default", which tells me that the index is already created somehome, and this event is already destined for and index called logs-generic-default rather then logs-123456-default

I reviewed the logs in prior to the errors and see that maybe this could be something to do with ecs_compatability and ootb templates... but I'm not sure. Here is what I noticed in the logs:

[2022-04-13T10:48:50,847][INFO ][logstash.outputs.elasticsearch][mylog-logs] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2022-04-13T10:48:50,888][WARN ][deprecation.logstash.filters.grok][mylog-logs] Relying on default value of `pipeline.ecs_compatibility`, which may change in a future major 
release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.
[2022-04-13T10:48:50,934][INFO ][logstash.outputs.elasticsearch][mylogs-logs] Installing Elasticsearch template {:name=>"logstash"}

Do I need to set some additional configuration to make this work? I'm not sure what I am missing....

logstash does not use the same syntax as kibana and Elasticsearch. It can distinguish between a field with a . in its name and a nested field. Use [data_stream][dataset].

That worked a treat :+1:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.