Dynamic naming of data-streams

Hello,

Currently I'm trying to simplify my pipelines but unfortunately the sprintf format for field reference according to Field References Deep Dive | Logstash Reference [7.14] | Elastic does not work.

Am I doing something wrong here or is it just not possible?

Below you'll see the current settings of the pipeline:

input {
  pipeline { address => "pipeline-name" }
}

filter {
  grok {
    match => {"[log][path]" => "%{GREEDYDATA}/%{WORD:logfile}\.log" }
  }
}

output {
  elasticsearch {
    ssl => true
    cacert => "/etc/logstash/ssl/ca.crt"
    hosts => ["node001:9200", "node002:9200", "node003:9200"]
    user => "USER"
    password => "PASSWORD"
    data_stream => true
    data_stream_type => "logs"
    data_stream_dataset => "%{[fields][ingest][data_origin]}.%{[logfile]}"
    data_stream_namespace => "prod"
  }
}

For now I'm using an if-else construct with about 8-9 cases which I am quite unhappy about.

Maybe someone of here can share his thoughts.

EDIT: The data stream which gets created comes out like this:

logs-%{[fields][ingest][data_origin]}.%{[logfile]}-prod

Kind Regards,
Marcus

I see no indication in the code that it sprintfs the dataset name, so that is the outcome I would expect.

Thanks for taking the time to clarify this. Then I'll stuck with the if-else's.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.