Use ingest pipelines with output.kafka specified

Hello there, is it possible use ingest pipeline in filebeat without direct connection to logstash (or elastic) ? I mean in our scenario we have specified ouput.kafka (filebeat -> kafka -> logstash -> elastic), however we need make some changes in logs fileds and structure before log is send from filebeat to kafka topic. Also I think it's good to be mention that we use autodiscover setup for our kubernetes environment. So we made custom module and ingest pipeline but it seems it doesnt work. As long as I read documentation I understand that this is not possible to filebate use ingest pipelines to tranfrom logs. Am I right or Im missing something ? Is there any option how make advanced logs transformation in filebaet ? Thank you

You can use conditional in your ultimate logstash output to specify ingest pipeline. It the sample below, I have a tag and must have a module name that gets passed to the "pipeline" option.

else if "use_ingest" in [tags] and [agent][module] {
    elasticsearch {
      hosts => [{{ ES_http }}]
      cacert => "/etc/logstash/certs/https_interm.cer"
      user => "{{ elastic.user }}"
      password => "{{elastic.pass }}"
      sniffing => false
      manage_template => false
      pipeline => "%{[@metadata][beat]}-%{[@metadata][version]}-%{[agent][module]}-%{[fileset][name]}-pipeline"
      ilm_enabled => true
      index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{[fields][app_id]}-%{[fields][campus]}"
    }
  }

Well thanks for your reply I appreciate it. But Im not sure this is what Im looking for (or maybe I just missinterpret your answer). We dont want to make any changes in logstash or elastic. Those components (logstash,elastic) are shared for many teams in our corporation so we dont want to doing any unnecessary changes just becauce we decided to use filebeat. While those components are maintained by another squad, every change request is long running corporate process. Our idea was just use filebeat to read logs from docker container, transfrom theirs structure or fileds as we want, and send it to kafka topic. Thats it. We thought use filebat custom filebeat modules and ingest piplines is right way for that purpose. But now it seems we have to find out another solution

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.