Pipeline with id [y] does not exist

Hello,

I've found multiple topics about this very issue however I don't really see a solution. I'm also having a little difficulty understanding it all as I'm fairly new at this.

I have each of the ELK components running in Docker on Alpine Linux. Those are version 7.16.3. I have numerous hosts using filebeat to feed logs into logstash, and everything is working fine, other than everything appears in only the message field. Thus I am experimenting with filebeat modules, the iptables module in this case.

On the filebeat node that I am experimenting with, its version is 7.16.4. Here is its filebeat.yml (comments removed):

output.logstash:
  hosts: ["10.1.1.31:5044"]

processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

Verification that the iptables filebeat module is enabled:

# ./filebeat -c ./filebeat.yml modules list | head
Enabled:
iptables

modules.d/iptables.yml (comments removed):

- module: iptables
  log:
    enabled: true
    var.input: "file"
    var.paths: ["/var/log/iptables.log"]

/var/log/iptables.log is created by rsyslogd which is configured to match the firewall log lines with the prefix i'm using.

Let's fire it up!

./filebeat -c ./filebeat.yml -e

Nothing unusual on STDERR from that. However I don't see the expected data in kibana. The following shows up in logstash logs:

[2022-02-08T01:35:51,938][WARN ][logstash.outputs.elasticsearch][main][1ce34e7b02ff1dc873969a447ed719029b89cf2450da2d03f9b094543240b4ef] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-7.16.4-2022.02.08", :routing=>nil, :pipeline=>"filebeat-7.16.4-iptables-log-pipeline"}, {"message"=>"2022-02-08T01:35:48.042301+00:00 alpine-fb-test kernel: [ 7947.747187] IPTABLES: IN=eth0 OUT= MAC=00:0c:29:46:97:06:00:52:56:32:f8:c1:08:00 SRC=10.1.1.224 DST=10.1.1.38 LEN=60 TOS=0x00 PREC=0x00 TTL=64 ID=28993 DF PROTO=TCP SPT=46412 DPT=12351 WINDOW=64240 RES=0x00 SYN URGP=0 ", "agent"=>{"id"=>"9ed8531f-e3ca-447e-8282-988006213834", "hostname"=>"alpine-fb-test", "name"=>"alpine-fb-test", "ephemeral_id"=>"00623e0f-bc4a-403a-bea5-421bbfe41317", "type"=>"filebeat", "version"=>"7.16.4"}, "event"=>{"dataset"=>"iptables.log", "module"=>"iptables", "timezone"=>"+00:00"}, "log"=>{"offset"=>187085, "file"=>{"path"=>"/var/log/iptables.log"}}, "@version"=>"1", "fileset"=>{"name"=>"log"}, "input"=>{"type"=>"log"}, "service"=>{"type"=>"iptables"}, "ecs"=>{"version"=>"1.12.0"}, "@timestamp"=>2022-02-08T01:35:50.833Z, "tags"=>["iptables", "forwarded", "beats_input_codec_plain_applied"]}], :response=>{"index"=>{"_index"=>"filebeat-7.16.4-2022.02.08", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"pipeline with id [filebeat-7.16.4-iptables-log-pipeline] does not exist"}}}}

Going by the other threads I try to check if that pipeline exists. (Am I doing this right?):

curl -XGET "http://10.1.1.31:9200/_ingest/pipeline/filebeat-7.16.4-iptables-log-pipeline"

{}

I am not really sure how to proceed from here. If you've read this far thanks!

It looks like your Elasticsearch output in logstash is setting the pipeline option, and it is setting it to tell Elasticsearch to use an ingest pipeline that does not exist. What does your Elasticsearch output configuration look like?

# attempt to get expected iptables filebeat module fields populated
# https://www.elastic.co/guide/en/logstash/7.0/use-ingest-pipelines.html

output {
  if [@metadata][pipeline] {
    elasticsearch {
      hosts => "http://172.17.0.2:9200"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      pipeline => "%{[@metadata][pipeline]}" 
    }

  } else {

    elasticsearch {
      hosts => "http://172.17.0.2:9200"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    }
   }

Did you run the setup while using the elasticsearch output in filebeat?

I think that you need to use the Elasticsearch output, run the setup, and after that you can change to the logstash output.

Ah HAH!! In the very instructions i linked, that was talked about. I didn't do that. Things are working now this is really exciting!! THANKS!!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.