Default ingest pipeline not working

Hi, I'm pretty new in elastic search so sorry if it's a newbie question.
I want to change the ingest pipeline and index for my cisco ios module. I tried changing them in filebeat.yml file. my config so far:

filebeat.inputs:
- type: log
  enabled: false
  paths:
    - /var/log/*.log
    #- c:\programdata\elasticsearch\logs\*
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
  #reload.period: 10s
setup.template.settings:
  index.number_of_shards: 1
setup.kibana:
  host: "kibana:5601"
output.elasticsearch:
  hosts: ["node-1:9200"]
  indices:
    - index: "cisco-beat-%{+yyyy.MM}"
      when.contains:
        event.module: "cisco"
  protocol: "https"
  ssl.certificate: "/etc/filebeat/filebeat/filebeat.crt"
  ssl.key: "/etc/filebeat/filebeat/filebeat.key"
  ssl.certificate_authorities:
    - /etc/filebeat/ca/ca.crt
  username: "elastic"
  password: "xxxxxxxxxxxxxxxxx"
  pipeline: geoip-info
  pipelines:
    - pipeline: "ciscosbeatspipeline"
      when.contains:
        event.module: "cisco"
processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

I created my ingest pipeline (very simple) via dev console :

 PUT _ingest/pipeline/ciscosbeatspipeline
{
    "description": "cisco ios ingest pipeline",
    "processors": [
        {
            "dissect": {
            "field": "log.original",
            "pattern" : "%{}: %{DeviceName}:"
           }
        },
           {
            "set": {
                "field": "host.name",
                "value": "{{DeviceName}}"
              }
        },
        {
            "remove": {
                "field": "DeviceName"
  }
}
    ]

}

Then I changed the default pipeline setting for my index. here is my current index setting:

GET /cisco-beat-*/_settings
{
  "cisco-beat-2020.09" : {
    "settings" : {
      "index" : {
        "number_of_shards" : "1",
        "provided_name" : "cisco-beat-2020.09",
        "default_pipeline" : "ciscosbeatspipeline",
        "creation_date" : "1599297439325",
        "number_of_replicas" : "1",
        "uuid" : "xuLvwqnLTHiZIQ6yKICRaQ",
        "version" : {
          "created" : "7080099"
        }
      }
    }
  }
}

But my data isn't processing in the ingest pipeline. I checked my pipeline with test pipeline in stack management -> ingest node management and it works fine(host.name field changes).
I'm running a single node cluster ELK 7.8.

Did you change enabled from false to true?

Your filebeat input will not work until enabled is set to true.

Did you change enabled from false to true ?

Your filebeat input will not work until enabled is set to true .

I don't think that's the problem. filebeat can receive my cisco logs and send it to elasticsearch with the correct index ("cisco-beat-%{+yyyy.MM}") but can't send it to the pipeline

Yeah, you are right, you are using the cisco module, sorry, saw it later. The enabled in the input does not affect your case.

So, can you share what are you receiving in Elasticsearch, what fields does your document have?

I'm not sure if filebeat or the cisco module ouputs anything in the field log.original. Share a sample document of the what is being indexed.

here is a sample document.
also, I created a very simple pipeline (just a set processor) and I still didn't get results. so I don't think there is a problem with ingest pipeline itself.

After reading this post i changed final_pipeline instead of default_pipeline and it worked!
looks like default_pipeline doesn't work if your docs are directly coming from filebeat and it's not a re-index or a watcher index action. hope they add a warning in documents.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.