Filebeat to elasticsearch's ingest node in Kubernetes

I followed the guide as running-on-kubernetes say,everything worked fine,but one day I would like to spilt the message field by using ES's ingrest node.So I created three pipelines to handle the original docker's container message .But after I created them,there are no newer log inserting to the elastic searh any more.

So,my problem is how to set something to make the log system work properly.

here is my filebeat.yml

    filebeat.config:
        inputs:
          # Mounted `filebeat-inputs` configmap:
          path: ${path.config}/inputs.d/*.yml
          # Reload inputs configs as they change:
          reload.enabled: false
        modules:
          path: ${path.config}/modules.d/*.yml
          # Reload module configs as they change:
          reload.enabled: false

    # To enable hints based autodiscover, remove `filebeat.config.inputs` configuration and uncomment this:
    #filebeat.autodiscover:
    #  providers:
    #    - type: kubernetes
    #      hints.enabled: true
    processors:
      - add_cloud_metadata:

    cloud.id: ${ELASTIC_CLOUD_ID}
    cloud.auth: ${ELASTIC_CLOUD_AUTH}

    output:
      elasticsearch:
        hosts: ['${ELASTICSEARCH_HOST:elasticsearch}:${ELASTICSEARCH_PORT:9200}']
        # username: ${ELASTICSEARCH_USERNAME}
        # password: ${ELASTICSEARCH_PASSWORD}
        pipelines:          
          - pipeline: "nginx"
            when.contains:
              kubernetes.container.name: "nginx-"
          - pipeline: "java"
            when.contains:
              kubernetes.container.name: "java-"              
          - pipeline: "default"  
            when.contains:
              kubernetes.container.name: ""         

Here are my pipeline which I declared


PUT /_ingest/pipeline/java

{

 "description": "[0]java[1]nginx[last]通用规则",

 "processors": [{

  "grok": {

   "field": "message",

   "patterns": [

    "\\[%{LOGLEVEL:level}\\s+?\\]\\[(?<date>\\d{4}-\\d{2}-\\d{2}\\s\\d{2}:\\d{2}:\\d{2},\\d{3})\\]\\[(?<thread>[A-Za-z0-9/-]+?)\\]\\[%{JAVACLASS:class}\\]\\[(?<msg>[\\s\\S]*?)\\]\\[(?<stack>.*?)\\]"

   ]

  },"remove": {

              "field": "message"

            }

 }]

}

PUT /_ingest/pipeline/nginx

{

 "description": "[0]java[1]nginx[last]通用规则",

 "processors": [{

  "grok": {

   "field": "message",

   "patterns": [

    "%{IP:client} - - \\[(?<date>.*?)\\] \"(?<method>[A-Za-z]+?) (?<url>.*?)\" %{NUMBER:statuscode} %{NUMBER:duration} \"(?<refer>.*?)\" \"(?<user-agent>.*?)\"" 

   ]

  },"remove": {

              "field": "message"

            }

 }]

}

PUT /_ingest/pipeline/default

{

 "description": "[0]java[1]nginx[last]通用规则",

 "processors": []

}

Thanks for any help :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.