How to parse additional kubernetes pods with json fomat logs?

I have a ELK stack with Filebeat as an agent. Works great for collecting logs of existing pods.
Here is the helm values for filebeat

filebeatConfig:
  filebeat.yml: |
    logging.level: error
    filebeat.autodiscover:
      providers:
        - type: kubernetes
          node: ${NODE_NAME}
          hints.enabled: true
          hints.default_config:
            type: container
            paths:
              - /var/log/containers/*${data.kubernetes.container.id}.log
            
    output.logstash:
      hosts: 'logstash-logstash.logging.svc.cluster.local:5044'

    setup.template:
      name: "k8s"
      pattern: "k8s-*"
      enabled: false

    setup.ilm.enabled: false

Now, I have a new Kubernetes deployment of falco , whose logs are in json format and want to ship it to ELK stack but not able to with this config. How can I possibly do it, not able to figure out the right way.

Tried following that didn't work

            type: container
            paths:
              - /var/log/containers/*${data.kubernetes.container.id}.log
             json.keys_under_root: true
             json.add_error_key: true
             json.message_key: message

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.