7.7: broken filebeat pipeline for Nginx access logs

I just upgraded my stack to 7.7
Now, I see that I have 2 pipelines for nginx access logs:

  1. filebeat-7.7.0-nginx-access-default
    description: "Pipeline for parsing Nginx access logs. Requires the geoip and user_agent plugins."

  2. filebeat-7.7.0-nginx-ingress_controller-pipeline
    description: "Pipeline for parsing Nginx ingress controller access logs. Requires the geoip and user_agent plugins."

What is that new "ingress_controller-pipeline"?

And now here is what I see in the Kibana "Discover":

event.dataset:   nginx.ingress_controller
message:   noty.propovednik.com 2a01:4f8:161:7181::2 - - [13/May/2020:06:19:54 +0000] "GET /index.php/res/Public/%D0%93/%D0%93%D0%BE%D1%81%D0%BF%D0%BE%D0%B4%D1%8C%20%D0%B4%D0%B0%D0%B9%20%D0%B6%D0%B8%D0%B7%D0%BD%D1%8C%20%D0%B4%D1%83%D1%88%D0%B5%20%D0%BC%D0%BE%D0%B5%D0%B9/Public/S/Public/O/Oh,%20What%20a%20Change.nwc HTTP/1.1" 301 178 "-" "Mozilla/5.0 (compatible; MJ12bot/v1.4.8; http://mj12bot.com/)"
error.message:   Provided Grok expressions do not match field value: [noty.propovednik.com 2a01:4f8:161:7181::2 - - [13/May/2020:06:19:54 +0000] \"GET /index.php/res/Public/%D0%93/%D0%93%D0%BE%D1%81%D0%BF%D0%BE%D0%B4%D1%8C%20%D0%B4%D0%B0%D0%B9%20%D0%B6%D0%B8%D0%B7%D0%BD%D1%8C%20%D0%B4%D1%83%D1%88%D0%B5%20%D0%BC%D0%BE%D0%B5%D0%B9/Public/S/Public/O/Oh,%20What%20a%20Change.nwc HTTP/1.1\" 301 178 \"-\" \"Mozilla/5.0 (compatible; MJ12bot/v1.4.8; http://mj12bot.com/)\"]

hm, may be it's not broken.

looks like that ingress controller is a Kubernetes thing:

But I don't have anything to do with Kubernetes. I just have standalone Ubuntu web-server.
Looks like that nginx.ingress_controller things got auto-enabled during upgrade somehow?

Looks to be caused by this MR:

Can be disabled like this:

- module: nginx
  ingress_controller:
    enabled: false

IMHO, ingress_controller should have been disabled by default. Or, at least, not automatically added during upgrade.

Hi Slavik,

I have some problems with the update two of them are in that post, but I'm not having lucky with the answers.

I'm very interested of your update. Do you use

add_kubernetes_metadata:

With the update from 7.3 to 7.7 I cannot use this piece

        - add_kubernetes_metadata:
            in_cluster: true
            host: ${NODE_NAME}
            matchers:
            - logs_path:
                logs_path: "/var/log/containers/"

How did you solve it?

Thank you very much

There was an unexpected regression in 7.7.0, a possible workaround is to disable default matchers:

        - add_kubernetes_metadata:
            in_cluster: true
            host: ${NODE_NAME}
            default_matchers.enabled: false
            matchers:
            - logs_path:
                logs_path: "/var/log/containers/"

I have opened a PR to revert to pre-7.7.0 behaviour: https://github.com/elastic/beats/pull/18818