Nginx logs with filebeat and parsing with pipeline

Hello,I have filebeat installed in one server that have docker containers running inside. One of those container is a nginx container and I need to parse the log to see it well in elastic.

I have configured filebeat tu use a pipeline with this:

 pipelines:
   - pipeline: "filebeat-8.6.1-nginx-access-pipeline"
     when.contains:
        container.labels.com_docker_swarm_service_name: "nginx"

And I have created in the pipeline, the grok pattern:

"%{IP:client_ip} - - \\[%{HTTPDATE:timestamp}\\] \"(?:%{WORD:method} %{URIPATHPARAM:request}(?:\\?%{DATA:query_params})? %{DATA:http_version})\" %{NUMBER:status_code} %{NUMBER:response_size} \"(?:%{DATA:referrer})\" \"(?:%{DATA:user_agent})\" \"%{IP:forwarded_ip}\" \\[%{WORD:cache_status}\\] %{NUMBER:response_time:float}"

If I try the pattern in the grok debbuger in Dev Tools in kibana, I get a success :

with this example trace:


10.0.1.36 - - [12/Nov/2024:14:58:14 +0000] "GET /actuator/prometheus HTTP/1.1" 403 9 "-" "Elastic-Metricbeat/8.6.1 (linux; amd64; 14f2f8d585f8c380945feee789771bd782cd6b2d; 2023-01-24 13:30:23 +0000 UTC)" "172.18.0.1" [-] 0.083

But in discover, when I look for some traces, I see there are grok pattern error:

Why if grok debugger sais Its ok, the traces are with error?

Please post the complete logstash config of the failing pipeline including your grok-pattern.

Perhaps test with the _simulate API in the Kibana Dev Tools it's typically the best way to debug.

Curious why you're not trying the nginx integration?

Thanks, I could fix the error, I have changed the log format of the nginx and now works. Thanks

Hi, thats because the nginx is inside a docker container and the logs are being harvested by filebeat