How to process these NginX logs? Grok error

Hello!

I'm pretty new to ELK, but I have theoretical and practical background in Cyber Security. As a hobby project I'm trying to setup a simple SIEM, or log collector at home to see how the ELK stack is configured.

Installing ELK on an Ubuntu 20.04 VM (IP: 192.168.3.30) was easy and is running smoothly. Winbeats on my Win 11 PC is setup and forwarding logs. These are all ingested fine.

My main Linux VM (192.168.3.3) runs a docker called Nginxproxymanager. It is writing Nginx access and error logs to a location on disk and I want to forward these to Elasticsearch. Forwarding works;

  • I installed filebeat on the Linux VM (192.168.3.3).
  • Specified the log file location in /etc/filebeat/modules.d/nginx.yml
    Logs are forwarded, but I see: "Provided Grok expressions do not match field value" in Kibana. so Elasticsearch does not contain all the right fields indexed.

I have no idea where to start to fix this error. Below is a line of the access log which is my main focus. (mydomain.nl replaced due to privacy reasons)

[17/Oct/2021:09:29:34 +0000] - 200 200 - GET https cloud.mydomain.nl "/apps/theming/icon?v=2" [Client 89.205.225.142] [Length 6795] [Gzip -] [Sent-to nextcloud_app_1] "Mozilla/5.0 (Linux; Android 12) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/94.0.4606.71 Mobile Safari/537.36" "-"

From what I see, the access log is formatted slightly different than a default nginx access log. I believe by default the line starts with the client IP, not the timestamp. I think that's why I'm seeing the Grok error. How can I configure the filter to successfully index my log lines?

This is right, this log format is not the default, the nginx module in filebeat expects the log format to be in the default format to work.

The nginx module uses an ingest pipeline in elasticsearch to parse the messages, you will need to edit the ingest pipeline used by this module so it can parse your message.

You can do this directly on Kibana, go to Stack Management / Ingest Nodes Pipelines, there you will have the ingest pipeline being used.

Use the current grok pattern as an starting point to build yours.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.