Grok filter pattern for nginx

Hello Team,

I am using ELK6.4.0 and filebeat6.4.0. Currently i am sending only my application logs over elasticsearch using filebeat and parse via logstash. But now we want to send the nginx logs (error and access) over elasticsearch. But i am confused how i can add the grok pattern for nginx in my current filter.

Please find the current logstash configuration:-

input {
  beats {
    port => 5044
    ssl => true
    ssl_certificate_authorities => ["/etc/pki/tls/ca.crt"]
    ssl_certificate => "/etc/pki/tls/server.crt"
    ssl_key => "/etc/pki/tls/server.key"
    ssl_verify_mode => "peer"
    tls_min_version => "1.2"
  }
}
filter {
grok {
match => { "message" => [ "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}", "\I\,\s\[(?<date-time>[\d\-\w\:\.]+)\s\#(?<pid>\d+)\]\s+(?<loglevel>\w+)\s\-+\s\:\s\[(?<request-id>[\d\w\-]+)\]\s(?<method>[\w\s]+)\s\"(?<path>[\w\/\.]+)\"\s(?<mlp-message>.*)", "\I\,\s\[(?<date-time>[\d\-\w\:\.]+)\s\#(?<pid>[\d]+)\]\s\s(?<loglevel>[\w]+)\s\--\s\:\s\[(?<request-id>[\d\-\w]+)\]\s(?:[cC]urrent\s)?[dD]evice[\s:]+(?<device-id>[\w\s\:]+)", "\I\,\s\[(?<date-time>[\d\-\w\:\.]+)\s\#(?<pid>\d+)\]\s+(?<loglevel>\w+)\s\-+\s\:\s\[(?<request-id>[\d\w\-]+)\]\s(?<mlp-message>.*)", "\w\,\s\[(?<date-time>[\w\-\:\.]+)\s\#(?<pid>\d+)\]\s+(?<loglevel>\w+)\s(?<mlp-message>.*)" ] }
add_field => [ "received_at", "%{@timestamp}" ] add_field => [ "received_from", "%{host}" ]
}
}
output {
  elasticsearch {
    hosts => ["xyz:9200"]
    sniffing => true
    manage_template => false
#    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

Can you please help me how i can add grok pattern for nginx.

Note:- Current filter have default syslog pattern and pattern for my application.

Thanks in advance.

In your Filebeat configuration set fields that indicate the type of log you have. Then add conditionals to your Logstash configuration to choose between different filters.

https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#conditionals

Hello Magnus,

Thank you for your response.

Can you please elaborate little bit more?

Currently i have setup type log in my filebeat input. Please refer the below config part:

#=========================== Filebeat inputs =============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
#    - /var/log/*.log
    - /var/apps/mobilock/shared/log/production.log

Can you please give me small example. I am running out of ideas here, because i never used conditional base filter.

Thank you.

Can you please elaborate little bit more?

Here's an example that sets a field named mycustomvar: elasticsearch - Generating filebeat custom fields - Stack Overflow

Can you please give me small example. I am running out of ideas here, because i never used conditional base filter.

The documentation I linked to earlier contains several examples. For example,

if [action] == "login" {
  mutate { remove_field => "secret" }
}

shows how to run a mutate filter only if the action field has a particular value.

Hello Magnus,

Thank you for providing useful info.

I have last question, Can we use filebeat nginx module to send the logs over elasticsearch?

Thanks.

Can we use filebeat nginx module to send the logs over elasticsearch?

Yes, most likely.

1 Like

Hello Magnus,

Thank you for your support.

I have enabled the filebeat nginx module and started getting the logs on kibana dashboard for nginx.

But i am facing another issue i.e i am getting the nginx logs over kibana dashboard but filebeat nginx dashboard is not showing any data. Please refer the below screenshotsSelection_024

Selection_025

I am using filebeat prospector also for our arbitrary application. is that have any impact?

I suggest you ask questions about Filebeat in the Filebeat category.

Sure. Thanks :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.