Unable to use filters with grok expression

Hello Team,

I need to use grok expression for ELK to filter the message data

I have elk configured using docker compose and filebeat on client server using docker container.

I have following files configured for filters using grok expression.

input {
beats {
port => 5044
}
}

filter {
if [path] == “/var/log/apache2/access.log”)
grok {
match => { “message” => “%{IP:client *ip} | %{DATA:syslog* timestamp} | %{WORD:method} | %{DATA:unknow} | %{DATA:xyz} | %{DATA:code} | %{DATA:unno} | %{DATA:byte} | %{DATA:port} | %{GREEDYDATA:syslog_message}” }
}

if [path] == “/var/log/apache2/error.log”)
grok {
match => { “message” => “%{DATA:timestamp} | %{DATA:Loglevel} | %{DATA:requet} | %{DATA:url} | %{GREEDYDATA:syslog_message}” }
}

if [path] = “/var/log/apache2/request.log”)
grok {
match => { “message” => “%{DATA:timestamp} +0000%{DATA:unknow} [0]%{DATA:method} <-%{DATA:code} %{DATA:httpcode} %{GREEDYDATA:message} %{GREEDYDATA:responsetime}” }
}
}

output {
elasticsearch {
hosts => “elasticsearch:9200”
}
}

filebeat.yml as follows.

filebeat.config:
modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false

processors:

* add *cloud* metadata: ~

filebeat.inputs:

* type: log
enabled: true
paths:
  * “/var/log/apache2/*.log” exclude *files: [’.gz$’] json.message* key: log
* type: log
enabled: true
paths:
  * “/var/log/aem/*.log” exclude *files: [’.gz$’] json.message* key: log

when I bring up the ELK stack and start the filebeat I am not able to see the tag in the discover section on the kibana dashboard and getting the following error message in logstash logs.

[2022-01-18T07:43:53,689][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.16.2) {:es_version=>7}
[2022-01-18T07:43:53,690][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2022-01-18T07:43:53,773][WARN ][logstash.outputs.elasticsearch][main] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set `data_stream => true/false` to disable this warning)
[2022-01-18T07:43:53,773][WARN ][logstash.outputs.elasticsearch][main] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set `data_stream => true/false` to disable this warning)
[2022-01-18T07:43:53,783][WARN ][deprecation.logstash.filters.grok][main] Relying on default value of `pipeline.ecs_compatibility`, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.
[2022-01-18T07:43:53,823][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2022-01-18T07:43:53,906][WARN ][deprecation.logstash.filters.grok][main] Relying on default value of `pipeline.ecs_compatibility`, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.
[2022-01-18T07:43:53,927][WARN ][deprecation.logstash.filters.grok][main] Relying on default value of `pipeline.ecs_compatibility`, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.

Here is my logstash.conf which I am using with Logstash pipline.

input {
    beats {
             port => 5044
             type => "log"
    }
}

filter {

if ([log][file][path] == "/var/log/apache2/request.log") {
  grok {
    match => { "message" => "%{IP:client_ip} \| %{DATA:syslog_timestamp} \| %{WORD:method} \| %{DATA:unknow} \| %{DATA:xyz} \| %{DATA:code} \| %{DATA:unno} \| %{DATA:byte} \| %{DATA:port} \| %{GREEDYDATA:syslog_message}" }
  }
}

if ([log][file][path] == "/var/log/apache2/access.log") {
  grok {
    match => { "message" => "%{DATA:timestamp} \| %{DATA:Loglevel} \| %{DATA:requet} \| %{DATA:url} \| %{GREEDYDATA:syslog_message}" }
  }
}

if ([log][file][path] == "/var/log/apache2/error.log") {
  grok {
    match => { "message" => "%{DATA:timestamp} \+0000%{DATA:unknow} \[0]%{DATA:method} <-%{DATA:code} %{DATA:httpcode} %{GREEDYDATA:message} %{GREEDYDATA:responsetime}" }
  }
}
}

output {
  elasticsearch {
    hosts => "elasticsearch:9200"
    user => "elastic"
    password => "*************"
  }
}

please provide you suggestion/input on this how to fix this issue?

Thank you in advance!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.