Grok not parsing

logstash.conf

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  beats {
    port => 5044
    type = "test"
  }
}
filter {
  if [type] == "log" {
    grok {
      match => {
        "message" => "%{DATE_US:date}\s*%{TIME:time}\s*\\t%{DATA:proc}\s*\(%{DATA:procid}\)\s*\\t%{DATA:tid}\\t%{DATA:area}\s*\\t%{DATA:category}\s*\\t%{DATA:eventid}\s*\\t%{DATA:level}\s*\\t%{DATA:message}\\t%{GREEDYDATA:corellation}$"
      }
      remove_field => ["message"]
    }
  }
}
output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    #user => "elastic"
    #password => "changeme"
  }
}

filebeat.yml

filebeat.inputs:

- type: log

  id: my-filestream-id

  enabled: true

  paths:
    - C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\LOGS\SHRP-10*.log

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml

  reload.enabled: false

setup.template.settings:
  index.number_of_shards: 1

output.logstash:

  hosts: ["172.18.10.236:5044"]

processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

example logs

09/15/2023 06:58:32.29 \tw3wp.exe (0x1E40)                       \t0x53CC\tSharePoint Foundation         \tMonitoring                    \tb4ly\tHigh    \tLeaving Monitored Scope: (PostAuthenticateRequestHandler) Время выполнения=0.996825596874434; CPU Milliseconds=0; Число запросов SQL=2; Parent=Request (GET:https://172.18.2.39/)\ta592daa0-7bf6-b069-a8b6-2cbff679f9ad
09/14/2023 13:09:35.61 \tDistributedCacheService.exe (0x2130)    \t0x1A3C\tSharePoint Foundation         \tMonitoring                    \tb4ly\tMedium  \tLeaving Monitored Scope: (Updating child collection file cache.) Время выполнения=0.91805628854773; CPU Milliseconds=0; Число запросов SQL=1; Parent=Invalidating object collection file cache according to change type\t

Tell me where my error is, everything is fine in the grok debugger, but in logstash it doesn’t even want to delete the message. Thanks

Welcome to the community!

Quickest solution, change after filter:

if [input][type] == "log" {
grok {
    match => { "message" => "%{DATE_US:date}\s*%{TIME:time}\s*\\t%{DATA:proc}\s*\(%{DATA:procid}\)\s*\\t%{DATA:tid}\\t%{DATA:area}\s*\\t%{DATA:category}\s*\\t%{DATA:eventid}\s*\\t%{DATA:level}\s*\\t%{DATA:message}\\t%{GREEDYDATA:corellation}" }
	overwrite => [ "message" ]
  }
}

If you don't use overwrite then you will have the message field with additional the parsed field from grok as an array.

What you can change:
Filebeat:

  • type: filestream - log type is depricated

Logstash:

  • You don't need type on input beat,
  • Improve grok, have date+time in a single field, and no need $ at the end
  • Convert to date format
  • Additional parse the message with the csv plugin

input {
  beats {
    port => 5044
    enrich => "none"
  }  
}

filter {

 if [input][type] == "log" {
   grok {
    match => { "message" => "%{DATESTAMP:[@metadata][time]}\s*\\t%{DATA:proc}\s*\(%{DATA:procid}\)\s*\\t%{DATA:tid}\\t%{DATA:area}\s*\\t%{DATA:category}\s*\\t%{DATA:eventid}\s*\\t%{DATA:level}\s*\\t%{DATA:message}\\t%{GREEDYDATA:corellation}" }
	overwrite => [ "message" ]
    }
 }

	date {
        match => ["[@metadata][time]", "MM/dd/yyyy HH:mm:ss.SS"]
    }
	
	csv{
	 source => "message"
	 separator => "; "
	 columns => ["msg1","msg2","msg3","msg4"]
	}

}

output {
     file { path => "C:/path/sharep_%{+YYYY-MM-dd}.txt" }

    stdout {
        codec => rubydebug{}
    }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.