Grok filter is not working using Filebeat

Hi experts.
I already setup ELK stack and it working normally. But I am having a tough time in setting up log shipping using Filebeat and adding custom fields (grok filter) via logstash.

my sample log lines

[3/2/16 2:27:08:529 EST] 0000e9b7 SystemOut     O INFO  - XYZ100354GG32.42994.01.MKULKHS.00232 - 2016/03/02-07:27:08,529 UTC - ccsas2270.abc3.mywo.com - User authenticated: MKULKHS
[3/4/16 0:23:49:652 EST] 0000003d SystemOut     O INFO  - None - 2016/03/04-05:23:49,652 UTC - ccsas2270.abc3.mywo.com - The connection to the pool with ID ahdlof1_PoolD has been tested successfully. Primary Address Host is 19.116.204.156 and the Primary Address Port is 36520

My Filebeat config

filebeat.prospectors:
- type: log
  enabled: true
  paths:
    - C:\Users\rty\Desktop\INvestigationlogs\FGA\WAS_LOGS\*.log
  document_type: JVM_logs
  include_lines: ['\W*((?i)INFO(?-i))\W*']
  multiline.pattern: '^\W[0-9]{1,2}\W[0-9]{1,2}\W[0-9]{2,4}'
  multiline.negate: true
  multiline.match: after
  multiline.timeout: 1m
  backoff: 5m

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
setup.template.settings:
  index.number_of_shards: 3
tags: ["service-X", "web-tier"]
fields:
 env: Test_staging
output.logstash:
  hosts: ["192.156.3.10:5043"]

My Logstash Conf

input { 
 beats {
 #host => "192.156.3.10"
 port => "5043"	
  }
}

filter {
  if [type] == "JVM_logs" and "multiline" in [tags] {
    grok {
    patterns_dir => ["C:/ELK6.2/logstash-6.2.2/vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns/grok-patterns"]
      match => [ "message", "%{SYSLOG5424SD:DATETIME}" ]
    }
     date {
     match => ["timestamp", "dd/mm/yyyy:HH:mm:ss:SSS"]
      }
    }
  }

output {
stdout { codec => rubydebug }
if "_grokparsefailure" not in [tags] {
  elasticsearch {
  hosts => ["192.156.3.10:9200"]
  manage_template => false
   }
  }
}

With this above configs, I could send multiline logs to my Elasticsearch and visualise over Kibana as well, but trying to add a filter like (DATETIME)is a problem and adding it as a field in elastic & visualizing in Kibana is a problem for me now, I am sure, I am missing out some key tricks , if anyone can guide, would really help.

Thanks

Please ping here if you are getting any error. As per my understanding you are not able to get field named "DATETIME" for which you are using grok right?

As i can see its document_type is JVM_logs not type.

probably bcz of this if condition where type is not JVM_logs your grok filter was not executed. if thats the case try replacing type with document_type in if condition.

I am not getting any error, but to me, it seems like the filter is not getting applied at all, I have tried with making changes as you suggested before I posted this query.

if [document_type] == "JVM_logs" and "multiline" in [tags] {

still didn't see any added fields over kibana

can you help further in any way ?

Comment out your elasticsearch output and show us what the stdout { codec => rubydebug } output produces.

Hi Magnus,

Thanks for coming over to help out !!! here is the below output from my filebeat log,
Hope this is what you asked me to look for ?

  "@timestamp": "2018-03-07T20:59:43.681Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.2.2"
  },
  "beat": {
    "name": "INDENBG01NB",
    "hostname": "INDENBG01NB",
    "version": "6.2.2"
  },
  "source": "C:\\Users\\rty\\Desktop\\INvestigationlogs\\FNA\\WAS_LOGS\\team_User_stdout.log",
  "offset": 184,
  "message": "[3/2/16 2:27:08:529 EST] 0000e9b7 SystemOut     O INFO  - WGC1036PGG32.42994.01.vsivaub.00232 - 2016/03/02-07:27:08,529 UTC - ccsas2270.abc3.mywo.com - User authenticated: vsivaub",
  "tags": [
    "service-X",
    "web-tier"
  ],
  "prospector": {
    "type": "log"
  },
  "fields": {
    "env": "Test_staging"
  }
}
2018-03-07T21:59:43.681+0100	DEBUG	[publish]	pipeline/processor.go:275	Publish event: {

document_type is no where appending in your output. Try adding a field "doc_type : JVM_log" and make "fields_under_root: true". now in logstash try if [doc_type] =~ "JVM_logs".

filebeat:
  prospectors:
    -
      paths:
        - xxxxxxxx
      fields:
        env: Test_staging
        doc_type: JVM_logs
      fields_under_root: true
      multiline:
        pattern: xxxxxxxxx
        negate: true
        match: after

try this and ping any errros.

Clearly you don't have a top level type field so your conditional will never be true, so you either need to adjust the Logstash configuration to match reality or you can change reality be adjusting the Filebeat configuration.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.