Can custom created logs from Logstash be shipped to ES using Filebeat?


(Nandha Krishna) #1

I'm not able to GROK a custom log from logstash to ES using FIlebeat. Everytime i create a grok im only able to see the fields that are default in Filebeat. The fields that I grok from the message are not seen.


(Andrew Kroh) #2

Can you please provide more information about your setup and logs you are trying to grok.

What error you are seeing?

Please share the config files you are using, sample log messages, and software versions (filebeat, logstash, and elasticsearch).


(Andrew Kroh) #4

The message you are trying to grok is JSON so it would be easiest to use a json filter.

filter {
  json {
    source => "message"
  }
}

If you are reading the log file using Filebeat you could do the json decoding there instead using the logstash filter. See https://www.elastic.co/guide/en/beats/filebeat/current/configuration-filebeat-options.html#config-json

filebeat.prospectors:
- paths: [mylog.json]
  json.keys_under_root: true
  json.add_error_key: true

(Nandha Krishna) #5

Thank you so much for the quick reply! Just a question, This means that I have to replace the json filter instead of the grok to parse the data right?


(Andrew Kroh) #6

Correct, replace the grok filter with the json filter.


(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.