Can custom created logs from Logstash be shipped to ES using Filebeat?

I'm not able to GROK a custom log from logstash to ES using FIlebeat. Everytime i create a grok im only able to see the fields that are default in Filebeat. The fields that I grok from the message are not seen.

Can you please provide more information about your setup and logs you are trying to grok.

What error you are seeing?

Please share the config files you are using, sample log messages, and software versions (filebeat, logstash, and elasticsearch).

The message you are trying to grok is JSON so it would be easiest to use a json filter.

filter {
  json {
    source => "message"
  }
}

If you are reading the log file using Filebeat you could do the json decoding there instead using the logstash filter. See https://www.elastic.co/guide/en/beats/filebeat/current/configuration-filebeat-options.html#config-json

filebeat.prospectors:
- paths: [mylog.json]
  json.keys_under_root: true
  json.add_error_key: true

Thank you so much for the quick reply! Just a question, This means that I have to replace the json filter instead of the grok to parse the data right?

Correct, replace the grok filter with the json filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.