How to create custom fields filein filebeat to preprocess before transporting to elasticsearch

My Log Format:

Time Hostname Severity Program Message

My log line example:

2018-02-01T21:1147+05:30 freebsd notice sshd Exiting on signal 15

I want filebeat to transfer these logs to elastic search (WITHOUT LOGSTASH) so that it gets saved in elastic search in the same format as: Time Hostname Severity Program and Message


Instead, it gets stored in elastic search as offset, prospector.type, beat.name, beat.hostname etc


I understand I need to change the fields.yml file but I am not understanding how to do that with my requirement i.e.

Elastic Search should store data as:

xyz.date xyz.hostname xyz.severity xyz.program xyz.message

This is not possible with filebeat alone as it doesn't have parsing capabilities (yet).

However, you can configure an ingest pipeline in Elasticsearch that does this field extraction for you, and have filebeat send the events through this pipeline.

Take a look at the ingest node documentation to understand how to create an ingest pipeline. Add a Grok Processor to the pipeline to parse the message and create the new fields.

Now you only need to add the pipeline name to the prospector configuration in filebeat.yml to have the logs pre-processed in the Elasticsearch ingest node:

filebeat.prospectors:
- type: log
  path: [...]
  pipeline: my_pipeline

If you prefer not to use an ingest pipeline, you can also develop your own filebeat module to parse your custom log files. See the Beats Developer Guide, which contains a section for Creating a New Filebeat Module

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.