2018-02-01T21:1147+05:30 freebsd notice sshd Exiting on signal 15
I want filebeat to transfer these logs to elastic search (WITHOUT LOGSTASH) so that it gets saved in elastic search in the same format as: Time Hostname Severity Program and Message
Instead, it gets stored in elastic search as offset, prospector.type, beat.name, beat.hostname etc
I understand I need to change the fields.yml file but I am not understanding how to do that with my requirement i.e.
This is not possible with filebeat alone as it doesn't have parsing capabilities (yet).
However, you can configure an ingest pipeline in Elasticsearch that does this field extraction for you, and have filebeat send the events through this pipeline.
Take a look at the ingest node documentation to understand how to create an ingest pipeline. Add a Grok Processor to the pipeline to parse the message and create the new fields.
Now you only need to add the pipeline name to the prospector configuration in filebeat.yml to have the logs pre-processed in the Elasticsearch ingest node:
If you prefer not to use an ingest pipeline, you can also develop your own filebeat module to parse your custom log files. See the Beats Developer Guide, which contains a section for Creating a New Filebeat Module
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.