Filebeat indexing on Windows

Hi,

I'm new to the Elastic Stack and came across filebeat when trying to find a way to push custom logfiles from windows through to Elasticsearch.

I've got filebeat installed and have got the logs going to AWS Elasticsearch but I'm not sure how to index my custom log file so that it is more searchable in Kibana?

My log file is space delimited like this:

"date" "starttime" "endtime" "client" "successful", etc

When my log file arrives in Elasticsearch, the entire line appears in the "message" field. How can I index the log file so each column in the log file shows as a separate field when it gets through to Elasticsearch?

Probably a basic question but I'm just starting my journey.

Thanks!

filebeat is just the data shipper. It does not parse any content. For parsing you can use either Logstash or Elasticsearch Ingest Node. The filebeat modules for example do configure and use Elasticseach Ingest Node automatically on startup.

Thanks for the information.

I've just checked on the filebeat modules and can see that there are some standard modules but I would need to create my own to parse my custom log file?

Can you created your own on Windows? The developer guide just has instructions for Linux:

https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-modules-devguide.html

Would you recommend that I should use logstash for my use case?

filebeat modules is just a set of configurations for filebeat and Elasticsearch. You don't need to use filebeat modules, you can as well create a pipeline in Elasticsearch and configure filebeat to send the logs to this pipeline.

You can simulate/create pipelines in Elasticsearch using any REST client or the Elasticsearch developer console (e.g. from within Kibana).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.