Send custom logs to elasticsearch with predefined list of fields

Disclaimer: I am very confused about filebeat help files. If I need anything more technical than sending a log line to elasticsearch, it does not explain anything. It just states options and leaves me to find out which options I need and what values to set them to.

Situation:

  • Windows server with Filebeat installed
  • Filebeat has access to Elasticsearch and Kibana (simply no security needed)
  • Custom log of one json per line
  • Fields description both in json and yml available

Up until now, we just dumped a line of json as a message to ES and it worked, because everything was a keyword. But now I need to analyse some fields into numbers and dates

I think I need an index template. I have defined an index template in Kibana.
I think filebeat needs the name of this index template to be able to load it (?).

Please advise

Hi @Johannnnnn

in Filebeat, you're specifying the index being written to in Elasticsearch. If that index name matches the pattern that you have configured in your index template, when a new index is created (as in the case of the first event written for a day) the template will be used to generate that new index.

Hi Mitch. This is the current setup of my filebeat.yml:

  # ======================= Elasticsearch template setting =======================

setup.template.overwrite: true
setup.template.name: "performance-tests"
setup.template.pattern: "performance-functional-test*"
setup.ilm.enabled: false

Do I understand correctly, by setting the overwrite to true, filebeat will overwrite any mapping I set under Index Management > Index templates > template > Mappings?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.