Create indices dynamically using filebeat by configuring filebeat.yml

Hi guys,

I would like to create indices dynamically using filebeat and ship it directly to elasticsearch without using logstash (too heavy for my use case) by configuring filebeat.yml.

I have one csv file which I would like to use in Kibana to visualise underlying data. Index name should be dynamically created based on values in "state" column.

Steps I have done so far:

  1. import small part of csv file in File Data Visualiser in order to
  • automatically create ingest pipeline in elasticsearch (test_dataset-pipeline)
  • extract filebeat configuration snippet for filebeat.yml
  1. paste filebeat configuration snippet in filebeat.yml on local machine
  2. configure path and dynamic index rule in filebeat.yml

configuration for inputs and output from filebeat.yml

filebeat.inputs:
    - type: log
      paths:
      - 'path/test_dataset.csv'
      exclude_lines: ['^"?fid"?,"?date"?,"?city"?,"?state"?,"?population"?']

output.elasticsearch:
  hosts: ["localhost:9200"]
  pipeline: "test_dataset-pipeline"
  index: "index-%{[state]}-%{+yyyy.MM.dd}"

setup:
  template.enabled: false
  ilm.enabled: false

I get following message (endless loop) after executing filebeat app in linux:

ERROR [publisher_pipeline_output] pipeline/output.go:180 failed to publish events: temporary bulk send failure

What should I configure in order to dynamically populate ES indices based on csv column?

P.S.
Dynamic index are succesfully created in kibana for following rule:
index: "index-%{[agent.version]}-%{+yyyy.MM.dd}"

Index name in KIbana:

  • index-7.10.1-2020.12.23

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.