Multiple indexes not being created for multiple log files

Hello,

I've been trying to create multiple indexes for multiple log files that get created within a server. I'm quite new to filebeat so wanted to know what I might be doing wrong within the code. The use case requires me to use filebeat to capture and send logs to elasticsearch.

Can someone please help. I'm currently testing the below code with filebeat 8.x

# ============================== Filebeat inputs ===============================
filebeat.inputs:
- type: filestream
  id: test-index1-logs
  enabled: true
  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - {{ log_path_index1 }}
  fields:
    name: "index1"

- type: filestream
  id: test-index2-logs
  enabled: true
  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - {{ log_path_index2 }}
  fields:
    name: "index2"

# ======================= Elasticsearch template setting =======================
setup.template.enabled: true
setup.template.name: "test-%{[fields.name]}"
setup.template.pattern: "test-%{[fields.name]}-*"
setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false

# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
  enabled: true
  allow_older_versions: true
  # Array of hosts to connect to.
  #hosts: ["localhost:9200"]
  index: test-%{[fields.name]}-%{+yyyy.MM.dd}"
1 Like

Can someone please help in this?

Hi @developer_cloud , Welcome to the Elastic community. The configuration file looks good to me. Any error you getting?

You can try debugging steps and check if you getting any error or not.

You can paste error or warning if you getting any.

Hi @ashishtiwari1993 , I checked through the logs at /var/log/filebeat/* and found an error stating Connection marked as failed because the onConnect callback failed: error loading template: error creating template instance: key not found". This seems to be happening for Fields.name I believe. I'm currently using filebeat 8.x. Would there be any version related conflicts causing this issue?

Yes it could be the reason. May i what is the version of Filebeat and Elasticsearch?

I was using filebeat 8.x. However, when I switched back to filebeat 7.x, I did not encounter this error. Even then, the streams were not being created. Not sure what I might be doing wrong.

What it showing when you trying to load template manually ?

Overwriting ILM policy is disabled. Set setup.ilm.overwrite: true for enabling.
Index setup finished.
Loading dashboards (Kibana must be running and reachable)
Skipping loading dashboards, Error importing Kibana dashboards: fail to import the dashboards in Kibana: Error importing directory /usr/share/filebeat/bin/kibana: No directory /usr/share/filebeat/bin/kibana/7
Setting up ML using setup --machine-learning is going to be removed in 8.0.0. Please use the ML app instead.
See more: Machine Learning in the Elastic Stack [8.13] | Elastic
It is not possble to load ML jobs into an Elasticsearch 8.0.0 or newer using the Beat.
Exiting: 1 error: Error setting up ML for apache_ecs: 10 errors: ; ; ; ; ; ; ; ; ;

filebeat version: 7.17.21

This seems your parsing also breaking. Could you try specifying static index name and template. Just verify whether its parsing issue or not.

could you elaborate on that?
currently I'm starting the filebeat and then executing the below command
./filebeat test config -c /etc/filebeat/filebeat.yml #checking any config errors
./filebeat test output -c /etc/filebeat/filebeat.yml #check for connectivity
./filebeat setup -c /etc/filebeat/filebeat.yml

Also, I'm currently executing this on filebeat version 7.17. Not sure if multi indexing is function in this version.

Your filebeat and other elastic stacks (elasticsearch, kibana etc.) should be same version. I would recommend if you can do fresh installation and just try to read file without changing yaml file. Try to run simple example of file reading. Once it successfully done, try to change according to your use case.

Hi @ashishtiwari1993, I synched my filebeat version to the version of the elastic and ran filebeat with the default configuration with just the changes in the log file path. This seems to work fine for one log file. However when I'm trying to execute the same while adding one more input I'm not getting anything.

Could you share your multi input configurations?

@ashishtiwari1993 I'm not getting errors in /var/log/filebeat as well. Not sure why the indices are not being created.

updated ilm policy setup to false and this seemed to have worked since the log was only being sent to filebeat-7.17.* without the index name being updated. However, the problem for multi indexing still exists.

got it. Thanks for your help and time @ashishtiwari1993 :blush:

1 Like

@ashishtiwari1993 , just a quick question, is there a way to attach the multiple indices created to a single ilm policy via filebeat.yml. Currently ilm.setup is set to false. When set to true, it attaches a different index. The indices created via the above filebeat.yml do not get attached to the ilm. Any insights on the same?

@developer_cloud everything you need to control via ILM policy. Here what worked for me with custom index name -

  1. Create an ILM

So I create a

  • test-index-1 assigned policy test-policy-1
  1. Added index name in filebeat like below -
output.elasticsearch.index: "test-index-1"
setup.template.name: "filebeat"
setup.template.pattern: "filebeat"

Which means it will use all fields from filebeat template only.

1 Like