Multiple filebeat instances appear as one beat on kibana x-pack monitoring

We have 9 filebeats running, none with a 'name' specified - and so defaulting to the hostname.

Visualising beat.name shows all 9 sending events.

However, only 2 beats in Kibana under the monitoring section (latest version of Kibana). When clicking through, sometimes the name of the beat in the header changes, suggesting they are arbitrarily merged or something.

Can anyone advise?

Each filebeat creates it's unique ID the first time it is started the first time. The ID is stored in ${path.data}/meta.json. When filebeat is restarted, it reuses the ID stored in this file.

A common issue is with users copying the data directory between filebeat instances. The data directory must be unique per instance. All beats store operational state and data in the data directories, which are unique per instance. Never share or copy this directory between instances.

Check the meta.json files contents. Quick fix is to remove the meta.json files. In this case filebeat will create new ones upon restart.

Thanks @steffens, unfortunately that doesn't seem to be the case. I checked the projects in source control, and there's no meta.json anywhere in the repositories. From the docs it looks like the default data path (on Windows - these are Azure App Services deployed from CI) is relative to the main executable.

Is there anything else I can do to diagnose?

There must be a meta.json somewhere. It's created the first time you start filebeat. It's not part of the source control, but created at runtime.

Filebeat normally prints the directories on start up. You can also ask filebeat to print the initial configuration to the debug log by running filebeat with -d "config". This initial debug message should give you some hint where the ${path.data} directory is.

Given you deploy it from some CI makes me wonder if the meta.json file is created by the CI already. When packaging filebeat in some kind of containers/zips, the data directory should be cleared out before.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.