Filebeat on ECK reading multiple copies

I have ECK deployed on a cluster with 10 nodes. There is a large file which I want uploaded to Elasticsearch, and the file is stored on a network drive mounted onto each machine, meaning that each node can access the file at the same path. When I use Filebeat, what ends up happening is that since one instance runs on every machine, each instance sends the same data to the same index since they share the configuration. How can I tell filebeat to read the file from only one instance? Any advice would be appreciated.

I don't think you can, if you have multiple filebeats reading the same file, you will get duplicates.

In this case you need to have just one filebeat reading from that file.

But this means I cannot run Filebeat as a DaemonSet on Kubernetes, right? Since all the created instances in that case would have the exact same configuration, and hence read the same file

I do not use Kubernetes, but as mentioned you need to have just one instance of Filebeat reading from that file.

Can you not configure the DaemonSet to run just on one node? From this example it seems to be possible to do something like that.