Hi,
I’ve been using the ELK stack to capture our logs from our Kubernetes cluster. I had filebeats setup on the cluster and everything was working fine until I updated from 6.7 to 7.3 (I also updated the docker image that the filebeat DaemonSet was pulling from).
When I initially made the update, all the logs ended up being pushed into one index, whereas before they were split by day and namespace. This is the configuration I have for the output to Elasticsearch and the index template:
output.elasticsearch:
hosts: ['<url>']
index: "filebeat-%{[kubernetes.namespace]:default}-%{[agent.version]}-%{+yyyy.MM.dd}"
setup.template:
name: "filebeat-%{[kubernetes.namespace]:default}"
pattern: "filebeat-%{[kubernetes.namespace]:default}-*"
However since then, the logs have stopped showing up all together, including in the newly created log index. I've searched and haven't found any other indices that match the index pattern filebeat-*
. I can’t see any errors in the Daemonset logs on Kubernetes so I’m wondering where the output is getting dumped? As it is clearly not following the configuration above.
On a separate note, what would be the easiest way to send logs from a React webapp into Elastic? Currently we are sending them to Sentry using POST requests. If we were to use filebeats, I assume the only way would be to store the logs locally, setup FB locally and have it read from them, or to send them over a UDP/TCP web socket and have FB read from there? I've seen other implementations that use Logstash as an intermediate but would rather not have to set it up for now.
Thanks,
Nathanael