I have logstash, es and kibana inside docker-compose. They are communicating to each other well. But filebeat which is installed outside is responsible for fetching data from a file and sending to logstash. Filebeat is harvesting data well, as i could print the data into console. It can also send data to logstash when its installed in vm directly than in docker i have logstash input filter as beats with port 5044 and filebeat config file filebeat.yml has
i even used sudo filebeat setup --template -E output.logstash.enabled=false -E 'output.elasticsearch.hosts=["localhost:9200"]'
and loaded index template
Oh it seems logstash and filebeat are connected. I just removed my filters and then found that logstash have received 2 lines of data that was send. But the remaining data haven't reached logstash. I suppose its lost somewhere in between. What to do now? Filebeat is sending all data correctly bcz if i use stdout as output then all data is printed to console but its not reaching logstash
Its sooo weird logstash always receive just 2 lines of data. That is the starting of my file. i cleared my file still it shows the earlier starting line. How can it access it even after deleting from file. Is it Filebeat that sends the data again and again? But if output for filebeat is stdout it behaves quite normal then why is this happening only when logstash is inside docker
I had changed queue type to persistent. When i changed back to default value its working fine. I think the memory was not enough to use a persistent queue.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.