Hii I am using logstash to parse my logs . This is my architecture . Filebeat ships logs from server to Logstash. Logstash parses the logs and sends it to Elasticsearch . Elasticsearch's logs can be viewed using kibana . Now one of my client said that he can have about 500MB of logs generated in a day . So in order to test whether my ELK stack can handle this amount of traffic i wrote a bash script that is as shown below -
dt=$(date '+%d/%m/%Y %H:%M:%S')
echo $dt "local.ERROR:" $dt >> 'testing.log'
The contents of this testing.log file is getting shipped to the Logstash . But even when I stopped the script after 12 hrs or so logs were getting printed on the console . What does this means ??? further the time was old that means Logstash is somehow queuing the logs somewhere . Does Logstash queues logs ? If yes what can I do to remove these logs from the queue ????