Pull ES data and generate application logs

Hi Team,

My Setup: Have 50+ application servers which generates huge logs and filebeat ships them to multiple LS nodes and then gets shipped into 1 ES node[Actually 6 node ES cluster - 5 data nodes 1 client node]

In logstash conf, there are 2 output plugins, 1 is ES and other is file. So we feel every event writing in file takes a long time so filebeat queues up log files in application server. We temporarily stopped file plugin and the queue got drastically reduced in filebeat end.

We store only 1 month of app data in ES. For general auditing we need to store years of data in tapes. But since ES+file makes issues in our setup. Can we only push application log to ES and then extract and generate same kind of logs back from ES and store in a text file and then archive it. Is this possible? Kindly suggest if it is valid or suggest any better ideas?

You can Logstash to do this, just setup a separate config pipeline to extract things.

Hi Warkolm,/All,

Sorry i forgot to inform the setup completely. I use filebeat -> logstash -> ES. In logstash output filter. First i have my event posting to ES and then to file plugin where i write back to a text file.

So every event runs thought both the processes and then picks up next event. This operation is too costly. So i wanted a process such that i push my logs only to ES and store for 15-20 days and then generate a single file based on day wise index and store that txt file and archive it for years.

Any suggestions on this please..

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.