Is store log file in elastic search is a best practice?

I have next scenario:

One system will run a lots of application and generate logs. Everytime when one application finish, I will send the log to elasticsearch.

There are 2 choice here:

  1. Every application first generate the log in file system, then use filebeat to collect the log to elasticsearch. Using this way, we have 2 copies of log for per-application.

  2. Every application not generate the log in file system, directly use elasticsearch rest api to send the log to elasticsearch, any design choice compare to choice 1? Will it make me some trouble? What's the suggested way?

The 2nd scenario allows you to save disk space on machine, but it's harder to see if you're loosing logs (wrong logstash pipeline, network issues, authentification issue...).

Maybe you should go for the 1st solution and rotate files using logrotate or custom script to save disk space.
So in case of problems with the ingestio npipeline you can replay it on files and compare bare informaiton on client with what you indexed.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.