Where can I find logs on Linux machine

I want to find where are all logs stored on ELK linux box.

I want to perform some task on logs like tar and zip.

- /var/log/log1.log
- /var/log/nova/log2.log

I want to see where they are stored on linux machine I do not want them on Horizon.
Also I want to have only log1.log so is that saved as different files ?

I'm not sure what you are asking. /var/log is the location where "most" "default" logging get stored. I say most cause an application can choose to log where-ever it wants and also can be configure to log where-ever the admin wants it to log. To find all logs on a machine, use the find command. To archive logs, you can use a custom script or use logrotate to rotate the log and compress them. This question is more suited for your local linux administration forum and not really related to logstash. If you are asking where ELK logs are, its where you configure it to be. The default location is /var/log/elasticsearch,/var/log/logstash, /var/log/kibana. But again they can be define to go where ever via the /etc/sysconfig/(logstash,elasticsearch,kibana) configs.

Sorry I will explain it to you on better front.

I have filebeat sending logs to Logstash which it then sends to Elasticsearch.

So these logs are coming from remote to local machine and then processed to be shown over kibana. My question here is if they are coming they might be stored somewhere before processing them, what is that location.

Like I am fetching -/var/log/XYZ from client machine using filebeat
It comes to Filestash and then elasticsearch so where can I find this log on ELK linux machine.


They are not being stored on the LS host unless you've configured it to output to stdout for debugging or configure it to do otherwise. The LS host should be parsing and filtering the logs its getting from filebeat and pushing them to ES as fast as it can filter and process them. The logs are then stored in ES in the indexes you've specified if my understanding of LS is not faulty in this matter.

I assume from your reply you got the filebeat -> logstash -> elasticsearch working. I didn't see any reply to your other topic.


From what you are saying logs are not stored on LS.
They are stored on ES so can you tell me where can I find this on ES as my LS,ES and Kabina are on same machine ie ELK.

What our requirement is in filebeat.yml


  • /var/log/XYZ.log
  • /var/log/nova/abc.log

So from this I just want the XYZ log from all Filebeat on all client machine so that I can tar hem.
regarding abc log I want it to work as it is it goes to LS then to ES and then Kibana.

I want the logs(XYZ) from all machine to gather on either LS or ES so that I can combine them in a folder and tar and mail them. This is our requirement.

Can you please provide me where can I find this logs.

On the elasticsearch host run:
ps auxf|grep elastic

You should see the -Des.default.path.data=""

This is where your indexes are kept. There should be a directory with the name of your cluster and within there are the files that elasticsearch keeps for all the logs you sent to it to index. If you want to curate them, you should install curator (https://github.com/elastic/curator) and use it to curate your indexes.

Thank you Michael.

I looked into this and found the path /var/lib/elasticsearch/
Also the data inside all the folder is of different format not readable so i think that is the reason you are suggesting me to use Curator.

I tried to search all the folder and other folder too to find the same logfile which is on the client but I guess that is not available on ES or LS.

One question so this file are not stored as they are in client side anywhere on ELK , as I cannot locate them anywhere ?

The log file in elasticsearch is store in lucene formatted indexes for searching. ELK does not store a "copy" of your logfile. Your logs are parsed and stored in the elasticsearch indexes. Your original log file is still on the client where they are read, there is no "copy" of the log file. It does not copy the log file. Filebeat reads the logfile and sends them to logstash which parses and forwards to elasticsearch which indexes and stores them in its formated index file. It does not replicate the log file

Thank you for explaining it to me, now I get it.