Can i check elasticsearch index in file system


I wanted to know if there is a way for me to look at the logs of the different indexes directly on my server without going through kibana.

After some research I was supposed to have a data folder in /var/lib/elasticsearch, unfortunately I did not find it, so I made a script to look for the biggest folders of my server because my logs were well stored somewhere:

11G     /
5,9G    /var
5,6G    /var/lib
5,5G    /var/lib/elasticsearch/indices
5,5G    /var/lib/elasticsearch
4,1G    /usr
2,9G    /usr/share
1,2G    /usr/share/elasticsearch
1,2G    /usr/lib
1,1G    /var/lib/elasticsearch/indices/XrcdQJShQZemMvPSxjJp-A/0/index
1,1G    /var/lib/elasticsearch/indices/XrcdQJShQZemMvPSxjJp-A/0
1,1G    /var/lib/elasticsearch/indices/XrcdQJShQZemMvPSxjJp-A
1,1G    /var/lib/elasticsearch/indices/9aN2eLEvSH2m8OnGWWKf6g/0/index
1,1G    /var/lib/elasticsearch/indices/9aN2eLEvSH2m8OnGWWKf6g/0
1,1G    /var/lib/elasticsearch/indices/9aN2eLEvSH2m8OnGWWKf6g
859M    /usr/share/elasticsearch/modules
737M    /usr/share/kibana
733M    /var/lib/elasticsearch/indices/bfMQvXWbQyW-Eb41ZxyMEA/0
733M    /var/lib/elasticsearch/indices/bfMQvXWbQyW-Eb41ZxyMEA
714M    /var/lib/elasticsearch/indices/ZFswCF8UR2m03GRnrKrMpw/0 

So I guess this is where my logs are stored, but I don't understand much about how it's stored, and is there a way for me to read my logs directly here?

The path data is correct: /var/lib/elasticsearch

I'm on 8.7 btw

Thank you for your time.

You should never go directly to the file system nor chenge anything. Always use the APIs.

If you do not want to use Kibana you can query Elasticsearch directly though the REST API, but using Kibana is generally easier.

Why do you avoid Kibana?

It's not that I wanted to avoid kibana, I use it but I actually wanted to send a log to a server when a certain index on elasticsearch had a log added, so I thought for example used filebeat to retrieve the logs of the index at the source

It might be easier to capture that before it is indexed, e.g. using Logstash and send it to other downstreams at that point. Otherwise you may need to query for new data periodically using a custom script.

Unfortunately I can not recover them before, since it was annoyed to log generated by alerts!

You would have tips for the script on what used, because I am not even able to know where to recover my data x)

I am not sure I understand what the issue is. Could you please clarify?

Can you see the data you are looking for through Kibana?

Sorry for the misunderstanding, yes I have access to my logs of any type on kibana, I programmed alerts on kibana via the security menu, and I programmed it so that when alerts are "activated" a log is generated and indexed, now the problem is that I want to send these alert logs elsewhere

Then run a query periodically through a script against the index where you indexed the alerts.

Have a look at the available language clients and pick the one that works for you. You can also create a shell script that uses curl to run queries against Elasticsearch.


You cannot directly open the binary files that Elasticsearch (Lucene) creates. You need to use the APIs.

1 Like

Thanks @Christian_Dahlqvist and @warkolm

I followed your advice and so I used a bash script, which I run with crontab, and so I make a request with curl. Here is the beginning of the script that simply allows me to retrieve the name of the log:

curl -u elastic:... --cacert /certificat_elk.crt -XPOST https://localhost:9200/alert-test/_search?pretty=true -H "Content-Type: application/json" -d '{"query": {"range": {"@timestamp": {"gt": "now-1m"}}}}' | jq -r '.hits.hits[]."_source".rule_name' >> /mylog/mylogfile.log

Now I just have to retrieve other information to make real log, but the principle will remain the same, thank you!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.