I wanted to know if there is a way for me to look at the logs of the different indexes directly on my server without going through kibana.
After some research I was supposed to have a data folder in /var/lib/elasticsearch, unfortunately I did not find it, so I made a script to look for the biggest folders of my server because my logs were well stored somewhere:
So I guess this is where my logs are stored, but I don't understand much about how it's stored, and is there a way for me to read my logs directly here?
It's not that I wanted to avoid kibana, I use it but I actually wanted to send a log to a server when a certain index on elasticsearch had a log added, so I thought for example used filebeat to retrieve the logs of the index at the source
It might be easier to capture that before it is indexed, e.g. using Logstash and send it to other downstreams at that point. Otherwise you may need to query for new data periodically using a custom script.
Sorry for the misunderstanding, yes I have access to my logs of any type on kibana, I programmed alerts on kibana via the security menu, and I programmed it so that when alerts are "activated" a log is generated and indexed, now the problem is that I want to send these alert logs elsewhere
Then run a query periodically through a script against the index where you indexed the alerts.
Have a look at the available language clients and pick the one that works for you. You can also create a shell script that uses curl to run queries against Elasticsearch.
I followed your advice and so I used a bash script, which I run with crontab, and so I make a request with curl. Here is the beginning of the script that simply allows me to retrieve the name of the log:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.