Where or how can I find raw logs on ELK?

Hello,

I want to have raw log as they are on client machine.
ELK is my server machine all Elasticsearch, Logstash and Kibana on same machine.
Machine1 is my client machine sending logs through Filebeat.

I will explain more here suppose we are sending following logs from Machine1
paths:

  • /var/log/log1.log
  • /var/log/nova/log2.log

How can I get this log in ELK. I found related folder:
/var/lib/elasticsearch/elasticsearch/nodes/0/indices/filebeat-2016.05.05
Tried searching in all folder but I could not find raw logs as they are on client side.

What my requirement is I want those log as they are on client machine before any indices or any filtering is done by elasticsearch so that I have log form all server and I can write a script to combine them and mail them.

I got suggestion to use Curator so if that can actually create or produce the same log as on client machine, can you please let me know if it is possible and will that require any scripting or it can be done using curator CLI ?

TIA

When Logstash processes log files, it does so line by line, and the resulting events are then indexed into Elasticsearch as separate documents. The raw logs are therefore not stored anywhere in the ELK stack.

If you have metadata, e.g. file name, host and offset in the file for each record, you could retrieve all entries from a particular log file using a query and then sort them in order to recreate the file. I am however not sure if Logstash supports adding this type of metadata in the file input plugin.

Why? This seems like a massive waste of setting up the Elastic stack to process them if this is all you are going to do.

1 Like

You are correct it is something which is not observed but the reason behind this is ELK is doing its process of log monitoring.

We do not want to interrupt that process its just I am not sure if Kibana dashboard gives a way I can download all the all logs a specific log file which can be then mailed . We want to provide log to different team and also I tried Kibana with the chrome plugin but that is not a proper way to do so.

Log's can be many from many client machine varying to thousand line of logs from hundreds of client machine.

So if you have a suggestion or a way please let me know.

Thanks in advance.

That still doesn't explain why. If the data is in Elasticsearch and visible with Kibana, then the other teams can simply search for it, there is no need to email thousands of log files.

If all you are after is a way to collect logs, then use rsyslog.