I created elk project(elasticsearch, logstash, kibana and filebeat) for production environment. It’s contained 1 server and 5 filebeat clients. Our developers would like to see some log files from they applications. I used system module, because is no other module close to they application. Kibana in logs view shows the paths to they log files, but no content. I checked syslog and auth.log files, same story. Only message, no content. Maybe Elasticsearch works like this? Can you clarify this, please? Or I have to change something?
Below, what I can see in LOG stream for one of my filebeat client:
So you have collected 475 million log entries. As a next step:
Can you add one of your Filebeat configs?
Are you getting the logs from your application? Probably the "awsapi02" search isn't what you want — what happens if you search for not event.dataset:system.log?
The part with /var/log/awsapi/localhost.log looks good. Change the filter not in the highlights but in the KQL field (where it says awsapi02 in your screenshot — top left).
Using properly formated code was just a general statement. The Filebeat config looked ok.
And the query should be not event.dataset:system.syslog, sorry. I want to exclude the syslog logs so we can see your own logs; though there will be multiple ways to do that. Alternatively filter on the log.file.path to the ones you're interested in.
I don't think /var/log/awsapi/server.log_yyyy-MM-ddTHH-mm-ss will do much, since this should be replaced by the actual timestamp. In your Filebeat config this should probably be /var/log/awsapi/server.log_*.
Can you refresh the page and make sure your search is being applied? Because if there are no search hits, it should look like this:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.