Analyse the log files

Hi,

I just finished setting up ELK on log server(one folder for each new issue) and edited my filebeat.yml file for the log folder. So the next step is to create index so that it can be easily searched. so do I need to set up different filebeat.yml file for every new issue or I can give the path of root folder and it will automatically create the index and will show me th result in Kibana. Can you please guide me how it will do it.

Filebeat supports wildcards so if you have a directory hierarchy you can tell Filebeat to monitor all those files with a single filename pattern.

My Hierarchy is

/log/A/1.log, 2.log, 3.log
/log/B/1.log, 2.log, 3.log

Can you please point me to the documentation or give me an example. I am a new bie to this framework

Standard wildcard patterns apply, so /log//.log should work for you.

https://www.elastic.co/guide/en/beats/filebeat/current/configuration-filebeat-options.html#_paths

where do I specify for Index .. so that I can search for A or B

Are you going to use Filebeat with Logstash or will you send directly to Elasticsearch?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.