Configure multiple logs location on filebeat

Hi!

let's say I need to send logs from a few directories to elasticsearch, like this:

/var/log/logfolder1/server.log
/var/log/logfolder2/server.log
/var/log/logfolder3/server.log
/var/log/logfolder4/server.log

what about my filebeat.yaml? should it look like:

- type: filestream
    id: id1
    enabled: true
    paths:
     - /var/log/logfolder1/server.log

- type: filestream
    id: id2
    enabled: true
    paths:
     - /var/log/logfolder2/server.log

- type: filestream
    id: id3
    enabled: true
    paths:
     - /var/log/logfolder3/server.log

or just one -type: filestream should be used ?

Do you need to have a specific id for each of them?

not sure I'm new in ELK.. I thought it is needed to figure out,define logs in Kibana

Perhaps look at the path setting and what you can do.

A list of glob-based paths that will be crawled and fetched. All patterns supported by Go Glob are also supported here. For example, to fetch all files from a predefined level of subdirectories, the following pattern can be used: /var/log/*/*.log. This fetches all .log files from the subfolders of /var/log

Hi!

Thank you for the link

I understand that I can config this way. The question is how to figure out in Kibana later on which line belongs to which log file on filebeat host. That's the reason I assigned different IDs. I hoped it will help later on

I don't use logstash for now.. filebeat sends data to elasticsearch

If you want to maintain the specific tagging like that then you will need 3 separate filestream inputs like that.

Filebeat includes the log path and name in each event by default.

IDs are fine too!

"log": {
      "file": {
        "path": "/var/log/jamf.log"
      },
      "offset": 11171602
    },
1 Like

thank you, gents!

I believe we can close discussion

Topics auto close on their own after 28 days...