I am collecting logs from other serves to a syslog server using rsyslog.
The result is a directory path with sub-directories under it that have the IP address of the server from where the logs came from.
I have filebeat installed on the receiving server and have verified that it collects the local logs just fine however no matter what I do Filebeats starts running but doesn't ingest any of the logs under the subdirectory I specified.
File names under the subdirectory are as they were when they were pulled over to the destination directory so I am not sure what is going on.
I am not sure why it is not working and need some help with this preferably an example would be helpful.
I also need to make sure that these logs are loaded into Elasticsearch identified with not only the IP but the hostname from where they originated from.
My expectation was that it would definitely load the files once I pointed var.paths to the path where they are located but noting is happening.
Note: I set the path in the filebeat.yml as well as the var.paths in the system.yml and auditd.yml.
The perfect solution would be for it the solution to sweep across all the subdirectories and load all the files changing the IP and hostname assignments for all the documents created for a particular host.
I really would prefer to use beats and the built in templates, mappings ...etc rather than go down the road of using Logstash and having to cobble together custom templates and mappings.
I would have expected there to already be a working example for collecting logs deposited by rsyslog on a centralized server and using filebeat to ingest these but seem to have particular difficulty finding one.
Any information you can provide would be helpful.
Awaiting your reply
Thanks in advance