Filebeats and multiple files

Hi. I have a requirement to pull in multiple files from the same host, but in Logstash they need to follow different input/filter and output paths.
I was going to setup 2 Filebeats on this Unix hosts but that doesn't seem too efficient.

Is there another 'easier' way to do this?


Why not use and then

Hi. I setup another Prospectors in /etc/filebeat/filebeat.yml and can see it in Kibana.

Below is what I did in the config file:

prospectors: # Each - is a prospector. Below are the prospector specific configurations - paths: - /var/log/rsyslog.log.file input_type: rsyslog.log01 - paths: - /var/log/*.log - /var/log/syslog - /var/log/apt/* input_type: syslog

and an extract of the Kibana json.
"_index": "logs-2016.08.07",
"_type": "logs-log02",
"_id": "AVZkjypyu9iLKWAc6aqV",
"_score": null,
"_source": {
"message": "2016-08-07T20:30:54.641225+10:00 137: AP:e8b7.48de.05fb: *Aug 7 10:30:53.529: %WIDS-6-ENABLED: IDS Signature is loaded and enabled",
"@version": "1",
"@timestamp": "2016-08-07T10:31:59.273Z",
"path": "/var/log/rsyslog.log.file",
"host": "log02",
"type": "logs-log02"
"fields": {
"@timestamp": [
"highlight": {
"path": [
"sort": [

Can I somehow call the index 'local_syslog' so I can create an index of it?

As documented here, input_type supports only two possible values -- log and stdin. Try using document_type or fields (as previosly suggested by warkolm).

Thanks that worked.

This topic was automatically closed after 21 days. New replies are no longer allowed.