I have 100 log files and I want to read those files, put some fields to it and send it to logstash.
If I create say 100 different prospectors all in one config file, it becomes very difficult to manage. Therefore, i want to know if there is any functionality in filebeat to include multiple config file inside main config file.say something like rsyslog??
filebeat supports glob-patterns to configure multiple files in one prospector. Also paths
setting is an array taking multiple glob-patterns.
You plan to add the very same fields to all files?
How is your directory/naming layout?
No fields in all the log files will be different. I will write the data into kafka from FB. say for 100 logs I want to create 100 diff topics in kafka. So the FB config file will become very difficult to understand as every thing will be written under same config file. So I want to break one big config file into multiple small config files. It is similar to what happens in rsyslog.
with no fields being different, you will need only one prospector.
How exactly is your directory layout and how do you plan to select the kafka topic name?
All beats support multiple -c
flags to combine multiple configurations. Plus filebeat can load additional prospector configurations via filebeat.config_dir setting.
Sry, by this statement i meant that No, all the log files have different patterns. All are my custom application log file.
Also below is the FB config file where I read from multiple path and create kafka topic for all the log files.
filebeat:
prospectors:
-
paths:
- /rescue/dataLogging/1.2.3.4/App__access/*.log
document_type: log
multiline:
pattern: '^[JFMASOND]'
negate: true
match: after
fields:
type: abc14_logdata
fields_under_root: true
############################ file 2 #########################
-
paths:
- /rescue/dataLogging/2.3.4.5/Log_data_check/*.log
document_type: log
fields:
type: xyz14_logdata_2
fields_under_root: true
############################ file 3 #########################
-
paths:
- /rescue/dataLogging/3.4.5.6/TAG/*.log
document_type: log
fields: {type: def32_logdata}
fields_under_root: true
########################## file 4###################
and so on......
##################### output ############################
output:
kafka:
hosts: ["192.168.0.1:9092"]
topic: "%{[type]}"
partition.round_robin:
reachable_only: false
My prob if I have 100 files do I need to write all 100 paths here itself in one config file. It will be all so messy.
if you're using 100 different topics for these 100 different path and without me seeing a general rule to filter on (output.kafka.topics
setting supports conditionals), then you have to create a prospector per file/directory.
As I said already, filebeat can load additional prospector configurations via filebeat.config_dir
. No need to have one config file only. You can have one config file per prospector.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.