hi,
i a monitoring a log directory with filebeats, this directory has severl log files, how can i send a specific log file to a specific index in elasticsearch when going through logstash?
fileBeats settings:
filebeat.prospectors:
- type: log
enabled: true
paths:
- C:\DCAlogs\dca.process-*.log
output.logstash:
hosts: ["localhost:5044"]
example of log file names
dca.process-aTob.log
dca.process-errorlist.log
dca.process-test_log.log
in my current flow, they will all go to the same index.
i'm a noob to the elk world, but i think best practice would be to save each in its own index,
how can i do that
In logstash you can configure the index name to be dynamic, based on the events contents. See Set up Logstash docs. The @metadata fields is published by filebeat with the event. It contains the beat name, a type field and the beat version. But you are free to use any field you want to.
In LS you can filter on tags or fields. E.g. the source field contains the filename. But you can also setup different prospectors per file-type and add tags or extra fields in order to configure the index name to be used by LS.
hi,
i dont know how to access the source filed, i've tried:
mutate {
add_field => { "filename" => [source] }
and
add_field => {[fields][source]}
and
add_field => {"%{[fields][source]}"}
}
nothing seems to work
can you please provide an example.
creating a index with a filename reference, or even better, adding file name as a new field,
thanks,
David
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.