I am trying to send various types of logs through Filebeat -> Logstash -> Elastich Search -> Kibana
I used Fields with a variable log_type and assigned different value to the variable basing on the type of the log and sending it to Logstash. In the output section of the logstash which is sending the data to ES, i am unable to use those fields set in Filebeat to create a index with the name.
Can someone show a sample of how this can be done please.
#filter{
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp}"]
}
date {
match => ["timestamp", "ISO8601"]
}
}
output{
elasticsearch {
hosts => ["xxxx"]
index => "%{[@metadata][fields]}-%{[@metadata][log_type]}" -> HOW TO REFER THE FIELDS FROM FILEBEAT TO CREATE SEPARATE INDEX?
}
}
All but the first of these are writing to an ILM alis, but you have to define the ILM parts including creating the initial empty index manually first.
This is an Ansible template so consider {{ vars }} "sanitized". The logic for [agent][module] and [fileste][module] is to accommodate breaking changes (Thanks Elastic).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.