Multiple Filebeat inputs with logstash output

There might be a many issue. The most obvious issue I see are:

 if [tag] == "TE" {
   ...
}

The Field [tag] does not even exist. => The grok filter will never be executed because if this condition.

else  if [tag] == "TMRS" {
    ...
}

Same, [tag] does not exist.

But you are writing tags. That changing condition to if "TE" in [tags] or if [fields][log_type] == "te" should fix the conditions.

output {
  elasticsearch {
    hosts => ["192.168.0.159:9200"]
    manage_template => false
    index => "%{tag}-index"
  }
}

The index setting is wrong because:

  • field tag does not exist
  • syntax is wrong. For field access it should say index => "%{[tag]}-index"
  • if you were to use tags, this would not be a string, but a list, generating an invalid index name

As you already have fields.log_type configured in filebeat, I assume you want:

output {
  elasticsearch {
    hosts => ["192.168.0.159:9200"]
    manage_template => false
    index => "%{[fields][log_type]}-index"
  }
}

Please note, index names without timestamp are not recommended. You will have a hard time to delete old data when you are about to run out of disk space.

Having log_type, I don't see why you need to configure tags in filebeat, but it doesn't really hurt to do so.

No idea about the grok and kv filters. I'm no grok debugger, but have you tried a grok debugger like the one that comes with kibana?