I'm using file to ingest consul-template logs
input {
file {
type => "consul-template"
path => ["/var/log/consul-template.log"]
add_field => { "name" => "consul-template" }
}
}
For every log message _grokparsefailure is tagged -- but I'm not using grok. What could be causing that?
The rest of the pipeline on the host with the logs:
output {
rabbitmq {
host => "rabbitmq.service.ops.consul"
port => {{rabbitmq_port}}
user => "{{elk_rabbitmq_logstash_user}}"
password => "{{elk_rabbitmq_logstash_password}}"
vhost => "logstash"
exchange => "logs_json"
exchange_type => "direct"
durable => true
}
}
The pipeline on the ingest node:
input {
rabbitmq {
host => "localhost"
port => {{rabbitmq_port}}
user => "{{elk_rabbitmq_logstash_user}}"
password => "{{elk_rabbitmq_logstash_password}}"
vhost => "logstash"
exchange => "logs_json"
queue => "logs_json"
threads => 3
codec => json
}
}
output {
elasticsearch {
hosts => "{{ elasticsearch_consul_service }}"
workers => 6
manage_template => true
template => "{{logstash_es_templates_dir}}/es-logstash-mappings-template.json"
template_name => "logstash-vimana"
template_overwrite => true
}
}
}
Could it be the code=>json in the rabbitmq input?