Filebeats not sendind any logs

Hello,

My filebeat instance is not sending any logs and not giving me any information why.

Here are my filebeats.yml:

filebeat.prospectors:
- type: log
  enabled: false
  tags: ["DOCKER"]
  paths:
    - /var/log/messages
  fields:
    cliente: "TJSP"
    servidor: "DES"
  fields_under_root: true
  exclude_lines: ["STDOUT","STDERR"]

output.logstash:
  # The Logstash hosts
  hosts: ["xxx.xx.x.xx:5043"]

logging.level: debug

And here the logs it's providing me:

2018-01-03T09:59:46-02:00 DBG  [log] Disable stderr logging
2018-01-03T09:59:46-02:00 INFO Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
2018-01-03T09:59:46-02:00 DBG  [beat] Beat metadata path: /var/lib/filebeat/meta.json
2018-01-03T09:59:46-02:00 INFO Metrics logging every 30s
2018-01-03T09:59:46-02:00 INFO Beat UUID: 6a02bb1f-aee5-4d3d-8b03-e79f01d06aed
2018-01-03T09:59:46-02:00 INFO Setup Beat: filebeat; Version: 6.1.0
2018-01-03T09:59:46-02:00 DBG  [beat] Initializing output plugins
2018-01-03T09:59:46-02:00 DBG  [processors] Processors:
2018-01-03T09:59:46-02:00 DBG  [publish] start pipeline event consumer
2018-01-03T09:59:46-02:00 INFO Beat name: server2225
2018-01-03T09:59:46-02:00 INFO filebeat start running.
2018-01-03T09:59:46-02:00 INFO Registry file set to: /var/lib/filebeat/registry
2018-01-03T09:59:46-02:00 INFO Loading registrar data from /var/lib/filebeat/registry
2018-01-03T09:59:46-02:00 INFO States Loaded from registrar: 0
2018-01-03T09:59:46-02:00 WARN Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2018-01-03T09:59:46-02:00 INFO Loading Prospectors: 1
2018-01-03T09:59:46-02:00 INFO Starting Registrar
2018-01-03T09:59:46-02:00 DBG  [cfgfile] Checking module configs from: /etc/filebeat/modules.d/*.yml
2018-01-03T09:59:46-02:00 DBG  [cfgfile] Number of module configs found: 0
2018-01-03T09:59:46-02:00 INFO Loading and starting Prospectors completed. Enabled prospectors: 0
2018-01-03T09:59:46-02:00 INFO Config reloader started
2018-01-03T09:59:46-02:00 DBG  [cfgfile] Scan for new config files
2018-01-03T09:59:46-02:00 DBG  [cfgfile] Number of module configs found: 0
2018-01-03T09:59:46-02:00 INFO Loading of config files completed.
2018-01-03T10:00:16-02:00 INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=30199 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=1290880 beat.memstats.memory_total=3102368 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.config.reloads=1 libbeat.output.type=logstash libbeat.pipeline.clients=0 libbeat.pipeline.events.active=0 registrar.states.current=0
2018-01-03T10:00:46-02:00 INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=29999 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=1304496 beat.memstats.memory_total=3115984 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.pipeline.clients=0 libbeat.pipeline.events.active=0 registrar.states.current=0
2018-01-03T10:01:16-02:00 INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=30000 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=1318656 beat.memstats.memory_total=3130144 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.pipeline.clients=0 libbeat.pipeline.events.active=0 registrar.states.current=0
2018-01-03T10:01:46-02:00 INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=30000 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=1327984 beat.memstats.memory_total=3139472 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.pipeline.clients=0 libbeat.pipeline.events.active=0 registrar.states.current=0

And here is my logstash.conf just so you know I'm not excluding any kind of logs:

input {
    beats {
        port => "5043"
    }
}
filter {
  if "DOCKER" not in [tags] {
    grok {
      match=> { "message" => "%{TIMESTAMP_ISO8601:log_timestamp}%{SPACE}?%{LOGLEVEL:loglevel:string}%{SPACE}?\[(?<java_class>([A-z]|\.|[0-9])*)\]%{SPACE}?(?<message>(.|\r|\n)*)" }
      overwrite => ["message"]
    }
    date {
      match => [ "log_timestamp", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-d HH:mm:ss,SSS", "yyyy-MM-d HH:mm:ss.SSS"]
    }
    grok {
      match => { "message" => "\.(?<java_exception>([A-z]|[0-9])*Exception):(.|\r|\n)*" }
    }
     mutate {
      remove_field => ["log_timestamp", "beat", "@version", "prospector"]
      remove_tag => ["beats_input_codec_plain_applied", "_grokparsefailure" ]
    }
  } else {
    mutate {
      add_field => { "message" => "Erro ao transformar log, por favor veja o campo 'log'" }
    }
    grok {
      match => { "log" => "%{TIME:log_timestamp}%{SPACE}?%{LOGLEVEL:loglevel:string}%{SPACE}?\[(?<java_class>([A-z]|\.|[0-9])*)\]%{SPACE}?(?<message>(.|\r|\n)*)" }
      overwrite => ["message"]
      add_tag => ["JBOSS4"]
    }
    grok {
      match => { "message" => "\.(?<java_exception>([A-z]|[0-9])*Exception):(.|\r|\n)*" }
    }
     mutate {
      remove_field => ["log_timestamp", "beat", "@version", "prospector", "stream", "time"]
      remove_tag => ["beats_input_codec_plain_applied", "beats_input_raw_event" ]
    }
  }
}
output {
    elasticsearch {
      hosts => [ "localhost:9200" ]
    }
}

I'm kinda lost on where to find the reason why I'm not getting anything. Any help would be much appreciated.

How can I delete this thread?

Just as I posted this I saw the

enabled: false

Holy, I must be blind

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.