Filebeat is not sending logs to logstash

I'm trying to read Tomcat logfiles with filebeat, send them to logstash, format them and forward them to elasticsearch.

Problem: I'm pretty sure the logs never reach logstash, because they don't apper at stdout

versions:
filebeat:7.10.0
logstash:7.10.0
elasticsearch:7.10.0
kibana:7.10.0

Filebeat.yml:

filebeat.config:
  modules:
    path: ${path.config}/modules.d/*.yml
    reload.enabled: false

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - "/opt/logs/tomcat/catalina.out"

multiline.type: pattern
multiline.pattern: '^[[:space:]]+(at|\.{3})[[:space:]]+\b|^Caused by:'
multiline.negate: false
multiline.match: after

#----------------------------- Logstash output --------------------------------
output.logstash:
  enabled: true
  hosts: ["http://logstash:5144"]

setup.kibana:
  host: "http://kibana:5601"

Logstash.conf:

input {
  beats {
    port => 5144
    type => tomcat
  }
}

filter {
  grok {
    patterns_dir => ["./patterns"]
    match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{THREAD:thread} %{LOGLEVEL:level} %{JAVALOGMESSAGE:message}" }
  }

  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss, Z" ]
  }
}

output {
  elasticsearch {
    hosts => ["http://elasticsearch:9200"]
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    #user => "elastic"
    #password => "changeme"
  }
  stdout { codec => rubydebug }
}

Filebeat logs:

2020-11-18T13:58:38.733Z	INFO	instance/beat.go:299	Setup Beat: filebeat; Version: 7.10.0
2020-11-18T13:58:38.735Z	INFO	[publisher]	pipeline/module.go:113	Beat name: 6897ba41241e
2020-11-18T13:58:38.737Z	WARN	beater/filebeat.go:178	Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2020-11-18T13:58:38.737Z	INFO	[monitoring]	log/log.go:118	Starting metrics logging every 30s
2020-11-18T13:58:38.738Z	INFO	instance/beat.go:455	filebeat start running.
2020-11-18T13:58:38.739Z	INFO	memlog/store.go:119	Loading data file of '/usr/share/filebeat/data/registry/filebeat' succeeded. Active transaction id=0
2020-11-18T13:58:38.739Z	INFO	memlog/store.go:124	Finished loading transaction log file for '/usr/share/filebeat/data/registry/filebeat'. Active transaction id=0
2020-11-18T13:58:38.739Z	WARN	beater/filebeat.go:381	Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2020-11-18T13:58:38.740Z	INFO	[registrar]	registrar/registrar.go:109	States Loaded from registrar: 0
2020-11-18T13:58:38.740Z	INFO	[crawler]	beater/crawler.go:71	Loading Inputs: 1
2020-11-18T13:58:38.741Z	INFO	log/input.go:157	Configured paths: [/opt/logs/tomcat/catalina.out]
2020-11-18T13:58:38.741Z	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 16654394220544737438)
2020-11-18T13:58:38.742Z	INFO	[crawler]	beater/crawler.go:108	Loading and starting Inputs completed. Enabled inputs: 1
2020-11-18T13:58:38.742Z	INFO	cfgfile/reload.go:164	Config reloader started
2020-11-18T13:58:38.742Z	INFO	cfgfile/reload.go:224	Loading of config files completed.

logstash logs:

OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.ext.openssl.SecurityHelper (file:/tmp/jruby-1/jruby7971926842528329702jopenssl.jar) to field java.security.MessageDigest.provider
WARNING: Please consider reporting this to the maintainers of org.jruby.ext.openssl.SecurityHelper
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2020-11-18T13:59:13,734][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.10.0", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.8+10 on 11.0.8+10 +indy +jit [linux-x86_64]"}
[2020-11-18T13:59:17,262][INFO ][org.reflections.Reflections] Reflections took 60 ms to scan 1 urls, producing 23 keys and 47 values 
[2020-11-18T13:59:17,883][WARN ][deprecation.logstash.outputs.elasticsearch] Relying on default value of `pipeline.ecs_compatibility`, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.
[2020-11-18T13:59:18,641][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2020-11-18T13:59:18,869][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2020-11-18T13:59:18,929][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-11-18T13:59:18,932][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-11-18T13:59:19,010][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://elasticsearch:9200"]}
[2020-11-18T13:59:19,109][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-11-18T13:59:19,192][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-11-18T13:59:19,394][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash-tomcat.conf", "/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x4757915b run>"}
[2020-11-18T13:59:20,511][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.11}
[2020-11-18T13:59:20,537][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5144"}
[2020-11-18T13:59:20,552][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-11-18T13:59:20,561][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-11-18T13:59:20,698][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-11-18T13:59:20,707][INFO ][org.logstash.beats.Server][main][5c13331495d6e734f58a9a52e7694c6e3030f5c9ed203cdd14585cb543632189] Starting server on port: 5044
[2020-11-18T13:59:20,707][INFO ][org.logstash.beats.Server][main][d981b5044e1ad829a2782aefd44208b4d423d52c2a23c15729cd5867eade2306] Starting server on port: 5144
[2020-11-18T13:59:21,034][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Everything runs in docker and all .yml and .conf-file are mounted as well as the log-directories.

I ran out of ideas and any help would be much appreciated.

I found the prolem. The input path was wrong.

It was:

paths:
  - "/opt/logs/tomcat/catalina.out"

but it should have been:

paths:
    - "/usr/share/filebeat/opt/logs/tomcat/catalina.out"

due to the docker filesystem.

But now i get a : "_grokparsefailure"

My logs look like:

2020-11-19 11:34:40,260  [thread0-exec-1] WARN  org.springframework.web.servlet.PageNotFound - Request method 'GET' not supported

And should match with my logstash.conf filter:

filter {
  grok {
    patterns_dir => ["./patterns"]
    match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{THREAD:thread} %{LOGLEVEL:level} %{JAVALOGMESSAGE:message}" }
  }

  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss, Z" ]
  }
}

with the pattern THREAD \[.*\]

But i guess this question belongs in a new post

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.