Hi, for some reason remote server that has Filebeat setup on it is not being able to successfully send logs to my elk server that has logstash running on it. I ran filebeat -e -d "publish,logstash
and got the following warning/errors:
2020-10-22T06:27:21.311Z INFO [publisher] pipeline/module.go:113 Beat name: choco-server
2020-10-22T06:27:21.311Z WARN beater/filebeat.go:178 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2020-10-22T06:27:21.311Z INFO instance/beat.go:450 filebeat start running.
2020-10-22T06:27:21.312Z INFO memlog/store.go:119 Loading data file of '/var/lib/filebeat/registry/filebeat' succeeded. Active transaction id=0
2020-10-22T06:27:21.317Z INFO [monitoring] log/log.go:118 Starting metrics logging every 30s
2020-10-22T06:27:21.332Z INFO memlog/store.go:124 Finished loading transaction log file for '/var/lib/filebeat/registry/filebeat'. Active transaction id=1527
2020-10-22T06:27:21.332Z WARN beater/filebeat.go:381 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2020-10-22T06:27:21.332Z INFO [registrar] registrar/registrar.go:109 States Loaded from registrar: 10
2020-10-22T06:27:21.332Z INFO [crawler] beater/crawler.go:71 Loading Inputs: 1
2020-10-22T06:27:21.334Z INFO log/input.go:157 Configured paths: [/var/log/*.log]
2020-10-22T06:27:21.334Z INFO [crawler] beater/crawler.go:141 Starting input (ID: 11204088409762598069)
2020-10-22T06:27:21.336Z INFO log/harvester.go:299 Harvester started for file: /var/log/cloud-init-output.log
.
2020-10-22T06:27:51.479Z ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: read tcp 10.0.35.4:38002->10.0.35.5:5044: i/o timeout
2020-10-22T06:27:51.480Z DEBUG [logstash] logstash/async.go:172 2048 events out of 2048 events sent to logstash host 10.0.35.5:5044. Continue sending
2020-10-22T06:27:51.480Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2020-10-22T06:27:51.480Z INFO [publisher] pipeline/retry.go:223 done
2020-10-22T06:27:51.480Z DEBUG [logstash] logstash/async.go:128 close connection
2020-10-22T06:27:51.480Z ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: write tcp 10.0.35.4:38002->10.0.35.5:5044: use of closed network connection
2020-10-22T06:27:51.480Z DEBUG [logstash] logstash/async.go:128 close connection
2020-10-22T06:27:51.480Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2020-10-22T06:27:51.480Z INFO [publisher] pipeline/retry.go:223 done
2020-10-22T06:27:53.481Z ERROR [publisher_pipeline_output] pipeline/output.go:180 failed to publish events: write tcp 10.0.35.4:38002->10.0.35.5:5044: use of closed network connection
2020-10-22T06:27:53.481Z INFO [publisher_pipeline_output] pipeline/output.go:143 Connecting to backoff(async(tcp://10.0.35.5:5044))
2020-10-22T06:27:53.481Z DEBUG [logstash] logstash/async.go:120 connect
2020-10-22T06:27:53.481Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2020-10-22T06:27:53.481Z INFO [publisher] pipeline/retry.go:223 done
2020-10-22T06:27:53.483Z INFO [publisher_pipeline_output] pipeline/output.go:151 Connection to backoff(async(tcp://10.0.35.5:5044)) established
2020-10-22T06:27:53.525Z DEBUG [logstash] logstash/async.go:172 2048 events out of 2048 events sent to logstash host 10.0.35.5:5044. Continue sending
Any guidance here? I'm banging my head right now..
[Edit - Added text of cmdline output as that may be easier to see..]
And just in case, here's some logstash data after running sudo ./bin/logstash --path.settings /etc/logstash -f /etc/logstash/conf.d/beats.conf --config.reload.automatic
:
[2020-10-22T06:47:26,180][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://0.0.0.0:9200/]}}
[2020-10-22T06:47:26,470][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://0.0.0.0:9200/"}
[2020-10-22T06:47:26,537][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-10-22T06:47:26,543][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-10-22T06:47:26,631][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//0.0.0.0:9200"]}
[2020-10-22T06:47:26,724][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-10-22T06:47:26,817][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/etc/logstash/conf.d/beats.conf"], :thread=>"#<Thread:0x60feee16 run>"}
[2020-10-22T06:47:26,863][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-10-22T06:47:28,142][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>1.31}
[2020-10-22T06:47:28,258][INFO ][logstash.inputs.beats ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-10-22T06:47:28,311][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-10-22T06:47:28,504][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-10-22T06:47:28,667][INFO ][org.logstash.beats.Server][main][fafbc78e9db27f6a29354f4105e4e92ad72510e14769fe64be7c46b26a83cf0d] Starting server on port: 5044
[2020-10-22T06:47:28,976][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}