Filebeat harvester started but not sending logs to logstash / kibana

I'm certainly missing something in my configuration. I'm running a full docker setup, and can access Kibana and view indexes, including some logs I manually sent to logstash, I'm working on adding filebeat, docker logs seem to show it connects to logstash, and I even get harvester started for my log file, and additional logs if I update my log file, however the Non-zero metrics in the last 30s show nothing about my logs and nothing is making it to logstash / kibana index view. I have only configured a basic input/output as such in my filebeat.yml

filebeat.config:
  modules:
    path: ${path.config}/modules.d/*.yml
    reload.enabled: false

filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /projects/dealseal/oppty-re-server/log/2021/1/*.log
    
output.logstash:
  enabled: true
  hosts: ["logstash:5044"]


I have not done anything in my logstash.yml other then:

http.host: "0.0.0.0"

xpack.monitoring.elasticsearch.hosts: [ "http://elasticsearch:9200" ]

## X-Pack security credentials

#

xpack.monitoring.enabled: true

xpack.monitoring.elasticsearch.username: elastic

xpack.monitoring.elasticsearch.password: changeme

Any ideas why my logs would not be sent to logstash correctly?

What do the Filebeat logs show?

1 Like

What does your Logstash config look like? You need to create a pipeline with at least a beats input listening to port 5044 and an elasticsearch output as Logstash does nothing by default or without a configuration. It might be easier to have Filebeat write directly to Elasticsearch.

1 Like

lostash.conf file is below: I added the filter to handle an error, which allowed a bunch of docker logs to flow through, however that did not solve my problem with the log file from filebeats, I'm confused if "beats" and "filebeat" are the same thing / input.

input {
	beats {
		port => 5044
	}

	tcp {
		port => 5000
	}
}

## Add your filters / logstash plugins configuration here
filter {
	mutate { replace => { "[host]" => "[host][name]" } }
}
output {
	elasticsearch {
		hosts => "elasticsearch:9200"
		user => "elastic"
		password => "changeme"
		ecs_compatibility => disabled
	}
}

notice: Harvester started for file: /projects/dealseal/oppty-re-server/log/2021/1/debug-2021-1-2.log

I looked at your log but it ends right after Filebeat finally connects to Logstash what does the logs look like after that?... Are there new log events being written to the harvested file?

1 Like

I added logs to my gist post as a comment. You know, it started reporting in. Maybe I fixed it adding mutate { replace => { "[host]" => "[host][name]" } } which was failing for docker logs. I'm going to makr this answered, thanks for everyones support.

1 Like

Yes since there was no if / conditional logic for that mutate if it failed on every event then all the events would be lost ... If I am understanding correct.

There also looked like very few events being harvested