I just tried building docker images (make release) from github on both master and the new 7.9.0 tag.
When I then run filebeat with this config:
filebeat.inputs:
- type: http_endpoint
enabled: true
listen_address: 127.0.0.1
listen_port: 8080
#
#======================= Logging from filebeat itself=========================
logging:
to_files: true
json: false
level: info
files:
path: /usr/share/filebeat/logs/
name: filebeat.log
keepfiles: 7
permissions: 0644
#================================ Outputs ======================================
output.kafka:
# initial brokers for reading cluster metadata
hosts: ${ENVIRONMENT_FILEBEAT_HOSTS}
ssl:
enabled: true
certificate_authorities:
- "/etc/pki/filebeat/elk-kafka-ca.cer"
verification_mode: full
supported_protocols: [TLSv1.2]
#Certificate for SSL client authentication
certificate: "/etc/pki/filebeat/elk-kafka-private.cer"
# Client Certificate Key - private key
key: "/etc/pki/filebeat/elk-kafka-private.key"
key_passphrase: ${ENVIRONMENT_FILEBEAT_PWD}
# renegotiation: never
topic: ${ENVIRONMENT_FILEBEAT_TOPIC}
filebeat fails with:
2020-08-19T08:49:46.642Z INFO instance/beat.go:299 Setup Beat: filebeat; Version: 7.9.0
2020-08-19T08:49:46.646Z INFO [publisher] pipeline/module.go:113 Beat name: r9432a5f928-apim-v2-7c48dcf9d-trqjp
2020-08-19T08:49:46.647Z WARN beater/filebeat.go:178 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2020-08-19T08:49:46.647Z INFO instance/beat.go:450 filebeat start running.
2020-08-19T08:49:46.648Z INFO [monitoring] log/log.go:118 Starting metrics logging every 30s
2020-08-19T08:49:46.650Z INFO memlog/store.go:119 Loading data file of '/usr/share/filebeat/data/registry/filebeat' succeeded. Active transaction id=0
2020-08-19T08:49:46.650Z INFO memlog/store.go:124 Finished loading transaction log file for '/usr/share/filebeat/data/registry/filebeat'. Active transaction id=0
2020-08-19T08:49:46.650Z WARN beater/filebeat.go:381 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2020-08-19T08:49:46.650Z INFO [registrar] registrar/registrar.go:108 States Loaded from registrar: 0
2020-08-19T08:49:46.650Z INFO [crawler] beater/crawler.go:71 Loading Inputs: 1
2020-08-19T08:49:46.650Z INFO beater/crawler.go:148 Stopping Crawler
2020-08-19T08:49:46.650Z INFO beater/crawler.go:158 Stopping 0 inputs
2020-08-19T08:49:46.650Z INFO beater/crawler.go:178 Crawler stopped
2020-08-19T08:49:46.650Z INFO [registrar] registrar/registrar.go:131 Stopping Registrar
2020-08-19T08:49:46.650Z INFO [registrar] registrar/registrar.go:165 Ending Registrar
2020-08-19T08:49:46.738Z INFO [registrar] registrar/registrar.go:136 Registrar stopped
2020-08-19T08:49:46.740Z INFO [monitoring] log/log.go:153 Total non-zero metrics {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":10,"time":{"ms":16}},"total":{"ticks":40,"time":{"ms":50},"value":40},"user":{"ticks":30,"time":{"ms":34}}},"handles":{"limit":{"hard":1048576,"soft":1048576},"open":8},"info":{"ephemeral_id":"6519a089-e4ce-4a97-a6be-0f8ec70cc23e","uptime":{"ms":395}},"memstats":{"gc_next":7539280,"memory_alloc":4650336,"memory_total":11295032,"rss":31948800},"runtime":{"goroutines":10}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"kafka"},"pipeline":{"clients":0,"events":{"active":0}}},"registrar":{"states":{"current":0}},"system":{"cpu":{"cores":4},"load":{"1":3.09,"15":2.56,"5":2.69,"norm":{"1":0.7725,"15":0.64,"5":0.6725}}}}}}
2020-08-19T08:49:46.740Z INFO [monitoring] log/log.go:154 Uptime: 397.102732ms
2020-08-19T08:49:46.740Z INFO [monitoring] log/log.go:131 Stopping metrics logging.
2020-08-19T08:49:46.740Z INFO instance/beat.go:456 filebeat stopped.
2020-08-19T08:49:46.740Z ERROR instance/beat.go:951 Exiting: Failed to start crawler: starting input failed: Error while initializing input: Error creating input. No such input type exist: 'http_endpoint'
Exiting: Failed to start crawler: starting input failed: Error while initializing input: Error creating input. No such input type exist: 'http_endpoint'
According to docs here https://www.elastic.co/guide/en/beats/filebeat/7.9/filebeat-input-http_endpoint.html - this config should be absolutely correct.
The config works if we remove the http_endpoint input section - and make filebeat read from a file instead.