I want to send filebeat (logs) data to kafka directly. I have commented logstash/elasticsearch output and set the kafka output like below. But after that my filebeat instance is not starting . Getting error message
Failed to start Filebeat sends log files to Logstash or directly to Elasticsearch
#--------------------------- KAFKA ----------------------------------------------
output.kafka:
enabled:true
hosts: ["127.0.0.1:9092"]
topic: log
Welcome to our community! 
Can you paste your Filebeat logs?
Please format your code/logs/config using the </> button, or markdown style back ticks. It helps to make things easy to read which helps us help you 
Thanks for the reply . here is the /var/log/filebeat/filebeat log
2020-08-09T13:06:46.572-0400 INFO kafka/log.go:53 producer/broker/0 starting up
2020-08-09T13:06:46.572-0400 INFO kafka/log.go:53 producer/broker/0 state change to [open] on logs/0
2020-08-09T13:06:46.572-0400 INFO kafka/log.go:53 producer/leader/logs/0 selected broker 0
2020-08-09T13:06:46.572-0400 INFO kafka/log.go:53 producer/leader/logs/0 state change to [flushing-3]
2020-08-09T13:06:46.572-0400 INFO kafka/log.go:53 producer/leader/logs/0 state change to [normal]
2020-08-09T13:06:46.832-0400 INFO kafka/log.go:53 Connected to broker at 100.97.62.105:9092 (registered as #0)
2020-08-09T13:06:47.093-0400 INFO kafka/log.go:53 producer/broker/0 state change to [closing] because EOF
2020-08-09T13:06:47.094-0400 INFO kafka/log.go:53 Closed connection to broker 100.97.62.105:9092
2020-08-09T13:06:47.094-0400 INFO kafka/log.go:53 producer/leader/logs/0 state change to [retrying-3]
2020-08-09T13:06:47.094-0400 INFO kafka/log.go:53 producer/leader/logs/0 abandoning broker 0
2020-08-09T13:06:47.094-0400 INFO kafka/log.go:53 producer/broker/0 shut down
2020-08-09T13:06:47.194-0400 INFO kafka/log.go:53 client/metadata fetching metadata for [logs] from broker 100.97.62.105:9092
2020-08-09T13:06:47.456-0400 INFO kafka/log.go:53 producer/broker/0 starting up
2020-08-09T13:06:47.456-0400 INFO kafka/log.go:53 producer/broker/0 state change to [open] on logs/0
2020-08-09T13:06:47.456-0400 INFO kafka/log.go:53 producer/leader/logs/0 selected broker 0
2020-08-09T13:06:47.456-0400 INFO kafka/log.go:53 producer/leader/logs/0 state change to [flushing-3]
2020-08-09T13:06:47.456-0400 INFO kafka/log.go:53 producer/leader/logs/0 state change to [normal]
2020-08-09T13:06:47.494-0400 INFO beater/filebeat.go:437 Stopping filebeat
2020-08-09T13:06:47.494-0400 INFO crawler/crawler.go:139 Stopping Crawler
2020-08-09T13:06:47.494-0400 INFO crawler/crawler.go:149 Stopping 1 inputs
2020-08-09T13:06:47.494-0400 INFO input/input.go:149 input ticker stopped
2020-08-09T13:06:47.494-0400 INFO input/input.go:167 Stopping Input: 8669900041677695518
2020-08-09T13:06:47.494-0400 INFO log/harvester.go:272 Reader was closed: /var/log/dmesg. Closing.
2020-08-09T13:06:47.494-0400 INFO log/harvester.go:272 Reader was closed: /var/log/messages. Closing.
2020-08-09T13:06:47.494-0400 INFO crawler/crawler.go:165 Crawler stopped
2020-08-09T13:06:47.495-0400 INFO registrar/registrar.go:356 Stopping Registrar
2020-08-09T13:06:47.495-0400 INFO registrar/registrar.go:282 Ending Registrar
2020-08-09T13:06:47.501-0400 INFO [monitoring] log/log.go:149 Total non-zero metrics {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":490,"time":{"ms":495}},"total":{"ticks":1510,"time":{"ms":1519},"value":1510},"user":{"ticks":1020,"time":{"ms":1024}}},"info":{"ephemeral_id":"cf0eecef-2c56-4978-9baf-0a3ff454e68b","uptime":{"ms":155366}},"memstats":{"gc_next":4993408,"memory_alloc":5979264,"memory_total":151842768,"rss":19914752}},"filebeat":{"events":{"active":25,"added":32,"done":7},"harvester":{"closed":4,"open_files":0,"running":0,"started":4},"input":{"log":{"files":{"truncated":2}}}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"active":21,"batches":112,"failed":773,"total":794},"type":"kafka"},"outputs":{"kafka":{"bytes_read":18800,"bytes_write":119657}},"pipeline":{"clients":0,"events":{"active":21,"filtered":11,"published":21,"retry":789,"total":32}}},"registrar":{"states":{"cleanup":1,"current":4,"update":7},"writes":{"success":8,"total":8}},"system":{"cpu":{"cores":2},"load":{"1":0.59,"15":0.46,"5":0.45,"norm":{"1":0.295,"15":0.23,"5":0.225}}}}}}
2020-08-09T13:06:47.501-0400 INFO [monitoring] log/log.go:150 Uptime: 2m35.36951196s
2020-08-09T13:06:47.501-0400 INFO [monitoring] log/log.go:127 Stopping metrics logging.
2020-08-09T13:06:47.501-0400 INFO instance/beat.go:373 filebeat stopped.
There's nothing there that suggests a problem. It's started, connected to kafka, then something has asked it to stop running.
Yes I suspect something with Kafka as output, if I comment it out and enable Elasticsearch as output its working fine. Only with Kafka its failing and also failed to start service message is coming while restarting .
Well what do the logs look like when you enable it?
You mean enable Elasticsearch ?
After enabling Elasticsearch as output
2020-08-10T04:19:00.098-0400 INFO instance/beat.go:544 Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
2020-08-10T04:19:00.101-0400 INFO instance/beat.go:551 Beat UUID: e638f53f-84fc-47ea-875d-354872f82bef
2020-08-10T04:19:00.101-0400 INFO [seccomp] seccomp/seccomp.go:93 Syscall filter could not be installed because the kernel does not support seccomp
2020-08-10T04:19:00.101-0400 INFO [beat] instance/beat.go:768 Beat info {"system_info": {"beat": {"path": {"config": "/etc/filebeat", "data": "/var/lib/filebeat", "home": "/usr/share/filebeat", "logs": "/var/log/filebeat"}, "type": "filebeat", "uuid": "e638f53f-84fc-47ea-875d-354872f82bef"}}}
2020-08-10T04:19:00.101-0400 INFO [beat] instance/beat.go:777 Build info {"system_info": {"build": {"commit": "34b4e2cc75fbbee5e7149f3916de72fb8892d070", "libbeat": "6.4.0", "time": "2018-08-17T22:20:20.000Z", "version": "6.4.0"}}}
2020-08-10T04:19:00.101-0400 INFO [beat] instance/beat.go:780 Go runtime info {"system_info": {"go": {"os":"linux","arch":"amd64","max_procs":2,"version":"go1.10.3"}}}
2020-08-10T04:19:00.105-0400 INFO [beat] instance/beat.go:784 Host info {"system_info": {"host": {"architecture":"x86_64","boot_time":"2020-08-05T09:20:14-04:00","containerized":true,"hostname":"rhel74","ips":["127.0.0.1/8","::1/128","100.64.24.211/22","2607:f2b1:f000:24:250:56ff:fe1e:a275/64","fe80::250:56ff:fe1e:a275/64"],"kernel_version":"3.10.0-693.17.1.el7.x86_64","mac_addresses":["00:50:56:1e:a2:75"],"os":{"family":"","platform":"rhel","name":"Red Hat Enterprise Linux Server","version":"7.4 (Maipo)","major":7,"minor":4,"patch":0,"codename":"Maipo"},"timezone":"EDT","timezone_offset_sec":-14400,"id":"05136768af6046b0b100160cf1bdd62e"}}}
2020-08-10T04:19:00.106-0400 INFO [beat] instance/beat.go:813 Process info {"system_info": {"process": {"capabilities": {"inheritable":null,"permitted":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"effective":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"bounding":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"ambient":null}, "cwd": "/", "exe": "/usr/share/filebeat/bin/filebeat", "name": "filebeat", "pid": 17774, "ppid": 1, "seccomp": {"mode":"disabled"}, "start_time": "2020-08-10T04:18:59.580-0400"}}}
2020-08-10T04:19:00.106-0400 INFO instance/beat.go:273 Setup Beat: filebeat; Version: 6.4.0
2020-08-10T04:19:00.107-0400 INFO elasticsearch/client.go:163 Elasticsearch url: http://100.102.128.20:9200
2020-08-10T04:19:00.108-0400 INFO pipeline/module.go:98 Beat name: rhel74
2020-08-10T04:19:00.108-0400 WARN [cfgwarn] beater/filebeat.go:78 DEPRECATED: prospectors are deprecated, Use `inputs` instead. Will be removed in version: 7.0.0
2020-08-10T04:19:00.109-0400 INFO [monitoring] log/log.go:114 Starting metrics logging every 30s
2020-08-10T04:19:00.109-0400 INFO instance/beat.go:367 filebeat start running.
2020-08-10T04:19:00.110-0400 INFO registrar/registrar.go:134 Loading registrar data from /var/lib/filebeat/registry
2020-08-10T04:19:00.110-0400 INFO registrar/registrar.go:141 States Loaded from registrar: 4
2020-08-10T04:19:00.110-0400 INFO crawler/crawler.go:72 Loading Inputs: 1
2020-08-10T04:19:00.110-0400 WARN [cfgwarn] input/config.go:42 DEPRECATED: input_type input config is deprecated. Use type instead. Will be removed in version: 6.0.0
2020-08-10T04:19:00.111-0400 INFO log/input.go:138 Configured paths: [/var/log/dmesg* /var/log/messages*]
2020-08-10T04:19:00.111-0400 INFO input/input.go:114 Starting input of type: log; ID: 8669900041677695518
2020-08-10T04:19:00.111-0400 INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
2020-08-10T04:19:00.122-0400 INFO log/harvester.go:251 Harvester started for file: /var/log/dmesg
2020-08-10T04:19:00.124-0400 INFO log/harvester.go:251 Harvester started for file: /var/log/messages
2020-08-10T04:19:01.642-0400 INFO elasticsearch/client.go:708 Connected to Elasticsearch version 6.6.0
2020-08-10T04:19:01.905-0400 INFO template/load.go:129 Template already exists and will not be overwritten.
No, kafka, that's the one with the issue?