Hi Steffens
The error occurs each time I run the filebeat manually and it is repeated
2017/04/24 09:28:53.480521 log_file.go:84: DBG End of file reached: /var/log/messages; Backoff now.
2017/04/24 09:28:53.492880 spooler.go:88: DBG Flushing spooler because of timeout. Events flushed: 0
2017/04/24 09:28:58.474975 log_file.go:84: DBG End of file reached: /var/log/secure.1; Backoff now.
2017/04/24 09:28:58.475012 log_file.go:84: DBG End of file reached: /var/log/secure; Backoff now.
2017/04/24 09:28:58.475063 log_file.go:84: DBG End of file reached: /var/log/messages.1; Backoff now.
2017/04/24 09:28:58.493962 spooler.go:88: DBG Flushing spooler because of timeout. Events flushed: 0
2017/04/24 09:29:01.551592 sync.go:53: DBG connect
2017/04/24 09:29:01.552068 single.go:140: ERR Connecting error publishing events (retrying): dial tcp 192.168.10.237:5044: getsockopt: connection refused
2017/04/24 09:29:01.552614 single.go:156: DBG send fail
2017/04/24 09:29:03.482003 log_file.go:84: DBG End of file reached: /var/log/messages; Backoff now.
2017/04/24 09:29:03.482023 prospector.go:197: DBG Run prospector
2017/04/24 09:29:03.482033 prospector_log.go:73: DBG Start next scan
Also I made a tcpdump trace on the elk machine and it received 0 bits from this server
root@elk:~# tcpdump src host 192.168.10.199 and port 5044
tcpdump: verbose output suppressed, use -v or -vv for full protocol decode
listening on eth0, link-type EN10MB (Ethernet), capture size 262144 bytes
And a tcpdump on the sender server
tcpdump: verbose output suppressed, use -v or -vv for full protocol decode
listening on eth0, link-type EN10MB (Ethernet), capture size 96 bytes
11:32:28.997490 IP 192.168.10.199.47155 > 192.168.10.237.lxi-evntsvc: S 4202664451:4202664451(0) win 5840 <mss 1460,sackOK,timestamp 448758197 0,nop,wscale 7>
11:32:29.999713 IP 192.168.10.199.47156 > 192.168.10.237.lxi-evntsvc: S 4209729800:4209729800(0) win 5840 <mss 1460,sackOK,timestamp 448759199 0,nop,wscale 7>
11:32:32.001726 IP 192.168.10.199.47157 > 192.168.10.237.lxi-evntsvc: S 4202283528:4202283528(0) win 5840 <mss 1460,sackOK,timestamp 448761201 0,nop,wscale 7>
11:32:36.003808 IP 192.168.10.199.47158 > 192.168.10.237.lxi-evntsvc: S 4215919907:4215919907(0) win 5840 <mss 1460,sackOK,timestamp 448765203 0,nop,wscale 7>
11:32:44.005915 IP 192.168.10.199.50387 > 192.168.10.237.lxi-evntsvc: S 4239114558:4239114558(0) win 5840 <mss 1460,sackOK,timestamp 448773205 0,nop,wscale 7>
This is after manually starting the filebeat
[ServerName ~]# /root/admin/filebeat-linux-386 -c /etc/filebeat/filebeat.yml -e -d ''
2017/04/24 09:32:23.930961 beat.go:327: INFO Home path: [/root/admin] Config path: [/root/admin] Data path: [/root/admin/data] Logs path: [/root/admin/logs]
2017/04/24 09:32:23.930990 beat.go:352: INFO Beat metadata path: /root/admin/data/meta.json
2017/04/24 09:32:23.931068 beat.go:334: INFO Beat UUID: 5ad83c6d-6972-458c-8fc0-934e9be0976c
2017/04/24 09:32:23.931083 beat.go:212: INFO Setup Beat: filebeat; Version: 6.0.0-alpha1-gite288cf6
2017/04/24 09:32:23.931094 processor.go:44: DBG Processors:
2017/04/24 09:32:23.931108 beat.go:218: DBG Initializing output plugins
2017/04/24 09:32:23.931177 logstash.go:92: INFO Max Retries set to: 3
2017/04/24 09:32:23.931237 outputs.go:107: INFO Activated logstash as output plugin.
2017/04/24 09:32:23.931249 publish.go:158: DBG Create output worker
2017/04/24 09:32:23.931289 publish.go:191: INFO Publisher name: ServerName
2017/04/24 09:32:23.931402 async.go:63: INFO Flush Interval set to: 1s
2017/04/24 09:32:23.931415 async.go:64: INFO Max Bulk Size set to: 2048
2017/04/24 09:32:23.931424 async.go:72: DBG create bulk processing worker (interval=1s, bulk size=2048)
2017/04/24 09:32:23.931501 modules.go:93: ERR Not loading modules. Module directory not found: /root/admin/module
2017/04/24 09:32:23.931615 metrics.go:23: INFO Metrics logging every 30s
2017/04/24 09:32:23.931726 beat.go:255: INFO filebeat start running.
2017/04/24 09:32:23.931819 registrar.go:83: INFO Registry file set to: /root/admin/data/registry
2017/04/24 09:32:23.931898 registrar.go:104: INFO Loading registrar data from /root/admin/data/registry
2017/04/24 09:32:23.932169 registrar.go:115: INFO States Loaded from registrar: 10
2017/04/24 09:32:23.932231 crawler.go:38: INFO Loading Prospectors: 1
2017/04/24 09:32:23.932392 processor.go:44: DBG Processors:
2017/04/24 09:32:23.932453 prospector.go:96: DBG File Configs: [/var/log/secure /var/log/messages*]
2017/04/24 09:32:23.932520 prospector_log.go:46: DBG exclude_files:
2017/04/24 09:32:23.932594 state.go:64: DBG New state added for /var/log/secure.1
2017/04/24 09:32:23.932667 state.go:64: DBG New state added for /var/log/secure.2
2017/04/24 09:32:23.932723 state.go:64: DBG New state added for /var/log/messages.3
2017/04/24 09:32:23.932793 state.go:64: DBG New state added for /var/log/messages.4
2017/04/24 09:32:23.932848 state.go:64: DBG New state added for /var/log/messages.2
2017/04/24 09:32:23.932909 state.go:64: DBG New state added for /var/log/secure
2017/04/24 09:32:23.932968 state.go:64: DBG New state added for /var/log/secure.3
2017/04/24 09:32:23.933028 state.go:64: DBG New state added for /var/log/secure.4
2017/04/24 09:32:23.933086 state.go:64: DBG New state added for /var/log/messages
2017/04/24 09:32:23.933144 state.go:64: DBG New state added for /var/log/messages.1
2017/04/24 09:32:23.933196 prospector_log.go:67: INFO Prospector with previous states loaded: 10
2017/04/24 09:32:23.933303 prospector.go:138: INFO Starting prospector of type: log; id: 11581658867755187697
2017/04/24 09:32:23.933392 crawler.go:58: INFO Loading and starting Prospectors completed. Enabled prospectors: 1
2017/04/24 09:32:23.933455 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017/04/24 09:32:23.932627 registrar.go:148: INFO Starting Registrar
2017/04/24 09:32:23.932631 sync.go:41: INFO Start sending events to output
2017/04/24 09:32:23.933600 prospector_log.go:73: DBG Start next scan
And this is the configuration
###################### Filebeat Configuration Example #########################
#=========================== Filebeat prospectors =============================
filebeat.prospectors:
Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/secure*
- /var/log/messages*
document_type: syslog
#================================ Outputs =====================================
#----------------------------- Logstash output --------------------------------
output.logstash:
The Logstash hosts
hosts: ["192.168.10.237:5044"]
index: 'filebeat'
logging.level: debug
logging.to_files: true
logging.to_syslog: false
logging.files:
path: /var/log/filebeat
name: filebeat.log
keepfiles: 7