Tomcat log not detected after chnge the PATH on filebeat.yml

(Mohamed Ibrahim) #1

my configuration like the below
- /home/mohamed/localhost_access_log.2015-12-20.txt
input_type: syslog
registry_file: /var/lib/filebeat/registry
enabled: false
hosts: ["localhost:9200"]
hosts: [""]
certificate_authorities: ["/etc/pki/tls/certs/logstash-forwarder.crt"]
rotateeverybytes: 10485760 # = 10MB

i change the path to new location to start monitor it but flebeat still sent log from /var/log/*log
any help

(Mohamed Ibrahim) #2

simply i need to monitor tomcat logs ?
and i installed from this tutorial

(Mark Walkom) #3

Did you restart filebeat after changing the path?

(Mohamed Ibrahim) #4

yes and logstash & elasticsearch

(ruflin) #5

Can you run filebeat with the -e -d "*" flag and provide the output? In addition, it would be good to know the version of filebeat you are using and your OS.

(Mohamed Ibrahim) #7

root@mohamed:~# /etc/filebeat/filebeat.yml -e -d '*'
/etc/filebeat/filebeat.yml: line 4: filebeat:: command not found
/etc/filebeat/filebeat.yml: line 6: prospectors:: command not found
/etc/filebeat/filebeat.yml: line 8: -: command not found
/etc/filebeat/filebeat.yml: line 14: paths:: command not found
/etc/filebeat/filebeat.yml: line 15: -: command not found
/etc/filebeat/filebeat.yml: line 33: input_type:: command not found
/etc/filebeat/filebeat.yml: line 59: scan_frequency:: command not found
/etc/filebeat/filebeat.yml: line 132: registry_file:: command not found
/etc/filebeat/filebeat.yml: line 149: output:: command not found
/etc/filebeat/filebeat.yml: line 152: elasticsearch:: command not found
/etc/filebeat/filebeat.yml: line 153: enabled:: command not found
/etc/filebeat/filebeat.yml: line 158: hosts:: command not found
/etc/filebeat/filebeat.yml: line 231: logstash:: command not found
/etc/filebeat/filebeat.yml: line 233: hosts:: command not found
/etc/filebeat/filebeat.yml: line 246: tls:: command not found
/etc/filebeat/filebeat.yml: line 248: certificate_authorities:: command not found
/etc/filebeat/filebeat.yml: line 294: shipper:: command not found
/etc/filebeat/filebeat.yml: line 332: logging:: command not found
/etc/filebeat/filebeat.yml: line 343: files:: command not found
/etc/filebeat/filebeat.yml: line 352: rotateeverybytes:: command not found

(ruflin) #8

Below is the command. Above you tried to run your config file.

filebeat -c /etc/filebeat/filbeat.yml -e -d "*"

(Mohamed Ibrahim) #11

mohamed@mohamed:~$ filebeat -c /etc/filebeat/filebeat.yml -e -d "*"
2016/01/04 08:16:26.958219 beat.go:97: DBG Initializing output plugins
2016/01/04 08:16:26.958249 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths
2016/01/04 08:16:26.958297 client.go:244: DBG ES Ping(url=http://localhost:9200, timeout=1m30s)
2016/01/04 08:16:26.958593 client.go:249: DBG Ping request failed with: Head http://localhost:9200: dial tcp getsockopt: connection refused
2016/01/04 08:16:26.958612 outputs.go:111: INFO Activated elasticsearch as output plugin.
2016/01/04 08:16:27.037861 outputs.go:111: INFO Activated logstash as output plugin.
2016/01/04 08:16:27.037901 publish.go:198: DBG create output worker: 0x0, 0xc8200c8eb0
2016/01/04 08:16:27.037969 publish.go:198: DBG create output worker: 0x0, 0x0
2016/01/04 08:16:27.037993 publish.go:235: DBG No output is defined to store the topology. The server fields might not be filled.
2016/01/04 08:16:27.038037 publish.go:249: INFO Publisher name: mohamed
2016/01/04 08:16:27.038183 async.go:95: DBG create bulk processing worker (interval=1s, bulk size=50)
2016/01/04 08:16:27.038225 async.go:95: DBG create bulk processing worker (interval=1s, bulk size=200)
2016/01/04 08:16:27.038269 beat.go:107: INFO Init Beat: filebeat; Version: 1.0.1
2016/01/04 08:16:27.038915 beat.go:133: INFO filebeat sucessfully setup. Start running.
2016/01/04 08:16:27.038944 registrar.go:66: INFO Registry file set to: /var/lib/filebeat/registry
2016/01/04 08:16:27.038964 registrar.go:76: INFO Loading registrar data from /var/lib/filebeat/registry
2016/01/04 08:16:27.038997 spooler.go:44: DBG Set idleTimeoutDuration to 5s
2016/01/04 08:16:27.039018 crawler.go:38: DBG File Configs: [/home/mohamed/localhost_access_log.2015-12-20.txt]
2016/01/04 08:16:27.039036 prospector.go:128: DBG Set ignore_older duration to 24h0m0s
2016/01/04 08:16:27.039052 prospector.go:128: DBG Set scan_frequency duration to 10s
2016/01/04 08:16:27.039065 prospector.go:128: DBG Set backoff duration to 1s
2016/01/04 08:16:27.039078 prospector.go:128: DBG Set max_backoff duration to 10s
2016/01/04 08:16:27.039092 prospector.go:128: DBG Set partial_line_waiting duration to 5s
2016/01/04 08:16:27.039104 crawler.go:58: DBG Waiting for 1 prospectors to initialise
2016/01/04 08:16:27.039125 prospector.go:141: DBG Harvest path: /home/mohamed/localhost_access_log.2015-12-20.txt
2016/01/04 08:16:27.039154 prospector.go:207: DBG scan path /home/mohamed/localhost_access_log.2015-12-20.txt
2016/01/04 08:16:27.039176 prospector.go:219: DBG Check file for harvesting: /home/mohamed/localhost_access_log.2015-12-20.txt
2016/01/04 08:16:27.039191 prospector.go:273: DBG Start harvesting unknown file: /home/mohamed/localhost_access_log.2015-12-20.txt
2016/01/04 08:16:27.039206 prospector.go:289: DBG Fetching old state of file to resume: /home/mohamed/localhost_access_log.2015-12-20.txt
2016/01/04 08:16:27.039224 prospector.go:306: DBG Skipping file (older than ignore older of 24h0m0s, 351h2m55.007260138s): /home/mohamed/localhost_access_log.2015-12-20.txt
2016/01/04 08:16:27.039239 prospector.go:207: DBG scan path /home/mohamed/localhost_access_log.2015-12-20.txt
2016/01/04 08:16:27.039254 prospector.go:219: DBG Check file for harvesting: /home/mohamed/localhost_access_log.2015-12-20.txt
2016/01/04 08:16:27.039268 prospector.go:341: DBG Update existing file for harvesting: /home/mohamed/localhost_access_log.2015-12-20.txt
2016/01/04 08:16:27.039280 prospector.go:383: DBG Not harvesting, file didn't change: /home/mohamed/localhost_access_log.2015-12-20.txt
2016/01/04 08:16:27.039295 crawler.go:65: DBG No pending prospectors. Finishing setup
2016/01/04 08:16:27.039308 crawler.go:78: INFO All prospectors initialised with 0 states to persist
2016/01/04 08:16:27.039321 registrar.go:83: INFO Starting Registrar
2016/01/04 08:16:27.039346 filebeat.go:122: INFO Start sending events to output
2016/01/04 08:16:27.039411 spooler.go:77: INFO Starting spooler: spool_size: 1024; idle_timeout: 5s
2016/01/04 08:16:29.539524 spooler.go:97: DBG Flushing spooler because of timemout. Events flushed: 0
^C2016/01/04 08:16:34.698914 service.go:27: DBG Received sigterm/sigint, stopping
2016/01/04 08:16:34.698938 registrar.go:129: INFO Stopping Registrar
2016/01/04 08:16:34.698963 registrar.go:93: INFO Ending Registrar
2016/01/04 08:16:34.698979 registrar.go:142: DBG Write registry file: /var/lib/filebeat/registry
2016/01/04 08:16:34.699058 registrar.go:147: ERR Failed to create tempfile (/var/lib/filebeat/ for writing: open /var/lib/filebeat/ permission denied
2016/01/04 08:16:34.699073 beat.go:143: INFO Cleaning up filebeat before shutting down.

(Mohamed Ibrahim) #12

many thanks the problem was on the file, it's static file no changes has been don on this file when i copy a new file overwrite the old one the data come.

(ruflin) #13

Good to hear your were able to solve the problem.

(Mohamed Ibrahim) #14

Thanks for your support :slight_smile:

(system) #15