Trying to capture IIS logs in close to real time, but Kibana seems to only show old logs (the day before and older). Can anyone point me in the right direction to getting close to real-time logs displayed in Kibana? OS is Windows Server 2008
The filebeat config looks quite off above with the indentation. Is that just a copy / paste issue?
So today you see in Kibana the logs from yesterday? Means over the long term, all logs are available? Does Kibana, Logstash, Filebeat and the server itself have the same timezone?
Yes, the copy/paste issue made the formatting appear strange. I just checked Kibana today, and the most recent filebeat IIS logs are from midnight, nothing newer.
As you process the timestamp, can you check if IIS perhaps uses the wrong time one? Can you post a log line here? Also can you very that all are on UTC?
Run filebeat with -e -d "publish" and check if filebeat pushes the most recent events
Run logstash with stdout or file output and check if it writes the most recent events correctly?
It would also help to first use a minimal LS config and see if all lines show up as expected and then dig deeper into the LS config to see if there is a potential problem.
With today being 6/17 and executing filebeat with -e -d "publish", the first results displayed were from 6/14.
2016/06/17 15:10:35.307882 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths
2016/06/17 15:10:35.307882 logstash.go:106: INFO Max Retries set to: 3
2016/06/17 15:10:35.316883 outputs.go:126: INFO Activated logstash as output plugin.
2016/06/17 15:10:35.316883 publish.go:232: DBG Create output worker
2016/06/17 15:10:35.316883 publish.go:274: DBG No output is defined to store the topology. The server fields might not
be filled.
2016/06/17 15:10:35.317883 publish.go:288: INFO Publisher name: SERVER
2016/06/17 15:10:35.321884 async.go:78: INFO Flush Interval set to: 1s
2016/06/17 15:10:35.322884 async.go:84: INFO Max Bulk Size set to: 2048
2016/06/17 15:10:35.322884 async.go:92: DBG create bulk processing worker (interval=1s, bulk size=2048)
2016/06/17 15:10:35.322884 beat.go:147: INFO Init Beat: filebeat; Version: 1.2.3
2016/06/17 15:10:35.323884 beat.go:173: INFO filebeat sucessfully setup. Start running.
2016/06/17 15:10:35.323884 registrar.go:68: INFO Registry file set to: C:\ProgramData\filebeat\registry
2016/06/17 15:10:35.325884 registrar.go:80: INFO Loading registrar data from C:\ProgramData\filebeat\registry
2016/06/17 15:10:35.333885 prospector.go:133: INFO Set ignore_older duration to 24h0m0s
2016/06/17 15:10:35.333885 prospector.go:133: INFO Set close_older duration to 1h0m0s
2016/06/17 15:10:35.334885 prospector.go:133: INFO Set scan_frequency duration to 10s
2016/06/17 15:10:35.334885 prospector.go:90: INFO Invalid input type set:
2016/06/17 15:10:35.334885 prospector.go:93: INFO Input type set to: log
2016/06/17 15:10:35.334885 prospector.go:133: INFO Set backoff duration to 1s
2016/06/17 15:10:35.334885 prospector.go:133: INFO Set max_backoff duration to 10s
2016/06/17 15:10:35.334885 prospector.go:113: INFO force_close_file is disabled
2016/06/17 15:10:35.334885 prospector.go:143: INFO Starting prospector of type: log
2016/06/17 15:10:35.334885 spooler.go:77: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2016/06/17 15:10:35.658917 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160614.log.1
2016/06/17 15:10:35.660918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC4\u_ex160615.log.1
2016/06/17 15:10:35.660918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC4\u_ex160613.log.1
2016/06/17 15:10:35.659918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC4\u_ex160616.log.1
2016/06/17 15:10:35.660918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC4\u_ex160612.log.1
2016/06/17 15:10:35.661918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC1\u_ex160616.log.1
2016/06/17 15:10:35.662918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160617.log
2016/06/17 15:10:35.662918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC3\u_ex160616.log.1
2016/06/17 15:10:35.662918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160616.log.1
2016/06/17 15:10:35.662918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC4\u_ex160614.log.1
2016/06/17 15:10:35.664918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC1\u_ex160617.log
2016/06/17 15:10:35.664918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160613.log.1
2016/06/17 15:10:35.665918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160615.log.1
2016/06/17 15:10:35.666918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC3\u_ex160612.log.1
2016/06/17 15:10:35.671919 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC3\u_ex160617.log
2016/06/17 15:10:35.671919 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC1\u_ex160614.log.1
2016/06/17 15:10:35.671919 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160612.log.1
{:timestamp=>"2016-06-21T08:17:39.851000-0600", :message=>"Received an event that has a different character encoding than you configured.", :text=>"2W\\u0000\\u0000\\u0000\\u00012J\\u0000\\u0000\\u0000\\u0001\\u0000\\u0000\\u0001\\x9E{\\\"@metadata\\\":{\\\"type\\\":\\\"log\\\",\\\"beat\\\":\\\"filebeat\\\"},\\\"input_type\\\":\\\"log\\\",\\\"message\\\":\\\"2016-06-19 00:00:00 W3SVC3 SERVER 10.0.0.1 POST /API/Check - 80 - 10.0.0.2 Job - 200 0 0 558 185 368\\\",\\\"type\\\":\\\"log\\\",\\\"count\\\":1,\\\"fields\\\":null,\\\"beat\\\":{\\\"hostname\\\":\\\"server\\\",\\\"name\\\":\\\"server\\\"},\\\"@timestamp\\\":\\\"2016-06-21T13:01:01.449Z\\\",\\\"source\\\":\\\"C:\\\\\\\\inetpub\\\\\\\\logs\\\\\\\\LogFiles\\\\\\\\W3SVC3\\\\\\\\u_ex160619.log\\\",\\\"offset\\\":304}2W\\u0000\\u0000\\u0000\\u00012J\\u0000\\u0000\\u0000\\u0001\\u0000\\u0000\\u0001\\x9E{\\\"@metadata\\\":{\\\"type\\\":\\\"log\\\",\\\"beat\\\":\\\"filebeat\\\"},\\\"input_type\\\":\\\"log\\\",\\\"source\\\":\\\"C:\\\\\\\\inetpub\\\\\\\\logs\\\\\\\\LogFiles\\\\\\\\W3SVC3\\\\\\\\u_ex160619.log\\\",\\\"offset\\\":304,\\\"message\\\":\\\"2016-06-19 00:00:00 W3SVC3 server 10.0.0.1 POST /API/Check - 80 - 10.0.0.2 Job - 200 0 0 558 185 368\\\",\\\"type\\\":\\\"log\\\",\\\"count\\\":1,\\\"fields\\\":null,\\\"beat\\\":{\\\"hostname\\\":\\\"server\\\",\\\"name\\\":\\\"server\\\"},\\\"@timestamp\\\":\\\"2016-06-21T13:01:01.449Z\\\"}2W\\u0000\\u0000\\u0000\\u00012J\\u0000\\u0000\\u0000\\u0001\\u0000\\u0000\\u0001\\x9E{\\\"@metadata\\\":{\\\"type\\\":\\\"log\\\",\\\"beat\\\":\\\"filebeat\\\"},\\\"input_type\\\":\\\"log\\\",\\\"message\\\":\\\"2016-06-19 00:00:00 W3SVC3 server 10.0.0.1 POST /API/Check - 80 - 10.0.0.2 Job - 200 0 0 558 185 368\\\",\\\"type\\\":\\\"log\\\",\\\"count\\\":1,\\\"fields\\\":null,\\\"beat\\\":{\\\"hostname\\\":\\\"server\\\",\\\"name\\\":\\\"server\\\"},\\\"@timestamp\\\":\\\"2016-06-21T13:01:01.449Z\\\",\\\"source\\\":\\\"C:\\\\\\\\inetpub\\\\\\\\logs\\\\\\\\LogFiles\\\\\\\\W3SVC3\\\\\\\\u_ex160619.log\\\",\\\"offset\\\":304}", :expected_charset=>"UTF-8", :level=>:warn}
There seems definitively something wrong with the encoding. What is the encoding of the files you are reading? I have the suspicion that these files are not in utf-8. So in case these are not in utf-8, you have to change the encoding accordingly in filebeat.
Filebeat is expected to communicate to the logstash beats input and not the TCP input. Filebeat communicates to Logstash over the lumberjack protocol and this is what the beats input provides. The data sent by Beats is JSON and the beats input knows this and decodes the JSON.
Filebeat is where you need to set the correct encoding type for your log file.
In your Logstash config, setting type to IIS for the beats input will not have an affect because type is already set by Filebeat (see Logstash's docs for type). To set the type to IIS in Filebeat use the document_type config setting.
Thanks Andrew - marked as answer. In logstash I have type set to IIS because I have other logs coming in with different grok expressions. But you are correct, in filebeat I do have document_type set to IIS.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.