Filebeat on Windows 2012 R2, Error Sending logs to Logstash


(Smit Jain) #1

Team,

I am using Filebeat > Logstash > Elasticsearch > Kibana to parse and analyse logs basically Java Stack Trace and other Tomcat Logs.

Here is YML for Filebeat
filebeat:
prospectors:
-
paths:
- C:\logs\OCR\abbyy.log
#input_type: log
#document_type: UAT_OCR
exclude_lines: [".+DEBUG"]
multiline:
pattern: ".+(ERROR|INFO)"
negate: true
match: after
fields:
app_name: OCR
environment: UAT
fields_under_root: true
#force_close_files: true
spool_size: 2048
publish_async: true
scan_frequency: 10s
#close_older: 2h

output:
  logstash:
host: "10.0.64.14"
port: 5044
index: filebeat
timeout: 5
reconnect_interval: 3
bulk_max_size: 2048

shipper:
 tags: ["ABBYY_Engine", "UAT_OCR"]
 queue_size: 1000

###  Enable logging of the filebeat
logging:
  level: warning
  to_files: true
  files:
path: c:\logs\
name: mybeat.log
rotateeverybytes: 20485760 # = 20MB
keepfiles: 7

Problem : the Filebeat is not able to send logs to logstash at times, some times it start running shipping but sometimes it doesn't.
Although If I use "test.log" as a prospector and save logs locally on disk via below config it works well.

Writing Files to local File to Check the output. I have tried "file" and "logstash" one by one.

output:
  file:
   path: c:\logs\
   filename: filebeat
   rotate_every_kb: 100000
   number_of_files: 7

Also, The things mostly run when I am using command Line. :

.\filebeat.exe -c filebeat.yml -e -v

@Filebeat Team, Kindly assist with the correct config for windows.
The log file "abbyy.log" is getting rotated on every 30 MB of size.

I am not very sure to use the below attributes and how they will function with Filebeat on windows :slight_smile:

  • close_older

  • ignore_older

  • Logging : This is also not working on windows.

  • List item


(Mark Walkom) #2

What do you mean by [quote="smitjainsj, post:1, topic:52025"]
the Filebeat is not able to send logs to logstash at times, some times it start running shipping but sometimes it doesn't
[/quote]

What are you seeing? Is there something in the logs?


(Smit Jain) #3

@warkolm when I try the command line option for filebeat it ships the logstash only for those files which are not opened by any other application. But when I try to start via service it doesn't ship any logs.

I don't understand why its NOT shipping the logs for the files which are opened. The logs files rotated every time when the size it 20 MB.


(ruflin) #4

Can you start filebeat with the -e -d "*" option and check if it gives you some indications on what goes wrong? If not you can post the output here.


(Smit Jain) #5

@ruflin the -d options did shared the output on the console as what was desired but the filebeat is not parsing the whole file. The file size is around 17 MB with 1400000 lines and it parsed starting from the line number 125000 as it should be parsing the full log file.

Here is without -d option :slight_smile:
PS C:\Filebeat> .\filebeat.exe -c filebeat.yml -e -v
2016/06/09 12:29:21.881312 geolite.go:24: INFO GeoIP disabled: No paths were set under output.ge
2016/06/09 12:29:21.882298 logstash.go:106: INFO Max Retries set to: 3
2016/06/09 12:29:21.883309 outputs.go:126: INFO Activated logstash as output plugin.
2016/06/09 12:29:21.883309 publish.go:288: INFO Publisher name: examplehost
2016/06/09 12:29:21.886298 async.go:78: INFO Flush Interval set to: 1s
2016/06/09 12:29:21.886298 async.go:84: INFO Max Bulk Size set to: 2048
2016/06/09 12:29:21.886298 beat.go:147: INFO Init Beat: filebeat; Version: 1.2.3
2016/06/09 12:29:21.887309 beat.go:173: INFO filebeat sucessfully setup. Start running.
2016/06/09 12:29:21.887309 registrar.go:68: INFO Registry file set to: C:\Filebeat.filebeat
2016/06/09 12:29:21.887309 registrar.go:80: INFO Loading registrar data from C:\Filebeat.filebe
2016/06/09 12:29:21.888282 prospector.go:133: INFO Set ignore_older duration to 0
2016/06/09 12:29:21.888282 prospector.go:133: INFO Set close_older duration to 1h0m0s
2016/06/09 12:29:21.888282 prospector.go:133: INFO Set scan_frequency duration to 5s
2016/06/09 12:29:21.889306 prospector.go:93: INFO Input type set to: log
2016/06/09 12:29:21.889306 prospector.go:133: INFO Set backoff duration to 1s
2016/06/09 12:29:21.889306 prospector.go:133: INFO Set max_backoff duration to 10s
2016/06/09 12:29:21.889306 prospector.go:113: INFO force_close_file is disabled
2016/06/09 12:29:21.890296 prospector.go:143: INFO Starting prospector of type: log
2016/06/09 12:29:21.890296 crawler.go:78: INFO All prospectors initialised with 1 states to pers
2016/06/09 12:29:21.890296 registrar.go:87: INFO Starting Registrar
2016/06/09 12:29:21.891309 publish.go:88: INFO Start sending events to output
2016/06/09 12:29:21.891309 log.go:113: INFO Harvester started for file: C:\logs\OCR\example.log
2016/06/09 12:29:21.889306 spooler.go:77: INFO Starting spooler: spool_size: 2048; idle_timeout:
2016/06/09 12:29:24.740291 publish.go:104: INFO Events sent: 241
2016/06/09 12:29:24.747273 registrar.go:162: INFO Registry file updated. 1 states written.
2016/06/09 12:29:33.549204 publish.go:104: INFO Events sent: 1356
2016/06/09 12:29:33.552179 registrar.go:162: INFO Registry file updated. 1 states written.


(Smit Jain) #6

While using the "-d" attribute the output is quiet long so cannot here on Forum.
PS C:\Filebeat> .\filebeat.exe -c filebeat.yml -e -v -d "*"
2016/06/09 12:35:18.412653 beat.go:135: DBG Initializing output plugins
2016/06/09 12:35:18.413701 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths
2016/06/09 12:35:18.413701 logstash.go:106: INFO Max Retries set to: 3
2016/06/09 12:35:18.413701 client.go:100: DBG connect
2016/06/09 12:35:18.415697 outputs.go:126: INFO Activated logstash as output plugin.
2016/06/09 12:35:18.415697 publish.go:232: DBG Create output worker
2016/06/09 12:35:18.415697 publish.go:274: DBG No output is defined to store the topology. The server fields might not be filled.
2016/06/09 12:35:18.416697 publish.go:288: INFO Publisher name: examplehost
2016/06/09 12:35:18.418699 async.go:78: INFO Flush Interval set to: 1s
2016/06/09 12:35:18.419697 async.go:84: INFO Max Bulk Size set to: 2048
2016/06/09 12:35:18.419697 async.go:92: DBG create bulk processing worker (interval=1s, bulk size=2048)
2016/06/09 12:35:18.419697 beat.go:147: INFO Init Beat: filebeat; Version: 1.2.3
2016/06/09 12:35:18.420698 beat.go:173: INFO filebeat sucessfully setup. Start running.
2016/06/09 12:35:18.420698 registrar.go:68: INFO Registry file set to: C:\Filebeat.filebeat
2016/06/09 12:35:18.420698 registrar.go:80: INFO Loading registrar data from C:\Filebeat.filebeat
2016/06/09 12:35:18.421697 spooler.go:44: DBG Set idleTimeoutDuration to 5s
2016/06/09 12:35:18.421697 crawler.go:38: DBG File Configs: [C:\logs\OCR\example.log]
2016/06/09 12:35:18.421697 prospector.go:133: INFO Set ignore_older duration to 0
2016/06/09 12:35:18.421697 prospector.go:133: INFO Set close_older duration to 1h0m0s
2016/06/09 12:35:18.422697 prospector.go:133: INFO Set scan_frequency duration to 5s
2016/06/09 12:35:18.422697 prospector.go:93: INFO Input type set to: log
2016/06/09 12:35:18.422697 prospector.go:133: INFO Set backoff duration to 1s
2016/06/09 12:35:18.422697 prospector.go:133: INFO Set max_backoff duration to 10s
2016/06/09 12:35:18.423697 spooler.go:77: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2016/06/09 12:35:18.423697 prospector.go:113: INFO force_close_file is disabled
2016/06/09 12:35:18.423697 crawler.go:58: DBG Waiting for 1 prospectors to initialise
2016/06/09 12:35:18.423697 prospector.go:143: INFO Starting prospector of type: log
2016/06/09 12:35:18.424697 prospector.go:161: DBG exclude_files: []
2016/06/09 12:35:18.424697 prospector.go:261: DBG scan path C:\logs\OCR\example.log
2016/06/09 12:35:18.424697 prospector.go:275: DBG Check file for harvesting: C:\logs\OCR\example.log
2016/06/09 12:35:18.424697 registrar.go:175: DBG Same file as before found. Fetch the state.
2016/06/09 12:35:18.425649 prospector.go:345: DBG Start harvesting unknown file: C:\logs\OCR\example.log
2016/06/09 12:35:18.425649 prospector.go:397: DBG Resuming harvester on a previously harvested file: C:\logs\OCR\example.log
2016/06/09 12:35:18.425649 crawler.go:71: DBG Registrar will re-save state for C:\logs\OCR\example.log
2016/06/09 12:35:18.426697 crawler.go:65: DBG No pending prospectors. Finishing setup
2016/06/09 12:35:18.426697 crawler.go:78: INFO All prospectors initialised with 1 states to persist
2016/06/09 12:35:18.426697 registrar.go:87: INFO Starting Registrar

Rest in next POST


(Smit Jain) #7
2016/06/09 12:35:18.426697 publish.go:88: INFO Start sending events to output
2016/06/09 12:35:18.422697 service_windows.go:49: DBG  Windows is interactive: true
2016/06/09 12:35:18.427697 prospector.go:261: DBG  scan path C:\logs\OCR\example.log
2016/06/09 12:35:18.427697 prospector.go:275: DBG  Check file for harvesting: C:\logs\OCR\example.log
2016/06/09 12:35:18.428646 registrar.go:175: DBG  Same file as before found. Fetch the state.
2016/06/09 12:35:18.428646 prospector.go:418: DBG  Update existing file for harvesting: C:\logs\OCR\example.log
2016/06/09 12:35:18.428646 prospector.go:465: DBG  Not harvesting, file didn't change: C:\logs\OCR\example.log
2016/06/09 12:35:18.428646 log.go:270: DBG  harvest: "C:\\logs\\OCR\\example.log" position:17224636 (offset snapshot:0)
2016/06/09 12:35:18.429713 log.go:113: INFO Harvester started for file: C:\logs\OCR\example.log
2016/06/09 12:35:18.429713 util.go:20: DBG  full line read
2016/06/09 12:35:18.429713 log.go:197: DBG  Drop line as it does match one of the exclude patterns2016-06-09 17:12:59,402 INFO  http-apr-8080-exec-6 c
om.blackbox.ids.example.services.impl.DataExtraction892TemplateService - File name : Row_24 Field Value :
2016-06-09 17:13:00,241 DEBUG http-apr-8080-exec-6 org.apache.pdfbox.pdfparser.PDFObjectStreamParser - parsed=COSObject{2, 0}
2016-06-09 17:13:00,241 DEBUG http-apr-8080-exec-6 org.apache.pdfbox.pdfparser.PDFObjectStreamParser - parsed=COSObject{12, 0}
2016-06-09 17:13:00,241 DEBUG http-apr-8080-exec-6 org.apache.pdfbox.pdfparser.PDFObjectStreamParser - parsed=COSObject{3, 0}
2016-06-09 17:13:00,241 DEBUG http-apr-8080-exec-6 org.apache.pdfbox.pdfparser.PDFObjectStreamParser - parsed=COSObject{4, 0}
2016-06-09 17:13:00,241 DEBUG http-apr-8080-exec-6 org.apache.pdfbox.pdfparser.PDFObjectStreamParser - parsed=COSObject{5, 0}
2016-06-09 17:13:00,241 DEBUG http-apr-8080-exec-6 org.apache.pdfbox.pdfparser.PDFObjectStreamParser - parsed=COSObject{9, 0}
2016-06-09 17:13:00,241 DEBUG http-apr-8080-exec-6 org.apache.pdfbox.pdfparser.PDFObjectStreamParser - parsed=COSObject{11, 0}
2016/06/09 12:35:18.433698 util.go:20: DBG  full line read
2016/06/09 12:35:18.433698 log.go:197: DBG  Drop line as it does match one of the exclude patterns2016-06-09 17:13:08,847 INFO  http-apr-8080-exec-6 c
om.blackbox.ids.example.server.DefaultexampleResource - Unloading Engine..
2016-06-09 17:13:08,847 DEBUG http-apr-8080-exec-6 org.apache.cxf.phase.PhaseInterceptorChain - Invoking handleMessage on interceptor org.apache.cxf.i
nterceptor.OutgoingChainInterceptor@1a788b
2016-06-09 17:13:08,847 DEBUG http-apr-8080-exec-6 org.apache.cxf.interceptor.OutgoingChainInterceptor - Interceptors contributed by bus: []
2016-06-09 17:13:08,848 DEBUG http-apr-8080-exec-6 org.apache.cxf.interceptor.OutgoingChainInterceptor -

(Smit Jain) #8

The problem is still exist , logtstash doesn't recieve any logs from filebeat .


(Steffen Siering) #9

totally missing any logs from logstash output here. Instead log contains Drop line as it does match one of the exclude patterns.


(system) #10

This topic was automatically closed after 21 days. New replies are no longer allowed.