Winlogbeat file output -> logstash -> Kibana

Hi, I've come upon the trouble of making logstash parsing the winlogbeat file that i created from other computer. Additional info: The computer that generate winlogbeat doesn't have any LAN or WAN connection so no logstash connection.

Below are the code on the winlogbeat.yml that output the file that will be put on ELK server

output.file:
  path: "C:/tmp/winlogbeat"
  filename: winlogbeat
  #rotate_every_kb: 10000
  #number_of_files: 7
  #permissions: 0600

Here is my logstash config

input {
  beats {
    port => 5044
    #ssl => true
    #ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    #ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
  }
file{
    path => "/home/user/dummy/*"
    start_position => "beginning"
  }
}

filter {
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

The problem i'm facing is that the logstash are not displaying the winlogbeat on kibana. Any help would be welcome TQ

ps: All the index pattern and dashboard template for winlogbeat are already loaded in ELK server.

Could you share the debug logs of your Winlogbeat instance?

Sure here is my Winlogbeat logs i got from log folder. The log file were created without problem but when i copy my file to my ELK server the logstash doesn't seem to load it.

2018-02-16T18:25:45.779+0800 INFO instance/beat.go:468 Home path: [E:\Winlogbeat] Config path: [E:\Winlogbeat] Data path: [E:\Winlogbeat\data] Logs path: [E:\Winlogbeat\logs]
2018-02-16T18:25:45.991+0800 INFO instance/beat.go:475 Beat UUID: cbc082e6-3a82-4070-add1-e5eccd273d42
2018-02-16T18:25:45.991+0800 INFO instance/beat.go:213 Setup Beat: winlogbeat; Version: 6.2.1
2018-02-16T18:25:45.992+0800 WARN instance/metrics_other.go:8 Metrics not implemented for this OS.
2018-02-16T18:25:45.997+0800 INFO fileout/file.go:76 Initialized file output. path=C:\Users\user\Desktop\winlogbeat-6.2.1-windows-x86_64\tmp\winlogbeat max_size_bytes=10485760 max_backups=7 permissions=-rw-------
2018-02-16T18:25:46.027+0800 INFO pipeline/module.go:76 Beat name: DESKTOP-HIIDDG6
2018-02-16T18:25:46.028+0800 INFO beater/winlogbeat.go:56 State will be read from and persisted to E:\Winlogbeat\data.winlogbeat.yml
2018-02-16T18:25:46.062+0800 INFO instance/beat.go:301 winlogbeat start running.
2018-02-16T18:25:46.062+0800 INFO [monitoring] log/log.go:97 Starting metrics logging every 30s
2018-02-16T18:25:46.157+0800 WARN beater/eventlogger.go:87 EventLog[Security] Open() error. No events will be read from this source. Access is denied.
2018-02-16T18:25:47.509+0800 INFO beater/eventlogger.go:56 EventLog[Windows PowerShell] successfully published 7 events
2018-02-16T18:25:48.861+0800 INFO beater/eventlogger.go:56 EventLog[Application] successfully published 100 events
2018-02-16T18:25:48.861+0800 INFO beater/eventlogger.go:56 EventLog[System] successfully published 102 events
2018-02-16T18:26:16.064+0800 INFO [monitoring] log/log.go:124 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":209,"batches":2,"total":209},"type":"file","write":{"bytes":208561}},"pipeline":{"clients":3,"events":{"active":0,"published":209,"total":209},"queue":{"acked":209}}},"msg_file_cache":{"ApplicationHits":83,"ApplicationMisses":18,"ApplicationSize":18,"SystemHits":75,"SystemMisses":27,"SystemSize":27,"Windows PowerShellHits":6,"Windows PowerShellMisses":1,"Windows PowerShellSize":1},"published_events":{"Application":100,"System":102,"Windows PowerShell":7,"total":209},"uptime":"{"server_time":"2018-02-16T10:26:16.0638814Z","start_time":"2018-02-16T10:25:45.58109Z","uptime":"30.4827914s","uptime_ms":"30482791"}"}}}

My objective is like this
Winlogbeat (Create the winlogbeat file) -> Copy file to usb -> Paste it in ELK server folder -> Logstash will load the file to elasticsearch and kibana

It looks to me like you are missing a JSON decode step in Logstash. The data in the file is JSON so you need to apply a json codec.

@andrewkroh It works like a charm thanks a lot andrewkroh. I've tested it twice by deleting through curl -XDELETE and then added the file again(winlogbeat files) the logstash detected the file changed and loaded it into elasticsearch and kibana without a problem

input {
	file {
		path => "/home/user/dummy/*"
		codec => "json"
	    start_position => "beginning"
	    tags => ['windows', 'eventlog', 'dc']
	}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.