I am new to Filebeats and LogStash and am feeling overwhelmed.
My goal is to take .json files in through Filebeats to LogStash and finally onto Elasticsearch.
I am currently working on my local dev machine running Windows 7.
For right now I am just trying to take the files from Filebeats and output them to the console.
I have been all over the internet looking at different documents and have had no success!
Considering going back to using the ES API fed by a Console App!
Thanks for the response. I am not getting any Log files in C:\Program Files\Filebeat\logs directory.
I have the logging.level set to debug in my filebeat.yml file.
#================================ Logging =====================================
# Sets log level. The default log level is info.
# Available log levels are: critical, error, warning, info, debug
logging.level: debug
# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
logging.selectors: ["*"]
I updated the Logging section:
#================================ Logging =====================================
# Sets log level. The default log level is info.
# Available log levels are: critical, error, warning, info, debug
logging.level: debug
# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
logging.selectors: ["*"]
logging.files:
path: C:\Program Files\Filebeat\logs
name: filebeat.log
keepfiles: 7
I restarted Filebeat with these new settings and nothing appeared in the directory.
I disabled Windows Firewall and nothing changed. No logs. Doesn't appear anything happening in LogStash.
Here's the log from my Filebeat console window. Anything useful?
logp.go:219: INFO Metrics logging every 30s
beat.go:267: INFO Home path: [c:\Program Files\Filebeat] Config path: [c:\Program Files\Filebeat] Data path: [c:\Program Files\Filebeat\data] Logs path: [c:\Program Files\Filebeat\logs]
beat.go:177: INFO Setup Beat: filebeat; Version: 5.1.1
processor.go:43: DBG Processors:
beat.go:183: DBG Initializing output plugins
logstash.go:90: INFO Max Retries set to: 3
outputs.go:106: INFO Activated logstash as output plugin.
publish.go:234: DBG Create output worker
publish.go:276: DBG No output is defined to store the topology. The server fields might not be filled.
publish.go:291: INFO Publisher name: AARONB64WIN7
async.go:63: INFO Flush Interval set to: 1s
async.go:64: INFO Max Bulk Size set to: 2048
async.go:72: DBG create bulk processing worker (interval=1s, bulk size=2048)
beat.go:207: INFO filebeat start running.
service_windows.go:51: DBG Windows is interactive: true
registrar.go:85: INFO Registry file set to: c:\Program Files\Filebeat\data\registry
registrar.go:106: INFO Loading registrar data from c:\Program Files\Filebeat\data\registry
registrar.go:131: INFO States Loaded from registrar: 3
sync.go:41: INFO Start sending events to output
registrar.go:230: INFO Starting Registrar
crawler.go:34: INFO Loading Prospectors: 1
spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
prospector_log.go:41: DBG exclude_files: []
state.go:64: DBG New state added for c:\Elasticsearch5\elasticsearch\data\10.json
state.go:64: DBG New state added for c:\Elasticsearch5\elasticsearch\data\20.json
state.go:64: DBG New state added for c:\Elasticsearch5\elasticsearch\data\d2000.json
prospector_log.go:57: INFO Prospector with previous states loaded: 3
prospector.go:69: DBG File Configs: [c:\Elasticsearch5\elasticsearch\data*.json]
crawler.go:46: INFO Loading Prospectors completed. Number of prospectors: 1
crawler.go:61: INFO All prospectors are initialised and running with 3 states to persist
crawler.go:56: DBG Starting prospector 0
prospector.go:111: INFO Starting prospector of type: log
prospector_log.go:62: DBG Start next scan
prospector_log.go:212: DBG Check file for harvesting: c:\Elasticsearch5\elasticsearch\data\ABC.json
prospector_log.go:245: DBG Update existing file for harvesting: c:\Elasticsearch5\elasticsearch\data\ABC.json, offset: 0
prospector_log.go:254: DBG Resuming harvesting of file: c:\Elasticsearch5\elasticsearch\data\ABC.json, offset: 0
log.go:256: DBG Setting offset for file based on seek: c:\Elasticsearch5\elasticsearch\data\ABC.json
log.go:241: DBG Setting offset for file: c:\Elasticsearch5\elasticsearch\data\ABC.json. Offset: 0
prospector_log.go:212: DBG Check file for harvesting: c:\Elasticsearch5\elasticsearch\data\DEF.json
log.go:84: INFO Harvester started for file: c:\Elasticsearch5\elasticsearch\data\ABC.json
prospector_log.go:245: DBG Update existing file for harvesting: c:\Elasticsearch5\elasticsearch\data\DEF.json, offset: 0
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC.json; Backoff now.
prospector_log.go:254: DBG Resuming harvesting of file: c:\Elasticsearch5\elasticsearch\data\DEF.json, offset: 0
log.go:256: DBG Setting offset for file based on seek: c:\Elasticsearch5\elasticsearch\data\DEF.json
log.go:241: DBG Setting offset for file: c:\Elasticsearch5\elasticsearch\data\DEF.json. Offset: 0
prospector_log.go:83: DBG Prospector states cleaned up. Before: 3, After: 3
log.go:84: INFO Harvester started for file: c:\Elasticsearch5\elasticsearch\data\DEF.json
prospector_log.go:99: DBG Remove state for file as file removed: c:\Elasticsearch5\elasticsearch\data\d2000.json
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF.json; Backoff now.
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC.json; Backoff now.
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF.json; Backoff now.
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC.json; Backoff now.
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF.json; Backoff now.
spooler.go:89: DBG Flushing spooler because of timeout. Events flushed: 6
client.go:128: DBG No events to publish
sync.go:68: DBG Events sent: 6
registrar.go:269: DBG Processing 6 events
state.go:109: DBG State removed for c:\Elasticsearch5\elasticsearch\data\d2000.json because of older: 0s
registrar.go:255: DBG Registrar states cleaned up. Before: 3, After: 2
registrar.go:292: DBG Write registry file: c:\Program Files\Filebeat\data\registry
registrar.go:317: DBG Registry file updated. 2 states written.
I ran the following and it output the log below: c:\Program Files\Filebeat>filebeat -c filebeat.yml -d "*" -v -once
DBG Disable stderr logging
INFO Metrics logging every 30s
INFO Home path: [c:\Program Files\Filebeat] Config path: [c:\Program Files\Filebeat] Data path: [c:\Program Files\Filebeat\data] Logs path: [c:\Program Files\Filebeat\logs]
INFO Setup Beat: filebeat; Version: 5.1.1
DBG Processors:
DBG Initializing output plugins
INFO Max Retries set to: 3
INFO Activated logstash as output plugin.
DBG Create output worker
DBG No output is defined to store the topology. The server fields might not be filled.
INFO Publisher name: AARONB64WIN7
INFO Flush Interval set to: 1s
INFO Max Bulk Size set to: 2048
DBG create bulk processing worker (interval=1s, bulk size=2048)
INFO filebeat start running.
DBG Windows is interactive: true
INFO Registry file set to: c:\Program Files\Filebeat\data\registry
INFO Loading registrar data from c:\Program Files\Filebeat\data\registry
INFO States Loaded from registrar: 3
INFO Starting Registrar
INFO Start sending events to output
INFO Loading Prospectors: 1
INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
DBG exclude_files: []
DBG New state added for c:\Elasticsearch5\elasticsearch\data\ABC1.json
DBG New state added for c:\Elasticsearch5\elasticsearch\data\DEF2.json
DBG New state added for c:\Elasticsearch5\elasticsearch\data\GHI3.json
INFO Prospector with previous states loaded: 3
DBG File Configs: [c:\Elasticsearch5\elasticsearch\data*.json]
INFO Loading Prospectors completed. Number of prospectors: 1
INFO All prospectors are initialised and running with 3 states to persist
DBG Starting prospector 0
INFO Starting prospector of type: log
INFO Running filebeat once. Waiting for completion ...
DBG Start next scan
DBG Check file for harvesting: c:\Elasticsearch5\elasticsearch\data\ABC11.json
DBG Update existing file for harvesting: c:\Elasticsearch5\elasticsearch\data\ABC11.json, offset: 0
DBG Resuming harvesting of file: c:\Elasticsearch5\elasticsearch\data\ABC11.json, offset: 0
DBG Setting offset for file based on seek: c:\Elasticsearch5\elasticsearch\data\ABC11.json
DBG Setting offset for file: c:\Elasticsearch5\elasticsearch\data\ABC11.json. Offset: 0
DBG Check file for harvesting: c:\Elasticsearch5\elasticsearch\data\DEF22.json
INFO Harvester started for file: c:\Elasticsearch5\elasticsearch\data\ABC11.json
DBG Update existing file for harvesting: c:\Elasticsearch5\elasticsearch\data\DEF22.json, offset: 0
DBG Resuming harvesting of file: c:\Elasticsearch5\elasticsearch\data\DEF22.json, offset: 0
DBG Setting offset for file based on seek: c:\Elasticsearch5\elasticsearch\data\DEF22.json
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC11.json; Backoff now.
DBG Setting offset for file: c:\Elasticsearch5\elasticsearch\data\DEF22.json. Offset: 0
DBG Check file for harvesting: c:\Elasticsearch5\elasticsearch\data\GHI33.json
INFO Harvester started for file: c:\Elasticsearch5\elasticsearch\data\DEF22.json
DBG Update existing file for harvesting: c:\Elasticsearch5\elasticsearch\data\GHI33.json, offset: 0
DBG Resuming harvesting of file: c:\Elasticsearch5\elasticsearch\data\GHI33.json, offset: 0
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF22.json; Backoff now.
DBG Setting offset for file based on seek: c:\Elasticsearch5\elasticsearch\data\GHI33.json
DBG Setting offset for file: c:\Elasticsearch5\elasticsearch\data\GHI33.json. Offset: 0
DBG Prospector states cleaned up. Before: 3, After: 3
INFO Harvester started for file: c:\Elasticsearch5\elasticsearch\data\GHI33.json
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\GHI33.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC11.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF22.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\GHI33.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC11.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF22.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\GHI33.json; Backoff now.
DBG Flushing spooler because of timeout. Events flushed: 6
DBG No events to publish
DBG Events sent: 6
DBG Processing 6 events
DBG Registrar states cleaned up. Before: 3, After: 3
DBG Write registry file: c:\Program Files\Filebeat\data\registry
DBG Registry file updated. 3 states written.
DBG Received sigterm/sigint, stopping
@magnusbaeck continuing my research...are there certain properties that have to be included in the .json file OR added to the "log" for LogStash to pick it up??
I saw something about @version and @timestamp.
My json was as basic as possible: {"ID":1, "Name":"Aaron"}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.