LogStash does not appear to be receiving files from FileBeat

I am new to Filebeats and LogStash and am feeling overwhelmed.
My goal is to take .json files in through Filebeats to LogStash and finally onto Elasticsearch.
I am currently working on my local dev machine running Windows 7.
For right now I am just trying to take the files from Filebeats and output them to the console.

Here is my filebeat.yml file:

filebeat.prospectors:

  • input_type: log
    paths:
  • c:\Elasticsearch5\elasticsearch\data*.json
    output.logstash:
    hosts: ["localhost:5043"]

As I put .json files into that I see them get picked up in my Filebeats console window when I run:

C:\Program Files\Filebeat>filebeat -e -c filebeat.yml -d "Publish"

Here is my filebeat.conf file that I use with LogStash:

input {
beats {
port => 5043
codec => "json"
}
}
output {
stdout { codec => rubydebug }
}

Finally here is my logstash.yml file:

node.name: logstash_node_aaron_122016
config.reload.automatic: true
path.logs: c:\Elasticsearch5\log\logstash

I fire up LogStash using:

C:\Elasticsearch5\logstash-5.0.2>.\bin\logstash -f .\config\filebeat.conf --debug

I have been all over the internet looking at different documents and have had no success!
Considering going back to using the ES API fed by a Console App!

Thank you!
-Aaron

If Filebeat doesn't send anything to Logstash there should be clues in the Filebeat log file.

Thanks for the response. I am not getting any Log files in C:\Program Files\Filebeat\logs directory.
I have the logging.level set to debug in my filebeat.yml file.

What does the whole logging section look like? When posting make sure you preserve the indentation.

#================================ Logging =====================================

# Sets log level. The default log level is info.
# Available log levels are: critical, error, warning, info, debug
logging.level: debug
# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
logging.selectors: ["*"]

Okay... so nothing about logs being sent to C:\Program Files\Filebeat\logs? Is that specified on the command line?

No, I didn't specify it because it said in the documentation it would go to that folder by default:
https://www.elastic.co/guide/en/beats/filebeat/current/directory-layout.html

I updated the Logging section:
#================================ Logging =====================================

# Sets log level. The default log level is info.
# Available log levels are: critical, error, warning, info, debug
logging.level: debug
# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
logging.selectors: ["*"]
logging.files:
  path: C:\Program Files\Filebeat\logs
  name: filebeat.log
  keepfiles: 7

I restarted Filebeat with these new settings and nothing appeared in the directory.

I was going through this document and tried telnet on port 5043

telnet localhost:5043
Connecting To localhost:5043...Could not open connection to the host, on port 23: Connect failed

Would this be part of the problem?

That would certainly explain why Logstash isn't getting anything, but you really need to fix the logging too.

I disabled Windows Firewall and nothing changed. No logs. Doesn't appear anything happening in LogStash.
Here's the log from my Filebeat console window. Anything useful?

c:\Program Files\Filebeat>filebeat -e -c filebeat.yml -d "*"

logp.go:219: INFO Metrics logging every 30s
beat.go:267: INFO Home path: [c:\Program Files\Filebeat] Config path: [c:\Program Files\Filebeat] Data path: [c:\Program Files\Filebeat\data] Logs path: [c:\Program Files\Filebeat\logs]
beat.go:177: INFO Setup Beat: filebeat; Version: 5.1.1
processor.go:43: DBG Processors:
beat.go:183: DBG Initializing output plugins
logstash.go:90: INFO Max Retries set to: 3
outputs.go:106: INFO Activated logstash as output plugin.
publish.go:234: DBG Create output worker
publish.go:276: DBG No output is defined to store the topology. The server fields might not be filled.
publish.go:291: INFO Publisher name: AARONB64WIN7
async.go:63: INFO Flush Interval set to: 1s
async.go:64: INFO Max Bulk Size set to: 2048
async.go:72: DBG create bulk processing worker (interval=1s, bulk size=2048)
beat.go:207: INFO filebeat start running.
service_windows.go:51: DBG Windows is interactive: true
registrar.go:85: INFO Registry file set to: c:\Program Files\Filebeat\data\registry
registrar.go:106: INFO Loading registrar data from c:\Program Files\Filebeat\data\registry
registrar.go:131: INFO States Loaded from registrar: 3
sync.go:41: INFO Start sending events to output
registrar.go:230: INFO Starting Registrar
crawler.go:34: INFO Loading Prospectors: 1
spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
prospector_log.go:41: DBG exclude_files: []
state.go:64: DBG New state added for c:\Elasticsearch5\elasticsearch\data\10.json
state.go:64: DBG New state added for c:\Elasticsearch5\elasticsearch\data\20.json
state.go:64: DBG New state added for c:\Elasticsearch5\elasticsearch\data\d2000.json
prospector_log.go:57: INFO Prospector with previous states loaded: 3
prospector.go:69: DBG File Configs: [c:\Elasticsearch5\elasticsearch\data*.json]
crawler.go:46: INFO Loading Prospectors completed. Number of prospectors: 1
crawler.go:61: INFO All prospectors are initialised and running with 3 states to persist
crawler.go:56: DBG Starting prospector 0
prospector.go:111: INFO Starting prospector of type: log
prospector_log.go:62: DBG Start next scan
prospector_log.go:212: DBG Check file for harvesting: c:\Elasticsearch5\elasticsearch\data\ABC.json
prospector_log.go:245: DBG Update existing file for harvesting: c:\Elasticsearch5\elasticsearch\data\ABC.json, offset: 0
prospector_log.go:254: DBG Resuming harvesting of file: c:\Elasticsearch5\elasticsearch\data\ABC.json, offset: 0
log.go:256: DBG Setting offset for file based on seek: c:\Elasticsearch5\elasticsearch\data\ABC.json
log.go:241: DBG Setting offset for file: c:\Elasticsearch5\elasticsearch\data\ABC.json. Offset: 0
prospector_log.go:212: DBG Check file for harvesting: c:\Elasticsearch5\elasticsearch\data\DEF.json
log.go:84: INFO Harvester started for file: c:\Elasticsearch5\elasticsearch\data\ABC.json
prospector_log.go:245: DBG Update existing file for harvesting: c:\Elasticsearch5\elasticsearch\data\DEF.json, offset: 0
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC.json; Backoff now.
prospector_log.go:254: DBG Resuming harvesting of file: c:\Elasticsearch5\elasticsearch\data\DEF.json, offset: 0
log.go:256: DBG Setting offset for file based on seek: c:\Elasticsearch5\elasticsearch\data\DEF.json
log.go:241: DBG Setting offset for file: c:\Elasticsearch5\elasticsearch\data\DEF.json. Offset: 0
prospector_log.go:83: DBG Prospector states cleaned up. Before: 3, After: 3
log.go:84: INFO Harvester started for file: c:\Elasticsearch5\elasticsearch\data\DEF.json
prospector_log.go:99: DBG Remove state for file as file removed: c:\Elasticsearch5\elasticsearch\data\d2000.json
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF.json; Backoff now.
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC.json; Backoff now.
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF.json; Backoff now.
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC.json; Backoff now.
log_file.go:84: DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF.json; Backoff now.
spooler.go:89: DBG Flushing spooler because of timeout. Events flushed: 6
client.go:128: DBG No events to publish
sync.go:68: DBG Events sent: 6
registrar.go:269: DBG Processing 6 events
state.go:109: DBG State removed for c:\Elasticsearch5\elasticsearch\data\d2000.json because of older: 0s
registrar.go:255: DBG Registrar states cleaned up. Before: 3, After: 2
registrar.go:292: DBG Write registry file: c:\Program Files\Filebeat\data\registry
registrar.go:317: DBG Registry file updated. 2 states written.

Okay I got the logs working - I had -e in the command and was disabling the file output;
https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-command-line.html
That's what I get for copying and pasting!

I ran the following and it output the log below:
c:\Program Files\Filebeat>filebeat -c filebeat.yml -d "*" -v -once

DBG Disable stderr logging
INFO Metrics logging every 30s
INFO Home path: [c:\Program Files\Filebeat] Config path: [c:\Program Files\Filebeat] Data path: [c:\Program Files\Filebeat\data] Logs path: [c:\Program Files\Filebeat\logs]
INFO Setup Beat: filebeat; Version: 5.1.1
DBG Processors:
DBG Initializing output plugins
INFO Max Retries set to: 3
INFO Activated logstash as output plugin.
DBG Create output worker
DBG No output is defined to store the topology. The server fields might not be filled.
INFO Publisher name: AARONB64WIN7
INFO Flush Interval set to: 1s
INFO Max Bulk Size set to: 2048
DBG create bulk processing worker (interval=1s, bulk size=2048)
INFO filebeat start running.
DBG Windows is interactive: true
INFO Registry file set to: c:\Program Files\Filebeat\data\registry
INFO Loading registrar data from c:\Program Files\Filebeat\data\registry
INFO States Loaded from registrar: 3
INFO Starting Registrar
INFO Start sending events to output
INFO Loading Prospectors: 1
INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
DBG exclude_files: []
DBG New state added for c:\Elasticsearch5\elasticsearch\data\ABC1.json
DBG New state added for c:\Elasticsearch5\elasticsearch\data\DEF2.json
DBG New state added for c:\Elasticsearch5\elasticsearch\data\GHI3.json
INFO Prospector with previous states loaded: 3
DBG File Configs: [c:\Elasticsearch5\elasticsearch\data*.json]
INFO Loading Prospectors completed. Number of prospectors: 1
INFO All prospectors are initialised and running with 3 states to persist
DBG Starting prospector 0
INFO Starting prospector of type: log
INFO Running filebeat once. Waiting for completion ...
DBG Start next scan
DBG Check file for harvesting: c:\Elasticsearch5\elasticsearch\data\ABC11.json
DBG Update existing file for harvesting: c:\Elasticsearch5\elasticsearch\data\ABC11.json, offset: 0
DBG Resuming harvesting of file: c:\Elasticsearch5\elasticsearch\data\ABC11.json, offset: 0
DBG Setting offset for file based on seek: c:\Elasticsearch5\elasticsearch\data\ABC11.json
DBG Setting offset for file: c:\Elasticsearch5\elasticsearch\data\ABC11.json. Offset: 0
DBG Check file for harvesting: c:\Elasticsearch5\elasticsearch\data\DEF22.json
INFO Harvester started for file: c:\Elasticsearch5\elasticsearch\data\ABC11.json
DBG Update existing file for harvesting: c:\Elasticsearch5\elasticsearch\data\DEF22.json, offset: 0
DBG Resuming harvesting of file: c:\Elasticsearch5\elasticsearch\data\DEF22.json, offset: 0
DBG Setting offset for file based on seek: c:\Elasticsearch5\elasticsearch\data\DEF22.json
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC11.json; Backoff now.
DBG Setting offset for file: c:\Elasticsearch5\elasticsearch\data\DEF22.json. Offset: 0
DBG Check file for harvesting: c:\Elasticsearch5\elasticsearch\data\GHI33.json
INFO Harvester started for file: c:\Elasticsearch5\elasticsearch\data\DEF22.json
DBG Update existing file for harvesting: c:\Elasticsearch5\elasticsearch\data\GHI33.json, offset: 0
DBG Resuming harvesting of file: c:\Elasticsearch5\elasticsearch\data\GHI33.json, offset: 0
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF22.json; Backoff now.
DBG Setting offset for file based on seek: c:\Elasticsearch5\elasticsearch\data\GHI33.json
DBG Setting offset for file: c:\Elasticsearch5\elasticsearch\data\GHI33.json. Offset: 0
DBG Prospector states cleaned up. Before: 3, After: 3
INFO Harvester started for file: c:\Elasticsearch5\elasticsearch\data\GHI33.json
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\GHI33.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC11.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF22.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\GHI33.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\ABC11.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\DEF22.json; Backoff now.
DBG End of file reached: c:\Elasticsearch5\elasticsearch\data\GHI33.json; Backoff now.
DBG Flushing spooler because of timeout. Events flushed: 6
DBG No events to publish
DBG Events sent: 6
DBG Processing 6 events
DBG Registrar states cleaned up. Before: 3, After: 3
DBG Write registry file: c:\Program Files\Filebeat\data\registry
DBG Registry file updated. 3 states written.
DBG Received sigterm/sigint, stopping

@magnusbaeck continuing my research...are there certain properties that have to be included in the .json file OR added to the "log" for LogStash to pick it up??
I saw something about @version and @timestamp.

My json was as basic as possible:
{"ID":1, "Name":"Aaron"}

are there certain properties that have to be included in the .json file OR added to the "log" for LogStash to pick it up??

No.

This seems more like a Filebeat problem now so I suggest moving this thread to that category or repost there to get the attention of the Beats folks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.