Logstash didn't receive Filebeat data

Hi,

I configured filebeat on my mac os local, logstash on the vm with public internet. But I didn't see the filebeat client logs coming through, not on the logstash debug logs, not on the kibana either.
Logstash was using beat as the input, lumberjack as the output. When trying to send local files through File input on logstash, it's successfully sending to Kibana.

On the Filebeat side, when switching the output from logstash to output,file, I can see logs showed up in the filebeat server output path.

I don't see any errors in the filebeat debugging session nor logstash session.

filebeat.prospectors:


- type: log
  enabled: true
  paths:
    - /var/log/filebeat/*.log


#============================= Filebeat modules ===============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

#==================== Elasticsearch template setting ==========================

setup.template.settings:
  index.number_of_shards: 3
  #index.codec: best_compression
  #_source.enabled: false

#================================ General =====================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
name: "box_relay_shipper"
env: "dev"

#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["9.30.248.40:5044"]
  ssl.certificate_authorities: ["/certs/logstash.crt"]


#================================ Logging =====================================

# Sets log level. The default log level is info.
# Available log levels are: critical, error, warning, info, debug
logging:
  level: warning
  to_files: true
  files:
    path: /var/log/filebeat
    name: beat.log
    keepfiles: 7
    rotateeverybytes: 10485760 # 10 MB
  level: debug
  selectors: ["*"]

#------------------------------- File output -----------------------------------

Here's my logstash input

input {
  beats {
    port => 5044
    ssl => true
    ssl_certificate => "/certs/logstash.crt"
    ssl_key => "/certs/logstash.key"
  }
}

On the filebeat side I did the test below, no complaint about SSL, so I think it could just connect to logstash fine, but I could be wrong...

➜  filebeat-6.1.3-darwin-x86_64 git:(feature_logstash_poc) ✗ sudo ./filebeat test config
Config OK
➜  filebeat-6.1.3-darwin-x86_64 git:(feature_logstash_poc) ✗ sudo ./filebeat test output
logstash: 9.30.248.40:5044...
  connection...
    parse host... OK
    dns lookup... OK
    addresses: 9.30.248.40
    dial up... OK
  TLS...
    security: server's certificate chain verification is enabled
    handshake... OK
    TLS version: TLSv1.2
    dial up... OK
  talk to server... OK

Any thoughts?

Looks good from what I can see. I'd recommend adding some debugging output to Filebeat and Logstash.

For filebeat, add -e -d "publish,logstash" to the command line.

For logstash, add -vvv if you are executing it at the console.

If logstash receives the messages, it should print them to the console when executed with -vvv.

How is the output configuration looking in Logstash? See our recommendations from https://www.elastic.co/guide/en/beats/filebeat/6.1/logstash-output.html

Hi Tudor,

Thanks for your suggestions.

On the logstash, with -vvv, all I see is below

Pipeline main started {:file=>"logstash/agent.rb", :line=>"491", :method=>"start_pipeline"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}

Here's the log from filebeat side:

➜ filebeat-6.1.3-darwin-x86_64 git:(feature_logstash_poc) ✗ sudo ./filebeat -c filebeat.yml -e --d "publish,logstash"
Password:
2018/02/01 17:04:52.615522 beat.go:436: INFO Home path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64] Config path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64] Data path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/data] Logs path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/logs]
2018/02/01 17:04:52.615546 metrics.go:23: INFO Metrics logging every 30s
2018/02/01 17:04:52.615659 beat.go:443: INFO Beat UUID: 53057e74-34bc-4e56-91ab-1823fc94c8eb
2018/02/01 17:04:52.615683 beat.go:203: INFO Setup Beat: filebeat; Version: 6.1.3
2018/02/01 17:04:52.616393 logger.go:18: DBG [publish] start pipeline event consumer
2018/02/01 17:04:52.616417 module.go:76: INFO Beat name: box_relay_shipper
2018/02/01 17:04:52.616845 beat.go:276: INFO filebeat start running.
2018/02/01 17:04:52.616911 registrar.go:88: INFO Registry file set to: /Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/data/registry
2018/02/01 17:04:52.616976 registrar.go:108: INFO Loading registrar data from /Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/data/registry
2018/02/01 17:04:52.617186 registrar.go:119: INFO States Loaded from registrar: 17
2018/02/01 17:04:52.617224 filebeat.go:261: WARN Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2018/02/01 17:04:52.617239 crawler.go:48: INFO Loading Prospectors: 1
2018/02/01 17:04:52.617270 registrar.go:150: INFO Starting Registrar
2018/02/01 17:04:52.618650 prospector.go:87: INFO Starting prospector of type: log; ID: 10432274863013843907
2018/02/01 17:04:52.618915 crawler.go:82: INFO Loading and starting Prospectors completed. Enabled prospectors: 1
2018/02/01 17:04:52.618957 reload.go:127: INFO Config reloader started
2018/02/01 17:04:52.619066 reload.go:219: INFO Loading of config files completed.
2018/02/01 17:05:22.615193 metrics.go:39: INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=30001 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=1377632 beat.memstats.memory_total=2918568 filebeat.events.added=1 filebeat.events.done=1 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.config.reloads=1 libbeat.output.type=logstash libbeat.pipeline.clients=1 libbeat.pipeline.events.active=0 libbeat.pipeline.events.filtered=1 libbeat.pipeline.events.total=1 registrar.states.current=17 registrar.states.update=1 registrar.writes=1
2018/02/01 17:05:52.614940 metrics.go:39: INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=30001 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=1421592 beat.memstats.memory_total=2962528 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.pipeline.clients=1 libbeat.pipeline.events.active=0 registrar.states.current=17
2018/02/01 17:06:22.613560 metrics.go:39: INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=29999 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=1452544 beat.memstats.memory_total=2993480 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.pipeline.clients=1 libbeat.pipeline.events.active=0 registrar.states.current=17

The output configuration in logstash is to sending data to Elasticsearch, that was using a customized plugin of lumberjack. I tested it by sending logs directly from logstash, and I can see data show up in Kibana.

mtlumberjack {
"hosts" => ["xxxxx. bluemix.net"]
"port" => 9091
"tenant_id" => "our tenant id"
"tenant_password" => "our pass word"
}

The Filebeat logs seem to suggest that no events are created. That could be simply because there are no "new" events. Filebeat stores the offset in each file in the registry file (/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/data/registry). For experimentation, if you want Filebeat to read the files again, you can delete the file and start filebeat again.

Do you really want filebeat to only read its own log files (/var/log/filebeat/*.log)? That seems to be only path configured.

1 Like

I cat the registry file
[{"source":"/opt/ibm/wlp/output/defaultServer/logs/messages.log","offset":2633,"timestamp":"2018-01-31T14:20:47.845439-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8593601013,"device":16777220}},{"source":"/opt/ibm/wlp/output/defaultServer/FileNet/tracy/P8_trace.log","offset":1282,"timestamp":"2018-01-31T14:20:47.845992-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8593600609,"device":16777220}},{"source":"/var/log/fsck_hfs.log","offset":6632,"timestamp":"2018-01-31T15:29:47.692424-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8591435052,"device":16777221}},{"source":"/var/log/displaypolicyd.stdout.log","offset":119,"timestamp":"2018-01-31T15:29:47.686712-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590699300,"device":16777221}},{"source":"/var/log/ecprint.log","offset":94,"timestamp":"2018-01-31T15:29:47.686747-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8592059817,"device":16777221}},{"source":"/var/log/hfs_convert.log","offset":4642,"timestamp":"2018-01-31T15:29:47.689854-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590699264,"device":16777221}},{"source":"/var/log/wifi.log","offset":191271,"timestamp":"2018-01-31T15:29:48.555387-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8593560039,"device":16777221}},{"source":"/var/log/corecaptured.log","offset":1720936,"timestamp":"2018-01-31T15:29:53.009902-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590700760,"device":16777221}},{"source":"/var/log/fsck_apfs.log","offset":29865,"timestamp":"2018-01-31T15:29:48.127108-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590699340,"device":16777221}},{"source":"/var/log/install.log","offset":2495426,"timestamp":"2018-01-31T15:29:53.009895-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8589935151,"device":16777221}},{"source":"/var/log/jamf.log","offset":409511,"timestamp":"2018-01-31T15:29:49.127767-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590699938,"device":16777221}},{"source":"/var/log/wifi-01-31-2018__09:40:00.650.log","offset":1946579,"timestamp":"2018-01-31T15:29:53.009899-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8593643037,"device":16777221}},{"source":"/var/log/acroUpdaterTools.log","offset":11416,"timestamp":"2018-01-31T15:29:47.694559-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8591290631,"device":16777221}},

I just want to do a test, so using that path as the only log file.

I deleted the registry and restart the filebeat. Same result

ixin.rb", :line=>"154", :method=>"config_init"}
config LogStash::Inputs::Beats/@client_inactivity_timeout = 60 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"154", :method=>"config_init"}
starting agent {:level=>:info, :file=>"logstash/agent.rb", :line=>"213", :method=>"execute"}
starting pipeline {:id=>"main", :level=>:info, :file=>"logstash/agent.rb", :line=>"487", :method=>"start_pipeline"}
Settings: Default pipeline workers: 8
log4j java properties setup {:log4j_level=>"DEBUG", :level=>:debug, :file=>"logstash/logging.rb", :line=>"89", :method=>"setup_log4j"}
Beats inputs: Starting input listener {:address=>"0.0.0.0:5044", :level=>:info, :file=>"logstash/inputs/beats.rb", :line=>"160", :method=>"register"}
Starting pipeline {:id=>"main", :pipeline_workers=>8, :batch_size=>125, :batch_delay=>5, :max_inflight=>1000, :level=>:info, :file=>"logstash/pipeline.rb", :line=>"188", :method=>"start_workers"}
Pipeline main started {:file=>"logstash/agent.rb", :line=>"491", :method=>"start_pipeline"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
...

Filebeat

➜ filebeat-6.1.3-darwin-x86_64 git:(feature_logstash_poc) ✗ sudo ./filebeat -c filebeat.yml -e --d "publish,logstash"
2018/02/01 17:30:12.897174 beat.go:436: INFO Home path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64] Config path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64] Data path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/data] Logs path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/logs]
2018/02/01 17:30:12.897237 metrics.go:23: INFO Metrics logging every 30s
2018/02/01 17:30:12.897303 beat.go:443: INFO Beat UUID: 53057e74-34bc-4e56-91ab-1823fc94c8eb
2018/02/01 17:30:12.897318 beat.go:203: INFO Setup Beat: filebeat; Version: 6.1.3
2018/02/01 17:30:12.897992 logger.go:18: DBG [publish] start pipeline event consumer
2018/02/01 17:30:12.898014 module.go:76: INFO Beat name: box_relay_shipper
2018/02/01 17:30:12.898412 beat.go:276: INFO filebeat start running.
2018/02/01 17:30:12.898502 registrar.go:88: INFO Registry file set to: /Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/data/registry
2018/02/01 17:30:12.898574 registrar.go:108: INFO Loading registrar data from /Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/data/registry
2018/02/01 17:30:12.898808 registrar.go:119: INFO States Loaded from registrar: 17
2018/02/01 17:30:12.898836 filebeat.go:261: WARN Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2018/02/01 17:30:12.898849 crawler.go:48: INFO Loading Prospectors: 1
2018/02/01 17:30:12.898893 registrar.go:150: INFO Starting Registrar
2018/02/01 17:30:12.899159 prospector.go:87: INFO Starting prospector of type: log; ID: 10432274863013843907
2018/02/01 17:30:12.899287 crawler.go:82: INFO Loading and starting Prospectors completed. Enabled prospectors: 1
2018/02/01 17:30:12.899374 reload.go:127: INFO Config reloader started
2018/02/01 17:30:12.899980 reload.go:219: INFO Loading of config files completed.
2018/02/01 17:30:42.910268 metrics.go:39: INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=30002 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=1353040 beat.memstats.memory_total=2927272 filebeat.events.added=1 filebeat.events.done=1 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.config.reloads=1 libbeat.output.type=logstash libbeat.pipeline.clients=1 libbeat.pipeline.events.active=0 libbeat.pipeline.events.filtered=1 libbeat.pipeline.events.total=1 registrar.states.current=17 registrar.states.update=1 registrar.writes=1
2018/02/01 17:31:12.916644 metrics.go:39: INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=30000 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=1390840 beat.memstats.memory_total=2965072 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.pipeline.clients=1 libbeat.pipeline.events.active=0 registrar.states.current=17

BTW, I don't think I ever seen any events .. that could be the problem.

Can you post the registry file again (after deleting it and the last run)?

➜ filebeat-6.1.3-darwin-x86_64 git:(feature_logstash_poc) ✗ sudo cat /Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/data/registry
Password:
[{"source":"/opt/ibm/wlp/output/defaultServer/logs/messages.log","offset":2633,"timestamp":"2018-01-31T14:20:47.845439-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8593601013,"device":16777220}},{"source":"/opt/ibm/wlp/output/defaultServer/FileNet/tracy/P8_trace.log","offset":1282,"timestamp":"2018-01-31T14:20:47.845992-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8593600609,"device":16777220}},{"source":"/var/log/fsck_hfs.log","offset":6632,"timestamp":"2018-01-31T15:29:47.692424-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8591435052,"device":16777221}},{"source":"/var/log/displaypolicyd.stdout.log","offset":119,"timestamp":"2018-01-31T15:29:47.686712-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590699300,"device":16777221}},{"source":"/var/log/ecprint.log","offset":94,"timestamp":"2018-01-31T15:29:47.686747-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8592059817,"device":16777221}},{"source":"/var/log/hfs_convert.log","offset":4642,"timestamp":"2018-01-31T15:29:47.689854-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590699264,"device":16777221}},{"source":"/var/log/wifi.log","offset":191271,"timestamp":"2018-01-31T15:29:48.555387-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8593560039,"device":16777221}},{"source":"/var/log/corecaptured.log","offset":1720936,"timestamp":"2018-01-31T15:29:53.009902-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590700760,"device":16777221}},{"source":"/var/log/fsck_apfs.log","offset":29865,"timestamp":"2018-01-31T15:29:48.127108-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590699340,"device":16777221}},{"source":"/var/log/install.log","offset":2495426,"timestamp":"2018-01-31T15:29:53.009895-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8589935151,"device":16777221}},{"source":"/var/log/jamf.log","offset":409511,"timestamp":"2018-01-31T15:29:49.127767-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590699938,"device":16777221}},{"source":"/var/log/wifi-01-31-2018__09:40:00.650.log","offset":1946579,"timestamp":"2018-01-31T15:29:53.009899-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8593643037,"device":16777221}},{"source":"/var/log/acroUpdaterTools.log","offset":11416,"timestamp":"2018-01-31T15:29:47.694559-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8591290631,"device":16777221}},{"source":"/var/log/alf.log","offset":0,"timestamp":"2018-01-31T15:29:47.687413-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590699664,"device":16777221}},{"source":"/var/log/appfirewall.log","offset":0,"timestamp":"2018-01-31T15:29:47.687526-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8590712052,"device":16777221}},{"source":"/var/log/system.log","offset":2458317,"timestamp":"2018-01-31T15:29:52.221923-08:00","ttl":-2,"type":"log","FileStateOS":{"inode":8593639234,"device":16777221}},{"source":"/var/log/filebeat/beat.log","offset":659,"timestamp":"2018-02-01T09:30:12.89921-08:00","ttl":-1,"type":"log","FileStateOS":{"inode":8593671781,"device":16777221}}]

never mind, I think I finally got it after delete the registry again.

["beats_input_codec_plain_applied"]}, :level=>:debug, :file=>"(eval)", :line=>"14", :method=>"filter_func"}
filter received {:event=>{"message"=>"2018-01-31T15:45:22-08:00 INFO Beat UUID: 53057e74-34bc-4e56-91ab-1823fc94c8eb", "@version"=>"1", "@timestamp"=>"2018-02-01T17:38:29.856Z", "source"=>"/var/log/filebeat/beat.log", "offset"=>659, "prospector"=>{"type"=>"log"}, "beat"=>{"name"=>"box_relay_shipper", "hostname"=>"tracys-mbp-3.usca.ibm.com", "version"=>"6.1.3"}, "host"=>"tracys-mbp-3.usca.ibm.com", "tags"=>["beats_input_codec_plain_applied"]}, :level=>:debug, :file=>"(eval)", :line=>"14", :method=>"filter_func"}
output received {:event=>{"message"=>"2018-01-31T15:45:22-08:00 DBG [log] Disable stderr logging", "@version"=>"1", "@timestamp"=>"2018-02-01T17:38:29.856Z", "source"=>"/var/log/filebeat/beat.log", "offset"=>60, "prospector"=>{"type"=>"log"}, "beat"=>{"hostname"=>"tracys-mbp-3.usca.ibm.com", "version"=>"6.1.3", "name"=>"box_relay_shipper"}, "host"=>"tracys-mbp-3.usca.ibm.com", "tags"=>["beats_input_codec_plain_applied"]}, :level=>:debug, :file=>"(eval)", :line=>"19", :method=>"output_func"}
output received {:event=>{"message"=>"2018-01-31T15:45:22-08:00 INFO Home path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64] Config path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64] Data path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/data] Logs path: [/Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/logs]", "@version"=>"1", "@timestamp"=>"2018-02-01T17:38:29.856Z", "source"=>"/var/log/filebeat/beat.log", "offset"=>391, "prospector"=>{"type"=>"log"}, "beat"=>{"hostname"=>"tracys-mbp-3.usca.ibm.com", "version"=>"6.1.3", "name"=>"box_relay_shipper"}, "host"=>"tracys-mbp-3.usca.ibm.com", "tags"=>["beats_input_codec_plain_applied"]}, :level=>:debug, :file=>"(eval)", :line=>"19", :method=>"output_func"}
output received {:event=>{"message"=>"2018-01-31T15:45:22-08:00 DBG [beat] Beat metadata path: /Users/tracywan/work/BoxRelay/filebeat-6.1.3-darwin-x86_64/data/meta.json", "@version"=>"1", "@timestamp"=>"2018-02-01T17:38:29.856Z", "source"=>"/var/log/filebeat/beat.log", "offset"=>523, "prospector"=>{"type"=>"log"}, "beat"=>{"version"=>"6.1.3", "name"=>"box_relay_shipper", "hostname"=>"tracys-mbp-3.usca.ibm.com"}, "host"=>"tracys-mbp-3.usca.ibm.com", "tags"=>["beats_input_codec_plain_applied"]}, :level=>:debug, :file=>"(eval)", :line=>"19", :method=>"output_func"}
output received {:event=>{"message"=>"2018-01-31T15:45:22-08:00 INFO Metrics logging every 30s", "@version"=>"1", "@timestamp"=>"2018-02-01T17:38:29.856Z", "prospector"=>{"type"=>"log"}, "beat"=>{"name"=>"box_relay_shipper", "hostname"=>"tracys-mbp-3.usca.ibm.com", "version"=>"6.1.3"}, "offset"=>580, "source"=>"/var/log/filebeat/beat.log", "host"=>"tracys-mbp-3.usca.ibm.com", "tags"=>["beats_input_codec_plain_applied"]}, :level=>:debug, :file=>"(eval)", :line=>"19", :method=>"output_func"}
output received {:event=>{"message"=>"2018-01-31T15:45:22-08:00 INFO Beat UUID: 53057e74-34bc-4e56-91ab-1823fc94c8eb", "@version"=>"1", "@timestamp"=>"2018-02-01T17:38:29.856Z", "source"=>"/var/log/filebeat/beat.log", "offset"=>659, "prospector"=>{"type"=>"log"}, "beat"=>{"name"=>"box_relay_shipper", "hostname"=>"tracys-mbp-3.usca.ibm.com", "version"=>"6.1.3"}, "host"=>"tracys-mbp-3.usca.ibm.com", "tags"=>["beats_input_codec_plain_applied"]}, :level=>:debug, :file=>"(eval)", :line=>"19", :method=>"output_func"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}

I saw those logs goes to logstash

I think I speak too early. I still don't see the logs go to Kibana... is there anything jumping out from the debug log?

Hmm, I'm not sure I understand your output configuration. What's mtlumberjack, a custom output plugin?

You can add a simple stdout or rubydebug output to check which messages reach the LS output, for example:

output {
  stdout {
    codec => rubydebug { metadata => true }
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.