Hi all,
I have a tons of nginx logs wants to transfer to elasticsearch through filebeat => logstash
However, i want to do it slowly to ensure my elasticsearch functionality and only do only once
But it seems not work.
My configuration for nginx module is here:
- module: nginx
# Access logs
access:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
var.paths: []
prospector:
type: stdin
harvester_limit: 4
exclude_lines: ['HEAD \/']
# Error logs
error:
enabled: false
main configuration:
filebeat.shutdown_timeout: 0s
output.logstash:
hosts: ["127.0.0.1:5062"]
ssl.certificate_authorities: ["/etc/pki/tls/certs/logstash-forwarder.crt"]
processors:
- add_locale: ~
Then, ran:
$ cat access.log | filebeat -c ./filebeat.yml -e -v -once
it shows:
2017/12/10 14:54:26.808119 beat.go:426: INFO Home path: [/usr/share/filebeat] Config path: [/d/logs/filebeat] Data path: [/d/logs/filebeat] Logs path: [/d/logs/filebeat]
2017/12/10 14:54:26.808210 beat.go:433: INFO Beat UUID: 4aa09c1a-dd07-44c1-aecb-96b80886ac13
2017/12/10 14:54:26.808211 metrics.go:23: INFO Metrics logging every 30s
2017/12/10 14:54:26.808231 beat.go:192: INFO Setup Beat: filebeat; Version: 6.0.1
2017/12/10 14:54:26.808761 module.go:80: INFO Beat name: eva00
2017/12/10 14:54:26.809032 beat.go:260: INFO filebeat start running.
2017/12/10 14:54:26.809085 registrar.go:88: INFO Registry file set to: /d/logs/filebeat/registry
2017/12/10 14:54:26.809113 registrar.go:108: INFO Loading registrar data from /d/logs/filebeat/registry
2017/12/10 14:54:26.810152 registrar.go:119: INFO States Loaded from registrar: 143
2017/12/10 14:54:26.810174 filebeat.go:260: WARN Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2017/12/10 14:54:26.810192 crawler.go:44: INFO Loading Prospectors: 1
2017/12/10 14:54:26.810275 registrar.go:150: INFO Starting Registrar
2017/12/10 14:54:26.812281 crawler.go:78: INFO Loading and starting Prospectors completed. Enabled prospectors: 0
2017/12/10 14:54:26.812304 reload.go:128: INFO Config reloader started
2017/12/10 14:54:26.812328 filebeat.go:272: INFO Running filebeat once. Waiting for completion ...
2017/12/10 14:54:26.812335 filebeat.go:274: INFO All data collection completed. Shutting down.
2017/12/10 14:54:26.812341 crawler.go:105: INFO Stopping Crawler
2017/12/10 14:54:26.812347 crawler.go:115: INFO Stopping 0 prospectors
2017/12/10 14:54:26.813489 reload.go:259: INFO Starting 1 runners ...
2017/12/10 14:54:26.813504 prospector.go:103: INFO Starting prospector of type: stdin; id: 8303358740364277103
2017/12/10 14:54:26.813514 reload.go:220: INFO Loading of config files completed.
2017/12/10 14:54:26.813521 reload.go:223: INFO Dynamic config reloader stopped
2017/12/10 14:54:26.813531 crawler.go:131: INFO Crawler stopped
2017/12/10 14:54:26.813582 signalwait.go:76: INFO Continue shutdown: All enqueued events being published.
2017/12/10 14:54:26.813598 registrar.go:210: INFO Stopping Registrar
2017/12/10 14:54:26.813608 registrar.go:165: INFO Ending Registrar
2017/12/10 14:54:26.813623 harvester.go:207: INFO Harvester started for file: -
2017/12/10 14:54:26.813666 forwarder.go:35: INFO Prospector outlet closed
2017/12/10 14:54:26.816539 metrics.go:51: INFO Total non-zero values: beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=1944824 beat.memstats.memory_total=3829088 filebeat.harvester.closed=1 filebeat.harvester.open_files=-1 filebeat.harvester.running=0 filebeat.harvester.started=1 libbeat.config.module.running=1 libbeat.config.module.starts=1 libbeat.config.reloads=1 libbeat.output.type=logstash libbeat.pipeline.clients=0 libbeat.pipeline.events.active=0 registrar.states.current=143 registrar.writes=1
2017/12/10 14:54:26.816562 metrics.go:52: INFO Uptime: 16.994577ms
2017/12/10 14:54:26.816569 beat.go:268: INFO filebeat stopped.