I am prepping for the elasticsearch certification exam, and am trying to load the server logs data provided in ES Engineer 1 and 2 to my local cluster. I have three 6.6.1 nodes running on my laptop, with the following enabled in elasticsearch.yml:
cluster.name: my_cluster
node.max_local_storage_nodes: 3
xpack.security.enabled: true
discovery.zen.minimum_master_nodes: 2
I'm using the following config for filebeat 6.6.1 to load the data to my cluster:
filebeat.inputs:
- type: log
enabled: true
paths:- /path/to/data/elastic_blog_curated_access_logs_server1/*.log
processors:
-
decode_json_fields:
fields: ['message']
target: ''
overwrite_keys: true -
drop_fields:
fields: ["message", "prospector", "beat", "source", "offset"]
setup.template.enabled: false
output.elasticsearch:
hosts: ["localhost:9200"]
index: "logs_server1"
document_type: "_doc"
bulk_max_size: 1000
user: "training"
password: "heresmypassword"
I run filebeat by executing ./filebeat -c filebeat.yml. Whenever I spin filebeat up, I receive the following in the logs:
2019-10-03T07:08:48.433-0400 INFO instance/beat.go:616 Home path: [/path/to/filebeat] Config path: [/path/to/filebeat] Data path: [/path/to/filebeat/data] Logs path: [/path/to/filebeat/logs]
2019-10-03T07:08:48.438-0400 INFO instance/beat.go:623 Beat UUID: 47b21207-302f-475b-94f1-3c15146e44b8
2019-10-03T07:08:48.438-0400 INFO [beat] instance/beat.go:936 Beat info {"system_info": {"beat": {"path": {"config": "/path/to/filebeat", "data": "/path/to/filebeat/data", "home": "/path/to/filebeat", "logs": "/path/to/filebeat/logs"}, "type": "filebeat", "uuid": "47b21207-302f-475b-94f1-3c15146e44b8"}}}
2019-10-03T07:08:48.438-0400 INFO [beat] instance/beat.go:945 Build info {"system_info": {"build": {"commit": "928f5e3f35fe28c1bd73513ff1cc89406eb212a6", "libbeat": "6.6.1", "time": "2019-02-13T16:12:11.000Z", "version": "6.6.1"}}}
2019-10-03T07:08:48.439-0400 INFO [beat] instance/beat.go:948 Go runtime info {"system_info": {"go": {"os":"darwin","arch":"amd64","max_procs":4,"version":"go1.10.8"}}}
2019-10-03T07:08:48.440-0400 INFO [beat] instance/beat.go:952 Host info {"system_info": {"host": {"architecture":"x86_64","boot_time":"2019-09-12T05:50:47.425492-04:00","name":"my_laptop_name","ip":[.......],"kernel_version":"18.7.0","mac":[.......],"os":{"family":"darwin","platform":"darwin","name":"Mac OS X","version":"10.14.6","major":10,"minor":14,"patch":6,"build":"18G95"},"timezone":"EDT","timezone_offset_sec":-14400,"id":"......."}}}
2019-10-03T07:08:48.441-0400 INFO [beat] instance/beat.go:981 Process info {"system_info": {"process": {"cwd": "/path/to/filebeat", "exe": "./filebeat", "name": "filebeat", "pid": 32874, "ppid": 25428, "start_time": "2019-10-03T07:08:48.307-0400"}}}
2019-10-03T07:08:48.441-0400 INFO instance/beat.go:281 Setup Beat: filebeat; Version: 6.6.1
2019-10-03T07:08:48.441-0400 INFO elasticsearch/client.go:165 Elasticsearch url: http://localhost:9200
2019-10-03T07:08:48.444-0400 INFO [publisher] pipeline/module.go:110 Beat name: beat-name
2019-10-03T07:08:48.454-0400 INFO [monitoring] log/log.go:117 Starting metrics logging every 30s
2019-10-03T07:08:48.454-0400 INFO instance/beat.go:403 filebeat start running.
2019-10-03T07:08:48.454-0400 INFO registrar/registrar.go:134 Loading registrar data from /path/to/filebeat/data/registry
2019-10-03T07:08:48.462-0400 INFO registrar/registrar.go:141 States Loaded from registrar: 402
2019-10-03T07:08:48.462-0400 INFO crawler/crawler.go:72 Loading Inputs: 1
2019-10-03T07:08:48.644-0400 INFO log/input.go:138 Configured paths: [/path/to/data/elastic_blog_curated_access_logs_server1/*.log]
2019-10-03T07:08:48.644-0400 INFO input/input.go:114 Starting input of type: log; ID: 3080209040326827450
2019-10-03T07:08:48.644-0400 INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
2019-10-03T07:09:18.461-0400 INFO [monitoring] log/log.go:144 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":98,"time":{"ms":98}},"total":{"ticks":260,"time":{"ms":260},"value":260},"user":{"ticks":162,"time":{"ms":162}}},"info":{"ephemeral_id":"6af10b82-f593-4286-b461-86446b4c0b55","uptime":{"ms":30063}},"memstats":{"gc_next":4194304,"memory_alloc":2372312,"memory_total":25198288,"rss":21200896}},"filebeat":{"events":{"added":134,"done":134},"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"elasticsearch"},"pipeline":{"clients":1,"events":{"active":0,"filtered":134,"total":134}}},"registrar":{"states":{"current":402,"update":134},"writes":{"success":134,"total":134}},"system":{"cpu":{"cores":4},"load":{"1":1.6304,"15":2.3618,"5":2.2231,"norm":{"1":0.4076,"15":0.5905,"5":0.5558}}}}}}
The last message continues to print, and filebeat hangs without loading any data into my cluster. Any ideas why this is happening?