Prospector ticker stopped

Hi,
facing this problem..
2019-04-03T20:42:54.626+0530 INFO log/harvester.go:216 Harvester started for file: /home/vankata/190_APS_QUALIFICATION/kalyan_elk_logs/free_data_11_09_02_30.txt
2019-04-03T20:42:54.626+0530 INFO prospector/prospector.go:121 Prospector ticker stopped
2019-04-03T20:42:54.626+0530 INFO log/prospector.go:411 Scan aborted because prospector stopped.
2019-04-03T20:42:54.626+0530 INFO prospector/prospector.go:121 Prospector ticker stopped
2019-04-03T20:42:54.626+0530 INFO prospector/prospector.go:138 Stopping Prospector: 8728499415371259904
2019-04-03T20:42:54.626+0530 INFO prospector/prospector.go:121 Prospector ticker stopped
2019-04-03T20:42:54.626+0530 INFO log/harvester.go:216 Harvester started for file: /home/vankata/190_APS_QUALIFICATION/kalyan_elk_logs/free_data_11_09_10_40.txt

filebeat version: filebeat-6.2.4
have around 124 log files in the input dir, this was working fine all i changed is filter in logstash side.

Note: tried to delete registry files by using clean_* command couldn't succeed though

deleting registry and try again. (/opt/nokia/backup/filebeat/registry) is it good idea ?
and can the below error ignored if in case i am using lostash output

2019-04-11T10:08:25.594+0530 WARN beater/filebeat.go:261 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.

What version of Filebeat are you running?

@pierhugues, above problem is solved,
file beat version is : filebeat-6.2.4
logstash: logstash-6.2.4

when i run filebeat binary i am facing below error message in log and index is not being created in kibana, could you please help me with this?

2019-04-15T10:44:50.842+0530 DEBUG [harvester] log/harvester.go:447 Setting offset for file based on seek: /home/vankata/190_APS_QUALIFICATION/kalyan_elk_logs/new_working_dir/Process_heap_09_09_23_40.txt
2019-04-15T10:44:50.842+0530 DEBUG [harvester] log/harvester.go:433 Setting offset for file: /home/vankata/190_APS_QUALIFICATION/kalyan_elk_logs/new_working_dir/Process_heap_09_09_23_40.txt. Offset: 0
2019-04-15T10:44:50.842+0530 DEBUG [harvester] log/harvester.go:348 Update state: /home/vankata/190_APS_QUALIFICATION/kalyan_elk_logs/new_working_dir/Process_heap_09_09_23_40.txt, offset: 0
2019-04-15T10:44:50.842+0530 INFO log/prospector.go:411 Scan aborted because prospector stopped.
2019-04-15T10:44:50.842+0530 DEBUG [prospector] log/prospector.go:168 Prospector states cleaned up. Before: 237, After: 237, Pending: 0
2019-04-15T10:44:50.842+0530 INFO log/harvester.go:216 Harvester started for file: /home/vankata/190_APS_QUALIFICATION/kalyan_elk_logs/new_working_dir/Process_heap_09_09_23_40.txt
^C^C^C^C^C^C2019-04-15T10:45:12.565+0530 INFO [monitoring] log/log.go:124 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":330,"time":337},"total":{"ticks":1320,"time":1330,"value":1320},"user":{"ticks":990,"time":993}},"info":{"ephemeral_id":"6e565251-92bb-446f-b673-a5d6c1f69dea","uptime":{"ms":120011}},"memstats":{"gc_next":22612144,"memory_alloc":15827312,"memory_total":59980512,"rss":1110016}},"filebeat":{"events":{"active":3,"added":3},"harvester":{"open_files":90,"running":90,"started":2},"prospector":{"log":{"files":{"truncated":1}}}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":4117,"retry":1048}}},"registrar":{"states":{"current":288}},"system":{"load":{"1":0,"15":0.05,"5":0.01,"norm":{"1":0,"15":0.0021,"5":0.0004}}}}}}
2019-04-15T10:45:28.676+0530 ERROR pipeline/output.go:74 Failed to connect: dial tcp 10.153.148.199:5044: i/o timeout
2019-04-15T10:45:28.676+0530 DEBUG [logstash] logstash/async.go:94 connect
^C^C2019-04-15T10:45:42.565+0530 INFO [monitoring] log/log.go:124 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":330,"time":338},"total":{"ticks":1360,"time":1368,"value":1360},"user":{"ticks":1030,"time":1030}},"info":{"ephemeral_id":"6e565251-92bb-446f-b673-a5d6c1f69dea","uptime":{"ms":150010}},"memstats":{"gc_next":22954272,"memory_alloc":11490968,"memory_total":60236984,"rss":2330624}},"filebeat":{"harvester":{"open_files":90,"running":90}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":4117,"retry":1000}}},"registrar":{"states":{"current":288}},"system":{"load":{"1":0,"15":0.05,"5":0.01,"norm":{"1":0,"15":0.0021,"5":0.0004}}}}}}

Config part:

  1. filebeat is installed in different client machine
  2. ELK is in different machine as server
    3.ELK server was reachable from client before the filebeat binary run
    4.logstash was in listening mode.

please let me know if you need configuration details from filebeat and logstash side

Closing this as i found solution !

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.