I'm trying to ingest log from filebeat, parsed through logstash and ingest in Elasticsearch. To make the pipeline work, I didnt add the grok yet but the log file is not moving forward with below error
I'm using 6.x version
2019-05-15T13:20:33.679-0500 INFO log/harvester.go:216 Harvester started for file: /var/log/SDP/events/EventLogFile.txt.0
2019-05-15T13:20:33.725-0500 ERROR logstash/async.go:235 Failed to publish events caused by: write tcp 127.0.0.1:23054->127.0.0.1:5044: write: connection reset by peer
2019-05-15T13:20:34.726-0500 ERROR pipeline/output.go:92 Failed to publish events: write tcp 127.0.0.1:23054->127.0.0.1:5044: write: connection reset by peer
2019-05-15T13:20:43.620-0500 INFO [monitoring] log/log.go:124 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":670,"time":675},"total":{"ticks":6380,"time":6391,"value":6380},"user":{"ticks":5710,"time":5716}},"info":{"ephemeral_id":"7baa6f1d-01fd-4c42-b4b9-7867e6cf2083","uptime":{"ms":1740008}},"memstats":{"gc_next":12478016,"memory_alloc":9147128,"memory_total":1016521112,"rss":10252288}},"filebeat":{"events":{"added":43656,"done":43656},"harvester":{"open_files":2,"running":2,"started":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":43655,"batches":23,"failed":2048,"total":45703},"read":{"bytes":132},"write":{"bytes":3462886,"errors":1}},"pipeline":{"clients":1,"events":{"active":0,"filtered":1,"published":43655,"retry":4096,"total":43656},"queue":{"acked":43655}}},"registrar":{"states":{"current":11,"update":43656},"writes":23},"system":{"load":{"1":1.39,"15":0.67,"5":0.72,"norm":{"1":0.029,"15":0.014,"5":0.015}}}}}}