NOt able to discover logtash in kibana

I am using version 7.3 for all elastic search, kibana, logstash and I have an APM agent installed on my application.

I am trying to create a file beat and send the input using logstash to elastic search.

below is the log from file beat. All of this exist in same machine . I see an error in the log :- Failed to publish events caused by: write tcp as below.

Why am I not able to discover logstash in Kibana. I was able to discover it when the output format was elastic search. When I changed it to logstash the discovery stopped working.

My filebeat.yml

filebeat.inputs:

  • type: log
    paths:
    • C:/STS/workspace/v17base/v17.0.2workspace/xyz_xstore/log/retmonEvent.out
      output.logstash:
      hosts: ["localhost:5044"]

log:-

2019-11-26T16:38:53.797-0800 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":62},"total":{"ticks":124,"value":124},"user":{"ticks":62}},"handles":{"open":236},"info":{"ephemeral_id":"edd6d598-9ecc-48d6-bcf1-2ed479a55ae8","uptime":{"ms":570057}},"memstats":{"gc_next":4973312,"memory_alloc":2615048,"memory_total":10292864,"rss":4096},"runtime":{"goroutines":23}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":0}}},"registrar":{"states":{"current":1}}}}}
2019-11-26T16:39:23.797-0800 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":62},"total":{"ticks":124,"value":124},"user":{"ticks":62}},"handles":{"open":236},"info":{"ephemeral_id":"edd6d598-9ecc-48d6-bcf1-2ed479a55ae8","uptime":{"ms":600058}},"memstats":{"gc_next":4973312,"memory_alloc":2680488,"memory_total":10358304},"runtime":{"goroutines":23}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":0}}},"registrar":{"states":{"current":1}}}}}
2019-11-26T16:39:33.873-0800 INFO log/harvester.go:253 Harvester started for file: C:\STS\workspace\v17base\v17.0.2workspace\xyz_xstore\log\retmonEvent.out
2019-11-26T16:39:34.879-0800 ERROR logstash/async.go:256 Failed to publish events caused by: write tcp [::1]:59554->[::1]:5044: wsasend: An existing connection was forcibly closed by the remote host.
2019-11-26T16:39:36.072-0800 ERROR pipeline/output.go:121 Failed to publish events: write tcp [::1]:59554->[::1]:5044: wsasend: An existing connection was forcibly closed by the remote host.
2019-11-26T16:39:36.072-0800 INFO pipeline/output.go:95 Connecting to backoff(async(tcp://localhost:5044))
2019-11-26T16:39:36.072-0800 INFO pipeline/output.go:105 Connection to backoff(async(tcp://localhost:5044)) established
2019-11-26T16:39:53.796-0800 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":78,"time":{"ms":16}},"total":{"ticks":140,"time":{"ms":16},"value":140},"user":{"ticks":62}},"handles":{"open":244},"info":{"ephemeral_id":"edd6d598-9ecc-48d6-bcf1-2ed479a55ae8","uptime":{"ms":630058}},"memstats":{"gc_next":8164416,"memory_alloc":7120008,"memory_total":15139464,"rss":3317760},"runtime":{"goroutines":27}},"filebeat":{"events":{"added":55,"done":55},"harvester":{"open_files":1,"running":1,"started":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":54,"batches":3,"failed":28,"total":82},"read":{"bytes":12},"write":{"bytes":3587,"errors":1}},"pipeline":{"clients":1,"events":{"active":0,"filtered":1,"published":54,"retry":56,"total":55},"queue":{"acked":54}}},"registrar":{"states":{"current":1,"update":55},"writes":{"success":3,"total":3}}}}}
2019-11-26T16:40:23.796-0800 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":78},"total":{"ticks":140,"value":140},"user":{"ticks":62}},"handles":{"open":248},"info":{"ephemeral_id":"edd6d598-9ecc-48d6-bcf1-2ed479a55ae8","uptime":{"ms":660057}},"memstats":{"gc_next":8101760,"memory_alloc":4071848,"memory_total":16772584,"rss":815104},"runtime":{"goroutines":27}},"filebeat":{"events":{"added":3,"done":3},"harvester":{"open_files":1,"running":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":3,"batches":1,"total":3},"read":{"bytes":6},"write":{"bytes":622}},"pipeline":{"clients":1,"events":{"active":0,"published":3,"total":3},"queue":{"acked":3}}},"registrar":{"states":{"current":1,"update":3},"writes":{"success":1,"total":1}}}}}
2019-11-26T16:40:53.796-0800 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":78},"total":{"ticks":140,"value":140},"user":{"ticks":62}},"handles":{"open":246},"info":{"ephemeral_id":"edd6d598-9ecc-48d6-bcf1-2ed479a55ae8","uptime":{"ms":690056}},"memstats":{"gc_next":8101760,"memory_alloc":4153528,"memory_total":16854264,"rss":8192},"runtime":{"goroutines":27}},"filebeat":{"harvester":{"open_files":1,"running":1}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":0}}},"registrar":{"states":{"current":1}}}}}
2019-11-26T16:41:23.797-0800 INFO [monitoring] log/log.go:145 Non-zer

What do you see in the logstash logs?

[2019-11-26T16:25:15,841][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x361eeeb5 run>"}
[2019-11-26T16:25:15,878][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-11-26T16:25:16,270][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2019-11-26T16:25:16,280][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-11-26T16:25:16,351][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-11-26T16:25:16,361][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2019-11-26T16:25:16,705][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-11-26T16:29:21,783][INFO ][org.logstash.beats.BeatsHandler] [local: 0.0.0.0:5044, remote: 0:0:0:0:0:0:0:1:59481] Handling exception: An existing connection was forcibly closed by the remote host
[2019-11-26T16:29:21,784][WARN ][io.netty.channel.DefaultChannelPipeline] An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.
java.io.IOException: An existing connection was forcibly closed by the remote host
at sun.nio.ch.SocketDispatcher.read0(Native Method) ~[?:1.8.0_221]
at sun.nio.ch.SocketDispatcher.read(Unknown Source) ~[?:1.8.0_221]
at sun.nio.ch.IOUtil.readIntoNativeBuffer(Unknown Source) ~[?:1.8.0_221]
at sun.nio.ch.IOUtil.read(Unknown Source) ~[?:1.8.0_221]
at sun.nio.ch.SocketChannelImpl.read(Unknown Source) ~[?:1.8.0_221]
at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:288) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1128) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:347) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:148) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) [netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:897) [netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-all-4.1.30.Final.jar:4.1.30.Final]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_221]

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.