Error: Address already in use

Hi

I have installed logstash on debian container which been spunup using -link elasticsearch:elasticsearch.

  1. /config-dir/first-pipeline.conf:
    input {
    tcp {
    port => 8083
    }
    }
    output {
    elasticsearch { hosts => ["172.18.0.10:9200"] }
    }

  2. running curl 172.18.0.10:9200 will show me
    {
    "name" : "3-cPlh9",
    "cluster_name" : "elasticsearch",
    "cluster_uuid" : "46Upu7ARTO2OPXufnqZhyg",
    "version" : {
    "number" : "5.6.12",
    "build_hash" : "cfe3d9f",
    "build_date" : "2018-09-10T20:12:43.732Z",
    "build_snapshot" : false,
    "lucene_version" : "6.6.1"
    },
    "tagline" : "You Know, for Search"
    }

  3. using netstat -a | egrep 'Proto|LISTEN', will NOT mention 9600 as ocupied

  4. When running /usr/share/logstash/bin/logstash -f /config-dir/first-pipeline.conf --path.settings /etc/logstash/ I'll keep see the following errors :
    [2018-09-26T10:34:46,840][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
    [2018-09-26T10:34:47,239][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.1"}
    [2018-09-26T10:34:48,494][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>64, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
    [2018-09-26T10:34:48,836][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://172.18.0.10:9200/]}}
    [2018-09-26T10:34:48,845][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://172.18.0.10:9200/, :path=>"/"}
    [2018-09-26T10:34:49,066][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://172.18.0.10:9200/"}
    [2018-09-26T10:34:49,112][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>5}
    [2018-09-26T10:34:49,137][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//172.18.0.10:9200"]}
    [2018-09-26T10:34:49,153][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
    [2018-09-26T10:34:49,214][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
    [2018-09-26T10:34:49,233][INFO ][logstash.inputs.tcp ] Starting tcp input listener {:address=>"0.0.0.0:8083", :ssl_enable=>"false"}
    [2018-09-26T10:34:49,429][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x116c1c34 run>"}
    [2018-09-26T10:34:49,463][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
    Pipeline_id:main
    Plugin: <LogStash::Inputs::Tcp port=>8083, id=>"ae37c7b04e7975c9ba81d4fec53f179a3758587c980eac0f8fd700ee72a25a65", enable_metric=>true, codec=><LogStash::Codecs::Line id=>"line_03daa7c8-0d06-42d6-aa51-4c7ede7f2006", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">, host=>"0.0.0.0", mode=>"server", proxy_protocol=>false, ssl_enable=>false, ssl_verify=>true, ssl_key_passphrase=>, tcp_keep_alive=>false>
    Error: Address already in use
    Exception: Java::JavaNet::BindException
    Stack: sun.nio.ch.Net.bind0(Native Method)
    sun.nio.ch.Net.bind(sun/nio/ch/Net.java:433)
    sun.nio.ch.Net.bind(sun/nio/ch/Net.java:425)
    sun.nio.ch.ServerSocketChannelImpl.bind(sun/nio/ch/ServerSocketChannelImpl.java:223)
    io.netty.channel.socket.nio.NioServerSocketChannel.doBind(io/netty/channel/socket/nio/NioServerSocketChannel.java:128)
    io.netty.channel.AbstractChannel$AbstractUnsafe.bind(io/netty/channel/AbstractChannel.java:558)
    io.netty.channel.DefaultChannelPipeline$HeadContext.bind(io/netty/channel/DefaultChannelPipeline.java:1283)
    io.netty.channel.AbstractChannelHandlerContext.invokeBind(io/netty/channel/AbstractChannelHandlerContext.java:501)
    io.netty.channel.AbstractChannelHandlerContext.bind(io/netty/channel/AbstractChannelHandlerContext.java:486)
    io.netty.channel.DefaultChannelPipeline.bind(io/netty/channel/DefaultChannelPipeline.java:989)
    io.netty.channel.AbstractChannel.bind(io/netty/channel/AbstractChannel.java:254)
    io.netty.bootstrap.AbstractBootstrap$2.run(io/netty/bootstrap/AbstractBootstrap.java:364)
    io.netty.util.concurrent.AbstractEventExecutor.safeExecute(io/netty/util/concurrent/AbstractEventExecutor.java:163)
    io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(io/netty/util/concurrent/SingleThreadEventExecutor.java:403)
    io.netty.channel.nio.NioEventLoop.run(io/netty/channel/nio/NioEventLoop.java:463)
    io.netty.util.concurrent.SingleThreadEventExecutor$5.run(io/netty/util/concurrent/SingleThreadEventExecutor.java:858)
    io.netty.util.concurrent.FastThreadLocalRunnable.run(io/netty/util/concurrent/FastThreadLocalRunnable.java:30)
    java.lang.Thread.run(java/lang/Thread.java:748)

Thanks as in advanced for ANY help in this subject
Albert.

You can't listen on port 8083 in more than once place. Is some other process already listening on that port? Do you have two inputs in your Logstash configuration that use port 8083?

Hi magnusbaeck,

My bad - thanks for the prompt response

Albert

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.