Docker logstash container not listening on port 5044

container of logstash runs. but ports not open.
When i run docker port logstash it yields no output.
when i run docker port elasticsearch or kibana it does yield results.
When i telnet hostname 5044 or telnet hostname 9600, it doesn't work. connection refused.
i notice that when i do docker ps - for logstash i see that it;s missing 0.0.0.0:5044->5044/tcp. it just shows 5044/tcp.

$ sudo netstat -pnlt
Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name
tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN 1124/sshd
tcp6 0 0 :::9200 :::* LISTEN 3800/docker-proxy-c
tcp6 0 0 :::22 :::* LISTEN 1124/sshd
tcp6 0 0 :::5601 :::* LISTEN 6430/docker-proxy-c

$docker ps
26884c45549a docker.elastic.co/logstash/logstash:6.3.2 "/usr/local/bin/do..." 18 minutes ago Up 7 minutes 5044/tcp, 9600/tcp logstash

Docker version: 1.13
Docker Compose version: 2
using elasticsearch, logstash and kibana official 6.3.2 images.

When i start using docker-compose up -d all containers comes up well.

and, logstash logs show:

[2018-08-19T10:16:51,155][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2018-08-19T10:16:52,523][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch hosts=>[http://elasticsearch:9200], bulk_path=>"/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s", manage_template=>false, document_type=>"%{[@metadata][document_type]}", sniffing=>false, id=>"51f66c9ec66feb8fd59be1157c335c1ddc5fb856d2b248254ae6add289cea7b7", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_83f81cee-ae98-417c-a32a-d7406332f0da", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-08-19T10:16:53,181][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50}
[2018-08-19T10:16:53,543][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-08-19T10:16:53,787][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2018-08-19T10:16:53,812][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x4b57804c@/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:48 sleep>"}
[2018-08-19T10:16:53,908][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2018-08-19T10:16:54,079][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2018-08-19T10:16:54,086][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elasticsearch:9200/, :path=>"/"}
[2018-08-19T10:16:54,269][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2018-08-19T10:16:54,314][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-08-19T10:16:54,317][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-08-19T10:16:54,335][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://elasticsearch:9200"]}
[2018-08-19T10:16:54,416][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2018-08-19T10:16:54,416][INFO ][logstash.licensechecker.licensereader] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elasticsearch:9200/, :path=>"/"}
[2018-08-19T10:16:54,420][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2018-08-19T10:16:54,424][INFO ][logstash.licensechecker.licensereader] ES Output version determined {:es_version=>6}
[2018-08-19T10:16:54,425][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-08-19T10:16:54,505][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x6221a636 sleep>"}
[2018-08-19T10:16:54,545][INFO ][logstash.agent           ] Pipelines running {:count=>2, :running_pipelines=>[:main, :".monitoring-logstash"], :non_running_pipelines=>[]}
[2018-08-19T10:16:54,556][INFO ][logstash.inputs.metrics  ] Monitoring License OK
[2018-08-19T10:16:54,798][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Docker-compose yml:
version: '2'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:6.3.2
container_name: elasticsearch
ports: ['9200:9200']
networks: ['stack']
volumes:
- 'es_data:/usr/share/elasticsearch/data'

kibana:
image: docker.elastic.co/kibana/kibana:6.3.2
container_name: kibana
ports: ['5601:5601']
networks: ['stack']
depends_on: ['elasticsearch']

logstash:
image: docker.elastic.co/logstash/logstash:6.3.2
container_name: logstash
networks: ['stack']
depends_on: ['elasticsearch']

volumes:
es_data:
driver: local

networks:
stack:
driver: bridge

1 Like

I solved the problem - added ports: ['5044:5044'] in docker-compose.yml under logstash.

This was required to bind the port.

new docker-compose.yml
version: '2'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:6.3.2
container_name: elasticsearch
ports: ['9200:9200']
networks: ['elkstack']
volumes:
- 'es_data:/usr/share/elasticsearch/data'

kibana:
image: docker.elastic.co/kibana/kibana:6.3.2
container_name: kibana
ports: ['5601:5601']
networks: ['elkstack']
depends_on: ['elasticsearch']

logstash:
image: docker.elastic.co/logstash/logstash:6.3.2
volumes:
- '/path/to/config/:/usr/share/logstash/pipeline/'
container_name: logstash
ports: ['5044:5044']
networks: ['elkstack']
depends_on: ['elasticsearch']

volumes:
es_data:
driver: local

networks:
elkstack:
driver: bridge

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.