Installed Elasticsearch docker does not have a password

I have just following the docker installation guide and have got ES up and running but anyone can add documents to it.

I am confused on how to secure it. Is the authentication only pre-configured in the paid version?

What are correct steps to add a simple password to the free version running in docker?

Here is how I'm setting that with docker compose ( docker-compose.yml):

---
version: '3'
services:

  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:$ELASTIC_VERSION
    environment:
      - bootstrap.memory_lock=true
      - discovery.type=single-node
      - ELASTIC_PASSWORD=$ELASTIC_PASSWORD
      - xpack.security.enabled=$ELASTIC_SECURITY
    ulimits:
      memlock:
        soft: -1
        hard: -1
    ports:
      - 9200:9200
    networks: ['stack']

  kibana:
    image: docker.elastic.co/kibana/kibana:$ELASTIC_VERSION
    environment:
      - ELASTICSEARCH_USERNAME=elastic
      - ELASTICSEARCH_PASSWORD=$ELASTIC_PASSWORD
    ports: ['5601:5601']
    networks: ['stack']
    links: ['elasticsearch']
    depends_on: ['elasticsearch']

networks:
  stack: {}

.env file is:

ELASTIC_VERSION=7.14.0
ELASTIC_SECURITY=true
ELASTIC_PASSWORD=changeme

It's a similar way for Docker. :wink:

2 Likes

Thanks for the help. The docs are a little confusing around this topic

Does this work with version 5.6? I realised i needed older version to work with mongo connector and this doesn't seem to be setting the password for tag 5.6.16.

Maybe with a commercial license? I'm not sure.

But you should not be using any 5.x version anymore and upgrade to the latest 6.8 at the very least or better install 7.14.0.

And anyway, we are not providing anymore commercial licenses for 5.x versions as they are not supported.

I have a mongodb that is around 1TB in size and I want to use elastic search to create an index for it but all the guides I have found refer to connectors that don't work with recent versions.

Do you have any advice?

You need to send your existing data to elasticsearch.
That means:

  • Read from the database (SELECT * from TABLE)
  • Convert each record to a JSON Document
  • Send the json document to elasticsearch, preferably using the _bulk API.

Logstash can help for that (have a look at MongoDB Logstash Integration [Solved]). But I'd recommend modifying the application layer if possible and send data to elasticsearch in the same "transaction" as you are sending your data to the database.

I shared most of my thoughts there: Advanced Search for Your Legacy Application - -Xmx128gb -Xms128gb

Have also a look at this "live coding" recording.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.