Is setting up elk stack on docker intentionaly this complicated?

I've been dealing with this for a couple hours now, followed half a dozen tutorials

i am not a noob, i've set up a thousand servers, i run my own stack at home with a dozen containers.

but this? impossible.

does anyone have a minimal viable configuration to run this on my own server to analyze my docker container logs ?

Welcome!

What did you try so far?
Did you successfully started Elasticsearch and Kibana so far?

cant even paste the link to the tutorials

i gave up

forget it

1 Like

i imagine this is intentional to try users to go get your cloud subscription

i dont see any other reason

It's hard to help if you don't tell what the problem is.

i imagine this is intentional to try users to go get your cloud subscription

No. It's not. Even if you run on cloud, I will ask the same questions. What is your current status? Do you have an Elasticsearch / Kibana instance running yet?

Anyway, in case you did not start yet successfully your Elastic cluster, here is a simple guide which tells you to run first:

curl -fsSL https://elastic.co/start-local | sh

Not sure if you did that yet or no...

this doesn't even give you docker-compose file

i like my volumes in the same folder, this sets them globally

i followed the "Get starting guide" on this same website where there is a link to github repo with docker-compose file and logstash config file

i ran your script, containers are running but it's unaccessible on localhost port 5601

with a few tweaks i got it running but it wont let me login

You do not have permission to access the requested page

Either go back to the previous page or log in as a different user.

i've used u: kibana_system and pass is in .env. there is no other username variable in env nor docker compose ...

The password is printed when you start the script.

But if you want a finer control on everything, you can follow the installation guide from Install Elasticsearch with Docker | Elasticsearch Guide [8.17] | Elastic

see, needlessly complicated ... you close shell too soon, and it's gone, start from scratch.

Do you want to run it for production? For test?

If the later, then I personally think that running curl -fsSL https://elastic.co/start-local | sh is not that complicated. It starts one elasticsearch cluster, one kibana instance, and prints an API Key and a password.

And then, open http://localhost:5601

Gives:

2 Likes

yes thanks i'm in now

they could've put username and password in env since it already exists

i want to run it on my home server where i keep homeassistant, nextcloud, pihole, mailserver and a couple others

i would like to monitor logs from my docker containers

so i guess production

i also disabled premium features, reverted license to basic in settings