Hi all, I've been doing a number of installs towards getting a fully working ELK system and I've not yet gotten to the L part. In the end I'm trying to end up with is a MySQL db search set up that is accessible over the LAN.
I got a dedicated machine, using Ubuntu 22.04 Server and apt to install with. (I used tar balls first but decided using the Elastic repo might be easier.) Several reinstalls later and thinking that I got it, it seems that I don't and I'm not sure what goes wrong. For example, installing Elasticsearch with exact steps but for replacing localhost with the server IP, and when I try to use
<curl -X GET 'http://:9200'> results in (52) Empty reply from server.
Using https result in curl not able to verify the legitimacy of the server.
I thought a CA was automatically created during the install and https would be operational?
At one point I was able to use Kabana. I tried to use it to add MySQL access it but eventually discovered Logstash is needed for that, which is fine, but an example of lacking certainty that I'm barking up the right tree.
I've wiped the system and started over a few times to ensure I have a clean build each time. But I'm thinking that the docs might not reveal exactly how accomplish this and of course I could be clueless about any and all parts.
To recap I'm looking for the instructions that will get a dedicated ELK server that will be accessible across the LAN. Sorry for what might simply be my stupidity on the subject. I'm really excited to see how Elasticsearch can be implemented over PHP to search the MySQL server (maybe not even on the same server).
That was correct... the certs are created but they are Self Signed certs (perhaps look that up / read about it) so in order to use https you have to either ignore verification or provide the CA (Certificat Authority) to validate the cert...
So curl would need -k which says don't verify the cert
The HTTPS query works, but the cert command does not. It gives curl error 77 which is either path or permissions.
My install is the defaults (from apt) which made /etc/elasticsearch assigned to the group elasticsearch. Which I'm guessing falls under the Elastic super user.
Running --cacert as user elastic should work as it is the super user...
The instructions talks about the CA fingerprint, but I've not been able to locate it going through the logs looking at the initial screen with the authentication information.
OK, fair enough. I was using the commands you listed but with my ip and path, which returned the curl error 77 and then I shared that the path is correct and permission wise that does not seem to be the problem either.
perhaps In the future, it really helps us trying to answer the questions and see if you show the actual command and the actual message after the command, not just a verbal description of it.
Hmm interesting, I guess when there are a lot of messages the thread is not fully read before answering the last one. I'm used to thinking differently but will adopt your suggestion. Thx!
It's all good. All I'm saying is it makes it much easier for us trying to answer questions If you post the exact commands you ran explicitly followed by the exact error or return message.... Not a paraphrased version of it.
That leads too much clearer communication and quicker answers.
Even if we gave you the commands to run because you may be very skilled, but often we have folks that don't run what we ask and then give us a different error message...
I answer hundreds of questions on this forum. It's just suggestions to help those of us that are answering them.
I was thinking you'll be using the elastic stack more and we are super glad you're here. Just trying to help when you may need a little bit more help.
I'll take a list of commands and outputs any day of the week.
Oh no, I was not criticizing you, simply reflecting on what you ended up saying and how I had not looked at it quite that way. I've been seeing at various places how the thread is not read simply by the questions. So yes, being explicit here will make it easier to solve typo's and whatever.
Thank you for caring enough to explain what I see as a very valuable feedback!
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.