I am enrolled into "Elasticsearch Engineer 1 (On-Demand)" via my company account. Now there are several problems with the Strigo environment and a link to the dataset material ZIP is not provided - so I am looking for support.
Let's start with Strigo:
At the moment, the entire lab environment doesn't start anymore. I am trying on macOS with Firefox 80.0.1 (all plugins disabled), as well as Chrome 85.0.4183.102. It now shows:
"Something went wrong. We have encountered a technical issue while setting up your lab. Please contact your event host to resolve this issue."
Before that it spins the "Preparing machine" and finishes with the error message. I cannot reset it myself apparently. So that is stuck.
The very first time it ran fine, and I could ssh into the other servers. Then it stopped doing anything on the ssh connections and just didn't connect to server 1/2/3, just not returning to the console after the command (maybe a timeout?). I tried removing the already accepted ssh keys in .ssh, but that didn't change anything. I can try to ping the ips behind the server names, but they don't reply any pongs (but that might be firewall related).
So I turned to the documentation on the virtual server, specifically the point:
"Running labs on your local machine". Here it says: "The ZIP file that you can download from the Elastic training portal contains the lab instructions in the form of HTML files, as well as the datasets. You can use these to recreate the lab environment on your own machine. This page is provided to help you get the Elastic Stack up and running locally.".
Specifically it speaks about a blogs.csv, a blogs_csv.conf and a elastic_blog_curated_access_logs.tar.
I saw a a dataset ZIP file on server1 when I could do an initial connection into server1 (back when the console worked once). All I want to do is replicate the elastic config on my own machine, since Strigo is totally unreliable.
Now when I look into the course outline, there is no ZIP file to download.
So 2 Questions:
1 - Where is the ZIP with the datasets? 2 - If that cannot be provided, can someone fix this broken Strigo machine?
I have to say I am a bit perplexed about the instability of that setup with Strigo. I sure hope the Exam works better than the current preparation course.
Hi Marcus,
thank you very much for reaching out to us. This is a new issue that seems to be affecting many of our users in the last few days/hours, even though we had no updates applied. I am wondering if there is an issue with Strigo or more specifically AWS in the way they serve their instances.
We are currently investigating and will let you know what we find out. The short term solution we can offer you is to replace your lab environment, which means things will work again (at least initially). Unfortunately, you will lose any work done. Let me know if you want to follow this path.
Hi Marcus,
we don't provide the datasets for the on-demand lab environment as the environment itself is a bit more complex (and enhance the experience) than just having the datasets. The mention you found in the instructions is a bug that we plan to fix in a few days. Sorry, for that.
I am facing somewhat similar issues as explained above. I can access the 'entry server'. But when I try to ssh to any of the other servers, it times out.
My second attempt was to setup the whole thing on my laptop, but again as mentioned above I can't find the necessary zip file anywhere. I started the training last Monday (14-09-2020)
I am having trouble accessing the Strigo environment. I have restarted my laptop, deactivated popup blockers and cleared cookies. I have also used a fresh browser. I have registered for the Elasticsearch Engineer 1 course and I still have the same problem. The loading screen appears only and I've been waiting for over 2 hours. Please advise how to resolve this issue.
If switching browsers does not help, then script blockers or ad blockers are something to look at. They are known to cause issues. The blank screen that you get there in Chrome is often caused by those extensions. Try disabling those blockers, for example by opening the page in a private or incognito window.
If that does not help, then it may be that your network is blocking some functionalities. A proxy or VPN could be the cause of that. Try accessing the lab via another network, and disabling your VPN/proxy.
I can now access the environment, but Kibana is not working. Elasticsearch is started and responds fine, but when trying to start Kibana from a fresh tar extraction, I am getting:
log [15:39:04.736] [fatal][root] { Error: Request Timeout after 30000ms
at /home/elastic/kibana-7.3.1-linux-x86_64/node_modules/elasticsearch/src/lib/transport.js:362:15
at Timeout. (/home/elastic/kibana-7.3.1-linux-x86_64/node_modules/elasticsearch/src/lib/transport.js:391:7)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
status: undefined,
displayName: 'RequestTimeout',
message: 'Request Timeout after 30000ms',
body: undefined,
isBoom: true,
isServer: true,
data: null,
output:
{ statusCode: 503,
payload:
{ statusCode: 503,
error: 'Service Unavailable',
message: 'Request Timeout after 30000ms' },
headers: {} },
reformat: [Function],
[Symbol(SavedObjectsClientErrorCode)]: 'SavedObjectsClient/esUnavailable' }
log [15:39:04.739] [debug][server] stopping server
I can cURL the elastic service fine on localhost, server1 and the given amazon instance host name. They all respond on port 9200. Node-name of the instance is node1.
Kibana was started with the kibana --host=0.0.0.0, as per the lab instructions.
Hi Marcus,
thanks for reaching out. Can you move this dicussion to another thread? It makes it easier for other users to find solutions to their problems later.
I think the issue might be that Kibana is trying to communicate with Elasticsearch in a host ("server1" vs. "localhost"). Which curl command from which server did you use to check for Elasticsearch?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.