Starting Fscrawler with SSL error

I wonder if this is now a permissions issue.

btw, you dont need to be root user to run the curl command. Usually elasticsearch runs as a non-root user, maybe after your tidy up some file/directory has ended up as being owned by root.

See:

$ ls -ld /var/lib/elasticsearch/
drwxr-s--- 5 elasticsearch elasticsearch 4096 Mar  4 21:28 /var/lib/elasticsearch/

so this whole tree is owned by the "elasticsearch" user (this is Ubuntu)

$ sudo find /var/lib/elasticsearch/ /var/log/elasticsearch/ \! -user elasticsearch -ls
$

no output, everything in those trees is owned by elastsearch user.

$ sudo find /var/lib/elasticsearch/ -user elasticsearch -type f | wc
   2878    2878  206668
$ sudo find /var/log/elasticsearch/ -user elasticsearch -type f | wc
     47      47    2160
$

The error you have now has been seen before, eg

and there its not clear what was wrong but the summary was:

This means that your cluster is not in a state where it can read/write data. I cannot tell you what caused that, but it almost certainly has nothing to do with your password setup - the root of the problem is that security cannot do its job if it cannot read and write data from the cluster.

and the suggestion was to check cluster.initial_master_nodes setting. as you want just a single node cluster then

cluster.initial_master_nodes: []

should be fine.