Hi @krezno. I am not aware of any off-the-shelf way to write from spark to logstash. It might be worth trying out using es-hadoop to write to your ECK cluster (with es.nodes.wan.only
set to true) to get a sense for how good or bad the performance really is though.
I'm not really familiar with ECK and it's been a while since i used kubernetes, but I assume the problem is that Elasticsearch is exposed as a service at a single URL, and discovery does you no good since none of the discovered nodes are accessible. Is that right? Or are you running into other problems?
If you do try es-hadoop, keep in mind that it will see your whole Elasticsearch cluster as a single node, so your hadoop or spark jobs will fail if it gets "blacklisted" due to failures writing to it. We have this problem with customers using load balancers as well. You might want to list the same node several times as I described here.