Running Spark on k8s and scheduling

Hello,

I have a k8s environment in Rancher with 2 hosts for H.A. I’ve launched a spark cluster from the catalog.

The cluster automatically creates a load balancer with the IPs of the 2 hosts as external services. However the load balancer service only runs on one host.

How can I configure the load balancer to be scheduled global and with certain host labels so that it runs on at least 2 hosts ? I’m unsure whether this is something to configure in Rancher or in the k8s templates.

Any tips appreciated. Thanks !

After the service has been launched, you can If you click on Edit on the Load Balancer, you should be able to edit the scale to how many hosts you want it running. But if something was to happen to the LB, it would default back to 1.

If you are interested in being able to handle scale for LBs, you can looking into the K8s ingress and scale, which we have added support to in our v1.1.0-dev4 release.

http://docs.rancher.com/rancher/latest/en/kubernetes/ingress/