Jenkins Kubernetes plugin with Rancher cluster

Hi,

I’m new to k8s and I have built an on-prem cluster with Rancher 2.x, it’s super simple to have a k8s cluster. This cluster will manage the Jenkins slave nodes from an existing Jenkins master.

When I setup a single master/worker then use the Jenkins Kubernetes plugin which will use the jnlp docker image and everything is good.

However, when I want to try to test with multiple nodes: master on one machine and worker on other machine then the slave pod on the worker node can’t connect to my Jenkins master anymore (404). Not sure if I missed any configuration to let my worker node works correctly. I have no idea what happen here.

I tried to do more troubleshooting, when I try with a pod on master and ping my Jenkins master, it returns the correct IP but when I try with the other worker node then it returns with a weird IP, something like 72.167.191.69. When I try to ping any machine, even google.com, it always return this IP. Seems there is an outbound rule or something.