New Loadbalancer stays in Initializing

I have cloned my old LB-Agent cause I added SSL termination to the load balancer.Now the loadbalancer does not turn green:

Any Idea?

BTW, the system is fully working and i have no issues in the log.

Can you use the API to find the state of the service? Degraded is a state that is UI driven, so I need more details on what’s going on in the API.

@denise @zauberertz it means that the initial health check failed for the LB service. In multi hosts setup, health checks for the LB instance is configured on the network agent(s) running on other hosts from where LB instance runs. In most of the cases Initializing state indicates that network agent on host A can’t ping LB instance running on host B. To check that, login to network agent(s) and try to establish tcp connection to [LB instance ip address]:42 (42 is the port on LB listening for health check requests)

In the next rancher release, it would be possible to check which host reports health check failure using Rancher APIs:

@denise mean while the load balancer turnd green and stayed in that state, This process took about 24h.

What if network agents cannot reach each other? Do we need to deactivate and activate again?

@Swaroop_Kundeti networks agents not reaching each other indicate some problem in ipsec connectivity between hosts. Make sure security groups/firewalls allow access for ports 500/4500 on all your hosts (

@alena Thank you, looks like network agents are not connected though all the ports (0-65550) of TCP and UDP are open within the subnet.

@Swaroop_Kundeti Are you able to exec into one network agent and ping the other network agent on the other host? If not, then the cross host communication isn’t working.