thanks for your feedback. I think I need to provide more details. As mentioned I have a mysql database running in kubernetes on top of rancheros.
I indeed have kubectl running on my local laptop and I executed
kubectl port-forward percona-percona-598df6598c-db9nx 3306:3306 -n percona
Forwarding from 127.0.0.1:3306 -> 3306
Forwarding from [::1]:3306 -> 3306
in my local shall and not on the node the database is running at. Kubectl command blocks the console, hence, it looks like the port-forward is working as expected. At least I assume the command which I use does this?
As I do not want to make the database publicly available, I mapped the port to localhost on the node. What I actually want to do now, is to connect with mysql workbench via ssh to the node and access the database on port 127.0.0.1:3306 through the ssh tunnel. I can see in mysql workbench, that the ssl connection is working, but it cannot connect to the database. For debugging purposes I checked that my user,password and permissions for the database are ok. Therefore I opened a shell via RancherUI to the database pod. As expected I was able to access the database with user,password. My next try was to connect to the node via ssh directly (rather than via mysql workbench) and run a mysql client on the node against 3306 directly. I wanted to ensure that on server side things work as expected. Unfortunately, even when connected to the node like described, I am not able to connect to the mysql database. Hence, it seems like the kubectl port-forward does not work as expected. Have you any idea how I could debug this further and where the source of the problem could be?
Thanks for your help!