RKE2 - Rancher cluster versions missmtach after manual upgrade

Hi, I manually upgraded my nodes from version v1.26.10-rke2r2 to v1.27.10-rke2r1. Now I added a new node with version v1.27.10-rke2r1, but when I look into Cluster management → Config → RKE2 Options → Kubernetes version I have only these options (in picture). Everything works, but ONLY on the new node (named euler) there’s periodically appearing job named apply-rke2-worker-plan-on-euler-with-c70b04748a3dc799ade3 in ns cattle-system which has this in logs:

[INFO]  Waiting for all master nodes to be upgraded to version
+ K8S_IMAGE_TAG=v1.26.10-rke2r2
+ '[' v1.27.10-rke2r1 '==' v1.26.10-rke2r2 ]
+ info 'Waiting for all master nodes to be upgraded to version '
+ echo '[INFO] ' 'Waiting for all master nodes to be upgraded to version '
+ sleep 5
+ continue
+ true
+ all_updated=true
+ kubectl get nodes '--selector=node-role.kubernetes.io/master' -o json
+ jq -r '.items[].status.nodeInfo.kubeletVersion'
+ sort -u
+ tr + -
+ MASTER_NODE_VERSION=v1.27.10-rke2r1
+ '[' -z v1.27.10-rke2r1 ]
+ bash /bin/semver-parse.sh v1.26.10-rke2r2 k8s
+ K8S_IMAGE_TAG=v1.26.10-rke2r2
+ '[' v1.27.10-rke2r1 '==' v1.26.10-rke2r2 ]
+ info 'Waiting for all master nodes to be upgraded to version '
[INFO]  Waiting for all master nodes to be upgraded to version
+ echo '[INFO] ' 'Waiting for all master nodes to be upgraded to version '
+ sleep 5

Everything works, but how do I set the Kubernetes version to the correct version, when the installed one is not in the list?