Currently we’re running a CI / CD pipeline which looks as follows:
Git-Repo -> Jenkins -> “docker build” -> “docker push” to local registry -> “docker run” integration tests -> API call to the Rancher server that deploys the image onto the platform
The build of an image is triggered by a webhook from the git repo. So we’re building like 10-20 images per day.
These images get pushed to a private / local registry (v2). Now as you can imagine, the disk space gets eaten up pretty quickly with this setup.
Now my question: What’s the best way possible currently to solve this problem? Including removing old images from the Rancher server as well as from the registry and from the Rancher hosts.
Thanks for any ideas in advance.