I’ve an Omnibus installation of Gitlab in my local server, and on another server a gitlab-runner that performs the CI tasks.
In my pipeline I’ve a Job that check the code quality using Docker In Docker with codeclimate.
The problem is that it is very slow (20 minutes) because every time it download and extract the docker images of codeclimate.
I tried to configure the gitlab-ci cache but nothing changed.
That sounds a docker issue than a runner configuration. Can you check your script, make sure you’re not pruning after your check? “docker pull” should check locally first (unless the codeclimate image is getting updated that often).
I’d try this by hand to make sure docker works first as the runner user (as the runner user).
Yes you are right but the docker pull is performed inside a docker container. if the runner doesn’t cache the downloaded images then as soon as the job is finished the container will be deleted loosing all the downloaded docker images right?
If I execute the job directly on my operating system (and not using dockerInDocker) there is no problem:
These are the commands that I execute in the gitlab-ci.yml:
So every time the runner starts it performs the docker pull codeclimate/codeclimate and since the runner hasn’t cached the last job execution, it has to download all the layers again.
My current cache section is like this (I’m not sure if this is working or not):
Ok I missed the fact that you’re running the runner itself in a docker instance. You would have to create a volume for the runner docker that can cache your images. /var/lib/docker/overlay2 probably needs to be local rather than in the container.
BTW, does this actually work??
variables:
DOCKER_DRIVER: overlay2
I’d bet that this is a docker daemon setting and could not be manipulated this way but I have not tested this.