I have a curious question. When I run a pipeline that pulls a docker image is the work done on docker image cached? I know the docker image is cached. I am saving an artifact but have no cache: declared. Is everything FRESH when run the pipeline again?
If you have a pipeline job running on Docker and the previous pipeline stage produced an artifact, then the artifact will be available in your Docker stage.
If you are explicitly caching dependencies in your .gitlab-ci.yml file, the cached paths will also be available to your Docker job.
Otherwise, any work done in a Docker container in a previous stage of the pipeline will be lost when that stage completes.
Still in the context of Docker Image Cache, I’ve been trying that strategy here, inside of a pipeline:
But I can’t seem to make any gain.
I can see the image being pulled, built, and then pushed in the GitLab CI Docker registry, but the build time is just the same (about 27 seconds) whatever the number of replays I do, whereas the tag hasn’t changed (I can confirm that from the job console).