What I Want to Do
Using GitLab Runner via Docker-in-Docker image:
- Build an image from a Dockerfile in my repository
- Run numerous tests against that Docker image
before_script: - cd /builds/user stages: - build - test build_image: stage: build script: - cd Repository - docker info - docker build -t image . test_one: stage: test script: - docker run -v Repository:/opt/src container -m unittest source/tests/test_one.py test_two: stage: test script: - docker run -v Repository:/opt/src container -m unittest source/tests/test_two.py
concurrent = 2 [[runners]] name = "GitLab Runner" url = "https://gitlab.com/ci" token = "[REDACTED]" tls-ca-file = "" executor = "docker" [runners.docker] image = "gitlab/dind:latest" privileged = true volumes = ["/cache"]
Obviously, I’m running into the issue where the build steps in build_image no longer exist in test_one, so I get the following.
$ docker run -v Repository:/opt/src image -m unittest source/tests/test_one.py
Unable to find image ‘image:latest’ locally
Pulling repository docker.io/library/container
docker: Error: image library/image not found.
I’ve read in a few places where perhaps artifacts could be used to store the results of the build between stages; however, that just doesn’t seem to work.
I’ve tried the following in my .gitlab-ci.yml
build_image: stage: build script: - cd Repository - docker info - docker build -t image . artifacts: untracked: true paths: - /var/lib/docker
Unfortunately, I’m still unable to find/use the built Docker image.
Has anyone been able to do this? I’d like to split the tests into separate jobs (to make them concurrent), but rebuilding the image every time in
before_script: is absolutely brutal.