Obviously, Iâm running into the issue where the build steps in build_image no longer exist in test_one, so I get the following.
$ docker run -v Repository:/opt/src image -m unittest source/tests/test_one.py
Unable to find image âimage:latestâ locally
Pulling repository Docker
docker: Error: image library/image not found.
Iâve read in a few places where perhaps artifacts could be used to store the results of the build between stages; however, that just doesnât seem to work.
Unfortunately, Iâm still unable to find/use the built Docker image.
Question
Has anyone been able to do this? Iâd like to split the tests into separate jobs (to make them concurrent), but rebuilding the image every time in before_script: is absolutely brutal.
Since version 8; Gitlab CI is now fully integrated into Gitlab itself (no separate website anymore). What I did was creating a docker image and use it in gitlab CI Your dockerfile can maybe be based on jessie/ubuntu or another image. Meaning a lot is already done for you.
For example, a Dockerfile could be (based on the official PHP docker image):
# Based on PHP image
FROM php:5.6-fpm
MAINTAINER Melroy van den Berg "melroy@melroy.org"
ENV PATH "/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:$PATH"
# Copy come stuff
COPY . /var/www/html/
# Install some stuff
RUN apt-get install npm
To build it, execute:
sudo docker build .
Where . is the location where the Dockerfile is location. This case the same folder.
Although I advise you to use some name and tag. Syntax is: name:tag. Tag is optionally, but used for version.
You can use this image when for example registering your runner.
sudo gitlab-runner register
With the following info:
Host: http://yourdomain.com/ci
Token: xxxx (see admin/runners)
Description: Some text
Tags: <optionally>
Executor: docker
Docker Image: latest:1.0
Notice: The docker image name âlatest:1.0â. Docker will get the image called âlatestâ with the tagged version: 1.0 locally.
HINT: Like you said, you want to split into separate jobs. Within your .gitlab-ci.yml, you can define as many jobs as you want. Optionally you can use runnerâs tag to execute a specific job on a specific runner.
Hi @intelliphant, I suspect youâre not able to run your built image in the test stage because your artifacts bundle from build doesnât actually contain the image.
Iâm trying to do the exact same youâve described but with gitlab.com CI. Iâm not sure if youâre hosting GitLab yourself. Browsing the artifacts on GitLab UI from my build stage reveals itâs empty, and downloading the zip turns out to be a mere 22 bytes in size. So, no image there. I think itâs because my base docker container running on a shared runner is not run as privileged, so I cannot access /var/lib/docker. But thatâs just a wild guess.
Successfully built 20a8c2f94639
Uploading artifacts...
WARNING: /var/lib/docker: no matching files
I guess you could try changing the Docker image directory as described here but I havenât tried this yet.
@danger89 This may be a solution, but I donât think it answers @intelliphantâs question. For me, this could mean building my image and pushing to a registry before every push to gitlab, which could get cumbersome and time-consuming during development. Maybe I have it wrong though.
On further investigation, I believe Iâm not able to access /var/lib/docker not because of non-privileged environment, but because that directory is hidden in the docker vm. On OSX, it is hidden in the xhyve vm.
If youâre trying to access the volume data, I think the usual way is to launch another container: docker run -v test:/test -it ubuntu:16.04 bash will get a shell with the volume data visible in /test.
Not sure how to make that work for this use case so farâŚ
@gustavjf Itâs a great solution, but itâs prone. The network and disk IO activity increases dramatically. It would be good to get some sort of command to execute stages at same agent.
So I guess creating an artifact is the ârightâ way to do this?
I was thinking the other possibility could be to use the GitLab container registry and push the built container during build, then pull that down in the following stages. Youâd probably need to use commit refs for the tag, otherwise you could run into issues if you have multiple pipelines running in parallel.