Docker build --> no space left on device

Hi,

I have a multi-stage pipeline that first generates 2 sequential docker images and then creates a website.
The problem is that for the second docker image I need to download 4GB of files and before it finishes that it ends up “with no space left on device”.
The pipeline is here: https://gitlab.com/nexs-metabolomics/INM/inm-booklet/-/jobs/2593435061
though it seems to require login even if I have public pipeline enabled.

So the thing is that if I run df -h inside the docker before copying my 4GB of data it says there is 6.4G. So I don’t get why it would fail.

The relevant stage from the gitab CI

docker-files-build-master:
  image: docker:latest
  stage: dock-files
  services:
    - docker:dind
  before_script:
    - docker login -u "$CI_REGISTRY_USER" -p "$CI_REGISTRY_PASSWORD" $CI_REGISTRY
  script:
    - df -h
    - docker build --pull -t "$CI_REGISTRY_IMAGE/files" --file "CI/Dockerfile-files" --build-arg erda_key=${erda_key} .
    - docker push "$CI_REGISTRY_IMAGE/files:latest"

The docker file:

FROM registry.gitlab.com/nexs-metabolomics/inm/inm-booklet/base:latest

ARG erda_key=${erda_key}

##### Copy mzML files #########################################################
RUN df -h
COPY ./scripts/copy_files.R /scripts/copy_files.R
RUN R -e "source('/scripts/copy_files.R')"
###############################################################################

Any ideas?

I tried to remove a large linux package that is not needed, but that didn’t seem to free-up any space.
I badly need to get this to work.

Is this a shared runner or your own private runner? If shared, take a look at this: What are Resource limits for CI jobs on GitLab.com? RAM? CPUs

Whilst this is an old post, it does mention the instance has 25GB. That said, only 16GB was available. Someone recently opened in issue here: Shared runners running out of disk space (#1146) · Issues · GitLab.com / GitLab Infrastructure Team / production · GitLab which also mentions problems with no space left failures.

As per the forum post, you may wish to disable shared runners and run your own private runner. That way you have more control over it, and can increase the disk space available by resizing the disk for the Gitlab Runner VM.

If you are already running your own runner, then chances are you need to resize the disk and make it bigger to allow your job to finish.

This is on the shared runner. I saw that post and others. It is more the math that I can’t get to add up.
The runner reports a size of 20.5GB (on /) with 13.3G free before the build.
The docker image that I start out with is ~5GB. Yet inside the docker it says 6.4G is free out of 21G total. Even with 6.4G I don’t understand why I should be running out of space by downloading 4GB.

Sounds to me that the filesystem underneath is using more than you think. You mentioned 13.3GB before the build, then after the docker image of 5GB is downloaded you say 6.4GB is left? That means your 5GB image has used 6.9GB of space on the runner (13.3 - 6.4) for whatever reason.

I notice similar things copying one files from one Linux system to another. Despite them both being ext4, the space used seems to vary.

Either way, it looks like you will need to disable shared runners and run your own which has far more space. I run my own runner, these are my specs:

root@gitlab-runner:~# cat /proc/cpuinfo  | grep -ic proc
2

root@gitlab-runner:~# free
               total        used        free      shared  buff/cache   available
Mem:         4025628      211316     3333128         548      481184     3592980
Swap:        8388604           0     8388604

root@gitlab-runner:~# df -h
Filesystem      Size  Used Avail Use% Mounted on
/dev/vda1        40G   13G   25G  35% /

2cpu, 4gb ram, 40gb hdd. The added bonus, I can increase the resources if I need to and not have to worry about cpu/ram/disk limits on shared runners.

Thanks. Yes I wanted to avoid having to set up my own runner. But now I did and it was much easier than I imagined. Seems all is good now.

1 Like