Sharing data/storage between gitlab-runner and its host

Hello.

I am trying to run a CI/CD that is going to use a big chunk of data for its tests.

The data in question is about 170 Gb big and is already on the machine hosting gitlab-runner (in docker mode).

I’d like to knwo how I can mount the folder containing this data inside a job of my pipeline.

I know you can use docker volumes, but is it configurable from within a gitlab-ci instance ?

Or maybe I could mount a filesystem like sshfs from within gitlab-runner ?

Or should I rather try to use gitlab-runner in shell mode ?

Any help/idea would be appreciated, thank you

1 Like

I know this is a very old topic, so for people who arrived there like me :slight_smile:

First see : Advanced configuration | GitLab

  1. you need to define a binded volume in the config.toml of you gitlab-runner :
[runners.docker]
  volumes = ["/path/to/bind/from/host:/path/to/bind/in/container:rw"]

(or ro for read only mode)
This volume will be available to the container that will execute the content of gitlab-ci.yml

  1. If you choose to execute pure docker-in-docker, you have to make sure this volume is available to the instance you run from the gitlab-ci.yml script :
variables:
    ASSETS_DIR: /a/path/to/files

test:
  stage: test
  script:
    - # blah blah blah 
    - docker run -v $ASSETS_DIR:$ASSETS_DIR $THE_IMAGE /bin/bash a_script.sh

Regards.