How to run pytest with fixtures that spin-up docker containers?

I am a bit at a loss with how to best use gitlab to run automated tests that use docker. I’ve read a lot of resources regarding running a docker image and the importance of docker:dind, but I still can’t figure out the best way to setup my pipeline

I have a Python/Django project which runs some automated tests using pytest (both for Django, and simply Python scripts). Most tests execute fine with a python image. Here is my .gitlab-ci.yml

    - unit-test

    stage: unit-test
    image: python:3.7
        - pip install --no-cache-dir -r requirements.txt
        - pytest -sv utilities/emulated/tests/

However, one of the tests uses a fixtures uses a fixture which uses python docker package in order to spin-up test services as separate containers.

import docker

def run_emulated_container():
    Build and run one or more standalone test containers.
    Cleanup all of them after fixture is used
    container_list = []
    client = docker.from_env()"emulated", dockerfile="dockerfile_standalone",
                        tag=EMULATED_TAG, rm=True, nocache=True)

    def _run_emulated_container(bind_port):
        container_ports = {
            '{!s}/tcp'.format(INNER_PORT): bind_port
        emulated_container =, name="emulated_test", 
                                                 detach=True, ports=container_ports)
        return emulated_container

    yield _run_emulated_container

    for container in container_list:

The container does nothing special. It just binds to a port and listens for connections. The test procedure spins up one or more such containers that bind to different ports on the host machine and then tests these emulated services.

Locally, simply running the commands I posted on the .gitlab-ci.yml script, executes all tests successfully. However, this specific test fails when it tries to initialize the fixture on client = docker.from_env() (I guess it cannot connect to the docker daemon).

`E           docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))` `/usr/local/lib/python3.7/site-packages/docker/api/ DockerException`

I tried using docker:dind as a service in the .gitlab-ci.yml but it didn’t help. Any idea how I can configure the containers to detect and connect to the docker daemon? Thank you for your time!

p.s. I’m using the free public runners of the

For anyone interested. I managed to fix this. All tests now pass.

Essentially: I created a custom testing image which has the python version I require -plus- the docker tools. Then I modified env variables to give this container the proper docker daemon host (named docker_dind here), and the host of the spanned-up utility containers (which is also the docker_dind because that’s where they bind)

The custom testing image

FROM python:3.7

RUN apt-get update \
    && apt-get install -y --no-install-recommends \
    curl wget \
    ca-certificates \
    apt-transport-https \
    gnupg-agent \

# Install docker
RUN curl -fsSL | \
    gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
RUN echo "deb [arch=amd64 signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] \ $(lsb_release -cs) stable" | \
    tee /etc/apt/sources.list.d/docker.list > /dev/null
RUN apt-get update \
    && apt-get install -y --no-install-recommends \
    docker-ce  \

# Install python dependencies (may be unneeded, since requirements.txt is also installed)
RUN pip install --no-cache-dir \
    six \
    docker \

CMD ['/bin/bash']

The updated .gitlab-ci.yml (the custom image was currently build and uploaded manually, but can also be added to the ci pipeline)

    - unit-test

    stage: unit-test
        - name: docker:20.10-dind
          alias: docker_dind

        DOCKER_HOST: tcp://docker_dind:2375
        CONTAINERIZED_HOST: docker_dind

        - pip install --no-cache-dir -r requirements.txt
        - pytest -sv utilities/emulated/tests/