Ci docker in docker (dind) not verifying certificates - Confusion as to where and how to install certificates. Cannot locate any documentation

Hi,

I am running python pytest in a .gitlab-ci job. The CI job uses a docker:dind service to start docker-compose services. I have been using the CI build template provided for docker executor.

This has mostly got me up and running with being able to start up docker-compose from within a CI test job container. Pytest is running within the CI test job and successfully spawns docker-compose services.This is working from both a specific runner and on a shared runner.

However, a specific runner and shared runner fails when I run a test, from the CI test job, that requires SSL certificate verification with a nginx container. The nginx container is started from a docker-compose process triggered from within the test CI job.

I have tried exporting the environment variable SSL_CERT_FILE to use the self signed certificate in the test CI job. This is in addition, to using update-ca-certificates in before_script. I am using an alpine linux based image for the CI build. In both cases I am receiving SSL CERTIFICATE VERIFY FAILED errors.

Is there any documentation / resources for guidance on how to achieve this with a solution that works on specific and shared runner? Where should the self signed certificate be added so that it is trusted and can be verified? docker:dind, CI test job or in the image used by the gitlab-runner?

Can anyone offer any resources / guidance to clear up confusion on how to achieve this on gitlab?

Kind regards

dcs3spp

Updated 20/12/2018

I have included an image of the architecture that reflects my understanding of what I think is occurring.

dind-certs-arch

  • The pytest image represents the gitlab CI job.
  • The tusd, minio and listener images are the docker-compose services. These are triggered by the pytest image. I think that these will be run from within docker:dind?

I have installed the ca root certificate in the docker images marked with a red dot, including a custom image for docker:dind.

The following environment variables are exported from within the pytest CI build image to reference the root CA:

  • REQUESTS_CA_BUNDLE
  • SSL_CERT_FILE

I have to set both environment variables since the tus-py-client uses the requests library and http.client.

When the CI build runs I still receive an CERTIFICATE_VERIFY_FAILED error. Any ideas?

Update
If I issue openssl s_client -connect docker:1081 -CApath /etc/ssl/certs from within the CI build test job, to attempt verification of the certificate, I receive a verify error:num=21:unable to verify the first certificate and verify error:num=20:unable to get local issuer certificate.

I have tried starting up the docker-compose services on a separate Ubuntu 16.04 bare metal instance with the root certificate installed using update-ca-certificates. I then issued openssl s_client -connect docker:1081 -CApath /etc/ssl/certs which verified successfully after adding *docker 127.0.0.1 to /etc/hosts. So the server certificate and root ca appear to be ok.

Why am I receiving verify error numbers 20 and 21 when connecting to the specific docker-compose service running within docker:dind?

Update 21/12/2018
Fixed. In the image for the CI test job I placed the custom ca certificate in /usr/local/share/ca-certificates. Previously I placed the certificate in /usr/local/share/ca-certificates/subfolder and this was not getting appended to /etc/ssl/certs/ca-certificates.crt bundle after running update-ca-certificates. Following this, I reverted back to the de facto docker:dind image in .gitlab-ci.yml and the certificates were still verified successfully. So no need to install ca root cert in docker:dind.

Did a clean git clone in development environment fixed some issues with requirements.txt for pip install and successful test in build environment. Build environment closely matching dev environment!

2 Likes