Hi, im already using gitlab ci for years, but now i need to step up doing something new and im not founding a way.
I have now a build pipeline that compiles a Docker image, and a following Job gets this image and install our software using Postgres Service as testing before release, and after this job is sucessful, a last job build an image with the production tag.
All this goes nicely, but now, on the third step (build production image), i want to incluing into the image the backup of the database generated by Step 2, which installs the software into a database. Our software running on Step 2 already have a dump and package tool, and the whole process already run and generate a zip file.
The second Job, which installs the software, runs with our image directly. If i do instead a docker run and specify a volume on docker run, i can capture this file, but since this job is already running the image, i didnt find a way to grab this file and preserve it.
I read something on reference manual about the /build folder, and in theory it is mounted as a volume into the container. I tried to save the result in the container to this folder, but without success, its not recognized inside the running folder.
pre_script
- mkdir /builds/$RUNNER_TOKEN_KEY/$CONCURRENT_ID/$NAMESPACE/$PROJECT_NAME/backup
- mkdir /builds/$RUNNER_TOKEN_KEY/$CONCURRENT_ID/$NAMESPACE/$PROJECT_NAME/imagehash
script:
- myprogram -d $DB_NAME
-l pt_BR
-i $MODULE
–db_host postgres
–db_user $POSTGRES_USER
–db_password $POSTGRES_PASSWORD
-b /builds/$RUNNER_TOKEN_KEY/$CONCURRENT_ID/$NAMESPACE/$PROJECT_NAME/backup/reference.zip
I assumed it would be recognized as the project folder, so i setup the folder inside the container with the info i found. I also setup the cache policy of the job as
cache:
untracked: true
However at Job ending it says no untracked file found.
I would be grateful if someone could point me the most adequate way to mount a volume or cache inside a container that will be preserved for the next job.