Caching vs downloading artifacts from UI to enable file sharing across pipelines

Caching vs downloading artifacts from UI

I have a gitlab-ci.yml file with the following structure:

stages:

  • docker related stages
  • build
  • compile

After all the docker related stages and jobs, I essentially run a few commands in the docker container to:

  1. do a clean build from scratch with meson
  2. compile the meson project using the files from step 1

I do not want the clean build from scratch to be executed every time. But I do want the code to be compiled whenever there’s a change in the source code. The compile command uses files generated by the initial build job.

Right now I am using a global cache to make the generated files available to the compile command. But is it better to download artifacts from the UI? Sharing artifacts b/n different pipeline runs doesn’t seem possible.