Does publishing on gitlab-pages remove old files?

I have gitlabci similar to that:

   script: touch report.html
     name: results
       - report.html
     - touch public/index.html 
     - cp report.html public/${CI_COMMIT_SHORT_SHA}.html
       - public

What i experienced, that only latest result is avaiable (current commit.html). I suppose publishing new “public” directory is removing previous content. Is there a way to publish “new” content and keep the “old” ?


you’ll need to find a way to persist the old content. One way of exposure is with using the artifacts setting making it available in the GitLab CI/CD web interface and going on from there.

I would toy with the idea of uploading the created file from the job into the Git repository with a new name, for example report-$(date +%s) or similar - The step can be added to the script and involve a curl call then, re-using the CI access token from the pre-defined environment variables.

One thing to be aware of - if you upload and commit this to git master, the CI will trigger again and possibly result in an endless loop. In order to prevent this scenario from happening, define a rule to only run the deployment when being tagged.

We use that for our blog at

    - if: '$CI_COMMIT_TAG != null'
      when: always


1 Like


Thanks for the reply. So I assume there is no way to publish gitlab pages in non-desctructive way? I would rather avoid creating commits in repository.

Regards, Michal.


GitLab pages uses what’s provided in the Git repository by default on each CI/CD pipeline execution, that’s by design. If you want to persist the report history and have that available, you need to store that somewhere. The Git repository itself would be my first approach, and should not be destructive in any way. Other ways include putting the reports in an S3 bucket and fetching it from there on the next run. There’s probably many others ways to cache this, I only would recommend to keep it simple.