One of my repositories on Gitlab.com is showing me that it is using 9 GB of “storage” when I go to the ‘Project Overview’ page. It says Files: 311 MB, Storage 9 GB.
I am not sure where this 9 GB come from! I have don a lot of CI Testing on this project which also created a lot of artifacts. But I deleted the Pipelines using the Rest API:
After deleting and waiting a day the 9 GB are still showing.
After that I ran “HouseKeeping” in the Settings->General but that also didn’t change the storage of 9GB. At this point I don’t know what to do and I fear that I soon won’t be able to push to my repository because of the 10 GB limit on storage per repository.
I also exported the project to check the git LFS storage but that exported Zip is only 300 MB big. I know the export does not contain the CI data. So I think GitLab somehow did not pickup that my pipelines got deleted. Can somebody help and tell me how I can force a Cleanup of all CI data?
Unfortunately no solution but a emergency strategy which I still have to test.
You could export the project from GitLab which gives you a zip with all settings but not the artifacts. Then move your original repo to another URL/Name and then import a new GitLab project using the zip. That should remove the artifacts and from then on either always set “expire_in” for attributes or don’t use them until fixed
Thanks for your quick response. Which artifacts can be provided with an ‘expires’. Do you mean the images in the Container Registry? Or artifacts from the CI/CD processes?
Regards
Peter
I performed some tests. Despite the ‘expire_in’ the project storage is not released (at least according to the UI)…
Except for the export/import of the project, it seems that there is no way to delete the storage.
Greetings
Peter
@robinryf Same results here even after deleting pipelines. The GitLab issue tracker has tens of thousands of issues. Has anyone found a relevant issue so we can track this? I’ve done some searching but can’t seem to find an issue that precisely describes this problem. I too would like to escalate, or at least make sure Gitlab is aware of the issue.
Any update on how to solve this issue? CI/CD keeps consuming storage on the repo, the actual size of all the files on my the repo are 5.3mb, consumed storage by CI is 18GB! @ushandelucca got some news?
In my project I changed the pipeline and now I don’t need artifacts anymore. Since then my storage needs have not grown any further. But now the whole build is represented in one step.
The workaround with export, delete and import of the project seems to be the only solution at the moment.
Has somebody tried what happens when you go over the 10GB on Gitlab.com ? I think once you go over the size you can’t push to the repository anymore. Does this Push rule use the “real” size or the wrong size shown in the Repository information?
I’m having the same problem, i deleted all jobs and pipelines but job_artifacts_size keeps reporting more than 7gb and every pipeline takes more space that is not reclaimed after expiration (i have expire_in 1 day). I don’t know what to do
There’s a somewhat old (two years) issue reporting that statistics of artifacts size calculation was and still is broken. Seems like this is the current root of storage size quirks.