Just trying to weigh up my options for git hosting. Been using bitbucket and gitlab on and off like a distro hopper.
What I need to know is the max repo size for a free user + LFS storage. I am intending to use it for Unreal Engine projects.
I asked pretty much the same question over at BitBucket and I was told “In Bitbucket Cloud you have a limitation of 2GB per repo. If you enable LFS, you’ll get 1GB for your LFS files for free”
So how does Gitlab compare to that?
EDIT: According to this page… The max repo size on Gitlab.com is 10GB. Does that include LFS?
I would also like to know about on premise hosted community edition’s Gitlab instance.
What should be Max. Size limit for a project like can it be 15GB or more. Does it create issues while restoring whole gitlab instances backup with “gitlab-backup restore” command.
Waiting for your valuable help & recommendations here.
For a self-managed instance, you can set it to whatever you want. If you’re worried about how large projects get, you can get a global limit then override it for individual projects.
In terms of backup and restore, that mostly depends on your infrastructure set up. There is no specific guideline. I can tell you that we have do users who run GitLab with very large 100+GB projects, but I believe most of these are on high availability (HA) setups.
Thanks for quick response Cynthia. But here I was talking about community edition which don’t have global/individual project limit, so is there any alternate way. One more thing what do you mean by “infrastructure set up”. We have approx. 90+GB of backup dump. when we try to restore it on new instance with same version/edition, largest projects i.e. 10+GBs are not getting restored.
Sorry for the late reply. If you haven’t already, I suggest starting a new thread with details on your issue as backups are unrelated to the original question on the project repository limits.