I am interested in installing gitlab with LFS to serve as a data repository for an university research lab, and am trying to decide the best way forward. I have used gitlab as a user, but this is the first time I’ve been asked to deploy and manage a gitlab installation.
- I have the gitlab 13.6.2-ee installed via omnibus, on a Ubuntu 18 machine
- I have an existing network mounted Synology NAS that I want all our repos to sit. The NAS consists of two NAS bays in mirror arrangement for backup purposes. The main purpose of this NAS was to store data, so I can rearrange the NAS configuration as needed.
- I am aware that some Synology drives have the ability to deploy gitlab via docker, but my particular model doesn’t support this
I was wondering:
- I saw from the NFS documentation (https://docs.gitlab.com/ee/administration/nfs.html#soft-mount-option) that NFS support will end in gitlab 14. Instead of deploying the NAS on the network, should I move the NAS so it’s directly plugged into my workstation? What is a recommended option in this regard?
- I know gitlab has its own backup best practices. Should I keep my NAS in its mirror arrangement, or break it and rely on gitlab’s backup setup instead?
- We will likely be working with large text (~500 MB each) and video files (1-2 GB each). I know that even with git-lfs, it’s not recommended to put large files into git, but I was unable to find a suitable alternative. I mainly want a central version of all our data files so all the lab users are not overwriting each other’s updates (ie if they clean some data and want to re-upload), and less concerned about file history. Is using git for this purpose a bad idea?
- If my gitlab installation breaks, are there easy ways for me to extract the contents of my repos (ie if I access the NAS directly) while I am fixing the gitlab installation? Or maybe set up an auto-export of the master branch?