GitLab-Server Backup strategy

This is not a problem, but a question.

Backing up our Gitlab-server results in big files (tar).

We have a tsm backup of the whole machine, where GitLab is installed (Docker installation).

Since this is a Docker based installation all GitLab relevant data
are stored in a directory /srv/docker/gitlab/

My question is the following

In case we have to restore our system, is it going to be OK to just import the data? With data I mean repositories, db, artifacts und everything GitLab needs.

Not sure what you mean but it sounds like you mean “just make a copy of a directory instead of doing a backup”. The big thing you miss there is the entire contents of the main gitlab postgress database, and perhaps the postgres database belonging to mattermost if you’re using it. A copy of a filesystem directory is NOT a backup of your server’s state.

Instead of doing something that doesn’t work, why not try restricting the backups to contain what you want. There are command line options to back up only the data you want.

sudo gitlab-rake gitlab:backup:create SKIP=uploads,artifacts,builds

https://docs.gitlab.com/omnibus/settings/backups.html

Bonus tip: If your backups are huge, then your Gitlab instance may eventually fill its disk. Consider using some tools to expire/delete old artifacts if those are wasting a lot of space for you.

Thank you for your reply! That is exactly what I wanted to say. I will try to go for a GitLab-Backup and be done with it.