Hi Everyone,
I had an idea to create a separate server with a newer version and transfer the project from the old one.
I want to keep all the versioning history from each of the old repositories.
Would anyone know how I could transfer all the history from a very old version of GitLab to the newest one?
Does you saying “the project” imply that there’s only one project on this GitLab server?
Then setting up a new server and transferring it might indeed be the easier option. But you also use the plural in “each of the old repositories”.
Normally the way would be to look at
find the upgrade path from the version you currently have and then follow that.
With a GitLab version that old (6-7 years I would guess), you’ll probably also have to upgrade the underlying OS in between. You need to say which OS that is if you want any help including that.
A simple export of a project will contain the entire git history, so you shouldn’t loose anything if you go that way. But that export-import will be a pr. project operation, so it quickly becomes more time consuming than the regular upgrade path. My guess would be that break-even with a version of GitLab as old as that is somewhere in the (low) double digit number of repositories.
Thanks for the replay, this information will help me progress.
I have certainly made it little bit confusing in my question.
I have multiple projects that I would have to move to the new server. Additionally, I got other people working on these projects. I would not want to introduce any downtime for others.
So what would be your recommendation based on the above?
As @iwalker says, you can’t totally avoid downtime (with a single server), and when you have to go as far as you have, it might be better to just announce a day or a couple of evenings where people should not expect GitLab to be available.
You can minimize (but not eliminate) downtime by following the zero-downtime procedure (badly named as it does cause downtime), but then you’ll have to go through every minor version, so you’re looking at many times as many upgrades (and you were already looking at quite a few), and between each upgrade you’ll have to let background migrations complete (that’s as such not specific to this procedure, but normally all upgrades are done in the foreground). So it’s harder for and basically every upgrade will give (a minimum of) downtime, your users might prefer more downtime over a shorter period.
I agree with all of you that I can not avoid the downtime if I only have one server.
However, The plan I had in mind now is that I would run a completely new server (separate VM) and leave the old server as back-up (in-case things go south).
This way I won’t need to go through endless upgrading 9.5.0 to 16…
Only thing I am worried about is to keep the history of the repository to the new server. Although, I have seen some magic that can be done via changing the git remote origin on git bash.
As I’ve already said if you go with exporting+importing you should get all git history transferred, so you won’t lose anything. (I don’t know if there are other kinds of history you care about.)
But even with your plan there will effectively be downtime between a moment before (the projects need to be in a non-changing state) you export projects on the old until the import is done on the new. You might think people are still able to access the projects on the old server, but any changes they make there will be lost, which in general causes more complaints than downtime.
In summary:
So downtime is unavoidable
The work for you will be significant no matter what
The path of upgrading is more known (except that we don’t know what OS your server uses, that might cause additional challenges), and guaranteed not to lose any data