We’re investigating migrating from gitolite to GitLab. One big outstanding issue for us in our GitLab POC is git per-receive hooks. In our current git server, we have a library of per-receive hooks that are run in linux and we expect to be able to migrate them to GitLab.
The problem is that our hooks need a complete linux environment, not a pared-down minimalist gitaly docker image which is what we think is what GitLab over k8s will provide us. Our hook environment requires
- Our collection of scripts (ideally a git clone updated regularly via a “git pull”)
- Certain perl libraries
- Certain linux packages
It is not clear that these would be in the default gitaly docker image. We could build own gitaly docker image, but this would a) make upgrading GitLab much more difficult if every time we had to rebuild our custom image, and b) perhaps create a bootstrapping issue - how can you boot up GitLab if you need GitLab’s own container registry to get your own version of git gitaly docker image?
For item (1) Perhaps we could easily mount or git clone our collection of scripts into the docker image. But that solution does not apply to (2) and (3).
In our brainstorming, we wonder if perhaps we could run our own docker image that also mounts the git repo and has an open socket to the gitaly image and the gitaly image calls out to our docker image for the pre-receive hooks. But we have no idea if that is possible.
(Could we re-write everything in bash? Possibly, though the general reason to move a script from bash to a language like Python/Perl/Ruby is because you need the more complex datastructures that these provide. Writing our hooks in a less ideal environment will likely make them more buggy and less maintainable. Additionally, we’d still likely need, in addition to the standard
awk, etc, who knows what other specialized tools to handle json/xml/http/… that are probably not in the gitaly image.)
Any tips/hacks you could provide would be greatly appreciated.