bmulej
1
Hi,
we have performance issues with large repository.
The problematic one are shell and docker runner that run on Windows.
I would like some help, documentation, information on what runner does before it even starts a job.
So even before_script statement. Is it clone/fetch, some git clean, checkout , anything else…
We would like to understand this underlying process so we can prepare better solution for our development team.
Kind regards, Bostjan
snim2
2
Hi @bmulej
One thing that might speed things up for you is to tweak Git shallow clone. I have one repo where I set:
variables:
# Use a shallow clone depth to speed up the CI build.
# See: https://docs.gitlab.com/ee/ci/large_repositories/
GIT_DEPTH: 10
There are a number of other ideas on the large repositories page that might help you.
The other thing you might want to do is to change what that repository caches in the runners, if you find that extracting the cache takes a long time.
Good luck!