I am running a python script in CI on GitLab.com
What are the resource limits for CI jobs on GitLab.com?
- max RAM?
- max CPUs?
- max disk space?
I am running a python script in CI on GitLab.com
What are the resource limits for CI jobs on GitLab.com?
Hi @westshawn and welcome to the GitLab Community Forums.
If you’re using the GitLab.com autoscaling shared runners, I believe the runners have 1CPU, 4GB RAM, and 22Gb of storage (16Gb available).
You also have the option to install GitLab Runner on your own infrastructure use it to run your CI jobs. Doing this, there is essentially no resource limit, as it circumvents any limitations of using shared resources.
Awesome!
Thank You!!
Hi @gitlab-greg, is there any evidence such as official docs about the resource limits?
Thx, Martin
All your CI/CD jobs run on n1-standard-1 instances with 3.75GB of RAM, CoreOS and the latest Docker Engine installed. Instances provide 1 vCPU and 25GB of HDD disk space.
Source: GitLab.com settings | GitLab
You can also confirm these stats real-time by adding a few commands to a GitLab CI job.
For example:
image: "ubuntu:latest"
shared-runner-stats:
script:
- free -m | grep -v "Swap" # RAM
- df -h| grep -E "Filesystem|overlay" # storage
- lscpu | grep -E "^CPU\(s\)" # CPUs
The output should look something like:
$ free -m | grep -v "Swap"
total used free shared buff/cache available
Mem: 3693 481 2221 199 991 2790
$ df -h| grep -E "Filesystem|overlay"
Filesystem Size Used Avail Use% Mounted on
overlay 22G 4.6G 17G 22% /
$ lscpu | grep -E "^CPU\(s\)"
CPU(s): 1
I thought we can use a half of n2-standard-2
instance now, but it was not correct
It looks like the current limit should be 8 GB (most likely actually GiB) for the default small Linux runners.