I’m working on a project with a lot of png files (up to 100K) and I need to optimize it with tools like optipng (strip metadata and, re-compression, color type change…)
The optimization task can take several hours when many many files are involved.
I’m wondering what’s the best way to automate this task with GitLab ?
I already tried different approachs:
Git pre-commit hook
Commited files are optimized at commit time but it is a blocking operation so it is not ideal, even more when a lot of pngs are commited as it takes times.
GitLab CI + Build artifacts:
A deploy job does the compression and send back to the server the optimized files as build artifacts.
The drawback of this workflow is that the optimized files never go back to the repo and need to be optimized again and again at each release.
GitLab CI + commit/push optimized files
A deploy job does the compression, commit the optimized files and push it back to the remote repo.
This doesn’t work since runners have read-only access to the repo.
Does anyone knows good alternatives workflows for such a need ?