Automated workflow to optimize png files

Hi all,

I’m working on a project with a lot of png files (up to 100K) and I need to optimize it with tools like optipng (strip metadata and, re-compression, color type change…)

The optimization task can take several hours when many many files are involved.

I’m wondering what’s the best way to automate this task with GitLab ?

I already tried different approachs:

Git pre-commit hook

Commited files are optimized at commit time but it is a blocking operation so it is not ideal, even more when a lot of pngs are commited as it takes times.

GitLab CI + Build artifacts:

A deploy job does the compression and send back to the server the optimized files as build artifacts.

The drawback of this workflow is that the optimized files never go back to the repo and need to be optimized again and again at each release.

GitLab CI + commit/push optimized files

A deploy job does the compression, commit the optimized files and push it back to the remote repo.

This doesn’t work since runners have read-only access to the repo.

Does anyone knows good alternatives workflows for such a need ?

Actually I found a good alternative :

GitLab Triggers

I created a cron job on the server that trigger a build every night with a GitLab API call.

The only drawback of this method is that I need to distinguish within my gitlab-ci.yml between a regular build & a nightly build since as I don’t want to optimize assets on every builds cause the processing time is too long.

I wish that nightly build feature will be integrated directly in GitLab in the future