Gitlab CI workflow, multiple projects and Sphinx


we do have this project which an intern covers.

Basic idea is: Use sphinx document generator to create a documentation for a gitlab wiki and then upload it to a webserver.

Sphinx is a python based documentation generator. Output would look like this:

After meddling with sphinx generation it works now like this:

  1. gitclone the gitwikis you need.
  2. Run a bunch of scripts (like deleting pages that are not in the index, so they won’t show up in the sphinx search)
  3. Convert cloned gitlab wiki .md files (that are included in the index) - to .rst - in order to allow sphinx lightbox/thumbnail extension. And we can now mix .md (for the easy stuff like headings and so on) with .rst - like the cool info and warning boxes.
  4. Finally we upload everything to a webserver (with a given URL) via ncftp in gitlab_ci.yml.

This works fine.

However we do have multiple projects in gitlab. I wanna have this documentation generator working like a service.

Let’s say sphinx generator is project A. But I’m working on the wiki of Project D. If I’m in project D - I wanna call the “service” in project A to do everything for me.

I want to tell the service:

  1. Which gitlab wiki needs documentation (my current one)
  2. The index.rst (basically the index/menu defines the menu tree)
  3. To which URL I wanna upload (everything on the same webserver with apache) + maybe login credentials (but it’s possible to globalize these)
  4. Maybe a different logo for the wiki (not that important).

How can I create such a service using multiple gitlab projects?

Simple solution would be to create a side project for every project that needs documentation. But that seems not to be the right way. If I have 40 projects - this would result in 80 projects … bad!

Any idea? There are multi project pipelines, there is the gitlab API and there are git submodules … But not sure where to start and couldn’t find a guide/solution such a problem.