i have a pipeline that builds a .zip file. that is what the customer should get.
when i create a tag like 0.1.0, i would expect that i somehow can configure the .yml to produce the .zip and provide some way to upload it.
unfortunately, this seems to be a very big thing to accomplish without hacking around the .yml.
the, for me, cleanest solution was to use the gitlab release-cli, but either i am using it wrong or i misunderstood what it does.
release_job:
stage: release
image: registry.gitlab.com/gitlab-org/release-cli:latest
needs:
- job: generate_description
artifacts: true
rules:
- if: $CI_COMMIT_TAG # Run this job when a tag is created manually
script:
- >
release-cli create --name "v$CI_COMMIT_TAG" --description $EXTRA_DESCRIPTION
--tag-name $CI_COMMIT_TAG --ref $CI_COMMIT_SHA
--assets-link '{"name":"matrixregistration-$CI_COMMIT_TAG","url":"https://gitlab.com/olze/matrixregistration/-/jobs/artifacts/$CI_COMMIT_TAG/download?job=production","link_type":"other"}'
i dont get any error so i would expect now somewhere some kind of link to download that built binary. but there is just none. its only the source i can download.
how do others publish their releases?
You want to download an artifact from a previous build, publish it somewhere and get the link…whenever you create a tag manually? I’m not familar with release-cli. I tend to use Python API for this type of thing. I think you can accomplish your task with something like this… (You might need to install python and gitlab library on Docker image…I’m not sure what’s on the image you’re using.)
import gitlab
import sys
import zipfile
import argparse
arg_parser = argparse.ArgumentParser(add_help=False)
arg_parser.add_argument('commit_sha', help='GitLab variable CI_COMMIT_SHA')
def main(argv):
args = arg_parser.parse_args()
commit_sha = args.commit_sha
api = gitlab.Gitlab('https://gitlab.com/', private_token='<your private token>')
api.auth()
pipeline_job = None
project = api.projects.get("<name of project relative to https://gitlab.com/>")
if project:
pipelines = project.pipelines.list()
for pipeline in pipelines:
if pipeline.sha == commit_sha:
jobs = pipeline.jobs.list()
for job in jobs:
if job.stage == '<name of stage that built .zip>':
pipeline_job = job
break
break
if pipeline_job:
job = project.jobs.get(pipeline_job.id, lazy=True)
file_name = '__artifacts.zip'
with open(file_name, 'wb') as f:
job.artifacts(streamed=True, action=f.write)
zip = zipfile.ZipFile(file_name)
zip.extractall()
# upload to Google Cloud Storage, S3 bucket and print link
# also can reupload as artifact attached to this pipeline using extract path
if __name__ == '__main__':
main(sys.argv)
so in order to do a regular ci job (build, test, publish, release), you use another docker image inside the build declaration (gitlab-ci.yaml) which has python and gitlab library installed, just to be able to publish the just built artefact? that sounds even more hacky than the gitlab release-cli commandline.
thanks for your help but i’m afraid that does not sound like a solution to me.