"simple" python project to be deployed, how?


5 VM’s running gitlab-runner with Docker.

In my team we have some small python projects, where they may only contain 1 single python file and possible a yaml file for config.

We need from time to time to work on this as a team and needs to get it deployed out to only 1 server internally in our network.
In another project there is more work on and only contains configurations file is deployed through a ssh command.

In that project we do this:

  DOCKER_DRIVER: overlay

image: gitlab.internal.company/dockerproxy/dependency_proxy/containers/alpine:3.12

  - apk --update add git openssh-client bash --no-cache
  - eval $(ssh-agent -s)
  - bash -c 'ssh-add <(echo "$SSH_PRIVATE_KEY")'
  - mkdir ~/.ssh
  - echo -e "Host server01\n\tHostName\n\tUser root" > ~/.ssh/config
  - echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config
  - echo -e "nameserver\nsearch internal.company\n" > /etc/resolv.conf

  stage: deploy
    name: production
    - ssh root@server01 "cd /srv && git checkout master && git pull origin master && exit"
    - master

I don’t like this method, it’s super insecure to have a SSH key added to root@ on that specific server and then add the public key in a variable inside my project.

I know I can use a deploy token to get read access to my repo(s) but how can I easily trigger something on a remote server.

I am currently thinking about a small “api” deployed on the server that could be triggered with curl where it gets version & project name and then that project with a deploy token trigger a download / clone of the repo or artifact.
put it e.g. in /opt with the version number and a symlink.
So it ends up e.g.

/opt/project -> /opt/project-v2

We don’t use Ansible or Chef, so I can’t trigger something with that, nor do I use Kubernetes (yet).