Hi,
5 VM’s running gitlab-runner with Docker.
In my team we have some small python projects, where they may only contain 1 single python file and possible a yaml file for config.
We need from time to time to work on this as a team and needs to get it deployed out to only 1 server internally in our network.
In another project there is more work on and only contains configurations file is deployed through a ssh command.
In that project we do this:
variables:
DOCKER_DRIVER: overlay
GIT_STRATEGY: none
image: gitlab.internal.company/dockerproxy/dependency_proxy/containers/alpine:3.12
before_script:
- apk --update add git openssh-client bash --no-cache
- eval $(ssh-agent -s)
- bash -c 'ssh-add <(echo "$SSH_PRIVATE_KEY")'
- mkdir ~/.ssh
- echo -e "Host server01\n\tHostName 10.100.10.47\n\tUser root" > ~/.ssh/config
- echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config
- echo -e "nameserver 10.2.1.10\nsearch internal.company\n" > /etc/resolv.conf
deploy_production:
stage: deploy
environment:
name: production
script:
- ssh root@server01 "cd /srv && git checkout master && git pull origin master && exit"
only:
- master
I don’t like this method, it’s super insecure to have a SSH key added to root@ on that specific server and then add the public key in a variable inside my project.
I know I can use a deploy token to get read access to my repo(s) but how can I easily trigger something on a remote server.
I am currently thinking about a small “api” deployed on the server that could be triggered with curl where it gets version & project name and then that project with a deploy token trigger a download / clone of the repo or artifact.
put it e.g. in /opt with the version number and a symlink.
So it ends up e.g.
/opt/project-v2
/opt/project -> /opt/project-v2
We don’t use Ansible or Chef, so I can’t trigger something with that, nor do I use Kubernetes (yet).