i met an issue when run cicd job deploy on K8s using GKE.
Here my gitlab-ci.yaml
variables:
KUBE_CONTEXT: datran22/k8s-connection:k8s-connection
AGENT_ID: 1077575
stages:
- build
- deploy
build_image:
image: docker
stage: build
services:
- docker:dind
script:
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
- docker build -t $CI_REGISTRY/datran22/k8s-data/sample:v1 .
- docker push $CI_REGISTRY/datran22/k8s-data/sample:v1
- echo "Image built"
deploy_project:
stage: deploy
allow_failure: true
image:
name: bitnami/kubectl:1.21
entrypoint: ['']
script:
- kubectl config view
- kubectl config get-contexts
- kubectl config use-context "$KUBE_CONTEXT"
- kubectl get pods
- kubectl get nodes -o wide
- echo "Deployment LOGIN-APP TO K8S"
- kubectl apply -f $CI_PROJECT_DIR/k8s-file/.
- kubectl get pod
- kubectl get svc
Here is log of gitlab runer:
Running with gitlab-runner 16.3.0~beta.108.g2b6048b4 (2b6048b4)
on green-4.saas-linux-small-amd64.runners-manager.gitlab.com/default ntHFEtyX, system ID: s_8990de21c550
feature flags: FF_USE_IMPROVED_URL_MASKING:true, FF_RESOLVE_FULL_TLS_CHAIN:false
Preparing the "docker+machine" executor
00:08
Using Docker executor with image bitnami/kubectl:1.21 ...
Pulling docker image bitnami/kubectl:1.21 ...
Using docker image sha256:75f379a39fe70239f824e0a4ae8d16fb7883cd6b69dd1bc7965d43ee97ab790b for bitnami/kubectl:1.21 with digest bitnami/kubectl@sha256:bba32da4e7d08ce099e40c573a2a5e4bdd8b34377a1453a69bbb6977a04e8825 ...
Preparing environment
00:02
Running on runner-nthfetyx-project-51447527-concurrent-0 via runner-nthfetyx-s-l-s-amd64-1698033050-3d726380...
Getting source from Git repository
00:01
Fetching changes with git depth set to 20...
Initialized empty Git repository in /builds/datran22/k8s-data/.git/
Created fresh repository.
Checking out 9970d422 as detached HEAD (ref is main)...
Skipping Git submodules setup
$ git remote set-url origin "${CI_REPOSITORY_URL}"
Executing "step_script" stage of the job script
00:00
Using docker image sha256:75f379a39fe70239f824e0a4ae8d16fb7883cd6b69dd1bc7965d43ee97ab790b for bitnami/kubectl:1.21 with digest bitnami/kubectl@sha256:bba32da4e7d08ce099e40c573a2a5e4bdd8b34377a1453a69bbb6977a04e8825 ...
$ kubectl config view
apiVersion: v1
clusters: null
contexts: null
current-context: ""
kind: Config
preferences: {}
users: null
$ kubectl config get-contexts
CURRENT NAME CLUSTER AUTHINFO NAMESPACE
$ kubectl config use-context "$KUBE_CONTEXT"
error: no context exists with the name: "datran22/k8s-connection:k8s-connection"
Cleaning up project directory and file based variables
00:01
ERROR: Job failed: exit code 1
Status of agent
Please give me an advice for this issue. Many thanks for your help.