Background: I have 8-9 private clusterIP spring based microservices in a GKE cluster. All of the microservices are having integration tests bundled with them. I am using bitbucket and using maven as build tool.
All of the microservices are talking to each other via rest call with url: http://:8080/rest/api/fetch
Requirement: I have testing enviroment ready with all the docker images up on GKE Test cluster. I want that as soon as I merge the code to master for service-A, pipeline should deploy image to tes-env and run integration test cases. If test cases passes, it should deploy to QA-environment, otherwise rollback the image of service-A back to previous one.
Issue: On every code merge to master, I am able to run JUNIT test cases of service-A, build its docker image, push it on GCR and deploy it on test-env cluster. But how can I trigger integration test cases after the deployment and rollback to previously deployed image back if integration test cases fails? Is there any way?
TIA
If you use Gitlab CICD you can break the stages as follows:
stages:
- compile
- build
- test
- push
- review
- deploy
where you should compile the code in the first stage, then build the docker images from it in the next and then pull images and run them to do all your tests (including the integration tests)
here is the mockup of how it will look like:
compile-stage:
stage: compile
script:
- echo 'Compiling Application'
# - bash my compile script
# Compile artifacts can be used in the build stage.
artifacts:
paths:
- out/dist/dir
expire_in: 1 week
build-stage:
stage: build
script:
- docker build . -t "${CI_REGISTRY_IMAGE}:testversion" ## Dockerfile should make use of out/dist/dir
- docker push "${CI_REGISTRY_IMAGE}:testversion"
test-stage1:
stage: test
script:
- docker run -it ${CI_REGISTRY_IMAGE}:testversion bash unit_test.sh
test-stage2:
stage: test
script:
- docker run -d ${CI_REGISTRY_IMAGE}:testversion
- ./integeration_test.sh
## You will only push the latest image if the build will pass all the tests.
push-stage:
stage: push
script:
- docker pull ${CI_REGISTRY_IMAGE}:testversion
- docker tag ${CI_REGISTRY_IMAGE}:testversion -t ${CI_REGISTRY_IMAGE}:latest
- docker push ${CI_REGISTRY_IMAGE}:latest
## An app will be deployed on staging if it has passed all the tests.
## The concept of CICD is generally that you should do all the automated tests before even deploying on staging. Staging can be used for User Acceptance and Quality Assurance Tests etc.
deploy-staging:
stage: review
environment:
name: review/$CI_BUILD_REF_NAME
url: https://$CI_ENVIRONMENT_SLUG.$KUBE_INGRESS_BASE_DOMAIN
on_stop: stop_review
only:
- branches
script:
- kubectl apply -f deployments.yml
## The Deployment on production environment will be manual and only when there is a version tag committed.
deploy-production:
stage: deploy
environment:
name: prod
url: https://$CI_ENVIRONMENT_SLUG.$KUBE_INGRESS_BASE_DOMAIN
only:
- tags
script:
- kubectl apply -f deployments.yml
when:
- manual
I hope the above snippet will help you. If you want to learn more about deploying microservices using gitlab cicd on GKE read this
You can create different steps for each part:
pipelines:
branches:
BRANCH_NAME:
- step:
script:
- BUILD
- step:
script:
- DEPLOY
- step:
script:
- First set of JUNIT test
- step:
script:
- Run Integration Tests (Here you can add if you fail to do rollback)
script:
- Upload to QA
You should probably check what Tekton is offering. The Tekton Pipelines project provides k8s-style resources for declaring CI/CD-style pipelines.
There are many ways you can do it. From the above information its not clear which build tool you are using.
Lets say if you are using bamboo you can create a task for the same and include it in the SDLC process. Mostly the task can have bamboo script or ansible script.
You could also create a separate shell script to run the integration test suite after deployment.