What is the advantage of in each deploy, building a new docker image containing the application instead of just updating the application?

10/26/2018

I am migrating an application in Nodejs to kubernetes in GCP. In CI tutorials, I see the updated application being copied to a new docker image and sent to GCR.

The process of uploading an image is slow compared to updating only the code. So what exactly is the gain of sending a new image containing the application?

-- Emerson MS
continuous-integration
docker
google-container-registry
kubernetes

3 Answers

10/26/2018

You are missing the whole docker philosophy and the concept of immutable infrastructure, and the matrix from hell bellow, docker and other container-based technology was originally adopted to address the matrix from hell. enter image description here

Solution enter image description here

Entire books have been written to answer your question why not copy the code and why use the images , but the short answer is , use the docker images and address the slowness by doing some optimizations such as minimal docker images , minimal layers , caching etc

Minimal docker images

-- Ijaz Ahmad Khan
Source: StackOverflow

10/26/2018

The philosophy of Docker is simple - layers are reusable [1]. As long as the layers have not changed, they are reused across images. As long as you keep your application's layer as the last few, the base layers can be reused, keeping the number of layers pushed to a minimum. You should consider using multi-stage builds to minimise shipping build-stage dependencies with your container. Hasura.io has an excellent post[2] on using multi-stage builds for NodeJS apps effectively.

  1. https://www.infoworld.com/article/3077875/linux/containers-101-docker-fundamentals.html
  2. https://blog.hasura.io/an-exhaustive-guide-to-writing-dockerfiles-for-node-js-web-apps-bbee6bd2f3c4
-- Aditya Sundaramurthy
Source: StackOverflow

10/26/2018

The image should be pushed every time to ensure each version is tagged according to the version of code it contains.
You can overcome the slowness of pushing the entire image every time by layering your image in a way that updates the code as late as possible in the build process.
That way the big layers will already exist and you won't have to push them each time.
Check out this guide for creating efficient docker images.

-- Yaron Idan
Source: StackOverflow