Monolith docker application with webpack

9/11/2019

I am running my monolith application in a docker container and k8s on GKE.

The application contains python & node dependencies also webpack for front end bundle.

We have implemented CI/CD which is taking around 5-6 min to build & deploy new version to k8s cluster.

Main goal is to reduce the build time as much possible. Written Dockerfile is multi stage.

Webpack is taking more time to generate the bundle.To buid docker image i am using already high config worker.

To reduce time i tried using the Kaniko builder.

Issue :

As docker cache layers for python code it's working perfectly. But when there is any changes in JS or CSS file we have to generate bundle.

When there is any changes in JS & CSS file instead if generate new bundle its use caching layer.

Is there any way to separate out build new bundle or use cache by passing some value to docker file.

Here is my docker file :

FROM python:3.5 AS python-build
WORKDIR /app
COPY requirements.txt ./
RUN pip install -r requirements.txt  &&\
    pip3 install Flask-JWT-Extended==3.20.0
ADD . /app

FROM node:10-alpine AS node-build
WORKDIR /app
COPY --from=python-build ./app/app/static/package.json app/static/
COPY --from=python-build ./app ./
WORKDIR /app/app/static
RUN npm cache verify && npm install && npm install -g --unsafe-perm node-sass && npm run sass  && npm run build


FROM python:3.5-slim
COPY --from=python-build /root/.cache /root/.cache
WORKDIR /app
COPY --from=node-build ./app ./
RUN apt-get update -yq \
    && apt-get install curl -yq \
    && pip install -r requirements.txt
EXPOSE 9595
CMD python3 run.py
-- Harsh Manvar
docker
google-cloud-platform
google-kubernetes-engine
kubernetes
webpack

1 Answer

9/11/2019

I would suggest to create separate build pipelines for your docker images, where you know that the requirements for npm and pip aren't so frequent. This will incredibly improve the speed, reducing the time of access to npm and pip registries.

Use a private docker registry (the official one or something like VMWare harbor or SonaType Nexus OSS).

You store those build images on your registry and use them whenever something on the project changes.

Something like this:

First Docker Builder // python-builder:YOUR_TAG [gitrev, date, etc.)

docker build --no-cache -t python-builder:YOUR_TAG -f Dockerfile.python.build .

FROM python:3.5 
WORKDIR /app
COPY requirements.txt ./
RUN pip install -r requirements.txt  &&\
    pip3 install Flask-JWT-Extended==3.20.0

Second Docker Builder // js-builder:YOUR_TAG [gitrev, date, etc.)

docker build --no-cache -t js-builder:YOUR_TAG -f Dockerfile.js.build .  

FROM node:10-alpine
WORKDIR /app
COPY app/static/package.json /app/app/static
WORKDIR /app/app/static
RUN npm cache verify && npm install && npm install -g --unsafe-perm node-sass 

Your Application Multi-stage build:

docker build --no-cache -t app_delivery:YOUR_TAG -f Dockerfile.app .  

FROM python-builder:YOUR_TAG as python-build
# Nothing, already "stoned" in another build process

FROM js-builder:YOUR_TAG AS node-build
ADD ##### YOUR JS/CSS files only here, required from npm! ###
RUN npm run sass && npm run build

FROM python:3.5-slim
COPY . /app # your original clean app
COPY --from=python-build #### only the files installed with the pip command
WORKDIR /app
COPY --from=node-build ##### Only the generated files from npm here! ###
RUN apt-get update -yq \
    && apt-get install curl -yq \
    && pip install -r requirements.txt
EXPOSE 9595
CMD python3 run.py

A question is: why do you install curl and execute again the pip install -r requirements.txt command in the final docker image? Triggering every time an apt-get update and install without cleaning the apt cache /var/cache/apt folder produces a bigger image.

As suggestion, use the docker build command with the option --no-cache to avoid caching result:

docker build --no-cache -t your_image:your_tag -f your_dockerfile .

Remarks:

You'll have 3 separate Dockerfiles, as I listed above. Build the Docker images 1 and 2 only if you change your python-pip and node-npm requirements, otherwise keep them fixed for your project. If any dependency requirement changes, then update the docker image involved and then the multistage one to point to the latest built image.

You should always build only the source code of your project (CSS, JS, python). In this way, you have also guaranteed reproducible builds.

To optimize your environment and copy files across the multi-stage builders, try to use virtualenv for python build.

-- madduci
Source: StackOverflow