PPL2021 — Docker Orchestration — Tupperware For Your Code

Photo by Dominik Lückmann on Unsplash

Deploying you program to the server is hard, but with docker, it’s easier

Have you ever wondered how your code run in the server? I remember it clearly the first time when I initialise my first project. I have to install the python first, and then pip, and then all the dependencies, and then the database, and let’s not forget if you use the NPM library for your front-end, the list goes on and on. You can’t expect the server computer to have the same program and library the same as you. Not only it was time-consuming and resource-consuming.

What is docker?

Docker is a platform built based on container technology. Yes, you heard that right, it’s basically a container like your mom’s Tupperware. Docker will ease your development pain. It will unite all your dependency and software library in one place. So you can deploy your software on every computer (or server) without having to install all the dependency, again and again, which exhausting.

VM-based technology vs Container-based clous. Source: A Hybrid Genetic Programming Hyper-Heuristic Approach for Online Two-level Resource Allocation in Container-based Clouds — Scientific Figure on ResearchGate. Available from: https://www.researchgate.net/figure/Container-based-cloud-vs-VM-based-cloud_fig1_334695896 [accessed 2 May, 2021]

Why docker?

The main advantage of using docker for your project is less error. Like I said before, docker will save all your and library and dependency. There are so many cases when your code runs well on your local machine. But, when you deploy your application, it becomes a mess because you forget to include your dependency on your Gradle, requirements, XML file, and many configuration files out there.

Photo by Tim Gouw on Unsplash

Docker Architecture

Source: https://docs.docker.com/get-started/overview/
docker build -t registry.docker.python:3.8:latest#push image
docker push registry.docker.python:3.8:latest
#run your container
docker run -it -p

Setting up the environment

Please note that because we are using Heroku as our server, we can’t use docker-compose because Heroku already has its docker. Also, because we use Heroku’s PostgreSQL, we don’t have to declare which database port to listen to. But, even though we are not using docker-compose, we still have to set up the Django env. Because we already have requirements written, all the docker have to do is make sure that python and pip installed.


Because Heroku already has its docker, all we have to do now on the Dockerfile.web is to set up the environment and make sure that the Heroku container will do the python regime: makemigrations, migrate, and then runserver. Below is our Dockerfile.web. And then all you have to is made heroku.yml to told the Heroku server to use this configuration every time they compose a new container for our project.

docker-build-master:# Official docker image.image: docker:lateststage: buildservices:- docker:dindbefore_script:- docker login -u "$CI_REGISTRY_USER" -p "$CI_REGISTRY_PASSWORD" $CI_REGISTRYscript:- docker build --pull -t "$CI_REGISTRY_IMAGE" .- docker push "$CI_REGISTRY_IMAGE"only:- master

Benefit of using Docker

  • Community Powered — Sure that some of the features need you to pay some money ( docker run server need money to run after all). Most of the time, if your project is small enough and you don’t need too many image or dependency. The free docker plan is good enough for you to use.
  • Sharing is caring — As I said on the point above. Because Docker is community-powered, it means every time you make a public image on Docker Hub, other people can use it too.
  • Less Error — The point of using a container is to minimalise dependency error. So your code can run the same on your local computer, on the server, on the user device, and your teammate's machine.

Under-graduated Students Majoring in Computer Science