Python Django with Docker and Gitlab CI

Python Django with Docker and Gitlab CI

Python Django with Docker and Gitlab CI - In this article i will describe how we set up Gitlab CI to run tests for Django project. But first couple of words about what tools we were using...✈️✈️✈️✈️✈️

Python Django with Docker and Gitlab CI - In this article i will describe how we set up Gitlab CI to run tests for Django project. But first couple of words about what tools we were using..

For a project I was specifically asked to build an API using Python Django. So, my first starting point was to google “django cookiecutter” which immediately brought me to this amazing cookiecutter project. What I am going to demonstrate here is how to quickly setup the project (for the sake of completeness) and use Gitlab Continuous Integration to automatically unit test, run linters, generate documentation, build a container and release it.

Setup the project

We start with initiating the project using the mentioned cookiecutter project,although you can also use another cookiecutter or build on your existing project; you probably need to make some alterations here and there. Here is a small list of prerequisites:

  • you have docker installed locally
  • you have Python installed locally
  • you have a Gitlab account and you can push using ssh keys

Now, install cookiecutter and generate the project:

pip install "cookiecutter>=1.4.0"

Provide the options in any way you like, so the Django project will be created. Type y when asked to include Docker (because that is why we are here!!!).

Walk trough the options for the Django Cookiecutter

Enter the project, create a git repo and push it there:

cd my_django_api
git init
git add .
git commit -m "first awesome commit"
git remote add origin [email protected]:jwdobken/my-django-api.git
git push -u origin master

Obviously replace my-django-api with your project name and jwdobken with your own Gitlab account name.

You can read here how to develop running docker locally. It is something I do with all my projects of any type; the dev and production environments are more alike and it has been years since I worked with something like virtual environments and I am not missing it!

Add a test environment

Make a test environment by copying the local environment:

cp local.yml test.yml
cp requirements/local.txt requirements/test.txt
cp -r compose/local compose/test

In compose/test/django/Dockerfilechange requirements/local.txt to requirements/test.txt . You can make more alterations to the test environment later.

The Gitlab-CI file

Finally we get to the meat. Here is the .gitlab-ci.yml file:

image: docker:latest
      - docker:dind

      DOCKER_HOST: tcp://docker:2375
      DOCKER_DRIVER: overlay2

      - test
      - build
      - release

      stage: test
      image: tiangolo/docker-with-compose
        - docker-compose -f test.yml build
        # - docker-compose -f test.yml run --rm django pydocstyle
        - docker-compose -f test.yml run --rm django flake8
        - docker-compose -f test.yml run django coverage run -m pytest
        - docker-compose -f local.yml run --rm django coverage html
        - docker-compose -f local.yml run --rm django /bin/sh -c "cd docs && apk add make && make html"
        - docker-compose -f local.yml run django coverage report
      coverage: "/TOTAL.+ ([0-9]{1,3}%)/"
          - htmlcov
          - docs/_build
        expire_in: 5 days

      stage: build
        - docker login -u gitlab-ci-token -p $CI_BUILD_TOKEN $CI_REGISTRY
        - docker build -t $CONTAINER_TEST_IMAGE -f compose/production/django/Dockerfile .
        - docker push $CONTAINER_TEST_IMAGE

      stage: release
        - docker login -u gitlab-ci-token -p $CI_BUILD_TOKEN $CI_REGISTRY
        - docker pull $CONTAINER_TEST_IMAGE
        - docker push $CONTAINER_RELEASE_IMAGE
        - master

      stage: release
        - mkdir -p public/coverage
        - mv htmlcov/* public/coverage
        - mkdir -p public/docs
        - mv -v docs/_build/html/* public/docs
          - public
        expire_in: 30 days
        - master

The test stage builds the container stack in the test environment, runs the unit tests with flake8, copies the html coverage report and catches the total coverage. Also, we misuse the test build to generate the sphinxdocumentation for which we need to install Make.

The build stage builds the production container and pushes it to the Gitlab container registry.

The release stage pulls the build container and tags it as the latest release before pushing it to the container registry.

The page part publishes the test and documentation artifacts with Gitlab Pages.

Push your code to Gitlab where you should find a running pipeline.

the pipeline is running

the pipeline has succesfully finished

In the container registry of the project you can find two images: the latest master image and the latest release image. The page itself explains how to pull images from here to anywhere.


Gitlab enables badges on the repo page to give any specific information. On the Gitlab project page, go to Settings, go to Badges. Here you can add the following badges:

Pipeline status of the master branch:

Test coverage and report:


Note that the URL link of Gitlab Pages, for the test coverage report and documentation, is not straightforward. Replace your username with a groupname if you work in a group. In the case of a subgroup, provide the full path. Usually I end up with a bit of trial-and-error; this article explains most of it.

status badges shown on the repo page


Finally, I highly recommend to check the existence and quality of your docstrings using pydocstyle. Add the following line to requirements/test.txtand requirements/local.txt in the Code quality section:

pydocstyle==3.0.0  #

Add the following lines to setup.cfg to configure pydocstyle:

match = (?!\d{4}_).*\.py

And finally add the following line to .gitlab-ci.yml in the script section of the test stage (just after the build):

- docker-compose -f test.yml run — rm django pydocstyle

Be warned that the project does not comply with pydocstyle by default, so you will have to complete the code with docstrings to pass the test again.


We now have a fresh Django project with a neat CI pipeline on Gitlab for automated unit tests, documentation and container image release. You can later include Continuous Deployment to the pipeline; I left it out of the scope, because it depends too much on your production environment. You can read more about Gitlab CI here.

Currently the pipeline is quite slow mainy caused by the build of the images. The running time can be accelerated by caching dependencies.

There is a soft (10GB) size restriction for registry on, as part of the repository size limit. Therefore, when the number of images increases, you probably need to archive old images manually.


Thanks for reading :heart: If you liked this post, share it with all of your programming buddies! Follow me on Facebook | Twitter

python django

What's new in Bootstrap 5 and when Bootstrap 5 release date?

How to Build Progressive Web Apps (PWA) using Angular 9

What is new features in Javascript ES2020 ECMAScript 2020

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

Random Password Generator Online

HTML Color Picker online | HEX Color Picker | RGB Color Picker

Python Django Tutorial | Django Course

🔥Intellipaat Django course: 👉This Python Django tutorial will help you learn what is django web development &...

Basic Data Types in Python | Python Web Development For Beginners

In the programming world, Data types play an important role. Each Variable is stored in different data types and responsible for various functions. Python had two different objects, and They are mutable and immutable objects.

Developing Restful APIs with Python, Django and Django Rest Framework

This article is a definitive guide for starters who want to develop projects with RESTful APIs using Python, Django and Django Rest Framework.

How to run a python script in django propject which has multiple django models in it

I have a script that I want to run from terminal inside a django project and that script is a python script and it has multiple django models in it .So how can I run that script since I am not able to run it directly.Also everything resides inside docker container.