Docker-first Python development

Docker-first Python development

Docker-first Python development - In this article, we’ll be talking about how to start using Docker for python development.

Docker-first Python development - In this article, we’ll be talking about how to start using Docker for python development.

I've always been a bit annoyed at how difficult it can be to avoid shipping test code and dependencies with Python applications. A typical build process might look something like:

  1. create a virtual environment
  2. install service dependencies
  3. install test dependencies
  4. run tests
  5. package up code and dependencies into an RPM.

At this point, my service dependencies and test dependencies are intermingled in the virtual environment. To detangle them, I now have to do something like destroy the venv and create a new one, reinstalling the service dependencies.

Regardless of the packaging method, I don't want to pull down dependencies when I deploy my service.

At Twilio, we are in the process of embracing container-based deployments. Docker containers are great for Python services as you no longer have to worry about multiple python versions or virtual environments. You just use an image with exactly the version of Python your service needs and install your dependencies directly into the system.

One thing I've noticed is that while many services are built and packaged as Docker images, few use exclusively Docker-based development environments. Virtual environments and pyenv .python-version files abound!

I recently started writing a new Python service with the knowledge that this would be exclusively deployed via containers. This felt like the right opportunity to go all in on containers and build out a strategy for Docker-first localdev. I set out with the following goals:

  1. don't ship tests and test dependencies with the final image
  2. tests run as part of the Docker build
  3. failing tests will fail the build
  4. IDE (PyCharm) integration

A bit of research (aka Googling) suggested that multi-stage builds might be useful in this endeavor. Eventually I ended up with a Dockerfile that looks something like this:

FROM python3 as builder

COPY requirements.txt ./

RUN pip install -r requirements.txt

COPY src ./src



FROM builder as tests

COPY test_requirements.txt ./

RUN pip install -r test_requirements.txt

COPY tests ./tests

RUN pytest tests



FROM builder as service

COPY docker-entrypoint.sh ./

ENTRYPOINT ["docker-entrypoint.sh"]

EXPOSE 3000

When building an image from this Dockerfile, Docker will build 3 images, one for each of the .python-version statements in the docker file. If you've worked with Dockerfiles before, you know that statement ordering is critical for making efficient use of layer cacheing, and multi-stage builds are no different. Docker builds each of the images in the order they are defined. All of the intermediate stages are ephemeral, only the last image is output by the build process.

In this case, the first stage (.python-version) builds an image with all the service dependencies and code. The second stage (.python-version) installs the test requirements and test code, and runs the tests. If the tests pass, the build process will continue on to the next stage. If the tests fail, the entire build will fail. This ensures that only images with passing tests are built! Finally, the last stage (.python-version) builds on top of our .python-version image, adding the entrypoint script, defining the entrypoint command and exposing port 3000.

So how did I do wrt the initial goals?

  1. don't ship tests and test dependencies with the final image ✓
  2. tests run as part of the Docker build ✓
  3. failing tests will fail the build ✓
  4. IDE (PyCharm) integration ❌

I've met most of the goals, but what about the actual development experience? If I open up PyCharm and import my source code, it complains that I have unsatisfied dependencies :( Fortunately PyCharm Professional has the ability to select a python interpreter from inside a Docker image! Cool, but I have to build the image before I can use its interpreter. But thanks to goal #3, if my tests are failing, I can't build my image...

Lucky for us, we can tell .python-version to build one of our intermediate stages explicitly, stopping the build after the desired stage. Now if I run .python-version, I can select the interpreter from the .python-version image.

Uh oh! The builder image doesn't include my test dependencies! Of course, that's the whole point of the builder image. Let's add another stage we can use for running and debugging our tests.

FROM python3 as builder

COPY requirements.txt ./

RUN pip install -r requirements.txt

COPY src ./src



FROM builder as tests

COPY test_requirements.txt ./

RUN pip install -r test_requirements.txt

COPY tests ./tests

RUN pytest tests



FROM builder as service

COPY docker-entrypoint.sh ./

ENTRYPOINT ["docker-entrypoint.sh"]

EXPOSE 3000

With the .python-version stage, I can build and image with all my service and test code and dependencies. I can even make the localdev container run the tests by default when the container is run. By using the interpreter from this image, I can now debug my failing tests.

Let's take a look again at the initial goals:

  1. don't ship tests and test dependencies with the final image ✓
  2. tests run as part of the Docker build ✓
  3. failing tests will fail the build ✓
  4. IDE (PyCharm) integration ✓

Hooray!

Except there's one thing still bothering me: changes to the service code trigger a reinstallation of our test dependencies. Yuck! Let's take another whack at our Dockerfile:

FROM python3 as builder

COPY requirements.txt ./

RUN pip install -r requirements.txt

COPY src ./src



FROM builder as tests

COPY test_requirements.txt ./

RUN pip install -r test_requirements.txt

COPY tests ./tests

RUN pytest tests



FROM builder as service

COPY docker-entrypoint.sh ./

ENTRYPOINT ["docker-entrypoint.sh"]

EXPOSE 3000

Ok that seems pretty complicated, here's a graph of our image topology:

FROM python3 as builder

COPY requirements.txt ./

RUN pip install -r requirements.txt

COPY src ./src



FROM builder as tests

COPY test_requirements.txt ./

RUN pip install -r test_requirements.txt

COPY tests ./tests

RUN pytest tests



FROM builder as service

COPY docker-entrypoint.sh ./

ENTRYPOINT ["docker-entrypoint.sh"]

EXPOSE 3000

I don't love that the .python-version and .python-version stages both copy over the source directory, but the real question is, does this still meet our initial goals while avoiding excessive re-installs of test dependencies? Yeah, it seems to work pretty well. Thanks to Docker's layer caching, we rarely have to re-install dependencies.

Originally published by*** **** *Jeremy Moore at dev.to

=================================================================

Thanks for reading :heart: If you liked this post, share it with all of your programming buddies! Follow me on Facebook | Twitter

Learn More

Complete Python Bootcamp: Go from zero to hero in Python 3

Python for Time Series Data Analysis

Python Programming For Beginners From Scratch

Python Network Programming | Network Apps & Hacking Tools

Intro To SQLite Databases for Python Programming

Beginner’s guide on Python: Learn python from scratch! (New)

Python for Beginners: Complete Python Programming

The Data Science Course 2019: Complete Data Science Bootcamp

Linking python app docker and postgress docker

I have two docker containers running by the following commands:

I have two docker containers running by the following commands:

  • docker run --name postgres -v "/Users/xxx/Desktop/Coding/DockerMounting":/home/ -e POSTGRES_PASSWORD=xyz -d postgres
  • docker run -it -v "/Users/xxx/Desktop/Coding/DockerMounting":/home/t -p 5000:5000 --name some-app --link postgres:postgres -d xxx/ubuntu:latest

I have created the necessary user, database and table in my postgres (psql) container.

I am trying to run a python script:

import os

from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker

engine = create_engine(os.getenv("DATABASE_URL"))
db = scoped_session(sessionmaker(bind=engine))

def main():
flights = db.execute("SELECT origin, destination, duration FROM flights").fetchall()
for flight in flights:
print(f"{flight.origin} to {flight.destination}, {flight.duration} minutes.")

if name == "main":
main()

I get the following error:

  File "list.py", line 6, in <module>
engine = create_engine(os.getenv("DATABASE_URL"))
File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/engine/init.py", line 435, in create_engine
return strategy.create(*args, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/engine/strategies.py", line 56, in create
plugins = u._instantiate_plugins(kwargs)

I know one issue is that I need to set DATABASE_URL env - but I am not sure what should be that value

Top Python Development Companies | Hire Python Developers

Top Python Development Companies | Hire Python Developers

After analyzing clients and market requirements, TopDevelopers has come up with the list of the best Python service providers. These top-rated Python developers are widely appreciated for their professionalism in handling diverse projects. When...

After analyzing clients and market requirements, TopDevelopers has come up with the list of the best Python service providers. These top-rated Python developers are widely appreciated for their professionalism in handling diverse projects. When you look for the developer in hurry you may forget to take note of review and ratings of the company's aspects, but we at TopDevelopers have done a clear analysis of these top reviewed Python development companies listed here and have picked the best ones for you.

List of Best Python Web Development Companies & Expert Python Programmers.

WordPress in Docker. Part 1: Dockerization

WordPress in Docker. Part 1: Dockerization

This entry-level guide will tell you why and how to Dockerize your WordPress projects.

This entry-level guide will tell you why and how to Dockerize your WordPress projects.