A Delicious Dozen Docker Terms You Need to Know

A Delicious Dozen Docker Terms You Need to Know

In this article I’ll share a dozen additional terms from the Docker ecosystem that you need to know.

“Everything You Need To Know About Docker” is a series of articles (so far 6) that explains different parts of *Docker *in a very simple and straightforward way. Here are the parts so far:

In Part 1 of this series we explored the conceptual landscape of Docker containers. We discussed the reasons Docker containers are important and several ways to think about them. And we made one into a pizza 🍕. In this article I’ll share a dozen additional terms from the Docker ecosystem that you need to know.

Docker Ecosystem Terms

I’ve broken Docker terms into two categories for easier mental model creation: Essentials and Scaling. Let’s hit the eight essentials first.

Docker Essentials

Docker Platform** is **Docker’s software that provides the ability to package and run an application in a container on any Linux server. Docker Platform bundles code files and dependencies. It promotes easy scaling by enabling portability and reproducibility.

Docker Engine* *is the client-server application. The Docker company divides the Docker Engine into two products. [*Docker Community Edition (CE)](https://docs.docker.com/install/ "Docker Community Edition (CE)*") is free and largely based on open source tools. It’s probably what you’ll be using. Docker Enterprise comes with additional support, management, and security features. Enterprise is how the Docker firm keeps the lights on.

Engine makes things run

Docker Client is the primary way you’ll interact with Docker. When you use the Docker Command Line Interface (CLI) you type a command into your terminal that starts with docker. Docker Client then uses the Docker API to send the command to the Docker Daemon.

Diagram from the Docker docs

Docker Daemon** **is the Docker server that listens for Docker API requests. The Docker Daemon manages images, containers, networks, and volumes.

Docker Volumes are the best way to store the persistent data that your apps consume and create. We’ll have more to say about Docker Volumes in Part 5 of this series. Follow me to make sure you don’t miss it.

Volumes

A** [Docker Registry](https://hub.docker.com/ "Docker Registry") is the remote location where Docker Images are stored. **You push images to a registry and pull images from a registry. You can host your own registry or use a provider’s registry. For example, AWS and Google Cloud have registries.

Docker Hub is the largest registry of Docker images. It’s also the default registry. You can find images and store your own images on Docker Hub for free.

Hubs and spokes

A* [Docker Repository](https://docs.docker.com/docker-hub/repos/ "Docker Repository**") is a collection of Docker images with the same name and different tags. The *tag is the identifier for the image.

Usually a repository has different versions of the same image. For example, Python* is the name of the most popular official Docker image repository on Docker Hub. *Python:3.7-slim *refers to the version of the image with the *3.7-slim tag in the Python repository. You can push a repository or a single image to a registry.

Now let’s look at Docker terms related to scaling multiple Docker containers.

Scaling Docker

The following four concepts relate to using multiple containers at once.

Docker Networking** **allows you to connect Docker containers together. Connected Docker containers could be on the same host or multiple hosts. For more information on Docker networking, see this post.

Docker Bridge Network

Docker Compose** is a tool that makes it easier to run apps that require multiple Docker containers. [Docker Compose](https://morioh.com/p/edd47d59de1b/tutorial-laravel-6-with-docker-and-docker-compose "Docker Compose**") allows you to move commands into a docker file for reuse. The Docker Compose command line interface (cli) makes it easier to interact with your multi-container app. Docker Compose comes free with your installation of Docker.

Docker Swarm is a product to orchestrate container deployment. The official Docker tutorial has you using Docker Swarm in its fourth section. I would suggest you not spend time on Docker Swarm unless you have a compelling reason to do so.

Bee swarm

Docker Services are the different pieces of a distributed app. From the docs:

Originally published by Jeff Hale at https://towardsdatascience.com Docker services allow you to scale containers across multiple Docker Daemons and make Docker Swarms possible.

There you have it: a dozen delicious Docker terms you should know.

Recap

Here’s the one line explanation to help you keep these dozen terms straight.

Basics

Platform — the software that makes Docker containers possible

Engine — client-server app (CE or Enterprise)

Client — handles Docker CLI so you can communicate with the Daemon

*Daemon *— Docker server that manages key things

Volumes — persistent data storage

Registry — remote image storage

*Docker Hub *— default and largest Docker Registry

Repository — collection of Docker images, e.g. Alpine

Scaling

Networking — connect containers together

Compose — time saver for multi-container apps

Swarm — orchestrates container deployment

Services — containers in production

Because we’re keeping with food metaphors, and everyone loves a baker’s dozen, we have one more related term for you: Kubernetes.

One more donut with extra icing and sprinkles

Kubernetes** **automates deployment, scaling, and management of containerized applications. It’s the clear winner in the container orchestration market. Instead of Docker Swarm, use Kubernetes to scale up projects with multiple Docker containers. Kubernetes isn’t an official part of Docker; it’s more like Docker’s BFF.

I have a whole series on Kubernetes in the works. Kubernetes is pretty awesome.

Now that you know the conceptual landscape and common terms I suggest you try out Docker.

Baking with Docker

If you haven’t worked with Docker before, it’s time to get in the kitchen and make something!

Docker runs locally on Linux, Mac, and Windows. If you’re on a Mac or Windows machine, install the latest stable version of Docker Desktop here. As a bonus, it comes with Kubernetes. If you’re installing Docker elsewhere, go here to find the version you need.

After you have Docker installed, do the first two parts of the Docker tutorial. Then meet back here for more Docker fun. In the next four parts of this series we’ll dive into Dockerfiles, Docker images, the Docker CLI, and dealing with data. Follow me to make sure you don’t miss the adventure.

docker devops kubernetes machine-learning data-science

Bootstrap 5 Complete Course with Examples

Bootstrap 5 Tutorial - Bootstrap 5 Crash Course for Beginners

Nest.JS Tutorial for Beginners

Hello Vue 3: A First Look at Vue 3 and the Composition API

Building a simple Applications with Vue 3

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

Most popular Data Science and Machine Learning courses — July 2020

Most popular Data Science and Machine Learning courses — August 2020. This list was last updated in August 2020 — and will be updated regularly so as to keep it relevant

15 Machine Learning and Data Science Project Ideas with Datasets

Learning is a new fun in the field of Machine Learning and Data Science. In this article, we’ll be discussing 15 machine learning and data science projects.

Using Docker and Kubernetes to Simplify Machine Learning

Using Docker and Kubernetes to Simplify Machine Learning: Managing the hardware, drivers, libraries and packages that make up a ML development environment can be hard. In this talk, I will introduce how Docker can be used to simplify the process of setting up a local ML development environment, and how we can use Kubernetes and Kubeflow to scale that standardised environment to provide scalable, web-based Jupyter environments for a large number of users, that can be served from both public cloud providers and from on-premise clusters.

50 Data Science Jobs That Opened Just Last Week

Data Science and Analytics market evolves to adapt to the constantly changing economic and business environments. Our latest survey report suggests that as the overall Data Science and Analytics market evolves to adapt to the constantly changing economic and business environments, data scientists and AI practitioners should be aware of the skills and tools that the broader community is working on. A good grip in these skills will further help data science enthusiasts to get the best jobs that various industries in their data science functions are offering.

Pipelines in Machine Learning | Data Science | Machine Learning | Python

Machine Learning Pipelines performs a complete workflow with an ordered sequence of the process involved in a Machine Learning task. The Pipelines can also