Kaustav Hazra

1573271171

Docker Tutorial | Introduction to Dockers and Containers

Unravel the mysteries of Docker and see how containers make creating and deploying applications easier for developers.

We have been hearing a lot about Dockers and containers; they are said to be the next big things in the world of technology. I decided to explore the enigma that was Docker and I must say it’s really impressive.

Introduction

What is Docker?

Whenever we have to install software, we have to take care of a lot of things. There are lots of different versions of software available for different operating systems and their different versions. You have to go through the documentation and choose the correct fit for your needs and then run the executive file. Even after that, you may need to complete some other steps before you are able to use that software. Docker runs containers, which contain the software plus the dependencies that the software requires to run. Just use a docker run command with the name of the image that you want to install, and your software runs in its own container, using its own resources. You do not have to worry which version of the software suits your operating system. I will demonstrate this with an example of a MongoDB installation.

What Is the Importance of Docker for Developers?

Developers can now simply write their code and create an image. This image will contain all the tools needed for the application to run. This image simply needs to be deployed on a production machine which has no prior software installed and the application will run exactly as in the development machine.

So How Do We Use Docker?

Once we have installed Docker on our systems, we go to Docker Hub or some other registry. Search for the software that you want to install. You can then run a PowerShell command docker run imageName and the software is ready for our use.

Difference Between Containers and Docker

These two terms are used interchangeably a lot, but they mean different things. Containers are self-contained processes which include the running software along with its dependencies. Containers have existed since Linux, but they were not much used then. According to the official Docker website “Docker is a platform for developers and sysadmins to develop, deploy, and run applications with containers.”

Advantages of Containers

As mentioned above, running containers simplifies the process of running software and applications. Suppose you have an ASP.NET application. A developer can create an image of the working application. This image will contain the application, the ASP.NET framework, and the dependencies. Now this image can be deployed as a container on a prod machine that needs no other software installed. Whatever is needed for the application to run will be present in the container. The container will run the same on all the systems, so you will no longer have issues like an application that is running on one machine but failing on another.

Containers and Virtual Machines

Containers and virtual machines might look the same, but they are quite different. Containers will contain only the tools that the application needs and it will share the host operating system kernel with other containers. But virtual machines, on the other hand, will have their own fully independent operating systems. Since containers do not have their own full-fledged operating system, they are lighter than virtual machines

Docker Engine

The Docker website explains a Docker engine with the below diagram

This is image title

A Docker engine consists of a client and a server. We interact with the server using the Docker CLI which is also the client. The client interacts with the server through the Docker Rest API. The server or Docker daemon is responsible for running the containers. When the user types in a command from the Docker CLI, like docker run imagename , the request is received by the Docker daemon. The daemon will search for the image locally, and if found, it will run it as a container. Think of an image as an executive file. If the image is not found locally, the daemon will search for it in a registry and then run it as a container.

Installing Docker

Now let’s start exploring Docker practically. You need Windows 10 Professional or Enterprise version with at least 4 GB RAM to install Docker. Since I didn’t have Windows 10 Professional, I created a virtual machine in Azure. Here are the steps:

Go to the Azure portal and click on virtual machine.

This is image title

Choose a Windows 10 professional machine. Not all VMs support nested virtualization, so I went and selected a VM of size D2s_v3. Selecting a virtual machine with a size that supports nested virtualization is important to run Docker.

This is image title

Also make sure in inbound and outbound port rules, all connections are allowed through RDP, or else you might not be able to connect through RDP.

This is image title

If you are trying to access Azure from your office you might run into issues. You might need to contact your system administrator to open up these ports. Once our VM is up and running, we need to install Docker. Go here to install Docker for Windows.

Once you have installed the above software, your system will restart and Docker will ask you to enable Hyper-V. Click “Ok” and restart the system.
This is image title

This is image title

By default, Linux containers will be enabled. You can switch containers by clicking on the whale icon.

This is image title

Go to PowerShell and type docker run hello-world and press enter. You should see a message, “Hello from Docker” which means Docker has installed correctly.

This is image title

Read the steps that are mentioned in the screenshot above. This is what we talked about earlier.

It is possible that if you are trying to run Docker from your workplace, you might face some proxy-related issues. You can set your proxy by navigating to Settings.

Demo: Running Your ASP.NET Application as A Container

Create a new ASP.NET MVC core project in Visual Studio 2017. While creating it, make sure you have the “Enable Docker Support” option checked

This is image title

I call my app “aspnetapp.” Once you enable Docker support, a file called a Dockerfile is created in the solution explorer.

This is image title

Replace the existing code in this file with the following piece of code:

FROM microsoft/dotnet:sdk AS build-env
WORKDIR /app
# Copy csproj and restore as distinct layers
COPY *.csproj ./
RUN dotnet restore
# Copy everything else and build
COPY . ./
RUN dotnet publish -c Release -o out
# Build runtime image
FROM microsoft/dotnet:aspnetcore-runtime
WORKDIR /app
COPY --from=build-env /app/out .
ENTRYPOINT ["dotnet", "aspnetapp.dll"]

This is image title

Go to PowerShell and navigate to the project directory. Once there run the command:

docker build -t aspnetapp .

This is image title

Once the project is built, run the following command:

docker run -d -p 8080:80 --name myaspnetapp aspnetapp

This is image title

Once this is successful, go to localhost:8080 to navigate to the app:

This is image title

So what happened here? The Dockerfile gave information which is needed for creating an image. For example, it says that the image should be created with the base image as Microsoft/dotnet:aspnetcore. An image for application is created when you run the build command. If you write the following the command in PowerShell, you see the images listed.

This is image title

When you run the image using the docker run command, it runs this image as a container where “myaspnetapp” is the container name and “aspnetapp” is the image name. The run command instructs it to run on port 8080. So, when you navigate to localhost:8080, you can find your containerized application running. You can check out all the running containers using the commander docker ps .

This is image title

For further information on this demo refer the official Docker website here.

Developers can create their images and upload it to a repository. This image can then be run on production machines when the application needs to be activated.

Now let’s check out how we can run MongoDB as a container.

Demo: Installing MongoDB

Traditional Way of Installing Software

Now let’s install Mongo using traditional methods. If we head over to its documentation, it will list the steps that are required for installing MongoDB including running the executive, setting it up through the installer, etc. Installing MongoDB is a lengthy process.

Now let’s see how Docker simplifies this process.

Running MongoDB as A Container

Go to Docker Hub and search for MongoDB.

Before we run the command, click on the whale icon, go to Settings -> Daemon and set the experimental flag as “True.”

This is image title

Once you are done, Docker will restart. You can then type in the following command in PowerShell or command prompt.

docker run --name some-mongo -d mongo:4.1

Here, Docker installs a container with the name “some-mongo.” You can give some other name as you please. “Mongo” is the image name and “4.1” is its version or tag.

This is image title

It says that a newer image has been downloaded.

Let’s use the following command to run the downloaded image:

docker run some-mongo

This is image title

We will get a message “Waiting for connections on port 27017,” which means our server is up and running.

So open another instance of PowerShell and run the following command:

docker exec -it some-mongo mongo

This is image title

Then type in the command:

show dbs

This is image title

This shows that there are no databases created yet in our server. We can now proceed with other MongoDB commands here.

So we see Docker simplifies the process of installing the software.

What Happens Behind the Screens?

Your operating system can be divided into two major portions: kernel and userspace. The kernel has control over the hardware. Everything other than the kernel like our applications, OS apps, and libraries that are required by this application fall under userspace. Userspace accesses the hardware through the kernel.

Traditionally, when we install software we simply install the application and use the drivers and the libraries already present in the user space. But now with the containerization approach, when an image is created it will contain the application plus other drivers required for it to run. Thus, an application will be independent of the resources that the operating system provides.

Docker Commands

  • docker run imagename: This will run the specified image. This is equivalent to running an executive in traditional software.

  • docker -help: This will list down all the docker commands available to you.

  • docker ps : This will list down the currently running containers. If we type this command in our PowerShell we get the following output:

This is image title

docker ps -a: This will list all the running and the exited containers.

![This is image title](https://i0.wp.com/taagung.com/wp-content/uploads/2019/02/image-41.png?w=810&ssl=1 "This is image title")
  • docker stop containername : This will stop the software. docker ps -a will list the container as stopped, but docker ps will not list it.

  • docker rm containername : This will remove the container. This is like uninstalling software in the traditional sense. Both docker ps -a and docker ps will not list it, since the container is removed.

  • docker images : This will list the downloaded images. Images are like the executives as far as traditional methods of installing software are concerned.

For the more Docker commands visit here.

Upload An Image To Docker Hub

In the first demo above, we created an image for ASP.NET Core which was stored locally. We will now have a look at how to upload images on Docker Hub. First, you need to create a free account on Docker Hub. Then create a repository.

Login from PowerShell as below:

This is image title

Then run the following commands:

This is image title

In the docker tag command, aspnetapp is the image name. Then comes my username/respository:tag.

Once this image is pushed on Docker Hub, you can log in there and you will be able to see the image in a browser.

This is image title

I hope that this article has taken you one step closer to unraveling the mysteries of Docker. Feel free to reach out to me in case you want to discuss further.

#docker

What is GEEK

Buddha Community

Docker Tutorial | Introduction to Dockers and Containers
Iliana  Welch

Iliana Welch

1597368540

Docker Tutorial for Beginners 8 - Build and Run C++ Applications in a Docker Container

Docker is an open platform that allows use package, develop, run, and ship software applications in different environments using containers.
In this course We will learn How to Write Dockerfiles, Working with the Docker Toolbox, How to Work with the Docker Machine, How to Use Docker Compose to fire up multiple containers, How to Work with Docker Kinematic, Push images to Docker Hub, Pull images from a Docker Registery, Push stacks of servers to Docker Hub.
How to install Docker on Mac.

#docker tutorial #c++ #docker container #docker #docker hub #devopstools

Haylie  Conn

Haylie Conn

1623747973

Implementing Non-Trivial Containerized Systems: Picking Components

We’ll use Icecast, Liquidsoap, youtube-dl, and FFmpeg as the base components for creating our radio station.

So, you want to start a radio station, eh?

This is the first part of a multi-part series on designing and building non-trivial containerized solutions. We’re making a radio station using off-the-shelf components and some home-spun software, all on top of Docker, Docker Compose, and eventually, Kubernetes.
In this part, we’re going to explore how the different parts of the system interface with one another, to set the stage for our next post, where we Dockerize everything!

I first met Icecast (https://icecast.org/) when I worked at a web-hosting startup around the turn of the millennium. One night, one of my co-workers and I had the crazy idea to load a bunch of audio files on the networked file server and stream them to our workstations. We could listen to music while we worked 90+ hours a week. Strange times. After realizing it wasn’t as simple as exporting .ogg files over HTTP, we found Icecast (and its pal, Ices2) and built a rudimentary, local-network broadcast radio station.

#open source #cloud #tutorial #docker #containers #docker containers #docker container

Mikel  Okuneva

Mikel Okuneva

1602317778

Ever Wondered Why We Use Containers In DevOps?

At some point we’ve all said the words, “But it works on my machine.” It usually happens during testing or when you’re trying to get a new project set up. Sometimes it happens when you pull down changes from an updated branch.

Every machine has different underlying states depending on the operating system, other installed programs, and permissions. Getting a project to run locally could take hours or even days because of weird system issues.

The worst part is that this can also happen in production. If the server is configured differently than what you’re running locally, your changes might not work as you expect and cause problems for users. There’s a way around all of these common issues using containers.

What is a container

A container is a piece of software that packages code and its dependencies so that the application can run in any computing environment. They basically create a little unit that you can put on any operating system and reliably and consistently run the application. You don’t have to worry about any of those underlying system issues creeping in later.

Although containers were already used in Linux for years, they became more popular in recent years. Most of the time when people are talking about containers, they’re referring to Docker containers. These containers are built from images that include all of the dependencies needed to run an application.

When you think of containers, virtual machines might also come to mind. They are very similar, but the big difference is that containers virtualize the operating system instead of the hardware. That’s what makes them so easy to run on all of the operating systems consistently.

What containers have to do with DevOps

Since we know how odd happenings occur when you move code from one computing environment to another, this is also a common issue with moving code to the different environments in our DevOps process. You don’t want to have to deal with system differences between staging and production. That would require more work than it should.

Once you have an artifact built, you should be able to use it in any environment from local to production. That’s the reason we use containers in DevOps. It’s also invaluable when you’re working with microservices. Docker containers used with something like Kubernetes will make it easier for you to handle larger systems with more moving pieces.

#devops #containers #containers-devops #devops-containers #devops-tools #devops-docker #docker #docker-image

Introduction to Docker Containers

Hello, readers! In this article, we will be focusing on the concept of Docker Containers, in detail. So, let us begin!

What are Virtual Machines [VM]?

Bare metal servers are inefficient. They require a lot of physical space, lots of cooling equipment, and impact the environment in a negative way. There had to be a better way out. The developers got thinking. Why not simply turn a single, powerful computer into a multi-dimensional server that runs multiple isolated applications simultaneously?

And this idea gave rise to the development of Virtual Machines.

Virtual Machines are computer files that are computer software that is an image and it acts like computer hardware. That is, it creates a computer within a computer. It virtualizes the environment for us. Above the VM, we have a hypervisor, also known as virtual machine monitor. This is software that creates, provisions, and runs Virtual machines over the actual physical hardware. It resides between the VM and the physical server and plays a key role in the virtualization of the server.

#docker #containers #docker containers

Iliana  Welch

Iliana Welch

1595249460

Docker Explained: Docker Architecture | Docker Registries

Following the second video about Docker basics, in this video, I explain Docker architecture and explain the different building blocks of the docker engine; docker client, API, Docker Daemon. I also explain what a docker registry is and I finish the video with a demo explaining and illustrating how to use Docker hub

In this video lesson you will learn:

  • What is Docker Host
  • What is Docker Engine
  • Learn about Docker Architecture
  • Learn about Docker client and Docker Daemon
  • Docker Hub and Registries
  • Simple demo to understand using images from registries

#docker #docker hub #docker host #docker engine #docker architecture #api