CODE VN

CODE VN

1666196722

Quản lý vòng đời của Docker Container

Docker là một nền tảng container để phát triển, vận chuyển và chạy các ứng dụng bên trong container. Chúng tôi có thể triển khai nhiều vùng chứa đồng thời trên một máy chủ nhất định. Các vùng chứa rất nhanh và khởi động nhanh chóng vì chúng không cần tải thêm của một siêu giám sát so với các máy ảo vì chúng chạy trực tiếp trong nhân của máy chủ. 

Trong bài này, chúng ta sẽ nói về quản lý vòng đời của Docker container với các lệnh quan trọng.

Khởi động vùng chứa Docker

Lệnh này được sử dụng để bắt đầu một vùng chứa mới từ một hình ảnh duy nhất:

docker run 

Ví dụ: phần sau sẽ chạy vùng chứa Node.js ở nền trước và yêu cầu nó chạy Bash shell:

docker run -it --name my_example node bash

Mục đích của cờ –name là gán một tên dễ nhớ cho vùng chứa.

Một ví dụ khác với tùy chọn cổng:

docker container run -d --name web_latest -p 8080:8080  web:latest

Tạm dừng một vùng chứa

Lệnh bên dưới sẽ tạm dừng tất cả các quá trình trong vùng chứa:

docker pause <container-id or container-name>

Khởi động lại vùng chứa

Bạn có thể khởi động lại vùng chứa Docker bằng cách thực hiện như sau:

docker restart <container-id or container-name>

Dừng một thùng chứa

Bạn có thể dừng một vùng chứa Docker đang chạy bằng lệnh này:

docker stop <container-id or container-name>

Xóa vùng chứa

Để loại bỏ một vùng chứa đã dừng (đã thoát), hãy chạy:

docker rm <container-id or container-name>

Nếu bạn muốn xóa tất cả các vùng chứa đang chạy và đã dừng trên hệ thống của mình cùng một lúc mà không chuyển ID hoặc tên, hãy sử dụng lệnh sau:

docker container rm $(docker container ls -aq) -f

Cần lưu ý rằng lệnh trước rất nguy hiểm và bạn nên hết sức cẩn thận khi gõ nó.

What is GEEK

Buddha Community

Mikel  Okuneva

Mikel Okuneva

1602317778

Ever Wondered Why We Use Containers In DevOps?

At some point we’ve all said the words, “But it works on my machine.” It usually happens during testing or when you’re trying to get a new project set up. Sometimes it happens when you pull down changes from an updated branch.

Every machine has different underlying states depending on the operating system, other installed programs, and permissions. Getting a project to run locally could take hours or even days because of weird system issues.

The worst part is that this can also happen in production. If the server is configured differently than what you’re running locally, your changes might not work as you expect and cause problems for users. There’s a way around all of these common issues using containers.

What is a container

A container is a piece of software that packages code and its dependencies so that the application can run in any computing environment. They basically create a little unit that you can put on any operating system and reliably and consistently run the application. You don’t have to worry about any of those underlying system issues creeping in later.

Although containers were already used in Linux for years, they became more popular in recent years. Most of the time when people are talking about containers, they’re referring to Docker containers. These containers are built from images that include all of the dependencies needed to run an application.

When you think of containers, virtual machines might also come to mind. They are very similar, but the big difference is that containers virtualize the operating system instead of the hardware. That’s what makes them so easy to run on all of the operating systems consistently.

What containers have to do with DevOps

Since we know how odd happenings occur when you move code from one computing environment to another, this is also a common issue with moving code to the different environments in our DevOps process. You don’t want to have to deal with system differences between staging and production. That would require more work than it should.

Once you have an artifact built, you should be able to use it in any environment from local to production. That’s the reason we use containers in DevOps. It’s also invaluable when you’re working with microservices. Docker containers used with something like Kubernetes will make it easier for you to handle larger systems with more moving pieces.

#devops #containers #containers-devops #devops-containers #devops-tools #devops-docker #docker #docker-image

Iliana  Welch

Iliana Welch

1597368540

Docker Tutorial for Beginners 8 - Build and Run C++ Applications in a Docker Container

Docker is an open platform that allows use package, develop, run, and ship software applications in different environments using containers.
In this course We will learn How to Write Dockerfiles, Working with the Docker Toolbox, How to Work with the Docker Machine, How to Use Docker Compose to fire up multiple containers, How to Work with Docker Kinematic, Push images to Docker Hub, Pull images from a Docker Registery, Push stacks of servers to Docker Hub.
How to install Docker on Mac.

#docker tutorial #c++ #docker container #docker #docker hub #devopstools

Iliana  Welch

Iliana Welch

1595249460

Docker Explained: Docker Architecture | Docker Registries

Following the second video about Docker basics, in this video, I explain Docker architecture and explain the different building blocks of the docker engine; docker client, API, Docker Daemon. I also explain what a docker registry is and I finish the video with a demo explaining and illustrating how to use Docker hub

In this video lesson you will learn:

  • What is Docker Host
  • What is Docker Engine
  • Learn about Docker Architecture
  • Learn about Docker client and Docker Daemon
  • Docker Hub and Registries
  • Simple demo to understand using images from registries

#docker #docker hub #docker host #docker engine #docker architecture #api

August  Murray

August Murray

1615124700

Docker Swarm: Container Orchestration Using Docker Swarm

Introduction

A swarm consists of multiple Docker hosts that run in swarm mode and act as managers (to manage membership and delegation) and workers (which run swarm services). A given Docker host can be a manager, a worker, or perform both roles.

When Docker is running in swarm mode, you can still run standalone containers on any of the Docker hosts participating in the swarm, as well as swarm services. A key difference between standalone containers and swarm services is that only swarm managers can manage a swarm, while standalone containers can be started on any daemon.

In this demonstration, we will see how to configure the docker swarm and how to perform basic tasks.

Pre-requisites

  1. For our demonstration, we will be using centos-07.
  2. We will be using 3 machines for our lab, 1 machine as a swarm Manager node and 2 swarm worker nodes. These servers have below IP details:

192.168.33.76 managernode.unixlab.com

192.168.33.77 workernode1.unixlab.com

192.168.33.78 workernode2.unixlab.com

3. The memory should be at least 2 GB and there should be at least 2 core CPUs for each node.

#docker #containers #container-orchestration #docker-swarm

Docker Architecture Overview & Docker Components [For Beginners]

If you have recently come across the world of containers, it’s probably not a bad idea to understand the underlying elements that work together to offer containerisation benefits. But before that, there’s a question that you may ask. What problem do containers solve?

After building an application in a typical development lifecycle, the developer sends it to the tester for testing purposes. However, since the development and testing environments are different, the code fails to work.

Now, predominantly, there are two solutions to this – either you use a Virtual Machine or a containerised environment such as Docker. In the good old times, organisations used to deploy VMs for running multiple applications.

So, why did they started adopting containerisation over VMs? In this article, we will provide detailed explanations of all such questions.

#docker containers #docker engine #docker #docker architecture