Docker is a tool for running your applications in containers. Containers package all of the dependencies and code your app needs to get into one file that works the same way on every machine.
What is Docker?
Docker is similar in concept to virtual machines, except that it is much more lightweight. Rather than running a very separate operating system (which is a huge overhead), Docker runs containers that use the same host operating system and only virtualize at the software level.
Docker Engine runs on Linux, Windows and macOS, and supports Linux and Windows for Docker containers. The exact taste of Linux doesn't really matter; most versions of Linux run on the same kernel and differ only in the user software. Docker can install this user software on the container so you can run a CentOS container on Ubuntu. For example, it is not possible to run FreeBSD on Ubuntu because the kernels are different.
The Docker container image contains only what your app needs. If your app uses nginx and Node.js, the container image will contain them, but you won't be burdened with all the other userland apps you would normally find on Linux.
Why is Docker so useful?  Docker uses the same version control and packaging as tools such as Git and NPM and allows you to use it for your server software. Since your container is one image, it is very easy to keep track of different versions of your container. And since everything is included, managing all your dependencies becomes much easier.
With Docker, your development environment will be exactly the same as your production environment, and exactly the same as everyone's development environment, causing the problem of "it's broken on my machine!"
If you want to add another server to your cluster , you don't have to worry about reconfiguring that server and reinstalling all the dependencies you need.Once you've built a container, you can share the container file with anyone and they can easily share your app with a few make commands work Docker makes running multiple servers very easy, especially with orchestration engines like Kubernetes and Docker Swarm.
Docker also allows you to organize your code for implementation on new services. Suppose you have a web server you are using for your app You probably have a lot of things installed on that server, you have an nginx web server for hosting static i n content, you probably have a database to store some stuff on the backend, maybe you have an API server running on Express. js too. Ideally, you would split these into separate applications to run on separate servers, but development can get messy.
Docker helps to clean this up; you can package and run your web server with an nginx container, you can package and run your API server with a Node.js container, and you can package your database and run it in its own container (although that may be not the best idea, but it is possible). You can take these three Docker containers and run them all on the same machine. If you need to switch servers, it's as easy as migrating those containers to a new server. If you need to scale, you can move one of those containers to a new server or deploy it across a cluster of servers.
Docker can also save you money if you want to use multiple apps on one VPS. If each app has different dependencies, it's very easy for your server to get messy, like a Thanksgiving board with it all together. Docker allows you to use multiple separate containers with, for example, separate versions of PHP, such as a high school lunch tray with everything separated.
How do you use Docker?
In production there are many services to host Docker containers, including AWS ECS, Azure Container Instances, DigitalOcean Docker Droplets and many others. If your provider doesn't offer managed Docker hosting, you can always install it on your VPS yourself.
Under development, Docker containers are easy to implement and require only a few commands. To get started, you need to install the Docker engine on your host operating system. For Windows and macOS, you can use Docker Desktop, but for Linux, you need to install the Docker community edition from your package manager. For Debian-based distributions like Ubuntu, that would be:
sudo apt-get install docker
With both installation methods, you should now be able to access Docker from the command line. To verify that it works, you can run the following:
docker run hello-world
Docker should get this tutorial image from the Docker Hub, an online repository with many useful container images. You can use many of these images as a basis for installing your apps.
Let's create a simple web server based on nginx. Nginx offers a build on the Docker Hub that we can use as a starting point. Create a new folder to save the files and open it:
mkdir ~ / dockertest && cd ~ / dockertest
Any changes made to the base nginx image, we will do with a Dockerfile. Docker files are like container makefiles, they define which commands to run when Docker builds the new image with your changes. The Dockerfile is simply called
Dockerfile without extension. Create this file with
touch Dockerfile and open it in a text editor. Paste this into:
FROM nginx COPY html / usr / share / nginx / html
The first line is a Docker command that tells Docker to base this image on the Hub's nginx image. The second line is another command that copies over a directory from this local directory (
~ / dockertest / html ) to the Docker image, in this case replacing the HTML directory for nginx.
You can run enough commands in Dockerfiles. For example, if your app needs to install dependencies, you can do something like
RUN cd src / && npm install . Everything your app needs to start the installation and get started is defined in the Dockerfile.
We have not yet created the directory
./ html so go ahead and run it:  mkdir html & & touch html / index.html
To the directory and create the HTML entry. Open
index.html and paste in a dummy HTML:
Hello from nginx, within Docker! Inside, your computer?
Now we are ready to cook our image. Make sure you are at the base of the project (in
~ / dockertest not in the folder
html ) and run:
docker build -t dockertest.
The dot at the end indicates that we will use the current directory as a starting point. Docker should find the Docker file and get to work. However, it only takes a few seconds and when it is done you can run it with:
docker run --name DockerTest -p 8080: 80 -d dockertest
This will start a new container called DockerTest, with the "dockertest" image we took. The flag
-p binds a local port to a port in the container, in this case nginx & # 39; s standard HTTP port (port 80) binds to port 8080 on your local machine. Open
localhost: 8080 in your web browser and you should see nginx running.
If you wanted further configuration, you could edit the configuration files of nginx by including
COPY nginx.conf /etc/nginx/nginx.conf"19459013] and write your own configuration file This is more difficult to configure than editing the configuration file directly, because you have to rebuild the image with each edit, but for the added benefit of using the same container that you used in development and putting it into production, a fairly fair trade-off.
If you would like a more in-depth tutorial on networking, implementation and containerization of existing applications, we recommend reading this guide.