Building and Managing Applications with Docker: A Practical Guide - Week Sixteen

Shubham Murti - Oct 2 - - Dev Community

Hello Community!

This week, I embarked on a deep dive into Docker, a vital tool for containerization. Guided by Docker’s official “Get Started” guide, I gained hands-on experience with Docker’s core concepts, from building simple applications in containers to managing more complex setups using Docker Compose. In this blog post, I will share what I’ve learned, focusing on Docker’s installation, creating custom images, and orchestrating multi-container applications.


1. What is Docker and Why Use It?

Docker is a platform that allows developers to build, ship, and run applications in isolated environments known as containers. Containers package an application with all its dependencies, ensuring it runs consistently across different environments. This solves the classic “it works on my machine” problem, making Docker a game-changer for cloud-based projects and microservices.

Image description

One of Docker’s biggest strengths is its ability to make workflows more efficient. Each component of an application runs independently, improving scalability and reducing dependency conflicts. As I worked through Docker, I was particularly impressed by how seamless it was to deploy applications across multiple environments.


2. Installing Docker

The first step on my Docker journey was installation. Since I’m using Windows, I installed Docker Desktop, which bundles Docker Engine with a user-friendly interface. The process was straightforward, thanks to Docker’s comprehensive documentation. However, it’s important to ensure that virtualization is enabled on your machine.

Docker Desktop comes with a built-in tutorial, which helped me familiarize myself with basic Docker commands like docker run and docker build. These commands quickly became essential tools as I progressed.


3. Docker Images vs. Containers

A key concept I learned is the distinction between Docker images and containers:

  • Docker Images are blueprints that define what’s inside a container. They are read-only templates containing application code, libraries, and dependencies.

  • Containers are the actual running instances of images. Multiple containers can be created from a single image, with each running independently in an isolated environment.


4. My First Docker Container

To test my Docker setup, I ran the famous hello-world container:

docker run hello-world
Enter fullscreen mode Exit fullscreen mode

This command pulled the “hello-world” image from Docker Hub, created a container from that image, and executed it. Seeing the successful output gave me confidence that Docker was set up correctly on my machine.

5. Building a Custom Docker Image

The next step in my Docker journey was creating a custom image by writing a Dockerfile. This simple text file contains the commands to assemble an image. Here’s an example where I built a basic web server using Python's Flask framework:

# Use an official Python runtime as a parent image
FROM python:3.8-slim-buster

# Set the working directory
WORKDIR /app

# Copy the current directory contents into the container at /app
COPY . /app

# Install required packages
RUN pip install -r requirements.txt

# Make port 80 available outside the container
EXPOSE 80

# Define environment variable
ENV NAME DockerApp

# Run the application
CMD ["python", "app.py"]
Enter fullscreen mode Exit fullscreen mode

To build the image, I used the command:

docker build -t my-python-app .
Enter fullscreen mode Exit fullscreen mode

Running the custom image in a container was a great learning experience. It showed me how Docker makes it easy to package applications and their dependencies into a reproducible unit.

6. Multi-Container Applications with Docker Compose

While working with a single container is useful, most real-world applications consist of multiple services. Docker Compose allows you to define and run multi-container applications with ease. I created a docker-compose.yml file to run a web application alongside a database:

version: "3"
services:
  web:
    image: my-python-app
    ports:
      - "5000:80"
  database:
    image: postgres
    environment:
      POSTGRES_USER: user
      POSTGRES_PASSWORD: pass
      POSTGRES_DB: mydb
Enter fullscreen mode Exit fullscreen mode

With the command docker-compose up, I was able to spin up both containers, and they worked together seamlessly. This is where Docker truly shines, simplifying the orchestration of complex applications.

7. Managing Containers, Images, and Volumes

As I worked more with Docker, I learned how to manage various containers, images, and volumes:

  • To list running containers: docker ps
  • To stop and remove containers: docker stop and docker rm
  • To list and remove images: docker images and docker rmi

Docker volumes were crucial for persistent data storage. By creating and attaching volumes, I could ensure that database data wasn’t lost when a container was removed. This added layer of durability is essential for any application that deals with persistent data.

Closure

This week’s journey into Docker has been enlightening. I now have a solid grasp of containerization and its advantages for cloud-based environments. Docker has simplified my development process and will undoubtedly play a significant role in future cloud projects. With Docker, I’m better equipped to create, deploy, and manage containers efficiently, whether for simple applications or multi-service environments using Docker Compose.

Stay tuned for more updates next week!

Shubham Murti — Aspiring Cloud Security Engineer | Weekly Cloud Learning !!

Let’s connect: Linkdin, Twitter, Github

. . . . . . . . . . . . . . . . .
Terabox Video Player