Building a Full-Stack Application with Docker Compose

Nikolai Main - Oct 19 - - Dev Community

In this project, I created a full-stack container setup using Docker Compose, an orchestration tool that allows you to define multi-container deployments using YAML files. This not only highlighted Docker Compose's utility but also introduced me to Express and React.

I defined a frontend application built with React, a backend API with Express.js, and a PostgreSQL database. While I haven't fully grasped how this setup is beneficial in a production environment, it's brilliant for developing an application.

For context, The application is a company directory that provides create, read, update, and delete functionality for editing personnel, departments, and locations.

Docker Compose


Tech Used

  • Docker
  • Postgres
  • Javascript
    • Express.js
    • React

Useful Commands

General:

  • docker logs <container name> Display current logs for a given container.
  • -f: "follow" print logs as they are received.

Compose Specific:

  • docker compose up/down Run/Stop the Docker compose file
  • -v: used with compose down - destroys volume as well
  • --build used with compose up - rebuilds containers
  • docker compose watch build containers in 'watch mode'

Database

I first created a database with PostgreSQL which turned out to be a relatively straightforward process.

Relevant notes with regards to the volume section:

  • The first line defines a volume - a location to store persistent data. If a volume with the name db_data doesn't exist, It will be created and any data written to /var/lib/postgresql/data - the location within the container that stores PostgreSQL data - is then written to the db_data volume.
  • The second line defines the location of an sql.init file, which contains a set of initial queries to populate the database.

  db:

    image: postgres:latest

    container_name: my-postgres-db

    environment:

      POSTGRES_USER: myuser

      POSTGRES_PASSWORD: mypassword

      POSTGRES_DB: mydatabase

    ports:

      - "5433:5432"

    volumes:

      - db_data:/var/lib/postgresql/data

      - ./database/init:/docker-entrypoint-initdb.d/
Enter fullscreen mode Exit fullscreen mode

To make sure everything loaded correctly, I used docker logs my-postgres-db to check the container logs. Once I saw that there were no errors, I logged into the database with psql to verify everything was there and ran a few SELECT queries to confirm.


Backend

Next, I created the backend with Express.js. Although it's a relatively new topic for me, it was easy enough to pick up at a basic level. My Compose service looked like this:

Relevant notes:

  • The context section of the service tells Docker where to find the Dockerfile necessary in building the container image.
  • I used Compose's watch feature, which is similar to nodemon for Node.js applications - It automatically rebuilds the container when code changes are detected, which is very useful in development.
  • The two actions defined in the watch section check for any changes to my index.js file which contains all of my http methods and my package.json file to check if any new modules are installed.

    build:

      context: ./api

    container_name: my-express-app

    ports:

      - "4000:4000"

    depends_on:

      - db

    develop:

      watch:

        - action: rebuild

          path: ./api

          target: index.js

          ignore:

            - node_modules/

        - action: rebuild

          path: package.json
Enter fullscreen mode Exit fullscreen mode

The Dockerfile for the Express.js app looks like this. In essence, it grabs all the relevant content from my directory, installs necessary dependencies, and, once running, executes node index.js to start the Express.js server.

FROM node:18

WORKDIR /usr/src/app

COPY package*.json ./

RUN npm install

COPY . .

EXPOSE 3000

CMD ["node", "index.js"]
Enter fullscreen mode Exit fullscreen mode

With the database and a means to communicate with it set up, I began creating the ~12 methods required to make the application function correctly. To aid in testing the API's functionality, I opened two command lines: one to run curl commands and another to view the container logs.


Frontend

Having defined the backend and the database, all that was left was the frontend. I used React for this, mainly beacuse I had heard of it, and was aware of how common it is in modern frontend development, but had never experimented with it. Very quickly I realized how easy the whole process is and how useful it is for development. Running npx create-react-app my-app gets you up and running with a basic template and a server within minutes.

Initially, I planned to define the whole environment with Docker, but it was just as convenient to keep using React separately. Had I used Docker however, I would have defined a similar compose service to my API where I used watch statements to automate change refresh.

Nevertheless, I created the frontend - a very simple UI with a series of controls to switch between several tables in the database, as well as provide CRUD functionality.

The Dockerfile for my frontend looks like this:

Relevant notes:

  • This Dockerfile consists of two stages. First, the build stage, where the application image is compiled with the necessary components. Second, creating the image that will host the application with Nginx.
  • The reason for doing it in two stages is that it results in a smaller final image, as it only contains the server configuration and static files.
FROM node:18 AS build

WORKDIR /app

COPY package*.json ./

RUN npm install

COPY . .

RUN npm run build

# Stage 2: Serve the application with Nginx

FROM nginx:alpine

COPY --from=build /app/build /usr/share/nginx/html

EXPOSE 80

CMD ["nginx", "-g", "daemon off;"]
Enter fullscreen mode Exit fullscreen mode

For completion sake, I did add the frontend to my Compose file, allowing the full-stack application to be deployed with a single command. Due to network constraints, I never went as far as making this accessible from the internet, but it is an equally simple process.


    build:

      context: ./frontend

    container_name: frontend-app

    ports:

      - "80:80"

    depends_on:

      - api
Enter fullscreen mode Exit fullscreen mode

Final Notes

While I can't see this setup being too useful in a production environment, it's certainly useful for development. Having the ability to spin up and tear down your entire development environment with a single command is incredibly useful. I'm yet to really dip my feet into Kubernetes, but from my limited research, I imagine Kubernetes is a much better option in providing production environments for multi-container deployments.

. . . . . . . .
Terabox Video Player