How to dockerize a react back-end app and postgres

Kaleb Dalla
6 min readFeb 15, 2022
Docker logo — retrieved from https://e-tinet.com/linux/container-docker/

Hey everyone, how’re all doing?

This post has the main objective of explain some core concepts of the docker technology and also provide a straightforward tutorial about how to dockerize a front-end application and a database together.

For this tutorial I will be using a react back-end api and postgres as the database. Before we jump to the hands-on, I would like to explain some concepts about the docker technology.

First important question: What is Docker?

Docker is a containerization platform. It enables developers to package applications into containers to isolate their apps from its environment.

And which problems does Docker solve? Have you ever heard that phrase: “It worked on my machine …”? hahahah

Let’s have a look on this architectural image bellow explaining the difference between virtual machines and containers.

Architecture difference between VM and Containers — Retrieved from https://www.slideshare.net/Docker/docker-birthday-3-intro-to-docker-slides

This picture illustrate the architectural difference between the two technologies. While the VM emulates a hardware server which could be expensive to the host, Docker Engine will use the host operating system to run isolated containers that contains only one application to be executed which makes it a very lightweight option in comparison with VMs.

So the benefit of using docker is that a container it’s much more lightweight than a VM and it includes everything needed to run the application, which solves the problem of “it worked on my machine but didn’t work on this one”. Moreover, it is a technology that is easy to use and learn.

Ok. Now that we have an overview about docker, let’s start the hands-on on how to dockerize an application with it. From now on, I’m going to assume that you already have docker and postgres installed on your machine.

For this example I’m going to use this repository: https://github.com/aneagoie/smart-brain-api

Feel free to fork it and test on your machine. This repo is an API built with react that handles some requests connecting to a postgres database.

The first thing to do is to add a dockerfile to your project and configure it. So, let’s do it.

dockerfile for smart-brain-api project

In this dockerfile we are telling docker what to do in order to start the app. In the first line we are declaring which version of node to use. On line 3, the command WORKDIR sets the working directory for any RUN, CMD, ENTRYPOINT, COPY and ADD instructions that follow in the dockerfile. In our case, we are telling docker to create a /usr/src/smart-brain-api folder structure and use it on the container as our working directory.

On line 5 we are using the COPY instruction which tells docker to copy everything on the root directory of our local project and add it to the working directory of the container. After that we copied all the files of our local project to the docker container we can run the npm install command so node can download and install all the dependencies of the project to the container and it is exactly what we are doing on line 7 with the docker instruction RUN.

Finally, on line 9 we are just asking docker to execute the command /bin/bash which will open an terminal for us on the container. This command is optional.

Ok, now that we set up the dockerfile for our react project, let’s do the same for our postgres application. To do so, let’s create a folder inside our project called postgres which will be responsible to store all the files related to the database. Inside this folder, we are going to create our second dockerfile.

Project structure after the creation of the dockerfiles
dockerfile for postgres

This second file is simpler than the first one. Again, on line 1 we are telling docker to use an image of postgres and on lines 3 and 4 we specify the instruction ADD which will copy the files on the left to the path on the right.

deploy_schemas file

On this file deploy_schemas.sql we are simply declaring two table schemas which will be created on postgres at every initialization of this container.

To finish this section, we have to create the sql files of the tables that we declared in the deploy_schemas file. Inside the folder /postgres create a folder called /tables and inside it you will create the users.sql and login.sql files. Bellow, you can see the code for both tables.

sql code for users and login tables

Just to explain the BEGIN TRANSACTION statement marks the starting point of an explicit, local transaction in sql and it finishes with the COMMIT statement.

Okay, now we have all the basic stuff configured. So, in order to dockerize these two applications together we will use the docker-compose file which allows us to bundle containers together and boot-up the whole ecosystem with a single command.

To do that, you’ll need to create a file called docker-compose.yml on your root project.

docker-compose file

The first thing to do is to setup the version of docker compose that you want to use. In my case I’m using the version 3.8 but you can check the latest version of it on docker compose documentation.

Then we’re going to declare our services, which are the applications that docker-compose are going to orchestrate. On lines 4 and 21 I’m naming the services. You can use whatever name you like on these lines. Then we need to define a container-name for our services just like on lines 5 and 22.

For the back-end you can also define which image of node that you want to use and a command to run on the container. Moreover, you need to declare your working directory so the container knows where to run the command.

A side note: the tag build tells docker compose to build the application using the dockerfile defined on the path declared. For the back-end we’re telling to use the root directory of the project and for postgres we’re telling to use the folder /postgres. By using this tag you can discard the image tag.

Another important configuration that needs to be done is related to the ports tag. With this tag you’re mapping the port from your machine with the port on the container, so then can communicate. For the back-end we’re mapping the port 3000 on your local machine with the port 3000 of the container. For the postgres I had to change the mapping a little bit in order to make it work on my windows machine. I mapped the port 5431 of my local machine with the port 5432 on the container.

To conclude, I also declared a volumes tag on the back-end service in order to map all the files from my local root directory to the working directory on the container. This allows the container to “listen” to any changes on those files and update as soon as any changes happen. Finally, you can use the tag environment to declare environment variables, which is optional.

Now that you have your docker-compose file configured you can use this command:

docker-compose up --build

It tells docker to build the services (you need to run it only once — then you can just use docker-compose up) and then it will starts the containers services. And that is it!! You have dockerize two applications together using docker. Now you can start both services with just one single command. Cool, isn’t it?

To conclude this tutorial, I added this image bellow that shows how to connect to your container database using PgAdmin tool. I struggled a lit bit with this, so I decided to include it here.

how to connect to your container database using PgAdmin.

Oooh boy, that was a lot to take in right? hahah

I hope that this tutorial helps you in someway. Nice coding!!

--

--

Kaleb Dalla

A Senior Software Engineering currently working at the biggest bank of Latin America - Itaú. A guy that loves learning and sharing knowledge.