Disclaimer
This is not a tutorial of any kind.
I don’t really describe what Docker, Laravel, or ReactJS are, nor do I explain all the configuration files below.
There are many people who write much better than me and have already explained all of this. Your favorite search engine will direct you to them.
I just want to share with you what I’ve learned while trying to put this project into production and the difficulties I’ve encountered along the way.
So, that said, we can start.
The project: Playlist Share
I have been working on a long-term project to share all the music I listen to and organize it on my own server.
Initially, I used Twitter extensively, and later switched to Mastodon. While these platforms are quite usable for day-to-day sharing, retrieving all my listening data for a month or even a year becomes more complicated. It’s possible, but not an easy task. That’s why I decided to develop this application.
Currently, the application looks like this (if you’re into the same music as me, you can search for the two albums in the screenshot - they’re great!):
The stack choice
It’s been a while since I wanted to give ReactJS a try, so choosing the frontend technology wasn’t too hard.
On the other hand, I hesitated a little with the backend. Initially, I wanted to use PHP, so I started with the Symfony framework, which I was already familiar with. However, I also wanted to step out of my comfort zone. So, in the middle of the project, I made the decision to switch the framework and started using Laravel.
I didn’t regret it at all; I found Laravel to be a lot more intuitive for API development. However, this is a personal preference, as I know some people might argue that Symfony is better. It’s worth noting that Laravel is heavily based on Symfony, so they do have some similarities.
As for the database, I didn’t need a really large database for now, so I went with my usual choice: SQLite. I don’t have too many arguments here, I just love this database since I first used it a long time ago.
Learning Docker
First steps, first mistakes, what a mess
I didn’t know Docker well, but just like ReactJS, I wanted to give it a try. So, here we are, learning Docker and Dockerfile syntax and starting to test it out.
At first, I created a rather messy Docker repository with a lot of git clone
commands directly in Dockerfile
. In my head, it wasn’t the best approach, but hey! Learning, right?
My initial Dockerfiles look like this:
Frontend Dockerfile
1 | # Use the official Node.js base image |
Backend Dockerfile
1 | # Use the official PHP base image |
I told you, it was messy.
My main issue was the fact that I had two repositories, one for the frontend and one for the backend. I guess that having only one repository would probably make the Docker configuration a lot easier (though I wasn’t entirely sure, just assuming).
Since I didn’t want to merge the two repositories, my initial thought was to create a third repository containing all the Docker configuration and pull all the code I need with git directly in the Dockerfile. However, I wasn’t sure if it’s considered a good practice with Docker. Therefore, I decided to search another way to accomplish my goal.
Interlude, learning about git submodule
Then, when I was seeking a more elegant solution to my issue, I discovered git submodules.
1 | git submodule add <repo_url> <target_dir> |
It was the solution for all my multi-repo issues. It kind of creates a symbolic link between one git repo and another, allowing you to have multiple repositories for code tracking but only one when it comes to deployment.
Using git submodules allowed me to create the Dockerfile in frontend and backend repositories without having to deal with messy Dockerfiles as shown above.
The only side-effect is that I have two more command lines to remember:
1 | git submodule update --init --recursive |
The first one initializes the git submodules, and the second one gets the latest version of them.
Rewrite Dockerfile
and docker-compose.yml
Now I can remove all the git clone
commands from my Dockerfiles, and I’ll keep the third repository for the final docker-compose.
Here is my current frontend Dockerfile:
1 | FROM node:14-alpine |
And my backend Dockerfile
1 | FROM php:8.2-cli |
Don’t forget to include RUN php artisan key:generate
in your Laravel Dockerfile. If you skip this step, your backend will return a 500 error consistently.
Additionally, I use the --force
argument in php artisan migrate
to create the database if it doesn’t exist (since I’m using SQLite database).
Production time
To put it into production, I am first thinking of including the web server directly in the docker-compose.yml
file. It is good for some cases, especially when you have only one server with this application, but for my usage, it’s not really ideal. I talked about it in my last post.
I keep the nginx service in the docker-compose.yml file, but I will simply comment out the line related to it. The current configuration file looks like this:
1 | version: '3' |
For the ports line, it is allowed for localhost to access the port in question, but it cannot be accessed from outside. So, it’s a win-win situation in my case.
Here is the nginx configuration file in case the nginx server is included directly in Docker:
1 | events {} |
In the nginx proxy config in Docker, you need to use the container name in the URL (e.g., http://backend:8000
) instead of using localhost. If you mistakenly use localhost, you’ll encounter a nice 502 Bad Gateway error, and it can be quite frustrating if you’re not aware of this - like I was - and you might get stuck for hours trying to figure out the issue.
After going through all of this (which took me several days to figure out), the final command line, and everything works together smoothly (at least for me), is:
1 | docker-compose up --build |
And that’s it.
To update the container, for now, I use:
1 | docker-compose up --force-recreate --build -d |
I’m not sure if there’s a better way, but currently, it is sufficient for my needs.
Conclusion (kind of)
It was a cool journey to get here, but that’s just the beginning of it. I still have to learn about CI/CD pipelines, cloud computing, and all the other fascinating and time-consuming tech topics. Development is a non-stop learning process, and that’s why I love developing stuff so much.
There is one online instance with some data in it - it is my personal instance, my development instance, my test instance, and… well, you get it. You can’t test it for now, I’m sorry for that, but I will continue to work on it and eventually get one public instance for you to test.
So, keep developing stuff, sharing knowledge, and, above all else, take care of yourself.