Full-Stack Project: vg-stack (Part 1)


In the web development space, tools and best-practices change quite frequently. At one point in the not-so-distant past, a LAMP stack was the go-to approach for launching a web application. These days, PHP is the #8 most popular programming language (according to a company with the name “The Importance Of Being Earnest”), while Python and JavaScript rank above it.

One of the reason for this change has to do with the growing trend towards microservice architectures. Together with the need for rapid prototyping and the onset of agile development methodologies, Python and other languages have swept in and stolen the LAMP stack’s lunch.

So what is the new stack? A complex question with opinionated answers… Here is a recent community discussion on Hacker News: Go-to web stack today?

But wow, there are so many things to learn and keep up to date with nowadays! To help alleviate this, I’d like to work on a side-project using full-stack technologies. In particular, I want to use the following tools to build a fullstack app:

  • Django (backend)
  • Ember (frontend)
  • Docker (development/deployment)
  • Postgres (database)
  • … plus many more! (think git, pipenv, serverless, test-driven development… Maybe type-hints, GraphQL and data visualization too?)

The idea: A web app for keeping track of your video game backlog

No shame: the project is most definitely a copy of The Backloggery, a great site that I use often. However, as far as I know, The Backloggery is built in PHP, probably on a LAMP stack or similar. So I thought I would take a crack at building a small subset of the Backloggery features, on a more modern stack.

I like this idea because it comes with a well-defined problem set. I can already envision the database schema and how to build the backend. But as I mentioned, this project is going to involve the entire stack. That means I gotta learn some frontend JavaScript. In this series of blog posts, I document my experience.

I should say, the reason I picked Ember and not (the elephant in the room) React is that in my day job, we work with Ember and so this is a skill I would like to improve! Besides, we all know Postgres is the real elephant here, right? Let’s move on.

The app: vg-stack

I had to name my repo something!

In all honestly, I like the name because it hints at the purpose of the project (practicing the web stack) while at the same conjuring a mental image of a bunch of video games stacked on top of each other.

Setting up

For this first part, I’m going to build a backend skeleton using Django and the awesome Django REST framework package. As already mentioned, our database will be the ever-popular PostgreSQL and we are aiming for best practices in all parts of the stack. That includes deployment/development tools and environments! That said, I’m going to assume knowledge of the tools used and explain only the highlights.

Project Structure

In fact, I’m going to skip most of the project setup entirely and just go with a pre-built template! I like William Vincent’s drfx, so let’s go ahead and clone that:

$ git clone https://github.com/wsvincent/drfx vgstack

The template provided comes with a lot of best practices built in. We can build on top of it and get going quickly!


We’re going to install the following extra packages for now: python-dotenv to comply with mandate III of The Twelve-Factor App, psycopg2-binary to use PostgreSQL as our database, black to format our code and factory_boy to improve our tests maintenance.

$ pipenv install python-dotenv psycopg2-binary
$ pipenv install --pre --dev black factory_boy

We install black and factory_boy as development dependencies and tell Pipenv to allow the pre-release black package through!

Django Settings

Let’s get started setting up our project.

First, create a .env file for use with python-dotenv.

# vgstack/.env


Replace the SECRET_KEY with an actual secret key! You can generate one in Python:

from django.core.management.utils import get_random_secret_key


The database env variables can stay as they are, since those settings happily coincide with the default database, user and password in the Postgres Docker Image!

Now, replace “drfx” with “vgstack” everywhere it appears in the project, such as in values for ROOT_URLCONF, WSGI_APPLICATION and even the main “drfx” directory!

We also modify our settings file in order to use python-dotenv. In the end, it should look like this:

# vgstack/vgstack/settings.py

from pathlib import Path
import os
from dotenv import load_dotenv

BASE_DIR = Path(__file__).resolve().parent.parent
dotenv_path = BASE_DIR / '.env'

# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = os.getenv("DEBUG")

# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = os.getenv("SECRET_KEY")

# Allow communicaiton from our Docker container if DEBUG=True
ALLOWED_HOSTS = [""] if DEBUG else []

# ...

ROOT_URLCONF = "vgstack.urls"

# ...

WSGI_APPLICATION = "vgstack.wsgi.application"

# Database
  "default": {
    "ENGINE": "django.db.backends.postgresql",
    "NAME": os.getenv("DATABASE_NAME"),
    "USER": os.getenv("DATABASE_USER"),
    "HOST": os.getenv("DATABASE_HOST"),
    "PORT": os.getenv("DATABASE_PORT"),
    "CONN_MAX_AGE": 7200,

# ...

Notice I replaced the os.path call for the new kid on the block: Pathlib. I’m trying to make that a habit!


Let’s build a simple Python Dockerfile!

# vgstack/Dockerfile

# pull the Python official base image, slimmed down
FROM python:3.7-slim

# set environment varibles

# set work directory
WORKDIR /usr/src/app

# install dependencies with Pipenv
RUN pip install --upgrade pip && pip install pipenv
COPY ./Pipfile /usr/src/app/Pipfile
RUN pipenv install --system --skip-lock

# copy project over to image
COPY . /usr/src/app/

# specify our entrypoint
ENTRYPOINT ["./docker-entrypoint.sh"]

And a nice docker-compose file to help us run the project and related services.

# vgstack/docker-compose.yml

version: '3.7'

    image: postgres:11-alpine
      - postgres_data:/var/lib/postgresql/data/

    build: .
    entrypoint: /usr/src/app/docker-entrypoint.sh
      - ./:/usr/src/app/
      - 8000:8000
    env_file: .env
      - database
    external: false

We can see that we are persisting the Postgres database via Docker volumes. Also, we are taking advantage of the env_file command to load in our previously created file. Finally, we call the entrypoint command to run a shell script instead of running an in-line command.

The following entrypoint script is a modified version of the one found in José Padilla’s notaso project. It attempts to use a PostgreSQL cursor every second, and keeps trying until a connection succeeds. Then, it proceeds to migrate the Django models and runs the development server.


# vgstack/docker-entrypoint.sh

set -e

python manage.py shell << END
import sys
import psycopg2
from django.db import connections
except psycopg2.OperationalError:

until postgres_ready; do
    >&2 echo "==> Waiting for Postgres..."
    sleep 1

echo "==> Postgres started!"

python manage.py makemigrations users
python manage.py migrate
python manage.py runserver

exec "$@"

Note: Make sure to chmod +x docker-entrypoint.sh!

Sanity check

We can now use docker-compose to build our images and start our development server!

$ docker-compose up --build

Visiting on your browser should show… a 404 page. That’s by design! There’s no home page yet. Pointing the browser over to should show a log in screen, which means all is well! You can create a user by opening a new terminal window and using Python inside the running Docker container.

$ docker-compose exec web python manage.py createsuperuser

You may want to start the containers in detached mode so you don’t have to keep opening terminal windows.

$ docker-compose up -d

Applying a black coat of paint

After all that, it’s nice to make sure our code looks nice and consistent. For that, we can use the opinionated Python code formatter, black.

First, create a pyproject.toml file in the root of the project, in order to customize black’s settings if we so choose.

# vgstack/pyproject.toml

line-length = 88

And now, format:

$ black .

Looking good.

Up Next

Now that the project is set up with best practices thanks to drfx and we have a nice development workflow using Docker, we can get to work writing the actual bulk of the backend. In the next part, we will write the models, serializers, views and (last but not least) the tests that make up our backend code.


Update (2019-03-14)

A previous version of this post mentioned using Angular for the frontend, as this was what my team was using at the time. I’ve since switched jobs, and I want to focus on using my current team’s frontend tech!