This week we somewhat shifted our focus from the development of our web app, to DevOps and deployment. We made this decision after facing some issues with local develpoment and subsequent deployment on the Linode server.

Docker

None of us were very familiar with Docker prior to this, so a lot of research was needed for us to get up to speed.

We realised that the best way to dockerize our project would be to use a database image coupled with a standard python image through docker compose. At present, our Dockerfile looks like this:

FROM python:3
ENV PYTHONUNBUFFERED=1
WORKDIR /code
COPY requirements.txt /code/
RUN pip install -r requirements.txt
COPY . /code/

RUN chmod +x /code/start.sh

CMD ["/code/start.sh"]

This creates a container using a standard python3 environment along with our requirements, and creates a directory code/ for the Django project files. We also created a basic entrypoint shell script which updates the Django project migrations and starts the server on port 8000.

The docker-compose.yml file specifies the database image as follows:

version: "3.9"
   
services:
  db:
    image: postgres
    environment:
        # the following details have been redacted for the purposes of this blog post
      - POSTGRES_DB=
      - POSTGRES_USER=
      - POSTGRES_PASSWORD=
    volumes:
      - postgres_data:/var/lib/postgresql/data/
  web:
    build: .
    ports:
      - "8000:8000"
    depends_on:
      - db

volumes:
  postgres_data:

It’s important to note that, in line with our client’s original recommendation and after facing compatibility issues with Docker and MySQL, we have reverted back to using PostgreSQL for this project. Fortunately, there is no conflict between our existing database schema and what PostgreSQL supports so there are no changes to be made in that regard.

During the process of dockerizing, we also decided to address our client’s concern regarding the “Interviewe(e/r)” naming convention, by replacing “Interviewer” with “Surveyor” and “Interviewee” with “Respondent”.

Continuous Deployment

After much initial research on how to get started with this, we came to the conclusion that the easiest (and conveniently, most commonly implemented mechanism) for continuous integration (CI) and continuous deployment (CD) was to use GitHub actions.

While our existing research had shown this was a great way to automatically run integration tests, it hadn’t answered the biggest questions: how are we going to deloy the docker image to Linode?

Our options as we saw them were:

  1. Use rsync or scp to copy over the latest image as and when we had built it (though this isn’t really CD).
  2. Write a script locally which detected new images locally and sent the latest image to Linode.

Neither of these options appeared suitable, and both were susceptible to numerous issues:

  • Our script which detected local builds would not differentiate between working and failing builds, and would send it over to Linode.
  • If two of us were working on two different features at the same time, we might not see our changes on Linode at all as the latest built image might be from the other person working on the project - resulting in time potentially wasted debugging an issue that does not exist.
  • If two of us were working simultaneously and one had made a change to the structure of the database, the existing data might be invalidated and the break the entire application for the other (whose version does not reflect these latest changes).

We weren’t quite sure how to get around these issues, until we encountered: https://docs.github.com/en/free-pro-team@latest/actions/guides/publishing-docker-images (hosting our latest docker image using either DockerHub or GitHub Packages), which was immediately seen as a much more appropriate solution to our problems.

After investigating both options, we decided to use GitHub packages as we realised publishing to DockerHub involves being part of a paid plan.

The guide above taught us to create publish.yaml and use GitHub workflows, which we configured to use our main branch as our production/release version of the Django project. Hence, every time that we push/merge a commit to the main branch, GitHub builds our docker image and hosts it in our repository as a GitHub package.

This wasn’t easy, given none of the three of us had any prior experience using GitHub workflows or any significant experience using Docker and required several days of experimentation before we were actually able to get it working.

However, even with these latest works, we still hadn’t resolved our initial issue: How do we get our latest version onto Linode?

WatchTower

In the process of researching answers to “How to automatically deploy the latest version of a GitHub package in a Linode server?” we came across WatchTower (https://github.com/containrrr/watchtower), a solution to this exact problem that we had. After setting up WatchTower on Linode, we tested to see whether making and committing small changes to our main branch would result in a pull-down and rebuild of the Linode image, which it did.

As it stands, this is our current CD workflow, which is triggered each time a push is made to the main branch:

  1. GitHub Actions rebuilds and publishes the Docker image to GitHub Packages.
  2. The Linode Server pulls the latest Docker image.
  3. The docker image is rebuilt and redeployed on the server.

This has allowed us to be able to give a URL to our client where they can view our progress and provide feedback as required.

Next steps

Now that we have our project dockerized and CD set up, we can concentrate our efforts on development and adding functionality as per the client requirements. We intend to properly integrate our backend functionality with the work that we have completed to-date on the frontend. This involves ensuring adding elements to and removing elements from the database, implementing visualisations to our user’s pages and making the outwards-facing functional elements work.

Also, since we have established that our main branch will be used as the production branch, naturally we will resume development with a more asynchronous development style, with each of us working independently on different aspects of the project.