Jan-Lukas Else

Thoughts of an IT expert

Migrate a PostgreSQL Container to a New Major Version

Published on in 👨‍💻 Dev
Short link: https://b.jlel.se/s/207
⚠️ This entry is already over one year old. It may no longer be up to date. Opinions may have changed.

A few weeks ago PostgreSQL 11 was released with a few new features and probably also a lot of improvements and bug fixes since the last release. Although I don’t really have the need to update to the latest version (I just use PostgreSQL as database for my Nextcloud and Miniflux installations), I wanted to migrate it though, to have everything up to date and probably profit from those smaller improvements.

I use Docker and docker-compose for everything I host myself, so I also use PostgreSQL inside a Docker container. But using Docker makes migrating between major versions of PostgreSQL a little bit more complicated. Because I have never done migration between versions before and because there isn’t really a good guide on how to do it, here’s the way I did it after some research:

In my case it was the migration between postgres:10.5-alpine and postgres:11.0-alpine.

The first step is to stop all services which access the database. So if your Nextcloud or Miniflux uses the postgres database stop them.

Next you create a “backup” of your postgres database. You can do so by using this command:

docker-compose exec -T db-container pg_dumpall -U dbuser > pgdump

Replace db-container with the name of the postgres container and dbuser with the database user. You should also make sure the file pgdump doesn’t already exist, because your “backup” is written to that file. Depending on the database size, this file can grow to a very big size, so make sure you have enough storage and also be patient if it takes a while.

Now after the last command finished, it’s time to delete the old container - after you’re sure everything works correctly, use cat pgdump | less to see if everything looks fine. Stop and delete the old container (docker ps shows you all running containers and docker rm let’s you delete them). If you use a volume for the database (that’s what you should do) delete all content in it (or delete and recreate the volume) to make sure the files from the old version are gone.

Update the postgres version in your docker-compose.yml to the new one and run docker-compose up -d db-container to start the container with the new version.

Finally import the “backup” to the new version:

docker exec -i db-container psql -U dbuser < pgdump

After everything has finished restart the database container and everything you stopped before the migration process (like your Nextcloud). Check if everything works and if that’s the case you can finally delete the pgdump file.

This might not be the best solution, especially not for really big databases or if you want to have zero downtime, but it works for the most things. You can probably optimize the commands or pipe them to make the pgdump file useless, but that’s another chapter.

I really hope this helped you. Most of it is based on the comments of this GitHub issue.

“Backup” is probably the wrong word, it’s technically a SQL script, which recreates the state of you databases.

Tags: , ,

Jan-Lukas Else
Interactions & Comments