Postgres and Go series, part 1: postgres tests setup in Go

CI and local friendly postgres tests in Go

This is the first post in the series Postgres and Go. Here we end up with the working setup of Postgres tests in Go, both locally and in CI, using docker and docker-compose.

The next posts in the series will cover these topics:

  • Libraries to work effectively with postgres in Go: trio pgx, squirrel, and scany. No need for magical ORM!
  • Isolating DB layer from the business logic in Go. Make DB layer swappable and easy to change.
  • Environment-based database migrations using goose in Go.
Note: Even though code examples are in Go, this post focuses on having postgres tests setup via docker and docker-compose. I hope the guide will be useful even if you don't use Go, since concepts are easily transferable to another language.

To get updates on the next posts in the series, feel free to subscribe to the blog, no spam guaranteed.

What we will achieve in the end

  1. Setup postgres through docker locally to be used in our go tests.
  2. Test DB setup in our simple Go test.
  3. Automate the whole setup with Makefile.
  4. Use docker-compose to run postgres tests in Go in any CI/CD pipeline (e.g. Jenkins).


When an engineer clones the project for the first time and executes [install dependencies] && [run tests], it should just work. The problem comes with the db setup though - if it’s not automated, the person spends at least few hours setting up the db locally, applying db migration etc.

Sometimes the team is tempted to switch off database test suite in CI (e.g. Jenkins, CirlceCI), because having a db test setup in CI might be clunky. The result - db test suite is not run in CI, not everyone remembers to run db test suite locally - as a result db test suite gets deprecated!

The worst case scenario - database layer is simply not tested. Bugs are discovered in sandbox or production. Feedback loop is slowed down, thus productivity is decreased. Engineers are afraid to touch and refactor db layer code, because they don’t know if it will work, oops.

Another approach might be to use some in-memory db that replicates postgres, but this doesn’t give us 1:1 feature parity with postgres and will end up with more troubles and debugging than needed.

Let’s get our hands dirty

Spin up postgres docker container for tests

docker run --name myapp-postgres-test -p 5432:5432 \
-d postgres:12.6

This command does the following:

  • We give a name to our docker container with flag --name <container name>
  • If postgres image is not installed locally, docker will fetch it.
  • We pass in environment variables of db credentials for a postgres container.

In my case I pinned postgres to 12.6, because that’s the version that was used in sandbox/production of my application. To have the latest version, simply omit :12.6 part from the command above or pin to your specific version.

When you want to stop the container, execute:

docker stop myapp-postgres-test

To run the container again after it stopped, execute:

docker start myapp-postgres-test

Of course feel free to rename myapp to the name of your actual app.

Now we can connect to our postgres with the following connection uri: postgres://postgres:pass@localhost:5432/testdb

Connecting to postgres in our Go tests

In my examples I use go version 1.16.

Let’s start with the simple “hello world” db test in Go.


mkdir testapp && cd testapp && touch db_test.go

Let’s initialize go.mod file and get pgxpool as the dependency to establish db connection:

go mod init testapp && go get 

My go.mod looks like this:

module testapp

go 1.16

require v4.10.1 // indirect

Our db_test.go is the following:

package testapp

import (

func TestDB(t *testing.T) {
    var err error

    pgHost := "localhost"
    if host := os.Getenv("PGHOST"); host != "" {
        pgHost = host
    dbPool, err := pgxpool.Connect(context.Background(),
        fmt.Sprintf("postgres://%v:%v@%v:%v/%v", "postgres", "pass", pgHost, 5432, "testdb"),
    if err != nil {
        t.Fatal("Fatal error while connecting to postgres: ", err)

	// test db connection
	var greeting string
	err = dbPool.QueryRow(context.Background(), "select 'Hello, world!'").Scan(&greeting)
	if err != nil {
		t.Fatal("Error while making test select statement in postgres: ", err)

	if greeting != "Hello, world!" {
		t.Fatal("Error on simple Postgres smoke test, incorrect result, got: ", greeting)

	log.Println("Successfully connected to postgres!")

Let’s run go test:

go test
2021/03/22 10:52:43 Successfully connected to postgres!
ok  	testapp	0.107s

Bingo! So what do we have so far:

  1. One liner to spin up postgres db as docker container exclusively for tests.
  2. Example of go test that connects to our db and does simple ‘Hello world’ interaction to verify that all works as expected.

Automate with Makefile

Let’s put all the commands to deal with our docker test db to the Makefile:

touch Makefile 

Makefile will have the following targets:

.PHONY: pgTestSetup \
		pgTestStart \
		pgTestStop \
		test \

	@ docker help > /dev/null 2>&1 || (echo "Please install docker." && exit 1)
	docker run --name myapp-postgres-test -p 5432:5432 \
    -e POSTGRES_USER=postgres -e POSTGRES_PASSWORD=pass -e POSTGRES_DB=testdb \
    -d postgres:12.6 || echo "Postgres container for tests already set up"

	docker start myapp-postgres-test || (make pgTestSetup)

	docker stop myapp-postgres-test

install: pgTestSetup
	go mod download

test: pgTestStart
	go test ./...

Why do we do this? When a new member joins the team, he or she has to do just the following to run tests successfully: git clone <myapp> && cd myapp && make install && make test That’s it. All db docker setup will be executed automatically and tests will run successfully.

Note: Think of it as a design smell when you have to set up a lot of manual steps once you 'git clone' the project. The local setup should be ideally just one command, e.g. 'make install'.

Using docker-compose to have postgres tests setup in CI/CD pipeline

This will be simpler than you think. First, let’s see how our docker-compose.yml file looks like, and then we will go step by step to understand what’s there.

touch docker-compose.yml:

version: '3'
      context: .
      dockerfile: Dockerfile.test
    container_name: myapp_tests
      PGHOST: myapp_postgres
      - myapp-postgres
    image: postgres:12.6
    container_name: myapp_postgres
      - POSTGRES_USER=postgres
      - POSTGRES_DB=testdb

Key things to notice:

  1. Here we orchestrate 2 services: app and myapp-postgres. They both have container_name property - its value is also a host name for containers to talk to each other. This is exactly the reason why we pass PGHOST: myapp_postgres, since our app connects to postgres container via the host that is specified in the container_name property of myapp-postgres.
  2. Property depends_on is important here: it tells docker-compose to first spin up myapp-postgres, and only afterwards proceed with the app.
  3. Dockerfile.test has setup to run actual tests. Let’s take a look at the Dockerfile.test next.

touch Dockerfile.test:

FROM golang:1.16-alpine as builder

RUN apk update && \
    apk add git make pkgconf gcc libc-dev openssl


COPY go.mod ./
COPY go.sum ./
RUN go mod download

COPY . .

# download test deps
RUN go get -t ./...

ENTRYPOINT go test ./...

Our Dockerfile.test simply prepares everything to run tests during docker build, and runs them when the container spins up, thus go test ./... is in ENTRYPOINT.

Now we have everything ready to actually run tests with docker-compose setup. Run:

docker-compose build && docker-compose up --abort-on-container-exit

Flag --abort-on-container-exit is important here, since once our tests finish we want to exit successfully.

It would be nice to clean up containers after we run our tests, thus last command would be:

docker-compose down

And of course we put it all into the Makefile to make our life easier in the future:

.PHONY: ciTest
	docker-compose build
	docker-compose up --abort-on-container-exit
	docker-compose down

Now all we have to do in our CI/CD pipeline is run make ciTest.

Wrap up

In this blog post we focused on docker and docker-compose setup mostly. There are other things to consider when working with postgres in Go:

  • How to do migrations.
  • Which Go libraries do we use with Postgres (tip: pgx   , scanny   , squirrel   combo work wonders).
  • Tactics to isolate DB layer from the business logic.

I will cover these questions in next posts in the series.

And if you are new to Go, pick up some good book   on the language or go to legendary A Tour of Go   to get going.

And btw, I work with Go and postgres in Starship   , where we build self-driving delivery robots, so if you feel like having a new engineering challenge, feel free to contact me or apply   .

Thanks for reading.

To get updates on the next posts in the series, feel free to subscribe to the blog, no spam guaranteed.

No comment