Next level Go testing
Go makes it very easy to write tests. In fact, the test tool is built in the standard
toolchain, and you can just run go test
to run your tests, without installing extra
dependencies or anything else. The testing package is part of the standard library, and
it’s use, I’m happy to observe, is pretty wide spread.
As you’re writing your service implementations in Go, hopefully your test coverage is growing with time. With the larger scope of testing, tests run longer as well. You, hopefully, use service integrations and integration tests to test important parts of your services. You figure out that there are cases where integration tests and various public service coupling becomes restrictive for CI use, or development use.
Integration testing
I’m a big believer in integration testing. It’s benefits may not be directly observed by some, but with LTS (long term support) releases, having integration tests is a very good idea, as you obviously want to upgrade your services with time as well. If you want to switch from MySQL 5.7 to 8.0 (or even to PostgreSQL), you want to be reasonably sure that your service will continue to work, and that you detect issues and update the implementation as required.
One example where integration testing has been useful for me lately is to detect addition
into MySQL reserved keywords: I have a database deployment that used rank
field column.
While the word was ok to use up to and including MySQL 5.7, it has become a reserved word
in MySQL 8.0. The integration test caught this issue, while a mock is not able to do that.
RANK (R); added in 8.0.2 (reserved) via Kewords and Reserved Words for MySQL 8.0
While mocking is an extension of unit testing, motivated by the prohibitive cost that an integration test might imply, doing integration testing today is becoming significantly easier than in the past. With the advancement of Docker, and with docker-first CI’s like Drone CI, we can declare our services in the CI test suite. Let’s look at a mysql service that I have defined:
services:
- name: crust-db
image: percona:8.0
ports:
- 3306
environment:
MYSQL_ROOT_PASSWORD: bRxJ37sJ6Qu4
MYSQL_DATABASE: crust
MYSQL_USER: crust
MYSQL_PASSWORD: crust
That’s literally all we need today to spin up a database along with our tests and builds. While in the past, this may have implied that you have an always-on database instance which you need to manage somewhere, today the doors are open to basically declare all your service requirements within the CI framework in which you work.
"Go and integration tests: simple with Drone CI #golang" via @TitPetric
Click to Tweet
I’m digressing here a bit, but the wisdom is - if you can avoid mocks for some things, especially
services under your control, do consider writing integration tests instead. You don’t need to resort
to projects like gomock or moq
from the get-go. Mocking everything is not sensible (e.g. net.Conn
doesn’t need mocking, it’s
simple enough to create your own client/server inside your tests, that will live in memory).
In fact, there’s also middle ground between integration tests and mocks, where you can write a fake implementation of a simple external service like Redis, but you’ll still not catch all the nuances of the real thing. Basically having to satisfy a simple interface which you use brings down the implementation surface so much that you can literally implement the behaviour of the API subset which you use.
Testing surface area
I’m working on a project, which currently has 53 test files, and out of that 28 of them are integration tests that require an external service (like the database above). You may not always be dealing with a full environment, or perhaps are just interested in a small subset of tests that are spread out over your project, and you want to be able to run those (and only those).
Looking at the testing
package API surface, we notice that there’s a Short()
function available, that reacts to -test.short
option when running go tests. This
allows us to skip some tests, when we want to run some subset of the tests:
func TestTimeConsuming(t *testing.T) {
if testing.Short() {
t.Skip("skipping test in short mode.")
}
...
}
On paper, this means that you could skip integration tests if you’re running in short mode. But the motivation, even from the example above, is that this should be used for skipping tests where duration is an important factor - effectively, this should apply only to benchmarks.
So, when considering that you need to enable benchmarks explicitly with a -bench
argument,
you may be wondering if one benchmark or another benchmark can compare in slowness. Go is
already smart enough that it limits the amount of time each benchmark is run by default, so
it’s exclusively up to you if you’re tweaking that somehow, and if you want to use both short
mode and benchmarks in tandem - for me, it doesn’t make sense.
Effectively, short test flags should not be used to skip integration tests. It’s intent is to speed up the build, but having it do so with code and personal judgement as to which tests should be short or long is prohibitive. Emphasis: either you run all the benchmarks or you don’t. The short test flag doesn’t give us the flexibility we need as our test suite grows, and we want to have a more declarative way as to what kind of tests we want to run.
A better way
Now, conventional wisdom says “just run all tests”. Being one of those engineers that actually read how people are dealing with the issue, ask questions and looks for established practice - now, that leads you down to some better solutions to problems that aren’t unique to you.
In 2016, and later in 2016, Peter Bourgon wrote two brilliant long form articles that should be a go-to reference for people who are moving into implementing real services, and outgrowing very basic implementations:
In the 2014 writeup, Peter suggests using build tags to introduce a valuable testing idiom:
Package testing is geared around unit testing, but for integration tests, things are a little trickier. The process of spinning up external services typically depends on your integration environment, but we did find one nice idiom to integrate against them. Write an integration_test.go, and give it a build tag of integration. Define (global) flags for things like service addresses and connect strings, and use them in your tests.
Effectively, Peter suggests marking your integration tests using go build tags. If you need
a separate environment where you should run those tests, you just use -tags=integration
for
go test parameters.
This makes complete sense - while my integration tests on this project take a minute or so, I know of projects where they take hours. Those projects might have very special dedicated testing setups, so that you don’t test the provisioning of those services as well - they are just available in a testing environment.
I was interested to know what changed in his opinion between 2014 and 2016. If anything
the autor digresses into how various non-standard library testing packages become their
own DSL’s (domain specific languages). But experience being a good teacher, he eludes towards
testing a http.Client
noting that you don’t want to test the HTTP transport the request
comes in to or the path on the disk you are writing files to.
You’re supposed to focus on business logic in unit tests, and with integration tests you’re validating the functionality of the integrated service, and not necessarily how that integration is implemented by the standard library or third party package.
"Go testing: which one are right for you - unit or integration tests? #golang" via @TitPetric
Click to Tweet
The corner cases
It’s common to integrate your app against third party services, and perhaps the integration test also needs to validate that the application responses still make sense, with API deprecation being a real thing. As such, Peters’ write-up begs for a little improvement.
You can’t always rely on the API you’re using; will it continue to stay as-is for years to come? Nobody wants you to create a bunch of GitHub users and organisations to test your webhook endpoints and integrations, but that doesn’t mean that you don’t occcasionally need to do just that.
A recent example would be a larger deprecation of Bitbucket APIs due to GDPR. The deprecation was announced about a year ago, started in october, and is scheduled to obsolete various APIs and returned data at the end of April 2019, probably wreaking havoc with various CI integrations that exist.
With that in mind, I extended Peters’ recommendations like so:
// +build unit
- a test that doesn’t require any service,// +build integration
- a mandatory tag to test against our services,// +build external
- testing against third party and public services,// +build integration,external
- testing against our and public services,// +build !integration,external
- testing exclusively against the public services,// +build unit integration
- no service required, provides functions for integrations
Our tests usually fall in either unit, integration or external categories, or a set
of those. We definitely want to skip external
tests in our CI jobs for obvious reasons,
but they are invaluable if we’re looking at debugging some related issues in development.
Often we need to pinpoint only a specific test in a specific package, so running something
like the following makes sense:
go test --tags="integration external" ./messaging/webhooks/...
Depending on your build tags, this may run all integration and exernal tests on a subset of your codebase, skipping unit tests, or it may just run that one test that’s both an integration and an external test. Either way, you’re focused on a package implementation, and particularly a subset of all tests within that package that match the provided tags.
"Go testing: A practical method to run integration tests on demand #golang" via @TitPetric
Click to Tweet
For CI jobs, the scope is fixed:
go test --tags="unit integration" ./...
So you test integrations to the full extent, as well as the full package scope. We’re skipping
external
and integration AND external
tests, which could cause our CI builds to fail, without
having it be an issue with our build. GitHub or Bitbucket might just be having a bad day,
looking at their respective status pages it usually happens about once per month.
So, basically, in addition to marking some tests as integration
, we want to mark others as
unit
and external
so we can skip them as needed in the development environment. Nobody
likes to run the full test suite and figure out that it’s failing just because GitHub has issues.
But having the option for development and debugging purposes is invaluable.
Testing tests
When refactoring tests, you’re usually going to find out only when you run your tests, that some symbol or something or other isn’t there anymore, and your tests can’t compile. A good approach to resolve this is to test just the compilation step of the tests. A few things come into play:
- Skipping tests can be done with filling out the
-run
argument togo test
. You can rungo test -run=^$ ./...
and it will effectively compile your complete test suite and skip all tests. This is great for long-running CI jobs, as it’s effectively a compile-time check that all your tests are runnable. However, this will still run yourTestMain
functions. - Go 1.10 introduced the
-failfast
flag. If your tests are failing but you have a pretty large test suite, there’s going to be a lot of output between the error/failure, and before the other tests finish and notify you of a failure. With this option, you can optimize this a bit at the cost of possible failing tests later on in your test suite. It’s the difference between testing everything and reporting all errors, or testing only until the first error is found. - The
-failfast
flag doesn’t have any effect on./...
, if one of the packages fails due a compile error for example, it will continue testing against the remaining detected packages.
These are basically just hacks around the golang/go#15535
issue, which effectively means we can’t just test compilation as we would with go build
.
"Go testing: compile-time check your tests without running them #protip #golang" via @TitPetric
Click to Tweet
Public vs. Private tests APIs
Ideally, you’ll resort to black-box testing for your packages. This means that your package
might be named store
, and your tests would be in the package store_test
. This may solve
a dependency problem for you, where net/url
package uses the net/http
package, and vice-versa.
Having url_test
and http_test
packages solves this issue.
Additionally, there are a few common rules that can apply for any codebase:
- if you’re doing internal tests, suffix your files with
_internal_test.go
, - if you’re doing black-box tests, your file should just have the
_test.go
suffix.
Particularly, for something named store
, you could have:
store.go
- the main package (package store
)store_test.go
- the black box tests (package store_test
)store_internal_test.go
- the internal tests (package store
)
There are a few notable examples of what you can do with this. In a talk by Michael Hashimoto, on the subject of “Advanced Testing in Go”, he advocates Testing as a public API:
- Newer HashiCorp projects have adopted the practice of making a “testing.go” or “testing_*.go” files.
- These are part of the package, itself (unlike regular test files). These are exported APIs for the sole purpose of providing mocks, test harnesses, helpers, etc.
- Allows other packages to test using our package without reinventing the components needed to meaningful use our package in a test.
The issue I have with this (which might not be a very relevant issue), is that ANY public API changes would require some sort of compatibility promise. While that in itself may be acceptable, it’s definitely not the norm. In most cases, it might be more appropriate if these test functions would be scoped only for the tests in the project.
I’d just add the functions to store_internal_test.go
- any public symbols defined in *_test.go
files will still be available in the package, but only accessible for tests. When your application
is being compiled, it’s not going to pull in anything that you declare in test files.
And if you ever change your mind and need to make some of them public - you just move the
code into a testing.go
, and don’t need to change a single line of your tests.
"Go testing: Should you have a public API to provide testing utilities? #golang" via @TitPetric
Click to Tweet
The principle suggested can also be used to expose some private symbols from the package, for use from the black-box tests. While I can’t seem to find a strong use case that would justify this approach, apart from the import cycle issue above, it’s possible to expose internals from your package, which are then available for your tests only. You’re effectively mixing internal tests and black-box tests if you go down this path, and I would advise against that. Internals will change, and your tests will be more fragile as a result.
There’s very few examples of this in the wild, but here are a few:
- API Testing - Swagger - exposes/wraps functionality provided by a private function for black box tests,
- Separate _test package - eludes towards exporting/mocking via additional files (no linked example),
- Export unexported method for test - a poor example of exporting a private function for tests,
Effectively, one could write an internal test in most cases that would achieve the same. I’m not
advocating, particularly, that such _internal_test.go
files should expose internals to black-box tests,
but I see the sense of using them to provide utility entities that may one day be part of your
public package API. It’s still a step too much, but it all comes down to your requirements. If you
don’t want to release some sort of API before a given date or a given release, this is a way to implement
it as a public package API, without actually publishing it for use outside of tests. To each their own.
While I have you here...
It would be great if you buy one of my books:
- Go with Databases
- Advent of Go Microservices
- API Foundations in Go
- 12 Factor Apps with Docker and Go
Feel free to send me an email if you want to book my time for consultancy/freelance services. I'm great at APIs, Go, Docker, VueJS and scaling services, among many other things.
Want to stay up to date with new posts?
Stay up to date with new posts about Docker, Go, JavaScript and my thoughts on Technology. I post about twice per month, and notify you when I post. You can also follow me on my Twitter if you prefer.