Testing with Python (part 2): moving to pytest
Stripping excuses for not testing until I realize I just suck
Summary
pytest is a diamond of a framework, that reduces objections to eat our testing vegetables, while making the whole ordeal more productive, flexible, and faster to execute.
Porting the previous article's code to pytest gives us:
import pytest
from the_code_to_test import add
@pytest.fixture()
def setup_and_tear_down():
print('This is run before each test')
yield
print('This is run after each test')
def test_add_integers(setup_and_tear_down):
result = add(1, 2)
assert result == 3
result = add(1, -2)
assert result == -1
def test_add_strings(setup_and_tear_down):
result = add("1", "2")
assert result == "12"
def test_add_floats(setup_and_tear_down):
result = add(0.1, 0.2)
assert result == pytest.approx(0.3)
def test_add_mixed_types(setup_and_tear_down):
with pytest.raises(TypeError):
add(1, "2")
A shorter, more expressive, yet feature equivalent, test suite.
To run it, we can call pytest -s the_tests.py
and get a test report as before.
Time to practice gratitude
I've mostly met people that agree testing is a chore. When you need to do something that is good for you but you don't want to do it, you can either increase your discipline, or decrease your friction.
To me, pytest is the ultimate form of the second strategy: it makes all kinds of testing easier, which means everyone is more likely to do it.
For this reason alone, you should use pytest if you can, so it will be the foundation of the teaching for rest of this series.
But pytest is not just making tests less painful, it also makes them:
more productive;
more flexible;
faster to run.
It's a gem in the Python world. No, scratch that, it's a gem in the programming world. By experience, very few languages have remotely anything that is as good.
Migrating our code to pytest
Let's take the final code of our previous article.
The files don't change, we still have:
basic_project
├── the_code_to_test.py
└── the_tests.py
"the_code_to_test.py" contains:
def add(a, b):
return a + b
And "the_tests.py":
import unittest
from the_code_to_test import add
class TestAddFunction(unittest.TestCase):
def setUp(self):
# Anything you attach to self here is available
# in other tests
print('This is run before each test')
def tearDown(self):
print('This is run after each test')
def test_add_integers(self):
result = add(1, 2)
self.assertEqual(result, 3)
result = add(1, -2)
self.assertEqual(result, -1)
def test_add_strings(self):
result = add("1", "2")
self.assertEqual(result, "12")
def test_add_floats(self):
result = add(0.1, 0.2)
self.assertAlmostEqual(result, 0.3)
def test_add_mixed_types(self):
with self.assertRaises(TypeError):
add(1, "2")
Let’s convert that to pytest.
First, you need to pip install pytest
in a venv and activate it. Don't use a global pytest, don't use pytest from your distribution packages or installed from anywhere else, it's a recipe for pain. As usual, if you don't know how to use pip and venv, we have an article for this.
Now, let's replace the content of "the_tests.py" by:
import pytest
from the_code_to_test import add
@pytest.fixture()
def setup_and_tear_down():
print('This is run before each test')
yield
print('This is run after each test')
def test_add_integers(setup_and_tear_down):
result = add(1, 2)
assert result == 3
result = add(1, -2)
assert result == -1
def test_add_strings(setup_and_tear_down):
result = add("1", "2")
assert result == "12"
def test_add_floats(setup_and_tear_down):
result = add(0.1, 0.2)
assert result == pytest.approx(0.3)
def test_add_mixed_types(setup_and_tear_down):
with pytest.raises(TypeError):
add(1, "2")
Putting our-self at the root directory named "basic_project" once again, we can run the pytest
test runner, to execute them all. I get:
pytest the_tests.py
================= test session starts ================
platform linux -- Python 3.10.13, pytest-8.1.1, pluggy-1.4.0
rootdir: /path/to/basic_project
plugins: anyio-3.7.1
collected 4 items
the_tests.py ....
[100%]
================= 4 passed in 0.01s ================
The output will vary a bit because of your platform, directory absolute path, software versions, and sometimes because you don't know you have a few pytest’s options already set. That was my case, I heavily customize pytest output, and use plugins, so I had to disable them for the purpose of this article.
Now let's explain what we just did.
In pytest, a test is just a function with a name starting with test
. You don't need a class, you just need a particular name. Also, there is no need for assertStuff
methods, it uses the regular assert
keyword:
def test_add_integers(setup_and_tear_down):
result = add(1, 2)
assert result == 3
result = add(1, -2)
assert result == -1
That's not normal Python behavior, though. To get this result, pytest actually performs a lot of magic behind the scene. I'm not a big fan of magic, but I've come to tolerate this one, because the trade off is so huge: you can use regular Python operators like ==
, !=
, >
, >=
, in
or is,
and assert
will make that part of the test.
In fact, it will do more than that, because in case of failure, it will analyze the expression and give you clues on what went wrong. Let's say I write the test with a wrong assertion:
def test_add_integers(setup_and_tear_down):
result = add(1, 2)
assert result == 2
result = add(1, -2)
assert result == -1
Running pytest
once more will give me:
pytest the_tests.py
================= test session starts ================
platform linux -- Python 3.10.13, pytest-8.1.1, pluggy-1.4.0
rootdir: /path/to/basic_project
plugins: anyio-3.7.1
collected 4 items
the_tests.py F... [100%]
================= FAILURES =================
_________________ test_add_integers _________________
setup_and_tear_down = None
def test_add_integers(setup_and_tear_down):
result = add(1, 2)
> assert result == 2
E assert 3 == 2
the_tests.py:14: AssertionError
...
We get some very clear indications:
The test that didn’t succeed is in file "the_tests.py" line 14, due to an assertion failing.
This assertion was checking
result == 2
Turns out
result
is 3.
I still haven't explained what the heck is this setup_and_tear_down
parameter though. But I need a whole section for that.
Fixtures, pytest' secret weapon
Being expressive is already a strong asset, however pytest really shines when it comes to setup and tear down. Once again, it uses dark magic to provide it, which is not something you want everywhere, but in this case the price is worth it.
You can mark any generator with the @fixture
decorator (we have an article on those):
@pytest.fixture()
# The name doesn't have to be "setup_and_tear_down", you can name it as
# you want. And you can have as many fixtures as you want.
def setup_and_tear_down():
print('This is run before each test')
yield
print('This is run after each test')
By itself, it does nothing.
But if you take any test, and add a parameter with the same name as this fixture, like this:
# The name must be "setup_and_tear_down", we named the fixture that way
def test_add_integers(setup_and_tear_down):
...
Then Python will automatically call the setup_and_tear_down()
generator when the test test_add_integers()
runs.
This is, of course, not the typical Python behavior at all, and is the magic I talked about: pytest links a function name to a parameter name, and put them all together when a test runs. It’s weird. And it has a name, it’s called “dependency injection”.
Before we see a full example on how that works, let’s break down how this will be used by pytest:
it will call
setup_and_tear_down()
until it yields.it will THEN run the
test_add_integers()
test, passing it the result of theyield
.it will next resume the generator, after the
yield
(even if the test has failed).
This is a bit abstract, but practically that means:
Any code of the fixture before the line containing
yield
is executed before the test. It's like theself.setUp()
method with unittest.Any value on the right side of the
yield
is passed as a parameter to the test. It's like attaching something toself
with unittest.Any code of the fixture after the line containing
yield
is executed after the test. It's like theself.tearDown()
method with unittest.
You probably want an example to illustrate this wall of text :)
Let's add a new fixture and assertion to the code, plus deactivate the setup and tear down on 2 tests. I'll also add some line breaks for clarity:
import pytest
import random
from the_code_to_test import add
# we have two fixtures
@pytest.fixture()
def random_number():
yolo = random.randint(0, 10)
yield yolo
print(f"\nWe tested with {yolo}")
@pytest.fixture()
def setup_and_tear_down():
print("\nThis is run before each test")
yield
print("\nThis is run after each test")
# this test uses them both
def test_add_integers(setup_and_tear_down, random_number):
result = add(1, 2)
assert result == 3
result = add(1, -2)
assert result == -1
# The result of the fixture is used in the test
assert add(0, random_number) >= 0
# this test only uses one
def test_add_strings(setup_and_tear_down):
result = add("1", "2")
assert result == "12"
# those tests don't use any fixture
def test_add_floats():
result = add(0.1, 0.2)
assert result == pytest.approx(0.3)
def test_add_mixed_types():
with pytest.raises(TypeError):
add(1, "2")
Now, if I run the tests:
pytest the_tests.py
================= test session starts ================
platform linux -- Python 3.10.13, pytest-8.1.1, pluggy-1.4.0
rootdir: /path/to/basic_project
plugins: anyio-3.7.1
collected 4 items
the_tests.py ....
[100%]
================= 4 passed in 0.01s ================
Wait… Nothing prints!!
It's a typical trap of pytest, every beginners get caught by it: it captures stdout by default. We need to instruct it to not do so, with the -s
option:
pytest the_tests.py -s
================= test session starts ================
platform linux -- Python 3.10.13, pytest-7.3.0, pluggy-1.0.0
rootdir: /path/to/basic_project
plugins: django-4.5.2, clarity-1.0.1
collected 4 items
the_tests.py
This is run before each test
.
We tested with 10
This is run after each test
This is run before each test
.
This is run after each test
..
================= 4 passed in 0.01s ================
We can now clearly see the fixture setup_and_tear_down
being called for 2 tests, and run the code before and after. We can also see the fixture random_number
being called once.
Pytest comes with a load of command line flags, let's make it more verbose with -v
to see the name of each test individually instead of dots:
pytest the_tests.py -s
================= test session starts ================
platform linux -- Python 3.10.13, pytest-7.3.0, pluggy-1.0.0
rootdir: /path/to/basic_project
plugins: django-4.5.2, clarity-1.0.1
collected 4 items
the_tests.py::test_add_integers
This is run before each test
PASSED
We tested with 3
This is run after each test
the_tests.py::test_add_strings
This is run before each test
PASSED
This is run after each test
the_tests.py::test_add_floats PASSED
the_tests.py::test_add_mixed_types PASSED
================= 4 passed in 0.01s ================
The fixture system is very flexible, allowing you to combine setup and tear down, share them, activate/deactivate one on a whim...
In fact, you can even use a fixture... in a fixture. This is allowed:
@pytest.fixture()
def random_number():
yolo = random.randint(0, 10)
yield yolo
print(f"\nWe tested with {yolo}")
# if setup_and_tear_down runs, it runs random_number
@pytest.fixture()
def setup_and_tear_down(random_number):
print("\nThis is run before each test")
yield random_number + 1
print("\nThis is run after each test")
This fixture system is better that the old unittest way of doing setup and tear down because:
You have only one place where you define all of it.
You can chose what test uses the setup and tear down, instead of forcing it on all the tests.
You can reuse your fixtures accross all your tests, not just in one class.
You can mix several fixtures in the same test by using several parameters.
But you also noticed we had to pass a few flags to make pytest behave the way we want. And it's an important point with this tool, it's powerful, yet it comes with quite a few knobs.
Next article, we'll go deeper into pytest’s configuration, how to organize your projects tree, deal with imports, use pytest’s plugins and so on.
Once you are fluent in pytest, we will use it to see real life use testing use cases, in a few weeks.
Also i had to import random package directly to make it work, but in your examples there is no such import. Maybe we use different pythons(3.10 here), but if it is not a case you may want to correct that in your example :)
Great read as always.
HI BiteCode, great article as always :)
I think there is a typo with your last example. The fixture "random_number" must be an argument of the function and not the decorator.