-
-
Save pelme/4671923 to your computer and use it in GitHub Desktop.
import pytest | |
from contextlib import contextmanager | |
from functools import wraps | |
# TODO: Make it possible to pass standard pytest.fixture args here too, like scope, params etc.. | |
def pytest_contextfixture(fn): | |
ctxmgr = contextmanager(fn) | |
@pytest.fixture | |
@wraps(fn) | |
def actual_fixture(request, scope='function'): | |
ctxinst = ctxmgr() | |
# TODO: Proper exception propagation? | |
request.addfinalizer(lambda: ctxinst.__exit__(None, None, None)) | |
return ctxinst.__enter__() | |
return actual_fixture | |
@pytest_contextfixture | |
def my_fixture(): | |
# Test setup with possibly nested with-blocks can be performed here | |
yield 1234 | |
# Teardown is possible here. | |
def test_something(my_fixture): | |
assert my_fixture == 1234 | |
@pelme not sure about the nested context-managers - wouldn't their teardown execute too early? In any case, you should specify the dependency-fixture in the signature, otherwise we have no way to sort/group tests by fixture and scope.
@ronny i wouldn't like to introduce a second parallel way to write fixture functions in the core. It doesn't really add much except a different syntax.
@RonnyPfannschmidt Yeah, it would be nicer, but then the order of @contextmanager and @pytest.fixture would be important. I think it could support both variants... i.e. support decorating "real" context managers too, like your example and also something like
@pytest.fixture
class Foo(object):
def __enter__(self): pass
def __exit__(self, *a, **kw): pass
But it could also support decorating a generator, in that case it would turn it would be applied to contextmanager().
@hpk42 No, the teardown/exit executes in the right order. I have packaged this gist up at https://github.com/pelme/pytest-contextfixture with tests that prove this too!
Yes, using context managers like this breaks fixture ordering, but in my use case I am not really concerned about that.
To give a more concrete example: The context managers I am using are mostly mock patchers, where I only use function scope.
i.e.
@pytest.fixture
def some_object_to_test(request):
an_object_to_test = Something()
with mock.patch('some_module', 'foo'):
with mock.patch(Something, 'bar', 'a faked value'):
yield an_object_to_test
def test_a(some_object_to_test):
pass
def test_b(some_object_to_test):
pass
def test_c(some_object_to_test):
pass
Today, I am doing something like this (which indeed works, but it feels a bit awkward):
@contextlib.contextmanager
def setup_some_object_to_test():
an_object_to_test = Something()
with mock.patch('some_module', 'foo'):
with mock.patch(Something, 'bar', 'a faked value'):
yield an_object_to_test
def test_a(some_object_to_test):
with setup_some_object_to_test() as some_object_to_test:
pass
def test_b():
with setup_some_object_to_test() as some_object_to_test:
pass
def test_c():
with setup_some_object_to_test() as some_object_to_test:
pass
Those mocks is not really suitable for re-use as standalone fixtures, and most of these fixtures are defined in the local test modules themselves.
For more general fixtures, they should be defined as modular fixtures themselves.
There might be a better solution all together, but I hope this shows my reasoning. :)
can we just have normal fixtures support the contextmanager protocol
then it would become