(This gist is pretty old; I've written up my current approach to the Pyramid integration on this blog post, but that blog post doesn't go into the transactional management, so you may still find this useful.)
Fast and flexible unit tests with live Postgres databases and fixtures
I've created a Pyramid scaffold which integrates Alembic, a migration tool, with the standard SQLAlchemy scaffold. (It also configures the Mako template system, because I prefer Mako.)
I am also using PostgreSQL for my database. PostgreSQL supports nested transactions. This means I can setup the tables at the beginning of the test session, then start a transaction before each test happens and roll it back after the test; in turn, this means my tests operate in the same environment I expect to use in production, but they are also fast.
I based my approach on sontek's blog post, but I've refined it a bit.
I use py.test for unit tests, but to the best of my knowledge, a similar approach will work with nosetest; you just create a nosetest plugin to load in the ini file with a command line option, and have begin and finalize methods to do the table setup and teardown; likewise, you setup the test-wrapping transaction and tear it down in setup/teardown methods on your BaseTest class. I chose py.test because I like certain conveniences it provides, but I've tried most of this setup with nosetest as well.
Since we will be executing the tests against a live database, we need a connection URL with which to configure SQLAlchemy. Since tests often involve other aspects of application configuration, I've found it most convenient to copy the production.ini file to test.ini and point it at my test database.
The first part of my py.test configuration is a plugin which adds a command-line option to load the ini file. Plugins in py.test implement hook functions, and while they can be part of packages, configured with setuptools entry points, py.test also looks for files named conftest.py in the test directories. So I went with that for simplicity while figuring things out.
def pytest_addoption(parser): parser.addoption("--ini", action="store", metavar="INI_FILE", help="use INI_FILE to configure SQLAlchemy")
This adds the ini file option; I then added an entry to setup.cfg so it would automatically be used.
[pytest] addopts = --ini=test.ini
Next, a handful of py.test fixtures. Py.test uses 'fixture' more broadly than I was used to; anything you might put in a setup/teardown method can be a py.test fixture.
@pytest.fixture(scope='session') def appsettings(request): config_uri = os.path.abspath(request.config.option.ini) setup_logging(config_uri) return get_appsettings(config_uri)
Session scope means that this will be a singleton for the life of the test session. It gets the ini file passed in on the command line, sets up logging, and parses the application settings, using functions from
@pytest.fixture(scope='session') def sqlengine(request, appsettings): engine = engine_from_config(appsettings, 'sqlalchemy.') DBSession.configure(bind=engine) Base.metadata.create_all(engine) def teardown(): Base.metadata.drop_all(engine) request.addfinalizer(teardown) return engine
Then I create the tables, and make arrangements for them to be torn down at the end of the test session; the function requires a variable
appsettings and so py.test injects the result of the appsettings fixture.
@pytest.fixture() def dbtransaction(request, sqlengine): connection = sqlengine.connect() transaction = connection.begin() DBSession.configure(bind=connection) def teardown(): transaction.rollback() connection.close() DBSession.remove() request.addfinalizer(teardown) return connection
The last fixture in conftest.py does not have session scope; it will be executed every time the fixture is required. This creates a transaction using the raw SQLAlchemy connection, outside the scope of the ORM's DBSession (where the transaction manager handles transactions). Since the DBSession doesn't know about this transaction, it can happily create, commit, or roll back transactions without this wrapper transaction ever being committed. At the end of the test, we roll back this transaction and close the connection.
The next piece of the puzzle is a BaseTest class. py.test will happily work with raw test functions, but nearly every unit test I write will need the wrapper transaction to avoid putting data in the database, so I find it convenient to have a BaseTest class which requires that fixture. It also executes the transaction.abort() function on teardown; this is the transaction manager module's abort() function, not a method called on the wrapper transaction.
@pytest.mark.usefixtures("dbtransaction") class BaseTest(object): def setup_method(self, method): self.config = testing.setUp() def teardown_method(self, method): transaction.abort() testing.tearDown()
So far, so good! Now we have speedy tests, even if they make SQL changes.
However, often you want to test code which expects records to already be in the database. You could create rows and insert them in each test, but that gets tedious. Instead, there's the notion of data fixtures. In a previous project, I had a simple homegrown way of handling that, which was getting less simple with every update. I went looking for a better approach and found Kumar McMillan's fixture library. It hasn't had a release since 2010, when SQLAlchemy 0.5 was new, but it seemed to be a pretty solid library.
EDIT: I've since discovered the factory_boy library which is actually maintained. I will look into revising this post to use factory_boy.
The fixture library lets you define DB fixtures using just a few classes which get turned into SQLAlchemy ORM instances on demand. Here's an example from the website.
class AuthorData(DataSet): class frank_herbert: first_name = "Frank" last_name = "Herbert" class brian_herbert(frank_herbert): first_name = "Brian" class BookData(DataSet): class dune: title = "Dune" author = AuthorData.frank_herbert class sudanna: title = "Sudanna Sudanna" author = AuthorData.brian_herbert
However, fixture's SQLAlchemy integration assumes it can be permitted to manage sessions on its own and freely commit transactions. This doesn't work with the wrapper-transaction system, so I wrote a different integration which uses the DBSession provided by the SQLAlchemy scaffold.
from fixture.loadable import EnvLoadableFixture from fixture.style import NamedDataStyle class LoadOnlyFixture(EnvLoadableFixture): def __init__(self, session=None, **kw): EnvLoadableFixture.__init__(self, **kw) self.session = session def commit(self): pass def rollback(self): pass class Medium(EnvLoadableFixture.Medium): def visit_loader(self, loader): self.session = loader.session def save(self, row, column_vals): obj = self.medium() for c, val in column_vals: setattr(obj, c, val) self.session.add(obj) return obj def clear(self, obj): pass dbfixture = LoadOnlyFixture( env = models, style = NamedDataStyle(), session = models.DBSession )
This integration assumes that every fixture can be constructed by instantiating the ORM class with no parameters, then setting each attribute specified in the fixture, and adding it to the specified session. It could easily instead pass all the attributes as keyword arguments, though. the NamedDataStyle maps a fixture named AuthorData to an ORM class named Author; there are other mapping styles and you can write your own.
In order to use these fixtures, you need only call
dbfixture.data(BookData). The fixture system will take care of instantiating these objects and all their dependencies.
Well, that's not quite true.
dbfixture.data() returns a FixtureData object, which controls the lifecycle of the fixtures. You need to call
.setup() on this object before you use the fixtures and call
.teardown() afterward. Even though my LoadOnlyFixture doesn't actually clear the fixtures from the database (since they will be cleared anyway when the wrapper transaction rolls back), you have to call
.teardown() so the fixture library knows what is and isn't loaded.
You can also use the FixtureData object as a context manager for the with statement, but I find that most or all of my test method would have to be inside the with statement, and that's just a lot of indentation. So instead I added support for the fixtures to BaseTest.
@pytest.mark.usefixtures("dbtransaction") class BaseTest(object): def setup_method(self, method): self._fixture_state = None self.config = testing.setUp() def add_fixtures(self, *datasets): self._fixture_state = dbfixture.data(*datasets) self._fixture_state.setup() models.DBSession.flush() def teardown_method(self, method): transaction.abort() testing.tearDown() if self._fixture_state is not None: self._fixture_state.teardown() self._fixture_state = None
This way I can simply call
self.add_fixtures(BookData) from my test method and everything works perfectly. I added a similar add_patcher method to handle the mock library's patches; see the files in this gist for the full source.