Full Disclosure: I'm a member of the AVA team
I should start by saying there are lots of reasons to choose AVA, and I don't think speed is (necessarily) the most import one. Other good reasons include:
- Babel is built in. You get to write your tests in
ES2015
right out of the box with no config. - An API designed around
ES2015
async goodness. With first class support for Promises, Generators, Observables, and the newasync function/await
syntax, AVA offers many ways to test async API's succinctly in a manner that is easy to understand. power-assert
- It's just awesome, and AVA comes with support built in.
For async tests that perform blocking operations (i.e. network/disk/database access), AVA is almost certainly going to outperform other test runners. Let's consider a test suite with the following characteristics:
- There are 100 tests.
- Each test needs to perform a single disk operation.
- Each disk operation takes 10 milliseconds
Now let's simulate those conditions in Mocha and AVA:
// mocha.js
var delay = require('delay');
for (var i = 0; i < 100; i++) {
it('test' + i, function () {
return delay(10);
});
}
// ava.js
import test from 'ava';
import delay from 'delay';
for (var i = 0; i < 100; i++) {
test('test' + i, () => {
return delay(10);
});
}
The results:
ava --verbose ava.js 0.54s user 0.07s system 106% cpu 0.564 total
mocha mocha.js 0.25s user 0.05s system 21% cpu 1.394 total
Increasing to 1,000 tests:
ava --verbose ava.js 0.82s user 0.10s system 110% cpu 0.829 total
mocha mocha.js 0.87s user 0.23s system 8% cpu 12.280 total
The simulation above greatly oversimplifies the real world. Hammering the disk system with 100 or 1000 requests at once, would almost certainly create longer read times, so AVA's performance advantage is exagerated here. That said, if any of your tests perform slow async operations, AVA can potentially make them much faster.
In addition to running tests concurrently within a given test file, AVA also forks a new process for each test file, and it runs each of those processes concurrently. This can also offers some significant performance improvements. Realizing the full advantage does require that you are strategic about splitting your tests across multiple files. I recently wrote up this comparison of synchronous test performance for the rest of the AVA team. The highlights:
- AVA (power-assert included)
ava 4.83s user 0.54s system 396% cpu 1.353 total
- Mocha + power-assert
mocha 2.52s user 0.33s system 99% cpu 2.878 total
- Mocha
mocha 1.14s user 0.24s system 92% cpu 1.484 total
AVA has run my tests as fast or faster than mocha for nearly every test suite I have converted, with two exceptions:
- Really small test suites.
AVA has a much larger dependency graph than most other test suites. Consequently, it takes a little longer to start up. If your test suite consists of just a dozen test in single file, it's unlikely AVA will be faster. Even so, as stated at the top of this post, I think there are plenty of other good reasons to choose AVA.
- Watch mode.
AVA currently lacks first class support for file change watching. Some users have expiremented with nodemon
, but the performance isn't great. The good news is that we plan to integrate first class file watching support soon. Done correctly, this should also help to mitigate the cost of AVA's startup time.
With a few exceptions, AVA is generally able to speed up your tests. Where it doesn't, the extra goodies like power-assert
and the ability to use ES2015
syntax in your tests without any config still make it a compelling option.
The AVA team is committed to making AVA screaming fast. If switching to AVA doesn't provide a significant performance boost, please file an issue, so we can make AVA faster for everyone.
It looks like AVA now offers a built-in Watch Mode