Koa can actually quite fast.
As per our scenario, in our controller, at the end of it we want to have a ctx.json(object, status)
. A helper that sends out stringified object with given status.
A naive implementation using babel, or async-to-gen is as follow:
const Koa = require('koa')
const app = new Koa()
app.use(async function (ctx, next) {
ctx.json = function(obj, status = 200) {
ctx.status = status
ctx.body = obj
}
})
// the controller, the last handler
app.use(function(ctx) {
ctx.json({ok: 1})
})
app.listen(3000)
In order to make it run, we need to have babel-cli
installed.
$ npm install -g babel-cli
We also need to install the plugins to transform the async-await
syntax into supported syntax.
The options are:
- To generator (e.g.
babel-plugin-transform-async-to-generator
) - To Promise (e.g.
fast-async
)
For to-generator, we also have async-to-gen
to transform the async-await
into sensible generator-based function.
Let's take option number 1.
After npm init
-ing your project,
$ npm install babel-plugin-transform-async-to-generator --save-dev
$ touch .babelrc
Put this inside .babelrc
{
"plugins": ["transform-async-to-generator "]
}
Then we can run our app, index.js by:
$ babel-node .
To benchmark our server, we use wrk.
./wrk -t2 -c10 -d5s http://127.0.0.1:3000
In my machine, this naive implementation gives:
Running 5s test @ http://127.0.0.1:3000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.37ms 4.06ms 63.73ms 95.69%
Req/Sec 7.10k 1.43k 8.32k 88.12%
71343 requests in 5.10s, 10.48MB read
Requests/sec: 13989.78
Transfer/sec: 2.05MB
Not bad huh!? How can we improve this?
We heard about the fast-async
babel plugin. This plugin transforms async-await
syntaxes into promises.
Let's try it out:
$ npm install fast-async --save
And put the following lines in .babelrc
file.
{
"plugins": [
["fast-async", {
"spec":true
}]
]
}
Run the babel-node .
again and see the benchmark:
Running 5s test @ http://127.0.0.1:3000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.14ms 1.49ms 16.50ms 92.46%
Req/Sec 6.13k 646.01 9.75k 84.16%
61584 requests in 5.10s, 9.04MB read
Requests/sec: 12074.93
Transfer/sec: 1.77MB
Whoa! Surprise, it is slower than the generator-based one. But, since this is promise-based, it let us to get help from bluebird.
Let's install bluebird.
$ npm install bluebird --save
And add following line in the first line of index.js.
const Promise = require('bluebird')
const Koa = require('koa')
...
And let's run our bencmark again, and the result is as follow:
Running 5s test @ http://127.0.0.1:3000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.13ms 1.64ms 15.49ms 91.13%
Req/Sec 7.03k 1.08k 8.32k 65.00%
69957 requests in 5.00s, 10.27MB read
Requests/sec: 13988.29
Transfer/sec: 2.05MB
Seems the fast-async
lies to us.
I have experience that async-to-gen
is better. Let's install it. This module has global executable async-node
.
$ npm install async-to-gen --global
And we can quickly run the index.js
.
$ async-node index.js
Let's examine the benchmark result again:
Running 5s test @ http://127.0.0.1:3000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.06ms 1.29ms 9.96ms 92.19%
Req/Sec 6.44k 753.18 7.23k 80.00%
64066 requests in 5.00s, 9.41MB read
Requests/sec: 12811.90
Transfer/sec: 1.88MB
Seems the babel one is the best one. Until now. How about our hand? Yeah, we can actually build a promise-based middleware when assigning that ctx.json
function. Let's do it.
const Promise = require('bluebird')
const Koa = require('koa')
const app = new Koa()
app.use(function (ctx, next) {
return Promise.resolve()
.then(function () {
ctx.json = function (obj, status = 200) {
ctx.status = status
ctx.body = obj
}
return next();
})
})
app.use(function (ctx) {
ctx.json({ok: 1})
})
app.listen(3000)
The benchmark shows some improvement!
Running 5s test @ http://127.0.0.1:3000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.91ms 1.32ms 18.65ms 91.73%
Req/Sec 8.28k 1.61k 15.62k 91.09%
83269 requests in 5.10s, 12.23MB read
Requests/sec: 16327.14
Transfer/sec: 2.40MB
So, what is left? I heard that we can trick the libuv event loop by wrapping the res.end
inside setImmediate
. How to do that with koa
? Well, we can tell koa
to skip its respond
helper.
The following is the modified index.js
const Promise = require('bluebird')
const Koa = require('koa')
const app = new Koa()
function wrapEnd (res, status = 200, obj = {}) {
res.writeHead(status, {'Content-Type': 'application/json'})
res.end(JSON.stringify(obj))
}
app.use(function (ctx, next) {
ctx.respond = false
return Promise.resolve()
.then(function() {
ctx.json = function (obj) {
setImmediate(wrapEnd, ctx.res, 200, obj)
}
return next()
})
})
app.use(function (ctx) {
ctx.json({ok: 1})
})
app.listen(3000)
And here's the benchmark:
Running 5s test @ http://127.0.0.1:3000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 825.85us 1.16ms 12.90ms 91.23%
Req/Sec 9.27k 1.79k 16.99k 87.13%
93192 requests in 5.10s, 14.04MB read
Requests/sec: 18274.23
Transfer/sec: 2.75MB
Not bad huh?
The above experiments done for quickly exploring our options to make our http server better. Well, since http server of node is implemented in .js, it is slower than e.g. if we use uWs http server, however, that fast http server is not complete yet.
Want to be faster? Write your app using c++ and use proxygen. LOL.
But, well, since our app mostly IO-bound, above attempts won't give any difference IF the underlying layers (e.g. cache, database) are slow.