public
Created

Thoughts on using mocking, spying libraries like sinon for testing

  • Download Gist
gistfile1.md
Markdown

Thoughts on using mocking, spying libraries for testing.

While using sinon extensively I realized that in most cases it's better to write assertions and define behaviors just right in a fake function:

it('Should list dir contents', function (done) {
  fs.readdir('dir', function (err, list) {
    should.not.exist(err)
    list.should.eql(['a', 'b', 'c'])
    done()
  })
})

The same test written with sinon suggested style would look like

it('Should list dir contents', function () {
  cb = sinon.spy()
  fs.readdir('dir', cb)
  cb.should.be.calledOnce
  should.not.exist(cb.firstCall.args[0])
  cb.firstCall.args[1].should.eql(['a', 'b', 'c'])
})

Pretty ugly, isn't it? Yes, we can define reusable assertion like cb.should.have.value(['a', 'b', 'c']), but meh, there always be the case when yet another funky assertion is missing and in any case it's less clear.

Let's consider another example

it('.waterfall()', function (done) {
  async.waterfall([
    function (cb) {
      cb(null, 'one', 'two')
    },
    function (arg1, arg2, cb) {
      arg1.should.equal('one')
      arg2.should.equal('two')
      cb(null, 'three')
    },
    function (arg1, cb) {
      arg1.should.equal('three')
      cb()
    }
  ], done)
})

It's pretty nice and clear but the problem is that we are not sure that all our assertions were run. We can fix this quickly:

it('.waterfall()', function (done) {
  var calls = ''
  async.waterfall([
    function (cb) {
      calls += 'first;'
      cb(null, 'one', 'two')
    },
    function (arg1, arg2, cb) {
      calls += 'second;'
      arg1.should.equal('one')
      arg2.should.equal('two')
      cb(null, 'three')
    },
    function (arg1, cb) {
      calls += 'third;'
      arg1.should.equal('three')
      cb()
    }
  ], function () {
    calls.should.equal('first;second;third;')
    done()
  })
})

Now we know exactly what and when and how many times were executed. We would have nice error text if something went wrong. Depending on the task in hand we can use more elaborate logging, write custom assertions for our text, etc. Isn't it better?

Please sign in to comment on this gist.

Something went wrong with that request. Please try again.