Skip to content

Instantly share code, notes, and snippets.

@thinktainer
Created December 16, 2013 14:53
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save thinktainer/7988230 to your computer and use it in GitHub Desktop.
Save thinktainer/7988230 to your computer and use it in GitHub Desktop.

CarolR

We conceived the idea for this hack when reading the list of hardware that was going to be available for this hack day. Among them was a raspberry pie that could be turned into an iBeacon very easily. An appealing thought was that we could join the event of people coming into a range of each other and the festive setting into a meaningful context. Playing festive tunes in a polyphonic orchestra of users devices seemed like a perfect fit for this proposition. We set out by looking at implementations of communications protocols and quickly discovered that Bluetooth_LE was not going to fit the bill, as we were going to have to:

  • Implement our own communication stack
  • Deploy native apps, as the bluetooth stack is not exposed via the browser API's

So with that in mind we focused on utilizing our key skills in distributed web applications and apply them to the problem at hand. We found that pushing the musical notes out via the MIDI protocol was an appealing thought, because it was going to have desirable properties, such as:

  • Make for an interactive experience
  • The bandwidth required was going to be acceptably low
  • There is an abundance of music with permissive rights available

Next up we had to decide on the communication mechanism, and as we wanted to leverage the browser as much as possible we thought that websockets was going to be a really nice way of pushing the notes out to the client. We looked at applying our core skills again, by means of using the SignalR stack as provided by Microsoft. Now this seemed like quite a heavy requirement for something that was going to push out a single type of event (MIDI data) and we decided in the spirit of a 'hack' day, to go with something off our beaten track. We settled on Node.js , after a bit of research. We have an awesome javascript ninja working at pebble{code} and he pointed us to the excellent socket.io node package.

Without further ado we both went ahead and installed the necessary requirements on our machines, Dan using his trusted Windows 8.1 environment, and myself feeling particularly hacky today, going OS X Mavericks. I quickly spun up a Debian VM in order to let us share the app with the rest of the office.

With the morning now gone and a "brainfood platter" later we had about 4 hours till demo left. We found MIDI.js , an awesome javascript library that does a lot more than playing just MIDI, but it was going to be a huge time saver in developing our MIDI network protocol. In true 'hack day' manner we modified the source code of the library to create a new plug in that replaced hardware midi playback with forwarding the notes on to our web sockets implementation. The code for getting these events published is these few lines:

io.sockets.on('connection', function (socket) {
socket.on('play note', function(data) {
	io.sockets.emit('play note', data);
});
socket.on('programChange', function (data) {
	io.sockets.emit('programChange', data);
});
socket.on('setVolume', function (data) {
	io.sockets.emit('setVolume', data);
});
socket.on('noteOn', function (data) {
	io.sockets.emit('noteOn', data);
});
etc ...
})

The rest was just taking the network events on the 'musician page' and play them back on the devices:

var socket = io.connect('/');
socket.on('news', function (data) {
	if (data.channel === currentChannel) {
		console.log(data);
		socket.emit('my other event', { my: 'data' });
	}
});

function connectToSocket() {
	socket.on('play note', function(data) {
		console.log(data);
		var note = decode(data);
		playSingle(note);
	});

etc...

After this we decided for it to be more interesting, and to highlight the fact that it is not all devices starting playback on the same music track at the same time we were going to split the music into individual parts, as defined in the midi files. You could think of it like one device playing the bass line, another playing the melody, etc. It turned out that this was really easy to do, as all we needed to do was filter the playback of the midi notes in the clients by channel. We added a text field and a button to change the channel on the device, and a little Santa gif to give it the famous 'graphics once over'. This took as to a little after 5 and it was time for the demo in a short while, so we started asking people to start connecting their phones as instruments. A little let down was that iphones seem to be a bit lacking in their support of playing back the midi music. The demo ended up to be more of an orchestra of laptops and android phones. This is less of a problem, as pebble is actually awesome and every dev gets a laptop to do his work on, so there's no shortage of these. In hindsight this was a blessing in disguise, as running on laptops the rendition of a grand piano only sounded half as horrible as playing through mobile phone speakers.

We are both really impressed with the maturity and ease of use of the whole javascript stack, particularly the Node package manager . We basically got from inception to a deployed audio web app in the course of a working day. Node.js is an awesome technology, and I highly recommend anyone who hasn't done so yet to go and have a play with it.

It was an awesome hack day, and we were quite proud to come away second place with our little implementation of the distributed device orchestra.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment