Skip to content

Instantly share code, notes, and snippets.

@kitallis
Last active September 11, 2015 14:46
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save kitallis/583543c5d2c0b60e9920 to your computer and use it in GitHub Desktop.
Save kitallis/583543c5d2c0b60e9920 to your computer and use it in GitHub Desktop.
Modeling a game with melody recognition and machine training using ClojureScript and real instruments

Links

The basic premise behind this has started here: https://github.com/kitallis/cacophonica. It's basically broken down into three parts:

  1. Detect the pitch / frequency from the instrument / input.
  2. Build a system that allows representing music as data from ground up (https://github.com/kitallis/cacophonica/blob/master/src/cacophonica/melodic/src/melodic/core.clj)
  3. Build a system that matches 1 against 2.

Read below for more details.

Philosophy / Abstract

Being great at a first-person shooter game requires a lot of hand-eye coordination and skill with the mouse. Being great at an MMORPG requires a working knowledge of a vast variety of keyboard shortcuts. Being great at a fighting game requires speed and skill with the gamepad.

The underlying philosophy behind this experiment is to showcase that you can potentially take the input to any stringed instrument (with a pickup) lying around in your house, to use to design a game that requires as much skill (if not more) as playing any other game with any other kind of input device. By treating the instrument as a game controller, and moving the onus of deriving quality from the playing into the system, we can harness some of the basic qualities – speed and melody – of the instrument to make our game more interesting and a lot more unique and fun.

The idea is to go as far away from being a Guitar Hero derivative as possible, and closer to a system that treats it as just another input device. The game should not be about playing guitars or any kind of music, but should be about playing any other regular game. Unlike other Musical Instrument Games™, we want ours to work for anyone, anywhere. So we're going to use a simple semi-electric guitar plugged into the microphone line-in and demonstrate this on the browser using the Web Audio Interface.

We'll go through a few papers and look at implementing some of the algorithms to pitch / note detect in near real-time, process the input to make sense of it, run it through a few simple constraints and algorithms to recognise what a melody might be like and eventually explore how we can make the system intelligent enough to train itself based on the player and the styles so that we can generate smarter and harder levels for our game.

Outline

Before we deep dive into the implementation, we're going to discuss music in general. The game is rather a secondary goal for this exercise and the outcome will serve merely as a demonstration of possibility.

Music Theory

We'll cover the basic building blocks of music from ground-up. Starting from,

  • Note
  • Tuning
  • Intervals
  • Temperaments
  • Scales
  • Chords

and how these apply to various traditions of music (western, contemporary, carnatic etc.)

Waves

We'll examine what fundamental frequencies, harmonics and the frequency-ratio characteristics (twelfth-root of 2 etc.) of these different styles are like.

Detection

We'll go through these papers and discuss some of the pros and cons of the each of these algorithms. We'll explore how the accuracy of each of them vary based on the kind of signals we receive. Polyphonic sounds for any proper guitar playing holds a lot of frequency fluctuations that are hard to deal with, so the robustness for pitch detection is still more or less an unsolved problem.

ClojureScript Wrapper

I've been working on a fork of the ClojureScript library called Hum. The original is still small and lacks a few APIs that we require, for e.g. AnalyserNode

Improvements

Even the high-end hardware at this point fails to properly do chords. As you can design chord notes to hit directly on the harmonics of the fundamental frequency. We'll talk about some simple solutions that can potentially make our output better, for e.g.: putting six piezos on each string of the guitar that can send MIDI data.

Game

We'll devise a simple Fleuch-esque game where a space rover can shoot in a 360° arc around itself to destroy incoming droids. The speed and attack radius of this rover is controlled by how melodic the player plays. Here's a simple shot of what this would look like:

gefleucht

Training

We can back this up with a persistent connection to a server that takes all this frequency data to bucket our notes into trying to determine what a good melody might be. This is obviously a very non-deterministic problem, and an objectively correct solution is impossible.

We could start by bucketing our melody engine into good, reasonable and bad templates. Each of these buckets would have a set of constraints that we would run our notes through.

  • Pre-feed frequency data from popular music for comparisons.
  • Playing a series of notes in a certain window frame that usually sound good (based on some simple rules of musical scale)
  • Not let the player beat the game by merely playing the same nice-sounding notes / scales over and over again
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment