Skip to content

Instantly share code, notes, and snippets.

@kenwebb
Last active August 29, 2015 13:56
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save kenwebb/9275385 to your computer and use it in GitHub Desktop.
Save kenwebb/9275385 to your computer and use it in GitHub Desktop.
HTML5 Audio Synthesis
<?xml version="1.0" encoding="UTF-8"?>
<!--Xholon Workbook http://www.primordion.com/Xholon/gwt/ MIT License, Copyright (C) Ken Webb, Sat Mar 01 2014 09:31:58 GMT-0500 (EST)-->
<XholonWorkbook>
<Notes><![CDATA[
Xholon
------
Title: HTML5 Audio Synthesis
Description:
Url: http://www.primordion.com/Xholon/gwt/
InternalName: 9275385
Keywords:
My Notes
--------
I'm using this workbook to explore how to synthesize audio in a web browser, using the HTML5 Web Audio API.
I'm starting off with no prior knowledge of this area, so everything here is very tentative.
It's a set of notes combined with a runnable app that generates some sounds.
Keith Peters has written a good tutorial about `Audio Synthesis in JavaScript` (October 28, 2013).
I'm reproducing his code here to help me start thinking about how I can use audio in Xholon.
This app works with Chrome (webkitAudioContext()), and with Firefox (AudioContext()),
and may work with other modern browsers.
To run the app and listen to some generated sounds:
--------------------------------------------------
- Click the "clsc" button at the top of this page
- The app will load in a new tab or window
- Adjust the volume on your sound card; low-volume is probably a good idea to start with
- Click the "Step" button 7 times and listen to the sounds (or lack of sound) at each timestep
- Examine the JavaScript behavior lower down on this page to see what should be happening at each time step,
and follow along in Keith Peters' tutorial
The Web Audio API specification has a very Xholon-like structure, including containment and routing.
"The Web Audio API specification developed by W3C describes a high-level JavaScript API for processing and synthesizing audio in web applications. The primary paradigm is of an audio routing graph, where a number of AudioNode objects are connected together to define the overall audio rendering." (wikipedia)
The set of AudioNode objects are contained within an AudioContext object.
To generate the SVG image at the bottom of this page:
----------------------------------------------------
- Run this app using the "clsc" button at the top of this page
- Right-click the context1:audioContext node in the Composite Structure Hierarchy
- Select Export > Graphviz
- Copy the Graphviz content from the resulting graphviz tab (shown below)
digraph 36 { label="context1"
node [style=filled,fillcolor="#f0f8ff"]
37 [label="osc1"]
37 -> 38;
38 [label="gain"]
38 -> 39;
39 [label="osc2"]
}
- Visit URL http://rise4fun.com/agl/
- Paste the Graphviz content into the rise4fun editor
- Click the rise4fun action button, which will generate the SVG
Resources
---------
http://flippinawesome.org/2013/10/28/audio-synthesis-in-javascript/
http://www.html5rocks.com/en/tutorials/webaudio/intro/
https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html
http://en.wikipedia.org/wiki/HTML5_Audio
http://www.phy.mtu.edu/~suits/notefreqs.html
]]></Notes>
<_-.XholonClass>
<!-- domain objects -->
<PhysicalSystem/>
<AudioBlock/>
<!--
API interfaces defined in the Web Audio API
Note: these should all be part of a new Xholon view-mechanism,
so they can be used in any Xholon app
-->
<!-- "contains an audio signal graph representing connections between AudioNodes" -->
<AudioContext>
<!-- "for rendering/mixing-down (potentially) faster than real-time" -->
<OfflineAudioContext/>
</AudioContext>
<!-- "for working with memory-resident audio assets.
can represent one-shot sounds, or longer audio clips" -->
<AudioBuffer/>
<!-- "represents audio sources, audio outputs, and intermediate processing modules.
AudioNodes can be dynamically connected together in a modular fashion"
to create a routing graph -->
<AudioNode>
<!-- "audio source from an audio, video, or other media element" -->
<MediaElementAudioSourceNode/>
<!-- "audio source from a MediaStream such as live audio input, or from a remote peer" -->
<MediaStreamAudioSourceNode/>
<!-- "generates audio from an AudioBuffer" -->
<AudioBufferSourceNode/>
<!-- "final destination for all rendered audio" -->
<AudioDestinationNode/>
<!-- "audio destination to a MediaStream sent to a remote peer" -->
<MediaStreamAudioDestinationNode/>
<!-- "for generating or processing audio directly in JavaScript" -->
<ScriptProcessorNode/>
<!-- "for explicit gain control" -->
<GainNode/>
<!-- "for common low-order filters such as
Low Pass
High Pass
Band Pass
Low Shelf
High Shelf
Peaking
Notch
Allpass" -->
<BiquadFilterNode/>
<!-- "applies a dynamically adjustable variable delay" -->
<DelayNode/>
<!-- "for spatializing / positioning audio in 3D space" -->
<PannerNode/>
<!-- "for applying a real-time linear effect (such as the sound of a concert hall)" -->
<ConvolverNode/>
<!-- "for use with music visualizers, or other visualization applications" -->
<AnalyserNode/>
<!-- "for accessing the individual channels of an audio stream in the routing graph" -->
<ChannelSplitterNode/>
<!-- "for combining channels from multiple audio streams into a single audio stream" -->
<ChannelMergerNode/>
<!-- "for combining channels from multiple audio streams into a single audio stream" -->
<ChannelMergerNode/>
<!-- "for dynamics compression" -->
<DynamicsCompressorNode/>
<!-- "applies a non-linear waveshaping effect for distortion and other more subtle warming effects" -->
<WaveShaperNode/>
<!-- "an audio source generating a periodic waveform" -->
<OscillatorNode/>
</AudioNode>
<!-- "an event type used with ScriptProcessorNode objects" -->
<AudioProcessingEvent/>
<!-- "for controlling an individual aspect of an AudioNode's functioning, such as volume" -->
<AudioParam/>
<!-- "works with an PannerNode for spatialization" -->
<AudioListener/>
</_-.XholonClass>
<xholonClassDetails>
<!-- ports for visualization of the structure defined by the AudioBlockbehavior -->
<OscillatorNode>
<!--<port name="port" index="0" connector=".[roleName='osc1']/../GainNode"/>-->
<port name="port" index="0" connector="following-sibling::GainNode"/>
</OscillatorNode>
<GainNode>
<port name="port" index="0" connector="following-sibling::OscillatorNode"/>
</GainNode>
<!-- ports for first example from section "1.2. Modular Routing" in the Web Audio API spec -->
<AudioBufferSourceNode>
<port name="port" index="0" connector="../AudioDestinationNode"/>
</AudioBufferSourceNode>
</xholonClassDetails>
<PhysicalSystem>
<AudioBlock/>
<!-- this is a visualization of the structure defined by the AudioBlockbehavior -->
<AudioContext roleName="context1">
<OscillatorNode roleName="osc1"/>
<GainNode roleName="gain"/>
<OscillatorNode roleName="osc2"/>
</AudioContext>
<!-- first example from section "1.2. Modular Routing" in the Web Audio API spec -->
<AudioContext roleName="context2">
<AudioBufferSourceNode roleName="source"/>
<AudioDestinationNode roleName="destination"/>
</AudioContext>
</PhysicalSystem>
<AudioBlockbehavior implName="org.primordion.xholon.base.Behavior_gwtjs"><![CDATA[
var myAudioBlock, context, osc1, gain, osc2;
var $this, position, scale, song, periodicTimer; // used in Creating Music
var beh = {
postConfigure: function() {
myAudioBlock = this.cnode.parent();
$wnd.xh.param("TimeStepInterval", "1000"); // 1 timestep every second (every 1000 milliseconds)
// Make it work with multiple browsers
try {
$wnd.AudioContext = $wnd.AudioContext || $wnd.webkitAudioContext;
context = new $wnd.AudioContext();
}
catch(e) {
alert('Web Audio API is not supported in this browser');
}
},
act: function() {
var ts = $wnd.xh.param("TimeStep");
switch(ts) {
// Creating Audio Nodes
case "0": this.makeOscillator1(); break;
// Creating an Oscillator
case "1": this.makeGain(); break;
case "2": this.makeOscillator2(); break;
// Creating Interactivity
case "3": this.makeInteractivity(); break;
// turn off the annoying noise
case "4":
osc2.stop(0);
osc1.stop(1);
break;
// Creating Music
case "5": this.makeMusic(); break;
// stop the annoying music before my cat goes crazy
case "6":
clearInterval(periodicTimer);
break;
default: break;
}
},
makeOscillator1: function() {
osc1 = context.createOscillator();
osc1.frequency.value = 440; // A
osc1.type = "sine"; // default is "sine"; also "sawtooth", "square", "triangle"
osc1.connect(context.destination);
osc1.start(0);
},
makeGain: function() {
gain = context.createGain();
gain.gain.value = 100;
gain.connect(osc1.frequency);
},
makeOscillator2: function() {
osc2 = context.createOscillator();
osc2.frequency.value = 1;
osc2.type = "square";
osc2.connect(gain);
osc2.start(0);
},
makeInteractivity: function() {
var w = $wnd.innerWidth;
var h = $wnd.innerHeight;
osc1.frequency.value = 400;
osc2.frequency.value = 5;
$doc.addEventListener("mousemove", function(e) {
osc1.frequency.value = e.clientY / h * 1000 + 200;
osc2.frequency.value = e.clientX / w * 30 + 5;
});
},
makeMusic: function() {
console.log("makeMusic");
context = new $wnd.AudioContext();
position = 0;
scale = {
C: 261.63,
d: 277.18,
D: 293.66,
e: 311.13,
E: 329.63,
F: 349.23,
g: 369.99,
G: 392,
a: 415.30,
A: 440.00,
b: 466.16,
B: 493.88
};
song = "GFeFGGG-FFF-Gbb-GFeFGGGGFFGFe---";
console.log(song);
$this = this;
periodicTimer = setInterval(this.play, 1000 / 4);
},
createOscillator: function(freq) {
var attack = 10;
var decay = 250;
var gain = context.createGain();
var osc = context.createOscillator();
gain.connect(context.destination);
gain.gain.setValueAtTime(0, context.currentTime);
gain.gain.linearRampToValueAtTime(1, context.currentTime + attack / 1000);
gain.gain.linearRampToValueAtTime(0, context.currentTime + decay / 1000);
osc.frequency.value = freq;
osc.type = "square";
osc.connect(gain);
osc.start(0);
setTimeout(function() {
osc.stop(0);
osc.disconnect(gain);
gain.disconnect(context.destination);
}, decay)
},
play: function() {
console.log("play");
var note = song.charAt(position);
var freq = scale[note];
position += 1;
if(position >= song.length) {
position = 0;
}
if(freq) {
$this.createOscillator(freq);
}
}
}
]]></AudioBlockbehavior>
<SvgClient><Attribute_String roleName="svgUri"><![CDATA[data:image/svg+xml,
<svg xmlns:xlink="http://www.w3.org/1999/xlink" width="58.12" height="172.887" id="svg2" version="1.1" xmlns="http://www.w3.org/2000/svg">
<g transform="translate(29.0598926544189,672.886711120605)">
<!--nodes-->
<rect fill="none" stroke="#000000" stroke-opacity="1" stroke-width="1" x="-27.06" y="-670.887" width="54.12" height="36.296" rx="3.63" ry="3.63" />
<text x="-19.06" y="-645.974" font-family="Arial" font-size="16" fill="#000000">osc1</text>
<rect fill="none" stroke="#000000" stroke-opacity="1" stroke-width="1" x="-27.06" y="-538.296" width="54.12" height="36.296" rx="3.63" ry="3.63" />
<text x="-19.06" y="-513.383" font-family="Arial" font-size="16" fill="#000000">osc2</text>
<rect fill="none" stroke="#000000" stroke-opacity="1" stroke-width="1" x="-27.06" y="-604.591" width="54.12" height="36.296" rx="3.63" ry="3.63" />
<text x="-19.06" y="-579.678" font-family="Arial" font-size="16" fill="#000000">gain</text>
<!--end of nodes-->
<!--Edges-->
<path fill="none" stroke="#000000" stroke-opacity="1" stroke-width="1" d="M 0 -634.591 L 0 -614.591" />
<polygon stroke="#000000" stroke-opacity="1" stroke-width="1" fill="#000000" points="-2.217 -614.591 0 -604.591 2.217 -614.591" />
<path fill="none" stroke="#000000" stroke-opacity="1" stroke-width="1" d="M 0 -568.296 L 0 -548.296" />
<polygon stroke="#000000" stroke-opacity="1" stroke-width="1" fill="#000000" points="-2.217 -548.296 0 -538.296 2.217 -548.296" />
</g>
</svg>
]]></Attribute_String><Attribute_String roleName="setup">${MODELNAME_DEFAULT},${SVGURI_DEFAULT}</Attribute_String></SvgClient>
</XholonWorkbook>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment