Skip to content

Instantly share code, notes, and snippets.

Last active October 10, 2017 02:12
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
Star You must be signed in to star a gist
What would you like to do?
const playSignal = (signal = [], audioContext, sampleRate) => {
const signalLength = signal.length
// This node will be used as the source to play the signal
const node = audioContext.createBufferSource()
// Let's create the buffer to be assigned to the above node
// The arguments are: the number of channels, the length of the buffer, which
// is the same from the signal and the sample rate
// Sample Rate is basically how many samples the sound array contains per second
// if sample rate is 3000 and you want to build a signal with 2 seconds, you'll need
// an array with length equals to 6000
const buffer = audioContext.createBuffer(1, signalLength, sampleRate)
// This is the data on the buffer, as we created an empty one, this an array full of zeros
const data = buffer.getChannelData(0)
// Let's write the values from our signal to the buffer...
for (let i = 0; i < signalLength; i += 1) {
data[i] = signal[i]
// then assign the buffer to the node.buffer property
node.buffer = buffer
// let's connect the buffer to the audioContext.destination, which means the speakers
// audioContext is the context that you need to handle all this fancy stuff Web Audio API related :)
// now we start to play!
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment