Create a gist now

Instantly share code, notes, and snippets.

What would you like to do?
Firefox Developer Tools' Web Audio Editor scratchpad

Moved to

Originally from etherpad

The Web Audio Editor in Firefox Developer Tools is still a work in progress. A lot of love is needed in its current state, and once the basic features are working, here are some dream features, and future planning.





  • Context View (displays meta information about the AudioContext itself) bug 1007321
  • Display automation over time (toggle between GRAPH view and AUTOMATION view?) bug 1007876
  • Display input/output nodes in a connections panel bug 1007875

Visual Feedback

  • Display feedback when an AudioSourceNode is playing.bug 1006626
  • Display time/frequency domain visualization, VU, for overall sound output bug 1019100
  • Display AudioNode connected to an AudioParam. bug 986705

Designer Controls

AudioNode controls

  • Start/stop AudioSourceNodes bug 1006624
  • "Tap into" nodes, listen along the way bug 1007887
  • Play AudioBufferSourceNode's buffers (use a new AudioContext/audio element) bug 1007889
  • "Disable/bypass" AudioNodes in a chain bug 1007778

Global Controls



  • Support multiple AudioContext's on a page (low priority IMO) bug 1006624
  • Render/save out audio output from an AudioContext bug 1008742

Platform work needed

  • GC-event when node is GC'd (so we can clear up dead-nodes in the graph view) bug 1008497
  • Bypass/skip node in an audio chain (possible to implement on actor, but may be better to be native) -- for bug 1007778, may be necessary

Edge Cases to Test

  • Buffer with loop


  • Who is going to use this tool? Game developers? Audio hackers?


One Audio Hackers vs Game Devs: this isn't quite right. I think adoption of Web Audio is going to be more like this:

  1. people will use audio player widgets to replace existing flash-based audio streaming widgets
  2. people who create multi-media experiences using audio and UI / visuals, eg games, visualizations, demo-scene style stuff, drum machines

Put a little differently, "is the purpose to stream some audio from the internet continuously or trigger short samples from buffers base don events"

Also, for use case 1 do we think maybe that super-charging <audio> and <video> tag inspection and having some insight into their state is almost a separate tool? Definitely feels like a separate use case to me.


jsantell commented May 10, 2014

@canuckistani Inspecting what exactly? an <audio> tag? I kind of was getting at this earlier with the WebVTT idea -- specific mini-tools for elements. For an audio player using web audio, there are many different ways that could be done, and I think being able to play back buffers for an AudioBufferSourceNode or a link to the <audio> tag for MediaElementSourceNode would be good for debugging the widget use case

sole commented May 10, 2014

Some ideas off the top of my head:

  • being able to see the event listeners added to nodes / maybe even edit them! (i.e. temporarily disable them with a checkbox?)
  • being able to see scheduled parameter changes (setParameterAtTime etc)
  • Once WebMIDI is implemented, what about providing a simple on screen keyboard that lets people simulate having a real MIDI instrument connected?

sole commented May 10, 2014

Also! Do you think it could be possible to sort of profile audio frames? A bit like how memory profiler would work but showing how long did each frame actually take to return. I am mostly thinking about detecting when onaudioprocess takes longer than it should and then you get glitches or "audio jank".


jsantell commented May 10, 2014

@sole that's an awesome idea. i wonder if we could somehow hook onaudioprocess links to the performance tools once they're done? Definitely a huge point of tracking down glitch and pops!

sole commented May 11, 2014

Oh I had this other idea: saving the output of a node to WAV or something similar. Here's sole entering Lalala-land, but consider that this would be like another type of "profile" of the actual output. Maybe you could record it twice and then compare the different runs.


jsantell commented May 11, 2014

Oh man these are all great ideas. Not sure how we'd do that retroactively, but I think it'd be pretty possible to start/stop recording, much like the profiler like you mentioned with an offlineAudioContext... hmm, and yes lots of these ideas are future dreams, but good to track anyway :)

sole commented May 12, 2014

I had another idream! What about being able to analyse the output of the currently selected node? So for example suppose I highlight the context destination-I would get a little panel in the inspector with the FFT + wave display + a saturation level indicator. This kind of visual cues are super useful when I'm doing stuff and you're sort of tired and unsure whether your ears are tricking you!

Then you could select another node and see what was it doing-is it the one which is introducing that annoying high frequency? Etc...


jsantell commented May 12, 2014

@sole nice! I think displaying time/freq domain display with a VU or something would be great -- combined with bug 1007778, I think we'd be able to do that along every node along the chain, which would be great for debugging along a signal

forresto commented Jun 2, 2014

Cool! I'm making which is a zoomable nodal editor with named ports. I'd be happy to help you use it for this.


jsantell commented Jul 25, 2014

@forresto Just saw your message here! We're currently trying to figure out how best to display connecting to AudioParams -- open to any solutions, including the grid/graph you've linked, if it's performant and customizable!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment