Skip to content

Instantly share code, notes, and snippets.

@charlieroberts
Created March 21, 2020 00:38
Show Gist options
  • Star 3 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save charlieroberts/91213060246daedeef673ebb30e97120 to your computer and use it in GitHub Desktop.
Save charlieroberts/91213060246daedeef673ebb30e97120 to your computer and use it in GitHub Desktop.
Equinox Marching.js Tutorial

Marching.js

This demo tutorial will mainly focus on getting started using marching.js playground, a browser-based tool for live coding with ray marchers. I’ve added some extra information at the beginning on the demoscene, which was an important source of inspiration for marching.js. Here’s a recent article on the demoscene: Here’s some Sanity, literally, and Amiga Dreams as read for you in French - CDM Create Digital Music

Demoscene

The demoscene is a culture that began in the 1980s, when hackers created “cracktros” for the software they cracked… these were short audiovisual demos that showed off the skills of the hacker and often contained shoutouts to their other hackers / friends.

Over time, some hackers/artists became more interested in the audiovisual demos than they were in pirating software, and this became the birth of the “demoscene”, where programmers would create audiovisual “demos” that ran in realtime, often on severely constrained hardware. The demoscene continues today with may events held annually around the world (for example, http://atparty-demoscene.net/2019/05/09/party-2019-performers/ and https://2019.cookie.paris)

A technique used in many (but certainly not all) demos is ray marching, which is a physically informed graphical rendering technique that can create very different graphics from the classic OpenGL rendering pipeline; see Ray Marching and Signed Distance Functions for more information about how ray marching works. Here is a classic demoscene video from 2007 that shows some of these techniques in action:

Chaos Theory by Conspiracy | 64k intro (FullHD 1080p HQ demoscene demo) - YouTube

These types of effects are a lot of fun! We can experiment with them using a JavaScript library / environment, marching.js playground, that lets us work at a high-level without having to write complex shader programs. These graphics can also be audio-reactive.

Getting started with Marching.js

Let’s start by just putting a sphere on the screen. We can pass a Sphere() to the march() function, and then call .render() on the result. Highlight the code below and hit ctrl+enter to execute it.

march( Sphere() ).render()

There are many more interesting forms to render in marching.js. For example, fractals:

march( Mandelbulb() ).render()

You can get an idea of what’s available by looking through the reference or the various examples in the playground. Behind the scenes, this JavaScript scene description is being used to compile a GLSL program that runs on the GPU of your computer and renders the scene. If you’ve used the live coding system < hydra >, marching.js uses a very similar approach.

One of the classic raymarching techniques is to repeat forms in space… this is a very simple operation on the shader. Let’s create a field of repeating spheres:

march(
  Repeat(
    Sphere(),
    4
  )
).render()

Note that the spheres end after about 10 spheres or so; in order to reduce rendering time, marching.js only renders to a certain distance on the horizon. However, if we decrease the size of the sphere (which defaults to 1) and the repeat size, we can get a much larger number of repetitions in the space available.

march(
  Repeat(
    Sphere(.1),
    .4
  )
).render()

At this point, they basically fade infinitely to the horizon. Unfortunately this results in a bit of a mess, it looks better if we fade them out in the distance using fog.

march(
  Repeat(
    Sphere(.1),
    .4
  )
)
.fog( .1, Vec3(0) )
.render()

There’s a bunch of other geometries to play with, either check out the reference for marching.js or take a look at the “geometry catalog” found in the drop-down menu of examples / tutorials.

Dynamic rendering and camera movement

In the above examples, we rendered static snapshots of the scene at high quality resolution. For realtime animation work, we’ll need to lower the quality that’s being used and tell marching.js to start rendering, ideally at 60fps.

For example, we can easily animate the distance between our repeated spheres by storing a reference to our repeat object and then manipulating its .distance property. However, in order to do this we need to tell marching.js that we want to animate our scene and specify a quality setting. Try the example below with a value of “low” for quality and see how that looks. Then try changing this to “medium” or “high” to see if your computer can handle the higher quality setting. Although raymarching can do interesting graphics that are hard to achieve using other techniques, it is unfortunately fairly hard on your GPU depending on the scene you’re rendering. The main difference between “medium” and “high” is an increase in resolution. Lowering the resolution of your screen/projector will also help a great deal… especially if you have a hidef or “retina” display. In performances you really want to go as low as possible. There’s also some other render presets specifically tuned for viewing fractals or voxellized geometries; we’ll discuss these in a bit.

march(
  rpt = Repeat(
    Sphere(.1),
    .4
  )
)
.fog( .1, Vec3(0) )
.render('low')

onframe = time => {
  rpt.distance = .25 + time/10 % 1
}

The other addition in creating our animation is defining an onframe function. The example above uses JavaScript’s arrow function notation, but it could also be written this way:

onframe = function( time ) {
  rpt.distance = .25 + time/10 % 1
}

In marching.js, all of JavaScript’s default Math functions / constants have been exported into the global namespace, so we can also use them to animate.

onframe = time => {
  rpt.distance.x = .5 + sin(time) * .25
  rpt.distance.y = .5 + cos(time/2) * .25
  rpt.distance.z = .5 + cos(time/3) * .35
}

The time argument passed to the onframe function is measured in seconds. You could write a function that uses frames as follows:

frame = 0
onframe ()=> {
  frame++
	// do something with the frame count here
}

Camera movement

Assuming we’re dynamically rendering our scene, we can control a virtual camera to fly around and check things out. Just hit Shift+Ctrl+C to enter camera mode, and hit the same keys again to leave it. In camera mode the GUI / code editor is hidden, and the following key commands are enabled. Do this while running the above code for the infinite field of spheres.

WASD - Camera movement, Forwards (w), backwards(s), left(a), and right(d). All of these take the direction the camera is currently facing into account.

Arrow keys - Camera orientation.

Ctrl+Shift+. - Stop/start calls to onframe, effectively toggling animation.

It’s a lot of fun to explore fractals in this way. Try this sketch out and then enter camera mode (and try “fractal.med” if your graphics card will handle it):

march(
  rpt = Repeat(
    Mandelbox(),
    4
  )
)
.fog( .5, Vec3(0) )
.render('fractal.low')

Using an FFT to animate

As you might have guessed given this topic of this workshop, we can also use this animation function to read FFT analysis data and then assign it to properties. In Chrome, you can specify the audio interface used with chrome://settings/content/microphone?search=microphone. Firefox will ask you what you’d like to use as the input when you run FFT.start().

FFT.start()

march(
  r = Repeat(
    Sphere(.1),
    .4
  )
)
.fog( .1, Vec3(0) )
.render(4,true)

onframe = t => {
  r.distance.x = FFT.low
  r.distance.y = FFT.high
  r.distance.z = FFT.mid
}

FFT.windowSize *= 2

We can increase our window size to get a smoother response out of the FFT, however, this will also make visualizations slower to respond to transients in the audio signal.

Texturing & Using GUIs

There are variety of ways to control the appearance of geometries in terms of lighting; see the “lighting and materials” tutorial in marching.js for more information about many of these. In addition to these methods we can also apply procedural textures to objects that can add visual interest and also provide fun parameters to control via audio input.

The “texture catalog” example shows most of these, however, the reference doesn’t currently list available textures or what their parameters are. We can instead learn these parameters by telling marching.js to create a GUI for them.

march(
  Box().texture('rainbow').gui()
).render('low')

Note that the resulting GUI exposes attributes of the Box for control as well, including options for any parameters the box exposes and a transform that it has attached to it by default. Play around with the sliders to learn more.

Other textures to try include ‘dots’, ‘checkers’, and ‘cellular’. Once you find interesting texture properties to animate, you can do so by storing a reference to the box you create (in this case the variable b) and then accessing properties through b.texture.propertyName:

march(
  b = Box().texture('cellular').gui()
).render('low')

onframe = t => {
  b.texture.time = t
  b.rotate( t*15,.35,.5,.5 )
}

Transforms and Mirroring

As we’ve seen from the GUI, every geometry (and every combinator and every domain transformation) has a transform associated with it that can be used to move, rotate, or scale an object. For example, run the top part of the code below and then run each line in the lower half one at a time:

march( 
  m = Mandelbulb().move(0,0,3)
).render('fractal.low')

m.scale(2)
m.rotate(45,1,0,0)
m.move(1,0,1)

The “fractal.low” render preset requires that the camera is fairly close to the fractal. By default the marching.js camera begins with a 0,0,5 position, so we move the fractal to be a bit closer to the camera with a call to .move(0,0,3). The scale function uniformly scales the object; non-uniform scaling isn’t reliably correct with this type of rendering so you can only use scale on all dimensions at once. The .rotate() method accepts an angle and then an axis; in the code above we’re rotating 45 degrees along the X axis. .move() is called by passing absolute XYZ coordinates.

This is especially fun when we use the Mirror() function, which creates copies of an object and distributes them on the X,Y, and Z axes. However, mirroring an object will only have effect if the object has first been transformed, as the mirroring occurs around the origin (0,0,0) coordinate.

Try this:

march( 
  Mirror(
    Mirror(
      Mandelbulb().move(.5,.5,.5)
    ).rotate(45,1,0,0)
  ).move(0,0,1) 
).render('fractal.low')

… and then enable the camera to fly around it. All sorts of fun can be had with mirroring; (here’s one example).

Wrap-up

There’s a lot we didn’t get to cover in this tutorial, however, some of the important elements of marching.js are covered in its built-in tutorials. Make sure to look through the “constructive solid geometry” tutorial, and if you know a bit of GLSL, it might be fun to check out the “defining your own GLSL shapes” and “defining procedural textures” tutorials. Feel free to ask questions about marching.js in the gibber channel of the TOPLAP chat or on Twitter (@gibber_cc).

Addendum: Gibber (v2)

Marching.js is built-in to the new, pre-alpha version of Gibber, which can be found at:

Gibber v2

Mapping in Gibber v2 is fairly simple, you simply assign a audio object to a visual property. For example:

k = Kick()
k.trigger.seq( 1,1/4 )
s = Sphere().render()
s.radius = k

However, many times the range of values created by the audio object is not appropriate for the mapping you’re trying to create. You can add a bias (offset) and a scalar (multiplier) to control this.

k = Kick()
k.trigger.seq( 1,1/4 )
s = Sphere().render()
s.radius = k
s.radius.offset = .5
s.radius.multiplier = 10

There’s an audiovisual tutorial included in v2 that walks through some other examples of mappings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment