Skip to content

Instantly share code, notes, and snippets.

@thejmazz
Created May 9, 2016 22:26
Show Gist options
  • Star 10 You must be signed in to star a gist
  • Fork 2 You must be signed in to fork a gist
  • Save thejmazz/4393abb18267c731e966f9370b7eaa31 to your computer and use it in GitHub Desktop.
Save thejmazz/4393abb18267c731e966f9370b7eaa31 to your computer and use it in GitHub Desktop.
GLSL Notes

GLSL

Keeping track of key concepts, maybe eventually make it tutorial style but is more like a reference. Based off of shader-school lessons (especially some code examples).

Spoiler warning: includes solutions to shader-school lessons.

Scalar Types

int life = 42;

bool flag;
bool a, b, c;

// lowp, mediump, highp
mediump float x=1.0, y=-2.0, z;

// +,-,*,/,<,>,<=,>=,==,!=
z = x + y

// +=,-=,*=,/=,++,--
z++

Vectors

  • boolean vectors: bvec2, bvec3, bvec4
  • integer vectors: ivec2, ivec3, ivec4
  • floating point vectors: vec2, vec3, vec4
precision mediump float;

//Declares a 3D medium precision floating point
//vector with components (1,2,3)
vec3 v(1.0, 2.0, 3.0);
//Create a low precision 2D vector with all
//components set to 2
lowp vec2 p = vec2(2.0);  
//Unpack first 3 components from v into q and
//set last component to 1.0
highp vec4 q = vec4(v, 1.0);  
//A 2D boolean vector with true and false
//components
bvec2 foo(true, false);
//A 3D integer vector with components 1,0,-1
ivec3 q(1,0,-1);

Swizzles

xyzw, rgba, stuv:

First component of p  = p.x = p.r = p.s = p[0]
Second component of p = p.y = p.g = p.t = p[1]
Third component of p  = p.z = p.b = p.u = p[2]
Fourth component of p = p.w = p.a = p.v = p[3]
vec4 p = vec4(1, 2, 3, 4);

vec2 q = p.xy;   //q = vec2(1, 2)
vec2 r = p.bgr;  //r = vec3(3, 2, 1)
vec3 a = p.xxy;  //a = vec3(1, 1, 2)

Arithmetic Operations

Applied component wise:

vec4 a = vec4(1, 2, 3, 4);
vec4 b = vec4(5, 6, 7, 8);

vec4 c = a + b;    //c = vec4(6, 8, 10, 12);
vec4 d = a * 3.0;  //d = vec4(3, 6, 9, 12);
vec4 e = a * b;    //e = vec4(5, 12, 21, 32);

Geometric Functions

  • length(p)
  • distance(a, b)
  • dot(a, b)
  • cross(a, b)
  • normalize(a)
  • faceforward(n, I, nr) reorient a normal to point away from a surface
  • reflect(I, N) reflect a vector I along an axis N
  • refract(I, N, eta) applies a refractive transformation to I according to Snell's law

Comparisons

Return a bvec

  • lessThan(a, b)
  • lessThanEqual(a, b)
  • equal(a, b)
  • greaterThan(a, b)
  • greaterThanEqual(a, b)

Boolean Operations

  • any(b), all(b), not(b)

Constants

const highp float PI = 3.14159265359;

Global Precision

// should check for define of highp before using
precision highp float;

Subroutines

int add(int x, int y) {
  return x+y;
}

void doNothing() {

}

Type qualifiers for procedure arguments:

  • in passes argument by value (default behaviour)
  • inout passes argument by reference
  • out argument uninitialized, but writing to value updates parameter
  • const argument is a constant value
precision mediump float;

void testFunction(in float x, inout float y, out float z, const float w) {
  x += 1.0;
  y += x;
  z = x + y + w;
}

void test() {
  float x=1.0, y=1.0, z=0.0, w=-1.0;

  testFunction(x, y, z, w);

  //Now:
  //  x == 1.0
  //  y == 3.0
  //  z == 4.0
  //  w == -1.0
}

Builtins

  • unit conversion: radians, degrees
  • trigonometry: sin, cos, tan, asin, acos, atan
  • calculus: exp, log, exp2, log2
  • algebra: pow, sqrt, inversesqrt
  • rounding: floor, ceil, fract, mod, step
  • magnitude: abs, sign, min, max, clamp
  • interpolation: mix

Using fract and step to make a periodic 1D grid:

bool inTile(vec2 p, float tileSize) {
  vec2 ptile = step(0.5, fract(0.5 * p / tileSize));
  return ptile.x == ptile.y;
}

Branching

Very expensive.

if(a < 0.5) {
  //Executed only if a < 0.5
} else {
  //Executed otherwise
}

Loops

Number of loops must be statically determined and bounding. No i<myVar! Use a larger number, then break at your myVar. Can use break and continue.

float x = 0.0;
for(int i=0; i<100; ++i) {
  x += i;   //Executes 100 times
}

int i = 0;
while(i < 10) {
  i = i + 1;
}

Matrices

  • mat2, mat3, mat4
  • column major order, not row major order!
//Create a a 2x2 identity matrix.  Note matrix
//constructors are in column major order.
mat2 I = mat2(1.0, 0.0,
              0.0, 1.0);

//Equivalently,
mat2 I = mat2(1.0);

//Matrices can also be constructed by
//giving columns:
vec3 a = vec3(0, 1, 0);
vec3 b = vec3(2, 0, 0);
vec3 c = vec3(0, 0, 4);
mat3 J = mat3(a, b, c);

//Now:
//J = mat3(0, 2, 0,
//         1, 0, 0,
//         0, 0, 4);


// Access columns with square brackets
mat3 m = mat3(1.1, 2.1, 3.1,
              1.2, 2.2, 3.2,
              1.3, 2.3, 3.3);

//Read out first column of m
vec3 a = m[0];

//Now: a = vec3(1.1, 2.1, 3.1);


// Scalar addition and vector addition
mat2 m = mat2(1, 2,
              3, 4);
mat2 w = mat2(7, 8,
              9, 10);

//Component wise addition
mat2 h = m + w;

//Now: h = mat2(8,  10,
//              12, 14)

//Scalar multiplication
mat2 j = 2.0 * m;
//Now: j = mat2(2, 4,
//              6, 8)


// Component multiplication
mat2 m = mat2(1, 2,
              3, 4);
mat2 w = mat2(7, 8,
              9, 10);

mat2 q = matrixCompMult(m, w);

//q = mat2( 7, 16,
//         27, 40)


// the * operator has the effect of multiplying matrices and transforming vectors
mat2 m = mat2(1, 2,
              3, 4);

vec2 v = m * vec2(1, 2);  //v = vec2(5, 8)

//In GLSL, switching order of arguments is equivalent
//to transposing:
vec2 u = vec2(1, 2) * m;  //u = vec2(7, 10)

Fragment Shaders

  • 1:1 relation between fragments and pixels, but ratio can be higher with AA
  • gl_FragColor aliases to gl_FragData[0] which is useful if we are only rendering to the drawing buffer
  • receives gl_FragCoord
    • xy: coordinate of fragment in units relative to top-left of buffer
    • z: depth in [0, 1], 0 close and 1 far
    • w: reciprocal of the homogeneous part of the fragment's position in clip coordinates
  • skip rendering a fragment with discard
  • linear interpolation
void main() {
  gl_FragColor = vec4(1, 0, 0, 1);
}

Uniforms

  • specified from JS
  • broadcast to all executions of a shader
precision highp float;

uniform vec4 foo;

void main() {
  gl_FragColor = foo;
}

Textures

  • 2D array of vectors
  • declared with sampler2D type
  • accessed using texture2D()
vec4 texture2D(
  in sampler2D texture, // sampler variable
  in vec2 coordinate, // which data is read out from texture, [0,1]
  in float bias = 0.0 // changes filtering of texture
);

Draw texture to screen:

precision highp float;

uniform sampler2D image;
uniform vec2 screenSize;

void main() {
  vec2 uv = gl_FragCoord.xy / screenSize;
  gl_FragColor = texture2D(image, uv);
}

Vertex Shaders

  • control geometry
  • executed before fragment shaders
  • a vertex is one of the corners of a primitive
  • primitives are simplices of dimension < 3: points, lines, and triangles
  • primitives drawn by linearly interpolating between their vertices
  • gl_Position controls placement of vertex in clip coordinates

Output a vertex at center of screen:

void main() {
  gl_Position = vec4(0, 0, 0, 1);
}

Attributes

  • per-vertex
  • content, type, and number of attributes customizable
attribute vec2 position;

void main() {
  gl_Position = vec4(position, 0, 1);
}

Varying Variables

  • send data from vertex shader to fragment shader
  • must be float, vec2, vec3, or vec4
  • linearly interpolated across rendered primitive by default

Vertex shader:

attribute vec4 position;

//Declare a varying variable called fragPosition
varying vec4 fragPosition;

void main() {
  gl_Position = position;

  //Set fragPosition variable for the
  //fragment shader output
  fragPosition = position;
}

Fragment shader:

precision highp float;

varying vec4 fragPosition;

void main() {
  gl_FragColor = fragPosition;
}

Rotation

precision highp float;
uniform float theta;
attribute vec2 position;

void main() {
  float c = cos(theta);
  float s = sin(theta);

  mat2 rot = mat2(c, s, -s, c);

  gl_Position = vec4(rot * position, 0, 1.0);
}

Geometry

Projective Coordinates

  • use 4D position vectors so we have a homogeneous coordinate system
  • basic idea of projective geometry: replace points in a space with lines passing through the origin in a space which is one dimension higher
  • lines passing through origin can be parameterized by vectors
  • two vectors [x0, y0, z0, w0] and [x1, y1, z1, w1] identify the same line if they are related by some non-zero scaler multiple t != 0
  • Lines through the origin can be identified with points in a normal Euclidean space by intersecting them with a hyperplane that does not pass through the origin
  • in WebGL this hyperplane is take to be the solution set to w=1

Intersect line generated by vector v=[0.2, 0.3, 0, 0.1] with hyperplane. I.e., find t such that t*v is in the w=1 hyperplane: 0.1*t=1, so t=10. Thus this line is identified with the 3D point [2, 3, 0].

  • in general, any vector [x, y, z, w] corresponds to the 3D point [x/w, y/w, z/w]

Points at Infinity

  • "extra points"
  • correspond to lines which do not pass through hyperplane
  • points where w=0
  • simplifies problem of handling degeneracies and certain extraordinary situations
  • act like idealized direction vectors instead of proper points

Clip Coordinates

  • coordinate system in which GPU interprets gl_Position

related to screen coordinates by:

screenColumn = 0.5 * screenWidth  * (gl_Position.x / gl_Position.w + 1.0)
screenRow    = 0.5 * screenHeight * (1.0 - gl_Position.y / gl_Position.w)

similar for depth of vertex:

depth = gl_Position.z / gl_Position.w
  • this rule greatly simplifies the problem of testing if a given point is visible or not
  • all of the drawable geometry is "clipped" against a viewable frustum which is defined by 6 inequalities:
-w < x < w
-w < y < w
 0 < z < w

Transformations

  • projective coordinates simplifies coordinate transformations
  • any change of reference frame in plane geometry can be encoded as a 4x4 matrix in homogeneous coordinates
  • these matrices used throughout graphics to
    • control position and orientation of camera
    • shape of viewable frustum
    • location of objects within the scene

applying a transformation encoded in a 4x4 matrix m to vector p:

vec4 transformPoint(mat4 transform, vec4 point) {
  return transform * point;
}

concatenation of transformations is equivalent to the product of their matrices:

transformPoint(A, transformPoint(B, p)) == transformPoint(A * B, p)

model-view-projection factorization

4 different coordinate systems:

  • data coordinates: coordinates of the vertices in a model
  • world coordinates: coordinates of objects in the scene
  • view coordinates: unprojected coordinates of the camera
  • clip coordinates: coordinates used by GPU to render all primitives

relationship between these coordinate systems is usually specified using 3 different transformations:

  • model: data -> world. controls location of object in world.

  • view: world -> view. controls position and orientation of camera.

  • projection: view -> device clip. controls whether view is orthographic or perspective, and camera aspect ratio

  • in theory, could pass one matrix which does data -> clip, but factoring into 3 phases can simplify various effects:

    • some lighting operations must be applied in world coordinates
    • billboarding for sprites need to be applied in view coordinate system

applying it (order is important):

precision highp float;
attribute vec3 position;
uniform mat4 model, view, projection;

void main() {
  gl_Position = projection *  view *  model * vec4(position, 1);
}

Translations

v moving the point o to the origin:

vec3 translatePoint(vec3 v, vec3 o) {
  return v - o;
}
  • not linear in affine coordinates, but are in projective homogeneous coordinates -> can be written as a matrix
highp mat4 translate(highp vec3 p) {
  return mat4(   1,    0,    0, 0,
                 0,    1,    0, 0,
                 0,    0,    1, 0,
              -p.x, -p.y, -p.z, 1);
}

Scaling

vec3 scaleVector(vec3 v, vec3 s) {
  return s * v;
}
highp mat4 scale(highp vec3 p) {
  return mat4(p.x,   0,   0, 0,
                0, p.y,   0, 0,
                0,   0, p.z, 0,
                0,   0,   0, 1);
}

Reflections

vec3 reflectPoint(vec3 p, vec3 n) {
  return p - 2.0 * dot(n, p) * n / dot(n, n);
}
highp mat4 reflection(highp vec3 n) {
  n = normalize(n);
  return mat4(1.0-2.0*n.x*n.x,    -2.0*n.y*n.x,    -2.0*n.z*n.x, 0,
                 -2.0*n.x*n.x, 1.0-2.0*n.y*n.y,    -2.0*n.z*n.y, 0,
                 -2.0*n.x*n.z,    -2.0*n.y*n.z, 1.0-2.0*n.z*n.z, 0,
                            0,               0,               0, 1);
}

Rotations

  • composition of even number of reflections
  • any 3D rotation can be represented as a rotation in a plane about some axis
  • possible because all 3D rotations can be written as the the product of exactly two reflections
  • axis of rotation: common line of the plane between the two planes of reflection
  • angle of rotation: angle between two planes

Rodrigues' rotation formula, rotate a point p about the unit axis n with angle theta:

vec3 rotatePoint(vec3 p, vec3 n, float theta) {
  return (
    p * cos(theta) + cross(n, p) *
    sin(theta) + n * dot(p, n) *
    (1.0 - cos(theta))
  );
}
highp mat4 rotation(highp vec3 n, highp float theta) {
  float s = sin(theta);
  float c = cos(theta);
  float oc = 1.0 - c;

  return mat4(
    oc * n.x * n.x + c,       oc * n.x * n.y + n.z * s,   oc * n.z * n.x - n.y * s,   0.0,
    oc * n.x * n.y - n.z * s, oc * n.y * n.y + c,         oc * n.y * n.z + n.x * s,   0.0,
    oc * n.z * n.x + n.y * s, oc * n.y * n.z - n.x * s,   oc * n.z * n.z + c,         0.0,
    0.0,                      0.0,                        0.0,                        1.0
  );
}

Lighting

Nature of Rendering Aside

  • since lightwaves are high frequency and travel at high speeds, wave-like nature becomes imperceptible -> physical images can be well approximated with geometric objects: replacing waves with "rays"
  • ray is a line perpendicular to the wavefront representing the amount of energy in the wave at some particular frequency, ignoring polarization
  • track amount of energy in red, green, and blue bands
  • ray tracing too slow for realtime

Flat Lighting

  • light reflected from any surface to the detector assumed to be constant
  • some parameter kA determines the colour of each fragment

Vertex shader:

precision mediump float;

attribute vec3 position;
uniform mat4 model, view, projection;
uniform vec3 ambient;

void main() {
  gl_Position = projection * view * model * vec4(position, 1);
}

Fragment shader:

precision mediump float;

uniform mat4 model, view, projection;
uniform vec3 ambient;

void main() {
  gl_FragColor = vec4(ambient,1);
}

Lambertian Diffuse Lighting

  • simple physically based model for matte surfaces (paper, paint, etc.)

  • assume surface will absorb some portion of the incoming radiation, and scatter the rest uniformly

  • basic component of more sophisticated models which often include diffuse component

  • assume light source far enough away so that all incoming light travelling in the same direction d

  • Lambert's cosine law says that the amount of diffuse light emitted from a point on the surface with a normal vector n is proportional to the following weight:

float lambertWeight(vec3 n, vec3 d) {
  return max(dot(n, d), 0.0);
}

Combined with ambient term:

vec3 reflectedLight(vec3 normal, vec3 lightDirection, vec3 ambient, vec3 diffuse) {
  float brightness = dot(normal, lightDirection);
  return ambient + diffuse * max(brightness, 0.0);
}

Vertex shader:

precision mediump float;

attribute vec3 position, normal; // data coords

uniform mat4 model, view, projection;
uniform mat4 inverseModel, inverseView, inverseProjection;

varying vec3 fragNormal;

void main() {
  vec4 worldPosition = model * vec4(position, 1.0);
  vec4 worldNormal = vec4(normal, 1.0) * inverseModel * inverseView;
  fragNormal = normalize(worldNormal.xyz);

  gl_Position = projection * view * worldPosition;
}

Fragment shader:

precision mediump float;

uniform vec3 ambient, diffuse, lightDirection; // lightDirection normalized

varying vec3 fragNormal; // normalized

void main() {
  float brightness = dot(fragNormal, lightDirection);
  vec3 lightColor = ambient + diffuse * max(brightness, 0.0);

  gl_FragColor = vec4(lightColor,1);
}

Transforming Normals

  • normal vectors (unlike points) are dual vectors
  • encode planes parallel to the surface, not directions
  • parallel distance along a normal vector given by the dot product with the normal vector:
float parallelDistance(vec3 surfacePoint, vec3 surfaceNormal, vec3 p) {
  return dot(p - surfacePoint, surfaceNormal);
}
  • if we transform the surface into world coords using model, equivalent to moving the input test point by the inverse of model
  • in order for the above function to remain invariant, the surface normal must be transformed by the inverse transpose of model

Phong Lighting

  • models shiny/specular materials like metals, plastic, etc.
  • scatter light by hard reflections instead of smoothly diffusing outward
  • reflect incoming light about surface normal, project onto view axis, amount of reflected specular light assumed to be proportional to some power of this length
float phongWeight(vec3 lightDirection, vec3 surfaceNormal, vec3 eyeDirection, float shininess) {
  //First reflect light by surface normal
  vec3 rlight = reflect(lightDirection, surfaceNormal);

  //Next find the projected length of the reflected
  //light vector in the view direction
  float eyeLight = max(dot(rlight, eyeDirection), 0.0);

  //Finally exponentiate by the shininess factor
  return pow(eyeLight, shininess);
}

Combined with diffuse and lambient to give complete lighting model:

light = (ambient + diffuse*lambert + specular*phong)

Vertex shader:

precision mediump float;

attribute vec3 position, normal;

uniform mat4 model, view, projection;
uniform mat4 inverseModel, inverseView, inverseProjection;

varying vec3 fragNormal, fragPosition;

void main() {
  vec4 viewPosition = view * model * vec4(position, 1.0);
  fragPosition = viewPosition.xyz;
  vec4 viewNormal = vec4(normal, 0.0) * inverseModel * inverseView;
  fragNormal = normalize(viewNormal.xyz);

  gl_Position = projection * viewPosition;
}

Fragment shader:

precision mediump float;

uniform vec3 ambient, diffuse, specular, lightDirection;
uniform float shininess;
varying vec3 fragNormal, fragPosition;

void main() {
  // eyeDirection is same as normalized view position
  vec3 eyeDirection = normalize(fragPosition);

  float lambert = dot(fragNormal, lightDirection);

  vec3 reflectlight = reflect(lightDirection, fragNormal);
  float eyeLight = max(dot(reflectlight, eyeDirection), 0.0);
  float phong = pow(eyeLight, shininess);

  vec3 lightColor = ambient + diffuse*lambert + specular*phong;

  gl_FragColor = vec4(lightColor, 1);
}

Point Lights

  • replace direction vector with a ray extending from the point on the surface to the light source:
vec3 lightDirection = normalize(lightPosition - surfacePosition);

Vertex shader:

precision mediump float;

attribute vec3 position, normal;
uniform mat4 model,  view, projection;
uniform mat4 inverseModel, inverseView, inverseProjection;
uniform vec3 lightPosition;
varying vec3 fragNormal, fragPosition, lightDirection;

void main() {
  vec4 viewPosition = view * model * vec4(position, 1.0);
  fragPosition = normalize(viewPosition.xyz);

  vec4 viewNormal = vec4(normal, 0.0) * inverseModel * inverseView;
  fragNormal = normalize(viewNormal.xyz);

  vec4 viewLight = view * vec4(lightPosition, 1.0);
  lightDirection = normalize(viewLight.xyz - viewPosition.xyz);

  gl_Position = projection * viewPosition;
}

Fragment shader:

precision mediump float;

uniform vec3 ambient, diffuse, specular;
uniform float shininess;
varying vec3 fragNormal, fragPosition, lightDirection;

void main() {
  vec3 eyeDirection = normalize(fragPosition);

  float lambert = dot(fragNormal, lightDirection);

  vec3 reflectlight = reflect(lightDirection, fragNormal);
  float eyeLight = max(dot(reflectlight, eyeDirection), 0.0);
  float phong = pow(eyeLight, shininess);

  vec3 lightColor = ambient + diffuse*lambert + specular*phong;

  gl_FragColor = vec4(lightColor, 1);
}

Multiple Lights

  • phong lighting model can be extended to support multiple lights by summing up their individual contributions:
vec3 combinedLight(vec3 light0, vec3 light1) {
  return light0 + light1;
}
  • in context of Phong model, ambient light usually factored out, each point light given a separate diffuse and specular component

light.glsl:

//This is the light datatype
struct PointLight {
  vec3 diffuse;
  vec3 specular;
  vec3 position;
  float shininess;
};

//Export the point light data type
#pragma glslify: export(PointLight)

Vertex shader:

precision mediump float;

#pragma glslify: PointLight = require(./light.glsl)

attribute vec3 position, normal;

uniform mat4 model, view, projection;
uniform mat4 inverseModel, inverseView, inverseProjection;
uniform PointLight lights[4];

varying vec3 fragNormal, fragPosition, lightDirection[4];

void main() {
  vec4 viewPosition = view * model * vec4(position, 1.0);
  vec4 viewNormal = vec4(normal, 0.0) * inverseModel * inverseView;

  for(int i=0; i < 4; i++) {
    vec4 viewLight = view * vec4(lights[i].position, 1.0);
    lightDirection[i] = viewLight.xyz - viewPosition.xyz;
  }

  gl_Position = projection * viewPosition;
  fragNormal = viewNormal.xyz;
  fragPosition = viewPosition.xyz;
}

Fragment shader:

precision mediump float;

#pragma glslify: PointLight = require(./light.glsl)

uniform vec3 ambient;
uniform PointLight lights[4];

varying vec3 fragNormal, fragPosition, lightDirection[4];

void main() {
  vec3 eyeDirection = normalize(fragPosition);
  vec3 lightColor = ambient;

  for(int i=0; i < 4; i++) {
    vec3 light = normalize(lightDirection[i]);
    float lambert = dot(fragNormal, light);

    vec3 reflectlight = reflect(light, fragNormal);
    float eyeLight = max(dot(reflectlight, eyeDirection), 0.0);
    float phong = pow(eyeLight, lights[i].shininess);

    lightColor += lambert*lights[i].diffuse + phong*lights[i].specular;
  }

  gl_FragColor = vec4(lightColor, 1);
}

Structs

//Declare a datatype for a point light
struct PointLight {
  vec3 diffuse;
  vec3 specular;
  vec3 position;
  float shininess;
};

//Declare a single point light called light
PointLight light;

//Set the color of the light to red (1,0,0)
light.color = vec3(1, 0, 0);

Arrays

//Declare an array of 10 point lights
//called "lights"
PointLight lights[10];

//Modify the first light in the array
lights[0].radius = 100.0;

Non-Photorealistic Rendering

Cel-Shading

  • flatten the colours of an image, look more cartoony
  • start from Lambert diffuse, apply quantization to intensity values
  • for example, round our light value into 1 of 8 buckets:
float celIntensity = ceil(lightIntensity * 8.0) / 8.0;

Gooch Shading

  • good for technical illustrations, makes fine details and contours pop out

Modify lambert in 2 ways:

  1. Weights from [-1, 1]
  2. instead of interpolating from the diffuse light value to 0, the light color is smoothly interpolated over some other color space

weight for light colour in gooch shading:

float goochWeight(vec3 normal, vec3 lightDirection) {
  return 0.5 * (1.0 + dot(normal, lightDirection));
}

with two colors:

vec3 goochColor(vec3 cool, vec3 warm, float weight) {
  return (1.0 - weight) * cool + weight * warm;
}

GPGPU

Conway's Game of Life

precision highp float;

uniform sampler2D prevState;
uniform vec2 stateSize;

float state(vec2 coord) {
  return texture2D(prevState, fract(coord / stateSize)).r;
}

void main() {
  vec2 vUv = gl_FragCoord.xy / stateSize;
  float s = texture2D(prevState, vUv).r;
  float n = 0.0;

  for (int i = -1; i <= 1; i++)
  for (int j = -1; j <= 1; j++) {
    vec2 coord = fract(vUv + vec2(i, j) / stateSize);
    n += texture2D(prevState, coord).r;
  }

  gl_FragColor.rgb = vec3(n > 3.0+s || n < 3.0 ? 0.0 : 1.0);
  gl_FragColor.a = 1.0;
}

Primitives

Point Sprites

Vertex shader:

precision highp float;

attribute vec3 position, color;
attribute float size;

uniform mat4 model, view, projection;

varying vec3 fragColor;

void main() {
  gl_Position = projection * view * model * vec4(position, 1.0);
  gl_PointSize = size;
  fragColor = color;
}

Fragment shader:

precision highp float;

varying vec3 fragColor;

void main() {
  if (distance(gl_PointCoord.st, vec2(0.5, 0.5)) > 0.5) {
    discard;
  }

  gl_FragColor = vec4(fragColor, 1);
}

Triangles

Triangle primitives in GLSL get a special property called gl_FrontFacing which tests if they are oriented towards the camera or not.

precision highp float;

uniform vec3 frontColor, backColor;

void main() {
  if (gl_FrontFacing) {
    gl_FragColor = vec4(frontColor, 1);
  } else {
    gl_FragColor = vec4(backColor, 1);
  }
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment