TITLE:   WebGL Experiments

CATEGORY:   WebGL, Creative Coding

DATE:   December, 2015

TOOLKITS:   Javascript, WebGL, GLSL, Three.js, HTML5

COLLABORATORS:   N/A

DESCRIPTION:
In December of 2015 I took a class in NYU's Computer Science department which was Taught by the renowned Ken Perlin.  Perlin is famous for developing the Perlin noise algorithm, for which he won an Academy Award for Technical Achievement for creating procedural textures for the original Tron film.  The class served as an introduction to WebGL and producing hardware-accelerated computer graphics for the web.  In the beginning, we focused on writing and implementing our own shaders (vertex and fragment) using the GLSL language, before moving on to to create our own WebGL Javascript library and workflow.

URL:   The experiments are all online and can be found HERE

CODE:   GitHub Repo


FRAGMENT SHADERS:

Starting with Fragment Shaders is a good introduction to the power of leveraging your computer's GPU to render graphics to the HTML5 canvas. Because fragment shader code is run for every pixel on the canvas, it's possible to create relatively complex visuals and animations using relatively simple GLSL code.  The experiments presented below are all done using only 4 vertices, by drawing two simple triangles.  Click on any of the images to go to the live version.


RAYTRACING - DIFFUSE SHADING, SHADOWS, REFLECTIONS, AND PROCEDURAL TEXTURES:

The principle of raytracing works by "shooting" a ray from the position of the viewer, often called the eyepoint, into the scene.  We define geometry, for example a sphere, inside the fragment shader using a GLSL vec4(x,y,z,w) which contains the x,y, and z positions, as well as the sphere's radius.  Again, because the fragment shader code executes for every pixel in the canvas,we compute whether or not the ray intersects with the object for each pixel.  We can then compute the surface point on the sphere, for that pixel, and from this we can determine the surface normal at that point. The surface normal helps us determine what at what angle the light makes contact with the surface.  This allows us to add diffuse shading, shadows, reflections, and to create procedural surface textures.  Click on any of the images to go to the live version.


MATRICES - TRANSFORMATIONS, SPLINES, AND PARAMETRIC GEOMETRY:

OpenGL, the framework on top of which WebGL is built, makes use of matrix math to perform transformations.  WebGL uses a homogenous coordinate system.  All the coordinate transformation information can be placed in a 4x4 matrix.  Libraries like Three.js have built-in functions for doing the transformations, but in order to understand the math behind these operations, we created our own set of functions for performing these transformations.  We used them to generate transform vertices in the coordinate space, to create a custom hermite spline editor, and to generate parametric geometry using an additional UV coordinate system.  Click on any of the images to go to the live version.


CUSTOM SHADERS WITH THREE.JS

Three.js is a powerful 3D graphics library for the Web that utilizes WebGL.  The library has plenty of built-in features and standard material shaders, but still has the flexibility to allow you to write your own shaders.  The image shown below represents an experiment that uses standard Three.js geometry, with a custom vertex shader that applies Perlin Noise to the vertex positions, and a custom fragment shader that produces a procedural texture.  Click on the image to view the experiment live.