Matt DesLauriers

creative developer

Read this first

Shaping Curves with Parametric Equations


This post explores a technique to render volumetric curves on the GPU — ideal for shapes like ribbons, tubes and rope. The curves are defined by a parametric equation in the vertex shader, allowing us to animate hundreds and even thousands of curves with minimal overhead.

Parametric curves aren’t a novel idea in WebGL; ThreeJS already supports something called ExtrudeGeometry. You can read about some of its implementation details here. This class can be used to extrude a 3D curve or path into a volumetric line, like a 3D tube. However, since the code runs on the CPU and generates a new geometry, it isn’t well suited for animating the curve every frame, let alone several hundred curves.

Instead, let’s see what we can accomplish with just a vertex shader. The technique presented here has various downsides and isn’t very robust, but it can look great in certain cases and tends to be

Continue reading →


Audiograph is a small project I built in my free time over a two-week period in April. It’s a music visualizer for the 2016 album TRANS by Pilotpriest, rendering in real-time with WebGL and WebAudio.

In this post I’ll explain the process and some discoveries along the way.


After learning about Dolby’s Web Audio challenge at FITC Toronto, I was motivated to build a captivating audio-visual experience with WebGL. With only two weeks left in the challenge, I knew it had to be a small and focused project.

I’ve always been a huge fan of Beeple and his Everydays series. The vivid colours, bloom, film grain, and other lens effects give his work a lot of texture, and I’ve often wondered how to emulate the aesthetic in WebGL.

Select works by Beeple


At its core, Audiograph is really just some simple geometry moving toward the camera. As with one of my

Continue reading →

Generative Art with Node.js and Canvas

This post explores a small weekend project that combines Node.js and HTML5 Canvas to create high-resolution generative artwork.

In the browser, the artwork renders in real-time. Tap the canvas below to randomize the seed.

Click here to open the demo in a new tab.

In Node.js, the same rendering code uses node-canvas to output high-resolution PNGs or MP4 videos.

 Node Canvas

The node-canvas API is mostly compatible with the HTML5 Canvas, so the backend code may be familiar to some frontend developers. We have two entry points – browser and node – but both require() a module that is engine-agnostic, and simply operates on the Canvas API.

For example, to draw a red circle in Node and the browser:

module.exports = function (context) {
    // get the Node or Browser canvas
    var canvas = context.canvas;
    var width = canvas.width / 2;
    var height = canvas.height / 2;

Continue reading →

Debugging Node.js in Chrome DevTools

This post introduces a novel approach to developing, debugging, and profiling Node.js applications within Chrome DevTools.


Recently I’ve been working on a command-line tool, devtool, which runs Node.js programs inside Chrome DevTools.

The recording below shows setting breakpoints within an HTTP server.


This tool builds on Electron to blend Node.js and Chromium features. It aims to provide a simple interface for debugging, profiling, and developing Node.js applications.

You can install it with npm:

npm install -g devtool


In some ways, we can use it as a replacement to the node shell command. For example, we can open a REPL like so:


This will launch a new Chrome DevTools instance with Node.js support:


We can require Node modules, local npm modules, and built-ins like process.cwd(). We also have access to Chrome DevTools functions like copy() and table(

Continue reading →

30 days, 30 demos

This year I decided to try #codevember, a challenge to write a creative experiment for every day of November.

This post is a follow-up (and brain-dump) exploring some of the daily demos and lessons learned.


You can see all the experiments here:


The #codevember challenge calls for CodePen submissions, but I find prototyping much faster with a dedicated development server (budo) and a wide selection of node modules at my fingertips.

After setting up a build/dev script, I was able to iterate quickly with babel for ES2015 and installify to auto-install npm modules as I coded. To inline GLSL snippets, I used glslify.

In the end, the project racked up over 160 direct dependencies. If nothing else, it is a testament to the ease of rapid prototyping with npm.


I tried to iterate on a few topics over the 30 day period. These features seemed

Continue reading →

Some JavaScript Sketches

It’s been a while since a blog post, so here’s a look at some small sketches I’ve developed in the last six months.

Most of them use WebGL and/or WebAudio, and are intended to be viewed on desktop Chrome or FireFox. Not all work on mobile. Each explores a single idea, delivered as a sort of “animated real-time artwork.”



desktop only

(demo) - (source)

This was a small audio-reactive sketch that uses soundcloud-badge by Hugh Kennedy. The original idea was to mimic the flow of black ink on paper, but it quickly diverged as I became more interested in projecting a video texture onto a swarm of 3D particles.

The trick here is just to take the position of the 3D particle and divide by the w component to get its position in screen space. We can use this as our uv coordinates into the video texture.

// vertex shader
void main() {
  // classic MVP vertex position

Continue reading →

Material Design on the GPU

One of the things I like about Material Design is that it builds on principles we see in the real world. Depth is used to convey information, light is used to indicate seams, and drop shadows follow convincing behaviours.

Material design is inspired by tactile materials, such as paper and ink. […] Surfaces can have elevation (z-height) and cast shadows on other surfaces to convey relationships.

- Polymer

What if we take these ideas to the extreme, and treat this as a graphics programming exercise?

The GPU could be used to simulate shading and reflections, new algorithms could be conceived that operate more effectively in a 2D domain, and surfaces could react to user interactions in a more tactile and realistic manner.

There is lots of potential in this idea. In this article, I will focus specifically on rendering text, or “paper and ink.”

 Text in the Real World



Continue reading →

Drawing Lines is Hard

Twitter: @mattdesl

Drawing lines might not sound like rocket science, but it’s damn difficult to do well in OpenGL, particularly WebGL. Here I explore a few different techniques for 2D and 3D line rendering, and accompany each with a small canvas demo.

Source for demos can be found here:

 Line Primitives

WebGL includes support for lines with gl.LINES, gl.LINE_STRIP, and gl.LINE_LOOP. Sounds great, right? Not really. Here are just a few issues with that:

  • Drivers may implement the rendering/filtering slightly differently, and you may not get a consistent render across devices or browsers
  • The maximum line width is driver-dependent. Users running ANGLE, for example, will get a maximum of 1.0, which is pretty useless. On my new Yosemite machine, line width maxes out at about 10.
  • No control over line join or end cap styles
  • MSAA is not

Continue reading →

Rapid Prototyping in JavaScript

Twitter: [@mattdesl](

This is a brief introduction to my current workflow for small, self-contained browser demos and prototypes.

screen cast of a typical prototyping session

Many modules are easy enough to unit test and develop entirely in the console (e.g. with nodemon and tape). This post is not about those modules, but instead, about the ones that are harder to unit test, and harder to visualize in a console alone. Some examples:

  • google-panorama-equirectangular - stitching equirectangular panoramas from Google Street View
  • touch-scroll-physics - integration for bouncing scroll panels and grids
  • glsl-film-grain - a film grain shader
  • perspective-camera - generic 3D perspective camera utilities
  • word-wrapper - word wrapping for custom 2D glyph rendering
  • three-bmfont-text - high quality text rendering in ThreeJS


The aim of this workflow is to

Continue reading →

Motion Graphics for the Web

Lately I’ve been thinking a lot about tooling for the web. Mostly, about how much it sucks, and how far away we are from the aesthetic of other industries, like games, film, and offline motion graphics in general.


screenshot of “Where Things Come From”

Shorts and reels like “Where Things Come From”, we think things’ 2013 Reel, and “Designed by Apple” demonstrate a visual fidelity which seems near impossible to replicate with interactive web content. Yet, the technology is all there: CSS for common elements and typography, SVG for shapes and strokes, and Canvas/WebGL for more advanced effects.

Why do rich websites and online experiences so often fall short of this aesthetic? There’s a few reasons, but I’d say the strongest is that we lack time-based animation tools for HTML5 content. Without these tools, the animations are left up to the developers to implement. And, let’s be

Continue reading →