Matt DesLauriers

creative developer

Read this first

Leaf Notes – An Interactive Web Toy


Try the experience in your browser at

I recently launched a small and interactive web toy for the Toronto-based Design + Animation Studio, Tendril. You can try it out on their home page. Their site rotates through different web toys, so you may need to reload once or twice to see it.

The experience is simple: brush your mouse across the generative plants to see them blossom and emit musical tones.

This was a really fun project to work on, and I’m very pleased with the outcome. It’s great to watch the reactions on Twitter and Instagram, including the heartwarming reaction by a four-year old using the experience on a tablet.

In this post, I’ll explore how I created the web toy alongside the amazing team at Tendril, and discuss some of the technical challenges faced along the way.


For a while now, Tendril has been showcasing different interactive...

Continue reading →

Pen Plotter Art & Algorithms, Part 2

— This post is a continuation of Pen Plotter Art & Algorithms, Part 1.


— Patchwork, printed with AxiDraw, December 2017

In our previous post, we learned to develop some basic prints with penplot, an experimental tool I’m building for my own pen plotter artwork.

In this post, let’s aim for something more challenging, and attempt to develop an algorithm from the ground up. I’m calling this algorithm “Patchwork,” although I won’t claim to have invented it. I’m sure many before me have discovered the same algorithm.

? You can find more discussion and images in this Twitter thread, where I first posted about it.

The algorithm we will try to implement works like so:

  1. Start with a set of N initial points.
  2. Select a cluster of points and draw the convex hull that surrounds all of them.
  3. Remove the points contained by the convex hull from our data set.
  4. Repeat the process from step 2.


Continue reading →

Pen Plotter Art & Algorithms, Part 1

— You can find the source code for this blog series here.


Over the last several months, I’ve been looking for ways to produce physical outputs from my generative code. I’m interested in the idea of developing real, tangible objects that are no longer bound by the generative systems that shaped them. Eventually I plan to experiment with 3D printing, laser cutting, CNC milling, and other ways of realizing my algorithms in the real-world.

My interest in this began in March 2017, when I purchased my first pen plotter: the AxiDraw V3 by Evil Mad Scientist Laboratories. It’s a fantastic machine, and has opened a whole new world of thinking for me. For those unaware, a pen plotter is a piece of hardware that acts like a robotic arm on which you can attach a regular pen. Software sends commands to the device to raise, reposition, and lower its arm across a 2D surface. With this, the plotter...

Continue reading →

Shaping Curves with Parametric Equations


This post explores a technique to render volumetric curves on the GPU — ideal for shapes like ribbons, tubes and rope. The curves are defined by a parametric equation in the vertex shader, allowing us to animate hundreds and even thousands of curves with minimal overhead.

Parametric curves aren’t a novel idea in WebGL; ThreeJS already supports something called ExtrudeGeometry. You can read about some of its implementation details here. This class can be used to extrude a 3D curve or path into a volumetric line, like a 3D tube. However, since the code runs on the CPU and generates a new geometry, it isn’t well suited for animating the curve every frame, let alone several hundred curves.

Instead, let’s see what we can accomplish with just a vertex shader. The technique presented here has various downsides and isn’t very robust, but it can look great in certain cases and tends to be...

Continue reading →


Audiograph is a small project I built in my free time over a two-week period in April. It’s a music visualizer for the 2016 album TRANS by Pilotpriest, rendering in real-time with WebGL and WebAudio.

In this post I’ll explain the process and some discoveries along the way.


After learning about Dolby’s Web Audio challenge at FITC Toronto, I was motivated to build a captivating audio-visual experience with WebGL. With only two weeks left in the challenge, I knew it had to be a small and focused project.

I’ve always been a huge fan of Beeple and his Everydays series. The vivid colours, bloom, film grain, and other lens effects give his work a lot of texture, and I’ve often wondered how to emulate the aesthetic in WebGL.

Select works by Beeple


At its core, Audiograph is really just some simple geometry moving toward the camera. As with one of my...

Continue reading →

Generative Art with Node.js and Canvas

This post explores a small weekend project that combines Node.js and HTML5 Canvas to create high-resolution generative artwork.

In the browser, the artwork renders in real-time. Tap the canvas below to randomize the seed.

Click here to open the demo in a new tab.

In Node.js, the same rendering code uses node-canvas to output high-resolution PNGs or MP4 videos.

 Node Canvas

The node-canvas API is mostly compatible with the HTML5 Canvas, so the backend code may be familiar to some frontend developers. We have two entry points – browser and node – but both require() a module that is engine-agnostic, and simply operates on the Canvas API.

For example, to draw a red circle in Node and the browser:

module.exports = function (context) {
    // get the Node or Browser canvas
    var canvas = context.canvas;
    var width = canvas.width / 2;
    var height = canvas.height / 2;

Continue reading →

Debugging Node.js in Chrome DevTools

This post introduces a novel approach to developing, debugging, and profiling Node.js applications within Chrome DevTools.


Recently I’ve been working on a command-line tool, devtool, which runs Node.js programs inside Chrome DevTools.

The recording below shows setting breakpoints within an HTTP server.


This tool builds on Electron to blend Node.js and Chromium features. It aims to provide a simple interface for debugging, profiling, and developing Node.js applications.

You can install it with npm:

npm install -g devtool


In some ways, we can use it as a replacement to the node shell command. For example, we can open a REPL like so:


This will launch a new Chrome DevTools instance with Node.js support:


We can require Node modules, local npm modules, and built-ins like process.cwd(). We also have access to Chrome DevTools functions like copy() and table(...

Continue reading →

30 days, 30 demos

This year I decided to try #codevember, a challenge to write a creative experiment for every day of November.

This post is a follow-up (and brain-dump) exploring some of the daily demos and lessons learned.


You can see all the experiments here:


The #codevember challenge calls for CodePen submissions, but I find prototyping much faster with a dedicated development server (budo) and a wide selection of node modules at my fingertips.

After setting up a build/dev script, I was able to iterate quickly with babel for ES2015 and installify to auto-install npm modules as I coded. To inline GLSL snippets, I used glslify.

In the end, the project racked up over 160 direct dependencies. If nothing else, it is a testament to the ease of rapid prototyping with npm.


I tried to iterate on a few topics over the 30 day period. These features seemed...

Continue reading →

Some JavaScript Sketches

It’s been a while since a blog post, so here’s a look at some small sketches I’ve developed in the last six months.

Most of them use WebGL and/or WebAudio, and are intended to be viewed on desktop Chrome or FireFox. Not all work on mobile. Each explores a single idea, delivered as a sort of “animated real-time artwork.”



desktop only

(demo) - (source)

This was a small audio-reactive sketch that uses soundcloud-badge by Hugh Kennedy. The original idea was to mimic the flow of black ink on paper, but it quickly diverged as I became more interested in projecting a video texture onto a swarm of 3D particles.

The trick here is just to take the position of the 3D particle and divide by the w component to get its position in screen space. We can use this as our uv coordinates into the video texture.

// vertex shader
void main() {
  // classic MVP vertex position

Continue reading →

Material Design on the GPU

One of the things I like about Material Design is that it builds on principles we see in the real world. Depth is used to convey information, light is used to indicate seams, and drop shadows follow convincing behaviours.

Material design is inspired by tactile materials, such as paper and ink. […] Surfaces can have elevation (z-height) and cast shadows on other surfaces to convey relationships.

- Polymer

What if we take these ideas to the extreme, and treat this as a graphics programming exercise?

The GPU could be used to simulate shading and reflections, new algorithms could be conceived that operate more effectively in a 2D domain, and surfaces could react to user interactions in a more tactile and realistic manner.

There is lots of potential in this idea. In this article, I will focus specifically on rendering text, or “paper and ink.”

 Text in the Real World



Continue reading →