My friends from Lateral Summer are making a short movie called The Rift.
It's a sci-fi movie and some scenes have screens with unusual user interfaces. Guys asked me for help to make them happen. The most troublesome part was stylized 3d globe. You can see it around 00:48 in the trailer.

A static image of the globe.
A static image of the globe

I followed this tutorial to make a basic monochrome 3d globe but had to go my own way because I needed more:
1. colored bars
2. animated bar growth
3. UI to control the animation

Colored Bars

To color the bars I've used an excellent library RainbowVis-JS. You give this library an array of colors, a range defined by two numbers, then you can get an interpolated color mapped to a number from the range. For example if your range is from 1 to 200 and you have five colors it's not a problem to ask for color at 15.5.

Three.js lets you set color for each of the vertices of your geometry. That gives smoothly colored bars.

Animated Bars Growth

My 3d globe was built from quite a alot of cubes so I had to merge them into one THREE.Geometry object to make everything work fast. This made animating the globe harder because I had to use morph maps. I had no idea how to use them but digging into three.js examples helped a lot and after several hours and half of my hairs pulled out I could morph my procedurally generated geometry like a boss.

Super-short tutorial on morphing

Build your base THREE.Geometry object.
Build another THREE.Geometry object with the same amount of vertices.
Put the vertices of the second object into the morph map of the first one.

baseObject.morphTargets.push({name:"target0", vertices: anotherObject.vertices});

Make a mesh out of the baseObject with morphTargets enabled.

total = new THREE.Mesh(everything, new THREE.MeshBasicMaterial({ vertexColors: THREE.VertexColors, morphTargets: true }));

Yay! You can morph now.

total.morphTargetInfluences[0] = 0.7;

UI To Control The Animation

I started with a very simple user interface using dat.GUI library.
It's a minimalistic GUI library from Google, outstanding in it's simplicity. It was a nice fit but it wasn't suitable for animation. Thus I tried to integrate an existing JavaScript based timeline-style animation component but it was outdated, buggy and hard to use for any real work.

Then I had an idea: what if I could control animation from outside through OSC (open sound controls) then it would be possible to use any of the existing OSC compatible timelines (Duration, Vezér).

To make this happen I took node-webkit which lets you build desktop applications on top of JavaScript, HTML5 and node.js. The funky thing is it combines browser and node.js environments into one powerful super weapon.

There is a ready to use node.js library called node-osc. It makes OSC communications simple and... doesn't work out of the box with the latest node-webkit on OS X. I had to use version 0.8.5 and cast a magic spell.

Why use magic? The problem is some modules are binary and have to be recompiled for your node-webkit's runtime architecture. There is a tool for this in a form of npm module. So it's easy to install.

npm install nw-gyp

In my case the offender was binpack module and I had to cast this spell inside the module's folder:

nw-gyp rebuild --target=0.8.5

I also had some problems parsing OSC messages but the dynamic nature of node-webkit's runtime and access to the console made it simple to hack the format on the fly.

In the end I had an app with some beautiful graphics in it controllable from outside. Perfect!

But that wasn't the end. When my friend tried to screengrab the animation there was a lot of frame-drops.
I had to find the way to render the animation to disk.

Rendering To Disk

Thanks god node-webkit lets you use both client side JS libs and node.js API. Hello fs.writeFile! It worked like a charm for some string I plugged into the command but I needed some real image data grabbed from a Web GL canvas.

Saving One Frame

I tried to get image from canvas with using canvas.toDataURL but it didn't work.
To make it work I had to initialize three.js renderer like this:

renderer = new THREE.WebGLRenderer({antialias:true, preserveDrawingBuffer: true});

preserveDrawingBuffer is the key.

It took me some time to figure out how to save a PNG. It wasn't obvious how to convert data/url which I could grab from canvas into binary image data. I figured it out by googling, trying and failing until I succeeded.

var image = renderer.domElement.toDataURL('image/png').slice(22);
fs.writeFile( 'aFrame.png', new Buffer(image, 'base64'));

Another problem was solved. The next one was harder. Somehow I had to record incoming animation data and then render and save it to disk in non-realtime manner.

Caching, Rendering and Saving

I added one more OSC parameter to start/stop animation data recording. Then I added shortcut to start saving the frames to the disc. I had to clean up my animation code because it wasn't taking in account OSC messages frequency which I had to use to control framerate of the animation (think FPS).

I also had to program completely different animation loop ot should I say render and save loop which was using recorded animation data instead of listening to the incoming OSC messages. It also compensated for the delay caused by write to disc operation and was starting the next frame rendering only when the current frame was saved to disk.

The key to success here is requestAnimationFrame.

Remember that we already have animation data recorded. The algorithm for rendering and saving is simple. "then" means the next step happens asynchronously as callback from previous operation. It's doable because requestAnimationFrame has callback parameter as well as fs.writeFile.

  1. updateAndRender currentFrame
  2. requestAnimationFrame then saveImage
  3. saveImage then requestAnimationFrame then increase currentFrame and go to 1

This doesn't include end loop condition for the sake of simplicity and demonstration of the core idea.

Overall I had a lot of problems but the result worth it. I learned a lot and researched a general approach to rendering canvas-based motion graphics to disc.

You can see the code at https://github.com/Nek/Rift/blob/master/js/app/main.js

If you're on a Mac you can also try it in action but it's a multistep process:
1. download and unzip node-webkit v8.5.0
http://dl.node-webkit.org/v0.8.5/node-webkit-v0.8.5-osx-ia32.zip
2. download packaged Rift Globe https://www.dropbox.com/s/wnogs6wby09ykz3/RiftGlobe.nw
3. download Duration timeline sequencer http://www.duration.cc/
4. download and unzip a preset for the sequencer https://www.dropbox.com/s/gzniwwtwfxnexhk/RiftCurves.zip
5. run Duration and open the preset which can be done by clicking on the text in the upper left corner of Duration's window
6. run Rift Globe on node-webkit (by double clicking on it)
7. hit play inside Duration and observe node-webkit's window

All praises, insults, questions and critics are welcome!