Last year (2017) I launched Space Rocks, a WebXR demo that re-imagines Atari’s Asteroids as an immersive, Web-based, virtual reality experience. Alongside the site’s launch I also released my source code used to create it. This technical deep dive explains how I built this WebXR game—and how you can make one too.
I wrote Space Rocks to be easy to tinker with right from the browser’s JavaScript console: No closures concealing an entire code base—inspect and inject as you please. I wrote Space Rocks to be read: No pre-processors. No trans-pilers. No minifiers, no uglifiers… The code is legible right from your browser’s View Source command. I wrote Space Rocks to make sense: Lots of code comments to explain what I’m aiming for, reasonably named variables, aligning equal signs, and even ASCII diagrams.
You can load up the live site itself at https://spacerocks.moar.io or download the source code from https://github.com/moar-tech/spacerocks. Follow along with the descriptions here and don’t hesitate to jump in with your questions or comments both below and on GitHub.
You can load up the live site itself at https://spacerocks.moar.io or download the source code from https://github.com/moar-tech/spacerocks. Follow along with the descriptions here and don’t hesitate to jump in with your questions or comments both below and on GitHub.
Atari’s Asteroids
I was too young to experience Atari in its prime. I first encountered their 1979 classic, Asteroids, at a summer camp in the early 1990’s — an old arcade cabinet sat in the corner of a community room. I was immediately hooked: Those bright laser vector graphics that left ghostly light trails against the glass. The minimal, one-two heartbeat soundtrack increasing in tempo as the action intensified. The infinite motion of objects in friction-less outer space. (And the related skill of piloting a careening ship back to a controlled near-stillness.) A quarter century later, Space Rocks is my very simple ode to Asteroids.
There’s a lot to translate, of course. Asteroids was a single color, 2D vector game. How do you be true to the original when the medium is now a full-color, stereoscopic 3D presentation? I was tempted to go for glowing wire-frame visuals but that felt a bit like using a distressed typewriter font to fake old school cut-and-paste zines — all surface and no backbone. Does Space Rocks have conceptual backbone? Well that’s an entirely different discussion. What I do know is this: With some JavaScript know-how, a little Three.js knowledge, and access to 6DoF hardware, you too can have fun building simple WebXR games.
It’s not the first time I’ve been inspired by Atari. My fascination with Pong has been well articulated in the forms of Browser Pong (2009) and later as Roll It (2013)—which began as a Nintendo Wii-like update to Browser Pong before being retooled as a Skeeball-like game. Some day I’ll finally pay tribute to my favorite of them all, Tempest. (Stay tuned.)
Three.js
I knew I wanted to recreate some of my favorite Asteroids elements in virtual reality. Because my creative medium of choice has long been the World Wide Web, I needed a code library that could both juggle 3D graphics and also push those graphics to a pair of stereoscopic virtual reality goggles — like the HTC Vive, Oculus Rift, or Windows Mixed Reality headsets. Mozilla’s A-Frame, which internally uses Three.js, is a great choice for creating WebXR experiences. From prior projects, however, I’d become accustomed to using Three.js directly—and that’s the route I chose for Space Rocks. The following examples all rely on working directly with Three.
Standard Three boilerplate involves setting up a renderer, virtual camera, main scene, and so on. Our Three boilerplate for Space Rocks resides in scripts/moar/three.js
and exports the relevant variables to the global scope via the M.three
object, making them easily accessible to us across the app:
Object.assign( M.three, { renderer, camera, scene, world, render })
In fact, all of our Moar boilerplate code lives in the /scripts/moar/
folder and populates the M
global object for convenience. If you’re looking for an easy way to get up and running with WebVR and Three.js you may wish to copy that folder and the /scripts/third-party/
folder into your own blank project as a useful starting point. The M
global object will have all the goods you need to start building something similar in scale to Space Rocks.
WebXR hand controllers for Three.js
I created VRController with my colleagues at Google’s Data Arts Team (DAT) in early 2017 to make supporting 3DoF and 6DoF hand controllers in WebVR a snap. (We then used VRController to power Dance Tonite, an ever-changing virtual reality dance collaboration between LCD Soundsystem and their fans; directed by Jonathan Puckey and Roel Wouters.) VRController supports hand controllers for Oculus Rift + Touch, HTC Vive, Windows Mixed Reality, Google Daydream, Samsung GearVR, and similar devices.
Just include VRController.js
in your Three.js-based WebXR project and call THREE.VRController.update()
from within your animation loop. When VRController discovers an available hand controller via the Gamepad API it will emit a global event labeled 'vr controller connected'
and pass a reference to the controller instance through that event. That controller instance is actually an extended THREE.Object3D
which means you can add it to your scene, attach objects and models to it, and so on.
Are you trying to track down a bug or just peek under the hood? Enter THREE.VRController.inspect()
into your JavaScript console for a full overview of what’s connected, what buttons and axes are available—and their current state.
Attaching visuals to the controllers
In “Space Rocks” your “arm” is a cylindrical object with a plasma thruster at the elbow and photon cannon instead of hands. Your current score sits just off your right hand, and your remaining lives meter just off your left hand.
Ok, but how do we use VRController in Space Rocks? Let’s begin with the visuals that we’ll attach to each. Rather than create a spaceship hull that encompasses the player’s body, I decided to make the player’s arms themselves the means of both propulsion and asteroid destruction. Each arm hull is composed of a few simple shapes which I begin to construct near line 115 in /scripts/player.js
:
/*
LEFT RIGHT
lives score
♥ ♥ ♥ 12345
╭───────╮ cannon ╭───────╮
│ ▪ │ cannon pointlight │ ▪ │
├───────┤ ├───────┤
│ │ │ │
│ │ hull │ │
│ │ │ │
│ │ │ │
├───────┤ engine ├───────┤
│ │ │ │
╲_____╱ ╲_____╱
↓ engine exhaust ↓
*/
const hull = new THREE.Mesh(
new THREE.CylinderGeometry(
player.arms.radii,
player.arms.radii,
0.2,
7
),
new THREE.MeshPhongMaterial({
color: 0x999999,
specular: 0xCCCCCC,
shininess: 70
})
…
Once we’re done building the models for the player’s arms we can store them as player.arms.left
and player.arms.right
respectively. With these pieces ready, let’s listen for the 'vr controller connected'
event near line 516 in /scripts/player.js
:
addEventListener( 'vr controller connected', function( event ){
const controller = event.detail
controller.standingMatrix =
M.three.renderer.vr.getStandingMatrix()
controller.head = M.three.camera
M.three.scene.add( controller )
…
If supporting an array of hand controller models from different manufacturers were simple then at this point we could reliably query controller.getHandedness()
to see if we need to attach the visual model for player.arms.left
or player.arms.right
. But simple it is not. In some instances certain controllers will occasionally return an empty handedness string upon connection. Vive controllers can actually swap hands after some initial controller movement based on their relative position to the headset — which is a very neat feature, but does complicate things. To handle all of these behaviors we’ll first ask for a valid handedness string, if there is one we can immediately attach a relevant arm model, but either way we’ll listen for subsequent 'hand changed'
events:
let side = controller.getHandedness()
if( side === 'left' || side === 'right' ) attachArm( side )
controller.addEventListener( 'hand changed', function( event ){
side = event.hand
attachArm( side )
})
Our attachArm
function near line 563 in /scripts/player.js
assigns a few convenience variables for internal use, but it is effectively the same as this simplified version:
const attachArm = function( side ){
controller.add( player.arms[ side ])
}
So at this point we’ve got visual models for both arms, are listening for available controllers, and as soon as those controllers report either 'left'
or 'right'
handedness we’ll attach the visual model to that controller instance. Phew! That was a lot! But what about interaction?
Handling buttons
VRController wraps the Web Gamepad API which handles the discovery and polling of gamepad-like devices. A Gamepad
instance has standard properties —one of which is a buttons
array. Each object within the buttons
array has three properties: value <Number>
, touched <Boolean>
, and pressed <Boolean>
. This is very useful information! But it can be confusing when trying to support various controller models from different manufacturers that were made for different purposes. For example, what if we decide to add 3DoF support to Space Rocks? We still want our cannon to fire off photon bolts when a user hits a button — so let’s look at supporting Vive controllers (6DoF) alongside Daydream controllers (3DoF) as a test case.
The Vive has several buttons, and the one that seems best suited for a shooting action is the trigger. A Vive controller’s trigger happens to be buttons[1]
in the Gamepad
instance’s buttons
array. Let’s say we have a Gamepad
instance called gamepad
representing the Vive’s controller; we could poll for the pressed
value of buttons[1]
within our update loop. If its value is true, we run our cannon fire routine.
myUpdateLoop(){
if( gamepad.buttons[ 1 ].pressed ) fire()
}
But we’re not using the raw Gamepad API — we’re using VRController — and that calls for a slightly different syntax:
myUpdateLoop(){
if( controller.getButton( 1 ).isPressed ) fire()
}
VRController provides explicit support for several popular controller models by way of meticulous code comments about how the device functions and proper names as strings for each button. This means we can re-write the above as:
myUpdateLoop(){
if( controller.getButton( 'trigger' ).isPressed ) fire()
}
Look at how that almost reads like a real sentence! But maybe we only want to call our fire routine when the trigger is initially pressed and not for every single frame afterward that our user continues to hold the trigger before releasing. VRController handles this for us by emitting button events on the controller instance. Instead of polling in our update loop as above, we can try this:
controller.addEventListener( 'trigger press began', fire )
Ok, now we’re really getting somewhere, right? Daydream, however, does not have a trigger button. In fact, it only has one single button — and that’s the thumbpad! So for Daydream we’d either need to poll for gamepad.buttons[0].pressed
, poll a VRController instance for controller.getButton( 'thumbpad' ).pressed
, or listen for the 'thumbpad press began'
event. And now we have a new problem: If we’re just listening for thumbpad presses that could mean both the Daydream’s thumpad press (intentional) and Vive’s thumbpad press (unintentional). I suppose we could check the gamepad.id
string or controller.style
string within the thumbpad event listener and only call fire()
if it’s a Daydream controller, but that’s not so elegant is it?
This is where VRController’s last bit of button magic comes into play — the concept of a primary button. For supported controllers like Vive or Daydream that’s pretty easy to determine. Vive has a trigger and it feels like the primary button. Daydream only has a single button so it is the primary button by default. (For unsupported / unknown controllers VRController will make an educated guess as to what should be labeled primary.) So we can replace any of the above with this single line:
controller.addEventListener( 'primary press began', fire )
VRController in Space Rocks
In Space Rocks our controller’s grip buttons fire up the engines. We catch the 'grip touch began'
controller event on 'line 693 of /scripts/player.js
' and set the engine rotation in motion — the user can see their engine exhaust ports spin as they are propelled forward.
controller.addEventListener( 'grip touch began', function(){
engine.rotationVelocity = 0.15
})
That’s neat, but where’s the code to actually move the player? Because engine thrust needs to be applied continuously as long as the player is holding the grips — and not just upon the initial 'touch began'
event — I decided to use the controller’s updateCallback
option. Just assign a function to this property and it will be executed at the end of each VRController.update()
call. Here on line 666 of /scripts/player.js
we can see this in action:
controller.updateCallback = function(){
if( controller.getButton( 'grip' ).isTouched )
engineThrust()
})
Multi-channel haptic feedback
Some hand controllers, like the current models for Oculus Touch or Vive, contain haptic actuators — vibrating motors — that can pulse or buzz to provide haptic feedback for our player. For example, is the user backhanding a tennis ball across the net at someone? The moment that virtual tennis racket connects with the virtual ball would be a good time to employ some haptic feedback; to make the controller rumble for a fraction of a second, thereby allowing the user to “feel” that contact.
I came up with the idea of haptic channels while building Space Rocks and then merged this functionality into VRController. The raw Gamepad API already supports vibrating actuators, so let’s begin there and then work our way up to multi-channel haptics using VRController. Here we have a Gamepad
instance and if it has any actuators we’ll buzz the first one:
if( gamepad.hapticActuators &&
gamepad.hapticActuators[ 0 ]){
gamepad.hapticActuators[ 0 ].pulse( intensity, duration )
}
For reference, intensity
is a floating-point value between 0 and 1 inclusive, and duration
is a number of milliseconds. hapticActuators
is an array, but I’ve yet to see a hand controller that has more than one actuator. For our tennis example, the above functionality covers everything we need: The user hits something in the virtual world and we buzz their hand controller in response. Perfect.
Set and Wait
Ok, but Space Rocks is a bit more complicated. What happens when you’re revving your plasma engines while firing off photon bolts in deep space? Slinging that bolt from the rotating cannon head is going to hit you with a sudden recoil while the low rumbling hum of the engine continues underneath. And what about the wind-down of that rotating cannon head as internal friction slowly returns it to a standstill?
First, let’s look at the underlying engine hum. When the user squeezes their controller grips we create a haptic channel on our controller
instance called 'engine rumble'
and set its intensity to 20% like so:
controller.setVibe( 'engine rumble' ).set( 0.2 )
The act of selecting a haptic channel with setVibe()
automatically creates that channel if it does not already exist. The name of that channel is whatever string you pass to setVibe()
. Notice how duration
is not being specified. VRController will rumble at that intensity forever—or until you issue a new intensity command. That’s as easy as selecting the same haptic channel again by name:
controller.setVibe( 'engine rumble' ).set( 0 )
We can also create a queue of haptic channel commands. Let’s say when you first engage the engine there’s a moment of intense shuddering before it settles down into its normal hum. Perhaps that initial shudder lasts a second and a half. Here’s how we might describe that using haptic channels:
controller.setVibe( 'engine rumble' )
.set( 0.8 )
.wait( 1500 )
.set( 0.2 )
And just for fun, perhaps it takes an eye-blink of a moment after releasing the controller grips for the engine to disengage:
controller.setVibe( 'engine rumble' ).wait( 250 ).set( 0 )
Under the hood, VRController is keeping track of time via window.performance.now()
to know when in the future each change in vibration intensity ought to take place.
Important note: Selecting the haptic channel will automatically erase its queue of future events. Why might this be desirable? Imagine the haptic behavior we described above. Now imagine a user engages the engine by squeezing their controller grips, but after one second decides to release the grips. If the event queue was not automatically scrubbed what they might experience is 1.25 seconds of 80% vibration intensity, followed by a quarter second of haptic silence, then followed by 20% vibration intensity that lasts into perpetuity. That wouldn’t feel like an engine kicking on, then shutting down. It would feel like a mistake.
Multiple haptic channels
The above pattern of selecting a haptic channel and then applying set
or wait
is convenient, but using multiple channels is where VRController’s approach to haptics really shines.
Let’s say you’ve got that engine rumbling and now the user is pulling the trigger on their controller to fire a photon bolt. How do you plan to keep track of where the engine’s at in terms of its vibration intensity queue, and also apply the recoil of the cannon? And what about the rotation of the cannon head — which we want to spin at full intensity immediately, then wind down over time? With VRController you don’t have to worry about it.
Your engine’s already rumbling. Here’s what to add to your trigger-press routine:
controller.setVibe( 'cannon recoil' )
.set( 0.8 )
.wait( 20 )
.set( 0.0 )
controller.setVibe( 'cannon rotation' )
.set( 0.2 )
And for the wind-down you might add something like this to your trigger-release routine:
controller.setVibe( 'cannon rotation' )
.wait( 500 ).set( 0.10 )
.wait( 500 ).set( 0.05 )
.wait( 500 ).set( 0.00 )
You could of course make a much more granular wind-down, but I’ve found even the above coarse degree of detail gets me close enough to the haptic expression I’m looking for.
So if your engine’s humming along at 20% intensity, your cannon recoil hits at 80% intensity, and the cannon’s rotation adds 20% on top of that… Well that’s 120% intensity — and that doesn’t make sense. Thankfully, VRController sums the intended aggregate intensities at each moment and automatically caps the total at 100%, then sends that pulse
command to the Gamepad
instance. All the gory details are handled for you so you can focus on what really matters to you: making your VR experience feel just right.
Haptic channels in Space Rocks
So where do we use haptic channels in Space Rocks? All over /scripts/player.js
. Check out how we buzz the controller when the cannon fires on line 641: We jump to 80% intensity on fire, then almost immediately we come back down to zero — just as described above. On the very next line we add to that same cannon rotation rumble which will cruise at 15% intensity until our user releases the trigger. For more examples just search for setVibe
within /scripts/player.js
.
Tasks—and task lists
In Space Rocks a “task” is literally just a function—there’s nothing ground-breaking about that. But task lists are something worth writing about. A TaskList[]
is an array for storing queued functions for later execution. The list itself includes handy methods like find()
, before()
, and after()
for specifying the order of the tasks in the queue. For example, consider this list of functions and TaskList[]
:
const
a = function(){ console.log( 'Apple' )},
b = function(){ console.log( 'Banana' )},
c = function(){ console.log( 'Carrot' )},
tasks = new TaskList()
The following will output Banana
, Apple
, Carrot
:
tasks
.add( a )
.add( b ).before( a )
.add( c )
.run()
Where is my particular task in this stack of tasks? A task’s index can be found by passing a reference to the function itself. The following will output 2
:
console.log( tasks.find( c ))
Similarly, asking for a specific task index will return the task itself. In this case the output will be function(){ console.log( 'Carrot' )}
:
console.log( tasks.find( 2 ))
With this idea of a TaskList[]
in place we can really start cooking. Space Rocks divides code execution into three main categories: 1. Code that’s executed immediately upon being read by the interpreter (normal JavaScript execution), 2. Code that should execute once all content has loaded (via the standard DOMContentLoaded
event), and 3. Code intended to be executed repeatedly in an update or animation loop. The first category is of course handled by the JavaScript interpreter itself as it evaluates the code. We can handle the second two with a TaskList[]
each, defined by this accompanying boilerplate code located in scripts/moar/tasks.js
:
if( this.M === undefined ) this.M = {}
M.tasks = {
setups: new TaskList(),
setup: function(){
M.tasks.setups.run().clear()
M.tasks.update()
},
updates: new TaskList(),
update: function( t ){
M.tasks.updates.run( t )
}
}
document.addEventListener( 'DOMContentLoaded', M.tasks.setup )
Any functions added to M.tasks.setups[]
will run during Space Rocks’ setup phase—that is, once the DOM content has finished loading. Upon completion, the setups()
task list will clear its queue of functions and then call M.tasks.update()
. (The t
argument is for passing a time delta between frames so that movement and animation can remain time-based rather than frame-number-based.)
You might notice that there are no further calls to M.tasks.update()
. This is because we make subsequent update calls from within Three’s built-in animate()
method from /scripts/moar/three.js
like so:
let timePrevious
const render = function( timeNow ){
if( timePrevious === undefined ) timePrevious = timeNow
const timeDelta = ( timeNow - timePrevious ) / 1000
timePrevious = timeNow
M.tasks.update( timeDelta )
renderer.render( scene, camera )
}
renderer.animate( render )
Why pass our update function into Three’s animate()
method instead of just calling Three’s renderer.render()
from within our own update loop? As of Three r87 the renderer handles toggling between window.requestAnimationFrame()
and vrDisplay.requestAnimationFrame()
. All we have to do is call animate()
and Three will worry about which request to make and some other underlying details.
You can inspect Space Rocks’ two task lists live from your browser’s JavaScript console with M.tasks.setups.inspect()
and M.tasks.updates.inspect()
.
Modes
There are only 29 lines of actual code in Mode.js
—or fewer, depending on how you count them—yet these lines establish the flexible framework that Space Rock’s main.js
is built upon. Modes are mutually exclusive; we can only be in one mode at a time. What mode are we in now? Check Mode.current
in the console. What other modes are available? Check Mode.all
instead. Modes are named and have optional setup()
, update()
, and teardown()
methods. We just call Mode.run()
in our update loop and it will call the current mode’s update()
method each time it’s invoked.
Let’s say we’re currently in mode Apple. We can switch to mode Banana by calling Mode.switchTo('Banana')
. This switchTo()
function will automatically call Apple’s teardown()
, set Mode.current
equal to the Banana mode object, then call Banana’s setup()
. Afterwards Mode.run()
will call Banana’s update()
method.
Creating modes is this easy:
new Mode({
name: 'Apple',
setup: function(){ console.log( 'Hey', Mode.current )},
update: function(){ Mode.switchTo( 'Banana' )}
teardown: function(){ console.log( 'Bye', Mode.current )}
})
new Mode({
name: 'Banana',
setup: function(){ console.log( 'Howdy', Mode.current )}
update: function(){ console.log( 'Ok' )}
})
And our output would be:
> Hey Apple
> Bye Apple
> Howdy Banana
> Ok
> Ok
> Ok
…
Modes make it trivial for Space Rocks to switch between routines like “game play” and “game over” or to handle waiting for a player’s hand controllers to become available to the Gamepad API. To see all of Space Rocks’ modes in action, checkout /scripts/main.js starting from line 233
.
Tasks AND modes?
Tasks and modes sound really similar—why use both? Well, tasks are the individual instrument players while modes are the orchestra conductors. Look how simple it is to describe game play as a mode in /scripts/main.js
:
new Mode({
name: 'game play',
setup: function(){
Rock.all.destroy()
level.number = 0
player.reset()
player.enableEngines()
},
update: function(){
if( Rock.all.length === 0 ) level.create()
if( player.lives < 1 ) Mode.switchTo( 'game over' )
},
teardown: function(){
…
That’s it. Because the tasks for Rocks
, Bolts
, etc. are already doing their job via M.tasks.update()
we need only issue a few commands to describe the game play itself. In fact, this mode’s update
function is just two lines long:
- If there are no rocks left, make a new level.
- If there are no lives left, switch to “game over.”
Wrapping space
The original Asteroids is a two-dimensional game. Rather than a side-scroller that simply extends beyond edges of the screen, Asteroids wraps around the edges back onto itself: Objects exiting one screen edge then enter from the opposite screen edge. Space wraps! If you’re trying to shoot an asteroid flying off the top of your screen but you’re just not close enough to aim with precision you might do well to turn around and fire off the bottom of the screen in order to hit it.
But how does that translate into virtual reality? If we think of the player’s head position as the origin point in our three-dimensional space, then we can define distances from that origin point. Near the top of scripts/main.js
I define some radii from the player’s head (our Three.js camera rig) in the global settings
object—this is the essence of our wrappable universe:
…
radiusFogBegins: 250,
radiusFogEnds: 295,
radiusWrap: 300,
radiusStarsBegin: 300,
radiusStarsEnd: 400,
…
The standard unit in virtual reality is meters. Here we see that fog begins to obscure the scene at 250 meters from the camera. By 295 meters the fog is fully opaque and therefore objects beyond this distance are invisible. Objects further than 300 meters will wrap to the opposite end of the universe—this is our “modulo magic” that creates the illusion of infinite space while always keeping the player in the action. No matter how far our player travels they will never leave this known universe, nor will the objects they encounter. (And in the far, unreachable background beyond 300 meters away, we set our stars.)
We’ve a few components that need wrappable behavior: Bolts
, Jumbotrons
, Rocks
. Using classical inheritance patterns we might define a Wrappable
class which Bolt
, Jumbotron
, and Rock
could inherit from. But just globbing on functionality via prototypes can be more fun. Here in scripts/main.js, line 47
we define a wrap
method in the global scope:
function wrap(){
const worldPosition = this.getWorldPosition()
if( worldPosition.distanceTo( player.position ) >= settings.radiusWrap ){
this.position.copy( this.parent.worldToLocal(
worldPosition
.sub( player.position )
.normalize()
.multiplyScalar( settings.radiusWrap * -1 )
.add( player.position )
))
return true
}
return false
}
Now adding wrappable functionality to any component is easy. For example, here’s how I made Rocks
wrappable in scripts/Rock.js, line 72
:
Rock.prototype.wrap = wrap
That’s it. All instances of Rock
are now wrappable. And if you choose to code your own space objects you can easily make them wrappable too.
To be continued?
Hey—thanks for reading this far. I could go on about Quicktext
, Rock
formation, particle reuse in Explosions
, and more. I mean, just check out this custom exhaust geometry and its explanatory ASCII diagram:
BASE GEOMETRY EXTENDED GEOMETRY
R0 R1 R2 R3 R4 …… R200
─────────··········─────────────────────
╱ ╱ ╲ ╱ ╱ ╱ ╱ ╲
│ │ │ │ │ │ │ │
╲ ╲ ╱ ╲ ╲ ╲ ╲ ╱
─────────··········─────────────────────
But this feels like a good place to call time for now. Drop some comments to let me know what’s been useful or to ask new questions. Long live WebXR.