Stewart Smith
Vr Controller

Support VR hand controllers for Oculus, Vive, Windows Mixed Reality, Daydream, GearVR, and more by adding VRController to your existing Three.js-based WebVR project. (Note: I ended support for this open-source library in 2018 when I left Google for Unity Labs and took a break from Web-based XR.)

VRController wraps the Web Gamepad API, handles gamepad browser quirks, emits a controller instance (an extended THREE.Object3D) upon gamepad discovery, handles controller updates for position and orientation—including 3DOF rigs via the OrientationArmModel—and watches for updates on axes and button states—emitting corresponding events on the controller instance.

We include explicit support for Oculus Rift + Touch, HTC Vive, Windows Mixed Reality motion controllers, Google Daydream, and has implicit support for Samsung GearVR and similar devices. Is your company developing new hand controllers? Send them my way and I’ll add support for it. 😉

VRController is compatible with Three.js r87 which is the first version to use the new renderer.vr object, and was originally submitted to Three.js as pull request #10991 on Saturday, 11 March 2017. Note: that pull request is no longer maintained.





VRController in use

Dance Tonite

VRController powers Google’s Dance Tonite WebXR music video for LCD Soundsystem, facilitating support for a wide range of VR hand controllers with minimal fuss. Browse the Dance Tonite source code on GitHub, or read Jonathan Puckey’s technical writeup for Google’s Web Developer blog.


Space Rocks

Kicking it up a notch, VRController also powers my personal project Space Rocks—a tribute to Atari’s 1979 classic, Asteroids. Space Rocks makes extensive use of VRController’s “multi-channel haptic feedback” (see below), and is also open-source, so you can dive into the Space Rocks code repository on GitHub to see exactly how the parts are stitched together. For a gentle overview, read the Space Rocks Technical Deep Dive.



Multi-channel haptic feedback

Some hand controllers, like the current models for Oculus Touch or Vive, contain haptic actuators—vibrating motors—that can pulse or buzz to provide haptic feedback for our VR user. For example, is the user backhanding a virtual tennis ball across the net at someone? The moment their virtual tennis racket connects with the virtual ball would be a good time to employ some haptic feedback; to make the controller rumble for a fraction of a second, thereby allowing the user to “feel” that contact.

I came up with the idea of haptic channels while building the game Space Rocks and then merged this functionality into VRController. For an in-depth look at how multi-channel haptic feedback can cut your haptic development time down to nothing, read my Space Rocks Technical Deep Dive. In the meantime, here’s the TLDR: Look how simple it is to describe concurrent channels of haptic activity. Imagine you’re on a ship with a giant rotating cannon. Let’s start with a mild engine rumble. (By referencing the “engine rumble” channel we are creating it if it does not already exist.)


controller.setVibe( 'engine rumble' )
    .set( 0.10 )//  10% intensity until we say otherwise.

Now imagine adding this to the trigger-press routine for that rotating cannon (with some serious kickback):


controller.setVibe( 'cannon recoil' )
    .set( 0.80 )//  80% intensity.
    .wait(  20 )//  Wait a fraction of a second.
    .set( 0.00 )//  Then stop the kickback.

controller.setVibe( 'cannon rotation' )
    .set( 0.20 )//  20% intensity until we say otherwise.

And add this to your trigger-release routine to make your rotating cannon gradually slow to rest:


controller.setVibe( 'cannon rotation' )
    .wait( 500 ).set( 0.10 )//  After half second, drop to 10%.
    .wait( 500 ).set( 0.05 )//  Half second later, drop to 5%.
    .wait( 500 ).set( 0.00 )//  Half second later, kill completely.

Now you’re cooking with gas.



Get started with VRController

Let’s get you up and running right away. You can run VRController locally in your own project (which requires a tiny bit of coding), or try out the live demo which doesn’t require any fiddling with files at all. Let’s get to it.


Requirements

  1. Virtual Reality rig with 3DOF or 6DOF controllers such as the Oculus Rift + Touch, HTC Vive, a Windows Mixed Reality rig, Google Daydream, Samsung GearVR, or similar devices.
  2. WebVR-capable browser. For the latest list of browsers that support WebVR—as well as download and setup instructions—see WebVR Rocks.
  3. Working knowledge of Three.js.

Try it now!

Already on a VR rig with a WebVR-capable browser? Just point your browser to https://stewdio.github.io/THREE.VRController/ to experience this code in action.




Easily add VRController to your project

  1. Add our VRController.js file to your existing Three.js project and use our index.html example file as your guide for the following steps.
  2. Add a THREE.VRController.update() function call to your animation loop.
  3. Add a listener for the "vr controller connected" global event. This is how you will receive the controller object instance—which is an extended THREE.Object3D. This means you can add it to your scene, attach meshes to it, and so on.
  4. When you receive the controller object instance you must give it some additional information depending on the type of controller. For 6DOF (room scale) rigs you must provide a standing matrix, easily obtained from your WebGLRenderer instance in Three.js r87 and above. This will look similar to: controller.standingMatrix = renderer.vr.getStandingMatrix(). For 3DOF (seated) rigs you must provide a reference to the camera so the controller can use the headset’s live position and orientation to guess where it ought to be: controller.head = camera. There’s no penalty for providing the controller instance with both standingMatrix and head properties as we do in the example.
  5. Explore the available touch, press, and trackpad events by assigning THREE.VRController.verbosity = 1. You’ll now see a flood of verbose comments in the JavaScript console as you interact with your controller. To access controllers directly from the console explore the THREE.VRController.controllers object. To get a snapshot of all controller data try THREE.VRController.inspect().

Run locally

For security reasons you can’t run a WebVR experience by just dragging the index file onto a browser tab. You have to run an actual server. The easiest way to do this on your own desktop machine is to start a simple Python server. Open up a command line prompt, navigate to wherever you’ve stored this code package, then type the following command depending on the version of Python you have installed.


  • Python 2: python -m SimpleHTTPServer 8000
  • Python 3: py -m http.server 8000

In your browser you can now navigate to http://localhost:8000/ to see the demo running locally. You can shutdown the local server by returning to the command line and hitting Control + C.


Notes on Chromium’s Gamepad API

If you’re building WebVR experiences and targeting the WebVR build of Chromium you may want to read my Medium post about its quirky behavior and how VRController compensates for it: WebVR controllers and Chromium’s Gamepad API.


Made with love

Everything in VRController is very clearly documented right in the code. I know that if this toolkit doesn’t materially make the process of creating your next WebVR project easier, you won’t use it. (And selfishly, this is to make my future project development easier too!) Here’s just one example of what you’ll find inside:


//  VIVE THUMBPAD
//  Both a 2D trackpad and a button. Its Y-axis is “Goofy” -- in
//  contrast to Daydream, Oculus, Microsoft, etc.
//
//              Top: Y = +1
//                   ↑
//    Left: X = -1 ←─┼─→ Right: X = +1
//                   ↓
//           Bottom: Y = -1
//
//  Vive is the only goofy-footed y-axis in our support lineup so to
//  make life easier on you WE WILL INVERT ITS AXIS in the code above.
//  This way YOU don’t have to worry about it. 

You’ll also discover a level of detail here that can only come from meticulously testing each and every controller in each and every WebVR-supporting browser. I didn’t just read the docs, I inspected everything. Some of it was documented. 99% of it wasn’t. (I feel like just by discovering the different activation and deactivation thresholds for specific hardware, I got to know a little something about the personalities of each different hardware team.) I did all of this so you don’t have to. You should be focussed on crafting the feel of your VR experience—not digging through hand controller support quirks!


//  VIVE TRIGGER
//  Has very interesting and distinct behavior on Chromium.
//  The threshold for releasing a pressed state is higher during
//  engagement and lower during release.
//
//  Chromium
//  if( value >  0.00 ) isTouched = true else isTouched = false
//  if( value >= 0.55 ) isPressed = true   UPON ENGAGING
//  if( value <  0.45 ) isPressed = false  UPON RELEASING
//
//  Firefox
//  if( value >= 0.10 ) isTouched = isPressed = true
//  if( value <  0.10 ) isTouched = isPressed = false
//  --------------------------------------------------------------
//  value:     Analog 0 to 1.
//  isTouched: Duplicates isPressed in FF, independent in Chrome.
//  isPressed: Corresponds to values as listed above.

Thank you

Every good piece of VRController stands on the shoulders of the Web Gamepad API and Three.js. Thank you Brandon Jones at Google for your work as Gamepad API spec editor, and for co-creating the WebXR API. And thank you Ricardo Cabello (AKA “Mr. Doob”), and contributors for creating and maintaining Three.js.

A very special shoutout to another WebXR spec editor: I wouldn’t be able to support Windows Mixed Reality motion controllers without the generous and infectious enthusiasm of Nell Waliczek at Microsoft. Nell took an interest in my little project and sent me an Acer headset with WMR hand controllers—months ahead of their consumer release—so that I could add support for them here. Not only that, but she gave me some gentle nudges when she discovered subtle bugs in VRController; Nell made this project better. Thank you, Nell.