“Dance Tonite” is an ever-changing virtual reality dance collaboration between LCD Soundsystem and their fans in celebration of the band’s single “Tonite” from their album American Dream. Created by Studio Puckey, Moniker, Google Data Arts Team, et al.
Immerse yourself in a moving party of abstract dancers—a vibrant palette of cone faces and rod hands bouncing to the beat in a series of spritely-hued rooms. Spectate from your laptop or a seated VR headset experience, as the spherical cursor traverses the length of Tonite’s thumping track. Jump up and dance along by using your room-scale VR headset and hand controllers. Every virtual dancer you encounter is the recording of a real-life reveler—just like yourself. Record your own moves and submit them for inclusion in the next public compilation of the video. And what’s more, you can record up to twenty loops of yourself in the same virtual room, instantly transforming your solo performance into a full-on dance party.
Our “Dance Tonite” experience segments LCD Soundsystem’s “Tonite” into virtual rooms, each one representing about eight seconds of the song. That means it takes about eight seconds for our yellow, spherical “cursor” to fly across each room.
The “Point of View” (POV) camera allows you to view the experience from the perspective of either the moving spherical cursor, or any of the dancers.
Within a seated VR experience (like Google Cardboard, Daydream, Samsung GearVR, etc), or on a non-VR platform, the yellow ball becomes your virtual camera view. (Not in VR? Just click or tap on the ball to use it as your virtual camera.) Ride along with the ball through our series of rooms and dancing LCD Soundsystem fans—and don’t forget to turn your head to look all around you as you glide onward. You can also click or tap on the virtual dancers in order to jump to their own point of view. That’s right—see the experience from the eyes of the dancer that lived it.
The “Dance Tonite” dance recording tutorial.
All of the choreography you see here was recorded by real fans, like yourself, using room-scale VR devices like the HTC Vive or Oculus Rift. These devices use headset and controller tracking to reflect your physical movements in your virtual environment. Dance Tonite transforms your room-scale VR setup into a DIY motion capture tool. Record your dance and submit it for future inclusion in the experience. One is the loneliest number, so record up to 20 loops of yourself to fill your room with movement.
WebVR is a new technology that enables your web browser—the thing you’re reading this on right now—to deliver high quality virtual reality content right to your phone or VR headset. (On a phone? Try Google Cardboard or Samsung GearVR.) With WebVR there’s nothing to install or download because the VR experience is just a website. And because it’s just the Web, all you need to get started (or to share with your friends) is a URL: https://tonite.dance
The making of Dance Tonite
As Jeff Nusz puts it, beautiful things can happen when worlds collide. Twenty years ago LCD Soundsystem mashed together electronic dance music and punk rock—an unlikely pairing that yielded undeniable fun, humanity, and an honesty that still rings true today. That spirit of fusion and recombination was in the air in what is recognizable now as still being a post-9/11 period; a search for novel cultural moorings and anchors for a new, uncertain era. I recall my Williamsburg days circa 2004–2006; loft party soundtracks shifting between openers like Metric and Yeah Yeah Yeahs, to “recent classics” like The Faint, or pre-fascist / Katrina-era Kanye, trance-inducing Ladytron, life-affirmers The Go! Team, peak-of-talents Outkast, and Union Pool regulars TV on the Radio. And then there was LCD Soundsystem. James Murphy and company felt like they could be our roommates in some factory squat on North 11th; that we might be deep in sidewalk conversation with, drinks in hand, as some nameless face shoulder-taps the lot of them that it was time for their set; best hop on the mic before the party died out. (And hey studio kids, don’t forget to keep those mouse pads securely taped to the the drum heads for that patented DFA thwump!)
While we can’t claim to hold a candle to the significance of our musical partners in this experience, we can at least say we collectively went out on a limb to fuse together another unlikely mashup: The indie dance spirit of LCD Soundsystem stirred with the experimental (and nerdy) technology of browser-based virtual reality. Our fusion experience was created by artists Jonathan Puckey and Moniker, in collaboration with Google’s Data Arts Team—a specialized team within Google that explores the ongoing dialog between artists and emerging technologies. Some of the main characters are featured in this “behind the scenes video here:
Our “Making of Dance Tonite” video, which I entirely missed the filming of due to a flight delay.
Sadly, my morning flight from New York to San Francisco was severely delayed on the day of “behind the scenes” filming—one of the very few times that the otherwise excellent Virgin America ever disappointed me. As a result, I arrived at Google’s SF office overlooking the old Ferry Building just as the “making of” film crew were packing up. I’d missed my opportunity to appear in the documentation. I would have loved to gush at least a sentence about VRController, or my WebVR co-presence experiments that dovetailed into recording the dances. No matter. This project was a treat to work on, and the Google I/O debut and launch party still lay ahead. (Also, this video may make it appear as if we were all physically together during this project’s evolution. The truth is we were working across nine time zones from various offices and homes; remote collaboration is indeed possible and those who claim otherwise are ignorant or worse.)
- Music. LCD Soundsystem.
- Directors. Jonathan Puckey, Moniker, Google Data Arts Team.
- Concept. Studio Puckey, Moniker.
- Production. Sabah Kosoy, Alisha Jaeger.
- Developers. Jonathan Puckey, Jeff Nusz, David van G. de Neufville, Stewart Smith, Michael Chang, Gianluca Martini.
- Typography. Studio Puckey, Marius Schwartz.
- Typeface. Lars by Bold Decisions
- Studio Puckey. Jonathan Puckey, David van Gelder de Neufville, Marius Schwarz.
- Moniker. Roel Wouters.
- Google Data Arts Team. Jeff Nusz, Sabah Kosoy, Alisha Jaeger, Stewart Smith, Michael Chang, Gianluca Martini, and special guest Christian Haas, with support from David Krinsky, David Rappaport, Eric Davich, and Jack O’Connell.
- “Dance Tonite” open-source code repository on GitHub.
- “‘Dance Tonite’ in WebVR” by Jonathan Puckey for Google’s Web Developer blog.
- “‘Dance Tonite’, an ever-changing VR collaboration by LCD Soundsystem and fans” by Jeff Nusz, Google Data Arts Team for Google’s “The Keyword” blog.
- “Dance Tonite” on Google’s “Experiments with Google” gallery.
- “Dance Tonite” case study on Studio Puckey website.
- “Dance Tonite” case study on Studio Moniker website.
My own path to WebVR and Dance Tonite
In the autumn of 2015, I departed Yahoo to return to Google’s Creative Lab. I was a Web guy at heart, and became obsessed with the emerging field of Web-based virtual reality (WebVR). The browser API for WebVR was not yet standardized; in those early days building and testing WebVR content meant asking Chrome engineer, Brandon Jones, for a custom build of Chromium. (Toji, you are a goddamn hero.) I was already very familiar with Three.js and this naturally became my 3D engine of choice for WebVR experimentation. (Obligatory shoutout to the patient and kind-hearted Ricardo Cabello, AKA Mr. Doob, and all Three.js contributors.) I’m also indebted to Boris Smus for his WebVR scaffolding code and functionality shims.
With software in place, the missing piece was hardware. A good friend who was eager to support my professional interests had offered to purchase tickets for us to attend Unity’s Vision Summit held in Los Angeles, February 2016. (“Friends are God’s apology for relations.”) We were beyond lucky when during his conference address, Valve’s Gabe Newell announced that Vision Summit attendees would receive a free HTC Vive “Pre” device—and that these “Pre” units would ship imminently; even ahead of the final consumer product’s availability. That meant I’d soon have access to the absolute latest in high-end consumer VR gear and could get to work on building demos straight away—provided I had the right sort of computer, of course. Upon returning to Brooklyn I ordered my first Windows PC since the one I had built for myself the summer before college, back in 1999. (And I would soon discover that using Windows was even more horrendous than I had remembered it.)
My multiplayer WebVR worlds of Peg People
Undeterred by the WebVR API’s instability, I set to work furiously cranking out browser-based demos for my HTC Vive Pre, Samsung GearVR, Google Cardboard, and eventually Google’s Daydream View. Creating these one-offs was fun, but I already had my eye on multiplayer experiences where visitors could jump from world to world together by engaging the very bedrock of the Web—the hyperlink.
A colleague scoffed that “co-presence” was difficult; that top university researchers were slowly developing protocols for multiplayer virtual reality in the hopes of making it real someday. This sounded like absolute rubbish to me. (How hard could it be to bounce a few floating point values for position and orientation across a server?) That weekend I wrote my own crude multiplayer server built on Node and Socket.io that did just that. (And yes, it turns out that multiplayer is indeed very difficult. Wow, is the devil in those details.) Within a few iterations I had three separate 3D “worlds” (each accessible by its own URL destination) that users could visit via their 2D computer screens within a first-person-shooter view, or by smartphone within a “magic window” AR view, or as an immersive seated VR experience, or as a full-on room-scale VR experience.
A peek inside “Summer World”—one of my WebVR demo worlds. Here we can see that my name is “Indigo Salmon.” I’ve met with Lime Caribou to my right, and Ocean blue Manatee in the left background. In the distance more folks are arriving to Summer World via the teleporters—the lights that extend upward into space. And in the far distance, the city.
Users found themselves represented as simple “peg people”—peg bodies sans hands or feet, with a rotatable spherical head containing eyes and a nose to indicate gaze direction. Each user was assigned a randomly chosen name composed of a color and animal, that hung as a floating label above their heads. This name was retained as they teleported across worlds, making it easy to spot friends coming and going. There was only a single control mechanic: tap the primary button on your platform to glide forward in whatever direction you’re currently facing. Teleporter pads sprinkled across the landscape would either transport you within a world, or bounce you via hyperlink to a new one as your peg body passed over it. On some platforms, like the Vive, a new page load would annoyingly drop you out of immersive mode. But on platforms like Cardboard, the reappearance of the browser’s address bar had only minimal impact on the experience. It was rag-tag, but I’d done it: I’d built a functioning multiplayer, browser-based, immersive virtual reality demo with graceful degradation from a 6DOF immersive experience down to a plain old laptop screen experience.
Herding cats: During a Friday “TGIF” meeting at Google Creative Lab in early 2016, I invited colleagues to test drive my WebVR demo. The majority of the approximately 60 participants were present in person, but some joined remotely as well. Photograph, Varathit “Tu” Uthaisri.
It was time to give this thing a real kick in the tires. During one of our regular Friday “TGIF” meetings, I invited everyone at Creative Lab to jump into this virtual universe together. My audience of about 60 coworkers joined primarily via their smartphones, opting for the “magic window” AR experience. A few folks suffered from crashed browsers as my server fed too much data to their phones at once. (Oops.) But for the most part it worked; an important stepping stone for WebVR—or at least for my own journey within it. I could livestream the positions and orientations of everyone’s heads to each other so that we could “see” each other in the virtual world. (And this was in the first half of 2016—very early days for the medium.)
I made a subsequent iteration purely for the Vive that focused on hand controllers. Hands were represented by “flashlights” that would leave a fading history trail of light, allowing users to write vanishing messages in the air to each other, or to just leave super cool light trails when carving around the map at high speed. This was how I learned to record and play back a user’s position and orientation data, rather than simply live stream it. I eeked out every shortcut imaginable in order to make the process efficient; serializing the translation / rotation / scale matrices, stripping out the scale values (as there was no need for them in my context), experimenting with frequency of broadcast, and tweening those received values.
I thought I was breaking ground. Creative Lab’s directors were less impressed; a medium that was expensive to develop for, had only a sliver of a user base, and had yet to evolve working business models. It was time for me to find a home where my enthusiasm for WebVR was shared rather than maligned. That home turned out to be Google’s Data Arts Team—a group once organizationally housed within Creative Lab, but was now attached to Google’s hardware division. And that division was eager for viral content to promote its newly released Daydream View hardware. Finally, alignment.
VRController for Three.js
By the time I joined Google’s Data Arts Team to work on WebXR-related things, I had already begun experimenting with abstracting away the Web Gamepad API and VR hand controller quirks that I was quickly running up against. My work eventually resulted in the open-source VRController toolkit for Three.js. We naturally used VRController to power Dance Tonite’s multi-platform hand controller support. I also used VRController for my personal project, Space Rocks, which pushed me to invent multi-channel haptic feedback routines that I then merged into VRController so that everyone could take advantage of these intuitive and easily configurable haptic commands. To learn more, see the VRController project entry or read the Space Rocks Technical Deep Dive. In the meantime, here’s a TLDR:
Supporting a primary button across platforms (usually, but not always a device’s “trigger” button), is as simple as this:
if( controller.getButton( 'primary' ).isPressed ) fire()
Or even this:
controller.addEventListener( 'primary press began', fire )
And adding haptic routines is as simple as naming a haptic channel and sending it a list of vibration intensity commands:
controller.setVibe( 'cannon recoil' )
.set( 0.80 )// 80% intensity.
.wait( 20 )// Wait a fraction of a second.
.set( 0.00 )// Then stop the kickback.
For more information, see the VRController project entry.
Seeking director Michel Gondry
Google’s Data Arts Team has a rich history of collaborating with musicians on technology-driven music videos. We knew we were aiming to make a great browser-based virtual reality music video experience—but that required locking down a musical act, as well as a director that could add cultural relevance to a project that might otherwise be viewed derivatively as clunky tech company marketing. We were very excited about the prospect of working with LCD Soundsystem—one of our mutually favorite bands. We were overjoyed when they agreed to let us use their then-unreleased track for our collaboration.
Meanwhile, we were still in search of a director. One hero that we had eyes on was a personal favorite of mine: French filmmaker, Michel Gondry. I had grown up adoring his music videos and had actually met him ten years prior during a cover shoot for Res Magazine’s July 2006 issue. The shoot was conducted in Juliette Cezzar’s Brooklyn loft—a friend and talented graphic designer who had recently secured Res (and the recurring work of designing its print issues) as a client. The night beforehand we’d helped photographer Autumn de Wilde assemble some of her playful set pieces ahead of the shoot. (Juliette and Autumn would later collaborate on the book “Elliott Smith”—a collection of Autumn’s photographs as well as collected stories and interviews memorializing the singer. Autumn’s daughter, Arrow, has since grown up to be a bit of a rockstar herself.)
Michel Gondry, wearing a blue felt suit, winds a rubber-band airplane during a Res Magazine photo shoot in Williamsburg, Brooklyn, April 2006. Left to right: the back of a wardrobe assistant, the left arm and hand of Res co-editor Sue Apfelbaum, a seated Arrow de Wilde looking onward, filmmaker Michel Gondry, designer Juliette Cezzar, photographer Autumn de Wilde, and photo assistant. Photograph, Stewart Smith.
Print issues of Res Magazine were accompanied by DVDs of the video work featured within. I counted myself undeservedly fortunate when my music video for Grandaddy’s “Jed’s Other Poem” appeared in Res’ Volume 9 collection of DVDs. Coincidentally, LCD Soundsystem’s “Daft Punk Is Playing at My House” had featured among the previous run of Res’ Volume 8 collection—and that LCD video gently references Michel Gondry’s visual works, no less. Perhaps some star-fated alignment of myself, LCD Soundsystem, and Michel Gondry was slowly manifesting? (Spoiler: It was not.)
From my obsession with Gondry’s Palm Pictures collection, I knew that Michel used to have recurring childhood nightmares about his hands becoming abnormally large—something that would later become the inspiration for his Foo Fighters “Everlong” music video. Taking this as a queue, I created a quick functioning prototype: A user could don their VR headset, pick up their hand controllers, and see photorealistic human hands where their actual hands were located. (I attempted to find a 3D model of hands that looked similar to Gondry’s—which is to say, a slender white man’s hands.) With a flick of the controller’s thumb sticks, these virtual hands would scale up to abominable size—and were just as easily soothed back to normality. I was delighted to include this in a package of assets that accompanied our Data Arts Team proposal to collaborate. I doubt Gondry ever tried it out, however. I got the sense it just wasn’t the right moment for him to be receptive to my clumsy non-verbal attempt at emotional connection. Perhaps another time.
Meanwhile, we were already working with Jonathan Puckey and Roel Wouters—and it was soon obvious all around that with this duo our recipe for success had been in-hand all along. The proof is in the results.
Our launch party for Dance Tonite was epic. We premiered our work to excited attendees participating in Google’s massive annual developer conference, Google I/O. (This was my second project-unveiling at I/O, my first being Sundar Pichai’s debut of Chrome Racer during his keynote in 2013.) We held the party in the enormous ballroom tent adjacent to the Shoreline Amphitheater grounds, with the virtual rooms from the Dance Tonite experience realized as actual physical rooms where attendees could don VR headsets and record their dances. The center of the event space contained a spectacularly long ball pit that raucous revelers made spectacular use of.
VRScout’s coverage of Dance Tonite’s launch party during Google I/O 2017 in Mountain View, California.
One guest that I was particularly excited about was Kent Bye (Voices of VR) who kept the party atmosphere in full swing with his enthusiastic participation. (Thanks Kent! And lovely to meet you again at SXSW a year or two later.) You can catch some of his moves in the VRScout video above. In his post-party writeup, VRScout’s Jonathan Nafarrete gushed “Unlike the more subdued VR demos I’ve grown accustomed to at tech conferences, this was an all out dance party, and likely the most fun you can have in any web VR experience.” Thank you, sir. Getting users out of their seat and moving has been my jam since Roll It.
For the Google I/O launch party we implemented our own cult-like dress code. Nothing says “team spirit” like a group of under-slept VR nerds in yellow Vans.
While this launch party in May unveiled Dance Tonite to Google I/O attendees and press, we waited to make the experience publicly available until August in order to align with LCD Soundsystem’s album and single release schedule. Dance Tonite went live on August 22nd to a wave of positive press previews.
LCD Soundsystem, live
The cherry on top of this entire experience was the free LCD Soundsystem concert at the Shoreline Amphitheater that closed out the multi-day Google I/O conference. (Originally live-streamed from this YouTube link.) I’m sure there must have been some discussion; some skepticism on the band’s part about implicitly embracing a giant tech company by collaborating on our VR music video, and then going further by performing at this conference. I don’t know how to properly answer or unpack that, but I do know that as an individual person working on this creative project with a small group of other individuals, I felt so privileged and thankful to be in this audience and to be on the receiving end of this band’s energy. You played my two favorites, “All My Friends”, “New York, I Love You But You’re Bringing Me Down”—as well as my new runner up, “American Dream.” Thank you, thank you, thank you.
A clip of “All My Friends” from LCD Soundsystem’s performance at Google I/O 2017 as captured by YouTube user Ernest.