By Paul White and Antony Stevens.
The First Climb
Fall 2013, Oculus DK1 + Razer Hydra
My journey into VR locomotion began with the sunsetting Razer Hydra in late 2013. An early motion controller system tracked by a low-power magnetic field, the Hydra was originally designed as a peripheral for flat PC gaming. But for some of us, it was also an unlikely hero—the Hydra was the first big key to unlocking presence in virtual reality.
It was the era of the DK1, the first of the Oculus Rift prototypes available to Kickstarters, offering only rotational head tracking during its initial foray into the rebirth of VR. Without positional tracking of the head or hands, player movement in VR projects was either bound to the analogue sticks or omitted entirely. These were the standards and limitations of the time; VR as we know it today was yet to exist.
I was working on Exploration School, an early tech demo for our built-for-VR adventure game "The Gallery." My challenge was to use the Hydra to mimic the motions of climbing a wall without using control sticks—just reach out and grab it. It sounds straightforward now, but during those early days of VR we thought it could never be done with the tech.
Holding the wired Hydra, you would reach out with your hand and press a button to capture the position of that arm on a surface. Any motion you made next would be countered and represented in game with our body persistence. If you let your arm down, your position would counter that movement, causing your camera and in-game body to move upward. If you raised your arm up, your position would counter, and you would climb down. It felt intuitive, all tech considered.
VR devs all around were experimenting with anything and everything, from climbing to flying to roller coasters, but there was no substantial test audience. Motion sickness was a concern internally, but there weren’t enough headsets in the wild to know how widespread its effect was. We knew what artificial movement felt like to us and other developers, but there was no way to know what was working and what wasn’t for various sensitivities.
When we brought Exploration School to public events, we gave players the best advice we had: “Don’t look down.”
The Bigger Picture
Spring 2014, Oculus DKHD + Razer Hydra
Those first two years saw many VR developers building single-room projects—playboxes with no need for travel or locomotion. The Oculus Rift, for all intents and purposes, was a seated experience.
Our project, The Gallery, was a larger world that needed exploration, with terrain that was organic and rugged. We wanted realism where you could walk around, look at things, and feel alive in a world. VR was predominantly blocky at the time (both graphically and otherwise), and walking with the analogue stick felt like your body was a cart behind you, changing direction to chase after you each time you turned your head. It all felt unnatural.
Tank move was one alternative. This method allowed your head to deviate from the direction you were moving, so you could pan your view around an environment completely decoupled from your body direction. Think of your head as a swiveling neck cannon, while your body is driven on tracks and controlled by a joystick. It was a fitting abstraction.
Tank move was better because it meant you could look around while you locomoted. It was also worse because of vestibular disconnect—motion sickness caused by your brain perceiving directional movement through your eyes (the headset), without physical motion detected by your inner ear (the real one). Decoupling head movement from the body ultimately decoupled stomach contents from the body as well.
More important than the freedom to look around was the freedom to move around, and we knew that the positional tracking features of the upcoming DK2 (and experimental hardware from Valve) would help dictate movement. In the meantime, we wanted to get ahead of the curve and start building for the future that VR was heading toward. Using heuristic spine modeling and a simulated height, I was able to turn the single, rotational tracking point of the DK1 into two positional tracking points: head and root.
With that inferred root, we then had the approximate location of the player’s torso in relation to their head, and could then adjust their body avatar with movements accordingly. We could tell the difference between natural displacements, from the player crouching into a tent, to peering over a balcony at the distant world around them.
In the end, the feature never made it in. Everything was about to change anyway.
The Great Divide: Comfort and Realism
Summer 2014, Oculus DK2 + Razer Hydra
VR devkits were being released to the public in droves, now with positional tracking, and people were getting motion sick. Just by putting on a headset, it became an immediate uphill battle for comfort. Using your hands and body, standing up and crouching down—it had all added so much to presence. But it had come at a cost. Any time the camera displaced without the player, it was barf city. And in an exploration game like The Gallery, you couldn't just explore the contents of your chair.
Most locomotion in VR was now split between body cart, tank move, and stick move with yaw rotation. The latter was the worst of the bunch, not only producing artificial forward-backward movement (vection), but also allowing the player to control the camera independent of their head position. Instead of a body cart, your face was the one along for the ride. If motion sickness was going to be the widespread problem it was trending to be, we would need to find a better way.
In any given moment, the human eye is capable of what’s called a ‘saccadic movement.’ Your eyes are constantly dancing, looking around for other things, even though to you it seems like the movements are smooth or even still. It’s an imperceptible movement—a jump. The turn of a ballerina. This was the basis for VR Comfort Mode.
Rather than continuously rotating the camera over a duration, as yaw rotations do, Comfort Turns are instantaneous. You press a button, or flick the control stick, and the player camera changes its facing direction. And because it’s instantaneous, there's no visual motion for your brain to perceive, and no physical movement for your inner ear to detect—no vestibular disconnect. It goes a long way to mitigating motion sickness, and the option has remained a standard for comfort even in today’s evolved experiences.
At the same time, we were still trying to make sure moving felt like moving. We began to work on a VR obstacle course specifically to experiment with different locomotion styles. I went back to climbing, inspired by geodesic domes, and developed a spherical ladder that you could go all the way around. You would grab for a bar and latch up, climbing the ladder until you were looking down at the ground in front of you, before reaching a point of upside down on the other side of the dome.
That one never made it in either, but if you’re reading this, NASA, you know who to call.
There’s Always a Lighthouse
Winter 2014, SteamVR + V minus-1
Near the end of 2014, Valve invited us and a few other select developers to a secret summit. It was there that they revealed SteamVR (and what would eventually become the HTC Vive) for the first time. Rather than the heuristic, inside-out tracking points Valve had shown at Dev Days earlier in the year, SteamVR was using real, local points on the HMD tracked within a volumetric space via “Lighthouses.”
SteamVR did something else no other HMD had before: it added hands. Instead of being tethered to a small magnetic box in front of you, controllers could now be tracked spatially by the same Lighthouse hardware your head was. It offered the genuine ability to walk around and touch things in VR. It was simply amazing.
It also threw everything we knew—and every concept we had for our game—in a nice, tidy loop and out the window.
When the player was fixed in their meatspace (their physical space), that was a fixed local offset. We had designed climbing so that when a player grabbed onto a bar we calculated that offset. With roomscale, those constraints were gone—you could move anywhere. Now, if the player decided to move positionally mid-grab, their arm would outstretch and essentially break free from the ladder and the offset. We were calling and calculating a redundant position that was now tracked by hardware right out of the box.
Likewise, tank move became irrelevant. You could walk in any direction in meatspace while simultaneously looking around normally. Our whole book on locomotion and body persistence broke overnight. The entire game halted. From that point, we had three months to not only redesign everything we had, but miraculously hit a new benchmark of 90fps for the first public demo of SteamVR at GDC 2015.
The first thing I remember us brainstorming about for the GDC Demo was movement. We had always wanted the player to move, and even though SteamVR had introduced this larger, volumetric space, it was now just a bigger box. Our plan was to utilize the volume itself as a virtual elevator, so that the whole room could move upward—and you with it.
As the GDC Demo concluded to the sweeping score of Jeremy Soule, the elevator rose up, and the walls opened on all sides to reveal a skyline with infinite possibilities and directions to explore.
<iframe title="Embedded content" src="//www.youtube.com/embed/sX1cum0vyWk?enablejsapi=1&origin=https%3A%2F%2Fwww.gamedeveloper.com" height="360px" width="100%" data-testid="iframe" loading="lazy" scrolling="auto" data-gtm-yt-inspected-91172384_163="true" id="489421916" class="optanon-category-C0004 ot-vscat-C0004 " data-gtm-yt-inspected-91172384_165="true" data-gtm-yt-inspected-113="true"></iframe>
Designing an Experience
Spring 2015, SteamVR + HTC Vive (Developer Edition)
After GDC, people loved roomscale; it was a new feeling for everyone. But we still had a game to build. Whatever we made for roomscale had to play into the full-scale levels we had been designing for The Gallery up to that point.
The new freedoms of positional tracking, artificial locomotion, and variable heights all at once were almost too much. They contradicted the core feeling and pacing of the experience we'd designed. We needed constraints. And a new type of comfort.
We experimented with rapid locomotion prototypes to see what landed. The goal was to align our locomotion with how we wanted the player to perceive the game. More advanced techniques felt superfluous and worked against the tone and feel of our slow-burn exploration. We were also a small team of seven and we had to consider player-fatigue—something we had no metrics on.
How long could players explore inside VR before they felt eye strain or exhaustion? How much mental load, between advanced controls and puzzles, could a player bother with on top of that? And that was yet to even consider nausea.
Instead of being limited to sitting or standing in one position like with the Hydra or DK2, SteamVR had its own issues: Players were literally running into walls. We needed to find a way to redirect players to the center of their physical space, so they had maximum area to work, play, and explore. Like with the GDC Elevator Demo, we didn’t want the player to feel like they were constrained to just their room.
Artificial locomotion—traditional gamepad movement—was the jumping point. We experimented with play bounds (the ostensible walls of the roomscale volume) overlaying as a grid when the player moved forward with the analogue stick. Then we simulated head bobbing like in a traditional FPS.
Body Joystick had you at the center of your room, with any offset direction and distance from that position representing the directional vector and velocity of your movement—you used your body as the controller. Arm Joystick used the motion controller itself like a flight stick (a similar method is known today as ‘Onward-style movement’).
<iframe title="Embedded content" src="//www.youtube.com/embed/YWtAwGeA_ds?enablejsapi=1&origin=https%3A%2F%2Fwww.gamedeveloper.com" height="360px" width="100%" data-testid="iframe" loading="lazy" scrolling="auto" data-gtm-yt-inspected-91172384_163="true" id="392164981" class="optanon-category-C0004 ot-vscat-C0004 " data-gtm-yt-inspected-91172384_165="true" data-gtm-yt-inspected-113="true"></iframe><iframe title="Embedded content" src="//www.youtube.com/embed/A5tsdRb3PFw?enablejsapi=1&origin=https%3A%2F%2Fwww.gamedeveloper.com" height="360px" width="100%" data-testid="iframe" loading="lazy" scrolling="auto" data-gtm-yt-inspected-91172384_163="true" id="232287542" class="optanon-category-C0004 ot-vscat-C0004 " data-gtm-yt-inspected-91172384_165="true" data-gtm-yt-inspected-113="true"></iframe>I was stuck on the concept of grabbing space and came up with a concept for VR handlebars. You could reach out and grab, and, as soon as you latched, a virtual handle would appear in that space. It looked like the holodeck; right when you grabbed forward, the whole entire world would move toward you. In The Gallery it went over like a fart in a spacesuit, but ultimately found a fitting home in zero-G experiences like Lone Echo.All of these methods worked against the feeling of our experience. And all of them made you feel like you were going to fall over. If you accelerated instantly, you’d get a lurching in your stomach. If you accelerated gradually, you’d get a different lurching in your stomach.
At the time, the few metrics we did have indicated that artificial locomotion wasn’t working for players. Now, with roomscale, it wasn’t working for our game either.
Blink and You’re There
Summer 2015, SteamVR + HTC Vive (Developer Edition)
One of the first forms of teleportation I worked on was Astral Navigation. You would look up in-game and see a star path between the clouds that represented the layout of the level you were in. Aim to where you wanted to go and, when you looked back down, you’d have teleported to that point in the scene. Impractical, but we were trying to explore the upper limits of what we could do with movement.
At this point, Valve put out an early photogrammetry scan of their office (a rough point-cloud version similar to the current SteamVR Environment below). In it were various "information" nodes that you could teleport between to navigate the room. Rather than use artificial locomotion to slide around, you could zip from curated point to curated point. We instantly liked it—it felt cool, and it didn’t make you sick.
<iframe title="Embedded content" src="//www.youtube.com/embed/FtLkF8HVEco?enablejsapi=1&origin=https%3A%2F%2Fwww.gamedeveloper.com" height="360px" width="100%" data-testid="iframe" loading="lazy" scrolling="auto" data-gtm-yt-inspected-91172384_163="true" id="539306978" class="optanon-category-C0004 ot-vscat-C0004 " data-gtm-yt-inspected-91172384_165="true" data-gtm-yt-inspected-113="true"></iframe>SteamVR hardware had elongated the tether to the computer, but we still had a cord to fight with. In a game with exploration like The Gallery, players were getting wrapped up and tangled when they tried to spin around. Immersion broke as they fixed themselves, their tangled cable, and their orientation in-game. We decided to take Valve’s teleportation nodes and augment them to support rotation.
The first question was how best to align players in an open level to have access to the interactions that we predicted they would want.
I prototyped a way for pl