A few weeks ago, I had planned to visit Seattle to tour Valve’s headquarters and preview Half-Life: Alyx. That didn’t happen, for, but I’ve been playing the game from home, and got to interview a couple of Valve’s developers via Skype about the breakout VR game, the first new Half-Life game in 13 years, and what could come next.
I spoke to animator Jay Benson and software developer Jason Mitchell, both of whom worked on the Half-Life: Alyx team. Like any current conversation, professional or personal, it started with a round of status updates. “We’re hunkered down,” said Mitchell, when I asked if he and his team were OK. “Yeah, we are in the quarantine zone, God help us,” Benson said, making a half-joking reference to the game, and reality itself.
Here’s the rest of our conversation, edited lightly for clarity.
CNET: The weirdest thing — what hits me is the strange thing of Half-Life: Alyx being set in this futuristic or retro-future dystopia, and I’m escaping a dystopian situation… to enter a different dystopia.
Benson: I always find it funny because, when I first came here — the Valve office, I don’t know if you’ve been to the office — it’s pretty high up and you get a good panorama, and one of the views you get is these trees that sort of circle you everywhere you look. And the first thing I felt when I came here was, this is just like in [Half-Life 2] Episode Two.
Do you consider this [to be] a Half-Life game like all the others, or, do you ever imagine a Half-Life 3 in addition to this?
Mitchell: Well, it’s entirely possible that we will visit this IP again. We had a heck of a lot of fun making this one; I wouldn’t be surprised if we wanted to make some more. I mean, obviously, at this point, we’re really still focused on finishing this up and haven’t made any decisions like that yet. But yeah, it’s certainly on the table.
Valve is a big pioneer in VR, but this is Valve’s first major VR game. I was curious how much you imagined a lot of what you’re doing in terms of furthering the idea of VR interaction. Or, were you just approaching it like ‘I just want to make the best thing possible’?
Mitchell: I think it’s both. Anytime you have a shift in input/output methods, you have an opportunity as a game designer to go and explore a bunch of really new ideas. VR obviously is that way, with just the super-high fidelity that you get from the tracked headset, but probably even more so from the tracked hands, the six degrees of freedom that you gain. There’s just so much more interesting input abilities from a game design perspective. You can go too far. I’m sure you’ve been into stereo 3D movie theaters where they go out of their way to poke you in the eye with a pool cue or some cheesy thing. We don’t want to go there with VR, and make it cheesy and ham-fisted. We really want to make it serve the game as much as possible. It’s a good match, the Half-Life universe, in terms of the expectation of a lot of physics gameplay, and VR. Those two things go really well together.
Do you see limits to VR? Or do you see a dream of where you want people to end up feeling comfortable? In the end, it is like learning a new language? We’re all kind of rethinking these controls. Is there a thing you’re striving for the most or you still feel like you’re getting to?
Mitchell: You know, one thing that really developed over the course of the game that we didn’t know that we were going to succeed at, necessarily, was hand presence. Particularly in the early days of VR, it wasn’t clear how to present to someone their hands — in a way we’re tracking these controllers, but, what are your hands? If you think about the Vive wands and where you hold them, it’s sort of down below on the wand, and there’s this big crown up top where all the sensors are. Do you show the user’s hands down where their physical hands are? Do you show the user’s hands up at the ends of the tools? Are they, like, puppeteering some hands? Or are they their hands?
When they pick up a tool, is there a hand holding that tool, or is their hand the tool? How far do you let the player’s virtual hand and physical hand separate? Because they’re going to. Even just a simple situation of pushing your hand through a table or something immovable in the virtual world, clearly the virtual hand is not gonna penetrate — or is it? There’s a question there as well, right? We just explored that whole space. We’ve been super happy with where we are with hand presence. And that’s not just because of Knuckles (the Index controllers). For all the controllers, these questions are there. We couldn’t have predicted necessarily where we would end up on that. I’m sure we can go further in the future.
You’re supporting this on a
Mitchell: Throughout development, we’ve had people playing on all the different headsets. In the very earliest days, of course, we were probably all on Vive and maybe some (Oculus) DK1s. But we have the rest in the mix around the team now. People have gotten used to the conventions that users of a particular type of controller have. Rift users are used to using their grip button to pick things up, and Vive users are used to using the trigger. We’ve kind of figured out all those mappings to be comfortable for people that are used to a particular set of conventions for their devices. We haven’t had too much trouble mapping buttons. I think there might be some squeeze interactions, crushing of some objects, that are really incidental to the gameplay, that we have on some controllers, but not others, but by and large people are able to do everything. We’ve even gone so far as supporting one controller for players that only have the use of one hand.
That was fairly challenging, in terms of finding a comfortable mapping and doing things. You get a flashlight, and for a two-handed player, that’s on the off hand. For a right-handed player, it’s on the left hand. For a one-handed player, there’s only one, so we did the work to put that on the single hand. Reloading is a different mechanic: some puzzles that are really kind of two-handed puzzles were modified. We want everyone to play this, seated, standing, room scale, forward-facing (for Rift 1, people that have to face their sensors). We want everyone in there. So we’ve done the work to make sure the target audience is as large as possible.
Do you have a preferred control system? I think I like Blink the most.
Benson: Obviously, there’s comfort levels. So some people, especially if they’ve not done VR before, they’re going to choose Blink as the most comfortable. At this point, because I’ve played so much, I find all three modes perfectly comfortable, and I’m happy to be in Continuous for hours at a time. But I still find myself using Blink a lot of the time. I find there’s something about Continuous (mode) where I room-scale less, because I always have the ability to make little micro movements through the environment, and something sort of switches in my brain when I go back to Blink or Shift where I remember that, oh yeah, I’m in VR, I have six feet by six feet or whatever to go around in. That’s the magic, a lot of time, peering around the corner and that kind of thing. So I ended up going back to Blink at least half the time or more, even though it’s not really a comfort thing for me. It’s just the fun of the game is better in Blink sometimes.
Mitchell: I feel the same way. It’s been interesting to see. Continuous is the most similar to a desktop or console experience. Your body is moving around on a plane, you’ve removed from degrees of freedom there by projecting your movement downward. And in fact, you can do things like just lock your hand forward, and then do all your micro adjustments with your left thumb-stick, and it’s like playing with a controller or a mouse and keyboard. It takes the sort of pantomiming fun out of the game. I totally agree, it’s been really interesting to see those different play styles evolve. I like Shift because the teleport is convenient, and you never blink out and blink back into existence and have to reassess where you are in space, because you’re kind of always there. But you still retain that room scale, pantomiming, ducking and leaning and bobbing and all this stuff that makes it VR.
How long has the game been in development for?
Mitchell: About four years.
Do you imagine moving forward that future VR games will be faster to develop?
Mitchell: It’s always true that, at the end of the game, you’ve gotten really good at making that game, so there’s sort of a temptation of making more. [Half-Life] episodes, I think, came out of that same kind of thinking. I think it’s in our nature as game developers to then go and push the boundaries. With [Half Life 2] Episode One it was pushing on Alyx’s AI to make her a really compelling companion. But then again, in Left 4 Dead 2 we were able to deliver that in a year. It comes down to, really, what your goals are and how you set your constraints for yourself. We’ve definitely gotten good at VR development, but it would come down to sort of how disciplined we were.
Well, since you mentioned Left 4 Dead… Half-Life: Alyx is a single-player experience. And I was wondering, will you adopt VR multiplayer? What do you think the benefits or challenges are in an esports multiplayer VR experience, since Valve does so much of that outside of VR? Do you feel that’s ready to happen?
Benson: There’s not some active super-plan to add in a multiplayer mode. But one of the things that I found working on the game is that watching someone play Alyx is really, really performative. Especially knowing that a ton of people will end up watching the game on Twitch streams … it’s fun watching someone in single-player just kind of pantomiming around because there’s so much physicality and crashing and ducking and weaving, and all that kind of thing. I’m not saying we’re actively developing a multiplayer title, but whenever I’m watching that, I think, spectator aspect of VR is actually really, really strong. When I’m at home — because we obviously have VR setup at home — me and my wife are going to play something, and it kind of becomes a group activity in some weird way where we’ll watch each other, or sometimes the kids will watch because it’s like this weird show that Mum’s putting on. It’s always occurred to me that it’s really interesting as far as a spectator thing.
Mitchell: Yeah, obviously from a product and market-size standpoint with VR, the size that it is right now, single-player is obviously easier because you don’t have to rely upon multiple people having VR headsets and getting together, online or otherwise. But it is kind of amazing when you look at other VR titles, even something as simple and stylized as Rec Room, how much emotion comes through the other players’ simple movements. I definitely think it would be a really fertile ground to go and explore doing a co-op title. Or, multiplayer title. We just aren’t there yet.
I also think of asymmetric multiplayer. You know, many people on PCs or phones playing with a few people who are in VR headsets, or introducing that element, could be intriguing. I think about it, especially as everybody’s at home and we’re asking that question: ‘hey, why are we not all living in VR?’
Mitchell: Yeah. we saw a prototype of a game, a kind of chess-like game. It wasn’t chess, but it was something where people were moving things around on a board, and you could see the other player, a representation of their hands and their head and stuff like that, which gave you a little bit of information about their intention. It was potentially giving away a little bit, like, “Oh yeah, they’re looking over there, that’s the part of the board where they’re thinking about.” And so I was like, “Are you guys discovering that players are head-faking each other in this game, looking at another spot to try to give subterfuge?” And they said, “Yes, we are.” Even with really subtle, abstract representations of each other, people are able to do a lot of really subtle interacting like that, which is kind of amazing. It’ll be exciting when we get to that point in VR.
Benson: When we had like two little snippets of the game that you could play in the VR home, that first day when it launched … it was an Index preorder bonus, but you could invite your friends into it who hadn’t bought the Index. There were all these multiplayer lobbies, with people just being, “Hey, come one, come all, let’s all do Half-Life together.” And so that morning, we jumped into these random lobbies that are full of customers, and it was so cool because you get to have that representation of the hands and the camera, and then they’ve all got these custom avatars with a headcrab on and all this stuff, and you see like 50 people just ransacking Russell’s lab or whatever. They’re really having a multiplayer experience where you see one guy who’s clipped through a wall to see some other part, and they said, “In here, there’s a Voodoo graphics card!” and you see all the avatars move around and follow him … there was an interesting kind of escape roomy sensation to the whole thing.
A lot of the game reminded me of Portal, too. And I wondered, will there be Portal in VR? It seems like the best thing in the world — do you all agree?
Mitchell: I think the player player movement would be really tough. I’m not sure you can really play that game first-person. Maybe there’s another version of that that’s not first-person or something. Like Moss, or something, where you manipulate the environment and the character, some NPC goes through, I don’t know. Just brainstorming.
Do you think controller-free hand tracking could be usable for things like these types of games or do you see a lot of limits still, versus a controller?
Mitchell: I don’t have a lot of first-hand experience with that. But I mean, my first gut reaction would be that … the lack of buttons would make that a lower-bandwidth input method. But maybe more organic in some way, I’m not sure. I was actually at an exhibition recently where a Japanese ikebana expert was using a — ikebana is the Japanese flower arrangement art form — and he was virtually augmenting his flower arrangements in a performative way that people could see from another virtual camera, and it was interesting, but it was pretty limited. You could kind of just turn on emitting stuff, and then turn it off again with simple gestures. But I’m not sure that he would have had many other degrees of freedom as far as modifying that input besides the binary on/off and then obviously the positional.
Benson: One of the things I found more useful as a player than I realized I would find it was the haptic element. As an example, in the very first section of the game, there’s a kind of greenhouse and there are little whiteboard markers where you can draw on the glass. The sensation of moving your hand to hit the collision on the glass, and then you sort of desynchronize your hand with the controller. But we also put some haptic feedback that’s a very, very subtle buzz as you’re touching the pen and moving it around. You don’t have any sort of dysphoria between your hand and the game hand. It’s bridged, partly, by the haptics. And so there’s the clarity of inputs issue like button presses and stuff. But also I think that haptics is doing a bigger job than you think … as you’re creating that clarity between the real world and the game world. I did see a really cool thing with the Oculus hand-tracking stuff, which was really leading into that natural design stuff that Jason was talking about, where it was Jenga. But every time you successfully remove a Jenga piece, your fingers grow slightly longer. Ten moves in, you’ve got these crazy Edward Scissorhands [fingers], and oh my God. I think it’s one of those things, it’s like an Inception, another level deeper of VR design sort of eating its own tail and becoming ever more of this very, very specific particular design of everything you think about.
One of the
Mitchell: We aren’t as extreme as, like, Edward Scissorhands or tentacles or anything, but we definitely do that in subtle ways. The particular tech that we have, we call it a hand pose. And it’s basically just a little volume in space that, when your virtual hand goes there, you blend from whatever your hand is doing, or whatever neutral pose, to the pose that we want. So if you grab onto a door handle, or an object, like if you pick up a matchbox, we have authored hand poses specifically so that you pick it up that way. Or if you grab a thing that has a handle, then you grab it by the handle. We didn’t know necessarily at the beginning that people would be willing to go with the flow in terms of “Oh, my hands are really doing this with the matchbox,” or whatever. But it actually was additive to the experience, to sort of diverge a little bit from the corresponding 100% of the person’s real pose.
Mitchell: I’ve used a prototype set that has eye tracking. And it’s interesting. There was enough latency that in the version that I saw that it was going to be tricky to drive optimizations of rendering around that, rendering things that are away from the foveated region at a lower fidelity somehow … that was not going to be the obvious first thing to do with it. But using it for control was pretty darn interesting. I played a demo where they were using it for a lock-on, a little bit like the way our gravity gloves work. But very explicitly with the eyes, and it was pretty powerful. So yeah, I’m sure that there are a bunch of really interesting things to do with eye tracking.
Speaking of accessibility, VR headsets have been tough to get lately. And it’s not the sort of thing that people instantly have at home. Could Half-Life: Alyx possibly move to being available as a PC game without VR? Or do you see this as a purely VR experience?
Benson: I think that this experience is so fundamentally designed from the ground up as basically relying on the camera presence and on the hand’s presence essentially in every single thing that happens. I’m sure that someone’s going to do the mod. I would imagine even if we all sat down to say, “What would a 2D mode be like?” you’d probably eventually just make an entirely different game just because everything is so incredibly entrenched in hand and camera and presence.
Do you have any thoughts on the whole mixed-reality landscape? Like bringing some mixed reality into VR via passthrough cameras — it seems like that’s the growing trend, VR headsets that promise to blend reality into them.
Mitchell: It sounds super exciting. We haven’t done anything concrete with that yet.
Do you feel that working in VR is fundamentally different than working on other PC games, in the sense that it’s driving you to think In different directions as a company? A lot of it becomes an experiential element, almost like a theme park or something a little bit different.
Benson: My experience making stuff in that vein, where Half-Life is kind of famous for the stuff you’re describing, where it’s like super, super crafted, and we care tons about where the player is looking and for how long, it just kind of ended up being this serendipitous thing that all the stuff that VR is good at happened to be … really core to Half-Life as a franchise. As an example, one of the things in Half-Life 2 that was so amazing at the time, and still is great actually, was being in the room with another character, like Barney, and them meeting your eye line and just having that really strong sense that they were meeting your gaze correctly. And then if you’re in Russell’s lab in [Half-Life: Alyx], in that that early section of the game, and he’s talking to you, following your camera around with significantly more complexity, and all the little micro movements, and you can physically move your body around him. And also you’re no longer in a game with a giant physics capsule that blocks you from getting really close to anything. Suddenly you can be much closer — the presence of being observed is something that just happened to be better in VR.
The things that are difficult in VR about creating a cinematic moment was sort of baked into the franchise, and the whole team was already super used to thinking about content in that way. So it was additive, rather than a constraint.
I found there’s also a familiarity that I felt in the game, that it builds off of. It’s like a Half-Life game, but it’s in VR.
Mitchell: Yeah, it was, in a lot of ways, getting the band back together. I mean, most folks on the team had not worked on any of the Half-Life franchise games, but many people did. And we were literally using the same code for many of the AIs and other systems, we really preserved all of the game code, even on into the Source 2 engine and there’s lore and institutional knowledge there that was totally preserved. You know, the headcrab AI is largely the same … with some changes to make it more modern or more suited to the environments that we have. Other creatures are that way, too. And it’s the same people that remember, “Oh yeah, that’s why we did that thing.” And in the way of working in terms of, oh, hey, I have this puzzle design, let me build it out myself and then go over and grab somebody and have them play this at my desk, and all the same sort of little feedback loops. And the same way of working is, there it’s preserved. It’s been fun to be a part of kind of getting that muscle going again.