Advertisement

15 Minutes of Fame: Full-body WoW with motion-sensing software

From Hollywood celebrities to the guy next door, millions of people have made World of Warcraft a part of their lives. How do you play WoW? We're giving each approach its own 15 Minutes of Fame.

The boss is enraging at 7% health and you're locked on target, hunched over your keyboard in a white-knuckled frenzy to squeeze every last drop of DPS from your avatar. Finally, the beast succumbs to your assault, and you sit back, exquisitely aware of the tension crumpling your neck and shoulders and radiating into your fingertips. As you pull in a deep, shuddering breath of relief, you wonder if perhaps it might be more natural to simply stand in front of your screen and show the computer, using gestures similar to those of your character, what to do.

Now, you can.

Dr. Skip Rizzo, associate director at the University of Southern California's Institute for Creative Technologies, is head of a research project that's applying the same kind of technology used in the Xbox Kinect to the World of Warcraft. The aim of the project, however, is not so much to turn games like WoW into virtual tarantellas of movement and gesture but to make games more accessible to disabled players and to open new avenues for rehabilitation, therapy and even education. The project's Flexible Action and Articulated Skeleton Toolkit (FAAST) middleware integrates full-body control with games and virtual reality applications, using tools like PrimeSensor and the Kinect on the OpenNI framework.



15 Minutes of Fame: Let's start off with the question we're all burning to ask: Can you really play World of Warcraft on a Kinect?

Dr. Rizzo: You can play the game with some simple gestures, but it's not fully evolved so that all the literally thousands of keyboard emulations are in place to be able to play in its full form.

We are a research institute. We are trying to advance 3-D user interaction, and that was what the point was of that project. I actually had access to the technology underlying the Kinect long in advance of the Kinect release. We worked with PrimeSense, the company that made the camera, and we had an exclusive non-disclosure kind of agreement with them for about six months prior. And so we built the software around it for the FAAST software, which may become less relevant now that Microsoft will be releasing their API.

... I think what is most telling in the whole thing, the YouTube clip and all that, is not so much what we did but the comments that we got to it. That was illuminating -- illuminating so far to the point where I actually wrote about it in a publication.

This is really the future of a lot of home-based rehabilitation, to create rehab games that require natural body action to interact with the game, as well as using games that people already know and love and trying to find ways that we can specify the body therapy, the body activity to do that. Our mission has been to do that, and we see this as groundbreaking.

I noticed that many of the gestures used are surprisingly similar to the in-game animations of the character avatars.

Somebody that plays WoW all the time and has all their keyboard commands down, they're going to be able to play it much better and much more efficiently that way. It's like playing the piano, essentially -- it really is. It's a procedural skill.

What I would like to see, to really take this project to the next level, is to really see what you could realistically emulate with body action that would make it a fun and compelling way to interact with the game, and then to have competitions where you're competing in WoW in the body action format. So you're not going to be competing against some wizard who's playing the piano on the keyboard -- you know, playing Beethoven's Fifth, essentially ... not these warlords, people who are great at the game -- but you're playing against people who are using body action. And there are defined, standard body actions. And maybe you build the software so people can pick their own body actions, so if they tilt their head one way while they're picking their arm up another way, something different happens.

But open it up so there's almost like a separate league, if you will.

I can see voice becoming integrated into this mix.

Yes. Yes, yes. I don't know if you've seen Avatar Kinect. It tracks your facial expressions and captures your voice. There's a microphone in the Kinect system, so it captures your voice, and you drive a cartoon-like avatar and [do] social chat interaction with each other.

Going back to the physical side, are the individual physical commands are defined by you and the system? Are they that the user is able to individualize?

They will be able to individualize.

... We have taken a very simple game, Space Invaders. In one instance, you can move the missile launcher from right to left by shifting your body from right to left, by leaning, by putting weight on one foot and then the other foot. And you can fire by moving your arm out forward. Now in the next instance, a minute later, we can switch it up so that you have to use your left arm swinging across your body laterally to move the missile launcher and fire with your right arm extended.

So what we're able to do is to specify the gesture -- and it can be any gesture. It could be jumping up and down that would move the thing right or left (although that wouldn't be natural). You could jump up and down to fire. The idea is that the software we built allows us to assign any gesture that can be picked up by the Kinect to emulate what the keyboard actions would be.

So we can do that, and it's just a matter of creating the software that makes a highly usable user interface, where you can select gestures to emulate this keyboard. In theory (and I believe this will become relatively commonplace), it's just like programming a Power Point animation. They go through the different steps and make it pop up at a certain time and play at a certain point in the video and end and so on. It's all about the user interface. I don't see any reason why we wouldn't be able to make that software available.

So is that the direction your research is going?

We're going in [two directions]: the direction of creating specific, game-like software for physical rehab that we design ourselves, versus the other direction, to be able to take any keyboard in game and be able to translate it into a game that can be played with your body.

I've read that you'd like to see this type of technology used to help players become more physically active, perhaps even as part of a weight control plan. I'm not sure that this would have much impact with some games. If you had to "run" to the next area in WoW, for example, you'd be exhausted by the time you arrived.

Sure, but that's just a matter of tweaking. I'm not sure that that game specifically would be the ideal for weight loss. I'm thinking more Call of Duty, where there's not as many complex actions you can do. It's running a little bit, crouching, selecting an weapon, throwing a grenade ... If you could make that really meet the user, I believe that one might engage people.

But as far as some of the fighting that goes on in World of Warcraft, body action that simulates that, it's all doable. It's just a matter of the will and the funding.

How is this sort of thing going to get to the players? Is the technology going to be coming from somebody like you, or will it come from a major manufacturer or game developer, or ...?

I think it'll be the individual game studios. Once Microsoft releases their API -- allegedly the tracking fidelity is about three times faster than what's already out there, so that way, you'll have a higher-fidelity action for the movement -- I think once they release that open source, then there's going to be plenty of games that people are going to want to play that way, either just for entertainment and recreation, just like people are buying the Kinect like crazy to play the games that already exist there. I think once people get their hands on these things, you're going to have a lot more independent developers trying to do this.

It's not going to be for everybody. There's a lot of people who are just going to want to sit there. They don't want to exercise. They want to play the game, and they love the game -- and that's fair enough. But there is a segment of the population that will play these games for the exercise and will enjoy the physical activity for some types of games that are well-matched.

Where else is this technology and research headed?

All the things that we've learned that motivate and engage and compel people to interact with this content, and trying to translate it into something that can be useful for other purposes other than just pure entertainment. We don't want to eliminate entertainment; the entertainment is what drives people to do it. But let's try to bring along some learning, some education, some physical therapy, some cognitive therapy.

And that's really what the vision of our research group is all about. We know we can change the brain, and we know we can rehabilitate people after significant injuries so they become more functional. But it's hard as hell to get people to do that physical therapy or that cognitive therapy in its traditional form because it's so damn boring.

With so many millions of subscribers out there, there's a significant population out there already using the game specifically for just that sort of thing. We've interviewed so many players who are disabled, injured, or recovering who use the game as their therapy or as their lifeline to the outside world. It's such an equalizer; they can play the game at their level and get out there as one of the inhabitants of Azeroth.

Yep. I think World of Warcraft, because of its diversity of theory, content, characters, things that go on it, is ideally suited for this. ... [In many cases], it's not even about the rehab as much as it is the social connectedness. We can get people connected online that wouldn't have the opportunity to play games against others.

Try it yourself: FAAST is freely downloadable for use with a camera and gaming system such as the Kinect.


"I never thought of playing

WoW like that!" -- and neither did we, until we talked with these players, from Star Trek: Deep Space Nine's Aron "Nog" Eisenberg to an Olympic medalist and a quadriplegic raider. Know someone else we should feature? Email lisa@wowinsider.com.