motion capture

Latest

  • SoftEther's sensor-laden QUMA robot demonstrates poses, intimidates your acting coach (video)

    by 
    Darren Murph
    Darren Murph
    07.24.2011

    A solution in search of a problem, or a solution to a problem that you were too proud to cop to? SoftEther has just revealed what might be the final blow to Barbie's distinguished career: the sensor-splashed QUMA. So far as we can tell, the human-shaped puppet contains a myriad sensors to pick up precise bends and flexes, and then pipes that information to a screen. Aside from showing your team of ballerinas exactly how their routine should look, we're guessing that the real future here is in far more sophisticated tasks -- things like artificial intelligence, major motion pictures and scientific research. As the saying goes, a video's worth a zillion words, so have a peek for yourself just after the break.

  • Aiken Labs brings 9-axis modular motion sensing to Android, we go hands-on (video)

    by 
    Zach Honig
    Zach Honig
    06.08.2011

    We already had a chance to try out Immersive Motion from Aiken Labs at CES, but now the nine-axis modular sensing system is making its way to Android and other mobile platforms, including iOS and Windows Phone. The more compact battery-powered server brings motion-controlled gaming to mobile environments, capturing position data from matchbox-size modular sensors that you can tape to a wooden sword or Viking helmet for live-action outdoor role-playing, or on you paws and dome during a virtual jam session with friends, for example. The mobile kit includes a pair of wireless sensors with a 50-foot range that you can attach to literally any accessory or appendage, and is expected to sell for about $300 when it ships later this year. You'll also be able to connect up to two smaller wired sensors to each wireless sensor, for about $50 a pop. The kit's price tag makes it cost-prohibitive for all but the most hardcore gamers and devs (there's an SDK available as well), but Aiken hopes to make its flagship product more affordable if its able to sell the kits in high volume. The tool has applications in other industries as well, including research and Hollywood, where it could be used as a (relatively) low-cost outdoor motion-capture suit. The early version we saw at E3 today is definitely not ready to head to production, but we're still months away from an actual release, giving Aiken some time to improve accuracy, and perhaps find a way to reduce that price. Jump past the break to see how it works.%Gallery-125868% Tim Stevens contributed to this report.

  • EVE Spotlight: An interview with Clear Skies creator Ian Chisholm

    by 
    Brendan Drain
    Brendan Drain
    06.02.2011

    EVE Spotlight is a biweekly feature in which we interview prominent members of EVE Online's player community or development team. Every two weeks, we'll be shining the spotlight on a player or developer who has a significant impact on EVE to highlight the efforts of EVE's most influential people. EVE Online is well-known for its community's awesome cinematic productions, and no film is more renowned than the incredible machinima Clear Skies. Directed by Ian Chisholm, Clear Skies seamlessly merges in-game EVE footage with scenes composed using Valve's Source development kit. The films follow the adventures of captain John Rourke and his crew aboard the Minmatar Tempest class battleship Clear Skies. With more luck than sense, the Clear Skies crew continually finds itself in sticky situations but manages to come out on top. The first Clear Skies film won the award for best long-format film at the 2008 annual Machinima Filmfest, and a second film solidified the series' huge cult following. Clear Skies has even inspired other players like Kyoko Sakoda to produce their own cinematic masterpieces set in the EVE Online universe. The third and probably final film in the Clear Skies series was released earlier this week, absolutely shattering all expectations. In this massive edition of EVE Spotlight, I interview Clear Skies creator Ian Chisholm to find out all about the production of Clear Skies III.

  • Hitman: Absolution using 'Avatar' mo-cap tech, hollywood actors

    by 
    Ben Gilbert
    Ben Gilbert
    05.12.2011

    Hollywood Reporter has managed to sneak in a few questions to IO Interactive game director Tore Blystad about the forthcoming Hitman: Absolution, and got an earful back about the development process. "We've designed a more stylized, more serious, and darker game this time around in both the story line and the visuals," Blystad told HR. Blystad also revealed that his studio employed Giant Studios for the game's motion capture -- the same studio that James Cameron used for Avatar -- and that Keith Carradine (Cowboys & Aliens) and Marsha Thomason (White Collar) will be voicing two main roles. Carradine is Agent 47's antagonist this time around, while Thomason will act as 47's handler. Blystad's betting that the theatrical approach his studio is taking to voice acting and motion capture will dovetail nicely into the next Hitman film. "The hope is that the movie will be going in a similar direction, and then when they both come out they will speak the same language," he said. In Absolution, Agent 47 finds himself on the run from police while simultaneously hunting "his most dangerous contract to date." As promised, more details will be revealed next month at E3.

  • Kinect keeps surgeons on task, Nintendo 3DS might assist optometrists with diagnoses

    by 
    Sean Hollister
    Sean Hollister
    03.21.2011

    The latest generation of gaming gadgets do some nifty tricks, and one of the niftiest they might perform is assisting the realm of medicine. Microsoft's Kinect sounded like a candidate for surgery, and this month real-life surgeons have actually put it to use -- Sunnybrook Hospital in Toronto, Canada rigged the Xbox 360 depth camera to its medical imaging computer. Now, doctors don't have to scrub out to manipulate an MRI scan, or even appoint a peon to the task -- rather, they simply raise their bloodied glove, and dive into the digital imagery with a wave of a dextrous hand. Meanwhile, the American Optometric Association has expanded upon its initial praise of Nintendo's 3DS, saying the autostereoscopic 3D handheld "could be a godsend for identifying kids under 6 who need vision therapy." Though Nintendo's warning labels had originally incited a bit of fear among parents, the organization says that kids who can't experience the 3DS to its full potential may have amblyopia (or other vision disorders) that can be more easily treated the earlier it's caught, though one doctor interviewed by the Associated Press contends that kids with amblyopia may not know what they're missing to begin with -- so don't necessarily expect a panacea, folks.

  • A smattering of topics from Star Wars: The Old Republic

    by 
    Eliot Lefebvre
    Eliot Lefebvre
    03.18.2011

    Fridays mean new updates for fans eagerly waiting on Star Wars: The Old Republic, and while last week saw the kickoff of PAX East and a hands-on demo (which we played), this Friday sees discussion on a wide range of different aspects. For starters, the latest Fan Friday feature has been posted, showing off new concept art and a fansite spotlight as well as a few new Sith avatars for forumgoers. A new developer diary is also available, which discusses cinematic animations, one of the major selling points for the game. Certainly the motion capture helps, but as the diary notes, it's not quite as simple as suiting up some capture actors and getting their raw data. Even for a simple scene, there's some fairly elaborate work necessary to make the animation and overall environment feel convincing. The end of the entry is devoted to several community questions regarding flashpoints, fresh in everyone's mind after the aforementioned demo at PAX East. If you're curious about how loot will be balanced in a dungeon with multiple storyline options, take a look at the full entry to get a clearer picture of how the system will be implemented in Star Wars: The Old Republic.

  • Kinect hacked for home automation, does your mood lighting for you (video)

    by 
    Michael Gorman
    Michael Gorman
    03.11.2011

    Microsoft's Kinect has become quite the hacking hotbed -- the fields of medicine, music, and even shadow puppeteering have all benefitted from the peripheral's incredible versatility. And now, to the delight of home automation nerds everywhere, an enterprising young hacker has rigged a Kinect to automate the lighting in his home. By positioning the camera bar in a corner to track his movements, connecting it to the automation controller, and coding on / off commands, he's able to control the lights throughout his geektastic domicile. The automation logic then turns on the lights when entering the room, localizes them according to location, and turns them off upon leaving. One less thing to worry about -- here's hoping a method for spotting our perpetually misplaced keys is in version 2.0. Vid's after the break.

  • Milo's 'drama director' discusses emotion-capture

    by 
    Griffin McElroy
    Griffin McElroy
    02.28.2011

    Milo and Kate -- Lionhead Studios' industry-wowing Kinect demo, first revealed at E3 2009 -- has been in limbo following last September's comment from Microsoft Game Studios boss Phil Spencer, who wasn't "convinced" the product would make it to market. We're still waiting with bated breath for news of some kind of playable implementation, but in the meantime, Milo's so-called "drama director," John Dower, has released a video diary featuring the real Milo (well, the mocap actor) and a behind-the-scenes look at creating the emotion within this would-be virtual boy. Check out the video (which contains a fair share of gameplay footage) after the jump. We warn you, though: It's just as intriguing and exciting as the first time you saw "Project Milo," which might send you into uncontrollable spells of worry that a finished product will never materialize. As always, we'll keep our fingers crossed.

  • Lockheed Martin's CHIL blends motion capture with VR, creates zombie engineers (video)

    by 
    Tim Stevens
    Tim Stevens
    01.28.2011

    Computer-aided design is a great way to build products, but does it let you bust a funky move while wearing some crazy glasses and gloves? Heck no. You need Lockheed Martin's CHIL for that. It's the Collaborative Human Immersive Laboratory, virtual reality goggles and gloves combined with motion capture enabling teams of engineers to work together in a virtual space. You can see it in action below, used first for installing polygonal munitions into a rendered version of one of the company's F-35 Joint Strike Fighters, then for doing a little VR tai chi. A Lockheed rep promises that this enables the team to ensure the plane can be more easily and affordably maintained, but we just see this as high-tech training tool for the company's world-renowned synchronized dance teams.

  • L.A. Noire's amazing MotionScan facial capture system demonstrated (video)

    by 
    Tim Stevens
    Tim Stevens
    12.17.2010

    In gaming, 3D graphics get more powerful, environments get more expansive, enemies get more intelligent, but still facial animations haven't progressed much since Pac-Man chomped his first power pellet in 1980. Finally, a major breakthrough courtesy of Australian company Depth Analysis. It has developed technology called MotionScan, which enables a high-res 3D recreation of a person's face -- not just capturing bits and pieces of facial animation but their entire head, right down to the hairstyle. It's getting its first use in next year's L.A. Noire, a 1940s PS3 and Xbox 360 murder mystery game from Rockstar, and while we don't know enough about the title to be able to say whether it's worth plunking down a pre-order now, after watching the video embedded below it's clear that the bar has been raised.

  • Quantic Dream renovates mocap studio

    by 
    Justin McElroy
    Justin McElroy
    12.13.2010

    A lot of times game journalists complain that December is light on news, with all the big releases for the year behind us. But we've always thought they were just a bunch of Negative Nancys. In an industry as robust and busy as ours, how tough can it be to find something to write about? ... So, umm ... Quantic Dream renovated its motion capture studio. Now it's got like 64 cameras, and some sound-proof curtains and stuff, so that's ... you know, a good number of cameras. So ... We were going to make a Heavy Rain joke in which we suggest it "renovate" its voice capture studio by burning it to the ground. ... You know, because the acting was so bad? But it turns out that this new studio can do voice capture too, so that's not really that funny anymore. So, yeah.

  • Hack turns Kinect into 3D video capture tool

    by 
    Sean Hollister
    Sean Hollister
    11.14.2010

    We all knew this would inevitably happen, but seeing it in action is something else -- the Kinect transformed by the power of open-source drivers into a true 3D video camera for capturing oneself. UC Davis visualization researcher Oliver Kreylos fed the streams from his peripheral's infrared and color cameras into a custom program that interpolated and reconstructed the result, generating a mildly mindblowing 3D virtual reality environment he can manipulate at will. And if it makes him look a little bit like the proficiently penciled protagonists in Take On Me, that's just the cherry on top. Don't miss the videos after the break to see what we're talking about.

  • Motion capture game teaches tween girls to avoid the come-ons of boys from the Metaverse

    by 
    Paul Miller
    Paul Miller
    07.29.2010

    The University of Central Florida has been awarded a $434k grant to develop a video game for tween girls. The game, which is still in development, is apparently designed to allow girls to "practice" rejecting peer pressure from boys for sex and make outs. Of course, the only natural way to immerse the player in such a game is to put her into a motion capture suit, where her jittery marionette of an avatar is surrounded by the most diabolical 3D models of teenage males obtainable on that $434k budget -- if it was paid to Neal Stephenson in 1992. "It's a place to practice where there aren't any social consequences," says Professor Anne Norris, who is heading the project. But who will protect them from the psychological consequences, Anne? And will there be any escape from the meta-virus? Check out the unintentionally hilarious video after the break.

  • VideoMocap creates 3D animation from any 2D clip (video)

    by 
    Sean Hollister
    Sean Hollister
    07.12.2010

    If you've ever used a video editing program, you might be familiar with the concept of "keyframes," which define the beginning and end of a particular segment. Seeing where you came from and where you're going, the computer literally guesses what's in between, and creates smooth animation as a result -- the very same technique that students at Texas A&M University use to create motion capture that doesn't require arrays of cameras or ping-pong balls. Dropping the laws of Newtonian physics into their algorithms, Xiaolin Wei and Jinxiang Chai claim to have whipped up a computer program that can turn most any 2D video into simple 3D animation in real time, with just a few keyframes to start out. For instance, in a complex weightlifting segment 310 frames long where the camera panned, tilted and zoomed, animators had only to position eleven joints in thirteen keyframes (and make seven minute adjustments) to get the entire animation to turn out. See it in action after the break, or read their entire SIGGRAPH paper at our more coverage link.

  • Ford assembly line uses mo-cap tech to build cars years ahead of time (video)

    by 
    Sean Hollister
    Sean Hollister
    05.27.2010

    Years ago, Ford would have to physically build all the parts for a new vehicle, and only discover afterwards whether it was feasible to have humans assemble the contraption. Now, it relies on the same motion-capture systems used to shoot your favorite 3D movies and games to test the vehicle's construction in virtual reality, years before a single scrap of metal needs to be cut. IDG got to see the system in action at Ford's Assembly Ergonomics Lab in Michigan; you can find their report at the source link and a video after the break.

  • OptiTrack mixes motion capture with a virtual camera for delicious, Avatar-esque results

    by 
    Paul Miller
    Paul Miller
    03.11.2010

    We knew virtual camera systems are starting gain traction, particularly in the world of cinema and within James Cameron's little set of toys, but it's pretty wild to see one in action. NaturalPoint is showing off its OptiTrack motion capture system at GDC, a budget-friendly multi-camera setup (if $6k is your idea of budget-friendly), but it also has a prototype of sorts of its upcoming virtual camera system. The camera's orientation and movement is actually tracked in the same way a motion capture suit is, and if you're in the same tracking space as a motion capture actor you can do "real" camera work with a live 3D rendered preview of the action. The shoulder-mounted camera has controls for virtual tracking and dolly moves, along with zoom, and has zero problem delivering that shaky handheld look that's all the rage in visual effects these days. There's no word on much this will retail for, but despite the fact that we have absolutely zero use for it we totally want one. Check out a video of it in action after the break. %Gallery-88029%

  • L.A. Noire's mocap system claims to set new bar for 3D performances

    by 
    Ben Gilbert
    Ben Gilbert
    03.03.2010

    "Traditional motion capture could never bring to life the subtle nuances of the chaotic criminal underworld of L.A. Noire in the same way as MotionScan," claims Team Bondi's Brendan McNamara, commenting on MotionScan. It's the new motion capture technology being employed first by his studio's moody crime game, L.A. Noire. In a press release yesterday, Depth Analysis announced the new tech and touted its many applications in the forthcoming Rockstar title. Allegedly, MotionScan uses "32 high-definition cameras to capture true-to-life three-dimensional performances at up to 30 frames per second," thus allowing for the supposedly "emotional performances" that McNamara says make L.A. Noire "a truly unique and revolutionary game." Aside from the claims of higher quality, the mocap system supposedly has lower operation costs due to a streamlined post-production processing time. With any luck, we'll finally see all this big talk in action -- and compare ir with other performance-capture scenes in games like Uncharted 2, Alan Wake and Heavy Rain -- this September when the game arrives on store shelves.

  • Fun facts about Alan Wake's facial animation

    by 
    Richard Mitchell
    Richard Mitchell
    02.26.2010

    CaptiveMotion has announced that it has completed work on the facial animation for Alan Wake. The company provided facial capture services for the developers at Remedy to use in the upcoming thriller. One of the more intriguing aspects of the announcement is that the facial capturing done by CaptiveMotion was actually done separately from the full body motion capture used in the game, which was completed six months prior. As a result, CaptiveMotion had to "present the data in a fashion that Remedy could easily integrate it into their pipeline." The capture technology itself is called Embody, which allows up to 1,600 markers to be mapped on a single motion capture actor. See it in action in the (somewhat bizarre) video after the break.

  • Bayonetta's dancing is really 'sexy'

    by 
    JC Fletcher
    JC Fletcher
    11.24.2009

    Did you think the footage of Bayonetta summoning intricate torture devices was disturbing? Wait until you see this context-free prototype dancing footage, featuring everyone's favorite giant witch cutting a motion-captured rug (after the break). We don't know if it's the fact that the animations have yet to be finalized in this sequence, or if it's the idea of Bayonetta taking a break from her brutal revenge quest to get down, or if it's simply Bayonetta's flagrantly inhuman body proportions, but yikes. What kind of touch-ups did Platinum do after this prototype stage? " ... on Kamiya-san's orders," animator Uchi says, "we accentuated the movements of her waist and butt to make it over-the-top in the sexiness department." You'd have to accentuate the hell out of those movements to go from that to "sexy."

  • LA Noire characters revealed by mocap casting call

    by 
    Andrew Yoon
    Andrew Yoon
    10.26.2009

    Rockstar's LA Noire has been shrouded in so much mystery that its very existence has been called into question. A recent casting call for actors not only confirms the game's continued development, but also reveals the many (if not all) characters to be featured in the game. The call asks for "STRONG ACTORS, able to handle LOTS OF dialog" for a cast of supporting characters, comprised mostly of detectives, police officers and the occasional crook. The casting call is looking for a face similar to actors Gary Cooper or Gregory Peck (pictured above) for the protagonist -- Cole Phelps. Co-stars and supporting roles, such as jazz singer Elsa Lichtmann, will only be required to perform for 10 days or less. However, the actor chosen to play Cole Phelps will work for three months, from November to January, to record his performance for the game. It's evident from the large roster of characters and the use of Hollywood actors that LA Noire is intended to be quite the cinematic game. However, with only three months of mocap, it still pales in comparison to the upcoming Heavy Rain -- which demanded nine months of motion capture work. [Via Superannuation; Image Source]