mocap

Latest

  • Joe Skipper / Reuters

    After Math: 'Musked' opportunities

    by 
    Andrew Tarantola
    Andrew Tarantola
    05.27.2018

    It was a week of near misses and closer hits than the tech industry probably would have wanted. Amazon's Alexa "accidentally" recorded more than a few customers' private conversations, Apple's iPhones turned out to be bendier than anticipated, and that PUBG chicken dinner of yours wound up being harder fought than anybody had previously thought.

  • Vicon

    100 years of motion-capture technology

    by 
    Jessica Conditt
    Jessica Conditt
    05.25.2018

    Modern motion-capture systems are the product of a century of tinkering, innovation and computational advances. Mocap was born a lifetime before Gollum hit the big screen in The Lord of the Rings, and ages before the Cold War, Vietnam War or World War II. It was 1915, in the midst of the First World War, when animator Max Fleischer developed a technique called rotoscoping and laid the foundation for today's cutting-edge mocap technology. Rotoscoping was a primitive and time-consuming process, but it was a necessary starting point for the industry. In the rotoscope method, animators stood at a glass-topped desk and traced over a projected live-action film frame-by-frame, copying actors' or animals' actions directly onto a hand-drawn world. The technique produced fluid, lifelike movements that animators couldn't achieve on their own.

  • With 'Siren,' Unreal Engine blurs the line between CGI and reality

    by 
    Edgar Alvarez
    Edgar Alvarez
    03.22.2018

    Epic Games has been obsessed with real-time motion capture for years, but the company is now trying to take its experiments with the technology one step further. Enter "Siren," a digital personality that it created alongside a few prominent firms in the gaming industry: Vicon, Cubic Motion, 3Lateral and Tencent (which just became a major investor in Ubisoft). The crazy thing about Siren is that she comes to life using live mocap tech, powered by software from Vicon, that can make her body and finger movements be captured and live-streamed into an Unreal Engine project.

  • The real-time motion capture behind ‘Hellblade’

    by 
    Nick Summers
    Nick Summers
    08.08.2017

    In a makeshift changing room filled with Disney Infinity figures, I strip down to my boxers and pull on a two-part Lycra suit. It feels tight, and the top half shimmies up toward my waistline as soon as I stretch or stand up straight. How anyone is able to act in this thing is a mystery to me. Sheepishly, I gather my belongings and trot back to the motion capture studio that sits at the end of Ninja Theory's offices in Cambridge, England. Inside, a couple of engineers scurry about, prepping cameras and cables. For years, movie and video game studios have used mocap to bring digital characters to life. From detective Cole Phelps in L.A. Noire to the powerful Caesar in Planet of the Apes, the technology has delivered some truly moving, actor-driven performances. Normally, however, motion capture scenes are processed by an animator hours, days or weeks after they've been captured on set. It's a time-consuming process, and one that involves some guesswork. In a sparse, lifeless room, directors are forced to imagine how a take will look in the final sequence. Not so with Ninja Theory. The video game developer has a unique setup that allows Chief Creative Director Tameem Antoniades and his team to preview scenes in real time. Pre-visualisation, or pre-vis, has existed before in the industry, but it's typically limited to body tracking. Full-character modelling is rare, especially at the kind of fidelity Ninja Theory is shooting for with its next game, Hellblade: Senua's Sacrifice.

  • Bjork avatar appears in London via Icelandic mocap

    by 
    Steve Dent
    Steve Dent
    09.01.2016

    Björk is continuing her "Vulnicura" exhibition at London, but the fact that she's in Iceland didn't stop her from appearing at Somerset House cultural center. In glorious Björk fashion, she beamed into the press conference as a colorful digital avatar to take questions alongside Somerset Director Jonathan Reekie. Meanwhile, she was back at the Icelandic Media College in Reykjavik wearing a motion capture suit, with her movements transferred to the Avatar digitally via Autodesk and Unity tech.

  • 'Hellblade' takes real-time motion capture to the next level

    by 
    Edgar Alvarez
    Edgar Alvarez
    03.17.2016

    Yesterday, during the Epic Games keynote at GDC 2016, Ninja Theory showed off a live motion capture demo for Hellblade, its upcoming AAA indie title. The results are absolutely stunning. Tameem Antoniades, Ninja Theory's chief creative director, described the real-time animation performance as historic, and people at the event seemed to validate his excitement. Interestingly enough, the game has been renamed Hellblade: Senua's Sacrifice, paying tribute to the main character in this combat-heavy story. But you're probably here for the video, so have at it -- we promise it doesn't disappoint.

  • Watch Jason do what he does best in 'Friday the 13th' game

    by 
    Timothy J. Seppala
    Timothy J. Seppala
    02.26.2016

    A slasher movie is only as good as its big bad, and that goes doubly so for interactive horror flicks. Take the upcoming Friday the 13th game, for example. To ensure that its Jason is as terrifying as possible, the team at Gun Media enlisted Kane Hodder, the man behind the hockey mask from the film franchise's seventh through tenth installments to terrorize the forever-randy camp counselors. Even though everyone in the video below are wearing mo-cap suits on a sound stage, it's hard to not cringe when Hodder tests the flexibility of the human leg, arm, neck, and, well, just about every appendage he lays hands on.

  • The VR arcade of the future will look something like this

    by 
    Joseph Volpe
    Joseph Volpe
    01.24.2016

    I knew I was on solid ground. I knew that no matter if I misstepped, I wouldn't fall hundreds of feet, plummeting to my death in some CG-Egyptian ruin. And yet, I was shaky, desperately reaching out for a handhold to steady myself, unable to calmly place one foot in front of the other as I attempted to cross a chasm bridged by a collection of meager wooden beams.

  • The Chemical Brothers bring Hollywood special effects to dance music

    by 
    Mona Lalwani
    Mona Lalwani
    11.25.2015

    I look around at the sea of glowing faces surrounding me in the dark of Randall's Island in New York. There's no fist pumping. Their feet aren't shuffling. Instead, they're looking straight ahead at a large hand-drawn figure on a black screen. The frame, shaped like a human body, is filled with an entangled web of white lines. It appears to stand behind a barricade of light beams that shoot up from the stage. When the rapper Q-Tip's voice booms -- "World, the time has come to galvanize"-- the figure shakes furiously as if trying to break free from its enclosure. With every beat of the iconic Chemical Brothers track, the abstract form pushes back with swift choreographed moves. It struggles for a while before it breaks down the light-built cage and spins freely with the elegance of a trained contemporary dancer.

  • How I turned my Xbox's Kinect into a wondrous motion-capture device

    by 
    Steve Dent
    Steve Dent
    03.08.2015

    When Microsoft started selling a basic Xbox One package without a Kinect V2 for $100 less, the result was unequivocal: Sales took off. Most gamers can take or leave the ubiquitous depth camera, because it just isn't as useful for gaming as, say, the Wii controller. It is indispensable for certain titles, like Just Dance 2014, Xbox Fitness and Fighter Within. Others, such as Madden NFL 25 and Battlefield 4, can make use of the Kinect 2, but absolutely don't need it. In other words, it's a big bag of meh for gamers and casual users. But recently, my ears perked up when Microsoft released a $50 cable that lets you use the Xbox One's Kinect on a PC.

  • What you need to know about 3D motion capture

    by 
    Steve Dent
    Steve Dent
    07.14.2014

    Close your eyes and go back... back in time. Picture Jar Jar Binks or Polar Express, movies that put the "Uncanny Valley" on the map. I know these aren't pleasant memories, but new technology like motion capture (mocap) can be... awkward in its youth. Now, let's forget all that and move forward to a time when the tech started hitting its stride -- from Lord of the Rings' Gollum to Avatar to The Avengers' Hulk. And let's not forget games -- The Last of Us has some of the best mocap done in any medium and Electronic Arts has used the technique since Madden NFL '94. But what is mocap, exactly, and how is it done? Will it ever replace live actors or put 3D animators out of business? To answer all that, let's head back in time 100 years.

  • Watch filmmakers render realistic CG on the fly using $14k of graphics cards

    by 
    Steve Dent
    Steve Dent
    04.11.2014

    A new short film teaser has taken digital character rendering to a new level, making real time motion capture a lot easier for animators. While working on "Construct" (see the stunning video after the break) filmmakers captured the movements of real actors in a studio, similar to how James Cameron did for Avatar. Instead of seeing the performer, however, the director saw a ray-traced version of the animated character on his screen. Though heavily pixelated, freezing the scene instantly gave animators a clear idea of the final result, something that can normally take hours in post-production. The system used custom software from ray-tracing outfit V-Ray powered by three top-of-the-line NVIDIA K6000 GPUs -- not exactly a home setup. Still, it's not hard to see how such tech could eventually power ultra-realistic gaming, though at $4,500 a pop or so for the graphics cards, we're not there yet.

  • PrioVR full-body mocap suit promises accurate motion tracking in VR gaming

    by 
    Alexis Santos
    Alexis Santos
    01.05.2014

    Sure, Kinect's done a bang-up job of bringing full-body motion tracking closer to the mainstream, but it hasn't exactly fulfilled the dreams of futuristic gaming that Hollywood (and our imaginations) promised. PrioVR, a motion-tracking suit meant for virtual reality games, aims to bring us closer to that future with accurate full-body motion-capture abilities without a camera array in the mix. The demo on hand today was pretty impressive: A rep was decked out in the upper-body suit, complete with Wii nunchuks, playing a first-person shooter. Sensors on his chest, back, head, arms and hands translated his movements to the screen with little latency, showing up on the display in a fraction of a second. We did notice an ever-so-slight choppiness -- which could have more to do with the game engine than the hardware -- but how much it affects gameplay remains to be seen. Though only an upper-body rig was being shown off, a full-body getup promises to capture everything from walking to kicking.

  • See Watch Dogs mocap actors jump down some dubsteps

    by 
    Jordan Mallory
    Jordan Mallory
    06.01.2013

    This latest development diary for Ubisoft's current/next-gen hacking/big-brother simulator Watch Dogs focuses on the complex motion capture work that is being done on the game's main character, as well as the various people he brutally assaults while reading their private Facebook messages.

  • Researchers demo 3D face scanning breakthroughs at SIGGRAPH, Kinect crowd squarely targeted

    by 
    Darren Murph
    Darren Murph
    08.10.2011

    Lookin' to get your Grown Nerd on? Look no further. We just sat through 1.5 hours of high-brow technobabble here at SIGGRAPH 2011, where a gaggle of gurus with IQs far, far higher than ours explained in detail what the future of 3D face scanning would hold. Scientists from ETH Zürich, Texas A&M, Technion-Israel Institute of Technology, Carnegie Mellon University as well as a variety of folks from Microsoft Research and Disney Research labs were on hand, with each subset revealing a slightly different technique to solving an all-too-similar problem: painfully accurate 3D face tracking. Haoda Huang et al. revealed a highly technical new method that involved the combination of marker-based motion capture with 3D scanning in an effort to overcome drift, while Thabo Beeler et al. took a drastically different approach. Those folks relied on a markerless system that used a well-lit, multi-camera system to overcome occlusion, with anchor frames acting as staples in the success of its capture abilities. J. Rafael Tena et al. developed "a method that not only translates the motions of actors into a three-dimensional face model, but also subdivides it into facial regions that enable animators to intuitively create the poses they need." Naturally, this one's most useful for animators and designers, but the first system detailed is obviously gunning to work on lower-cost devices -- Microsoft's Kinect was specifically mentioned, and it doesn't take a seasoned imagination to see how in-home facial scanning could lead to far more interactive games and augmented reality sessions. The full shebang can be grokked by diving into the links below, but we'd advise you to set aside a few hours (and rest up beforehand). %Gallery-130390%

  • Organic Motion's OpenStage motion capture system grabs 200FPS, no backdrop required (video)

    by 
    Darren Murph
    Darren Murph
    08.10.2011

    At just under $40,000 for an eight camera setup, we're hardly in hobbyist territory here, but Organic Motion's new OpenStage 2.0 motion capture system could certainly make do in the average basement. Unlike a few competing solutions shown here at SIGGRAPH, this one actually has no backdrop mandate, and better still, doesn't require you to latch a single sensor onto your subject. The magic lies within the cameras hung above -- kits are sold that contain between eight and 24 cameras, and even the latter can be handled with a single workstation. Multi-person tracking ain't no thang, and while you aren't capturing HD footage here, the high-speed VGA capability enables up to 200 frames per second to be logged. Not surprisingly, the company's aiming this squarely at the animation and medical realms, and should start shipping bundles as early as next month. Looking to take down Pixar? You'll need a lot more than 40 large, but perhaps the video after the break will give you a bit of inspiration. %Gallery-130288%

  • Aiken Labs brings 9-axis modular motion sensing to Android, we go hands-on (video)

    by 
    Zach Honig
    Zach Honig
    06.08.2011

    We already had a chance to try out Immersive Motion from Aiken Labs at CES, but now the nine-axis modular sensing system is making its way to Android and other mobile platforms, including iOS and Windows Phone. The more compact battery-powered server brings motion-controlled gaming to mobile environments, capturing position data from matchbox-size modular sensors that you can tape to a wooden sword or Viking helmet for live-action outdoor role-playing, or on you paws and dome during a virtual jam session with friends, for example. The mobile kit includes a pair of wireless sensors with a 50-foot range that you can attach to literally any accessory or appendage, and is expected to sell for about $300 when it ships later this year. You'll also be able to connect up to two smaller wired sensors to each wireless sensor, for about $50 a pop. The kit's price tag makes it cost-prohibitive for all but the most hardcore gamers and devs (there's an SDK available as well), but Aiken hopes to make its flagship product more affordable if its able to sell the kits in high volume. The tool has applications in other industries as well, including research and Hollywood, where it could be used as a (relatively) low-cost outdoor motion-capture suit. The early version we saw at E3 today is definitely not ready to head to production, but we're still months away from an actual release, giving Aiken some time to improve accuracy, and perhaps find a way to reduce that price. Jump past the break to see how it works.%Gallery-125868% Tim Stevens contributed to this report.

  • Hitman: Absolution using 'Avatar' mo-cap tech, hollywood actors

    by 
    Ben Gilbert
    Ben Gilbert
    05.12.2011

    Hollywood Reporter has managed to sneak in a few questions to IO Interactive game director Tore Blystad about the forthcoming Hitman: Absolution, and got an earful back about the development process. "We've designed a more stylized, more serious, and darker game this time around in both the story line and the visuals," Blystad told HR. Blystad also revealed that his studio employed Giant Studios for the game's motion capture -- the same studio that James Cameron used for Avatar -- and that Keith Carradine (Cowboys & Aliens) and Marsha Thomason (White Collar) will be voicing two main roles. Carradine is Agent 47's antagonist this time around, while Thomason will act as 47's handler. Blystad's betting that the theatrical approach his studio is taking to voice acting and motion capture will dovetail nicely into the next Hitman film. "The hope is that the movie will be going in a similar direction, and then when they both come out they will speak the same language," he said. In Absolution, Agent 47 finds himself on the run from police while simultaneously hunting "his most dangerous contract to date." As promised, more details will be revealed next month at E3.

  • David Cage: L.A. Noire mocap tech interesting, but limited

    by 
    Griffin McElroy
    Griffin McElroy
    04.01.2011

    Just about every human being that's laid eyes on the eerily accurate visages of L.A. Noire have had roughly the same reaction of shock and awe. Every human being, that is, except for Quantic Dream's David Cage, who spoke to CVG about Team Bondi's mocap process, saying, "I think it's an interesting solution to a problem for now." He later expounded, "Their technique is incredibly expensive and they will never be able to shoot body and face at the same time." Cage revealed that Quantic Dreams is leveraging just that kind of tech for an unannounced title, saying, "We see a huge difference between shooting the face and body separately and shooting everything at the same time. Suddenly you've got a real sense of acting that is consistent." Well, there's our first hint about the studio's follow-up to Heavy Rain: Its characters will have both faces and bodies. Not one or the other, as was the case in Quantic's earlier, far more upsetting game, The Disembodied Faces vs. The Disemfaced Bodies.

  • Killzone 3 trailer shows off some stellar voice talent

    by 
    Griffin McElroy
    Griffin McElroy
    01.28.2011

    Killzone 3 isn't just built upon a solid foundation of lookin' pretty -- it's also backed up by an impressive voice cast, including The Departed and Sexy Beast actor Ray Winstone, and Malcolm McDowell, who's been in everything. Check out these guys doing some mocap work in the video after the jump.