mitmedialab

Latest

  • MIT's "sixth sense" augmented reality device demonstrated on video

    by 
    Paul Miller
    Paul Miller
    02.06.2009

    We've got ourselves some video of MIT's new "sixth sense" project, which really helps explain the concept. MIT basically plans to augment reality with a pendant picoprojector: hold up an object at the store and the device blasts relevant information onto it (like environmental stats, for instance), which can be browsed and manipulated with hand gestures. The "sixth sense" in question is the internet, which naturally supplies the data, and that can be just about anything -- MIT has shown off the device projecting information about a person you meet at a party on that actual person (pictured), projecting flight status on a boarding pass, along with an entire non-contextual interface for reading email or making calls. It's pretty interesting technology, that, like many MIT Media Lab projects, makes the wearer look like a complete dork -- if the projector doesn't give it away, the colored finger bands the device uses to detect finger motion certainly might. There are patents already in the works for the technology, which the MIT folks have been working on "night and day" for the past four months, and we're guessing (and hoping) this isn't the last we'll see of this stuff. Video is after the break.

  • Video: TOFU robot probably tastes like chicken

    by 
    Thomas Ricker
    Thomas Ricker
    01.15.2009

    If a Big Bird bender resulted in a bumpin' of nasties with Keepon, well, this would be the genetic result. Meet TOFU, the "squash and stretch" robot with OLED eyes developed by the big brains over at the MIT Media Lab. Tofu applies techniques of social expression long used by 2D animators to explore the impact on robotics. If cute was the goal then we'd call this project a success -- enslave us now oh furry overlords of doom. Video after the break.

  • MIT researcher aims to understand language with Human Speechome Project

    by 
    Donald Melanson
    Donald Melanson
    04.24.2008

    It's far from the first time a researcher has enlisted the help of his own family or kids, but MIT's Deb Roy's latest endeavor looks to be a bit more ambitious than most, as he's aiming to do nothing short of understand how children learn language. To do that, Roy and his wife installed 11 video cameras and 14 microphones throughout their house to record just about every moment of their son's first three years. That, obviously, also required a good deal of computing power, which came in the form of a temperature-controlled data-storage room consisting of five Apple Xserves and a 4.4TB Xserve RAID (you can guess why Apple's profiling 'em), along with an array of backup tape drives and robotic tape changes (and an amply supply of other Macs, of course). While the project is obviously still a work in progress, they have apparently already developed some new methods for audio and video pattern recognition, among other things, and it seems they'll have plenty of work to sift through for years to come, with the project expected to churn out some 1.4 petabytes of data by the end of year three.[Thanks, Jeff]

  • MIT's Siftables let you juggle your data... for real

    by 
    Joshua Topolsky
    Joshua Topolsky
    03.15.2008

    The cats and kittens at the MIT Media Lab are always on some next-level type of wackiness, and the Siftables project doesn't break from that trend. The concept seems simple enough: a collection of small, self-contained input / display devices wirelessly link together to form an independent mini-network, or a control system for a PC. The cubes feature OLED screens, a 3-axis accelerometer, Bluetooth, flash memory, and a haptic actuation driver, and feature additional ports for attaching other devices. The aim is to create a more natural system for handling and displaying data, though we won't be surprised if this is somehow incorporated into an even more realistic version of Call of Duty. Check out the video after the break to see the little guys in action.[Via OhGizmo!]

  • The secret life of MIT's Media Lab robots

    by 
    Evan Blass
    Evan Blass
    08.20.2007

    While it may not have the production values -- and probably not the budget -- of the Pixar-produced Toy Story movies with which it shares a common theme, the stop-motion short "medialab@night" has nevertheless captured our imagination with its clever premise and lovable cast of characters. Just like Buzz, Woody, and that humorous little pig, the high-tech residents of MIT's Media Lab apparently also come to life when no one (except a film crew) is watching, with sensor shoes, pushpin computers, and various other gadgets roaming the halls and causing a bit of mischief. This particular film catches them hacking into the brain of our favorite little Gremlin-esque robot, Leonardo (no relation to director Leonardo Bonanni -- we think), and rewiring him to edit Wikipedia on -- what else -- an OLPC. Check out the full flick after the break, and just remember this warning the next time your Robosapiens and Pleos try using a Dremel to drill into your brain while you sleep...[Via Waziwazi]

  • OLPC will be powered by pulling a string

    by 
    Evan Blass
    Evan Blass
    07.24.2006

    We've been following Nicholas Negroponte's One Laptop Per Child initiative ever since the machine was still priced below $100, but once they jettisoned the hand crank, we've been wondering how they're going to deliver power to the 500MHz device. Enter Squid Labs, an R&D firm chock full of MIT Media Lab grads -- the same lab that Negroponte founded and ran for many years -- with an innovative human-powered generator that works by repeatedly tugging on a string in a motion similar to firing up a gas-powered lawnmower or snowblower. The team at Squid designed the external generator so that one minute of pulling yields ten minutes of computing, and included an electronic variable motor loading feature so that it can be operated by users of varying strength. Another nice feature of this system is that it can be configured in a number of different ways: users can either hold the device in one hand and pull the string with the other, or clamp it to a desk and operate the string with their legs. As long as further testing confirms the design's durability, and a better option doesn't come along, it looks like we'll be seeing classrooms full of string-pulling students when the laptop finally goes into mass production next year.[Via Slashdot]