mit media lab

Latest

  • MIT Media Lab develops glasses-free HR3D, supports broad viewing angles (video)

    by 
    Zach Honig
    Zach Honig
    05.04.2011

    We've already seen plenty of glasses-free 3D HDTVs and portable devices, but a promising new technology called HR3D (High-Rank 3D) has hit the prototype phase. Engineers from MIT's Media Lab, who developed the new solution, say that it avoids compromising on screen brightness, resolution, viewing angle, and battery life, and doesn't require those pesky (and pricey) 3D glasses. HR3D uses a pair of layered LCDs to give the illusion of depth, with the top layer (or mask) displaying a variable pattern based on the image below it, so each eye sees a slightly different picture. Nintendo's 3DS uses a similar technique, but with a parallax barrier instead of a second display. The designers constructed the prototype from two Viewsonic VX2265wm displays, removing the LCDs from their housings and pulling off polarizing filters and films. We've yet to go eyes-on with HR3D, so we're a mite skeptical, but tech this promising is worth watching closely, and from every angle.

  • Kinect used to make teleconferencing actually kind of cool (video)

    by 
    Vlad Savov
    Vlad Savov
    04.04.2011

    No matter how hard Skype and others try to convince us otherwise, we still do most of our web communications via text or, if entirely unavoidable, by voice. Maybe we're luddites or maybe video calling has yet to prove its value. Hoping to reverse such archaic views, researchers at the MIT Media Lab have harnessed a Kinect's powers of depth and human perception to provide some newfangled videoconferencing functionality. First up, you can blur out everything on screen but the speaker to keep focus where it needs to be. Then, if you want to get fancier, you can freeze a frame of yourself in the still-moving video feed for when you need to do something off-camera, and to finish things off, you can even drop some 3D-aware augmented reality on your viewers. It's all a little unrefined at the moment, but the ideas are there and well worth seeing. Jump past the break to do just that.

  • Operabots take center stage at MIT Media Lab's 'Death and the Powers' opera

    by 
    Donald Melanson
    Donald Melanson
    03.23.2011

    It already had its premiere in Monaco last year, but composer Tod Machover's new opera, "Death and the Powers," has now finally made it to the United States. Why are we reporting on a new opera (rather than Opera) on Engadget? Well, it just so happens to feature the "Operabots" pictured above, which were developed by MIT's Media Lab. The lab also helped develop some of the opera's other high-tech components, but it seems like the Operabots are the real standout -- they're "semi-autonomous" and freely roam around the stage throughout the opera, acting as a Greek chorus. Not surprisingly, the opera itself also deals with some futuristic subject matter. The Powers of the title is Simon Powers, a "Bill Gates, Walt Disney-type" who decides to upload his consciousness into "The System" before he dies -- hijinks then ensue. Those in Boston can apparently still get tickets for the final performance on March 25th -- after that it moves onto Chicago for four performances between April 2nd and 10th. Head on past the break for a preview.

  • MIT Media Lab gets a multiplicitous new logo (video)

    by 
    Vlad Savov
    Vlad Savov
    03.10.2011

    Logos can be surprisingly divisive things, so the MIT Media Lab has decided to cheat a little bit with its new identity: it won't have just one logo, it'll have 40,000. You heard / read / imagined that right, the new Media Lab logo will simply be the concept of three intersecting "spotlights," composed of three colors, straight lines, three black squares, and a few blending gradients. There's an algorithm behind it all, which is used to generate a unique logo for every new member of staff, meaning that although trademark claims may be a headache to enforce, originality will continue thriving in the Lab for a long time to come. Hit the source link to learn more or leap past the break for a nice video rundown.

  • 3D printed concert flute rapidly prototypes sound (video)

    by 
    Sean Hollister
    Sean Hollister
    12.29.2010

    The world's first store for 3D printed goods just opened in Brussels, and while we imagine they've already got a fair selection of prototyped merchandise to choose, might we suggest they invest in a few production runs of this fabulous new flute? Amit Zoran of the MIT Media Lab -- yes, the same soul who helped dream up a 3D food printer early this year -- has now printed a fully-functional concert flute with a minimum of human intervention. Directing an Objet Connex500 3D printer (which can handle multiple materials at the same time) to spit out his CAD design, dollop by tiny dollop, in a single 15-hour run, he merely had to wash off support material, add springs, and assemble four printed pieces to finish the instrument up. The proof of the pudding is in the eating, of course, so how does it sound? Find out for yourself in the video below.

  • Proverbial Wallets make your metaphysical money a little more tangible

    by 
    Tim Stevens
    Tim Stevens
    12.09.2010

    Counting dollars and cents on the checkout counter really makes you feel the weight of every expenditure. Swiping a credit card or waving an NFC device over a sensor? Not so much. Enter the Proverbial Wallets from the Information Ecology group at the MIT Media Lab, three separate devices that use three haptic techniques to curtail your spending. First is the Bumblebee, which buzzes and vibrates whenever money comes into or goes out from your account. Next is Mother Bear, which becomes harder to open as you get closer to your spending goal. Finally is Peacock, which swells proudly as your bank balance does the same. Sadly none of these are actually available yet, but we have a feeling if they were they might put a bit of a hurting on our very real and very strict budgets.

  • Kinect hacks: Use Kinect to navigate the web, resize koalas

    by 
    Ludwig Kietzmann
    Ludwig Kietzmann
    11.25.2010

    A new Kinect hack -- dubbed "DepthJS" -- allows Microsoft's frequently repurposed camera to interact with a web browser via Javascript. Its creators, from the MIT Media Lab Fluid Interfaces Group, envision "all sorts of applications that run in the browser," and demonstrate fairly simple website navigation in their video (embedded after break). Making a fist enables selection, a semi-dismissive swatting motion allows scrolling, and giving it the finger automatically posts a hateful comment on a game review you didn't agree with. (Okay, that last thing isn't true.) Meanwhile, Evoluce, a Munich-based software company, has shown Microsoft Windows 7 applications being controlled through Kinect. The associated video shows multi-touch support, with the user being able to zoom in on images or draw using two hands at once. It also makes resizing pictures of adorable animals very easy, which should come in handy for your bandwidth-limited nature blog.

  • Kinect hacks let you control a web browser and Windows 7 using only The Force (updated)

    by 
    Thomas Ricker
    Thomas Ricker
    11.25.2010

    Hacking the Xbox 360 Kinect is all about baby steps on the way to what could ultimately amount to some pretty useful homebrew. Here's a good example cooked up by some kids at the MIT Media Lab Fluid Interfaces Group attempting to redefine the human-machine interactive experience. DepthJS is a system that makes Javascript talk to Microsoft's Kinect in order to navigate web pages, among other things. Remember, it's not that making wild, arm-waving gestures is the best way to navigate a web site, it's just a demonstration that you can. Let's hope that the hacking community picks up the work and evolves it into a multitouch remote control plugin for our home theater PCs. Boxee, maybe you can lend a hand? Update: If you're willing to step outside of the developer-friendly borders of open-source software then you'll want to check out Evoluce's gesture solution based on the company's Multitouch Input Management (MIM) driver for Kinect. The most impressive part is its support for simultaneous multitouch and multiuser control of applications (including those using Flash and Java) running on a Windows 7 PC. Evoluce promises to release software "soon" to bridge Kinect and Windows 7. Until then be sure to check both of the impressive videos after the break. [Thanks, Leakcim13]

  • MIT's Android optometry app could help you stop squinting all the time (video)

    by 
    Tim Stevens
    Tim Stevens
    07.02.2010

    Remember Bokodes, MIT's tiny replacement for barcodes and the like? Their holographic nature enabled them to represent different information from different angles, and it's this property that allows the tech behind them to be used in a very different and even more useful way: figuring out just how busted your vision is. The Camera Culture team at MIT's Media Lab evolved that tech into a $2 box that refracts the image displayed on a smartphone screen. When combined with an app that displays a set of dots and lines, the user can manipulate the image until things look to be perfectly aligned. Once complete, the app spits out a prescription and you're just a quick trip to your local mall-based eyeglasses joint away from perfect vision. The goal is to make it easier for optometrists in developing countries to quickly and easily find glasses for people, but an app that could save a trip to the doctor's office is a wonderful thing regardless of where you are.

  • How do you make teachers angry? Take an app out of the App Store

    by 
    Steve Sande
    Steve Sande
    04.23.2010

    iPhone, iPod touch, and iPad users are used to hearing about new apps showing up in the App Store. It's when they are taken out of the App Store by Apple that things get interesting. Teachers across the country got a taste of "interesting" last week when Apple removed Scratch Viewer from the App Store. The app is used to display programs that have been written by children in the Scratch programming language, a popular language for teaching kids the basics of computer programming. Scratch was developed by a team at M.I.T Media Lab, and the app was written by John McIntosh of Canadian development firm Smalltalk Consulting, Ltd. The Computing Education Blog broke the news and received a number of comments protesting Apple's decision. While Apple is remaining quiet on the subject, McIntosh notes that he's in negotiations with the company. Many bloggers are thinking that Apple's excuse for killing Scratch Viewer is that it violates Section 3.3.1 of the company's policy against apps that interpret or execute code. That's the reason Apple is quashing Adobe Flash-based apps. Mitchel Resnick, who runs the Scratch team at M.I.T., says that he's "disappointed that Apple decided not to allow a Scratch player on the iPhone or iPad" and hopes that "Apple will reconsider its policies so that more kids can experience the joys of creating and sharing with Scratch." The team is planning on writing Scratch authoring tools for iPad, but whether those plans come to fruition is up to Apple. [via NYT Gadgetwise Blog]

  • MIT Media Lab's Surround Vision brings virtual reality to a tablet (video)

    by 
    Tim Stevens
    Tim Stevens
    04.09.2010

    Sure, 3D adds a little more dimensionality to your couch-bound viewing experience, but it's far from the truly immersive virtual reality people have promising for decades. Surround Vision isn't quite VR either, but it's an interesting way of breaking the perception barrier, allowing a viewer to pan around a scene outside the perspective offered by one display. It's a project by Santiago Alfaro, graduate student at MIT's Media Lab, and relies on a tablet with a compass. In his demo he filmed video from three perspectives and is able to display the center perspective on the main TV while panning around to the other two with the tablet. It's an interesting idea to bring some aspect of interactivity to the viewing process, but we could see Hollywood turning it into the next big gimmick, with the leading man pointing off screen dramatically and saying "Oh my god, what's that?" before waiting patiently for a few seconds while the audience scrambles to pan around and find the horror. Yeah, we've got your number, Michael Bay. Immersive video demonstration after the break for you to lose yourself in.

  • MIT gestural computing makes multitouch look old hat

    by 
    Vlad Savov
    Vlad Savov
    12.11.2009

    Ah, the MIT Media Lab, home to Big Bird's illegitimate progeny, augmented reality projects aplenty, and now three-dimensional gestural computing. The new bi-directional display being demoed by the Cambridge-based boffins performs both multitouch functions that we're familiar with and hand movement recognition in the space in front of the screen -- which we're also familiar with, but mostly from the movies. The gestural motion tracking is done via embedded optical sensors behind the display, which are allowed to see what you're doing by the LCD alternating rapidly (invisible to the human eye, but probably not to human pedantry) between what it's displaying to the viewer and a pattern for the camera array. This differs from projects like Natal, which have the camera offset from the display and therefore cannot work at short distances, but if you want even more detail, you'll find it in the informative video after the break. [Thanks, Rohit]

  • MIT's Affective Intelligent Driving Agent is KITT and Clippy's lovechild (video)

    by 
    Vlad Savov
    Vlad Savov
    10.30.2009

    If we've said it once, we've said it a thousand times, stop trying to make robots into "friendly companions!" MIT must have some hubris stuck in its ears, as its labs are back at it with what looks like Clippy gone 3D, with an extra dash of Knight Rider-inspired personality. What we're talking about here is a dashboard-mounted AI system that collects environmental data, such as local events, traffic and gas stations, and combines it with a careful analysis of your driving habits and style to make helpful suggestions and note points of interest. By careful analysis we mean it snoops on your every move, and by helpful suggestions we mean it probably nags you to death (its own death). Then again, the thing's been designed to communicate with those big Audi eyes, making even our hardened hearts warm just a little. Video after the break. %Gallery-76874%

  • Intelligent Siftables blocks get even more face time

    by 
    Darren Murph
    Darren Murph
    02.26.2009

    We were captivated when we first came across David Merrill's brilliantly simple Siftables idea last year, and though some time has passed us by, we're no less amazed this go 'round. The MIT graduate student has hosted up a few more videos and images of his pet project, which aims to utilize computerized tiles to initiate learning and allow Earthlings to "interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives." Call us crazy, but we're betting Art Lebedev would totally take these commercial. Check out a music sequencer vid just past the break, and catch the rest of the media in the read link. [Via TED, thanks Kaptix]

  • MIT's "sixth sense" augmented reality device demonstrated on video

    by 
    Paul Miller
    Paul Miller
    02.06.2009

    We've got ourselves some video of MIT's new "sixth sense" project, which really helps explain the concept. MIT basically plans to augment reality with a pendant picoprojector: hold up an object at the store and the device blasts relevant information onto it (like environmental stats, for instance), which can be browsed and manipulated with hand gestures. The "sixth sense" in question is the internet, which naturally supplies the data, and that can be just about anything -- MIT has shown off the device projecting information about a person you meet at a party on that actual person (pictured), projecting flight status on a boarding pass, along with an entire non-contextual interface for reading email or making calls. It's pretty interesting technology, that, like many MIT Media Lab projects, makes the wearer look like a complete dork -- if the projector doesn't give it away, the colored finger bands the device uses to detect finger motion certainly might. There are patents already in the works for the technology, which the MIT folks have been working on "night and day" for the past four months, and we're guessing (and hoping) this isn't the last we'll see of this stuff. Video is after the break.

  • Video: TOFU robot probably tastes like chicken

    by 
    Thomas Ricker
    Thomas Ricker
    01.15.2009

    If a Big Bird bender resulted in a bumpin' of nasties with Keepon, well, this would be the genetic result. Meet TOFU, the "squash and stretch" robot with OLED eyes developed by the big brains over at the MIT Media Lab. Tofu applies techniques of social expression long used by 2D animators to explore the impact on robotics. If cute was the goal then we'd call this project a success -- enslave us now oh furry overlords of doom. Video after the break.

  • Tools of the trade: Scratch for SL

    by 
    Tateru Nino
    Tateru Nino
    07.14.2008

    MIT Media Lab's Eric Rosenbaum has produced a wonderful little tool called Scratch for Second Life (S4SL). Available for Mac or Windows (but not Linux at present, alas) S4SL allows you to create scripts by assembling simple colorful shapes (a bit like plastic bricks). S4SL is based on MIT's Scratch, and allows you to put together some useful functionality very simply. S4SL isn't going to make you a star creator of scripts overnight, though -- anyone who knows Second Life's LSL scripting language and has a modicum of programming skill can do much more, but that's not the point here.

  • MIT researcher aims to understand language with Human Speechome Project

    by 
    Donald Melanson
    Donald Melanson
    04.24.2008

    It's far from the first time a researcher has enlisted the help of his own family or kids, but MIT's Deb Roy's latest endeavor looks to be a bit more ambitious than most, as he's aiming to do nothing short of understand how children learn language. To do that, Roy and his wife installed 11 video cameras and 14 microphones throughout their house to record just about every moment of their son's first three years. That, obviously, also required a good deal of computing power, which came in the form of a temperature-controlled data-storage room consisting of five Apple Xserves and a 4.4TB Xserve RAID (you can guess why Apple's profiling 'em), along with an array of backup tape drives and robotic tape changes (and an amply supply of other Macs, of course). While the project is obviously still a work in progress, they have apparently already developed some new methods for audio and video pattern recognition, among other things, and it seems they'll have plenty of work to sift through for years to come, with the project expected to churn out some 1.4 petabytes of data by the end of year three.[Thanks, Jeff]

  • MIT's Siftables let you juggle your data... for real

    by 
    Joshua Topolsky
    Joshua Topolsky
    03.15.2008

    The cats and kittens at the MIT Media Lab are always on some next-level type of wackiness, and the Siftables project doesn't break from that trend. The concept seems simple enough: a collection of small, self-contained input / display devices wirelessly link together to form an independent mini-network, or a control system for a PC. The cubes feature OLED screens, a 3-axis accelerometer, Bluetooth, flash memory, and a haptic actuation driver, and feature additional ports for attaching other devices. The aim is to create a more natural system for handling and displaying data, though we won't be surprised if this is somehow incorporated into an even more realistic version of Call of Duty. Check out the video after the break to see the little guys in action.[Via OhGizmo!]

  • The secret life of MIT's Media Lab robots

    by 
    Evan Blass
    Evan Blass
    08.20.2007

    While it may not have the production values -- and probably not the budget -- of the Pixar-produced Toy Story movies with which it shares a common theme, the stop-motion short "medialab@night" has nevertheless captured our imagination with its clever premise and lovable cast of characters. Just like Buzz, Woody, and that humorous little pig, the high-tech residents of MIT's Media Lab apparently also come to life when no one (except a film crew) is watching, with sensor shoes, pushpin computers, and various other gadgets roaming the halls and causing a bit of mischief. This particular film catches them hacking into the brain of our favorite little Gremlin-esque robot, Leonardo (no relation to director Leonardo Bonanni -- we think), and rewiring him to edit Wikipedia on -- what else -- an OLPC. Check out the full flick after the break, and just remember this warning the next time your Robosapiens and Pleos try using a Dremel to drill into your brain while you sleep...[Via Waziwazi]