mitmedialab

Latest

  • DIY Cellphone has the footprint of an ice cream sandwich, definitely doesn't run ICS (hands-on)

    by 
    Zach Honig
    Zach Honig
    04.25.2012

    Building your own wireless communications device isn't for the faint of heart, or the law-abiding -- the FCC tends to prefer placing its own stamp of approval on devices that utilize US airwaves, making a homegrown mobile phone an unlikely proposition. That didn't stop a team at the MIT Media Lab from creating such a DIY kit, however. Meet the Do-It-Yourself Cellphone. This wood-based mobile rig, while it's currently in the prototype phase (where it may indefinitely remain), would eventually ship with a circuit board, control pad, a fairly beefy antenna and a monochrome LCD. Sounds like it'd be right at home at some kid's garage workshop in the early '80s, not showcased at an MIT open house. The argument here is that people spend more time with their phone than with any other device, so naturally they'd want to build one to their liking. Nowadays, folks expect their pocketable handset to enable them to not only place and receive phone calls, but also store phone numbers, offer a rechargeable battery, and, well, in some cases even send and receive email, and surf the web -- none of which are available with such a kit. The prototype we saw was fully functional. It could place calls. It could receive calls. There was even Caller ID! The phone does indeed feel homemade, with its laser-cut plywood case and a design that lacks some of the most basic gadget essentials, like a rechargeable battery (or at very least some provisions for replacing the 9-volt inside without unscrewing the case). Audio quality sounded fine, and calls went out and came in without a hitch -- there's a SIM card slot inside, letting you bring the nondescript phone to the carrier of your choice. Does it work? Yes. Is it worth dropping $100-150 in parts to build a jumbo-sized phone with a microscopic feature set? No, there's definitely nothing smart about the DIY Cellphone. If you want to throw together your own handset, however, and not risk anyone questioning the legitimacy of your homemade claim, you might want to keep an eye out for this to come to market. The rest of you will find everything you need in the video just past the break. We're just happy to have walked away without any splinters.

  • OLED Display Blocks pack six 128 x 128 panels, we go hands-on at MIT (video)

    by 
    Zach Honig
    Zach Honig
    04.24.2012

    How do you develop an OLED display that gives a 360-degree perspective? Toss six 1.25-inch panels into a plastic cube, then turn it as you see fit. That's an overly simplistic explanation for the six-sided display on hand at the MIT Media Lab today, which is quite limited in its current form, but could eventually serve an enormous variety of applications. Fluid Interfaces Group Research Assistant Pol Pla i Conesa presented several such scenarios for his Display Blocks, which consist of 128 x 128-pixel OLED panels. Take, for example, the 2004 film Crash, which tells interweaving stories that could be presented simultaneously with such a display -- simply rotate the cube until you land on a narrative you'd like to follow, and the soundtrack will adjust to match. It could also go a long way when it comes to visualizing data, especially when in groups -- instead of virtually constructing profiles of individuals who applied for a slot at MIT, for example, or segments of a business that need to be organized based on different parameters, you could have each assigned to a cube, which can be tossed into an accepted or rejected pile, and repositioned as necessary. Imagine having a group of display cubes when it comes time to plan the seating chart for a reception -- each cube could represent one individual, with a color-coded background and a name or photo up top, with different descriptive elements on each side. The same could apply to products at monstrous companies like Samsung or Sony, where executives need to make planning decisions based on product performance, and could benefit greatly from having all of the necessary information for a single gadget listed around each cube. On a larger scale, the cubes could be used to replace walls and floors in a building -- want to change the color of your wallpaper? Just push a new image to the display, and dedicate a portion of the wall for watching television, or displaying artwork. You could accomplish this with networked single-sided panels as well, but that wouldn't be nearly as much fun. The Media Lab had a working prototype on display today, which demonstrated the size and basic functionality, but didn't have an adjustable picture. Still, it's easy to imagine the potential of such a device, if, of course, it ever becomes a reality. As always, you'll find our hands-on demo just past the break.

  • Droplet and StackAR bring physical interface to virtual experiences, communicate through light (hands-on)

    by 
    Zach Honig
    Zach Honig
    04.24.2012

    Light-based communication seems to wind throughout the MIT Media Lab -- it is a universal language, after all, since many devices output light, be it with a dedicated LED or a standard LCD, and have the capacity to view and interpret it. One such device, coined Droplet, essentially redirects light from one source to another, while also serving as a physical interface for tablet-based tasks. Rob Hemsley, a research assistant at the Media Lab, was on hand to demonstrate two of his projects. Droplet is a compact self-contained module with an integrated RGB LED, a photodiode and a CR1216 lithium coin battery -- which provides roughly one day of power in the gadget's current early prototype status. Today's demo used a computer-connected HDTV and a capacitive-touch-enabled tablet. Using the TV to pull up a custom Google Calendar module, Hemsley held the Droplet up to a defined area on the display, which then output a series of colors, transmitting data to the module. Then, that data was pushed to a tablet after placing the Droplet on the display, pulling up the same calendar appointment and providing a physical interface for adjusting the date and time, which is retained in the cloud and the module itself, which also outputs pulsing light as it counts down to the appointment time. StackAR, the second project, functions in much the same way, but instead of outputting a countdown indicator, it displays schematics for a LilyPad Arduino when placed on the tablet, identifying connectors based on a pre-selected program. The capacitive display can recognize orientation, letting you drop the controller in any position throughout the surface, then outputting a map to match. Like the Droplet, StackAR can also recognize light input, even letting you program the Arduino directly from the tablet by outputting light, effectively simplifying the interface creation process even further. You can also add software control to the board, which will work in conjunction with the hardware, bringing universal control interfaces to the otherwise space-limited Arduino. Both projects appear to have incredible potential, but they're clearly not ready for production just yet. For now, you can get a better feel for Droplet and StackAR in our hands-on video just past the break.

  • MIT gets musical with Arduino-powered DrumTop, uses household objects as a source of sound

    by 
    Zach Honig
    Zach Honig
    04.24.2012

    Everyone's favorite microcontroller has been a boon among hobbyists and advanced amateurs, but it's also found a home among the brilliant projects at MIT's Media Lab, including a groovy instrument called DrumTop. This modern take on the drum pad delivers Arduino-powered interactivity in its simplest form -- hands-on time with ordinary household objects. Simply place a cup, or a plastic ball, even a business card on the DrumTop to make your own original music. The prototype on display today includes eight pads, which are effectively repurposed speakers that tap objects placed on top, with an FSR sensor recognizing physical pressure and turning it into a synchronized beat. There's also a dial in the center that allows you to speed up or slow down the taps, presenting an adjustable tempo. DrumTop is more education tool than DJ beat machine, serving to teach youngsters about the physical properties of household objects, be it a coffee mug, a CD jewel case or a camera battery. But frankly, it's a lot of fun for folks of every age. There's no word on when you might be able to take one home, so for now you'll need to join us on our MIT visit for a closer look. We make music with all of these objects and more in the video after the break.

  • Newsflash uses high-frequency light to transmit data from iPad to smartphone, we go hands-on (video)

    by 
    Zach Honig
    Zach Honig
    04.24.2012

    MIT's Media Lab is chock-full of cutting-edge tech projects that researchers create, then often license to manufacturers and developers. One such project is called Newsflash, and uses high-frequency red and green light to transmit data to the built-in camera on a receiving device -- in this case Samsung's Epic 4G. The concept is certainly familiar, and functions in much the same way as a QR code, generating flashing light that's invisible to the human eye instead of a cumbersome 2D square. In the Media Lab's implementation, an iPad is used to display a static news page with flashing colored bands at the top, representing just a few vertical pixels on the LCD. As the device presents the standard touch experience you're already familiar with, it also broadcasts data that can be read by any camera, but flashes too quickly to be distracting or even noticeable to the naked eye. A NewsFlash app then interprets those flashes and displays a webpage as instructed -- either a mobile version with the same content, or a translation of foreign websites. As with most MediaLab projects, Newsflash is simply a concept at this point, but it could one day make its way to your devices. Jump past the break to see it in action.

  • MIT builds camera that can capture at the speed of light (video)

    by 
    Daniel Cooper
    Daniel Cooper
    12.13.2011

    A team from the MIT media lab has created a camera with a "shutter speed" of one trillion exposures per second -- enabling it to record light itself traveling from one point to another. Using a heavily modified Streak Tube (which is normally used to intensify photons into electron streams), the team could snap a single image of a laser as it passed through a soda bottle. In order to create the slow-motion film in the video we've got after the break, the team had to replicate the experiment hundreds of times. The stop-motion footage shows how light bounces through the bottle, collecting inside the opaque cap before dispersing. The revolutionary snapper may have a fast shutter but the long time it takes to process the images have earned it the nickname of the "the world's slowest fastest camera." [Image courtesy of MIT / M. Scott Brauer]

  • MIT's folding CityCar takes a spin on video, still no room for humans

    by 
    Donald Melanson
    Donald Melanson
    08.26.2011

    The MIT Media Lab has been working on a folding, stackable electric vehicle for quite a few years now, but it seems those have at least been fairly productive years, as the so-called CityCar has now finally progressed to something resembling a finished prototype. The only problem for those eager to hop into one is that it's a half-sized prototype, which makes accommodating a driver just a tad difficult. It does do a fairly good job of folding itself up though, and MIT expects a full-size version to go into production in 2013. Interestingly, MIT doesn't necessarily see people actually owning the vehicles themselves, but it would like to see them be made available throughout cities -- letting you rent one for a short trip across town, for instance, and not have to worry about returning it. Head on past the break to see it on video courtesy of The Next Web.

  • MIT Media Lab develops glasses-free HR3D, supports broad viewing angles (video)

    by 
    Zach Honig
    Zach Honig
    05.04.2011

    We've already seen plenty of glasses-free 3D HDTVs and portable devices, but a promising new technology called HR3D (High-Rank 3D) has hit the prototype phase. Engineers from MIT's Media Lab, who developed the new solution, say that it avoids compromising on screen brightness, resolution, viewing angle, and battery life, and doesn't require those pesky (and pricey) 3D glasses. HR3D uses a pair of layered LCDs to give the illusion of depth, with the top layer (or mask) displaying a variable pattern based on the image below it, so each eye sees a slightly different picture. Nintendo's 3DS uses a similar technique, but with a parallax barrier instead of a second display. The designers constructed the prototype from two Viewsonic VX2265wm displays, removing the LCDs from their housings and pulling off polarizing filters and films. We've yet to go eyes-on with HR3D, so we're a mite skeptical, but tech this promising is worth watching closely, and from every angle.

  • Kinect used to make teleconferencing actually kind of cool (video)

    by 
    Vlad Savov
    Vlad Savov
    04.04.2011

    No matter how hard Skype and others try to convince us otherwise, we still do most of our web communications via text or, if entirely unavoidable, by voice. Maybe we're luddites or maybe video calling has yet to prove its value. Hoping to reverse such archaic views, researchers at the MIT Media Lab have harnessed a Kinect's powers of depth and human perception to provide some newfangled videoconferencing functionality. First up, you can blur out everything on screen but the speaker to keep focus where it needs to be. Then, if you want to get fancier, you can freeze a frame of yourself in the still-moving video feed for when you need to do something off-camera, and to finish things off, you can even drop some 3D-aware augmented reality on your viewers. It's all a little unrefined at the moment, but the ideas are there and well worth seeing. Jump past the break to do just that.

  • Operabots take center stage at MIT Media Lab's 'Death and the Powers' opera

    by 
    Donald Melanson
    Donald Melanson
    03.23.2011

    It already had its premiere in Monaco last year, but composer Tod Machover's new opera, "Death and the Powers," has now finally made it to the United States. Why are we reporting on a new opera (rather than Opera) on Engadget? Well, it just so happens to feature the "Operabots" pictured above, which were developed by MIT's Media Lab. The lab also helped develop some of the opera's other high-tech components, but it seems like the Operabots are the real standout -- they're "semi-autonomous" and freely roam around the stage throughout the opera, acting as a Greek chorus. Not surprisingly, the opera itself also deals with some futuristic subject matter. The Powers of the title is Simon Powers, a "Bill Gates, Walt Disney-type" who decides to upload his consciousness into "The System" before he dies -- hijinks then ensue. Those in Boston can apparently still get tickets for the final performance on March 25th -- after that it moves onto Chicago for four performances between April 2nd and 10th. Head on past the break for a preview.

  • MIT Media Lab gets a multiplicitous new logo (video)

    by 
    Vlad Savov
    Vlad Savov
    03.10.2011

    Logos can be surprisingly divisive things, so the MIT Media Lab has decided to cheat a little bit with its new identity: it won't have just one logo, it'll have 40,000. You heard / read / imagined that right, the new Media Lab logo will simply be the concept of three intersecting "spotlights," composed of three colors, straight lines, three black squares, and a few blending gradients. There's an algorithm behind it all, which is used to generate a unique logo for every new member of staff, meaning that although trademark claims may be a headache to enforce, originality will continue thriving in the Lab for a long time to come. Hit the source link to learn more or leap past the break for a nice video rundown.

  • 3D printed concert flute rapidly prototypes sound (video)

    by 
    Sean Hollister
    Sean Hollister
    12.29.2010

    The world's first store for 3D printed goods just opened in Brussels, and while we imagine they've already got a fair selection of prototyped merchandise to choose, might we suggest they invest in a few production runs of this fabulous new flute? Amit Zoran of the MIT Media Lab -- yes, the same soul who helped dream up a 3D food printer early this year -- has now printed a fully-functional concert flute with a minimum of human intervention. Directing an Objet Connex500 3D printer (which can handle multiple materials at the same time) to spit out his CAD design, dollop by tiny dollop, in a single 15-hour run, he merely had to wash off support material, add springs, and assemble four printed pieces to finish the instrument up. The proof of the pudding is in the eating, of course, so how does it sound? Find out for yourself in the video below.

  • Proverbial Wallets make your metaphysical money a little more tangible

    by 
    Tim Stevens
    Tim Stevens
    12.09.2010

    Counting dollars and cents on the checkout counter really makes you feel the weight of every expenditure. Swiping a credit card or waving an NFC device over a sensor? Not so much. Enter the Proverbial Wallets from the Information Ecology group at the MIT Media Lab, three separate devices that use three haptic techniques to curtail your spending. First is the Bumblebee, which buzzes and vibrates whenever money comes into or goes out from your account. Next is Mother Bear, which becomes harder to open as you get closer to your spending goal. Finally is Peacock, which swells proudly as your bank balance does the same. Sadly none of these are actually available yet, but we have a feeling if they were they might put a bit of a hurting on our very real and very strict budgets.

  • Kinect hacks let you control a web browser and Windows 7 using only The Force (updated)

    by 
    Thomas Ricker
    Thomas Ricker
    11.25.2010

    Hacking the Xbox 360 Kinect is all about baby steps on the way to what could ultimately amount to some pretty useful homebrew. Here's a good example cooked up by some kids at the MIT Media Lab Fluid Interfaces Group attempting to redefine the human-machine interactive experience. DepthJS is a system that makes Javascript talk to Microsoft's Kinect in order to navigate web pages, among other things. Remember, it's not that making wild, arm-waving gestures is the best way to navigate a web site, it's just a demonstration that you can. Let's hope that the hacking community picks up the work and evolves it into a multitouch remote control plugin for our home theater PCs. Boxee, maybe you can lend a hand? Update: If you're willing to step outside of the developer-friendly borders of open-source software then you'll want to check out Evoluce's gesture solution based on the company's Multitouch Input Management (MIM) driver for Kinect. The most impressive part is its support for simultaneous multitouch and multiuser control of applications (including those using Flash and Java) running on a Windows 7 PC. Evoluce promises to release software "soon" to bridge Kinect and Windows 7. Until then be sure to check both of the impressive videos after the break. [Thanks, Leakcim13]

  • MIT's Android optometry app could help you stop squinting all the time (video)

    by 
    Tim Stevens
    Tim Stevens
    07.02.2010

    Remember Bokodes, MIT's tiny replacement for barcodes and the like? Their holographic nature enabled them to represent different information from different angles, and it's this property that allows the tech behind them to be used in a very different and even more useful way: figuring out just how busted your vision is. The Camera Culture team at MIT's Media Lab evolved that tech into a $2 box that refracts the image displayed on a smartphone screen. When combined with an app that displays a set of dots and lines, the user can manipulate the image until things look to be perfectly aligned. Once complete, the app spits out a prescription and you're just a quick trip to your local mall-based eyeglasses joint away from perfect vision. The goal is to make it easier for optometrists in developing countries to quickly and easily find glasses for people, but an app that could save a trip to the doctor's office is a wonderful thing regardless of where you are.

  • How do you make teachers angry? Take an app out of the App Store

    by 
    Steve Sande
    Steve Sande
    04.23.2010

    iPhone, iPod touch, and iPad users are used to hearing about new apps showing up in the App Store. It's when they are taken out of the App Store by Apple that things get interesting. Teachers across the country got a taste of "interesting" last week when Apple removed Scratch Viewer from the App Store. The app is used to display programs that have been written by children in the Scratch programming language, a popular language for teaching kids the basics of computer programming. Scratch was developed by a team at M.I.T Media Lab, and the app was written by John McIntosh of Canadian development firm Smalltalk Consulting, Ltd. The Computing Education Blog broke the news and received a number of comments protesting Apple's decision. While Apple is remaining quiet on the subject, McIntosh notes that he's in negotiations with the company. Many bloggers are thinking that Apple's excuse for killing Scratch Viewer is that it violates Section 3.3.1 of the company's policy against apps that interpret or execute code. That's the reason Apple is quashing Adobe Flash-based apps. Mitchel Resnick, who runs the Scratch team at M.I.T., says that he's "disappointed that Apple decided not to allow a Scratch player on the iPhone or iPad" and hopes that "Apple will reconsider its policies so that more kids can experience the joys of creating and sharing with Scratch." The team is planning on writing Scratch authoring tools for iPad, but whether those plans come to fruition is up to Apple. [via NYT Gadgetwise Blog]

  • MIT Media Lab's Surround Vision brings virtual reality to a tablet (video)

    by 
    Tim Stevens
    Tim Stevens
    04.09.2010

    Sure, 3D adds a little more dimensionality to your couch-bound viewing experience, but it's far from the truly immersive virtual reality people have promising for decades. Surround Vision isn't quite VR either, but it's an interesting way of breaking the perception barrier, allowing a viewer to pan around a scene outside the perspective offered by one display. It's a project by Santiago Alfaro, graduate student at MIT's Media Lab, and relies on a tablet with a compass. In his demo he filmed video from three perspectives and is able to display the center perspective on the main TV while panning around to the other two with the tablet. It's an interesting idea to bring some aspect of interactivity to the viewing process, but we could see Hollywood turning it into the next big gimmick, with the leading man pointing off screen dramatically and saying "Oh my god, what's that?" before waiting patiently for a few seconds while the audience scrambles to pan around and find the horror. Yeah, we've got your number, Michael Bay. Immersive video demonstration after the break for you to lose yourself in.

  • MIT gestural computing makes multitouch look old hat

    by 
    Vlad Savov
    Vlad Savov
    12.11.2009

    Ah, the MIT Media Lab, home to Big Bird's illegitimate progeny, augmented reality projects aplenty, and now three-dimensional gestural computing. The new bi-directional display being demoed by the Cambridge-based boffins performs both multitouch functions that we're familiar with and hand movement recognition in the space in front of the screen -- which we're also familiar with, but mostly from the movies. The gestural motion tracking is done via embedded optical sensors behind the display, which are allowed to see what you're doing by the LCD alternating rapidly (invisible to the human eye, but probably not to human pedantry) between what it's displaying to the viewer and a pattern for the camera array. This differs from projects like Natal, which have the camera offset from the display and therefore cannot work at short distances, but if you want even more detail, you'll find it in the informative video after the break. [Thanks, Rohit]

  • MIT's Affective Intelligent Driving Agent is KITT and Clippy's lovechild (video)

    by 
    Vlad Savov
    Vlad Savov
    10.30.2009

    If we've said it once, we've said it a thousand times, stop trying to make robots into "friendly companions!" MIT must have some hubris stuck in its ears, as its labs are back at it with what looks like Clippy gone 3D, with an extra dash of Knight Rider-inspired personality. What we're talking about here is a dashboard-mounted AI system that collects environmental data, such as local events, traffic and gas stations, and combines it with a careful analysis of your driving habits and style to make helpful suggestions and note points of interest. By careful analysis we mean it snoops on your every move, and by helpful suggestions we mean it probably nags you to death (its own death). Then again, the thing's been designed to communicate with those big Audi eyes, making even our hardened hearts warm just a little. Video after the break. %Gallery-76874%

  • Intelligent Siftables blocks get even more face time

    by 
    Darren Murph
    Darren Murph
    02.26.2009

    We were captivated when we first came across David Merrill's brilliantly simple Siftables idea last year, and though some time has passed us by, we're no less amazed this go 'round. The MIT graduate student has hosted up a few more videos and images of his pet project, which aims to utilize computerized tiles to initiate learning and allow Earthlings to "interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives." Call us crazy, but we're betting Art Lebedev would totally take these commercial. Check out a music sequencer vid just past the break, and catch the rest of the media in the read link. [Via TED, thanks Kaptix]