csail
Latest
Ever-changing memory could lead to faster processors
Virtually every central processor in your devices uses a tiered set of memory caches to speed things up by fetching commonly used data. But it's not very efficient -- in trying to accommodate everything, it's rarely the fastest at anything. MIT's CSAIL researchers want to fix that. They've developed a cache system (appropriately named Jenga) that creates new cache structures on the spot to optimize for a specific app. As Jenga knows the physical locations of each memory bank, it can calculate how to store data to reduce the travel time (and thus lag) as much as possible, even if that means changing the hierarchy. Whether an app would benefit from multiple cache levels or one gigantic cache, this system would be ready.
MIT already has your flying car in miniature form
Some drones fly, others drive. Those that can do both, however, can reach places other machines can't, making them ideal for search and rescue -- or package delivery. That's why a team from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a fleet of autonomous drones that have rotors and wheels, giving them the capability avoid obstacles on the ground and to go underneath overhead obstructions. Just imagine a machine that can fly to a disaster zone and then drive in the gaps of collapsed buildings to search for survivors.
MIT teaches machines to learn from each other
There are two typical ways to train a robot today: you can have it watch repeated demonstrations of what you want it to do or you can program its movements directly using motion-planning techniques. But a team of researchers from MIT's CSAIL lab have developed a hybridized third option that will enable robots to transfer skills and knowledge between themselves. It's no Skynet, but it's a start.
MIT's app only needs a second to teach you a new language
You know the seconds and minutes you waste waiting for the elevator to arrive, for a friend to reply to an IM or for a website to load? A team of MIT CSAIL researchers believe you can put them to good use, so they created a series of apps called the WaitSuite that makes the most of those idle moments by helping you learn a new language. The tools can test your vocabulary without whisking you away to another app. For instance, if you're chatting with a friend, a flash card asking you about a word in the language you're learning will pop up within the IM itself. If you're waiting for a website to load, the card will appear within the browser.
MIT finds an easy way to control robots with your brain
A team from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) wanted robots to be a more natural extension of our bodies. See, you'd usually have to issue vocal or very specific mental commands to control machines. But the method the CSAIL team developed works simply by reading your brain and detecting if you've noticed an error as the robot performs its tasks.
MIT's 'Super Smash Bros.' AI can compete with veteran players
For expert players, most video game AI amounts to little more than target practice -- especially in fighting games, where it rarely accounts for the subtleties of human behavior. At MIT, though, they've developed a Super Smash Bros. Melee AI that should make even seasoned veterans sweat a little. The CSAIL team trained a neural network to fight by handing it the coordinates of game objects, and giving it incentives to play in ways that should secure a win. The result is an AI brawler that has largely learned to fight on its own -- and is good enough to usually prevail over players ranked in the top 100 worldwide.
MIT demos smartwatch app that detects emotions
A conversation is never just about the words we speak, it's about our tone, volume, body language, gaze and everything in between. But the signals that we send out can sometimes be misinterpreted, or ignored, by people who struggle to understand non-verbal communication. That's what prompted researchers at MIT to develop software that could take the ambiguity out of what people say, and what they do.
MIT: Carpooling services could replace most NYC cabs
Researchers have proved mathematically what you probably already knew: Carpooling services are more efficient, less polluting and less costly than traditional taxis. Using data from three million New York City taxi rides, a team from from MIT's CSAIL computer science lab found that just 3,000 vehicles from services like UberPOOL and Lyft Line could replace NYC's 14,000 strong cab fleet. What's more, they'd reduce congestion by three times, barely impact travel times, and you'd only have to wait an average of 2.7 minutes for a ride.
ICYMI: A new form of whale communication, found
try{document.getElementById("aol-cms-player-1").style.display="none";}catch(e){}Today on In Case You Missed It: A new Marine Mammal Science publication found that humpback whales slap the surface of the water to communicate with one another, although what they're actually saying is still a mystery. Meanwhile MIT's CSAIL lab created a CAD-like program to create UAVS. The best part of the software is testing it virtually to see if your creation would fly in real life. The Tesla Coil video by SmarterEveryDay is pretty great and for fun, you may want to watch the Turkish satellite heading up to space. As always, please share any interesting tech or science videos you find by using the #ICYMI hashtag on Twitter for @mskerryd.
AI can create videos of the future
Loads of devices can preserve moments on camera, but what if you could capture situations that were about to happen? It's not as far-fetched as you might think. MIT CSAIL researchers have crafted a deep learning algorithm that can create videos showing what it expects to happen in the future. After extensive training (2 million videos), the AI system generates footage by pitting two neural networks against each other. One creates the scene by determining which objects are moving in still frames. The other, meanwhile, serves as a quality check -- it determines whether videos are real or simulated, and the artificial video is a success when the checker AI is fooled into thinking the footage is genuine.
MIT makes neural nets show their work
Turns out, the inner workings of neural networks really aren't any easier to understand than those of the human brain. But thanks to research coming out of MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), that could soon change. They've devised a means of making these digital minds not just provide the correct answer, classification or prediction, but also explain the rationale behind its choice. And with this ability, researchers hope to bring a new weapon to bear in the fight against breast cancer.
MIT's Foundry software is the 'Photoshop of 3D printing'
Because the materials from a 3D printer aren't the most functional, their output has largely been limited to prototyping in the past. That should change in the near future with devices like MIT's own MultiFab, which can print up to 10 different materials at a time, but it still doesn't solve the problem of how to design such complex objects. That's where the new program called Foundry, created by MIT's Computer Science and Artificial Intelligence Laboratory comes in.
ICYMI: The NYPL's book train and better-bouncing 'bots
Today on In Case You Missed It: The New York Public Library will unveil a brand new "book train" at its Bryant Park branch that will ferry research materials up 11 floors from a subterranean storage vault to a newly refurbished reading room. Also, MIT's CSAIL lab has developed a 3d-printed, "tuneable" shock absorber that can protect anything from autonomous drones to cellular phones from violent impacts. Finally, we bring you the mesmerizing aerial ballet that is the world indoor skydiving championships. As always, please share any interesting tech or science videos you find by using the #ICYMI hashtag on Twitter for @mskerryd. try{document.getElementById("aol-cms-player-1").style.display="none";}catch(e){}
MIT's shock-absorbing robots are safer and more precise
Soft robots aren't just about speed and grace... they should be safer, too. To prove that point, MIT's CSAIL has developed bouncing robots whose 3D-printed soft skins act as shock absorbers. The technique revolves around printing a "programmable viscoelastic material" where every aspect of the skin (which includes solids, liquids and a rubber-like substance called TangoBlack+) is tuned to the right level of elasticity. The robot can give way where it needs to, but remain solid otherwise. As a result, it can bounce around without taking damage, and land four times more precisely than it would with an inflexible surface.
MIT 'radio' uses wireless signals to identify emotions
You can lie to your partner, your best friends and even your mom, but you can't lie to EQ-Radio. It's a device out of MIT"s Computer Science and Artificial Intelligence Lab (CSAIL) that can tell how you truly feel by bouncing wireless signals off your body. Yep, you don't need to be connected to the device with ECG patches and wires. EQ-Radio has algorithms that can extract your heartbeat from the signals your body reflects. It then analyzes each heartbeat and compares it to your previous measurements.
MIT's smarter routers promise to fight crowded networks
MIT hates overcrowded networks just as much as you do, and its CSAIL division has made two breakthroughs that could clear up the data pipes. To begin with, it's developing programmable routers that can still keep up with bandwidth-heavy services like streaming video. Instead of trying to create an elaborate rule system for deciding which data packets get through (which could bog a router down or consume a lot of chip space), researchers broke things down into simple computing elements that could handle a wide range of tasks. You'd only have to combine different elements to achieve the intended effect, which could help networks adapt to new conditions -- that hot new mobile game might not cause chaos.
ICYMI: One Pen to rule all and video you can manipulate
try{document.getElementById("aol-cms-player-1").style.display="none";}catch(e){}Today on In Case You Missed It: Researchers from MIT's Computer Science and Artificial Intelligence Lab formulated a way to interact with existing videos, so they can prod and move objects within the frame. The Cronzy Pen samples colors from anything, anywhere, and mixes its own ink the match any shade. It's on IndieGoGo now so good luck scoring your crowdfunded thing. If you want to check out video of Jupiter's moon Io, the Washington Post explained it all well; and the charming paper craft animations from yelldesign are here. As always, please share any interesting tech or science videos you find by using the #ICYMI hashtag on Twitter for @mskerryd.
New MIT tech lets you mess with objects in pre-recorded video
Despite everything going on within the frame, videos are still a passive experience to observe. You can't reach in and mess with the objects you're watching — until now. An MIT researcher has pioneered new technology that lets you "touch" recorded things, which are simulated to respond like you'd fiddled with them in the real world.
Scientists create glasses-free 3D for the movie theater
Watching glasses-free 3D on a TV is no longer an outlandish concept, but that hasn't been true for movie theaters. How are you supposed to create the same parallax effect for everyone, whether they're up front or way in the back? Researchers at MIT CSAIL and Israel's Weizmann Institute for Science finally have a practical answer. Their Cinema 3D tech creates multiple parallax barriers in a single display, using lenses and mirrors to deliver a range of angles across the whole theater. And unlike previous attempts at large-scale glasses-free 3D, you don't have to take a hit to resolution.
Computers learn to predict high-fives and hugs
Deep learning systems can already detect objects in a given scene, including people, but they can't always make sense of what people are doing in that scene. Are they about to get friendly? MIT CSAIL's researchers might help. They've developed a machine learning algorithm that can predict when two people will high-five, hug, kiss or shake hands. The trick is to have multiple neural networks predict different visual representations of people in a scene and merge those guesses into a broader consensus. If the majority foresees a high-five based on arm motions, for example, that's the final call.