mitcsail

Latest

  • MIT CSAIL

    Computers learn to predict high-fives and hugs

    by 
    Jon Fingas
    Jon Fingas
    06.21.2016

    Deep learning systems can already detect objects in a given scene, including people, but they can't always make sense of what people are doing in that scene. Are they about to get friendly? MIT CSAIL's researchers might help. They've developed a machine learning algorithm that can predict when two people will high-five, hug, kiss or shake hands. The trick is to have multiple neural networks predict different visual representations of people in a scene and merge those guesses into a broader consensus. If the majority foresees a high-five based on arm motions, for example, that's the final call.

  • ICYMI: Underwater robot snake, Earth's ocean saving and more

    by 
    Kerry Davis
    Kerry Davis
    04.21.2016

    #fivemin-widget-blogsmith-image-54715{display:none;} .cke_show_borders #fivemin-widget-blogsmith-image-54715, #postcontentcontainer #fivemin-widget-blogsmith-image-54715{width:570px;display:block;} try{document.getElementById("fivemin-widget-blogsmith-image-54715").style.display="none";}catch(e){}Today on In Case You Missed It: Princeton researchers discovered ocean currents can move most anything around the globe within 10 years; which could help replenish dying ecosystems... and also spread around pollution. Norwegian engineers came up with a mechanical snake for underwater sea inspection and simple repair jobs near oil drills. And Harvard wants to encourage kid programming with a new robot that can be used by kindergartners to high schoolers. Once that's conquered, the answer is clearly to make the MIT open-source duckcar. As always, please share any great tech or science videos you find by using the #ICYMI hashtag on Twitter for @mskerryd.

  • ICYMI: Obstacle-avoiding UAV, smartwatch whys and more

    by 
    Kerry Davis
    Kerry Davis
    11.04.2015

    #fivemin-widget-blogsmith-image-421365{display:none;} .cke_show_borders #fivemin-widget-blogsmith-image-421365, #postcontentcontainer #fivemin-widget-blogsmith-image-421365{width:570px;display:block;} try{document.getElementById("fivemin-widget-blogsmith-image-421365").style.display="none";}catch(e){}Today on In Case You Missed It: A new wearable screen that runs Android while strapped to your wrist is out, but we can't help but make fun of it. MIT's Computer Science and Artificial Intelligence Lab created a UAV that can fly through a forest safely with an obstacle avoidance algorithm. And a prototype gun for objects lets users whip together large-scaled 3D designs using run-of-the-mill packing tape.

  • Team MIT's robot lost the DARPA challenge but won over the crowd

    by 
    Mona Lalwani
    Mona Lalwani
    06.12.2015

    #fivemin-widget-blogsmith-image-996107{display:none;} .cke_show_borders #fivemin-widget-blogsmith-image-996107, #postcontentcontainer #fivemin-widget-blogsmith-image-996107{width:570px;display:block;} try{document.getElementById("fivemin-widget-blogsmith-image-996107").style.display="none";}catch(e){} At the DARPA Robotics Challenge last week, a robot drove in on a red UTV. The vehicle slowly came to a halt on the obstacle course as it reached the door of a simulated disaster building. The driver, a six-foot-two Atlas humanoid, sat motionless for many minutes. About half a dozen researchers wearing blue "TEAM MIT" vests looked on, like anxious parents waiting for their child to pick up the pace in a crucial race. When their robot eventually turned its body to get out of the vehicle, it shook uncontrollably for seconds before it leaped out of the car and fell flat on its face. The crowd collectively gasped and a loud aww rippled through the stands at Fairplex in Pomona, California. In that moment, one of the front-runners in the race became the underdog.

  • Inefficient? MIT's new chip software doesn't know the meaning of the word

    by 
    Daniel Cooper
    Daniel Cooper
    12.12.2011

    Would you rather have a power-hungry cellphone that could software-decode hundreds of video codecs, or a hyper-efficient system-on-chip that only processes H.264? These are the tough decisions mobile designers have to make, but perhaps not for much longer. MIT's Computer Science and Artificial Intelligence Laboratory has developed a solution that could spell the end for inefficient devices. Myron King and Nirav Dave have expanded Arvind's BlueSpec software so engineers can tell it what outcomes they need and it'll decide on the most efficient design -- printing out hardware schematics in Verilog and software in C++. If this outcome-oriented system becomes widely adopted, we may never need worry about daily recharging again: good because we'll need that extra power to juice our sporty EV. [Image courtesy of MIT / Melanie Gonick]

  • Kinect finally fulfills its Minority Report destiny (video)

    by 
    Vlad Savov
    Vlad Savov
    12.09.2010

    Not to denigrate the numerous fine hacks that Kinect's undergone since its launch, but it's always nice to see the professionals come in and shake things up a little. A crew from MIT's brain labs has put together a hand detection system on Microsoft's ultra-versatile cam, which is sophisticated enough to recognize the position of both your palms and fingers. Just as a demonstration, they've tied that good stuff up to a little picture-scrolling UI, and you won't be surprised to hear that it's the closest thing to Minority Report's interactive gesture-based interface that we've seen yet. And it's all achieved with a freaking console peripheral. Video after the break.