carnegiemellonuniversity

Latest

  • Carnegie Mellon smart headlight prototype blacks out raindrops for clearer view of the road

    by 
    Steve Dent
    Steve Dent
    07.04.2012

    Researchers from Carnegie Mellon have developed a prototype smart headlight which blots out individual drops of rain or snow -- improving vision by up to 90 percent. Made with an off-the-shelf Viewsonic DLP projector, a quad-core Intel Core-i7 PC and a GigE Point Grey Flea3 camera, the Rube Goldberg-esque process starts by first imaging raindrops arriving at the top of its view. After this, the signal goes to a processing unit, which uses a predictive theory developed by the team to guess the drops' path to the road. Finally, the projector -- found in the same place as the camera -- uses a beamsplitter like modern digital 3D rigs. Used in tandem with calculations, it transmits a beam with light voids matching the predicted path. The result? It all stops light from hitting the falling particles, with the cumulative process resulting in the illusion of a nearly precipitation-free road view -- at least in the lab. So far, the whole process takes about a hundredth of a second (13 ms) but scientists said that in an actual car and with many more drops, the speed would have to be about ten times quicker. That would allow 90 percent of the light located 13 feet in front of the headlights to pass through, but even at just triple the speed, it would give drivers a 70 percent better view. To see if this tech might have a snowflake's chance of making it out of the lab, go past the break for all the videos.

  • Carnegie Mellon researchers develop robot that takes inventory, helps you find aisle four

    by 
    Alexis Santos
    Alexis Santos
    06.30.2012

    Fed up with wandering through supermarket aisles in an effort to cross that last item off your shopping list? Researchers at Carnegie Mellon University's Intel Science and Technology Center in Embedded Computing have developed a robot that could ease your pain and help store owners keep items in stock. Dubbed AndyVision, the bot is equipped with a Kinect sensor, image processing and machine learning algorithms, 2D and 3D images of products and a floor plan of the shop in question. As the mechanized worker roams around, it determines if items are low or out of stock and if they've been incorrectly shelved. Employees then receive the data on iPads and a public display updates an interactive map with product information for shoppers to peruse. The automaton is currently meandering through CMU's campus store, but it's expected to wheel out to a few local retailers for testing sometime next year. Head past the break to catch a video of the automated inventory clerk at work.

  • New shear touch technology lets you skip a double-tap, push your device around (video)

    by 
    Jon Fingas
    Jon Fingas
    05.11.2012

    Most every touchscreen in the market today can only register your finger input as coordinates; that's fine for most uses, but it leads to a lot of double-taps and occasionally convoluted gestures. A pair of researchers at Carnegie Mellon University, Chris Harrison and Scott Hudson, have suggested that shear touch might be a smarter solution. Instead of gliding over fixed glass, your finger could handle secondary tasks by pushing in a specific direction, or simply pushing harder, on a sliding display. Among the many examples of what shear touch could do, the research duo has raised the possibility of skipping through music by pushing left and right, or scrolling more slowly through your favorite website with a forceful dragging motion. The academic paper is still far away from producing a shipping device, although a Microsoft doctoral fellowship's partial contribution to funding the study indicates one direction the technology might go. You can take a peek at the future in a video after the jump -- just don't expect a tablet-based Van Gogh this soon. [Thanks, Chris]

  • Google Sky Map boldly explores open source galaxy

    by 
    Andrew Munchbach
    Andrew Munchbach
    01.21.2012

    Via its Research Blog, Google has announced the donation of the Sky Map project to the open source community. Originally developed by Googlers during their "20% time," the stellar application was launched in 2009 to showcase the sensors in first generation Android handsets. Four years and over 20 million downloads later, Sky Map's code will be donated to the people -- with Carnegie Mellon University taking the reins on further development through "a series of student projects." Hit the source link for the official announcement and a bit of nostalgia from Google.

  • Carnegie Mellon robot jumps up, jumps up and glides down (video)

    by 
    Joseph Volpe
    Joseph Volpe
    09.10.2011

    We can handle the imaginary terror of UFOs and nightmarish, flying mammals. But, robots that can jump like a human and then glide like a colugo? Now you're just filling Mr. Spielberg with even more sci-fi, end of days fodder. Carnegie Mellon researchers Matthew Woodward and Metin Sitti have crafted a prototype jumping and gliding bot at the university's NanoRobotics Lab that springs into action using a pair of human knee-like joints. The automated hi-jinks don't end there either, as the duo's invention then spreads its legs to catch some air and glide on back to terra firma. The project isn't just some bit of engineering whimsy; the team plans to adapt this tech for use in "unstructured terrain" -- i.e. non-level, wargadget territory. For now, this lord of the leaping gliders can reach comfortable human-sized heights of up to six feet. Give it some time, however, and we're sure this lil' android'll give Superman a bound for his money. Click on past the break for a real world demo.

  • Intel places $30 million bet on the cloud, opens two new labs at Carnegie Mellon

    by 
    Joseph Volpe
    Joseph Volpe
    08.04.2011

    Have you nerds heard? The cloud is the word, and Intel's ready to put its bank account where the industry's buzzing mouth is. Investing $30 million over a span of five years, the company has partnered with Carnegie Mellon University to open two new Intel Science and Technology Centers. The academic research labs will laser in on cloud and embedded computing research, providing open source innovations that tackle mass data analytics, real-time information service distribution and refinements to a future, cloud-connected lifestyle. Curious as to what this brain collective has up its sleeves? Imagine wearing a pair of Intel-powered glasses that overlays data linked to the people and objects you see. Not the Minority Report type? Alright, then consider its proposed intelligent car of the future, capable of recommending "routing, retail, dining, and entertainment" options tailored to passenger profiles and real-world conditions. Whether you're ready or not, this is the future folks -- one big, passive scoop of computer-generated coddling. Hit the break for the full PR, and Peter Griffin's take on our sponsored tomorrow. [Image credit via Popular Science]

  • Carnegie Mellon researchers use photo-tagging to violate privacy, prove nothing social is sacred

    by 
    Joseph Volpe
    Joseph Volpe
    08.01.2011

    Some people never forget a face and the same, it seems, can be said for the internet. With some off-the-shelf facial recognition software, a connection to the cloud and access to social networking data, Carnegie Mellon University researchers have proved tagging can be the everyman's gateway to privacy violation. Using a specially-designed, AR-capable mobile app, Prof. Alessandro Acquisti and his team conducted three real-world trials of the personal info mining tech, successfully identifying pseudonymed online daters and campus strolling college students via Facebook. In some cases, the application was even able to dredge up the students' social security digits and personal interests -- from their MySpace pages, we assume. Sure, the study's findings could have you running for the off-the-grid hills (not to mention the plastic surgeon), but it's probably best you just pay careful attention to that digital second life. Full PR after the break.

  • Google acquires PittPatt, wants to know you on a face-to-face basis

    by 
    Joseph Volpe
    Joseph Volpe
    07.23.2011

    Google's quietly pitter-pattering its acquisitive ways back into the controversial realm of facial recognition technology. To do that, the company busted out its oversized wallet to fold Pittsburgh-based PittPatt into the Mountain View borg. Founded by a trio of PhD's from Carnegie Mellon University, this three-man strong outfit specializes in the sort of object recognition software you've come to know as "tagging." Is this a reversal of the Do No Evil tech giant's prior waffling on the dubious visioning tech, or just another massive weapon in its social networking crusade against Facebook? We'd err on the side of both, although the company's new employees aren't exactly playing their cards for us to see. A brief statement on the triumvirate's site makes vague mention of "computer vision technology" being core to Google's products and points to the tech's planned integration in photo, video and mobile applications. So, basically, expect to see Picasa, Goggles, YouTube and Google+ watch you as you flaunt your internet celebrity ways to that front-facing camera.

  • Carnegie Mellon researchers develop world's smallest biological fuel cell

    by 
    Donald Melanson
    Donald Melanson
    06.21.2011

    Cars and other vehicles may be the first thing that springs to mind at the mention of fuel cells, but the technology can of course also be used for plenty of other devices big and small, and a team of researchers at Carnegie Mellon University are now looking to take them to a few new places that haven't been possible so far. To that end, they've developed what they claim is the world's smallest biological fuel cell, which is the size of a single human hair and "generates energy from the metabolism of bacteria on thin gold plates in micro-manufactured channels." That, they say, could make it ideal for use in places like deep ocean environments where batteries are impractical -- or possibly in electronic devices with some further refinements, where they could potentially store more energy than traditional batteries in the same space. The university's full press release is after the break.

  • Vibratron plays impossible music with ball bearings, is your new master (video)

    by 
    Jesse Hicks
    Jesse Hicks
    04.26.2011

    First they came for Jeopardy!, then they came for our vibraphones. We still own baseball, but the "humans only" list has grown one shorter now that the Carnegie Mellon Robotics Club has birthed Vibratron, a robotic vibraphone. Vibratron's Arduino Mega controls 30 solenoid gates that drop steel balls onto the vibration keys, producing a note; an Archimedes screw recycles the bearings, turning them once more into sweet, sweet music. We should also note that Vibratron doesn't put decent, salt-of-the-earth vibraphonists out of work. That cacophony in the video is "Circus Galop," written for two player pianos and impossible for humans to perform -- and still pretty hard for humans to listen to. See, Vibratron is here to help you, fellow humans. At least for now. Click the video above to get acquainted.

  • NC State and CMU develop velocity-sensing shoe radar, aim to improve indoor GPS routing

    by 
    Darren Murph
    Darren Murph
    12.01.2010

    The world at large owes a good bit to Maxwell Smart, you know. Granted, it's hard to directly link the faux shoe phone to the GPS-equipped kicks that are around today, but the lineage is certainly apparent. The only issue with GPS in your feet is how they react when you waltz indoors, which is to say, not at all. In the past, most routing apparatuses have used inertial measurement units (IMUs) to track motion, movement and distance once GPS reception is lost indoors, but those have proven poor at spotting the difference between a slow gait and an outright halt. Enter NC State and Carnegie Mellon University, who have worked in tandem in order to develop a prototype shoe radar that's specifically designed to sense velocity. Within the shoe, a radar is attached to a diminutive navigational computer that "tracks the distance between your heel and the ground; if that distance doesn't change within a given period of time, the navigation computer knows that your foot is stationary." Hard to say when Nike will start testing these out in the cleats worn by football players, but after last week's abomination of a spot (and subsequent botching of a review by one Ron Cherry) during the NC State - Maryland matchup, we're hoping it's sooner rather than later.

  • NELL machine learning system could easily beat you at Trivial Pursuit

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    10.12.2010

    If fifteen years ago you would have told us that some day, deep in the bowels of Carnegie Mellon University, a supercomputer cluster would scan hundreds of millions of Web pages, examine text patterns, and teach itself about the Ramones, we might have believed you -- we were into some far-out stuff back then. But this project is about more than the make of Johnny's guitar (Mosrite) or the name of the original drummer (Tommy). NELL, or Never-Ending Language Learning system, constantly surfs the Web and classifies everything it scans into specific categories (such as cities, universities, and musicians) and relations. One example The New York Times cites: Peyton Manning is a football player (category). The Indianapolis Colts is a football team (category). By scanning text patterns, NELL can infer with a high probability that Peyton Manning plays for the Indianapolis Colts - even if it has never read that Mr. Manning plays for the Colts. But sports and music factoids aside, the system is not without its flaws. For instance, when Internet cookies were categorized as baked goods, "[i]t started this whole avalanche of mistakes," according to researcher Tom M. Mitchell. Apparently, NELL soon "learned" that one could delete pastries (the mere thought of which is sure to give us night terrors for quite some time). Luckily, human operators stepped in and corrected the thing, and now it's back on course, accumulating data and giving researchers insights that might someday lead to a true semantic web.

  • GM shows off sensor-laden windshield, new heads-up display prototype

    by 
    Darren Murph
    Darren Murph
    03.18.2010

    Heads-up displays are undoubtedly novel, and downright useful in the right circumstances. Trouble is, few of these prototypes ever make it beyond the lab, and we're stuck using these same two eyeballs to experience the world around us. General Motors is evidently tired of the almosts, and it's now working in concert with Carnegie Mellon University and the University of Southern California in order to concoct one of the most advanced HUD systems that we've seen -- particularly in the automotive world. Setting out to create "enhanced vision systems," GM's R&D team has created a windshield packed with visible and Infrared cameras along with internal optics that keep a close eye on the driver's retinas. In the images and video below (hit the 'Read More' link for the real action), you'll see a solution that utilizes lasers in order to highlight road edges, speed limit signs and all sorts of other vital bits of data during a fog-filled commute. Best of all? We're told that some of these technologies "could end up in GM vehicles in the near-term future." Granted, the Volt was supposed to set sail already, but we suppose we'll give 'em the benefit of the doubt. %Gallery-88465%

  • Skinput: because touchscreens never felt right anyway (video)

    by 
    Vlad Savov
    Vlad Savov
    03.02.2010

    Microsoft looks to be on a bit of a hot streak with innovations lately, and though this here project hasn't received much hype (yet), we'd say it's one of the most ingenious user interface concepts we've come across. Skinput is based on an armband straddling the wearer's biceps and detecting the small vibrations generated when the user taps the skin of his arm. Due to different bone densities, tissue mass and muscle size, unique acoustic signatures can be identified for particular parts of the arm or hand (including fingers), allowing people to literally control their gear by touching themselves. The added pico projector is there just for convenience, and we can totally see ourselves using this by simply memorizing the five input points (current maximum, 95.5 percent accuracy), particularly since the band works even if you're running. Make your way past the break to see Tetris played in a whole new way.

  • Surfacescapes puts Dungeons & Dragons on Surface, makes your d20 obsolete (video)

    by 
    Tim Stevens
    Tim Stevens
    10.20.2009

    We've seen some fancy applications for Microsoft's Surface, the touchable, strokable, caressable computing device/big-ass table, but not a single one has made us twitter in nerdy glee like Surfacescapes. Created by a team at Carnegie Mellon University, it's an implementation of Dungeons & Dragons in 3D, something that has of course been done dozens and dozens of times before, but this is different. Way different. It brilliantly brings the tabletop style of play to Surface, with players moving real figurines over virtual battlefields, rolling virtual d20s and d6s to deal real damage against digital dire wolves and the like, opponents who can move and attack automatically. Sure, it takes some of the imagination out of the experience, but it'll also make re-rolling your character a heck of a lot easier -- not to mention eliminating the dungeonmaster's folder of magic, mystery, and crudely drawn maps.

  • Carnegie Mellon's robotic snake stars in a glamour video

    by 
    Nilay Patel
    Nilay Patel
    07.12.2009

    We've been pretty into Carnegie Mellon's modular snake robots for a while now, and seeing as it's a relatively sleepy Sunday we thought we'd share this latest video of snakebots just basically crawling all over the place and getting crazy. Bots like these have been getting some serious military attention lately, so watching these guys wriggle into any damn spot they please is at once awesome and terrifying. Or maybe it's just the music. Video after the break.[Thanks, Curtis]

  • Robot Hall of Fame expands to include Da Vinci, Terminator, Roomba

    by 
    Darren Murph
    Darren Murph
    05.11.2009

    Forget those "sporting" Halls of Fame -- the real HOF is right here. Since 2003, the Robot Hall of Fame has been honoring robots and creators at an exhibit in Pittsburgh, Pennsylvania, and now we're seeing the latest handful of noteworthy creatures take their rightful place in history. For those unaware, the Robot HOF is maintained by Carnegie Mellon University and the Carnegie Science Center, and an international jury of researchers, writers, and designers has just selected five new bots to join the cast: Mars rovers Spirit and Opportunity, the T-800 Terminator (yes, that Terminator), the Da Vinci surgical system, iRobot's Roomba and 'Huey, Dewey, and Louie' from the 1972 sci-fi flick Silent Running. Could you have imagined a more fitting five? If so, sound off below!

  • CMU researchers control microbots with mini magnets

    by 
    Darren Murph
    Darren Murph
    05.07.2009

    Pardon the alliteration, but we're excited about the proposition here. For years -- millenniums, even -- scientists have been trying to figure out how to manipulate minuscule devices with magnets, and at long last, we've got a breakthrough in the field. Metin Sitti, an associate professor of mechanical engineering at Carnegie Mellon University, is credited with creating a new control technique that could allow microscopic machines to "one day deliver drugs directly to a sickly cell or a tumor." Essentially, the diminutive bots glide across a glass surface covered with a grid of metal electrodes, and you're just a click away (it's the Read link, just so you know) from seeing a live demonstration on how they can be used to "anchor one or more microbots while allowing others to continue to move freely around the surface." Good times.

  • Pyuuun palm-sized robot keeps tabs on you, delivers beverages

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    03.02.2009

    If Hans Moravec of the Robotics Institute at Carnegie Mellon University is right, we only have a good twenty to thirty years left before robots evolve into a new type of artificial species. As we wait for the inevitable robot apocalypse, we've already begun to see lots of little robotic guys pop into our lives, whether they're sweeping the floor, giving us something to hug, or bringing us a cup of tea. In addition to its miniature waitstaff ability, Pyuuun, Robo-Engine's "LifeLog Robot," is equipped with eight sensors (including brightness, movement, collision, sound, distance, temperature, slope and infrared) and can be programmed to monitor an area, collecting various data (such as keeping an eye on a temperature-sensitive workspace) and reporting back to you (or your robot overlords) via WiFi. With a 12-volt battery that promises six hours of use on a single charge, the utility of this bad boy is only limited by your imagination -- and its ?300,000 (about $3,090) price tag. Video after the break.

  • Caterpillar and CMU team up to create world's largest robotic monster truck

    by 
    Laura June Dziuban
    Laura June Dziuban
    11.07.2008

    We're always hearing about some fantastical, nigh-mythical creation that Carnegie Mellon University is in the midst of cobbling together from spare parts, crazy ideas, and pure, simple genius, so maybe we shouldn't be frothing over the new robotic truck they've partnered up with Caterpillar to create, but this one promises to be the "world's largest." Adapting software CMU used in the DARPA Urban Challenge, the team hopes to end up with fully automated, 700-ton trucks capable of moving up to 42 miles per hour which will be used for mining. The trucks would theoretically reduce costs, increase productivity, and save lives. The Frankenstein-ed vehicles will boast GPS, laser range finders to identify large obstacles, video equipment, and a "robotic driver." The scientists somewhat predictably foresee some (as of now) rather far-fetched consumer applications in cars and trucks over the "next five to ten years," but we're taking that with a few salt grains for now. The trucks aren't ready quite yet but we hear their arrival is imminent, and and we can only imagine that somewhere in the world, Grave Digger is crying to himself.Update: We've changed the title to reflect the accurate arrangement, which is a teaming up of CMU and Caterpillar, not DARPA. Thanks to the commenter who pointed that out.