Carnegie Mellon

Latest

  • A Tour of Astrobotic Technology's lunar rover lab at Carnegie Mellon (video)

    by 
    Brian Heater
    Brian Heater
    10.22.2012

    Things are buzzing late Monday afternoon at Carnegie Mellon's Planetary Robotics Lab Highbay. Outside, in front of the garage door-like entrance, a trio of men fills up a kiddie pool with a garden hose. Just to their left, an Enterprise rent-a-truck backs up and a handful of students raise two metal ramps up to its rear in order to drive a flashy rover up inside. I ask our guide, Jason Calaiaro, what the vehicle's final destination is. "NASA," he answers, simply. "We have a great relationship with NASA, and they help us test things." Calaiaro is the CIO of Astrobotic Technology, an offshoot of the school that was founded a few years back, thanks to Google's Lunar X Prize announcement. And while none of the handful of vehicles the former student showcases were made specifically with the government space agency in mind, given the company's history of contractual work, we could well see them receive the NASA stamp of approval in the future. Asked to take us through the project, Calaiaro tells us, quite confidently, that the trio of vehicles behind us are set to "land on the moon in 2015," an ambitious goal set to occur exactly three weeks from last Friday.%Gallery-168976%

  • Robot Hall of Fame voting begins for class of 2012, Johnny 5 learns where BigDogs sit

    by 
    Terrence O'Brien
    Terrence O'Brien
    08.21.2012

    It's that time again: time for Carnegie Mellon to roll out the red carpet and welcome the crème de la crème of the robotics world into its halls. Since 2003 the school has been selecting the best of the best and inducting them into the Robot Hall of Fame. Past honorees have included everything from LEGO Mindstorms to the Terminator. This year's list of nominees is no less impressive, with celebrity bots Johnny 5 and WALL-E pitted against each other in the entertainment category, while NASA's Robonaut takes on the PR2 and BigDog under the banner of research bots. There will also be two other inductees awarded a spot in the hall in the consumer and education category and the industrial and service field. Best of all, for the first time ever, Carnegie Mellon is letting the public vote on the inductees. And, while PETMAN was snubbed yet again, he's not letting that get him down -- the Boston Dymanic's biped just keeps on struttin'. Hit up the source link to cast your vote before the September 30th deadline and check back on October 23rd to see who's granted a podium speech.

  • Google fetes 40 years of Landsat with new timelapse videos of Earth

    by 
    Steve Dent
    Steve Dent
    07.24.2012

    Compared to Landsat, which has been beaming photos of our planet since 1972, Mountain View is a cartographic newb. But Google Earth drove geospatial interest into the stratosphere when it launched in 2005 and, with a billion downloads and counting, the company is well placed to celebrate 40 years of Landsat imagery. To do that, it has collaborated with the US Geological Survey and Carnegie Mellon to create a collection of timelapse videos ranging from seasonal snowcover changes across North America to Amazon deforestation. Though the search giant is gradually shifting from relatively low-res 100 feet per pixel Landsat imagery to 8 feet SPOTImage maps, its Google Earth Engine was used to process the vast archive and make it available to the public. To watch a video of the history of the grand dame of satellite imagery and its liaison with Google, head after the break -- or check the source for all the timelapse goodness.

  • Carnegie Mellon smart headlight prototype blacks out raindrops for clearer view of the road

    by 
    Steve Dent
    Steve Dent
    07.04.2012

    Researchers from Carnegie Mellon have developed a prototype smart headlight which blots out individual drops of rain or snow -- improving vision by up to 90 percent. Made with an off-the-shelf Viewsonic DLP projector, a quad-core Intel Core-i7 PC and a GigE Point Grey Flea3 camera, the Rube Goldberg-esque process starts by first imaging raindrops arriving at the top of its view. After this, the signal goes to a processing unit, which uses a predictive theory developed by the team to guess the drops' path to the road. Finally, the projector -- found in the same place as the camera -- uses a beamsplitter like modern digital 3D rigs. Used in tandem with calculations, it transmits a beam with light voids matching the predicted path. The result? It all stops light from hitting the falling particles, with the cumulative process resulting in the illusion of a nearly precipitation-free road view -- at least in the lab. So far, the whole process takes about a hundredth of a second (13 ms) but scientists said that in an actual car and with many more drops, the speed would have to be about ten times quicker. That would allow 90 percent of the light located 13 feet in front of the headlights to pass through, but even at just triple the speed, it would give drivers a 70 percent better view. To see if this tech might have a snowflake's chance of making it out of the lab, go past the break for all the videos.

  • Carnegie Mellon researchers develop robot that takes inventory, helps you find aisle four

    by 
    Alexis Santos
    Alexis Santos
    06.30.2012

    Fed up with wandering through supermarket aisles in an effort to cross that last item off your shopping list? Researchers at Carnegie Mellon University's Intel Science and Technology Center in Embedded Computing have developed a robot that could ease your pain and help store owners keep items in stock. Dubbed AndyVision, the bot is equipped with a Kinect sensor, image processing and machine learning algorithms, 2D and 3D images of products and a floor plan of the shop in question. As the mechanized worker roams around, it determines if items are low or out of stock and if they've been incorrectly shelved. Employees then receive the data on iPads and a public display updates an interactive map with product information for shoppers to peruse. The automaton is currently meandering through CMU's campus store, but it's expected to wheel out to a few local retailers for testing sometime next year. Head past the break to catch a video of the automated inventory clerk at work.

  • New shear touch technology lets you skip a double-tap, push your device around (video)

    by 
    Jon Fingas
    Jon Fingas
    05.11.2012

    Most every touchscreen in the market today can only register your finger input as coordinates; that's fine for most uses, but it leads to a lot of double-taps and occasionally convoluted gestures. A pair of researchers at Carnegie Mellon University, Chris Harrison and Scott Hudson, have suggested that shear touch might be a smarter solution. Instead of gliding over fixed glass, your finger could handle secondary tasks by pushing in a specific direction, or simply pushing harder, on a sliding display. Among the many examples of what shear touch could do, the research duo has raised the possibility of skipping through music by pushing left and right, or scrolling more slowly through your favorite website with a forceful dragging motion. The academic paper is still far away from producing a shipping device, although a Microsoft doctoral fellowship's partial contribution to funding the study indicates one direction the technology might go. You can take a peek at the future in a video after the jump -- just don't expect a tablet-based Van Gogh this soon. [Thanks, Chris]

  • Arduino-powered glove brings real sound effects to your make believe gun show (video)

    by 
    Daniel Cooper
    Daniel Cooper
    10.24.2011

    The days of air-punching invisible Daleks and making your own sound effects are over: a team from Carnegie Mellon's Human-Computer Interaction course have built a glove that does it all for you. The Augmented Hyper-Reality Glove can identify upper-cuts and karate chops using flex and tilt sensors and play the accompanying sound effect using an Arduino-powered Adafruit wave shield. We can see some potential downsides -- flirtatious finger-gun fusillades accompanied by the sound of cannon fire might just ruin your date. If you're undaunted by such social faux pas, see the toy your inner-child always wanted in action after the break.

  • Carnegie Mellon robot jumps up, jumps up and glides down (video)

    by 
    Joseph Volpe
    Joseph Volpe
    09.10.2011

    We can handle the imaginary terror of UFOs and nightmarish, flying mammals. But, robots that can jump like a human and then glide like a colugo? Now you're just filling Mr. Spielberg with even more sci-fi, end of days fodder. Carnegie Mellon researchers Matthew Woodward and Metin Sitti have crafted a prototype jumping and gliding bot at the university's NanoRobotics Lab that springs into action using a pair of human knee-like joints. The automated hi-jinks don't end there either, as the duo's invention then spreads its legs to catch some air and glide on back to terra firma. The project isn't just some bit of engineering whimsy; the team plans to adapt this tech for use in "unstructured terrain" -- i.e. non-level, wargadget territory. For now, this lord of the leaping gliders can reach comfortable human-sized heights of up to six feet. Give it some time, however, and we're sure this lil' android'll give Superman a bound for his money. Click on past the break for a real world demo.

  • Intel places $30 million bet on the cloud, opens two new labs at Carnegie Mellon

    by 
    Joseph Volpe
    Joseph Volpe
    08.04.2011

    Have you nerds heard? The cloud is the word, and Intel's ready to put its bank account where the industry's buzzing mouth is. Investing $30 million over a span of five years, the company has partnered with Carnegie Mellon University to open two new Intel Science and Technology Centers. The academic research labs will laser in on cloud and embedded computing research, providing open source innovations that tackle mass data analytics, real-time information service distribution and refinements to a future, cloud-connected lifestyle. Curious as to what this brain collective has up its sleeves? Imagine wearing a pair of Intel-powered glasses that overlays data linked to the people and objects you see. Not the Minority Report type? Alright, then consider its proposed intelligent car of the future, capable of recommending "routing, retail, dining, and entertainment" options tailored to passenger profiles and real-world conditions. Whether you're ready or not, this is the future folks -- one big, passive scoop of computer-generated coddling. Hit the break for the full PR, and Peter Griffin's take on our sponsored tomorrow. [Image credit via Popular Science]

  • Carnegie Mellon researchers develop world's smallest biological fuel cell

    by 
    Donald Melanson
    Donald Melanson
    06.21.2011

    Cars and other vehicles may be the first thing that springs to mind at the mention of fuel cells, but the technology can of course also be used for plenty of other devices big and small, and a team of researchers at Carnegie Mellon University are now looking to take them to a few new places that haven't been possible so far. To that end, they've developed what they claim is the world's smallest biological fuel cell, which is the size of a single human hair and "generates energy from the metabolism of bacteria on thin gold plates in micro-manufactured channels." That, they say, could make it ideal for use in places like deep ocean environments where batteries are impractical -- or possibly in electronic devices with some further refinements, where they could potentially store more energy than traditional batteries in the same space. The university's full press release is after the break.

  • Carnegie Mellon's GigaPan Time Machine brings time-lapse to panoramas

    by 
    Donald Melanson
    Donald Melanson
    04.22.2011

    We've already seen GigaPan technology used for plenty of impressive panoramas, but some researchers from Carnegie Mellon University have now gone one step further with their so-called "GigaPan Time Machine" project. Thanks to the magic of HTML5 and some time-consuming (but automated) photography, you can now "simultaneously explore space and time" right in your web browser -- that is, zoom in and around a large-format panorama that also happens to be a time-lapse video. If you don't feel like exploring yourself, you can also jump straight to some highlights -- like the like the construction of the Hulk statue at the CMU Carnival pictured above. Ht up the source link below to try it out -- just make sure you're in either Chrome and Safari, as they're the only compatible browsers at this time.

  • NC State and CMU develop velocity-sensing shoe radar, aim to improve indoor GPS routing

    by 
    Darren Murph
    Darren Murph
    12.01.2010

    The world at large owes a good bit to Maxwell Smart, you know. Granted, it's hard to directly link the faux shoe phone to the GPS-equipped kicks that are around today, but the lineage is certainly apparent. The only issue with GPS in your feet is how they react when you waltz indoors, which is to say, not at all. In the past, most routing apparatuses have used inertial measurement units (IMUs) to track motion, movement and distance once GPS reception is lost indoors, but those have proven poor at spotting the difference between a slow gait and an outright halt. Enter NC State and Carnegie Mellon University, who have worked in tandem in order to develop a prototype shoe radar that's specifically designed to sense velocity. Within the shoe, a radar is attached to a diminutive navigational computer that "tracks the distance between your heel and the ground; if that distance doesn't change within a given period of time, the navigation computer knows that your foot is stationary." Hard to say when Nike will start testing these out in the cleats worn by football players, but after last week's abomination of a spot (and subsequent botching of a review by one Ron Cherry) during the NC State - Maryland matchup, we're hoping it's sooner rather than later.

  • Humans wearing radios could form massive wireless networks of the future

    by 
    Laura June Dziuban
    Laura June Dziuban
    11.03.2010

    Researchers at Queens University in Belfast, Northern Ireland, are studying how to create an infrastructure out of human beings interconnected by wearing sensors, gateways and radios, resulting in a "body-to-body" network. Because human beings are so easy to come by, the networks could potentially be massive as well as high in bandwidth. The team is now studying how human bodies and movement can affect radio signals, and the general operations of body area networks, which aren't new. Concurrent research is being done at Carnegie Mellon to study how thousands of sensors can communicate with each other effectively. Long term, actual functioning body-to-body wireless networks could render cellular base stations unnecessary in heavily populated areas. Of course, that's all well into the future, but hit up the source for more details.

  • CMU's first ever robot census: 547 (and counting)

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    10.18.2010

    Sure, we've seen an incredible amount of cool tech from Carnegie Mellon (usually on our way to a kegger on Beeler St.), but you might wonder exactly how many robots they have on campus. Well, maybe you don't -- but a first year doctoral student named Heather Knight does. A recent transplant from MIT, she's counted 547 robots so far -- but since these guys are all over campus, from the Robotics Institute to the theater and art departments, getting an accurate head count might take a while. But the project most likely won't stop there: upon completion of the university-wide project, Knight would like to see a nationwide census take place. We only hope this happens before it's too late. Update: The CMU Robot Census form is available here.

  • NELL machine learning system could easily beat you at Trivial Pursuit

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    10.12.2010

    If fifteen years ago you would have told us that some day, deep in the bowels of Carnegie Mellon University, a supercomputer cluster would scan hundreds of millions of Web pages, examine text patterns, and teach itself about the Ramones, we might have believed you -- we were into some far-out stuff back then. But this project is about more than the make of Johnny's guitar (Mosrite) or the name of the original drummer (Tommy). NELL, or Never-Ending Language Learning system, constantly surfs the Web and classifies everything it scans into specific categories (such as cities, universities, and musicians) and relations. One example The New York Times cites: Peyton Manning is a football player (category). The Indianapolis Colts is a football team (category). By scanning text patterns, NELL can infer with a high probability that Peyton Manning plays for the Indianapolis Colts - even if it has never read that Mr. Manning plays for the Colts. But sports and music factoids aside, the system is not without its flaws. For instance, when Internet cookies were categorized as baked goods, "[i]t started this whole avalanche of mistakes," according to researcher Tom M. Mitchell. Apparently, NELL soon "learned" that one could delete pastries (the mere thought of which is sure to give us night terrors for quite some time). Luckily, human operators stepped in and corrected the thing, and now it's back on course, accumulating data and giving researchers insights that might someday lead to a true semantic web.

  • Carnegie Mellon's robot snakes converge into creepy hand-like wargadget

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    07.28.2010

    President Eisenhower, in his famous farewell speech in 1961, warned against the acquisition of unwarranted influence by the "military industrial complex." If he had given those remarks some sixty years later, he might have worked academia into the phrase -- especially if he knew about the snakes! Certainly one of the more viscerally unnerving wargadgets we've encountered over the last few years, the creepy-crawly automatons of the Carnegie Mellon Robotics Institute are a big hit at the U.S. Army Research Laboratory, where three of 'em have been arrayed onto a circular base to form the Robotic Tentacle Manipulator, a hand that could be used for opening doors or handling IEDs, possibly while mounted on the iRobot Warrior. The "opening a door" problem, as it is called, has perplexed the field of robotics for quite some time now -- and it might one day be solved using technology like this. Until then, it looks like doorknobs are still the terrorist's best friend.

  • GM shows off sensor-laden windshield, new heads-up display prototype

    by 
    Darren Murph
    Darren Murph
    03.18.2010

    Heads-up displays are undoubtedly novel, and downright useful in the right circumstances. Trouble is, few of these prototypes ever make it beyond the lab, and we're stuck using these same two eyeballs to experience the world around us. General Motors is evidently tired of the almosts, and it's now working in concert with Carnegie Mellon University and the University of Southern California in order to concoct one of the most advanced HUD systems that we've seen -- particularly in the automotive world. Setting out to create "enhanced vision systems," GM's R&D team has created a windshield packed with visible and Infrared cameras along with internal optics that keep a close eye on the driver's retinas. In the images and video below (hit the 'Read More' link for the real action), you'll see a solution that utilizes lasers in order to highlight road edges, speed limit signs and all sorts of other vital bits of data during a fog-filled commute. Best of all? We're told that some of these technologies "could end up in GM vehicles in the near-term future." Granted, the Volt was supposed to set sail already, but we suppose we'll give 'em the benefit of the doubt. %Gallery-88465%

  • Carnegie Mellon student shows that 64 pixels is enough for Mario (video)

    by 
    Tim Stevens
    Tim Stevens
    03.12.2010

    There are 2,073,600 pixels in a 1080p TV, yet Carnegie Mellon student Chloe Fan has blown our minds by showing that you only need 64 of them to have a little fun with Super Mario Bros. She wired an Arduino to an 8 x 8 LED matrix through a breadboard, then scaled the first level of the game down to a resolution that makes the 160 x 144 resolution Game Boy look positively high def. The controls are similarly simplified: one button to move Mario (the slightly more orange dot) right, and a second to jump. She also wired up a separate board to play the game's theme song, as you can see in the embed below, but be aware: the video ends before the theme song does, meaning you'll be humming it to yourself all day long.

  • Skinput: because touchscreens never felt right anyway (video)

    by 
    Vlad Savov
    Vlad Savov
    03.02.2010

    Microsoft looks to be on a bit of a hot streak with innovations lately, and though this here project hasn't received much hype (yet), we'd say it's one of the most ingenious user interface concepts we've come across. Skinput is based on an armband straddling the wearer's biceps and detecting the small vibrations generated when the user taps the skin of his arm. Due to different bone densities, tissue mass and muscle size, unique acoustic signatures can be identified for particular parts of the arm or hand (including fingers), allowing people to literally control their gear by touching themselves. The added pico projector is there just for convenience, and we can totally see ourselves using this by simply memorizing the five input points (current maximum, 95.5 percent accuracy), particularly since the band works even if you're running. Make your way past the break to see Tetris played in a whole new way.

  • Carnegie Mellon's robotic snake stars in a glamour video

    by 
    Nilay Patel
    Nilay Patel
    07.12.2009

    We've been pretty into Carnegie Mellon's modular snake robots for a while now, and seeing as it's a relatively sleepy Sunday we thought we'd share this latest video of snakebots just basically crawling all over the place and getting crazy. Bots like these have been getting some serious military attention lately, so watching these guys wriggle into any damn spot they please is at once awesome and terrifying. Or maybe it's just the music. Video after the break.[Thanks, Curtis]