carnegiemellon

Latest

  • Arduino-powered glove brings real sound effects to your make believe gun show (video)

    by 
    Daniel Cooper
    Daniel Cooper
    10.24.2011

    The days of air-punching invisible Daleks and making your own sound effects are over: a team from Carnegie Mellon's Human-Computer Interaction course have built a glove that does it all for you. The Augmented Hyper-Reality Glove can identify upper-cuts and karate chops using flex and tilt sensors and play the accompanying sound effect using an Arduino-powered Adafruit wave shield. We can see some potential downsides -- flirtatious finger-gun fusillades accompanied by the sound of cannon fire might just ruin your date. If you're undaunted by such social faux pas, see the toy your inner-child always wanted in action after the break.

  • Carnegie Mellon robot jumps up, jumps up and glides down (video)

    by 
    Joseph Volpe
    Joseph Volpe
    09.10.2011

    We can handle the imaginary terror of UFOs and nightmarish, flying mammals. But, robots that can jump like a human and then glide like a colugo? Now you're just filling Mr. Spielberg with even more sci-fi, end of days fodder. Carnegie Mellon researchers Matthew Woodward and Metin Sitti have crafted a prototype jumping and gliding bot at the university's NanoRobotics Lab that springs into action using a pair of human knee-like joints. The automated hi-jinks don't end there either, as the duo's invention then spreads its legs to catch some air and glide on back to terra firma. The project isn't just some bit of engineering whimsy; the team plans to adapt this tech for use in "unstructured terrain" -- i.e. non-level, wargadget territory. For now, this lord of the leaping gliders can reach comfortable human-sized heights of up to six feet. Give it some time, however, and we're sure this lil' android'll give Superman a bound for his money. Click on past the break for a real world demo.

  • Intel places $30 million bet on the cloud, opens two new labs at Carnegie Mellon

    by 
    Joseph Volpe
    Joseph Volpe
    08.04.2011

    Have you nerds heard? The cloud is the word, and Intel's ready to put its bank account where the industry's buzzing mouth is. Investing $30 million over a span of five years, the company has partnered with Carnegie Mellon University to open two new Intel Science and Technology Centers. The academic research labs will laser in on cloud and embedded computing research, providing open source innovations that tackle mass data analytics, real-time information service distribution and refinements to a future, cloud-connected lifestyle. Curious as to what this brain collective has up its sleeves? Imagine wearing a pair of Intel-powered glasses that overlays data linked to the people and objects you see. Not the Minority Report type? Alright, then consider its proposed intelligent car of the future, capable of recommending "routing, retail, dining, and entertainment" options tailored to passenger profiles and real-world conditions. Whether you're ready or not, this is the future folks -- one big, passive scoop of computer-generated coddling. Hit the break for the full PR, and Peter Griffin's take on our sponsored tomorrow. [Image credit via Popular Science]

  • Carnegie Mellon researchers develop world's smallest biological fuel cell

    by 
    Donald Melanson
    Donald Melanson
    06.21.2011

    Cars and other vehicles may be the first thing that springs to mind at the mention of fuel cells, but the technology can of course also be used for plenty of other devices big and small, and a team of researchers at Carnegie Mellon University are now looking to take them to a few new places that haven't been possible so far. To that end, they've developed what they claim is the world's smallest biological fuel cell, which is the size of a single human hair and "generates energy from the metabolism of bacteria on thin gold plates in micro-manufactured channels." That, they say, could make it ideal for use in places like deep ocean environments where batteries are impractical -- or possibly in electronic devices with some further refinements, where they could potentially store more energy than traditional batteries in the same space. The university's full press release is after the break.

  • Carnegie Mellon's GigaPan Time Machine brings time-lapse to panoramas

    by 
    Donald Melanson
    Donald Melanson
    04.22.2011

    We've already seen GigaPan technology used for plenty of impressive panoramas, but some researchers from Carnegie Mellon University have now gone one step further with their so-called "GigaPan Time Machine" project. Thanks to the magic of HTML5 and some time-consuming (but automated) photography, you can now "simultaneously explore space and time" right in your web browser -- that is, zoom in and around a large-format panorama that also happens to be a time-lapse video. If you don't feel like exploring yourself, you can also jump straight to some highlights -- like the like the construction of the Hulk statue at the CMU Carnival pictured above. Ht up the source link below to try it out -- just make sure you're in either Chrome and Safari, as they're the only compatible browsers at this time.

  • NC State and CMU develop velocity-sensing shoe radar, aim to improve indoor GPS routing

    by 
    Darren Murph
    Darren Murph
    12.01.2010

    The world at large owes a good bit to Maxwell Smart, you know. Granted, it's hard to directly link the faux shoe phone to the GPS-equipped kicks that are around today, but the lineage is certainly apparent. The only issue with GPS in your feet is how they react when you waltz indoors, which is to say, not at all. In the past, most routing apparatuses have used inertial measurement units (IMUs) to track motion, movement and distance once GPS reception is lost indoors, but those have proven poor at spotting the difference between a slow gait and an outright halt. Enter NC State and Carnegie Mellon University, who have worked in tandem in order to develop a prototype shoe radar that's specifically designed to sense velocity. Within the shoe, a radar is attached to a diminutive navigational computer that "tracks the distance between your heel and the ground; if that distance doesn't change within a given period of time, the navigation computer knows that your foot is stationary." Hard to say when Nike will start testing these out in the cleats worn by football players, but after last week's abomination of a spot (and subsequent botching of a review by one Ron Cherry) during the NC State - Maryland matchup, we're hoping it's sooner rather than later.

  • Humans wearing radios could form massive wireless networks of the future

    by 
    Laura June Dziuban
    Laura June Dziuban
    11.03.2010

    Researchers at Queens University in Belfast, Northern Ireland, are studying how to create an infrastructure out of human beings interconnected by wearing sensors, gateways and radios, resulting in a "body-to-body" network. Because human beings are so easy to come by, the networks could potentially be massive as well as high in bandwidth. The team is now studying how human bodies and movement can affect radio signals, and the general operations of body area networks, which aren't new. Concurrent research is being done at Carnegie Mellon to study how thousands of sensors can communicate with each other effectively. Long term, actual functioning body-to-body wireless networks could render cellular base stations unnecessary in heavily populated areas. Of course, that's all well into the future, but hit up the source for more details.

  • CMU's first ever robot census: 547 (and counting)

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    10.18.2010

    Sure, we've seen an incredible amount of cool tech from Carnegie Mellon (usually on our way to a kegger on Beeler St.), but you might wonder exactly how many robots they have on campus. Well, maybe you don't -- but a first year doctoral student named Heather Knight does. A recent transplant from MIT, she's counted 547 robots so far -- but since these guys are all over campus, from the Robotics Institute to the theater and art departments, getting an accurate head count might take a while. But the project most likely won't stop there: upon completion of the university-wide project, Knight would like to see a nationwide census take place. We only hope this happens before it's too late. Update: The CMU Robot Census form is available here.

  • NELL machine learning system could easily beat you at Trivial Pursuit

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    10.12.2010

    If fifteen years ago you would have told us that some day, deep in the bowels of Carnegie Mellon University, a supercomputer cluster would scan hundreds of millions of Web pages, examine text patterns, and teach itself about the Ramones, we might have believed you -- we were into some far-out stuff back then. But this project is about more than the make of Johnny's guitar (Mosrite) or the name of the original drummer (Tommy). NELL, or Never-Ending Language Learning system, constantly surfs the Web and classifies everything it scans into specific categories (such as cities, universities, and musicians) and relations. One example The New York Times cites: Peyton Manning is a football player (category). The Indianapolis Colts is a football team (category). By scanning text patterns, NELL can infer with a high probability that Peyton Manning plays for the Indianapolis Colts - even if it has never read that Mr. Manning plays for the Colts. But sports and music factoids aside, the system is not without its flaws. For instance, when Internet cookies were categorized as baked goods, "[i]t started this whole avalanche of mistakes," according to researcher Tom M. Mitchell. Apparently, NELL soon "learned" that one could delete pastries (the mere thought of which is sure to give us night terrors for quite some time). Luckily, human operators stepped in and corrected the thing, and now it's back on course, accumulating data and giving researchers insights that might someday lead to a true semantic web.

  • Carnegie Mellon's robot snakes converge into creepy hand-like wargadget

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    07.28.2010

    President Eisenhower, in his famous farewell speech in 1961, warned against the acquisition of unwarranted influence by the "military industrial complex." If he had given those remarks some sixty years later, he might have worked academia into the phrase -- especially if he knew about the snakes! Certainly one of the more viscerally unnerving wargadgets we've encountered over the last few years, the creepy-crawly automatons of the Carnegie Mellon Robotics Institute are a big hit at the U.S. Army Research Laboratory, where three of 'em have been arrayed onto a circular base to form the Robotic Tentacle Manipulator, a hand that could be used for opening doors or handling IEDs, possibly while mounted on the iRobot Warrior. The "opening a door" problem, as it is called, has perplexed the field of robotics for quite some time now -- and it might one day be solved using technology like this. Until then, it looks like doorknobs are still the terrorist's best friend.

  • GM shows off sensor-laden windshield, new heads-up display prototype

    by 
    Darren Murph
    Darren Murph
    03.18.2010

    Heads-up displays are undoubtedly novel, and downright useful in the right circumstances. Trouble is, few of these prototypes ever make it beyond the lab, and we're stuck using these same two eyeballs to experience the world around us. General Motors is evidently tired of the almosts, and it's now working in concert with Carnegie Mellon University and the University of Southern California in order to concoct one of the most advanced HUD systems that we've seen -- particularly in the automotive world. Setting out to create "enhanced vision systems," GM's R&D team has created a windshield packed with visible and Infrared cameras along with internal optics that keep a close eye on the driver's retinas. In the images and video below (hit the 'Read More' link for the real action), you'll see a solution that utilizes lasers in order to highlight road edges, speed limit signs and all sorts of other vital bits of data during a fog-filled commute. Best of all? We're told that some of these technologies "could end up in GM vehicles in the near-term future." Granted, the Volt was supposed to set sail already, but we suppose we'll give 'em the benefit of the doubt. %Gallery-88465%

  • Carnegie Mellon student shows that 64 pixels is enough for Mario (video)

    by 
    Tim Stevens
    Tim Stevens
    03.12.2010

    There are 2,073,600 pixels in a 1080p TV, yet Carnegie Mellon student Chloe Fan has blown our minds by showing that you only need 64 of them to have a little fun with Super Mario Bros. She wired an Arduino to an 8 x 8 LED matrix through a breadboard, then scaled the first level of the game down to a resolution that makes the 160 x 144 resolution Game Boy look positively high def. The controls are similarly simplified: one button to move Mario (the slightly more orange dot) right, and a second to jump. She also wired up a separate board to play the game's theme song, as you can see in the embed below, but be aware: the video ends before the theme song does, meaning you'll be humming it to yourself all day long.

  • Skinput: because touchscreens never felt right anyway (video)

    by 
    Vlad Savov
    Vlad Savov
    03.02.2010

    Microsoft looks to be on a bit of a hot streak with innovations lately, and though this here project hasn't received much hype (yet), we'd say it's one of the most ingenious user interface concepts we've come across. Skinput is based on an armband straddling the wearer's biceps and detecting the small vibrations generated when the user taps the skin of his arm. Due to different bone densities, tissue mass and muscle size, unique acoustic signatures can be identified for particular parts of the arm or hand (including fingers), allowing people to literally control their gear by touching themselves. The added pico projector is there just for convenience, and we can totally see ourselves using this by simply memorizing the five input points (current maximum, 95.5 percent accuracy), particularly since the band works even if you're running. Make your way past the break to see Tetris played in a whole new way.

  • Carnegie Mellon's robotic snake stars in a glamour video

    by 
    Nilay Patel
    Nilay Patel
    07.12.2009

    We've been pretty into Carnegie Mellon's modular snake robots for a while now, and seeing as it's a relatively sleepy Sunday we thought we'd share this latest video of snakebots just basically crawling all over the place and getting crazy. Bots like these have been getting some serious military attention lately, so watching these guys wriggle into any damn spot they please is at once awesome and terrifying. Or maybe it's just the music. Video after the break.[Thanks, Curtis]

  • CMU researchers control microbots with mini magnets

    by 
    Darren Murph
    Darren Murph
    05.07.2009

    Pardon the alliteration, but we're excited about the proposition here. For years -- millenniums, even -- scientists have been trying to figure out how to manipulate minuscule devices with magnets, and at long last, we've got a breakthrough in the field. Metin Sitti, an associate professor of mechanical engineering at Carnegie Mellon University, is credited with creating a new control technique that could allow microscopic machines to "one day deliver drugs directly to a sickly cell or a tumor." Essentially, the diminutive bots glide across a glass surface covered with a grid of metal electrodes, and you're just a click away (it's the Read link, just so you know) from seeing a live demonstration on how they can be used to "anchor one or more microbots while allowing others to continue to move freely around the surface." Good times.

  • Carnegie Mellon morphs 'pop-up buttons' onto multi-touch display

    by 
    Thomas Ricker
    Thomas Ricker
    04.28.2009

    While attempts to add feedback to touchscreen displays via vibration and audible tones are laudable, these attempts are nothing by comparison to the tactile euphoria felt at the press of a well-designed button. Still, many of us are willing to sacrifice tactility in order to maximize display sizes on our pocketable or portable devices. Now researchers at Carnegie Mellon have developed touch-sensitive displays with physical buttons that "pop-out" from the surface. CM's prototypes pump air through geometric-shaped holes to create concave or convex "buttons" on a screen covered with a semi-transparent latex -- IR sensors and cameras detect finger placement while a projector cast images (like numbers and graphics) onto the display. It can even sense press-force by monitoring changes in air pressure. Sure it all sounds overly cumbersome until you see the technology demonstrated. For that you can travel to Pittsburgh to count the rivers or just hit the read link below for the video. Read -- Video Read -- Technology Review

  • Sensor-laden footballs / gloves could run referees right out of work

    by 
    Darren Murph
    Darren Murph
    12.19.2008

    Dr. Priya Narasimhan, a professor at Carnegie Mellon University, doesn't intend to put a single NFL referee out of work, but there's no doubt that the technology she's tinkering with could indeed have that effect. The prof and her students are developing sensor-laden footballs and gloves, both of which could eventually tell in real-time whether a ball bounced off the ground before being caught or whether a player actually had possession of a ball whilst being piled upon after a fumble. Currently, she's had zero luck persuading a college or professional team to help her experiment further, and we can sort of see why. We mean, it's nice to get every call right in theory, but what fun would sport be without the all-important "Ref, you suck!" chant?[Thanks, Freddy]

  • Caterpillar and CMU team up to create world's largest robotic monster truck

    by 
    Laura June Dziuban
    Laura June Dziuban
    11.07.2008

    We're always hearing about some fantastical, nigh-mythical creation that Carnegie Mellon University is in the midst of cobbling together from spare parts, crazy ideas, and pure, simple genius, so maybe we shouldn't be frothing over the new robotic truck they've partnered up with Caterpillar to create, but this one promises to be the "world's largest." Adapting software CMU used in the DARPA Urban Challenge, the team hopes to end up with fully automated, 700-ton trucks capable of moving up to 42 miles per hour which will be used for mining. The trucks would theoretically reduce costs, increase productivity, and save lives. The Frankenstein-ed vehicles will boast GPS, laser range finders to identify large obstacles, video equipment, and a "robotic driver." The scientists somewhat predictably foresee some (as of now) rather far-fetched consumer applications in cars and trucks over the "next five to ten years," but we're taking that with a few salt grains for now. The trucks aren't ready quite yet but we hear their arrival is imminent, and and we can only imagine that somewhere in the world, Grave Digger is crying to himself.Update: We've changed the title to reflect the accurate arrangement, which is a teaming up of CMU and Caterpillar, not DARPA. Thanks to the commenter who pointed that out.

  • Carnegie Mellon brings adhesive arms to the burgeoning pillbot scene

    by 
    Paul Miller
    Paul Miller
    08.06.2008

    We've seen plenty of pill bots in our day -- most of the dumb little swallowable cameras -- but Carnegie Mellon University isn't messing around with this stuff. The nerds over there have built a remote controlled pillbot with small, adhesive arms that allow it to grip onto internal surfaces. That is to say, your internal surfaces. The pill can view damaged areas, deliver drugs and might eventually be outfitted with a small laser for cauterizing internal wounds. Yes, we just said lasers.[Via Hack a Day]Read - Controlling a Gut Bot's PositionRead - Creepy action video

  • Waalbot, the wall-climbing, gecko-footed robot

    by 
    Joshua Fruhlinger
    Joshua Fruhlinger
    06.12.2008

    Robots do all sorts of things, but climbing walls seems to be a goal engineers can't live without. Meet the Waalbot which is fitted with gecko-like micro fiber feet that can stick to vertical and ceiling surfaces. In addition, the legs are wheeled with three feet that can rotate and cruise over surface curvature and even grab onto other walls at right angles. Unlike other wall-climbing robots, this one is small and light, and given its diminutive size, capacity varies depending on the surface type and size of feet. Using a PIC microcontroller and wireless controls along with batteries, Waalbot is designed to be completely autonomous and untethered.[Via Engineering TV]