GeorgiaTech

Latest

  • Georgia Tech models swimming, cargo-carrying nanobots

    by 
    Jon Fingas
    Jon Fingas
    08.07.2012

    The nanobot war is escalating. Not content to let Penn State's nanospiders win the day, Georgia Tech has answered back with a noticeably less creepy blood-swimming robot model of its own, whose look is more that of a fish than any arachnid this time around. It still uses material changes to exert movement -- here exposing hydrogels to electricity, heat, light or magnetism -- but Georgia Tech's method steers the 10-micron trooper to its destination through far more innocuous-sounding flaps. Researchers' goals are still as benign as ever, with the goal either to deliver drugs or to build minuscule structures piece-by-piece. The catch is that rather important mention of a "model" from earlier: Georgia Tech only has a scientifically viable design to work from and needs someone to build it. Should someone step up, there's a world of potential from schools of tiny swimmers targeting exactly what ails us.

  • Vibrating glove gives piano lessons, helps rehab patients regain finger sensation and motor skills

    by 
    Alexis Santos
    Alexis Santos
    07.18.2012

    We've seen a good number of electronic gloves before, and now researchers at Georgia Tech have devised one to rehabilitate patients who suffer from paralyzing spinal cord injuries while teaching them how to tickle the ivories. Christened Mobile Music Touch, the black mitt pairs with a keyboard and cues individual fingers with vibrations to play notes. The handgear also buzzes constantly for several hours to stimulate recovery while users go about their day, similar to another yellowjacket-developed solution. After treatment, some patients could pick up objects and feel textures they hadn't been able to -- especially remarkable since, according to the university, little improvement is typically seen a year after injuries are sustained. Folks who learned to play the piano with the device also experienced better results than those who did without it. Project leader Dr. Tanya Markow believes that the rehab's success could be caused by renewed brain activity that sometimes lies dormant. For the full skinny, head past the break for the press release and a video of the gauntlet in action. [Thanks, Timothy]

  • Georgia Tech scientists developing biology-inspired system to give robot eyes more human-like motion

    by 
    Brian Heater
    Brian Heater
    07.07.2012

    Having difficulty getting your robot parts to work as planned? Turn to nature -- or better yet, look inside yourself. After all, where better to find inspiration than the humans that the machines will one day enslave, right? Researchers at Georgia Tech have been working to develop a system to control cameras in robots that utilizes similar functionality as human muscle. Says Ph.D. candidate Joshua Schultz, The actuators developed in our lab embody many properties in common with biological muscle, especially a cellular structure. Essentially, in the human eye muscles are controlled by neural impulses. Eventually, the actuators we are developing will be used to capture the kinematics and performance of the human eye. The team recently showed off their work at the EEE International Conference on Biomedical Robotics and Biomechatronics in Rome. When fully developed, they anticipate that the piezoelectric system could be used for MRI-based surgery, rehabilitation and research of the human eye.

  • Georgia Tech's Shimi robot wants to rock with you all night, rock the night away

    by 
    Brian Heater
    Brian Heater
    06.27.2012

    Shimi certainly has the makings of a viral video hit, but its creators at Georgia Tech want you to know that there's more to the dancing robot than just a pretty face. The "interactive musical buddy," designed by the school's Center for Music Technology, is a one-foot-tall smartphone-enabled "docking station with a brain." Shimi has a whole slew of functionality, using the phone's face-detection to track listeners and better position its speakers. Users can also clap out a beat, which the 'bot will use to pull a matching song from the phone's playlist, playing the track and, naturally, dancing to the beat. Forthcoming functionality includes the ability for users to shake their heads or wave a hand to affect Shimi's song choices. Google I/O attendees will get the opportunity for a closer look at Shimi this week in San Francisco. In the meantime, check out a couple of videos of the robot doing its thing after the break.

  • The new stars of reggae are nothing like the old ones

    by 
    Jamie Rigg
    Jamie Rigg
    06.17.2012

    Earthly music just ain't enough for reggae / rock band Echo Movement. In search of extraterrestrial inspiration, they hooked up with researchers at Georgia Tech's Sonification Lab, which specializes in turning ugly numbers into beautiful music. Using data from NASA's Kepler telescope and its search for Earth II, SonLab generated "sequences of sonified musical pitches" from fluctuations in a star's brightness (meet Kepler 4665989). Echo Movement got their loop on and composed a harmony from the sequences, adding a tremolo effect from another star's pattern for a softer sound. Unfortunately, the finished track isn't out til September, but in the meantime you can hear the six-second celestial hook at the source link -- just don't blame us if you get pangs of Nokia-stalgia. Also, if you want to imagine how Echo Movement might use the sample, we've embedded one of their rarer songs -- that doesn't involve Spider-Man's girlfriend -- after the break.

  • Georgia Tech develops poultry deboning robot / chicken nightmare machine

    by 
    Brian Heater
    Brian Heater
    06.01.2012

    If you have any chickens in the house, you're going to want to keep them away from the computer. Georgia Tech researchers have developed the device that will haunt their feathered dreams. The prototype Intelligent Cutting and Deboning System has a built-in 3D vision system in order to help it cut and debone a chicken. The robot uses collected data and custom algorithms to help reduce bone fragments and increase yield on birds, whilst ensuring that no fowl with ever get a full night's sleep again. The school has begun testing the system, as evidenced by the unfortunate bird picture above. Press release after the break, if you're not too chicken.

  • Georgia Tech's BrailleTouch is a Braille writer for iPhone

    by 
    Steve Sande
    Steve Sande
    02.20.2012

    Researchers from the School of Interactive Computing at Georgia Tech have developed a prototype iPhone app called BrailleTouch that should revolutionize texting and data entry for visually impaired iPhone users. The app places six keys -- three for each hand -- on the sides of the iPhone screen in landscape orientation. Individuals who are familiar with Braille (which uses a grid of six raised dots) can then form the letters by touching the screen and receive audio feedback to confirm that the desired letter has been typed. There are also gestures for adding spaces and deleting incorrect letters. BrailleTouch has been shown to be up to six times faster than other texting solutions for the visually impaired, with speeds up to 32 words per minute at 92 percent accuracy. While the app is currently iOS-only (there is an iPad version as well), the developers plan to make BrailleTouch available for all smartphone platforms. The video below shows BrailleTouch in action, and you can read other details of the system in the press release from Georgia Tech. Show full PR text GEORGIA TECH DEVELOPS BRAILLE-LIKE TEXTING APP ATLANTA - Feb. 17, 2012 - Imagine if smartphone and tablet users could text a note under the table during a meeting without anyone being the wiser. Mobile gadget users might also be enabled to text while walking, watching TV or socializing without taking their eyes off what they're doing. Georgia Tech researchers have built a prototype app for touch-screen mobile devices that is vying to be a complete solution for texting without the need to look at a mobile gadget's screen. "Research has shown that chorded, or gesture-based, texting is a viable solution for eyes-free written communication in the future, making obsolete the need for users to look at their devices while inputting text on them," said Mario Romero, Postdoctoral Fellow in the School of Interactive Computing (IC) and the project's principal investigator. The free open-source app, called BrailleTouch, incorporates the Braille writing system used by the visually impaired. It has been conceived as a texting tool for any of the millions of smartphone phone users worldwide. Early studies with visually impaired participants proficient in Braille typing have demonstrated that users can input at least six times the number of words per minute when compared to other research prototypes for eyes-free texting on a touch screen. Users reach up to 32 words per minute with 92 percent accuracy with the prototype app for the iPhone. "We are currently designing a study to formally evaluate BrailleTouch through both quantitative and qualitative methods," said Caleb Southern, an IC graduate student. "We will measure the typing speed and accuracy of visually impaired users and capture the feedback from study participants in areas such as comfort, ease of use and perceived value." For sighted users, the research team is exploring how BrailleTouch could be a universal eyes-free mobile texting app that replaces soft QWERTY keyboards and other texting technologies. "BrailleTouch is an out-of-the-box solution that will work with smartphones and tablets and allow users to start learning the Braille alphabet in a few minutes," said Romero. "It also reduces the need for expensive proprietary Braille keyboard devices, which typically cost thousands of dollars." The researchers have designed BrailleTouch to address the limitations of soft keyboards, which do not provide tactile feedback, as well as physical keyboards, which often use small and numerous fixed buttons. BrailleTouch is the only iPhone app in existence that uses a six-finger chording process that replicates the traditional Braille keyboard. The app uses a gesture-based solution by turning the iPhone's touchscreen into a soft-touch keyboard programmed for Braille and requiring only six keys, making it a practical solution for the limited screen real estate on smartphones. The key feature of the BrailleTouch technology is the use of the six-key configuration so that the keyboard fits on the screen and users keep their fingers in a relatively fixed position while texting. This design allows users to hold their device with the screen facing away from them ¬– cradling the device with their palms or pinkies and thumbs – and to type with a majority of their fingers, identical to typing Braille on a standard keyboard. The team behind BrailleTouch is led by Romero and IC Professor Gregory Abowd, co-principal investigator. Former IC affiliate Brian Frey conceived the original idea and developed the first prototype and Southern created an improved design. They are conducting usability studies together with James Clawson, a Ph.D. candidate in IC, and Kate Rosier, a master's graduate in Digital Media and bachelor's graduate in Computational Media. The research group has developed iPhone and iPad versions of BrailleTouch and is currently working on Android versions. The app recently won the MobileHCI 2011 competition for design at the MobileHCI conference in Stockholm, Sweden. BrailleTouch will be demonstrated at the Abilities Expo-Atlanta 2012, taking place Feb. 17-19 at the Georgia World Congress Center. A video of BrailleTouch in action is available at the following link: http://www.youtube.com/watch?v=rIEO1bUFHsI This project was supported in part by the Rehabilitation Engineering Research Center for Wireless Technologies (Wireless RERC), which is funded by the National Institute on Disability and Rehabilitation Research (NIDRR), United States Department of Education, under grant number H133E110002.

  • Georgia Tech researchers turn an iPhone into a Braille writer with BrailleTouch app

    by 
    Michael Gorman
    Michael Gorman
    02.18.2012

    It wasn't all that long ago that we saw a student turn a tablet into a Braille writer, and now some researchers from Georgia Tech have done the same thing for smaller touchscreens, too. The Yellow Jackets produced a prototype app, called BrailleTouch, that has six keys to input letters using the Braille writing system and audio to confirm each letter as it's entered. To use the app, you simply turn the phone face down, hold it in landscape mode and start typing. As you can see above, it's currently running on an iPhone, but the researchers see it as a universal eyes-free texting app for any touchscreen. Early studies with people proficient in Braille writing show that typing on BrailleTouch is six times faster than other eyes-free texting solutions -- up to 32 words per minute at 92 percent accuracy. Skeptical of such speeds? Check out the PR and video of the app in action after the break.

  • Researchers use inkjet acumen to create wireless explosive sensor from paper

    by 
    Amar Toor
    Amar Toor
    10.31.2011

    Meet Krishna Naishadham and Xiaojuan (Judy) Song. They're researchers at the Georgia Institute of Technology, and those little devices they're holding may one day save you from an explosive device. This petite prototype is actually a paper-like wireless sensor that was printed using basic inkjet technology, developed by professor Manos Tentzeris. Its integrated lightweight antenna allows the sensor to link up with communication devices, while its functionalized carbon nanotubes enable it to pick up on even the slightest traces of ammonia -- an ingredient common to most IEDs. According to Tentzeris, the trick to such inkjet printing lies in the development of "inks" that can be deposited at relatively low temperatures. These inks, laced with silver nanoparticles, can then be uniformly distributed across paper-based components using a process called sonication. The result is a low-cost component that can adhere to just about any surface. The wireless sensor, meanwhile, requires comparatively low amounts of power, and could allow users to detect bombs from a safe distance. Naishadham says his team's device is geared toward military officials, humanitarian workers or any other bomb sniffers in hazardous situations, though there's no word yet on when it could enter the market. To find out more, careen past the break for the full PR.

  • Georgia Tech spies on nearby keyboards with iPhone 4 accelerometer, creates spiPhone

    by 
    Joe Pollicino
    Joe Pollicino
    10.21.2011

    Ever plopped your cellular down next to your laptop? According Georgia Tech researchers, that common scenario could let hackers record almost every sentence you type, all thanks to your smartphone's accelerometer. They've achieved the feat with an impressive 80 percent accuracy using an iPhone 4, and are dubbing the program they've developed, spiPhone. (Although the group initially had fledgling trials with an iPhone 3GS, they discovered the 4's gyroscope aided in data reading.) If the software gets installed onto a mobile device it can use the accelerometer to sense vibrations within three-inches, in degrees of "near or far and left or right," allowing it to statistically guess the words being written -- so long as they have three or more letters. It does this by recording pairs of keystrokes, putting them against dictionaries with nearly 58,000 words to come up with the most likely results. The group has also done the same with the phone's mics (which they say samples data at a whopping 44,000 times per second vs. the accelerometer's 100), but note that it's a less likely option given the usual need for some form of user permission. Furthermore, they explained that the accelerometer data rate is already mighty slow, and if phone makers reduced it a bit more, spiPhone would have a hard time doin' its thing. The good news? Considering the strict circumstances needed, these researchers think there's a slim chance that this kind of malware could go into action easily. Looks like our iPhone and MacBook can still be close friends... For now. You'll find more details at the links below.

  • Proof of concept: iPhone captures keystrokes via 'thump phreaking'

    by 
    Chris Rawson
    Chris Rawson
    10.19.2011

    Researchers at Georgia Tech have worked up a proof-of-concept demonstration of using an iPhone 4's accelerometer as a keylogger. After setting the iPhone near a computer keyboard, the device's built-in accelerometer and gyroscope were able to decipher entire sentences "with up to 80 percent accuracy." Similar keyloggers have already been developed using microphones, which sample vibrations far more frequently than accelerometers. However, nearly all phone operating systems ask a user's permission before granting applications access to the built-in microphone, which limits the utility of a keylogger. Apps don't currently ask for users' permission for access to accelerometers and gyroscopes, which raises the remote possibility of iPhones or other accelerometer-equipped devices spying on keyboard inputs without users being the wiser. "The way we see this attack working is that you, the phone's owner, would request or be asked to download an innocuous-looking application, which doesn't ask you for the use of any suspicious phone sensors," said Henry Carter, one of the project's researchers. "Then the keyboard-detection malware is turned on, and the next time you place your phone next to the keyboard and start typing, it starts listening." The keylogger software works by detecting key pairs -- detecting individual key presses turned out to be too difficult and unreliable -- and by comparing paired accelerometer events against a built-in dictionary, the software can decipher keypresses with startling accuracy. Our own Mike Rose has coined "thump phreaking" to refer to this spying technique (after Van Eck phreaking, which uses CRT or LCD emissions to reconstruct the screen image) and it's as apt a term as any for what this software does. It must be mentioned that this is only a proof of concept and not an actual attack that's out in the wild. The researchers themselves admit that this keylogger was difficult to build, and it's easily defeated by something as simple as moving your iPhone more than three inches away from the keyboard. That having been said, the technique is very James Bondian, and I wouldn't be at all surprised if something similar to this turns up in a forthcoming spy thriller or Batman movie.

  • New program makes it easier to turn your computer into a conversational chatterbox

    by 
    Amar Toor
    Amar Toor
    09.05.2011

    We've already seen how awkward computers can be when they try to speak like humans, but researchers from North Carolina State and Georgia Tech have now developed a program that could make it easier to show them how it's done. Their approach, outlined in a recently published paper, would allow developers to create natural language generation (NLG) systems twice as fast as currently possible. NLG technology is used in a wide array of applications (including video games and customer service centers), but producing these systems has traditionally required developers to enter massive amounts of data, vocabulary and templates -- rules that computers use to develop coherent sentences. Lead author Karthik Narayan and his team, however, have created a program capable of learning how to use these templates on its own, thereby requiring developers to input only basic information about any given topic of conversation. As it learns how to speak, the software can also make automatic suggestions about which information should be added to its database, based on the conversation at hand. Narayan and his colleagues will present their study at this year's Artificial Intelligence and Interactive Digital Entertainment conference in October, but you can dig through it for yourself, at the link below.

  • Where IPs go to die: a theoretical look at the belly of the online beast

    by 
    Joseph Volpe
    Joseph Volpe
    08.15.2011

    The key to a secure online world of tomorrow? Why, that would be an internet that spends a bit more time padding its waistline at the protocol buffet. Researchers at the Georgia Institute of Technology have developed an evolutionary model, dubbed EvoArch, that simulates a survival of the IP fittest battle for the interweb's belly. Separated into six distinct layers, the top-to-bottom structure -- specific applications, application protocols, transport protocols, network protocols, data-link protocols and physical layer protocols -- reveals a fiercely competitive middle tier that often sees newer, non-specialized competition cannibalized in favor of an older, more dominant framework. The team created the theoretic model as a guideline for "architects of the future Internet... to increase the number of protocols in these middle layers," thus protecting the web from potential security vulnerabilities. Despite these proposed layer variances, however, further simulations of the model only churned out more midriff slimming eventualities. It seems our dear internet is destined for a damned if you do, damned if you don't hourglass-shaped evolution. Full PR after the break.

  • Vibrating glove prototype lets you hold on to that feeling

    by 
    Brian Heater
    Brian Heater
    08.05.2011

    The secret to increasing tactile sensations? Good vibes, man. Georgia Tech scientists have unveiled a prototype glove that helps improve the feeling of its wearer by adding vibration. The gloves add physical "white noise," improving the sense of touch in the fingertips of the user. The whole thing is still in the early stages of testing, but the glove's inventors believe that it might some day find real world applications amongst people in occupations that require a good deal of manual dexterity and those with medical conditions that have dulled the feeling in their hands.

  • Robots for Humanity help around the house, scratch your itch (video)

    by 
    Joseph Volpe
    Joseph Volpe
    07.14.2011

    Robots for Humanity? That certainly doesn't jibe with our notion of the upcoming cyborg apocalypse. And it shouldn't, considering this joint effort's noble aim is to assist the disabled with the everyday household chores most of us take for granted. The project, a collaboration between Willow Garage and Georgia Tech's Healthcare Robotics Lab, has been working with stroke victim Henry Evans to develop custom UIs that give him mastery of the human-assistive PR2 robot. These tailor-made, head-tracking interfaces have allowed the mute quadriplegic to partially shave his face and even scratch a previously unreachable ten-year itch -- all with the helping claw of the friendly bot. It's a compassionate use of cybernetic tech we're used to seeing come out of Japan, and a welcome assist for disabled communities everywhere. Click past the break for a video demo of Henry and his robotic pal.

  • Georgia Tech engineers pull energy out of atmospheric hat, go on electromagnetic scavenger hunt

    by 
    Joseph Volpe
    Joseph Volpe
    07.11.2011

    Mankind's about to plunge into the depths of a wireless sensor-powering ether binge -- braincell annihilating vapors not included. Spearheaded by Georgia Institute of Technology's professor Manos Tentzeris and his engineering team, this ambient energy scavenging tech harnesses electromagnetic frequencies in the 100MHz - 15GHz range -- anything from your FM car radio to radar -- and converts it into a useable DC power source. So, it's free energy -- kind of. The cheap, self-powering paper or flexible polymer-based sensors are created using standard inkjet printers and Tentzeris' "unique in-house recipe" of circuit-building silver nanoparticles. Current testing hasn't yet yielded significant enough wattage to power your PS3 Slim, but it could soon via the help of supercapacitors and future solar cell integration. Imagine clothing embedded with health-monitoring biometric sensors, airport security run by something other than aloof TSA agents, or even spoilage-aware drink cartons -- milk that tells you when it's gone sour. The invisible radio band-charged possibilities are endless, but with storage still in the microwatt to one milliwatt range, it's more concept than solid vaporware reality.

  • Rescue robots map and explore dangerous buildings, prove there's no 'I' in 'team' (video)

    by 
    Amar Toor
    Amar Toor
    05.17.2011

    We've seen robots do some pretty heroic things in our time, but engineers from Georgia Tech, the University of Pennsylvania and Cal Tech have now developed an entire fleet of autonomous rescue vehicles, capable of simultaneously mapping and exploring potentially dangerous buildings -- without allowing their egos to get in the way. Each wheeled bot measures just one square foot in size, carries a video camera capable of identifying doorways, and uses an on-board laser scanner to analyze walls. Once gathered, these data are processed using a technique known as simultaneous localization and mapping (SLAM), which allows each bot to create maps of both familiar and unknown environments, while constantly recording and reporting its current location (independently of GPS). And, perhaps best of all, these rescue Roombas are pretty team-oriented. Georgia Tech professor Henrik Christensen explains: "There is no lead robot, yet each unit is capable of recruiting other units to make sure the entire area is explored. When the first robot comes to an intersection, it says to a second robot, 'I'm going to go to the left if you go to the right.'" This egalitarian robot army is the spawn of a research initiative known as the Micro Autonomous Systems and Technology (MAST) Collaborative Technology Alliance Program, sponsored by the US Army Research Laboratory. The ultimate goal is to shrink the bots down even further and to expand their capabilities. Engineers have already begun integrating infrared sensors into their design and are even developing small radar modules capable of seeing through walls. Roll past the break for a video of the vehicles in action, along with full PR.

  • Sand-swimming robot gets vertical manipulation via doorstop-shaped head (video)

    by 
    Christopher Trout
    Christopher Trout
    05.11.2011

    So it looks like a half-stuffed sock -- and it is, sort of -- but this sandfish-inspired search and rescue robot has the potential to change the way machines maneuver through disaster zones. Playing off its previous endeavors, a team of Georgia Tech researchers has designed a wedge-shaped head to manipulate the vertical movement of its sand-swimming invention through "complex dirt and rubble environments." By mimicking the pointy snout of the sandfish lizard, and attaching it to the body of its robot -- which sports seven servo-powered segments stuffed in a latex sock and sheathed by a spandex "swimsuit" -- the team found that subtle changes in the positioning of the robot's head made for drastic differences in vertical movement. When it was placed flat on the horizontal plane, the robot descended; when it was inclined above seven degrees, it ascended. For now, the robotic sandfish has been relegated to swimming in a sea of tiny yellow balls, but it's slated to dive into a pool of debris in the name of research soon. You can check out a rather dry description of the project in the video after the break.

  • Robots learn to march / spell, still not capable of love (video)

    by 
    Brian Heater
    Brian Heater
    05.07.2011

    Here's hoping there's more than a few military-style marches standing between us and a complete robotic takeover. If not, we've got some dire news: these are not simply miniature Roombas as they may appear, but 15 so-called Khepera bots capable of spelling out GRITS (for Georgia Robotics and Intelligent Systems) to demonstrate grad student Edward Macdonald's Master's thesis for the department. The diminutive robots aren't told where to go in the letters -- instead, they determine their spots via a control algorithm, positioning themselves relative to their fellow rolling machines, so that if one is removed from the equation, they quickly reform the letter without it. Fortunately, they haven't learned to spell "KILL." Yet. Get to know your new robotic overlords a little bit better in the video after the break. [Thanks, Ted]

  • Google gives Georgia Tech $1 million to build a benchmark for the open internet

    by 
    Tim Stevens
    Tim Stevens
    03.22.2011

    You can benchmark the cycles of your CPU, power of your GPU, speed of your internet connection, and a myriad of other seemingly important things. However, there's one missing benchmark that could make all those seem rather frivolous: the openness of your connection. Google wants one and has just awarded Georgia Tech a $1 million grant over two years (with a possible $500k bonus for a third year) to come up with a benchmark capable of detecting just how neutral your net is. When ready, it'll look for any artificial throttling that's been set in place and will also check for evidence of digital censorship. No word on when an early version might see release, but hopefully it comes before we need to start paying extra for the ability to download non-ISP-approved content.