Georgia Tech

Latest

  • Quantum computing concept. Abstract glowing electronic circuit. 3D rendered illustration.

    Facebook's AI team expands post-grad courses for Black and Latinx students

    by 
    Daniel Cooper
    Daniel Cooper
    10.22.2020

    The aim is to improve diversity and remove biases from AI.

  • SlothBot watches over the Atlanta Botanical Garden

    A robot sloth will (very slowly) survey endangered species

    by 
    Jon Fingas
    Jon Fingas
    06.18.2020

    A robot sloth will keep watch on animals and plants to protect them -- and its slowness is an advantage.

  • Allison Carter, Georgia  Tech

    Tiny vibration-powered robots could repair your body from the inside

    by 
    Jon Fingas
    Jon Fingas
    07.16.2019

    There are many challenges to developing robots that could operate within your body, not the least of which is finding a power source -- you can't exactly strap a big battery on them. That might not be an issue thanks to Georgia Tech researchers. They've developed minuscule "bristle-bots" that move by tapping vibration from a variety of sources, whether it's ultrasound or a nearby speaker. The trick was to mate a tiny piezoelectric actuator to a 3D-printed polymer body whose bristle-like legs are angled to move in specific directions in a resonant response to vibrations.

  • Keeping your digital life safe in the age of surveillance

    by 
    Jon Turi
    Jon Turi
    10.12.2014

    Like it or not, people are after your data. Whether it's for advertising, national security or other nefarious purposes, you're leaving a trail of digital breadcrumbs for anyone to follow. But there's a growing arsenal of affordable tools to help protect your privacy both digitally and physically. In this week's Rewind, we take a look at this age of surveillance and some of the more approachable gadgets designed to help fight back against prying eyes.

  • Kids with disabilities can teach this robot how to play 'Angry Birds'

    by 
    Edgar Alvarez
    Edgar Alvarez
    07.10.2014

    As a way to help children dealing with cognitive and motor-skill disabilities, researchers from Georgia Tech have developed a rehabilitation tool that pairs a robot and an Android tablet. To demonstrate this system in action, the research team used Angry Birds to let kids teach the humanoid how to play Rovio's popular game. Essentially, the robot is smart enough to learn by simply watching each move the child makes while flinging those birds toward the iconic green pigs. "The robot is able to learn by watching because it knows how interaction with a tablet app is supposed to work," writes project leader Ayanna Howard, a professor at Georgia Tech. "It recognizes that a person touched here and ended there, then deciphers the information that is important and relevant to its progress."

  • Access4Kids input device allows disabled children to control touch-centric tablets (video)

    by 
    Darren Murph
    Darren Murph
    12.11.2012

    The innovation world at large has been crafting ways for handicapped individuals to interact with computers for years on end, but the issue of tablets has created another predicament entirely. How do you enable someone to masterfully control a touch-centric device, when the mere act of touching is a challenge? Ayanna Howard, professor of electrical and computer engineering at Georgia Tech, and graduate student Hae Won Park have created Access4Kids, which is described as a "wireless input device that uses a sensor system to translate physical movements into fine-motor gestures to control a tablet." In essence, it enables individuals with limited mobility to pinch and swipe, and the group has had success thus far with providing greater accessibility to flagship programs like Facebook and YouTube. Moreover, custom-built apps for therapy and science education are cropping up, with the existing prototype utilizing a trio of force-sensitive resistors that measure pressure and convert it into a signal that instructs the tablet. A child can wear the device around the forearm or place it on the arm of a wheelchair and hit the sensors or swipe across the sensors with his or her fist, providing an entirely new level of interaction for those with cerebral palsy, traumatic brain injury, spina bifida and muscular dystrophy. The goal? Once it's honed, to get it out of the lab and made "into a commercial product." Head on past the break for a video look.

  • Georgia Tech receives $900,000 grant from Office of Naval Research to develop 'MacGyver' robot

    by 
    James Trew
    James Trew
    10.12.2012

    Robots come in many flavors. There's the subservient kind, the virtual representative, the odd one with an artistic bent, and even robo-cattle. But, typically, they all hit the same roadblock: they can only do what they are programmed to do. Of course, there are those that posses some AI smarts, too, but Georgia Tech wants to take this to the next level, and build a 'bot that can interact with its environment on the fly. The project hopes to give machines deployed in disaster situations the ability to find objects in their environment for use as tools, such as placing a chair to reach something high, or building bridges from debris. The idea builds on previous work where robots learned to moved objects out of their way, and developing an algorithm that allows them to identify items, and asses its usefulness as a tool. This would be backed up by some programming, to give the droids a basic understanding of rigid body mechanics, and how to construct motion plans. The Office of Navy Research's interest comes from potential future applications, working side-by-side with military personnel out on missions, which along with iRobot 110, forms the early foundations for the cyber army of our childhood imaginations.

  • Insert Coin: Shimi iPhone robot is ready to dance its way out of the lab, into your heart

    by 
    Brian Heater
    Brian Heater
    09.11.2012

    As soon as we saw Georgia Tech's Shimi, we wanted to how many sleepless nights we'd have to spend waiting for one to proudly display on our desk. And really that's the whole idea behind the iPhone-enabled dancing robot: bringing some sophisticated robotic concepts to the consumer, in an adorable little package. Now the wall-eyed "first musically intelligent robotic speaker dock" has hit Kickstarter, ready to dance its way into reality. When finished, Shimi will feature six-watt speakers on either side of its face and five motors that allow it to dance and turn its head to the best position for optimal listening. Shimi does the latter via facial recognition software, tracking you around the room. The 'bot can also respond to verbal requests like "look at me," and "play Justin Bieber" (their suggestion, not ours). In the future, its creators will be offering up apps for gaming, telepresence and the like, as well as an SDK for developers. The Kickstarter page has a decidedly lofty $100,000 goal to hit by October 10th. Pledge $129 or more, and you get a Shimi of your very own. Check out a video of the 'bot and its creators after the break.

  • Georgia Tech develops self-charging battery that marches to the owner's beat

    by 
    Jon Fingas
    Jon Fingas
    08.19.2012

    One of the last times we saw the concept of a self-recharging battery, it was part of a high-minded Nokia patent whose ideas still haven't seen the light of day. Researchers at Georgia Tech are more inclined to put theory into practice. Starting from a regular lithium-ion coin battery, the team has replaced the usual divider between electrodes with a polyvinylidene difluoride film whose piezoelectric nature produces a charging action inside that gap through just a little pressure, with no outside voltage required to make the magic happen. The developers have even thumbed their noses at skeptics by very literally walking the walk -- slipping the test battery under a shoe sole gives it a proper dose of energy with every footstep. At this stage, the challenge mostly involves ramping up the maximum power through upgrades such as more squeezable piezoelectrics. Georgia Tech hasn't progressed so far as to have production plans in mind; it's nonetheless close enough that we could see future forms of wearable computing that rarely need an electrical pick-me-up.

  • Georgia Tech models swimming, cargo-carrying nanobots

    by 
    Jon Fingas
    Jon Fingas
    08.07.2012

    The nanobot war is escalating. Not content to let Penn State's nanospiders win the day, Georgia Tech has answered back with a noticeably less creepy blood-swimming robot model of its own, whose look is more that of a fish than any arachnid this time around. It still uses material changes to exert movement -- here exposing hydrogels to electricity, heat, light or magnetism -- but Georgia Tech's method steers the 10-micron trooper to its destination through far more innocuous-sounding flaps. Researchers' goals are still as benign as ever, with the goal either to deliver drugs or to build minuscule structures piece-by-piece. The catch is that rather important mention of a "model" from earlier: Georgia Tech only has a scientifically viable design to work from and needs someone to build it. Should someone step up, there's a world of potential from schools of tiny swimmers targeting exactly what ails us.

  • Vibrating glove gives piano lessons, helps rehab patients regain finger sensation and motor skills

    by 
    Alexis Santos
    Alexis Santos
    07.18.2012

    We've seen a good number of electronic gloves before, and now researchers at Georgia Tech have devised one to rehabilitate patients who suffer from paralyzing spinal cord injuries while teaching them how to tickle the ivories. Christened Mobile Music Touch, the black mitt pairs with a keyboard and cues individual fingers with vibrations to play notes. The handgear also buzzes constantly for several hours to stimulate recovery while users go about their day, similar to another yellowjacket-developed solution. After treatment, some patients could pick up objects and feel textures they hadn't been able to -- especially remarkable since, according to the university, little improvement is typically seen a year after injuries are sustained. Folks who learned to play the piano with the device also experienced better results than those who did without it. Project leader Dr. Tanya Markow believes that the rehab's success could be caused by renewed brain activity that sometimes lies dormant. For the full skinny, head past the break for the press release and a video of the gauntlet in action. [Thanks, Timothy]

  • Georgia Tech scientists developing biology-inspired system to give robot eyes more human-like motion

    by 
    Brian Heater
    Brian Heater
    07.07.2012

    Having difficulty getting your robot parts to work as planned? Turn to nature -- or better yet, look inside yourself. After all, where better to find inspiration than the humans that the machines will one day enslave, right? Researchers at Georgia Tech have been working to develop a system to control cameras in robots that utilizes similar functionality as human muscle. Says Ph.D. candidate Joshua Schultz, The actuators developed in our lab embody many properties in common with biological muscle, especially a cellular structure. Essentially, in the human eye muscles are controlled by neural impulses. Eventually, the actuators we are developing will be used to capture the kinematics and performance of the human eye. The team recently showed off their work at the EEE International Conference on Biomedical Robotics and Biomechatronics in Rome. When fully developed, they anticipate that the piezoelectric system could be used for MRI-based surgery, rehabilitation and research of the human eye.

  • Georgia Tech's Shimi robot wants to rock with you all night, rock the night away

    by 
    Brian Heater
    Brian Heater
    06.27.2012

    Shimi certainly has the makings of a viral video hit, but its creators at Georgia Tech want you to know that there's more to the dancing robot than just a pretty face. The "interactive musical buddy," designed by the school's Center for Music Technology, is a one-foot-tall smartphone-enabled "docking station with a brain." Shimi has a whole slew of functionality, using the phone's face-detection to track listeners and better position its speakers. Users can also clap out a beat, which the 'bot will use to pull a matching song from the phone's playlist, playing the track and, naturally, dancing to the beat. Forthcoming functionality includes the ability for users to shake their heads or wave a hand to affect Shimi's song choices. Google I/O attendees will get the opportunity for a closer look at Shimi this week in San Francisco. In the meantime, check out a couple of videos of the robot doing its thing after the break.

  • The new stars of reggae are nothing like the old ones

    by 
    Jamie Rigg
    Jamie Rigg
    06.17.2012

    Earthly music just ain't enough for reggae / rock band Echo Movement. In search of extraterrestrial inspiration, they hooked up with researchers at Georgia Tech's Sonification Lab, which specializes in turning ugly numbers into beautiful music. Using data from NASA's Kepler telescope and its search for Earth II, SonLab generated "sequences of sonified musical pitches" from fluctuations in a star's brightness (meet Kepler 4665989). Echo Movement got their loop on and composed a harmony from the sequences, adding a tremolo effect from another star's pattern for a softer sound. Unfortunately, the finished track isn't out til September, but in the meantime you can hear the six-second celestial hook at the source link -- just don't blame us if you get pangs of Nokia-stalgia. Also, if you want to imagine how Echo Movement might use the sample, we've embedded one of their rarer songs -- that doesn't involve Spider-Man's girlfriend -- after the break.

  • Georgia Tech develops poultry deboning robot / chicken nightmare machine

    by 
    Brian Heater
    Brian Heater
    06.01.2012

    If you have any chickens in the house, you're going to want to keep them away from the computer. Georgia Tech researchers have developed the device that will haunt their feathered dreams. The prototype Intelligent Cutting and Deboning System has a built-in 3D vision system in order to help it cut and debone a chicken. The robot uses collected data and custom algorithms to help reduce bone fragments and increase yield on birds, whilst ensuring that no fowl with ever get a full night's sleep again. The school has begun testing the system, as evidenced by the unfortunate bird picture above. Press release after the break, if you're not too chicken.

  • Georgia Tech's BrailleTouch is a Braille writer for iPhone

    by 
    Steve Sande
    Steve Sande
    02.20.2012

    Researchers from the School of Interactive Computing at Georgia Tech have developed a prototype iPhone app called BrailleTouch that should revolutionize texting and data entry for visually impaired iPhone users. The app places six keys -- three for each hand -- on the sides of the iPhone screen in landscape orientation. Individuals who are familiar with Braille (which uses a grid of six raised dots) can then form the letters by touching the screen and receive audio feedback to confirm that the desired letter has been typed. There are also gestures for adding spaces and deleting incorrect letters. BrailleTouch has been shown to be up to six times faster than other texting solutions for the visually impaired, with speeds up to 32 words per minute at 92 percent accuracy. While the app is currently iOS-only (there is an iPad version as well), the developers plan to make BrailleTouch available for all smartphone platforms. The video below shows BrailleTouch in action, and you can read other details of the system in the press release from Georgia Tech. Show full PR text GEORGIA TECH DEVELOPS BRAILLE-LIKE TEXTING APP ATLANTA - Feb. 17, 2012 - Imagine if smartphone and tablet users could text a note under the table during a meeting without anyone being the wiser. Mobile gadget users might also be enabled to text while walking, watching TV or socializing without taking their eyes off what they're doing. Georgia Tech researchers have built a prototype app for touch-screen mobile devices that is vying to be a complete solution for texting without the need to look at a mobile gadget's screen. "Research has shown that chorded, or gesture-based, texting is a viable solution for eyes-free written communication in the future, making obsolete the need for users to look at their devices while inputting text on them," said Mario Romero, Postdoctoral Fellow in the School of Interactive Computing (IC) and the project's principal investigator. The free open-source app, called BrailleTouch, incorporates the Braille writing system used by the visually impaired. It has been conceived as a texting tool for any of the millions of smartphone phone users worldwide. Early studies with visually impaired participants proficient in Braille typing have demonstrated that users can input at least six times the number of words per minute when compared to other research prototypes for eyes-free texting on a touch screen. Users reach up to 32 words per minute with 92 percent accuracy with the prototype app for the iPhone. "We are currently designing a study to formally evaluate BrailleTouch through both quantitative and qualitative methods," said Caleb Southern, an IC graduate student. "We will measure the typing speed and accuracy of visually impaired users and capture the feedback from study participants in areas such as comfort, ease of use and perceived value." For sighted users, the research team is exploring how BrailleTouch could be a universal eyes-free mobile texting app that replaces soft QWERTY keyboards and other texting technologies. "BrailleTouch is an out-of-the-box solution that will work with smartphones and tablets and allow users to start learning the Braille alphabet in a few minutes," said Romero. "It also reduces the need for expensive proprietary Braille keyboard devices, which typically cost thousands of dollars." The researchers have designed BrailleTouch to address the limitations of soft keyboards, which do not provide tactile feedback, as well as physical keyboards, which often use small and numerous fixed buttons. BrailleTouch is the only iPhone app in existence that uses a six-finger chording process that replicates the traditional Braille keyboard. The app uses a gesture-based solution by turning the iPhone's touchscreen into a soft-touch keyboard programmed for Braille and requiring only six keys, making it a practical solution for the limited screen real estate on smartphones. The key feature of the BrailleTouch technology is the use of the six-key configuration so that the keyboard fits on the screen and users keep their fingers in a relatively fixed position while texting. This design allows users to hold their device with the screen facing away from them ¬– cradling the device with their palms or pinkies and thumbs – and to type with a majority of their fingers, identical to typing Braille on a standard keyboard. The team behind BrailleTouch is led by Romero and IC Professor Gregory Abowd, co-principal investigator. Former IC affiliate Brian Frey conceived the original idea and developed the first prototype and Southern created an improved design. They are conducting usability studies together with James Clawson, a Ph.D. candidate in IC, and Kate Rosier, a master's graduate in Digital Media and bachelor's graduate in Computational Media. The research group has developed iPhone and iPad versions of BrailleTouch and is currently working on Android versions. The app recently won the MobileHCI 2011 competition for design at the MobileHCI conference in Stockholm, Sweden. BrailleTouch will be demonstrated at the Abilities Expo-Atlanta 2012, taking place Feb. 17-19 at the Georgia World Congress Center. A video of BrailleTouch in action is available at the following link: http://www.youtube.com/watch?v=rIEO1bUFHsI This project was supported in part by the Rehabilitation Engineering Research Center for Wireless Technologies (Wireless RERC), which is funded by the National Institute on Disability and Rehabilitation Research (NIDRR), United States Department of Education, under grant number H133E110002.

  • Georgia Tech researchers turn an iPhone into a Braille writer with BrailleTouch app

    by 
    Michael Gorman
    Michael Gorman
    02.18.2012

    It wasn't all that long ago that we saw a student turn a tablet into a Braille writer, and now some researchers from Georgia Tech have done the same thing for smaller touchscreens, too. The Yellow Jackets produced a prototype app, called BrailleTouch, that has six keys to input letters using the Braille writing system and audio to confirm each letter as it's entered. To use the app, you simply turn the phone face down, hold it in landscape mode and start typing. As you can see above, it's currently running on an iPhone, but the researchers see it as a universal eyes-free texting app for any touchscreen. Early studies with people proficient in Braille writing show that typing on BrailleTouch is six times faster than other eyes-free texting solutions -- up to 32 words per minute at 92 percent accuracy. Skeptical of such speeds? Check out the PR and video of the app in action after the break.

  • Researchers use inkjet acumen to create wireless explosive sensor from paper

    by 
    Amar Toor
    Amar Toor
    10.31.2011

    Meet Krishna Naishadham and Xiaojuan (Judy) Song. They're researchers at the Georgia Institute of Technology, and those little devices they're holding may one day save you from an explosive device. This petite prototype is actually a paper-like wireless sensor that was printed using basic inkjet technology, developed by professor Manos Tentzeris. Its integrated lightweight antenna allows the sensor to link up with communication devices, while its functionalized carbon nanotubes enable it to pick up on even the slightest traces of ammonia -- an ingredient common to most IEDs. According to Tentzeris, the trick to such inkjet printing lies in the development of "inks" that can be deposited at relatively low temperatures. These inks, laced with silver nanoparticles, can then be uniformly distributed across paper-based components using a process called sonication. The result is a low-cost component that can adhere to just about any surface. The wireless sensor, meanwhile, requires comparatively low amounts of power, and could allow users to detect bombs from a safe distance. Naishadham says his team's device is geared toward military officials, humanitarian workers or any other bomb sniffers in hazardous situations, though there's no word yet on when it could enter the market. To find out more, careen past the break for the full PR.

  • Georgia Tech spies on nearby keyboards with iPhone 4 accelerometer, creates spiPhone

    by 
    Joe Pollicino
    Joe Pollicino
    10.21.2011

    Ever plopped your cellular down next to your laptop? According Georgia Tech researchers, that common scenario could let hackers record almost every sentence you type, all thanks to your smartphone's accelerometer. They've achieved the feat with an impressive 80 percent accuracy using an iPhone 4, and are dubbing the program they've developed, spiPhone. (Although the group initially had fledgling trials with an iPhone 3GS, they discovered the 4's gyroscope aided in data reading.) If the software gets installed onto a mobile device it can use the accelerometer to sense vibrations within three-inches, in degrees of "near or far and left or right," allowing it to statistically guess the words being written -- so long as they have three or more letters. It does this by recording pairs of keystrokes, putting them against dictionaries with nearly 58,000 words to come up with the most likely results. The group has also done the same with the phone's mics (which they say samples data at a whopping 44,000 times per second vs. the accelerometer's 100), but note that it's a less likely option given the usual need for some form of user permission. Furthermore, they explained that the accelerometer data rate is already mighty slow, and if phone makers reduced it a bit more, spiPhone would have a hard time doin' its thing. The good news? Considering the strict circumstances needed, these researchers think there's a slim chance that this kind of malware could go into action easily. Looks like our iPhone and MacBook can still be close friends... For now. You'll find more details at the links below.

  • Proof of concept: iPhone captures keystrokes via 'thump phreaking'

    by 
    Chris Rawson
    Chris Rawson
    10.19.2011

    Researchers at Georgia Tech have worked up a proof-of-concept demonstration of using an iPhone 4's accelerometer as a keylogger. After setting the iPhone near a computer keyboard, the device's built-in accelerometer and gyroscope were able to decipher entire sentences "with up to 80 percent accuracy." Similar keyloggers have already been developed using microphones, which sample vibrations far more frequently than accelerometers. However, nearly all phone operating systems ask a user's permission before granting applications access to the built-in microphone, which limits the utility of a keylogger. Apps don't currently ask for users' permission for access to accelerometers and gyroscopes, which raises the remote possibility of iPhones or other accelerometer-equipped devices spying on keyboard inputs without users being the wiser. "The way we see this attack working is that you, the phone's owner, would request or be asked to download an innocuous-looking application, which doesn't ask you for the use of any suspicious phone sensors," said Henry Carter, one of the project's researchers. "Then the keyboard-detection malware is turned on, and the next time you place your phone next to the keyboard and start typing, it starts listening." The keylogger software works by detecting key pairs -- detecting individual key presses turned out to be too difficult and unreliable -- and by comparing paired accelerometer events against a built-in dictionary, the software can decipher keypresses with startling accuracy. Our own Mike Rose has coined "thump phreaking" to refer to this spying technique (after Van Eck phreaking, which uses CRT or LCD emissions to reconstruct the screen image) and it's as apt a term as any for what this software does. It must be mentioned that this is only a proof of concept and not an actual attack that's out in the wild. The researchers themselves admit that this keylogger was difficult to build, and it's easily defeated by something as simple as moving your iPhone more than three inches away from the keyboard. That having been said, the technique is very James Bondian, and I wouldn't be at all surprised if something similar to this turns up in a forthcoming spy thriller or Batman movie.