Public Access

Community storytelling.

This post was created by a member of the Public Access community. It has not been edited for accuracy or truthfulness and does not reflect the opinions of Engadget or its editors.

Editor's Picks

Image credit:

Giving Dogs A Voice

IEEE Standards Association, @ieeesa

By: Thad Starner, Professor of Computing at Georgia Institute of Technology and a Technical Lead on Google's Glass

A dog walks up to a woman and says "Follow me. My owner needs your help." The woman stops and stares at the dog, disbelieving what she has just heard. Fortunately, the service dog is trained for this situation, and tugs on his wearable computing vest again, causing it to repeat "Follow me. My owner needs your help." The dog trots back to its owner with the woman following. Soon she discovers that she has been part of a Georgia Tech experiment to determine which pre-recorded messages are the most effective for service dogs requesting help from strangers.

Facilitating Interactions for Dogs with Occupations (FIDO), is a collection of projects by my colleague Professor Melody Jackson and I that use wearable computers to help man's best friend communicate. Service dogs currently alert their owners or caregivers about incipient seizures for people with epilepsy or fainting due to dangerously low blood sugar levels for those with diabetes. While the owner of a diabetes dog might know that her dog stands on its hind legs against her when alerting to low blood sugar, a stranger would probably think the dog was being overly friendly. Wearable computers can empower service dogs to better express themselves to humans. To provide an alert, the diabetes dog tugs on an elastic tug sensor mounted to his FIDO vest, which triggers a pre-recorded response designed to recruit help from humans in the vicinity.

In more complex scenarios, the dog may select from one of several different inputs to better communicate with long-term human partners. For example, bomb and drug detection dogs are trained to sit or lie down when they discover a substance of interest. Wearable computers can enable these dogs to specify what type of substance they discovered. Tugging on a sensor on the left side of a FIDO vest might indicate an unstable peroxide bomb, while biting a sensor on the other side might indicate a stable compound, like gunpowder. Including a GPS unit allows such service dogs to work at a distance. When the dog triggers the wearable, the dog's location is automatically sent to the handler. Furthermore, depending on the type of bomb the dog signals, the handler might decide to recall the dog to avoid accidentally triggering it. A FIDO vest can include a speaker so that the handler can give the dog verbal commands remotely. Or, for situations when silence may be needed such as with military dogs, vibration motors in the vest might provide the dog tactile, instead of verbal, commands.

In the field of Human Computer Interaction (HCI), we talk about the affordances perceived by the user. For example, when presented with a door knob, humans try to turn it, push it, and pull it (among other actions) until we determine how to open the door. What affordances does a dog perceive when most door knobs are not even reachable? Because computer interfaces were designed for humans, we know little about how to create appropriate affordances for computing for dogs. The FIDO project explores different affordances to see which are most intuitive (and most trainable) for dogs. Certainly using a mouse to control the ubiquitous Windows, Icons, Menus, and Pointer (WIMP) interface seems to have little future—dogs have no fingers to click the left mouse button! Also, dogs seem to have little need for the type of in-depth computer use that humans require. Instead, FIDO vests are designed for quick communicative actions. FIDO has examined tugs using stretch sensors, nose touches using conductive textiles, nose swipes using proximity sensors, and bites using capacitive, resistive, and pressure sensors, to name a few. In the non-wearable domain, FIDO has examined the use of nose touchscreens for service dogs to summon help when at home. Surprisingly, when presented with multiple icons to touch in sequence to dial 911, the dogs invented swiping and multi-touch (nose plus a front paw) gestures in order to hit the icons more quickly and get their reward faster! Our canine participants have discovered better interaction techniques repeatedly in our testing, often leading us to better interfaces than we originally conceived.

Embedding sensors in dog collars provides yet another means of communicating with dogs. Normally hearing assistance dogs lead their owners to the source of a noise, whether it is the doorbell or a baby crying. What should the dog do in the case of a tornado alarm? One option is to train a dog to make a specific gesture, such as spinning in place, to indicate such alarms. If the owner is nearby, the gesture is obvious, but if the owner is outside or in a different room, motion sensors in the collar can detect the gesture and trigger a vibration-based alert in the owner's mobile phone or smartwatch. Similarly, such a system could be used remotely by search and rescue or bomb and drug detection dogs to indicate when they have found the desired target.

Collar-based motion sensors might even help pets. Just as wearable fitness trackers help humans understand when they are being too sedentary, a dog's fitness tracker might help owners discover that not enough exercise leads to undesirable behaviors, like shoe chewing, when they are not at home. Some pet owners might go further and train their pets to communicate with gesture. As with every technology though, canine computing must strike a balance between useful information and information overload. Personally, there are only so many "Squirrel!" alerts I need from my pet throughout the day.

About the Author

Thad Starner is a wearable computing pioneer and has been wearing a computer with a head-up display as part of his daily life since 1993. Thad will provide insight on this topic at the annual SXSW Conference and Festival, 10-19 March, 2017. The session, Not Your Mama's Wearables, is included in the IEEE Tech for Humanity Series at SXSW. For more information please see Dr. Starner is a Professor of Computing at Georgia Institute of Technology and a Technical Lead on Google's Glass. Thad is a founder of the annual ACM International Symposium on Wearable Computers, now in its 21st year, and has produced over 450 papers and presentations on his work. He is an inventor on over 90 United States utility patents awarded or in process. For over two decades, Starner's work has appeared in national and international public forums, including CBS's 60 Minutes and 48 Hours, ABC's Nightline, PBS's News Hour, CNN, the BBC, National Geographic, The New York Times, New Scientist, and The Wall Street Journal.

ear iconeye icontext filevr