NonverbalCommunication

Latest

  • Ask Engadget: best tablet, software and case for a child with a developmental disorder?

    by 
    Daniel Cooper
    Daniel Cooper
    07.14.2012

    We know you've got questions, and if you're brave enough to ask the world for answers, here's the outlet to do so. This week's Ask Engadget inquiry is coming to us from is from Kim, who wants to help her four-year-old play games and communicate in the home. If you're looking to send in an inquiry of your own, drop us a line at ask [at] engadget [dawt] com. "I'm looking at getting a tablet for my four-year-old, developmentally delayed daughter. She uses an iPad with her therapist to play games and say what she wants, but I'm not sure if I should get the same for the home. Is there an Android alternative that's as useful in our special circumstances, and is it compatible with a sturdy, water-proof case for it?" For our two cents, we'd say that in such circumstances, continuity is probably an important factor to take into account. However, we're also aware that some of the specialist apps run into the hundreds of dollars, which few can easily afford, so let's turn the question over to our community. Do you have experience in the area, or perhaps you've already been in this situation? Either way, why not share what you know?

  • Cambridge developing 'mind reading' computer interface with the countenance of Charles Babbage (video)

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    12.23.2010

    For years now, researchers have been exploring ways to create devices that understand the nonverbal cues that we take for granted in human-human interaction. One of the more interesting projects we've seen of late is led by Professor Peter Robinson at the Computer Laboratory at the University of Cambridge, who is working on what he calls "mind-reading machines," which can infer mental states of people from their body language. By analyzing faces, gestures, and tone of voice, it is hoped that machines could be made to be more helpful (hell, we'd settle for "less frustrating"). Peep the video after the break to see Robinson using a traditional (and annoying) satnav device, versus one that features both the Cambridge "mind-reading" interface and a humanoid head modeled on that of Charles Babbage. "The way that Charles and I can communicate," Robinson says, "shows us the future of how people will interact with machines." Next stop: uncanny valley!

  • TalkTorque robot gets day job as creepy museum guide, TalkTorque 2 is now the future (video)

    by 
    Tim Stevens
    Tim Stevens
    12.23.2010

    As if there weren't enough Greys flying around in saucers and conducting strange experiments on us at night, a team at Tsukuba University went ahead and created their own. Two of them, as a matter of fact. It started with TalkTorque, a short, white bot with swoopy arms and head designed to help research in non-verbal communications. That poor guy is old news now, relegated to guide duty at the school's Groupware Lab. TalkTorque 2 has come along with slightly refined looks and a chunky collar containing a trio of motion- and range-sensing cameras to help the thing figure out who it should be talking to. Of course, it still has no mouth, so the "talking" will be in broad arm gestures, which it will surely use to guide you to his ship's examination chamber. There's a video of that communication technique below, along with some dramatized footage of the TalkTorque 2 in action.

  • DARPA working on "Silent Talk" telepathic communication for soldiers

    by 
    Laura June Dziuban
    Laura June Dziuban
    05.14.2009

    We're no strangers to crazy DARPA projects around here, but this one especially strikes our fantastic fancy. The agency's researchers are currently undertaking a project -- called Silent Talk -- to "allow user-to-user communication on the battlefield without the use of vocalized speech through analysis of neural signals." That's right: they're talking about telepathy. Using an EEG to read brain waves, DARPA is going to attempt to analyze "pre-speech" thoughts, then transmit them to another person. They first plan to map people's EEG patterns to his / her individual words, then see if those patterns are common to all people. If they are, then the team will move on to developing a way to transmitting those patterns to another person. Dream big, that's what we always say!