cognitive

Latest

  • IBM's Watson cognitive computer has whipped up a cookbook

    by 
    Jon Fingas
    Jon Fingas
    04.12.2015

    IBM's Watson learning computer system isn't just content with making the occasional meal -- it has a whole slew of recipes lined up. The tech company is launching Cognitive Cooking with Chef Watson, a cookbook based on Watson's knack for combining food in a way that produces unique (and typically tasty) flavors. There's only about 65 foodstuffs in the mix, but they're considered "greatest hits" that should work well in real life. Just be prepared to do more grocery shopping than usual when the book arrives on April 14th, since IBM's machine tends to choose ingredients that you probably don't have in the pantry.

  • Intel Labs measures cognitive workload of distracted drivers, we go eyes-on with the demo (video)

    by 
    Nicole Lee
    Nicole Lee
    06.26.2013

    Many studies have shown that any kind of distracted driving is a bad thing, but Intel wanted to take a closer look at our driving behavior to determine if we could avoid it in the first place. Paul Crawford, a senior research scientist in Intel's Interaction and Experience Research Lab, sought to do just that in a comprehensive investigation that seeks to understand not just where drivers are looking, but how they're thinking. By doing this, Intel hopes to alert the driver of any mental warning signs before he or she even gets behind the wheel. At a recent Research @ Intel event in San Francisco, Crawford used a racing car gaming set-up to demonstrate both visual and mental diversions with eye-tracking software and a functional near-infrared spectrometer headband. The latter is used to gauge the metabolic activity and cognitive workload of the brain under different driving conditions, which in this case fluctuated between a peaceful drive and a high-speed chase. Crawford also threw in a few questions and mathematical problems at the test subject to complicate matters. As you might expect, the brain was highly active during the more challenging scenarios and less so in the other. Crawford told us he hopes that the findings will point to ways we can optimize our environmental conditions and taskloads so that we can perform better, not just when driving but in everyday tasks as well. To see the demo in action and hear Crawford's words for yourself, check out the video after the break. Michael Gorman contributed to this report.

  • IBM's cognitive computing chip functions like a human brain, heralds our demise (video)

    by 
    Amar Toor
    Amar Toor
    08.18.2011

    After having created a supercomputer capable of hanging with Jeopardy's finest, IBM has now taken another step toward human-like artificial intelligence, with an experimental chip designed to function like a real brain. Developed as part of a DARPA project called SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics), IBM's so-called "neurosynaptic computing chip" features a silicon core capable of digitally replicating the brain's neurons, synapses and axons. To achieve this, researchers took a dramatic departure from the conventional von Neumann computer architecture, which links internal memory and a processor with a single data channel. This structure allows for data to be transmitted at high, but limited rates, and isn't especially power efficient -- especially for more sophisticated, scaled-up systems. Instead, IBM integrated memory directly within its processors, wedding hardware with software in a design that more closely resembles the brain's cognitive structure. This severely limits data transfer speeds, but allows the system to execute multiple processes in parallel (much like humans do), while minimizing power usage. IBM's two prototypes have already demonstrated the ability to navigate, recognize patterns and classify objects, though the long-term goal is to create a smaller, low-power chip that can analyze more complex data and, yes, learn. Scurry past the break for some videos from IBM's researchers, along with the full press release.

  • Study finds casual gaming can help cognition

    by 
    Griffin McElroy
    Griffin McElroy
    05.27.2010

    East Carolina University's Psychophysiology Lab recently published some promising findings from a study on the effect casual games can have on the cognitive abilities of older players. According to Gamasutra, the study, which has run for almost six months and counting, has measured certain mental functions of 40-some participants over the age of 50 as they've played various PopCap games in half-hour chunks over the duration of the study. Researchers have found that even this semi-regular play (like, really, who plays a PopCap game for just thirty minutes) has boosted participants' cognitive response times by 87 percent, in addition to increasing their executive functioning by a whopping 215 percent. So, what does that mean? The group conducting the study explained these findings could prove casual games (and, in all likelihood, "so-called 'hardcore' video games") could constitute effective mental exercise for the elderly, or those who suffer from dementia and Alzheimer's. That's really great news, since our grandmother has probably played enough Zuma that she can now move things around with her mind.

  • Is my iPhone making me dumber?

    by 
    Chris Rawson
    Chris Rawson
    05.12.2010

    I love my iPhone. It goes with me everywhere. Leaving my house without my iPhone would feel just as unnatural as leaving without my pants (although I'd probably get fewer stares). There are so many things my iPhone is able to do that it's become an indispensable part of my daily life... and that's actually beginning to worry me. Sometimes I wonder if my iPhone is making me dumber. This didn't start with my iPhone; it started with my first cell phone (and the only phone I owned before the iPhone), a monstrous Sanyo SCP-7200. Suddenly, once I was able to store all of my friends' and family members' phone numbers in my phone and dial them with just a couple of button presses, I became virtually incapable of remembering their phone numbers on my own. That was just the beginning of my cognitive downslide, though. Since getting my iPhone, it seems like it's been getting exponentially worse. Read on to find out how the iPhone may be damaging my brain. Hint: it's not the cell phone radiation.

  • A cognitive look at World of Warcraft

    by 
    Mike Schramm
    Mike Schramm
    09.22.2008

    The Human/Computer Interaction Design group at Indiana University seems interesting -- they're apparently working on the connections between the Human/Computer interface, both studying what's already being done between humans and technology and thinking of new ways for the two to interact. And they're concerned with abstracts, not specifics -- they look not at which buttons are being pressed, but why and how the software informs you what to do next.One of the students in a class there has written up a cognitive account of what it's like to play World of Warcraft, which is a look at the game strictly through sense perception. Even if you're an experienced player, it's interesting to see the game in a new light like this -- rather than talk about the lore or the mechanics of gameplay, the writeup is all about the sights and sounds of the game, and how Blizzard's overall design clues you in to what can and can't be done in Azeroth.There's probably lots more work that could be done on this as well -- lots of games, including World of Warcraft, use design elements like colors and lighting to nonverbally clue you in on the next door to go through or where to send your attention during a scene or fight. Most of their other cognitive accounts are about actual UI design, but there are many, many things left for those studying user interfaces to mine out of the way videogames express themselves to the user.

  • DARPA developing threat sensing binoculars

    by 
    Darren Murph
    Darren Murph
    04.12.2007

    The night-vision thing has definitely been done a time or two before, but DARPA's latest initiative is looking beyond the darkness as it hopes to create a set of binoculars that can actually detect threats and warn soldiers of impending death. Taking a note from Star Wars, the jokingly dubbed "Luke's Devices" is actually considered a "cognitive technology threat warning system," and utilizes brain monitoring to bring attention to spikes in activity before the person can actually realize he / she has noticed something awry. Among the gizmos that'll bring this all together are "neurally-based target detection signatures, ultra-low power analog / digital hybrid signal processing electronics, wide-angle optics, large pixel-count digital imagers, and cognitive visual processing algorithms." Yeah, sounds pretty complicated to us too, but unlike snazzy concepts we've seen before, the gurus behind these goggles reportedly hope to have prototypes ready for battle in just a few years.[Via Wired]