SRI International's AIC began its ARPA-sponsored research in 1966 -- the same year the center was founded -- with the goal of developing an artificially intelligent mobile robot system that could make its way in the world without having to trouble its human counterparts. Shakey stood nearly six feet tall and four feet wide (if you include its sensor extensions) and resembled a photocopier on wheels with a camera perched on top. It didn't go in for anthropomorphism; it was strictly the basics. The focus was on thought processes and scene analysis, and complicated appendages would just be an unnecessary distraction to the already challenging tasks at hand. While a primary goal was for independent freedom of movement, the initial model's range was restricted by a tether until its radio link was installed late in 1968. Shakey also got a brain upgrade in 1969, adding a new computer and revamped software, which helped increase Shakey's range and abstract problem-solving abilities.
Shakey's "head" was decked out with a movable vidicon television camera and an optical rangefinder as its primary sensory apparatus. A head-mounted antenna that supported a full-duplex radio link handled real-time communication with its host computer. One channel was dedicated to telemetry and the second transmitted the video signal. An onboard control logic system filled up the midsection of the robot and routed commands to the appropriate vehicle systems, including drive motors, camera settings and tilt angles. Shakey's lower regions were studded with whip-like touch sensors called "cat whiskers" and a push bar. The whiskers indicated contact with an object and the push bar measured the pressure when engaging with an obstacle.
SRI International's work with Shakey was a step forward in artificial intelligence and robotics, but it came to an end in 1972. The lab still continued its AI work in other mediums and eventually circled back to the robot form factor in 1984, with its follow-up project called Flakey. This robot had improved visual algorithms, allowing it to identify and follow individuals around, paired with the DECIPHER speech-recognition system for processing and responding to verbal commands. Various other projects at SRI have taken spatial awareness and navigational capabilities to the next level using coordinated sets of robots for distributed tasks, like the Centibots from the early aughts.
Shakey figures out how to access and move a block located on the raised platform.
The work that began with projects like Shakey has certainly matured over the last few decades. Groups from both MIT and Stanford University have joined Ford to explore algorithmic capabilities for predicting current and future pedestrian and traffic movement, as well as expanding sensor range to peer around obstacles like lane-hogging trucks. Recent initiatives by the DoT's National Highway Traffic Safety Administration hope to install vehicle-to-vehicle (V2V) communication systems, similar to the way data was distributed amongst the Centibots. Meanwhile, Google's Advanced Technology and Projects (ATAP) group is going one step further by taking the spatial sensors out of vehicles and putting them right into your smartphone with Project Tango. Having a pocket-sized smart device that's spatially aware of the world around it could go a long way towards helping us keep our toes safe on those late-night trips to the bathroom.
[Image credits: Sven Wahlstrom (lead image, triple exposure, Shakey's head); SRI International (box moving sequence)]