Intel CTO Justin Rattner just described a future where your devices know more about you -- not just "where you are," but "where you're going," to use his words. Intel's working on a context-aware API that uses not only physical smartphone and tablet sensors (like accelerometers and GPS) but also "soft sensors" including social networks and personal preferences to infer what you're doing
and what you like, and deliver these inferences to a "context engine" that can cater to your tastes. It's presently being tested in an app by travel guide company Fodors on a Compal MID
that dynamically delivers restaurant and tourism suggestions based on these factors, and also in a social cloud service (demoed on a prototype tablet) that can show you what your friends are up to (using game-like avatars
!) on the go. Rattner told us that the API itself is not quite like the typical experiments out of Intel Labs -- while there aren't presently plans to make the API publicly available, he said the context engine was made to commercial software standards specifically
so it could become a real product should the technology pan out. In other words, Intel just might be agreeing to do all the heavy lifting for a new generation of apps. How sweet.
Intel's context-aware presentation and Fodor travel app at IDF 2010
Intel's context-aware experiments, eyes-on