That slab of plastic and glass in your pocket might be called a smartphone, but Google is hoping to make the applications running on it smarter yet. The folks in Mountain View hope to achieve that by giving them access to contextual data like time of day, where you are, what you're doing, the weather and if you have headphones plugged in. Oh, and if there are any Physical Web devices (beacons) near by. A post on the Google Developers blog says that combining the aforementioned data would allow an app to, say, suggest a playlist when you plug in headphones and go for a run.
That's because the new framework takes that recipe, or "fence" in Google's parlance, and can use it to ping an app even if it isn't open. Thus, Spotify triggering some workout jams when you're out for a jog. Maybe future implementations could trigger WebMD to open when you hit the local pizza buffet for the third time in a week. You know, to remind you that maybe you're not making the most healthy decision for lunch.
On a more serious note, TechCrunch writes that this could also trigger your camera app to be open and waiting when you go outside, based on the amount of nature snapshots you've taken. What's more, the weather info could be baked into a photo's metadata so you'd be able to search Google Photos for pictures that were only taken on hot summer days, for instance. Developers can sign up for early API access right now, but when users will see apps supporting the feature isn't clear.
For all the latest news and updates from Google I/O 2016, follow along here.