Advertisement

Google is using AI to curate personalized news for smart speakers

An open audio news standard for the Assistant could be huge.

Every morning, without fail, I ask my closest smart speaker to play NPR. That's convenient, but it's only slightly better than turning on an old-school radio. Clearly, there's much more that a web-connected, voice-controlled speaker could do. And it seems Google has the same idea: the company announced today that it's developing an open audio news standard for the Assistant. When you ask your Google Home (or any other Assistant-powered device) to play the news, it'll automatically generate a playlist of stories based on your interests using the same technology behind the AI on Google News.

Hopefully, that should be more useful than just jumping into whatever your favorite station is playing at that moment. The briefing will start with short stories under two minutes, and then move on to slightly meatier stories up to 15 minutes long. And if you don't like what's playing, you can always ask the Assistant to jump to the next story. Google has partnered with media companies like CNBC, The New York Times, and The Washington Post to fill out its library of audio news, and it's also putting the call out for other English publishers.

Back in August, Google announced a simpler news feature for the Assistant, but that didn't have the benefit of AI recommendations. Additionally, there wasn't a way for publishers to contribute their content if they weren't already included by Google. Amazon, meanwhile, has a similar feature for Alexa devices, dubbed Flash Briefing, that brings together news, weather and other tidbits of information.

By creating an open audio news standard, Google is aiming to go a step beyond Amazon, instead of merely following in its footsteps. But its usefulness will really depend on more publishers climbing aboard and Google's AI selecting stories you actually want to hear.