After getting beat to the punch by its primary rival, Google plans to add conversational AI to its flagship Search product, CEO Sundar Pichai told The Wall Street Journal in an interview. "Will people be able to ask questions to Google and engage with LLMs [large language models] in the context of search? Absolutely," he said. Google has already said it would integrate LLMs into search, but this is the first time the company has announced plans for conversational features.
The move isn't unexpected, particularly after Microsoft released a version of its own Bing search engine that used OpenAI's ChatGPT AI engine. However, Google's implementation would potentially have more impact, considering its 93.4 percent worldwide share of the search market. Pichai added that he saw AI chat as a way to expand its search business, rather than a threat. "The opportunity space, if anything, is bigger than before," he told the WSJ.
Pichai didn't reveal a timeline for chat AI search, but it's clear that Google lags behind Microsoft. OpenAI's release of ChatGPT prompted Google to declare a "code red" as it saw the AI as an existential threat to its core business. That proved to be warranted, as Microsoft (which owns a large chunk of OpenAI), soon released a version of Bing Search powered by OpenAI's latest GPT 4 model that gave it some uncanny abilities.
Google released its own conversational AI called Bard strictly as a chat product on a standalone site and not in Search. However, it was clearly lagging behind ChatGPT, displaying incorrect answers in a Twitter ad. Pichai recently said Google would soon switch to a more "capable" language model in an effort to close the gap.
While Google is cutting jobs in an effort to achieve Pichai's goal of becoming 20 percent more productive, the company is accelerating work on new AI products. To be more efficient, it plans to allow more collaboration between divisions like Google Brain and DeepMind, its two primary AI units. "Expect a lot more, stronger collaboration, because some of these efforts will be more compute-intensive, so it makes sense to do it at a certain scale together," he said.