Advertisement

Google unveils its multilingual, code-generating PaLM 2 language model

The large language model is now svelte enough to run locally on mobile devices.

Jeff J Mitchell via Getty Images

Google has stood at the forefront at many of the tech industry's AI breakthroughs in recent years, Zoubin Ghahramani, Vice President of Google DeepMind, declared in a blog post while asserting that the company's work in foundation models, are "the bedrock for the industry and the AI-powered products that billions of people use daily." On Wednesday, Ghahramani and other Google executives took the Shoreline Amphitheater stage to show off its latest and greatest large language model, PaLM 2, which now comes in four sizes able to run locally on everything from mobile devices to server farms.

PaLM 2, obviously, is the successor to Google's existing PaLM model that, until recently, powered its experimental Bard AI. "Think of PaLM as a general model that then can be fine tuned to achieve particular tasks," he explained during a reporters call earlier in the week. "For example: health research teams have fine tuned PaLM with with medical knowledge to help answer questions and summarize insights from a variety of dense medical texts." Ghahramani also notes that PaLM was "the first large language model to perform an expert level on the US medical licensing exam."

Bard now runs on PaLM 2, which offers improved multilingual, reasoning, and coding capabilities, according to the company. The language model has been trained far more heavily on multilingual texts than its predecessor, covering more than 100 languages with improved understanding of cultural idioms and turns of phrase.

It is equally adept at generating programming code in Python and JavaScript. The model has also reportedly demonstrated "improved capabilities in logic, common sense reasoning, and mathematics," thanks to extensive training data from "scientific papers and web pages that contain mathematical expressions."

Even more impressive is that Google was able to spin off application-specific versions of the base PaLM system dubbed Gecko, Otter, Bison and Unicorn.

"We built PaLM to to be smaller, faster and more efficient, while increasing its capability," Ghahramani said. "We then distilled this into a family of models in a wide range of sizes so the lightest model can run as an interactive application on mobile devices on the latest Samsung Galaxy." In all, Google is announcing more than two dozen products that will feature PaLM capabilities at Wednesday's I/O event

This is a developing story. Please check back for updates.