Latest in Gear

Image credit: Patrick Daxenbichler via Getty Images

Microsoft's OpenAI supercomputer has 285,000 CPU cores, 10,000 GPUs

It's one of the five fastest systems in the world.
1906 Shares
Share
Tweet
Share
computer chips stylized
Patrick Daxenbichler via Getty Images

Sponsored Links

Last year, Microsoft invested $1 billion in Open AI, a company co-founded by Elon Musk that focuses on the development of human-friendly artificial intelligence. Today at the Build 2020 developer conference, we're seeing the first results of that investment. Microsoft announced that it has developed an Azure-hosted supercomputer built expressly for testing OpenAI's large-scale artificial intelligence models. 

While we've seen many AI implementations focused on single tasks, like recognizing specific objects in images or translating languages, a new wave of research is focused on massive models that can perform multiple tasks at once. As Microsoft notes, that can include moderating game streams or potentially generating code after exploring GitHub. Realistically, these large-scale models can actually make AI a lot more useful for consumers and developers alike. 

The OpenAI supercomputer is powered by 285,000 CPU cores and 10,000 GPUs (each of which are also united by speedy 400 gigabit per second connections). And while Microsoft didn't reveal any specific speed capability, the company says it's the TOP500 list of publicly disclosed supercomputers. 

At this point, it's unclear how, exactly, OpenAI will take advantage of such a powerful system. But we can at least expect the results to be interesting. The non-profit is best known for developing an algorithm that could write convincing fake news, as well as proving that even bots learn to cheat while playing hide and go seek. 

Maybe OpenAI will take a note from Microsoft and develop something like its Turing models for natural language generation, a large-scale AI implementation that's powering things like real-time caption generation in Teams. It's backed by 17 billion parameters for understanding language -- a particularly impressive number when competing solutions clocked 1 billion parameters last year. Microsoft also announced that it's making the Turing models open source, so developers will be able to use it for their own language processing needs soon.

Check out all of our Build 2020 news here!

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Comment
Comments
Share
1906 Shares
Share
Tweet
Share

Popular on Engadget

Weber’s SmokeFire smart grills just got a lot better

Weber’s SmokeFire smart grills just got a lot better

View
Atmospheric CO2 hits a record high while emissions drop

Atmospheric CO2 hits a record high while emissions drop

View
Our readers find Nintendo’s Joy-Con controllers a crushing disappointment

Our readers find Nintendo’s Joy-Con controllers a crushing disappointment

View
EA Access to hit Steam this summer after delay

EA Access to hit Steam this summer after delay

View
Instacart takes steps to discourage 'tip baiting'

Instacart takes steps to discourage 'tip baiting'

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr