Advertisement

Core Values: What's next for NVIDIA?

Core Values is our new monthly column from Anand Shimpi, Editor-in-chief of AnandTech. With over a decade of experience poring over the latest in chip developments, he's here to explain how things work and why our tech is the way it is.



I remember the day AMD announced it was going to acquire ATI. NVIDIA told me that its only competitor just threw in the towel. What a difference a few years can make.

The last time NVIDIA was this late to a major DirectX transition was seven years ago, and the company just quietly confirmed we won't see its next-generation GPU, Fermi, until Q1 2010. If AMD's manufacturing partner TSMC weren't having such a terrible time making 40nm chips I'd say that AMD would be gobbling up marketshare like a fat kid. By the time NVIDIA gets its entire stack of DX11 hardware out the gate, AMD will be a quarter away from putting out newly refreshed GPUs.

Things aren't much better on the chipset side either -- for all intents and purposes, the future of NVIDIA's chipset business in the PC space is dead. Not only has NVIDIA recently announced that it won't be pursuing any chipsets for Intel's Core i3, i5. or i7 processors until its various legal disputes with Intel are resolved, It doesn't really make sense to be a third-party chipset vendor anymore. Both AMD and Intel are more than capable of doing chipsets in-house, and the only form of differentiation comes from the integrated graphics core -- so why not just sell cheap discrete GPUs for OEMs to use alongside Intel chipsets instead?

Even Ion is going to be short lived. NVIDIA's planning to mold an updated graphics chip into an updated chipset for the next-gen Atom processor, but Pine Trail brings the memory controller and graphics onto the CPU and leaves NVIDIA out in the cold once again.

Let's see, no competitive GPUs, no future chipset business. This isn't looking good so far -- but the one thing I've learned from writing about these companies for the past 12 years is that the future's never as it seems. Chances are, NVIDIA's going to look a lot different in the future because of two things: Tesla and Tegra.



Tesla is NVIDIA's high performance computing (HPC) business, with customers from the seismic, financial, medical or academic markets. The workloads are things most of us would never remotely come close to doing, stuff like looking for oil or breast cancer detection. These markets also have the sort of extremely data parallel workloads that could work really well on a GPU, which are very good at working on a lot of data at the same time. A single high end GPU easily has hundreds of execution units that can run in parallel, while a single quad-core CPU may only have a dozen or so. Through C for CUDA, NVIDIA started enabling these markets to port their applications (or parts of them) from x86 CPUs to NVIDIA GPUs.

NVIDIA also made a decision to make its GPU architectures much more flexible, a decision that resulted in the G80 chip at the heart of the GeForce GTX 8800. At the same time, NVIDIA began investing in programming languages to make writing for its flexible GPUs much easier, and also fed HPC feedback into its GPU design cycle: the GT200 was more HPC friendly than the G80 and Fermi is even more HPC friendly than GT200.

NVIDIA believes there's roughly $1b to be made in these HPC markets over the next 24 months, and although it's only made a bit over $10m so far, the company thinks that Fermi is going to be the turning point for Tesla revenue. Let's be realistic, though: at its peak, NVIDIA used to pull in around $1b in a single quarter. Tesla alone won't be enough for NVIDIA, not at those numbers.


Tegra is the big one.

Tegra is NVIDIA's SoC brand -- as we talked about last time, the smartphones we love reading about are based on highly integrated SoCs (system on a chip). That's a CPU, GPU, some other specialized processing, memory/storage and maybe even a modem. Tegra contains nearly all NVIDIA-developed technology -- and like everything else in the smartphone space, it's based on ARM, which means NVIDIA won't be dependent on x86 CPUs that will soon have integrated GPUs.

While Tesla depends on NVIDIA's continued development of high end GPUs, Tegra does

not. The architectures share very little in common with the desktop chips, they just need to be efficient and low power. They are completely separate designs from what is on NVIDIA's video cards. If push comes to shove, Tegra has enough upside to let NVIDIA exit the PC business entirely and just make SoCs. Marvell told me that the market for ARM-based SoCs is expected to grow to around five billion chips per year -- if NVIDIA can capture a sizable portion of that market, we're easily talking a couple of billion dollars per year. Tegra could be just as big as NVIDIA's GPU business today.

But going from zero to significant market share in the SoC space is difficult. The established players there are companies like Marvell, Samsung and Qualcomm. Even Intel looks like an unlikely underdog in that market.

Although Tegra got a lot of attention with the Zune HD, it's based on an older ARM11 core with the usual general purpose performance shortcomings -- and it doesn't necessarily look so hot compared to other performance oriented SoCs that have moved to Cortex A8. That said, NVIDIA plans on updating its Tegra SoCs once a year, similar to its GPU update cycle. Given the slow level of progress we've seen in the SoC space, there's room for NVIDIA's Tegra approach to do well -- update it annually and it may end up being fast enough to raise a few brows.

Looking at it this way, the biggest threat to NVIDIA today doesn't come from Intel or AMD, but rather Imagination Technologies, whose graphics cores are heavily used by Apple and Samsung in smartphones -- and NVIDIA believes its strengths as a GPU maker on the PC side will give it the advantage here. I'm willing to give NVIDIA the benefit of doubt, but we had better see a big splash in 2010 with Tegra and Tegra 2.

So. Will NVIDIA remain a high end GPU maker for PCs, will it see success in HPC, or will it move entirely to the application processor/SoC space? That future is at least a few years away, and as AMD has already shown us, a lot can happen in a few years. A lot more than I could predict at least.

[Zune HD image courtesy of iFixit]


Anand Shimpi is CEO and Editor-in-chief of AnandTech. Contact him at anand AT anandtech DOT com or on Twitter at @anandshimpi. Views expressed here are his own.