Instead of trying to build the biggest and most powerful video card on the market, AMD aimed at the low end with the Radeon RX 480. But that doesn't make it any less exciting than NVIDIA's recent powerhouse GeForce GTX 1080 and 1070 GPUs. AMD's pitch for the RX 480 is simple: It's a $200 card that's VR ready. That's huge, especially since the current batch of GPUs that meet minimum VR specs cost around $350. I'll admit I was skeptical when AMD announced the RX 480 at Computex last month. But after putting one through its paces over the past few days, I feel like Han Solo in The Force Awakens. It's true. All of it.
Gallery: AMD Radeon RX 480 | 8 Photos
Gallery: AMD Radeon RX 480 | 8 Photos
To be fair, AMD did prime the pump a bit by sending me the 8GB version of the RX 480. That version of the card will retail for around $239, a bit more than the $200 figure it reached with the 4GB model. There will be some performance differences between the two cards, but they likely won't be significant with most games today. Still, AMD admits the 8GB version is a better bet if you want to future-proof your system for forthcoming titles.
Compared to the last AMD card I tested -- the mammoth R9 Fury X -- the RX 480 is elegant in its simplicity. It's basically a black box with some classy dimpling on the front and a single fan. It's based on AMD's new Polaris architecture, which is built on a 14nm FinFET (a type of 3D transistor) process. That means the chip itself is significantly smaller than the cards using the company's previous 28nm design, which first debuted back in 2011. Polaris' tiny size allows it to be more power efficient, and it also lets AMD reach higher clock speeds than ever before (1,120MHz with boost speeds up 1,266MHz).
Installing the RX 480 was like any other GPU: Plug it in a PCI Express slot and connect additional power (in this case, it's a single 6-pin PSU cable). I hooked a 4K monitor into one of the three DisplayPort slots (there's also an HDMI slot) and installed AMD's latest drivers, and I was ready to start gaming. It wasn't long before I forgot I was testing a $240 video card in my rig, which otherwise consists of a 4GHz Core i7-4790K CPU, 16GB of 2400Mz DDR3 RAM and a 512GB Crucial MX100 SSD on a ASUS Z97-A motherboard.
|AMD Radeon RX 480||Standard 10,279/ Extreme 5,146/ Ultra 2,688||X4,588|
|NVIDIA GeForce GTX 1080||Standard 15,859/ Extreme 9,316/ Ultra 5,021||X9,423|
|AMD R9 Fury X||Standard 13,337/ Extreme 7,249/ Ultra 3,899||X,6457|
In most of the 3DMark tests, the RX 480 scored around half as well as the GTX 1080. That's actually quite impressive, considering that the 1080 costs upward of $600. Notably, the RX 480 was also slightly faster than comparable benchmarks from NVIDIA's GTX 970, which still costs more than $300 today (and was previously the bare minimum you needed for VR).
|AMD Radeon RX 480||20||25|
|NVIDIA GeForce GTX 1080||43||48|
|AMD R9 Fury X||35||38|
I knew from the get-go that this card wouldn't be much of a 4K contender, and while the results I got weren't playable, I'm still surprised at how well it did compared to the GTX 1080 and the R9 Fury X. What really impressed me, though, was the RX 480's 1440p performance with maxed-out settings. It managed to reach nearly 60 frames per second in most titles, which has been my PC gaming goal for the past few years. What you lose out in resolution compared to 4K, you get back in overall smoother performance (and the ability to use more-elaborate graphical settings).
|AMD Radeon RX 480||43||45||58||60|
|NVIDIA GeForce GTX 1080||N/A||N/A||N/A||N/A|
|AMD R9 Fury X||N/A||70||N/A||N/A|
The RX 480 also cleaned up well in 1080p gaming, but that's no surprise. If you're buying a new video card today, though, you're far better aiming for the 1440p milestone (even if you don't have a compatible monitor yet).
When it comes to real-world performance, the RX480 felt as fluid as the GTX 1080 when playing Overwatch in 1440p with all graphics settings at their maximum. It never dipped below 60 fps, even when things got incredibly hectic. These days, that's all I ask for in a video card. With the new Doom, it hovered between 55 and 60 fps, which is still commendable, given how demanding that game can be. It didn't fare as well with The Witcher 3, getting around 43 fps, but that's also a game that eats GPUs for breakfast.
As for VR, the RX 480 delivered a solid experience without much slowdown. It didn't matter if I was dogfighting in Eve: Valkyrie, exploring alien worlds in Farlands or platforming in Lucky's Tale. I kept a particular eye out for stuttering or anything that could lead to motion sickness but couldn't detect any major issues. AMD wasn't lying: This is a VR-ready card. There's a chance that the 4GB version of the RX 480 could have some issues dealing with virtual reality, but given the speeds I saw with traditional games, even that card should be able to handle basic VR requirements (pumping out a 1,200-by-1,080 resolution at 90 fps).
Temperature-wise, the RX 480 idled around 35c and reached 69c while benchmarking and gaming. Its fan was normally quiet, but when things heated up it was definitely audible. Since it's a small fan, it's whinier and higher pitched than the larger fans you'll find on most video cards. That might be annoying for some, but it never bugged me in the middle of gaming sessions.
Similar to the GTX 1080 and 1070, there simply isn't anything else in the budget video card market that can compete with the Radeon RX 480. Last year's cards all cost more and offer less performance. The real problem is deciding between the $200 4GB model and the $239 8GB version. For peace of mind (and for a likely smoother VR experience), I'd recommend splurging for the additional memory. AMD will also offer cheaper Polaris cards, the RX 460 and 470, but those are meant for e-sports and less-demanding systems.
In the end, AMD has successfully delivered on its promise of making a VR-ready card that everyone can afford. And what's most intriguing is that NVIDIA doesn't yet have a viable budget competitor. The door is wide open for AMD to redefine what a low-end GPU can do.