As an Amazon Associate I earn money from qualifying purchases.

Thursday, April 2, 2015

Graphics Card Guide, April 2015

I mentioned working on some other articles, so this is going to be my first real non-cryptocurrency article in some time. Given my background, it's only fitting to start with graphics cards. So you're thinking about buying a new graphics card (GPU), but not for cryptocurrency mining or anything serious -- you want to use your PC for something useful! You want to play games.

I enjoy games as well, and with a long history of benchmarking and testing games, I've been exploring this realm for a while. But what's different is that I'm not going to try and benchmark everything under the sun on a regular basis -- I have a separate site for that sort of thing. Instead, I want to talk about the bigger picture and focus on what's important and what ends up being marketing fluff.

Use your PC as intended: for gaming!


The first question when considering a new graphics card is pretty simple: what brand of GPU do you want to use? While some might toss out Intel as an option, let's be clear that Intel processor graphics are at best weak little things for the time being; they might run games at low to medium detail and 1366x768 or 1600x900 -- maybe even 1920x1080 -- but in terms of graphical prowess the latest models are roughly the equivalent of an Xbox 360. Which was a great gaming system and had some really fun titles, but the hardware was state of the art about ten years ago. (Yes, ten and not nine -- even when it launched, the core hardware in the Xbox 360 was already about a year or two old.) So the question of which GPU brand to buy is really a choice between AMD and NVIDIA.

There are lots of people out there that like to turn this into a "David and Goliath" story, or at least a "root for the underdog" mindset. And I like having competition to drive innovation as much as the next guy. But let's be clear: both companies are large corporations with a goal of making a profit, and they only care about their users insofar as the users help them to make profits. NVIDIA is certainly healthier right now and AMD is struggling in a variety of areas, but buying substandard hardware just to try and help a company survive is not sustainable. With that said, let's talk frankly for a bit about what each company offers -- and I'm not pulling punches but neither am I trying to exaggerate the negatives.


A Frank Look at AMD's GPUs

Starting with AMD, they've had some great GPUs over the years, and they still have plenty of potential. If Bitcoin hadn't happened, AMD may have already fallen, but the past few years of cryptocurrency mania certainly helped their bottom line. What's particularly ironic is that as things progressed, it became clear that AMD wasn't actually vastly superior for mining compared to NVIDIA hardware; rather, the programmers behind the various algorithms just found it easier to get AMD hardware running and it took two years before the subject was revisited. Proper CUDA miners made a lot of headway, and perhaps they required more skill to code, but that's a different topic. Today, with the most optimal GPU miners for each architecture, NVIDIA hardware actually does reasonably well versus AMD -- it wins some algorithms and loses others, much like in the gaming arena.

But games are what really matters, and for some the ideal is to get as much gaming performance as possible for the lowest price possible. AMD definitely delivers on that front, with many price points favoring the various Radeon GPUs compared to their GeForce competition. If you're not going out and buying new games on the day of release (or pre-ordering games), there are very few instances where AMD's GPUs fail to deliver a good gaming experience. And in terms of bang for the buck, most of the time they'll come out on top. I'll get into the specifics a bit more below, but for now I'll say that AMD GPUs generally perform very well for their price.

So what's the catch? In a word: drivers. Much of the time when you hear people ranting about drivers, you can often dismiss the commentary as the rantings of a fanboy, but I've been using GPUs for long enough to remember the time when NVIDIA first coined the term "GPU" (back with the GeForce 256). I've owned everything from 3dfx, Cirrus Logic, Trident, ATI, NVIDIA, and others over the past 20 years, and the one constant has been this: drivers matter. You can have the best hardware on the planet but if the drivers don't work right, it won't matter. And as the creator of DirectX once stated, "The drivers are always broken." True words, and the only real question is just how badly they're broken.

AMD's drivers can be a bit bipolar on the desktop, with some releases working well only to be followed by a release where many "fixed" problems reappear. This happens with NVIDIA as well, but in my experience it doesn't occur as often. And as someone who purchased well over $2000 worth of AMD GPUs during the past two years (for cryptocurrency), I'm not anti-AMD or an NVIDIA fanboy; in fact the first NVIDIA GPU I've purchased in a while was the GTX 970, but I digress....

The real issue with AMD's drivers is this: they're bloated pigs. I remember when AMD first switched to using a .NET interface, and we all complained at the time that it just slowed everything down -- not necessarily for gaming performance, but for the driver install times, load times, and general UI performance. Well, things got better, but then they got worse again and to this day I feel AMD's reliance on .NET for the Catalyst Control Center is at best misguided, and at worst idiotic. When I have to do a clean install of Windows 7 on a new system (because I'm still not sold on Windows 8 and will do my best to just skip it and go straight to Windows 10) and I have an AMD GPU, it's super irritating to have to get .NET up and running as one of the first steps.

A great example of the issues is the install process for AMD drivers. When it works right, it takes several minutes to complete -- on a top of the line system with a solid state drive. Use a hard drive and/or slower processor and the driver installation routine can take 15+ minutes. That's bad enough, but then there are times where the install process simply breaks. When that happens you have to uninstall the drivers, use a utility like Display Driver Uninstaller to really uninstall all the cruft AMD's uninstall process leaves floating around, and then reinstall the drivers again (and hope it works properly this time). Even on a fast PC, that process can take upwards of 20 minutes, never mind the frustration involved. On a slower PC, it could be an hour or more. It's unconscionable that AMD continues to release drivers that can't even properly uninstall; some serious optimizations to their whole driver system are long overdue, but I fear AMD has plenty of other problems keeping them busy.

There's a separate topic related to notebooks and GPU drivers, which I won't get into much here. Suffice it to say that as bad as AMD's drivers can be on the desktop at times, on notebooks it's far worse. Toss in their Enduro Technology and you can easily end up with a malfunctioning system. I helped a friend with his Alienware M17x R4 and 7970M recently. The initial reinstall of Windows 7 went okay, but somewhere along the way .NET didn't behave properly and the whole system became essentially non-functional. So I had to reinstall everything from scratch and I got all of the stuff working right. When he took the PC home, Windows Update told him there were new drivers for his AMD GPU. He updated, like a fool... and the graphics drivers broke, with a BSOD for good measure. [Insert screams of frustration....]

Besides the drivers, AMD has one other real problem right now: power requirements and noise. I don't know what it is about the blowers on AMD's high-end GPUs (7950/7970 and now R9 290/290X), but they can be terribly loud. We all but crucified NVIDIA for similar noise levels back when they launched the GeForce FX, calling the cards a blow dryer. (Can you believe that was more than 11 years ago!?) Today, the R9 290X is every bit as bad, sometimes worse, but we live with it because it's fast and priced competitively. Part of the reason the GPU is loud is that the R9 series can simply use quite a bit of power compared to NVIDIA's Kepler and Maxwell cards, but the real truth of the matter is that NVIDIA has blowers that are simply better designed. A GeForce GTX 980 under 100% load isn't silent by any means, but it's a lot quieter than AMD's cards.

The "solution" is to use open air coolers rather than blowers on AMD GPUs, but that's potentially problematic as well. I understand these GPUs weren't necessarily intended to run 24/7 computational workloads, but I have three blower-style Sapphire 7950/7970 cards and three open air cooler Sapphire Dual-X 7950 cards. The fans on all the Sapphire Dual-X cards all failed within one year, and I ended up replacing the fans with jury-rigged 120mm case fans simply because it proved more effective. The blowers meanwhile kept going until I finally shut them down (about six months ago if you're wondering).

Of course, prior to Bitcoin, I've had fans fail on other AMD GPUs, including 5870 and 4870X2. Those were blower coolers as well, so by no means is getting a blower a guarantee of longevity, but they did last a few years. Anyway, I'm not sure if AMD has ever really modified their blowers since the 4870 era, but they need to. Seriously, just clone NVIDIA's GTX 770/780 cooler already!


What About NVIDIA?

Okay, enough complaining about AMD; what about NVIDIA? Without going deeper on both sides of this story, the reality is that NVIDIA simply offers a better platform right now. Maxwell 2.0 GPUs are efficient and fast, they use less power, NVIDIA's drivers tend to be easier to install and more reliable (and updated more frequently), and there are some other tangential benefits from NVIDIA. For example, NVIDIA has PhysX still floating around, and when used properly (e.g. in the Batman Arkham series) the technology is very welcome. The GameWorks libraries are also useful, providing ways for developers to make their games look better without investing as much time and effort. And NVIDIA has also been pushing new technologies like G-SYNC (which AMD countered with FreeSync).

It doesn't end there. Overall, I've found SLI to be generally more reliable and easier to live with than CrossFire -- it works more often on recent releases, basically. And on the mobile side of things, NVIDIA has all but killed off any real competition from AMD. The last true update to AMD's mobile GPUs came with the HD 7970M, launched way back in April 2012 -- almost three years ago. Since then, the HD 8970M was just a very small bump in clock speeds (50MHz) while the "new" R9 M290X is literally the same GPU as HD 8970M. The higher performance R9 M295X meanwhile is only fit for use in All-In-One systems (specifically, the latest iMac).

By comparison, NVIDIA launched the GTX 680M using the Kepler GK104 in June 2012, which was generally equal to or faster than the 7970M. The GTX 780M was a faster clocked version of the same GK104 GPU in May 2013, but with more CUDA cores enabled and and a substantial increase to GDDR5 clocks. GTX 880M in March 2015 increased the core clocks on GK104 by around 20%, and then in October 2015 NVIDIA dropped the GM204 hammer with the GTX 980M, delivering more than a 30% performance increase on average compared to GTX 880M. All told, in the same time frame AMD has increased mobile GPU performance by about 6% at the top while NVIDIA has nearly doubled their performance from GTX 680M to GTX 980M.

There's a cost associated with all of this, of course -- and I mean that quite literally. NVIDIA's most popular GPUs right now are the GTX 960 and the GTX 970. The GTX 960 delivers roughly the same performance as AMD's R9 280/R9 285, but the lack of RAM could be a problem. If you want the 2GB version (which I would recommend against), the price is good, but the extra $40 to get 4GB is a bit much. The GTX 970 meanwhile does battle with the R9 290/290X, typically holding on to a slight performance lead at less taxing settings but falling behind as the quality/resolution increase. Overall, however, I'd call the 960 vs. 285 and 970 vs. 290X performance close to a tie. The problem comes in going beyond that level of performance.

NVIDIA has their GTX 980 and GTX Titan X for those who want the fastest possible performance no matter the cost. Across a large collection of results, the 980 is only about 15% faster than the 970 (and by extension the R9 290X), and yet it costs 67% more. Ouch! And if that's not bad enough, the Titan X is only about 30% faster than the 980 on average, but it costs 82% more. If you prefer another comparison, the Titan X is about 50% faster than the R9 290X but it costs three times as much. Again: Ouch! NVIDIA is leading the performance charts and they're happy to reap the financial rewards.

And that is why we need AMD to stick around, and I for one truly hope that when AMD releases R9 380X (or whatever they end up calling their next halo GPU), they'll be able to bring some much needed competition to the high-end GPU market. Unfortunately, much like AMD's APUs are simply no match for Intel's CPUs (the fastest Kaveri APU, the A10-7850K, ends up being about 10% slower than the i3-4330 across a wide selection of benchmarks, while the i7-4790K is a punishing ~115% faster than the A10), the fastest AMD GPU at present is unable to top NVIDIA's best.

Price Comparison and Recommendations

That brings us to the summary of GPUs, pricing, and performance. I also happen to like looking at "bang for the buck" -- not just as a function of GPU pricing alone, but with a broader look at the cost of a complete system. First, let's start with a summary of the current online prices of the various contenders -- and I'll limit things to recent AMD and NVIDIA offerings, so we're looking at AMD GCN 1.1/1.2 parts (Radeon R9/R7) and NVIDIA 700/900 series Maxwell parts.

GPU Pricing Summary, April 2015
AMD
Price
NVIDIA

That gives you the rundown of pricing, and as is often the case, increasing prices will generally result in better performance, albeit with diminishing returns. There are also a few cases where paying more is actually a step back in performance, for instance the R9 295X2 is slightly slower than an R9 290X CrossFire build, and the GTX Titan X is on average slower than 970 SLI and 290X CrossFire (provided the games you play properly support dual GPUs). Let's take the above information and plot it against performance, which I've collected from a number of trusted sources. The result is in approximate FPS per dollar spent on a GPU:


There's nothing particularly shocking about this chart: the more you spend on a GPU, the lower your returns (in FPS per dollar). Thus, based on a pure "bang for the buck" approach this chart would suggest not getting anything more than the R7 260X -- which isn't a particularly fast GPU, really. Similarly, spending the money on a GTX 980 looks like a very poor investment overall. But while this chart makes sense from one perspective, it doesn't account for the total system cost. So as a secondary perspective on bang for the buck, we've added in system price. We've actually used a range, a midrange $750 system (not counting the GPU) and a high-end $1250 system.


Factoring system price into the equation changes things quite a bit. Now, if you're buying a high-end gaming system there's actually a lot to be said for buying multiple GPUs. The best overall value happens to be GTX 970 SLI, which not coincidentally is exactly what my own PC is running these days (when I'm not testing other hardware). The R9 290/290X CF are more or less equally sensible from a performance perspective, and then there's a moderate drop as you lose performance while increasing price if you buy a dual-GPU card like the R9 295X2. Finally, there's a sizeable drop in terms of return on investment when moving from the top-tier GPUs like the GTX 970/980 and R9 290/290X to less performant options.

The bottom line then is when shopping for a new system (with gaming as a use case), it's important to determine how much you want to spend and then shop accordingly. This chart also explains why pre-built OEM systems are such a bad value when it comes to gaming, as they'll frequently max out at relatively mediocre GPUs while putting far more money into the CPU/system side of the equation. But even with all the above information, there's still the question of what sort of settings you're really chasing -- there's not much sense buying a GTX Titan X over a GTX 970 if you're only planning run at 1920x1080, for example.

Final Recommendations By Price Range

I'll wrap things up with this final table, breaking down the recommendations based on pricing with a short note on what sort of settings the GPU(s) will handle well. There will be games that don't require as much GPU as well as titles that require more GPU, but in my book it doesn't hurt to spend a bit more if possible. Also note that while NVIDIA has pretty much revealed all their cards we're still waiting to see what AMD has in store. When AMD finally launches their next round of GPUs, we may finally see pricing on some of the faster NVIDIA parts come down.

GPU Recommendations, April 2015
Category Price GPU Target Settings
Entry Level $100-$125 R7 260X 1920x1080 and Medium to High
Mainstream $150-$200 GTX 960 or R9 285 1920x1080 and High to Ultra
High-End $250-$350 GTX 970 or R9 290X 2560x1440 and High to Ultra
Enthusiast $500+ GTX 970 SLI 2560x1440 Ultra or 4K Medium/High

Just a few final notes about the recommendations. While AMD is quite competitive at lower price tiers, things get a bit more questionable at the enthusiast level. Technically R9 290X CrossFire is a reasonable alternative to GTX 970 SLI, but my personal experience is that the fan noise associated with running two 290X GPUs in CrossFire isn't worth the added cost. The exception to that rule is if you're really serious about 4K gaming. NVIDIA is competitive and often class leading in most other settings, but at 4K AMD's R9 290X can often spoil the party. I'd still hold off buying R9 290X until the next generation of AMD GPUs is revealed, but that probably won't happen until June or thereabouts.

No comments:

Post a Comment