Skip to main content

How to Buy a Graphics Card? 10 Things To Consider

A discrete graphics card is a must for games, but will boost the performance of other applications like content production software as well. You can find one that matches your needs and your budget as long as you keep in mind these 10 simple things.

Let's face it: Most people who such a peripheral for a non-business PC are doing it for the purpose of playing games. Oh sure, discrete graphics cards may confer additional benefits with video playback or transcoding (and, increasingly, Web browsing), but who are we fooling? The main reason discrete graphics remains such a huge and fiercely competitive market is because, when it comes to playing games that don't come from casual-oriented companies like PopCap or Big Fish, games with major titles like Mass Effect 3, Batman: Arkham City, or The Elder Scrolls V: Skyrim, integrated graphics - even of the drastically improved varieties you find on the latest CPUs from AMD and Intel - just don't cut it.

The ugly truth about buying a discrete card can essentially be summed up in five sad words: The more expensive, the better. There's almost no way around the fact that the more money you're willing to drop on a graphics card, the better your gaming experience will be. If you can spend £300 or more, you're going to be buying yourself an outstanding gaming experience, even if the rest of your computer isn't so special.

That said, there are a few additional things you'll want to keep in mind when selecting a graphics card. All of these points are subsidiary to the issue of cost, but may help you make a better buying decision if you don't want to buy a single card for the same amount of money you could otherwise pay for a complete computer system.

AMD or Nvidia?

Would you believe that this question just doesn't matter that much? I didn't think so. Like "AMD or Intel?", "Windows or Mac?", and "Desktop or laptop?", it's a question that inspires intense - and often nonsensical - debate among each side's fans. We're not going to lie to you: There are substantial differences in the technologies in AMD's and Nvidia's graphics chipsets, and if you're really picky, you may have a very good reason for choosing one over the other. (We'll touch on this later.) The truth, however, is that only serious, detail-obsessed gamers are going to be able to discern a difference in appearance between a game running on an AMD card and one running on a comparable product from Nvidia. Most people are going to pay more attention to how realistic a game looks and whether it stutters during play than the specific types of anti-aliasing, physics processing, and multimonitor technologies that are employed. If you find a card at a price you can afford, and reviews say it does well, go with it. Most of the other stuff is just gravy - and often lumpy at that. (Note: Several years ago, AMD bought and absorbed graphics card maker ATI. All AMD's cards now carry AMD branding, but 5000-series and earlier models may still have ATI emblazoned on them somewhere.)


A graphics card's graphics processing unit (GPU) is what determines its video capabilities, and many GPUs have names that are often fairly arcane and unintuitive if you don't follow the business closely. But a good rule of thumb is that the higher the number in the GPU's name, the more recent and more powerful it is. For example, in Nvidia's newest 600 series, the top-of-the-line (and most expensive) card is the GeForce GTX 690. If you want something more affordable, but still unquestionably powerful, go for a GTX 680 or a GTX 670. On the other hand, the strongest card in AMD's current generation is the Radeon HD 7970, followed closely by the 7950, the 7870, the 7850, and so on down. But the company's most powerful card is, as of this writing, the 6990, which was unleashed in 2011. That's the only significant exception at this point: Otherwise, within each company's catalog, a card with a higher number in its name is invariably a superior performer.

Memory and Clock Speed

Ultimately, graphics cards aren't much different from CPUs - a lot of the same rules apply, including those about memory and clock speed. They have their own collections of both, although the specific values of either will usually be less immediately important than when you're choosing a system processor. It's possible for two graphics cards of the same type to have different amounts of memory - the GTX 580, Nvidia's single-GPU flagship from the last generation, is available with both 1.5GB and 3GB of memory - and the one with more memory will tend to be faster and cost more. Likewise, some cards might use GDDR5 memory, which is faster and more expensive than the DDR3 and GDDR3 memory other cards use. Lower-end video cards sometimes have faster clock speeds to compensate for the power they lack in other areas, but that's usually not going to translate to increased real-world performance. You only need to pay close attention to these specs if you want to fine-tune your purchase; in most cases, the name of the GPU will tell you everything you need to know about the card's capabilities.


Back when all computers were two-foot-tall towers, no one thought much about graphics cards hogging a lot of space. But because PCs now come in a range of sizes, you may not be able to use every card in every system you want to build. The more powerful a card, the longer it's going to be, and the less likely it will fit into a microATX case, or even a smaller ATX case. At 12.5 inches, the aforementioned Radeon HD 6990 is the longest card you can buy, but some higher-end AMD and Nvidia cards come within an inch of that. If you're building a smaller computer, or you want to upgrade the video in a minitower you already have, be aware that your card choices may be limited.


Graphics cards will support different versions of Microsoft's DirectX collection of advanced programming interfaces (APIs), which offer different graphical and processing features. Again, the specific details are going to be important only to hard-core gamers, but the higher the version number of DirectX a card supports, the more realistic games that use it will look - and the more challenging they will be for your computer to run. The highest-level version as of spring 2012 is DirectX 11 (DX11), and cards that can run it will also be able to run the full feature sets of games written with DX10 or DX9, but DX9 or DX10 hardware won't be able to see all the effects of DX11 games (assuming they work at all). The software package or the manufacturer's website will tell you what version of DirectX is supported - always check to make sure your hardware matches up. All the latest AMD and Nvidia cards support DX11, so if you buy new you won't have a problem.

Output Ports

The most common way to connect your monitor to your video card is with a DVI port, a trapezoid-shaped jack (usually white in color). Just about all graphics cards today will have one of these, and most will have two, in case you want to hook up more than one display. There are two different types: single-link, which can connect monitors up to 1,920 by 1,080 in resolution, and dual-link, which can connect displays as big as 2,560 by 1,600. But there are other jacks out there. You may also find an HDMI port or mini HDMI port, for outputting to an HDTV or other similar device. DisplayPort is an up-and-coming standard that promises support for display resolutions higher than DVI's 2,560 by 1,600 pixels; there's also a smaller version of this, with the same functionality, called Mini DisplayPort. The particular selection of ports on your card depends on a number of factors, ranging from how powerful the card is to the manufacturer's whims. But you'll almost always find a dual-link DVI port, accompanied by at least an HDMI or DisplayPort jack, of either the full- or reduced-sized variety. Your monitor probably supports a couple of different output technologies, but it's worth checking both it and the card you're considering before you buy, just to make sure everything works as soon as you've installed the card.

Internal Power Connectors

Graphics cards are serious business, and if you want the strongest ones on the market, simply plugging them into one of your computer's PCI Express (PCIe) x16 expansion slots isn't enough. Almost all high-performance cards - and increasingly the more powerful midrange cards - require dedicated connections to your system's power supply as well. These jacks, of which there may be one or two, of either six or eight pins, will be almost always be found on the edges of the video card farthest away from the I/O bracket. If you don't connect these to the proper cables from your power supply, your computer won't boot. Also, make sure your power supply can handle the card: It's not uncommon for lower-wattage power supplies to not have enough PCIe power cables to drive higher-end graphics cards. For their more demanding cards, AMD and Nvidia will list the minimum power supply you'll need - take this recommendation seriously!


For no other component in your computer can the Thermal Design Power (TDP) spec be as important as it can be with graphics cards. Many of the top-end models really suck up the juice - AMD's and Nvidia's dual-GPU behemoths, respectively the Radeon HD 6990 and the GeForce GTX 690, each have a TDP of 300 watts. And that's before you factor in any of your system's other hardware: A full high-end system with the 6990 installed used about 425 watts under full graphics load, and the GTX 690 about 414 watts. Do some quick math, and if it looks like you'll come close to tapping out your power supply, either choose a different card or buy a new and larger power supply (especially if you think you might want to add still more components in the future like more graphics cards).

Multiple-Card Setups

If your motherboard has more than one PCIe x16 expansion slot, chances are you can add a second, third, or even fourth graphics card to your computer to supercharge its graphics performance (provided your power supply is beefy enough, that is). AMD and Nvidia each have a technology that lets you link up two or more cards, and have the computer recognize them as one powerful graphics solution. (AMD's is called CrossFireX and Nvidia's is called Scalable Link Interface, usually abbreviated as SLI.) On almost every motherboard on the market, you can't mix and match Nvidia cards - you'll generally want two cards using exactly the same GPU, often even from the same manufacturer. There are a few exceptions to this, but that's information that only die-hard gamers and other enthusiasts will genuinely care about.

Special Features

It's with all the little extras that AMD and Nvidia distinguish themselves from each other. Since 2009, AMD has made a name for itself with its Eyefinity technology, which simplifies setting up and running as many as five or six monitors from a single graphics card - effectively giving you the ability to create your own digital wall. Nvidia also has a version of this (called Nvidia Surround), but is better known for its use of PhysX, a technology it purchased a few years ago that makes it easier for objects to behave according to real-world physics; and 3D Vision, which lets you play all games in stereoscopic 3D, provided you have the proper hardware (a special 120Hz monitor and Nvidia's 3D Vision kit). All these technologies don't come for free: Because controlling several monitors, rendering complex 3D animations, and making glass and fabric look ultra-realistic require a lot more pixels, they can slow even the more powerful video cards to a crawl. These are fun features, but they're not yet universally practical for regular use by ordinary users with most single-card setups.

Published under license from Ziff Davis, Inc., New York, All rights reserved. Copyright © 2012 Ziff Davis, Inc.