AMD vs Nvidia: 2nd gen DirectX 11 Battle of the GPUs

By on June 16, 2011
submit to reddit

CrossFire vs SLI; Eyefinity vs 3D Surround.

Before I get deeper into how today’s tests will be conducted, let’s get an understanding of how a PC gamer plays games. According to Steam latest survey for May 2011, the most widely used resolution is 1920×1080, or 1080p as the HDTV manufacturers like to call it. Keeping that in mind, for the single monitor test, I have used our resident BenQ G2400W LCD monitor which supports a maximum resolution of 1920 x 1200.

For those who want to enjoy a multi-monitor setup, 3 seems to be the ideal number of monitors as it’s not as expensive as a 6 monitor setup, nor is it as space-consuming. For that purpose, I used three LG W2363D-PU monitors, supporting a 1080p resolution. Once all three of them were hooked up, it gave a total resolution of 5760 x 1080. Note that running a triple monitor setup on AMD cards is very hard. In order for three monitors to work in a CrossFire setup, all three monitors must be connected to the first GPU. It’s either that, or no CrossFire. Since most AMD  cards come with mini-DisplayPorts including two DVI ports (max per card), you need a mini-DP to DVI or mini-DP to DP adapter. Mind you this has to be a powered adapter, so a passive adapter that maybe included in retail boxes isn’t enough. You have to have something like this adapter pictured below. Basically you cannot run triple monitor setup on AMD cards out of the box if you monitor doesn’t have a DisplayPort.

Since the AMD HD 6990 comes with only 1 DVI port and 4 mini-DisplayPorts, and the fact that we couldn’t source a powered mini-DP to DP adapter in time for this article to go online, no triple monitor tests were conducted on that card.

Since 3D is still pretty relevant, I thought I would give these cards a run for their money in 3D modes. And by these cards I mean Nvidia mostly, because of their substantial support for 3D Vision. AMD’s 3D support is paltry in comparison, although this is all set to change with the launch of their new Octa-Core CPU and “Gaming Evolved” program. For this purpose, I tested Crysis 2 running in 3D. This is why Crysis 2 is the only game whose standard one monitor test was done at a resolution of 1920 x 1080 instead of 1920 x 1200.

Coming down to the actual testing methodology now, below are the details of settings used for the benchmarks:

  • 3DMark 11 – Performance preset; 1280×720
  • 3DMark Vantage – Performance preset; 1280×1024
  • StarCraft II – Ultra Settings on all; NoAA/NoAF; 1920×1200
  • Just Cause 2  – Highest Settings; 8xAA/16xAF; 1920×1200
  • Far Cry 2 – Ultra High Settings; 8xAA/16xAF; 1920×1200
  • Unigine Heaven – Highest Settings (except Tessellation being Normal); 4xAA/16xAF; 1920×1200
  • Metro 2033 – Very High Settings; 4xAA/16xAF; 1920×1200
  • Lost Planet 2 – Highest Settings; 8xAA/Default AF; 1920×1200
  • Aliens vs Predators – Highest Settings; 1xAA/Default AF; 1920×1200
  • Crysis 2 – Extreme Settings; 1920×1080

For the triple monitor setup, we had the exact same settings as the one’s stated above, except all the benchmarks were run at 5760×1080 instead of 1920×1080/1200. Likewise, 3D tests run on Crysis 2 had the same settings and resolutions, except the game was now running in 3D.

Our testbed comprised of the following:

  • CPU – Intel Core i7-2600K @ 3.4GHz
  • RAM – G.Skill Ripjaws X 4GB DDR3-1600
  • Motherboard – Gigabyte P67A-UD7
  • HDD – Western Digital VelociRaptor 300GB
  • PSU – Cooler Master 1200W Silent Pro Gold
  • GPUs:
    • Nvidia GTX 550Ti
    • Nvidia GTX 560
    • Nvidia GTX 560 Ti
    • Nvidia GTX 570
    • Nvidia GTX 580
    • Nvidia GTX 590
    • AMD HD 6790
    • AMD HD 6850
    • AMD HD 6870
    • AMD HD 6950
    • AMD HD 6970
    • AMD HD 6990

The latest version of Windows 7 Ultimate 64bit with Service Pack 1 was used; drivers for all Nvidia cards were Forceware 270.61 (except GTX 560 for which 270.48 was used). For all of AMD HD cards Catalyst 11.4 was used.

In the table below you will see the specs and pricing of each of the above mentioned cards in more detail. All of these cards were running at stock speeds, any overclocked variants that we had were downclocked to match factory speeds.

Now with all the formalities out of the way, let’s get to the what we have all been waiting for: bar charts!

« Previous Page Next Page »


From auditing to editing, I now test and analyze the latest gadgets and games instead of the latest financial statements. Both jobs are equally intense and rewarding. When I'm not burning up hardware in the name of science, you'll find me nuking in DOTA 2 or engineering in TF2.

  • Pingback: Maximum CPU Review Site » Blog Archive » 6/16 Affiliate News

  • Aequitas

    Would you suggest getting a 6950 or a GTX560 Ti? Its for a Single Monitor Setup.

  • Aequitas

    Would you suggest getting a 6950 or a GTX560 Ti? Its for a Single Monitor Setup.

    • Taimoor Hafeez

      Both have nearly identical performance. 560Ti 1GB costs $250 whereas HD 6950 1GB costs around $230, with 2GB version costing $280. If you’re planning to use the card for video editing as well, then get the 2GB HD 6950.

      Personally I’d go for the GTX 560 Ti because it can overclock very nicely, and at its performance level an extra 1GB (when comparing to 2GB HD 6950) won’t make a tangible difference in games.

    • Alex

      Definitely the 6950. as they offer better performance and generally run cooler as well.

  • Simarills

    6990 WILL work with ONE passive DVI to mini DP in the mix, most come with one active and one passive…mine did and it works fine.

  • Blaa

    60 fps is almost acceptable for me.  I do ‘notice’ when fps drops below around 80
    as I am sure there are many others that can as well and yes my monitor will
    support that.

  • Luay

    Loved the chart! Thanks for summing up what will be (hopefully) the last of the 40 nm chips generation.

  • Pingback: News for June 20th 2011–Happy Monday! | Review the Tech

  • Footman

    Interesting review. I actually just sold a pair of 2gb 6950′s, which I tried to run crossfire/triple monitor but ran in to too many issues as well as substantial noise. I actually ended up with a pair of reference EVGA 560Ti’s in SLI, triple monitor was easy, support was better for widescreen gaming, the solution is silent in my opinion, I have HAF-X and I am unable to hear the 560Ti’s over case fans at full load, which was not the case with the 6950′s. But for me the biggest win win in going back to the green team was the ability to have transparent supersampling work. If you retest, you will find that the AMD equivalent does not work and therefore imo Nvidia provides a superior image.

    • Adam Garner

       Great read fellas. I actually just sold a pair of Gainward 1GB GTX560Ti,
      which I
      tried to run in SLI but after unforgivable heat & noise issues which
      would have required an aftermarket cooler to solve and numerous
      graphics instabilities i decided to go try the red team. I purchased 2x
      1GB Sapphire HD6950 and i havent had any issues since. The coolers are
      running quiet and frosty and i have had not a single graphics glitch. I
      am very impressed with these AMD / ATI cards.

    • anubis44

      You’re either an nVidia employee or on crack. You’ve just sold two good cards and bought two crappy ones for multi-monitor gaming.

      The 6950 custom cards like the Sapphire Flex edition, Dirt 3 edition, MSI Twin Frozr II & III, Asus DirectCU, etc. are all at least as quiet as any GTX560Ti on the market, and with 2Gb, you can actually turn on 4X AA or more without running out of video memory, unlike with any 1Gb video card from either camp.

      And for the cretin who wrote the review, saying multi-monitor gaming was hard with AMD cards, actually, doing multimonitor is much harder with nVidia, because you HAVE to have two cards AND an SLI-capable motherboard. With AMD, you can buy just one 6950 card and be up and running with a $25 miniDP to DVI adapter, and the Sapphire Flex editions will do eyefinity right out of the box. Finally, virtually any motherboard with at least two PCI-E slots will probably support crossfire, but nVidia, in their greed, has only licensed their SLI technology for a small subset of boards out there. So who’s really making it harder to do multi-monitor gaming?

      • Footman

        Damn, you are rude… Obviously you have an opinion as do we all, keep it civil.

        I obviously have experience with a pair of 2gb 6950′s as well as my current pair of 560Ti. So I am well positioned to comment on my particular situation. When I game I want to game with Transparent supersampling, AMD’s adaptive AA does not work, therfore in my opinion I get a better image using the 560′s….

        Obviously there are people reporting the opposite to me, what can I say, to each their own. just keep a cicil tongue in your head if you have an opinion.

        • Joe

          MLAA (morphological Antialiasing) is actually the better option… and AMD does that in spades plus gets a 30% improvement in MLAA via the Catalyst 11.8 Preview driver.  Plus you traded in 4GB VRAM for 2GB VRAM for multi monitor gaming.  There went your headroom for future titles at high res with AA enabled.

          Also everything anubis said is true.

    • Joe

      Really, because i have two Sapphire 6950 2GB’s (OC’d to 840/1325) in a Lancool PC-K62, and the only time i hear them is at POST.  Running Unigine @ max with 4xAA, max Tess, fan on card one maxes out at 38% and i hear nothing.  You must have super hearing.

  • Gelf54

    Its disappointing that that SLI 3D performance on page 12 is barely improved over a single card. Driver issue?

Most Read
Most Commented
Win two Toshiba AT200 tablets

This festive season Toshiba has 2 tablets to giveaway.

Win an MSI FM2-A85XA-G65 Motherboard

Thanks to MSI we have an great AMD FM2 motherboard to giveaway.

Win a Nokia Asha 311

Thanks to Nokia we have a great entry-level smartphone to giveaway.