Can your graphics card handle it?
Now that one of the most anticipated PC games of the decade is finally out, we’ve decided to find out just how capable some of graphics cards sitting in our labs really are. There are some surprises and some disappointments for sure, but these benchmarks provide a real world analysis on whether you need to splurge some cash on a new graphics card or not for this DX9 based game.
While the Shader models are heavily dependent on your graphics card, some physics and particle based effects are entirely dependent on your CPU. During our tests, each graphics card had all the settings at “Ultra”. The resolution was set to 1920×1200 with no forced Anti-Aliasing on either Nvidia or ATI cards.
Our testbed comprises of an Intel Core i7 965 EE CPU @ 3.2GHz with 3GB Corsair DDR3-1333 RAM on the Gigabyte X58A-UD9 motherboard. No overclocking was done to any component. ATI cards were running on Catalyst 10.7, and Nvidia’s on ForceWare 258.96 during our testing. StarCraft II was patched to version 1.0.2
The test is based on a multiplayer replay file in a match between Terrans vs Zerg. While the whole match lasts just over 34 minutes, we have selected a particular part between 26:01 and 27:19 minutes, displaying a particularly brutal showdown amongst Vikings, Thors, Siege Tanks and Hydralisks, Zerglings, Brood Lords, etc. What this means is that we have taken a fairly “heavy” scenario so the game should generally perform a bit better than the numbers we posted above.
1024x600 @ Low
While we were testing out the above cards, we also had a few machines lying in the office we thought we’d give a whirl. One of them was a netbook, the HP Mini 201 running Win 7 Starter. Coming with an Intel Atom n450 CPU at 1.66GHz and an Intel Integrated GMA 3150 onboard VGA controller, the 2GB of onboard RAM had 256MB shared with the VGA. With all settings on Low and a fullscreen resolution of 1024×600, FRAPS showed an average of 8fps. (Yes, we can already see the next bunch of netbook ads touting that you can play StarCraft II on it.)
1360x768 @ High/Ultra (Low Shaders)
Next we had an HP Pavilion dm4. This shiny new laptop packed a Core i5 at 2.27GHz along with 3GB of RAM and switchable graphics card between Intel GMA HD and ATI Mobility Radeon HD 5450 with 512MB onboard memory. We had quite an interesting experience with this one. With the resolution set at 1360×768 and everything set at High we got an average of 20fps. However, with the Shaders turned to Low and everything else set to High/Ultra, the average bumped up to 40fps.
1920x1200 @ Ultra
Finally we had another desktop that’s used in the office. This one has a Core i7 860 @ 2.80GHz, 4GB RAM and an ATI 5970 2GB onboard. 64fps average on this machine thanks to the dual GPU setup on the ATI 5970 (2x 5850).
So there you have it, StarCraft II will run on just about anything. The better the hardware, the smoother it goes. Shader modeling plays a huge role in how good the game looks, bringing the DX9 era graphics to an acceptably modern standard. If you have a low-end card, reducing the Shader settings will significantly boost your performance at the tradeoff of lesser eye candy.