Half Life 2 GPU Roundup Part 2 - Mainstream DX8/DX9 Battle
by Anand Lal Shimpi on November 19, 2004 6:35 PM EST- Posted in
- GPUs
Final Words
Valve has done an incredible job with making Half Life 2 playable on just about any graphics platform sold over the last couple of years. While our first guide was more of an upgrade guide telling you what card to upgrade to, Part 2 let us know more about where your graphics card stands today.
We found that as far as DirectX 9 support goes, if you've got a Radeon 9600XT you are in very good shape, the game is quite playable at 1024 x 768 and if you want higher frame rates then 800 x 600 works just fine as well. If you want a low cost upgrade then a GeForce 6600GT AGP would be a good way of smoothing things out at 1280 x 1024. Even owners of the Radeon X300 will find that their performance is relatively decent, albeit at 800 x 600. Slower cards like the Radeon 9550 and the X300SE may be better played in DirectX 8 mode instead.
If you've got a NV3x part your Half Life 2 performance isn't too bad so long as you stay far away from the DX9 codepath; as a DX8 solution, the NV3x GPUs do just fine, there's actually no reason to upgrade unless you want better image quality, since the frame rates they will provide are pretty high to begin with. The same can actually be said about the GeForce4; we found the GeForce4 to run Half Life 2 extremely well in DX8 mode, and the image quality is quite good. Be warned, if you are upgrading from a GeForce4, you are going to want to go for something no slower than the Radeon 9700, otherwise you will get an increase in image quality but a decrease in frame rate.
In the end, we hope these two guides can give you a good idea of how powerful your current graphics card is and what your upgrade path should be if you want higher frame rates or better image quality. The next step is to find out how powerful of a CPU you will need, and that will be the subject of our third installment in our Half Life 2 performance guides. Stay tuned...
62 Comments
View All Comments
abakshi - Sunday, November 21, 2004 - link
Just a note - the last graph on page 7 seems to be a bit messed up -- the GeForce 6200 is shown as 82.3 FPS - higher than all of the over cards - while the data chart and line graph show it as 53.9 FPS.KrikU - Sunday, November 21, 2004 - link
Why cant we see benchmarks with AA & AF enabled with mainstream graphics cards? HL2 is using a such engine that is only CPU limited, so AA & AF tests are really welcome!Im playing with ti4400 (o/c to ti4600 speeds) with AA 2x & AF 2x! This is first such new game where I can use these image quality enhancements with my card!
T8000 - Sunday, November 21, 2004 - link
Half life 2 seems to be designed around the Radeon 9700.Because Valve seems to have made certain promises to ATI, they where not allowed to optimize any Geforce for DX9.
This also shows with the GF6200, that should be close to the R9600, but is not, due to the optimized Radeon 9700 codepath.
Luckely, Valve was hacked, preventing this game from messing up the marketplace. Now, almost any card can play it and Nvidia may even be tempted to release a patch in their driver to undo Valves DX9 R9700 cheats and make the game do DX9 the right way for FX owners, without sacrificing any image quality. Just to prove Valve wrong.
draazeejs - Sunday, November 21, 2004 - link
Well, I like HL2 a lot, much more so than the pitch-black, ugly-fuzzy texture D3. But, honestly - to me it looks exactly like Far Cry, engine-wise. Is there any difference?Respect to the level-designers of HL2, none of the games comes even close nowadays to that sort of detail and scenery. Also I think the physics of the people and faces and AI is by far superior. And the Raven-yard is much more scary than the whole D3 :)))
kmmatney - Sunday, November 21, 2004 - link
[sarcasm] Oh, and have fun running those DX games on other platforms without emulation. [/sarcasm]Obviously, this game isn't meant for other platforms, and that's fine by me. I think the original half-life had an OpenGL option, but it sucked (at least on my old Radeon card). In general, OpenGL has always been a pain, dating back to the old miniGL driver days. In my experience, when playing games that had either a Dx or OpenGL option, the DX option has usually been more reliable. It sould be because I usually have ATI based cards...
kmmatney - Sunday, November 21, 2004 - link
I didn't mean that DX literally "looks" better than OpenGl, I meant that it seems to be more versatile. Here's a game that can be played comfortably over several generations of video cards. You have to buy a new one to play D3 at decent resolution. The HL2 engine seems to have room to spare in terms of using DX 9 features, so the engine can be further enhanced in the future. I would think this game engine would be preferred over the Doom3 engine.oneils - Sunday, November 21, 2004 - link
#15, Steam's site (under "updates") indicates that the stuttering is due to a sound problem, and that they are working on a fix. Hopefully this will help you.vladik007 - Saturday, November 20, 2004 - link
" I'm missing words to how pathetic that is. "1st my post was no.2 NOT no.3.
2nd unlike many people i dont have time to work on my personal computers all the time. IF i dont upgrade this holliday season , i'll possibly have to wait until summer vacation. And you dont see nforce4 out now , do you ?
3rd No it's not pathetic to follow something that's never failed me. Ever heard of satisfied customer ? Well Abit has always treated me very well , RMA proccess , crossshiping , bios updates , good support on official forums ... etc Why on earth should i change ?
4th got it ?
moletus - Saturday, November 20, 2004 - link
I really would like to see some ATI 8500-9200 results too..Pannenkoek - Saturday, November 20, 2004 - link
#18: It depends on what features of the videocards are used for how a game will look like, and the art. It's not dirct3d vs opengl, the videocards are the limiting factor. Doom III is just too dark, and that's because of an optimization used in the shadowing. ;-)#26: Surely you mean "#2", I'm all for AMD. Not that my current computer is not pathetic compared with what's around nowadays...