F.E.A.R. GPU Performance Tests: Setting a New Standard
by Josh Venning on October 20, 2005 9:00 AM EST- Posted in
- GPUs
Soft Shadows Performance
Please refer back to our earlier section on soft shadows to learn why (aside from abysmal performance) we recommend against enabling soft shadows. Upon selecting the option in FEAR to enable soft shadows, a dialog box will pop up to inform the gamer that soft shadows are a high end option, which will only run well on heavy hitting graphics hardware. It is very true that you need to high end hardware to run the game with soft shadows, but we just don't like the feature.
With Soft Shadows enabled, the game takes a very significant performance hit. You can see that the 7800 GTX and GT become borderline-unplayable at 1600x1200, while the rest of the cards' framerates drop off quite abruptly. The X800 GT is only really playable at the absolute lowest resolution, and the X1300 PRO isn't really playable at all. At 37 fps, the 6600 GT does very well at 800x600, and although this is a low resolution by other games' standards, FEAR is still impressive. While 640x480 leaves something to be desired, 800x600 doesn't do a bad job in a pinch. But in a case like the 6600 GT, it is especially desirable to disable soft shadows and go with a higher resolution.
Please refer back to our earlier section on soft shadows to learn why (aside from abysmal performance) we recommend against enabling soft shadows. Upon selecting the option in FEAR to enable soft shadows, a dialog box will pop up to inform the gamer that soft shadows are a high end option, which will only run well on heavy hitting graphics hardware. It is very true that you need to high end hardware to run the game with soft shadows, but we just don't like the feature.
With Soft Shadows enabled, the game takes a very significant performance hit. You can see that the 7800 GTX and GT become borderline-unplayable at 1600x1200, while the rest of the cards' framerates drop off quite abruptly. The X800 GT is only really playable at the absolute lowest resolution, and the X1300 PRO isn't really playable at all. At 37 fps, the 6600 GT does very well at 800x600, and although this is a low resolution by other games' standards, FEAR is still impressive. While 640x480 leaves something to be desired, 800x600 doesn't do a bad job in a pinch. But in a case like the 6600 GT, it is especially desirable to disable soft shadows and go with a higher resolution.
117 Comments
View All Comments
LocutusX - Thursday, October 20, 2005 - link
I have FEAR, and have been playing it for the past day or so ("sick day" from work).I can't believe AnandTech would consider it good-looking on non-cuttingedge hardware where you have to put the details down. Have you actually played the game for more than 5 minutes? Performance & Graphics Quality in the later levels is CRAP if you're using mostly medium settings, which is a NECESSITY if you're using a slow X800 part or anything worse. (think X800XL)
For the level of graphics you get, the performance of FEAR is unacceptable. Chronicles of Riddick looked much better, and performed slightly better, on my system. That's an OPENGL game on ATI hardware! Significant, no?
BTW I also just tried Quake4... much much better performance than FEAR, and the indoor sequences look better by comparison (since I can afford to increase details in Q4, because the D3 engine actually runs pretty decently on ATI hardware with the most up-to-date drivers and CatalystAI enabled).
Jackyl - Thursday, October 20, 2005 - link
LOL. He thinks X800XL is "slow"! A few months ago, everybody here was raving the X800XL as being best price/performance that actually beat a lot of higher end Nvidia cards. Ugly on medium textures? Go play the first DOOM or Wolfenstein 3D, and then come back and say Medium textures are ugly on Fear.Some people just won't be satisfied. It's people like you that pay $400-600 for a graphics card, that is causing prices to inflate. I really can't wait until silicon hits the limit where they can't reduce size anymore, and Moore's Law goes obsolete. Then the engineers will have to actually OPTIMIZE the hardware and drivers, instead of just cranking out more raw GPU power. I'm really sick of upgrading, and a lot of my friends have already stopped upgrading their systems, two generations ago. They just gave up.
Tell me something...Everyone sure talks big on here, wanting to upgrade their cards. But why is it when I go to a game store, there are barely any PC games available on the shelves? I don't think a lot of people are buying PC games today, even though ATI and Nvidia would like to say otherwise. The shelves are totally full with console games instead.
LocutusX - Thursday, October 20, 2005 - link
Jackyl, thanks for your totally useless post:"LOL. He thinks X800XL is "slow"!"
Yes, compared to the 7800GTX - which apparently is what you NEED for Fear to run at a decent rate AND look good - the X800XL is slow.
"Ugly on medium textures? Go play the first DOOM or Wolfenstein 3D, and then come back and say Medium textures are ugly on Fear."
Not necessary to go back 10 years, chico. Anyone who has played Doom 3 (August 04), Half-Life 2 (December 04) or even Far Cry (March 04) will agree that Medium textures are "ugly" on Fear, although some may not use such strong language.
"It's people like you that pay $400-600 for a graphics card, that is causing prices to inflate."
Uh, no genius, I paid less than $200 for my X800Pro at a fire sale. And then I overclocked the sh!t out of it, so now it's a little bit faster than a stock X800XL.
"Then the engineers will have to actually OPTIMIZE the hardware and drivers, instead of just cranking out more raw GPU power."
Uh, that's precisely the point of my post - sorta. FEAR is a horribly unoptimized, perhaps even poorly-written, engine. In my opinion, it is unacceptable that an X800XL-class card should have so much trouble with it.
So, what exactly was the point of your post anyways?
Pannenkoek - Thursday, October 20, 2005 - link
On the esthetics of FEAR: it would have been kind of the article writers to include screenshots to underline their judgement. As one who has seen Unreal2 rendered by a lousy GF2 I can understand the parent poster's point. Also thanks for listening to my request for the absence of subjective opinions on the "playablity" of a game in benchmarks...Kegh - Thursday, October 20, 2005 - link
I played through the demo and thought the graphics were pretty good (considering my setup - 9700 pro, AMD 2500). But more to your point... I have never been a big fan of the Monolith LithTech Engine, every game or demo I have played which used it always feels clunky, the controls always seem "off" and the general engine performance is generally not on par with the other 3d game engines available. To be fair, I haven't played enough of this game to bash the current engine that much and once the game goes on sale I will probably pick it up. But not until after I buy a new system. :)michal1980 - Thursday, October 20, 2005 - link
common now, high end with no sli, maybe the 7800gt/gtx X 2 could really for the first time shine?Jackyl - Thursday, October 20, 2005 - link
Tried the demo with my 9800 Pro 128MB, 2.2GHz 64-bit proc, 1GB DDR. I ran the game in 1024x768, no soft shadows and no AF/AA, medium textures, and it still ran great on my system. It also still looked great with medium textures and ran smooth.I'm not sure why the X800 GT got such low framerate? Because of high textures? Maybe I'll try that on my card too tonight.
Jedi2155 - Friday, October 21, 2005 - link
At first i was very skeptical that my friends system could handle it but it worked great and was perfectly smooth @ 1024x768 with medium details. And he's only running a AXP 1800+ with a Radeon 9800 pro. So it can still work pretty well on the old systems.xsilver - Thursday, October 20, 2005 - link
high end textures absolutley kills the 9800pro - killed mine anyways ;)if you let it autodetect the settings it should run smooth, all the tests here were on max settings except for the aformentioned soft shadows and AA/ansio
with your settings in the demo (i had basically the same setup), while the frames were good, it still hitches when there is a lot of enemies/action going on
it would just be good for AT to test it to compare apples with apples :)
Jackyl - Thursday, October 20, 2005 - link
Actually I had ATI's 2x AF turned on in the drivers.