Quote:
Originally posted by GCNaddict@5 January 2004 - 01:19
i must point out that that game is optimized for ATI only
all the (good) games such as UT2003 and Tron 2.0 are optimized for both cards, and the results are almost equal in those benchmarks
yet again, xbitlabs screwed up
i cannot trust xbitlabs in tests anymore
What are you talking about?
Quote:
I've gone over this a million times. Half Life 2 is not the only game the FX cards do poorly in. Everyone refers to HL2 simply because it is one of the most if not the most popular and highly anticipated games to come in a long while. It also heavily uses DX9. If you ever get a chance to ask Gabe Newell, founder of Valve software, i'm sure he would tell you about the tricks nvidia was willing to go through with and why he himself owns a 9800.
If you've been following the graphics scene at all the past year you would know nvidia's architecture used in their FX line of products were not developed around the dx9 standard. They chose to develop their own code because at the time ATI was not in the picture. An nvidia card was in about 80% - 90% of all gaming computers. And because of this they thought Microsoft would cave into their demands. However, ATI emerged out of nowhere and released the 9700 Pro which proved to be nearly twice as fast as the Geforce 4 TIs. Since the core in these 9700's was developed around MS's dx9 standard MS decided not to go with nvidia's code. Nvidia did not show up at any of the meetings and were thus not in the loop of what was happening. By the time they figured it out it was already too late and would have required they go back to the drawing board. Rather then do that and waste countless amounts of time and money they just decided to release what they had and use their PR machine to continue what it did best. This is when the optimizations came. They knew their card's performance would be rather lacking so they decided to make optimations to their drivers by decreasing image quality, removing aspects of scenery, etc. They were basically forced to do this in order to keep sales going while they wrote a compiler which would ultimately convert the dx9 code into their own. This compiler was introduced with the 'forceware' drivers. However, there are still optimizations and the image quality isn't on par with that of ATI's Radeons as was shown with Futuremarks last patch to 3DMark03.
Yes there are other aspects involved as well. The NV30 (nvidia's fx core) only utilizes either FP16 or FP32. It cannot do FP24 like that of ATI's R300. In FP32 it's performance drops a significant amount. The design just isn't capable of producing high fps in FP32 mode. This is partly the reason why Valve had to write a special backend specifically for the FX cards in order to get decent fps in HL2. ATI's cards run it perfectly fine without any special code. It is the same for all current and future dx9 games.
Nvidia has no one to blame but themselves. I'm sure they're next product, the NV40, will fix these problems as they wouldn't dare commit the same mistakes twice. But until then, my advice would be to stay away from the FX cards. If you must go with nvidia then get a Geforce 4 Ti line, otherwise get a Radeon 9500 or above. And next time, GCNaddict, I would suggest you do a bit more research before mouthing off about what you do not know. Sorry if I sound harsh.