PDA

View Full Version : Nvidia Gets Raped Again!



adamp2p
01-04-2004, 09:36 PM
I thought I would post these results because, you must admit, they speak for themselves. All of us know that Xbitlabs.com is an unbiased source of data regarding hardware, so I thought these results would aid some of the newer members of the board when deciding upon purchasing a new graphics card. Nothing is new here, of course, we already knew that ATi is the clear leader when it comes to rendering DX9 code.

But here are some results from a new DX9 game called: Lost Oblivion in Chernobyl (http://www.xbitlabs.com/articles/editorial/display/stalker.html)
[/IMG]

http://www.xbitlabs.com/images/editorial/stalker/escape_1024_pure.gif

http://www.xbitlabs.com/images/editorial/stalker/escape_1024_4x8x.gif

http://www.xbitlabs.com/images/editorial/stalker/escape_1280_pure.gif

http://www.xbitlabs.com/images/editorial/stalker/escape_1280_4x8x.gif

http://www.xbitlabs.com/images/editorial/stalker/escape_1600_pure.gif

http://www.xbitlabs.com/images/editorial/stalker/escape_1600_4x8x.gif

No comment necessary. The verdict is clear! ATi rulez (for now, that is)!

L0rD_S0tH
01-04-2004, 09:44 PM
well i'm at least glad I went with the fx 5700 Ultra rather than the 9600 XT for my new card, since the 5700 beat the 9600 in all the benchmarks. Were i to have enough money for a 9800 or 5950 I'd definitely have to go with ATI though.

Pitbul
01-04-2004, 09:54 PM
well actuallt Nividia really did not get raped as you exageratingly put it since the 5950 never really went below 60 FPS so that is decent for the highest nvidia card but yes the 9800 XT does show extremely significant leads over nVidia of course except with the 9600 XT and FX 5700 but im gonna get a 9800 XT so it doesn't matter to me. :01:

adamp2p
01-04-2004, 10:11 PM
It does seem that the 5700 U is indeed living up to the "ultra" name.
However, I neglected to mention a very important point brought up in the conclusion:

If the game uses Pixel Shader 2.0 Nvidia is assed out; however:



In case a game heavily uses Pixel Shaders 2.0, the GeForce FX is likely to stay behind competing RADEON solutions. But if an application requires powerful Vertex Shader processors, performance provided by the RADEON 9600-series will hardly be enough to beat the GeForce FX 5700 and the latter will be champ of the mainstream market.


So those of you buying Nvidia, you better hope that the games you anticipate playing with your new FX uses a similar engine as Chernobyl! I sure hope you guys don't plan on playing HL2 with decent framerates (especially since HL2 will he heavilty Pixel shader 2.0 intensive!) :smilie4: :01:

adamp2p
01-04-2004, 10:15 PM
Originally posted by Pitbul@4 January 2004 - 22:54
well actuallt Nividia really did not get raped as you exageratingly put it since the 5950 never really went below 60 FPS so that is decent for the highest nvidia card but yes the 9800 XT does show extremely significant leads over nVidia of course except with the 9600 XT and FX 5700 but im gonna get a 9800 XT so it doesn't matter to me. :01:
Pitbul, your signature dissapoints me, especially since AMD is the current leader in ALL benchmarks and is leading the CPU wars as of 2004 Q1. ;)

BiG_aL
01-04-2004, 10:36 PM
good thing I haven't changed my mind then from a 9600XT to a 5700U...
I'll stick with the 9600XT seeing as it come with HL2 for free...WOOT

Mad Cat
01-04-2004, 10:42 PM
Originally posted by adamp2p+4 January 2004 - 22:15--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (adamp2p @ 4 January 2004 - 22:15)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-Pitbul@4 January 2004 - 22:54
well actuallt Nividia really did not get raped as you exageratingly put it since the 5950 never really went below 60 FPS so that is decent for the highest nvidia card but yes the 9800 XT does show extremely significant leads over nVidia of course except with the 9600 XT and FX 5700 but im gonna get a 9800 XT so it doesn&#39;t matter to me.&nbsp; :01:
Pitbul, your signature dissapoints me, especially since AMD is the current leader in ALL benchmarks and is leading the CPU wars as of 2004 Q1. ;) [/b][/quote]
Haha, Pitbul got raped.

And AMD is a third world company that produces rip offs of real American cards (you have to have read my funny post a while back to understand this. If you didn&#39;t then disregard it).

GCNaddict
01-05-2004, 12:19 AM
Originally posted by adamp2p@4 January 2004 - 21:36
I thought I would post these results because, you must admit, they speak for themselves. All of us know that Xbitlabs.com is an unbiased source of data regarding hardware, so I thought these results would aid some of the newer members of the board when deciding upon purchasing a new graphics card. Nothing is new here, of course, we already knew that ATi is the clear leader when it comes to rendering DX9 code.

But here are some results from a new DX9 game called: Lost Oblivion in Chernobyl (http://www.xbitlabs.com/articles/editorial/display/stalker.html)
[/IMG]

http://www.xbitlabs.com/images/editorial/stalker/escape_1024_pure.gif

http://www.xbitlabs.com/images/editorial/stalker/escape_1024_4x8x.gif

http://www.xbitlabs.com/images/editorial/stalker/escape_1280_pure.gif

http://www.xbitlabs.com/images/editorial/stalker/escape_1280_4x8x.gif

http://www.xbitlabs.com/images/editorial/stalker/escape_1600_pure.gif

http://www.xbitlabs.com/images/editorial/stalker/escape_1600_4x8x.gif

No comment necessary. The verdict is clear&#33; ATi rulez (for now, that is)&#33;
i must point out that that game is optimized for ATI only

all the (good) games such as UT2003 and Tron 2.0 are optimized for both cards, and the results are almost equal in those benchmarks

yet again, xbitlabs screwed up

i cannot trust xbitlabs in tests anymore

Pitbul
01-05-2004, 12:51 AM
Originally posted by Mad Cat+4 January 2004 - 15:42--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Mad Cat @ 4 January 2004 - 15:42)</td></tr><tr><td id='QUOTE'>
Originally posted by adamp2p@4 January 2004 - 22:15
<!--QuoteBegin-Pitbul@4 January 2004 - 22:54
well actuallt Nividia really did not get raped as you exageratingly put it since the 5950 never really went below 60 FPS so that is decent for the highest nvidia card but yes the 9800 XT does show extremely significant leads over nVidia of course except with the 9600 XT and FX 5700 but im gonna get a 9800 XT so it doesn&#39;t matter to me.&nbsp; :01:
Pitbul, your signature dissapoints me, especially since AMD is the current leader in ALL benchmarks and is leading the CPU wars as of 2004 Q1. ;)
Haha, Pitbul got raped.

And AMD is a third world company that produces rip offs of real American cards (you have to have read my funny post a while back to understand this. If you didn&#39;t then disregard it). [/b][/quote]
can you piont out how i got raped Mad Cat? as regards to the AMD Intel situation i dont like Intel cause their an American Company thats so cleche get a life and a brain and as for AMD leading, yes theirs always one company slightly above the rest its called Marketing Competition. Soon Intel will release their 64bit and even the field and possibly taking it over again then AMD will and back and fore, are you saying that just because one company has the fastest of a product that everyone must like them? are you that childish? if that was true people would be bouncing around for everything. and your tedious pionts to knick pick and not mind your own buisness dissappionts me its quite sad to see you aviod a topic by moving it to another, a topic which you created, from me comenting on your over exagerations and saying nVidia got quote unquote "Raped" then you moving it to my sig, very mature. :rolleyes:

adamp2p
01-05-2004, 01:05 AM
Originally posted by GCNaddict@5 January 2004 - 01:19


i must point out that that game is optimized for ATI only

all the (good) games such as UT2003 and Tron 2.0 are optimized for both cards, and the results are almost equal in those benchmarks

yet again, xbitlabs screwed up

i cannot trust xbitlabs in tests anymore
What are you talking about?

Once again, I will repost what you obviously don&#39;t know or understand:


I&#39;ve gone over this a million times.&nbsp; Half Life 2 is not the only game the FX cards do poorly in.&nbsp; Everyone refers to HL2 simply because it is one of the most if not the most popular and highly anticipated games to come in a long while.&nbsp; It also heavily uses DX9.&nbsp; If you ever get a chance to ask Gabe Newell, founder of Valve software, i&#39;m sure he would tell you about the tricks nvidia was willing to go through with and why he himself owns a 9800.

If you&#39;ve been following the graphics scene at all the past year you would know nvidia&#39;s architecture used in their FX line of products were not developed around the dx9 standard.&nbsp; They chose to develop their own code because at the time ATI was not in the picture.&nbsp; An nvidia card was in about 80% - 90% of all gaming computers.&nbsp; And because of this they thought Microsoft would cave into their demands.&nbsp; However, ATI emerged out of nowhere and released the 9700 Pro which proved to be nearly twice as fast as the Geforce 4 TIs.&nbsp; Since the core in these 9700&#39;s was developed around MS&#39;s dx9 standard MS decided not to go with nvidia&#39;s code.&nbsp; Nvidia did not show up at any of the meetings and were thus not in the loop of what was happening.&nbsp; By the time they figured it out it was already too late and would have required they go back to the drawing board.&nbsp; Rather then do that and waste countless amounts of time and money they just decided to release what they had and use their PR machine to continue what it did best.&nbsp; This is when the optimizations came.&nbsp; They knew their card&#39;s performance would be rather lacking so they decided to make optimations to their drivers by decreasing image quality, removing aspects of scenery, etc.&nbsp; They were basically forced to do this in order to keep sales going while they wrote a compiler which would ultimately convert the dx9 code into their own.&nbsp; This compiler was introduced with the &#39;forceware&#39; drivers.&nbsp; However, there are still optimizations and the image quality isn&#39;t on par with that of ATI&#39;s Radeons as was shown with Futuremarks last patch to 3DMark03.

Yes there are other aspects involved as well.&nbsp; The NV30 (nvidia&#39;s fx core) only utilizes either FP16 or FP32.&nbsp; It cannot do FP24 like that of ATI&#39;s R300.&nbsp; In FP32 it&#39;s performance drops a significant amount.&nbsp; The design just isn&#39;t capable of producing high fps in FP32 mode.&nbsp; This is partly the reason why Valve had to write a special backend specifically for the FX cards in order to get decent fps in HL2.&nbsp; ATI&#39;s cards run it perfectly fine without any special code.&nbsp; It is the same for all current and future dx9 games.

Nvidia has no one to blame but themselves.&nbsp; I&#39;m sure they&#39;re next product, the NV40, will fix these problems as they wouldn&#39;t dare commit the same mistakes twice.&nbsp; But until then, my advice would be to stay away from the FX cards.&nbsp; If you must go with nvidia then get a Geforce 4 Ti line, otherwise get a Radeon 9500 or above.&nbsp; And next time, GCNaddict, I would suggest you do a bit more research before mouthing off about what you do not know.&nbsp; Sorry if I sound harsh.

Mad Cat
01-05-2004, 01:16 AM
Originally posted by Pitbul+5 January 2004 - 00:51--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Pitbul @ 5 January 2004 - 00:51)</td></tr><tr><td id='QUOTE'>
Originally posted by Mad Cat@4 January 2004 - 15:42

Originally posted by adamp2p@4 January 2004 - 22:15
<!--QuoteBegin-Pitbul@4 January 2004 - 22:54
well actuallt Nividia really did not get raped as you exageratingly put it since the 5950 never really went below 60 FPS so that is decent for the highest nvidia card but yes the 9800 XT does show extremely significant leads over nVidia of course except with the 9600 XT and FX 5700 but im gonna get a 9800 XT so it doesn&#39;t matter to me. :01:
Pitbul, your signature dissapoints me, especially since AMD is the current leader in ALL benchmarks and is leading the CPU wars as of 2004 Q1. ;)
Haha, Pitbul got raped.

And AMD is a third world company that produces rip offs of real American cards (you have to have read my funny post a while back to understand this. If you didn&#39;t then disregard it).
can you piont out how i got raped Mad Cat? as regards to the AMD Intel situation i dont like Intel cause their an American Company thats so cleche get a life and a brain and as for AMD leading, yes theirs always one company slightly above the rest its called Marketing Competition. Soon Intel will release their 64bit and even the field and possibly taking it over again then AMD will and back and fore, are you saying that just because one company has the fastest of a product that everyone must like them? are you that childish? if that was true people would be bouncing around for everything. and your tedious pionts to knick pick and not mind your own buisness dissappionts me its quite sad to see you aviod a topic by moving it to another, a topic which you created, from me comenting on your over exagerations and saying nVidia got quote unquote "Raped" then you moving it to my sig, very mature. :rolleyes: [/b][/quote]
I just said it because it was the title of the post really.

Oh yes, and I thought that Intel&#39;s 64 bits were going to take a long while to come.

I couldn&#39;t care less what your CPU preference was. Its just that I&#39;m stupid and tired. I make rash decisions on what to post, you see?

GCNaddict
01-05-2004, 02:54 AM
Originally posted by adamp2p+5 January 2004 - 01:05--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (adamp2p &#064; 5 January 2004 - 01:05)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-GCNaddict@5 January 2004 - 01:19


i must point out that that game is optimized for ATI only

all the (good) games such as UT2003 and Tron 2.0 are optimized for both cards, and the results are almost equal in those benchmarks

yet again, xbitlabs screwed up

i cannot trust xbitlabs in tests anymore
What are you talking about?

Once again, I will repost what you obviously don&#39;t know or understand:


I&#39;ve gone over this a million times. Half Life 2 is not the only game the FX cards do poorly in. Everyone refers to HL2 simply because it is one of the most if not the most popular and highly anticipated games to come in a long while. It also heavily uses DX9. If you ever get a chance to ask Gabe Newell, founder of Valve software, i&#39;m sure he would tell you about the tricks nvidia was willing to go through with and why he himself owns a 9800.

If you&#39;ve been following the graphics scene at all the past year you would know nvidia&#39;s architecture used in their FX line of products were not developed around the dx9 standard. They chose to develop their own code because at the time ATI was not in the picture. An nvidia card was in about 80% - 90% of all gaming computers. And because of this they thought Microsoft would cave into their demands. However, ATI emerged out of nowhere and released the 9700 Pro which proved to be nearly twice as fast as the Geforce 4 TIs. Since the core in these 9700&#39;s was developed around MS&#39;s dx9 standard MS decided not to go with nvidia&#39;s code. Nvidia did not show up at any of the meetings and were thus not in the loop of what was happening. By the time they figured it out it was already too late and would have required they go back to the drawing board. Rather then do that and waste countless amounts of time and money they just decided to release what they had and use their PR machine to continue what it did best. This is when the optimizations came. They knew their card&#39;s performance would be rather lacking so they decided to make optimations to their drivers by decreasing image quality, removing aspects of scenery, etc. They were basically forced to do this in order to keep sales going while they wrote a compiler which would ultimately convert the dx9 code into their own. This compiler was introduced with the &#39;forceware&#39; drivers. However, there are still optimizations and the image quality isn&#39;t on par with that of ATI&#39;s Radeons as was shown with Futuremarks last patch to 3DMark03.

Yes there are other aspects involved as well. The NV30 (nvidia&#39;s fx core) only utilizes either FP16 or FP32. It cannot do FP24 like that of ATI&#39;s R300. In FP32 it&#39;s performance drops a significant amount. The design just isn&#39;t capable of producing high fps in FP32 mode. This is partly the reason why Valve had to write a special backend specifically for the FX cards in order to get decent fps in HL2. ATI&#39;s cards run it perfectly fine without any special code. It is the same for all current and future dx9 games.

Nvidia has no one to blame but themselves. I&#39;m sure they&#39;re next product, the NV40, will fix these problems as they wouldn&#39;t dare commit the same mistakes twice. But until then, my advice would be to stay away from the FX cards. If you must go with nvidia then get a Geforce 4 Ti line, otherwise get a Radeon 9500 or above. [/b][/quote]
im not saying nVidia is better&#33; nVidia stinks&#33; but for all i know, u have only been showing tests which gave nVidia the advantage (HL2 barely supports nVidia)

so if u want to how how ATI killed nVidia, at least be fair

_John_Lennon_
01-05-2004, 03:14 AM
Originally posted by GCNaddict@4 January 2004 - 21:54
, u have only been showing tests which gave nVidia the advantage (HL2 barely supports nVidia).


Who says that the games have to cater to the Video card makers? I think it should be the other way around.

adamp2p
01-05-2004, 04:19 AM
Originally posted by GCNaddict+5 January 2004 - 03:54--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (GCNaddict @ 5 January 2004 - 03:54)</td></tr><tr><td id='QUOTE'>
Originally posted by adamp2p@5 January 2004 - 01:05
<!--QuoteBegin-GCNaddict@5 January 2004 - 01:19


i must point out that that game is optimized for ATI only

all the (good) games such as UT2003 and Tron 2.0 are optimized for both cards, and the results are almost equal in those benchmarks

yet again, xbitlabs screwed up

i cannot trust xbitlabs in tests anymore
What are you talking about?

Once again, I will repost what you obviously don&#39;t know or understand:


I&#39;ve gone over this a million times. Half Life 2 is not the only game the FX cards do poorly in. Everyone refers to HL2 simply because it is one of the most if not the most popular and highly anticipated games to come in a long while. It also heavily uses DX9. If you ever get a chance to ask Gabe Newell, founder of Valve software, i&#39;m sure he would tell you about the tricks nvidia was willing to go through with and why he himself owns a 9800.

If you&#39;ve been following the graphics scene at all the past year you would know nvidia&#39;s architecture used in their FX line of products were not developed around the dx9 standard. They chose to develop their own code because at the time ATI was not in the picture. An nvidia card was in about 80% - 90% of all gaming computers. And because of this they thought Microsoft would cave into their demands. However, ATI emerged out of nowhere and released the 9700 Pro which proved to be nearly twice as fast as the Geforce 4 TIs. Since the core in these 9700&#39;s was developed around MS&#39;s dx9 standard MS decided not to go with nvidia&#39;s code. Nvidia did not show up at any of the meetings and were thus not in the loop of what was happening. By the time they figured it out it was already too late and would have required they go back to the drawing board. Rather then do that and waste countless amounts of time and money they just decided to release what they had and use their PR machine to continue what it did best. This is when the optimizations came. They knew their card&#39;s performance would be rather lacking so they decided to make optimations to their drivers by decreasing image quality, removing aspects of scenery, etc. They were basically forced to do this in order to keep sales going while they wrote a compiler which would ultimately convert the dx9 code into their own. This compiler was introduced with the &#39;forceware&#39; drivers. However, there are still optimizations and the image quality isn&#39;t on par with that of ATI&#39;s Radeons as was shown with Futuremarks last patch to 3DMark03.

Yes there are other aspects involved as well. The NV30 (nvidia&#39;s fx core) only utilizes either FP16 or FP32. It cannot do FP24 like that of ATI&#39;s R300. In FP32 it&#39;s performance drops a significant amount. The design just isn&#39;t capable of producing high fps in FP32 mode. This is partly the reason why Valve had to write a special backend specifically for the FX cards in order to get decent fps in HL2. ATI&#39;s cards run it perfectly fine without any special code. It is the same for all current and future dx9 games.

Nvidia has no one to blame but themselves. I&#39;m sure they&#39;re next product, the NV40, will fix these problems as they wouldn&#39;t dare commit the same mistakes twice. But until then, my advice would be to stay away from the FX cards. If you must go with nvidia then get a Geforce 4 Ti line, otherwise get a Radeon 9500 or above.
im not saying nVidia is better&#33; nVidia stinks&#33; but for all i know, u have only been showing tests which gave nVidia the advantage (HL2 barely supports nVidia)

so if u want to how how ATI killed nVidia, at least be fair [/b][/quote]

And next time, GCNaddict, I would suggest you do a bit more research before mouthing off about what you do not know.&nbsp; Sorry if I sound harsh

sparsely
01-05-2004, 04:57 AM
computer games?

they make those ?&#33;

:o

adamp2p
01-05-2004, 05:23 AM
Originally posted by Sparsely@5 January 2004 - 05:57
computer games?

they make those ?&#33;

:o
:lol:

DWk
01-05-2004, 05:49 AM
Originally posted by adamp2p@4 January 2004 - 18:05

I&#39;ve gone over this a million times. Half Life 2 is not the only game the FX cards do poorly in. Everyone refers to HL2 simply because it is one of the most if not the most popular and highly anticipated games to come in a long while. It also heavily uses DX9. If you ever get a chance to ask Gabe Newell, founder of Valve software, i&#39;m sure he would tell you about the tricks nvidia was willing to go through with and why he himself owns a 9800.

If you&#39;ve been following the graphics scene at all the past year you would know nvidia&#39;s architecture used in their FX line of products were not developed around the dx9 standard. They chose to develop their own code because at the time ATI was not in the picture. An nvidia card was in about 80% - 90% of all gaming computers. And because of this they thought Microsoft would cave into their demands. However, ATI emerged out of nowhere and released the 9700 Pro which proved to be nearly twice as fast as the Geforce 4 TIs. Since the core in these 9700&#39;s was developed around MS&#39;s dx9 standard MS decided not to go with nvidia&#39;s code. Nvidia did not show up at any of the meetings and were thus not in the loop of what was happening. By the time they figured it out it was already too late and would have required they go back to the drawing board. Rather then do that and waste countless amounts of time and money they just decided to release what they had and use their PR machine to continue what it did best. This is when the optimizations came. They knew their card&#39;s performance would be rather lacking so they decided to make optimations to their drivers by decreasing image quality, removing aspects of scenery, etc. They were basically forced to do this in order to keep sales going while they wrote a compiler which would ultimately convert the dx9 code into their own. This compiler was introduced with the &#39;forceware&#39; drivers. However, there are still optimizations and the image quality isn&#39;t on par with that of ATI&#39;s Radeons as was shown with Futuremarks last patch to 3DMark03.

Yes there are other aspects involved as well. The NV30 (nvidia&#39;s fx core) only utilizes either FP16 or FP32. It cannot do FP24 like that of ATI&#39;s R300. In FP32 it&#39;s performance drops a significant amount. The design just isn&#39;t capable of producing high fps in FP32 mode. This is partly the reason why Valve had to write a special backend specifically for the FX cards in order to get decent fps in HL2. ATI&#39;s cards run it perfectly fine without any special code. It is the same for all current and future dx9 games.

Nvidia has no one to blame but themselves. I&#39;m sure they&#39;re next product, the NV40, will fix these problems as they wouldn&#39;t dare commit the same mistakes twice. But until then, my advice would be to stay away from the FX cards. If you must go with nvidia then get a Geforce 4 Ti line, otherwise get a Radeon 9500 or above. And next time, GCNaddict, I would suggest you do a bit more research before mouthing off about what you do not know. Sorry if I sound harsh.
lol its kinda funny adam how u post the same thing and just include a different name..... really shows your creativity and your skills as a copypaste guy ;)

BTW.....i guess you like dissing on nvidia...but i think the "raped" status you gave it isnt just a bit harsh.... but is something dumb.

and other thing.... talking about benchmarking with a game which isnt out -hl2- (or doesnt even have an actual DEMO out) is a little.... stupid.... how do you base your results, exactly?

adamp2p
01-05-2004, 06:00 AM
Well if you would have been a bit *sharper* (no offense, as I can tell from your content and grammar usage you must be a regular Einstein ;) ) you might have noticed that I mentioned that it was in fact a Neowinian who first said this to another Neowinian:


I&#39;ve gone over this a million times. Half Life 2 is not the only game the FX cards do poorly in. Everyone refers to HL2 simply because it is one of the most if not the most popular and highly anticipated games to come in a long while. It also heavily uses DX9. If you ever get a chance to ask Gabe Newell, founder of Valve software, i&#39;m sure he would tell you about the tricks nvidia was willing to go through with and why he himself owns a 9800.

If you&#39;ve been following the graphics scene at all the past year you would know nvidia&#39;s architecture used in their FX line of products were not developed around the dx9 standard. They chose to develop their own code because at the time ATI was not in the picture. An nvidia card was in about 80% - 90% of all gaming computers. And because of this they thought Microsoft would cave into their demands. However, ATI emerged out of nowhere and released the 9700 Pro which proved to be nearly twice as fast as the Geforce 4 TIs. Since the core in these 9700&#39;s was developed around MS&#39;s dx9 standard MS decided not to go with nvidia&#39;s code. Nvidia did not show up at any of the meetings and were thus not in the loop of what was happening. By the time they figured it out it was already too late and would have required they go back to the drawing board. Rather then do that and waste countless amounts of time and money they just decided to release what they had and use their PR machine to continue what it did best. This is when the optimizations came. They knew their card&#39;s performance would be rather lacking so they decided to make optimations to their drivers by decreasing image quality, removing aspects of scenery, etc. They were basically forced to do this in order to keep sales going while they wrote a compiler which would ultimately convert the dx9 code into their own. This compiler was introduced with the &#39;forceware&#39; drivers. However, there are still optimizations and the image quality isn&#39;t on par with that of ATI&#39;s Radeons as was shown with Futuremarks last patch to 3DMark03.

Yes there are other aspects involved as well. The NV30 (nvidia&#39;s fx core) only utilizes either FP16 or FP32. It cannot do FP24 like that of ATI&#39;s R300. In FP32 it&#39;s performance drops a significant amount. The design just isn&#39;t capable of producing high fps in FP32 mode. This is partly the reason why Valve had to write a special backend specifically for the FX cards in order to get decent fps in HL2. ATI&#39;s cards run it perfectly fine without any special code. It is the same for all current and future dx9 games.

Nvidia has no one to blame but themselves. I&#39;m sure they&#39;re next product, the NV40, will fix these problems as they wouldn&#39;t dare commit the same mistakes twice. But until then, my advice would be to stay away from the FX cards. If you must go with nvidia then get a Geforce 4 Ti line, otherwise get a Radeon 9500 or above. And next time, DirtyLarry, I would suggest you do a bit more research before mouthing off about what you do not know. Sorry if I sound harsh.

Note the name in bold I keep changing the name in bold.

...Look, DWK, I am presently speaking with my Brother in NY and I don&#39;t have time to lecture to you--- I think you should wisen up a bit if you want to be respected around here, in my opinion...it&#39;s not worth my time to compose a unique response to your myopic comments... ;) <_<

DWk
01-05-2004, 02:40 PM
oh boohoo :ghostface:

btw... if you think that having respect means just knowing things about ONE thing (ati), and bashing on its direct competitor (nvidia)...gl with the rest of your life

you try to "lecture to me" by telling me to "wisen up"....yet all you can do is bash against a brand that isnt what you put it for...

wisen up, homey..


ps: nice job on taking other man&#39;s words and making them ur own (not to mention repeating them so constantly....) this really shows your true "wisdom" ;)

GCNaddict
01-05-2004, 05:57 PM
Originally posted by DWk@5 January 2004 - 14:40
oh boohoo&nbsp; :ghostface:

btw... if you think that having respect means just knowing things about ONE thing (ati), and bashing on its direct competitor (nvidia)...gl with the rest of your life

you try to "lecture to me" by telling me to "wisen up"....yet all you can do is bash against a brand that isnt what you put it for...

wisen up, homey..


ps: nice job on taking other man&#39;s words and making them ur own (not to mention repeating them so constantly....) this really shows your true "wisdom"&nbsp; ;)
good call

@Adam: I am a fan of ati (as they created the "flipper" chip in the GameCube) but I stand equal with both companies: they both have advantages in their own games (ATI : Half Life 2 :: nVidia : Doom3 [neither is out yet]) next time, dont be so biased against one company. you&#39;re being like the loser who thot that having an AMD Processor is a sign of a h4x0r :lol:

look if ur going to bash some company, at least use ur own words&#33;

on a sidenote, if u didnt understand what i meant by "the loser who thot that having an AMD Processor is a sign of a h4x0r," ask mad cat, he&#39;ll enlighten u

adamp2p
01-05-2004, 07:57 PM
Originally posted by DWk@5 January 2004 - 15:40
oh boohoo :ghostface:

btw... if you think that having respect means just knowing things about ONE thing (ati), and bashing on its direct competitor (nvidia)...gl with the rest of your life

you try to "lecture to me" by telling me to "wisen up"....yet all you can do is bash against a brand that isnt what you put it for...

wisen up, homey..


ps: nice job on taking other man&#39;s words and making them ur own (not to mention repeating them so constantly....) this really shows your true "wisdom" ;)
Like I told you before, I am stupid. ;)

adamp2p
01-05-2004, 07:58 PM
Originally posted by GCNaddict+5 January 2004 - 18:57--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (GCNaddict @ 5 January 2004 - 18:57)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-DWk@5 January 2004 - 14:40
oh boohoo :ghostface:

btw... if you think that having respect means just knowing things about ONE thing (ati), and bashing on its direct competitor (nvidia)...gl with the rest of your life

you try to "lecture to me" by telling me to "wisen up"....yet all you can do is bash against a brand that isnt what you put it for...

wisen up, homey..


ps: nice job on taking other man&#39;s words and making them ur own (not to mention repeating them so constantly....) this really shows your true "wisdom" ;)
good call

@Adam: I am a fan of ati (as they created the "flipper" chip in the GameCube) but I stand equal with both companies: they both have advantages in their own games (ATI : Half Life 2 :: nVidia : Doom3 [neither is out yet]) next time, dont be so biased against one company. you&#39;re being like the loser who thot that having an AMD Processor is a sign of a h4x0r :lol:

look if ur going to bash some company, at least use ur own words&#33;

on a sidenote, if u didnt understand what i meant by "the loser who thot that having an AMD Processor is a sign of a h4x0r," ask mad cat, he&#39;ll enlighten u [/b][/quote]
You and DWK... :frusty:

GCNaddict
01-05-2004, 08:01 PM
Originally posted by adamp2p+5 January 2004 - 19:58--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (adamp2p @ 5 January 2004 - 19:58)</td></tr><tr><td id='QUOTE'>
Originally posted by GCNaddict@5 January 2004 - 18:57
<!--QuoteBegin-DWk@5 January 2004 - 14:40
oh boohoo :ghostface:

btw... if you think that having respect means just knowing things about ONE thing (ati), and bashing on its direct competitor (nvidia)...gl with the rest of your life

you try to "lecture to me" by telling me to "wisen up"....yet all you can do is bash against a brand that isnt what you put it for...

wisen up, homey..


ps: nice job on taking other man&#39;s words and making them ur own (not to mention repeating them so constantly....) this really shows your true "wisdom" ;)
good call

@Adam: I am a fan of ati (as they created the "flipper" chip in the GameCube) but I stand equal with both companies: they both have advantages in their own games (ATI : Half Life 2 :: nVidia : Doom3 [neither is out yet]) next time, dont be so biased against one company. you&#39;re being like the loser who thot that having an AMD Processor is a sign of a h4x0r :lol:

look if ur going to bash some company, at least use ur own words&#33;

on a sidenote, if u didnt understand what i meant by "the loser who thot that having an AMD Processor is a sign of a h4x0r," ask mad cat, he&#39;ll enlighten u
You and DWK... :frusty: [/b][/quote]
shut your shit (i love saying that) and stop defending ur ass when u skrew&#33; be like atividia who ran away when he shat on himself

adamp2p
01-05-2004, 08:16 PM
I have not sworn allegiance to this particular board, nor do I plan on it. I see that you have somehow associated some sort of sucess with post count here. That is very interesting and I have seen it happen before.

People with no direction in their lives often bond onto any little thing they can because they find meaning and purpose to their lives. Great; so you have found direction with your life posting on some dinky, internet message board. :lol:

Virtualbody1234
01-05-2004, 08:20 PM
This has turned into a bashing each other session. Please stop.

DWk
01-05-2004, 08:54 PM
10-4 VB....

however i have to say one more thing.... adam what makes u so good in ur life so u can come and tell any1 here what is your assumption of every1s life?

facts first, homey

GCNaddict
01-05-2004, 10:21 PM
Originally posted by Virtualbody1234+5 January 2004 - 20:20--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Virtualbody1234 &#064; 5 January 2004 - 20:20)</td></tr><tr><td id='QUOTE'> This has turned into a bashing each other session. Please stop. [/b]
i agree

<!--QuoteBegin-DWk@5 January 2004 - 20:54
10-4 VB....

however i have to say one more thing.... adam what makes u so good in ur life so u can come and tell any1 here what is your assumption of every1s life?

facts first, homey [/quote]
i agree with that too :lol:

adamp2p
01-05-2004, 11:03 PM
Originally posted by GCNaddict@5 January 2004 - 01:19


all the (good) games such as UT2003 and Tron 2.0 are optimized for both cards, and the results are almost equal in those benchmarks


http://www.xbitlabs.com/images/video/evga-5950ultra/tron.gif

A picture speaks a thousand words.

:)


The graphics card from EVGA gives up before RADEON 9800 XT in this test, although the game itself was developed under the NVIDIA’s slogan “The way it’s meant to be played” (i.e. the game is sharpened specifically for NVIDIA’s GPUs). There is no great speed difference between “quality” and “fast” anisotropic filtering modes, as there are no large amounts of “heavy” textures. Unlike Unreal Tournament 2003, it doesn’t require high texturing speed.

source (http://www.xbitlabs.com/articles/video/display/evga-5950ultra_8.html)

http://www.xbitlabs.com/images/video/evga-5950ultra/tron-screen.jpg

Oh, and you will love this one: the only way that the 5950 U can even compete with the 9800 XT is when the 5950 U is overclocked to the "XTreme." I love that. So what were you saying again about Tron and UT? I think I missed it. Could you clarify please?

:lol:

http://www.xbitlabs.com/images/video/evga-5950ultra/ut-screen.jpg

http://www.xbitlabs.com/images/video/evga-5950ultra/ut.gif

:)


Both cards enjoy a certain performance gain (25-30%) when we use “fast” rather than “quality” AF. The graphics memory bandwidth influences the results in higher resolutions more, so we have a smaller gain there, about 20-25%.

Our extreme overclocking of the EVGA e-GeForce FX 5950 Ultra provides a nice additional performance gain of 15-20%. As a result, this card is the winner in this test, although when working at nominal frequencies it goes neck and neck with the RADEON.

:lol:

"Hold on, there; are you saying that the 9800 XT beats the 5950 U at nominal frequencies and the only way the 5950 U delivers faster framerates than the XT is when the 5950 U is overclocked and the XT is not?"

"Yes." :) ;) :D :P :huh: