PDA

View Full Version : Nvidia Accused Of Cheating Again?



Pitbul
04-23-2004, 01:37 PM
http://www.theinquirer.net/?article=15502

No diffrence or much noticeable between Ps2.0 and Ps3.0?


"ATI IS MAKING hay while the sun shines after an article at Driverheaven more or less accused Nvidia of cheating. We referenced the web site article earlier today in our hardware roundup.
It's also worth looking here for background on this matter.

We said that what we saw when we were in Geneva with Nvidia was a clear difference between the code that was running PS 3.0 path and a referred system that was running PS 2.0 code.

ATI says that CryTek's representative told it that what Nvidia showed us in Geneva was a 2.0/3.0 path versus 1.1 path.

The key message is that Shader model 3.0 and 2.0 look exactly the same, the ATI representative added.

In a developers' mail that we also received, it added that there is a possibility that the 3.0 path might be slightly faster in same cases on some hardware - an obvious reference to Nvidia.

ATI also claims that its hardware will run faster then Nvidia's anyway and he added: "It's pretty much impossible to make a sm3.0 game look noticeably different to a sm2.0 game, which is why Nvidia was comparing the 2.0/3.0 path with a 1.1 path."

A CryTek representative responded on this matter with this answer: "Was Nvidia showing SM3.0 vs. SM2.0 or SM1.1?" He replied to his own question by saying that Nvidia was showing 3.0/2.0 vs. 1.1.

So the ball is now in Nvidia's half of the court, and I am the Bosnian net over which the balls are flying.
"

Me thinks someone just got bit in the ass for trying to twist words ::Cough:: nVidia and Crytek ::Cough::

kaiweiler
04-23-2004, 01:53 PM
haha sucks for nVidia!
another step forward for ATI though
I think within the next little bit, a lot of nVidia users will be switching to the almighty ATI! :01:

adamp2p
04-23-2004, 03:13 PM
Originally posted by kaiweiler@23 April 2004 - 05:53
haha sucks for nVidia!
another step forward for ATI though
I think within the next little bit, a lot of nVidia users will be switching to the almighty ATI! :01:
But do you know the numbers? 4/5 high end graphics cards ($150-$400 USD) sold are ATi cards. In other words, Nvidia sells significantly more low end products than ATi does.

Another thing--ATi has higher profit margins--so they earn more per card they sell than Nvidia; and that only makes sense, because Nvidia has to use much higher clocked DDR2-3 memory (such as 5700 U etc.) to compete with ATi's current offerings. For example: the current leader in graphics performance is the 9800XT, which uses DDR1, and it still beats the much higher clocked 5950 U.

:rolleyes:

adamp2p
04-23-2004, 03:30 PM
here is the article that compares the images and shows nvidia cheating (http://www.driverheaven.net/articles/driverIQ/)

Skillian
04-23-2004, 04:53 PM
Well, that's pretty much confirmed that the R420 won't support PS 3.0 then ;)

atiVidia
04-23-2004, 04:53 PM
lol i care about framerates.

plus we can never really know who is the true winner until the games supporting the hardware are released, and both comps r benched under dx9.0C



also: it could be a driver issue. nvidia has always had problems with drivers...

lynx
04-23-2004, 05:03 PM
Let's see if I've got this right.

100+ fps, a speed which the human eye can't process, and they spot 1 frame which is out of spec.

Oh, and who are the accusers? Oh yes, ATI.

Do me a favour, it's absolute bullshit.

RGX
04-23-2004, 05:59 PM
Originally posted by atiVidia@23 April 2004 - 16:53
lol i care about framerates.

plus we can never really know who is the true winner until the games supporting the hardware are released, and both comps r benched under dx9.0C



also: it could be a driver issue. nvidia has always had problems with drivers...
Personally, I care about image quality. I can get quake to run at 500 FPS on an MX440 FFS. :D

adamp2p
04-23-2004, 07:29 PM
Originally posted by lynx@23 April 2004 - 09:03
Let's see if I've got this right.

100+ fps, a speed which the human eye can't process, and they spot 1 frame which is out of spec.

Oh, and who are the accusers? Oh yes, ATI.

Do me a favour, it's absolute bullshit.
Sorry to correct you lynx, but I have to:

Driverheaven is the one who published these results.

If you take a moment (actually a few minutes) to read and interact with the entire article including the flash demos, and uncompressed images, you will see several instances where Nvidia's hardware produced images that are still not up to par with what currently available ATi hardware can produce.

What can we conclude from this? Well, a couple of things, that's for sure.

What this means in a game? Well for one thing, if you are getting 100 frames per second on a card that cost you over $400 USD, I am sure you would want every frame to be high quality.

In general, getting higher framerates of lower quality images does not equate to a better gaming experience (unless you are colorblind). So what? Well here is the real issue: when ATi releases its next generation hardware in a couple of weeks, and it can produce framerates at or equal (and possibly above or below by a few fps) the Nvidia offerings, folks will be sure to be looking at the image quality.

Nvidia has been hammered recently by the gaming community ([H]ardOCP, Anandtech, etc.) due to its hardware producing image quality less than ATi's hardware. So you can be sure folks will be looking and comparing this again.

Pitbul
04-23-2004, 08:21 PM
Originally posted by lynx@23 April 2004 - 10:03
Let's see if I've got this right.

100+ fps, a speed which the human eye can't process, and they spot 1 frame which is out of spec.

Oh, and who are the accusers? Oh yes, ATI.

Do me a favour, it's absolute bullshit.
LMAO!!!, how many FPS that eye can see depends on the person. diffrent people can see the diffrence between diffrent FPS, but still irrelevant.

Driverheaven is making the claims, not ATI you jerkass, the only thing ATI has said so far is that there is not much if any diffrence between the picture quality between Ps2.0 and Ps3.0, and so far we haven't seen any proof of this being true or false because we haven't seen a side by side picture of a game running PS2.0 and PS3.0.

hehe your whole post screams out "nVidia Fanboy"

atiVidia
04-23-2004, 08:23 PM
Originally posted by adamp2p+23 April 2004 - 14:29--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (adamp2p &#064; 23 April 2004 - 14:29)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-lynx@23 April 2004 - 09:03
Let&#39;s see if I&#39;ve got this right.

100+ fps, a speed which the human eye can&#39;t process, and they spot 1 frame which is out of spec.

Oh, and who are the accusers? Oh yes, ATI.

Do me a favour, it&#39;s absolute bullshit.
Sorry to correct you lynx, but I have to:

Driverheaven is the one who published these results.

If you take a moment (actually a few minutes) to read and interact with the entire article including the flash demos, and uncompressed images, you will see several instances where Nvidia&#39;s hardware produced images that are still not up to par with what currently available ATi hardware can produce.

What can we conclude from this? Well, a couple of things, that&#39;s for sure.

What this means in a game? Well for one thing, if you are getting 100 frames per second on a card that cost you over &#036;400 USD, I am sure you would want every frame to be high quality.

In general, getting higher framerates of lower quality images does not equate to a better gaming experience (unless you are colorblind). So what? Well here is the real issue: when ATi releases its next generation hardware in a couple of weeks, and it can produce framerates at or equal (and possibly above or below by a few fps) the Nvidia offerings, folks will be sure to be looking at the image quality.

Nvidia has been hammered recently by the gaming community ([H]ardOCP, Anandtech, etc.) due to its hardware producing image quality less than ATi&#39;s hardware. So you can be sure folks will be looking and comparing this again. [/b][/quote]
sorry to correct you adam, but driver heaven is practically paid by ATI to fu<k nvidia up the @ss.

and dont wisecrack your bull in my face trying to achieve more ati converts. let the companies do that, not the customers.

hell, we havent seen any reviews from ati yet, have we???

HELL NO&#33; so keep your smart-sh_t talking to yourself and stop posting one-sided topics in order to achieve ATI converts you low-lifed ATI employee&#33;


any1 who advertises for any company is an employee of that company, even if they are not being paid. they are bringing in profit for the company, and thus are employees.

if u want to advertise and not be raped by other topic surfers, do it for BOTH major companies, not just the one who you *probably* work for.

now L-T-T-F-A, please.


oh yea and also, BULL SH_T&#33; frames which are not up to par with ATI&#39;s current solutions??? ur eye has been trained, if not hypnotized, to only adccept the screens coming from ATI&#39;s cards. hell we can easily accuse ATI of having screens that are off compared to nvidia. HELL WE CAN SAY NVIDIA IS BETTER than ATI because several of the distributors offer KICKASS, LIFETIME WARRANTIES (24/7 365days from BFGtech, amazing rebate offers, and cards which are 20 bucks under retail).

btw ppl who play online generally set anything they can to the lowest (1600x1200 ut2004 with no FSAA and no ASF and ive fgot a kickass record.) ppl play their games online nowadays. they need high framerates so they can slaughter their opponents w/o lag.

lol id love to play my games in 2048 x 1536



oh, yes, and like lynx, i am an nVidia fanboy because ATI cant f__king wire their hardware properly to work on every goddamned machine, unlike nvidia cards which many ppl, if not every1, has no problems with in terms of hardware.

and another thing, i see nothing wrong with anything run on ATI&#39;s or on nVidia&#39;s hardware. and dont try calling my eye untrained. its prolly better trained than the ati-hijacked eye ur runningg on. i must admit that the fx series really did suck balls, but i find no flaws in quality whatsoever with either nVidia or ATI (counting the newest cards.)

adamp2p
04-23-2004, 08:39 PM
Did I hear something, nope. That&#39;s what I thought....

lynx
04-23-2004, 08:45 PM
Originally posted by Pitbul+23 April 2004 - 20:21--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Pitbul @ 23 April 2004 - 20:21)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-lynx@23 April 2004 - 10:03
Let&#39;s see if I&#39;ve got this right.

100+ fps, a speed which the human eye can&#39;t process, and they spot 1 frame which is out of spec.

Oh, and who are the accusers? Oh yes, ATI.

Do me a favour, it&#39;s absolute bullshit.
LMAO&#33;&#33;&#33;, how many FPS that eye can see depends on the person. diffrent people can see the diffrence between diffrent FPS, but still irrelevant.

Driverheaven is making the claims, not ATI you jerkass, the only thing ATI has said so far is that there is not much if any diffrence between the picture quality between Ps2.0 and Ps3.0, and so far we haven&#39;t seen any proof of this being true or false because we haven&#39;t seen a side by side picture of a game running PS2.0 and PS3.0.

hehe your whole post screams out "nVidia Fanboy"Vi [/b][/quote]
The human eye can actually process about 7 (SEVEN) frames per second. Of course we need videos with more than that so that there is a new image ready when the eye is ready otherwise we detect flicker. But even then the eye can&#39;t detect any difference above about 30fps. Learn something about basic human physiology before spouting crap.

As for your nVidia fanboy comment, I&#39;ve only ever had one nVidia card (my current one) but I&#39;ve had plenty ATI cards in the past. As atividia says, driverheaven aren&#39;t exactly impartial, so who&#39;s the fanboy now?

And in any case, look PROPERLY at the 3dmark03 test 2 and you will see that there are parts which only the nVidia card reporduces correctly, there are bits of the image missing on the ATI image and ON THE RATSER GRAPHICS IMAGE. But the raster graphics image is supposedly perfect, so how can there be parts missing? Trick of the light perhaps? Tricks by driverheaven more likely.

I repeat my claim - it is all utter bullshit.

adamp2p
04-23-2004, 10:10 PM
Originally posted by lynx@23 April 2004 - 12:45

I repeat my claim - it is all utter bullshit.
We will just have to wait and see about that, no?

Dray_04
04-23-2004, 11:42 PM
lol

funny conversation....

ppl argue&#39;n over which company is better. :lol:

I can&#39;t say that one company is better than the other cause quite honestly i dont know... and neither do you guys... unless you are an official...who has been hired by both companies.

bigdawgfoxx
04-24-2004, 12:47 AM
lets just wait for the cards..and benchmarks from tomshardware or anandtech :)

adamp2p
04-24-2004, 01:23 AM
Originally posted by bigdawgfoxx@23 April 2004 - 16:47
lets just wait for the cards..and benchmarks from tomshardware or anandtech :)
My personal favorite sites for hardware review are (in order of preference, of course):

#1) Anandtech
#2) [H]ardOCP
#3) Tom&#39;s Hardware

:)

atiVidia
04-24-2004, 03:30 AM
Originally posted by adamp2p+23 April 2004 - 20:23--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (adamp2p @ 23 April 2004 - 20:23)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-bigdawgfoxx@23 April 2004 - 16:47
lets just wait for the cards..and benchmarks from tomshardware or anandtech :)
My personal favorite sites for hardware review are (in order of preference, of course):

#1) Anandtech
#2) [H]ardOCP
#3) Tom&#39;s Hardware

:) [/b][/quote]
what happened to xbit labs? u were bragging about how they wrote the best reviews not too long ago... are their reviews too fair for u to accept? <_<

yes u heard the wind b_tching at u earlier. the wind is an nvidia fanboy...


d00d, no marketing on the forums: its not a smart business practice...



that will be 1 of the first things to change when atiVidia is formed: a corp. w/imageQ+framerates rolled into good business tactics.

lynx
04-24-2004, 11:46 AM
Personally I have no preference between the companies, both have their strengths and weaknesses.

What I DO object to is reviews which are inexcusably biased towards one manufacturer. If they want to produce an advertisement let them do that and be honest about it, rather than trying to slag off the opposition through a third party.

I haven&#39;t heard any comments about the areas missing on both the raster image and the ATI image, why is the same bit missing from both images? The nvidia image shows extra detail (whuch is quite extensive) which they couldn&#39;t have just made up. Alternatively we could be looking at different frames which means that comparison is meaningless. The poor IQ claimed by the article may be exactly the same on the other two sources if we were looking at the same frame.

lynx
05-10-2004, 10:49 PM
Trilinear Filtering - Colored Mipmaps

Here´s what Microsoft has to say about it:

Quote: "The DX9 reference rasterizer does not produce an ideal result for level-of-detail computation for isotropic filtering, the algorithm used in NV40 produces a higher quality result. Remember, our API is constantly evolving as is graphics hardware, we will continue to improve all aspects of our API including the reference rasterizer."

Source (http://www.tomshardware.com/graphic/20040504/ati-x800-35.html)

Anyone still going to claim that Nvidia were cheating?

Marius24
05-10-2004, 11:04 PM
Originally posted by lynx@10 May 2004 - 22:57
Trilinear Filtering - Colored Mipmaps

Here´s what Microsoft has to say about it:

Quote: "The DX9 reference rasterizer does not produce an ideal result for level-of-detail computation for isotropic filtering, the algorithm used in NV40 produces a higher quality result. Remember, our API is constantly evolving as is graphics hardware, we will continue to improve all aspects of our API including the reference rasterizer."

Source (http://www.tomshardware.com/graphic/20040504/ati-x800-35.html)

Anyone still going to claim that Nvidia were cheating?
i have no idea what the big deal about 2 two companys. i can hardly tell the difference between 60 and 100 fps so i have no favorate. I can play anygame and still be satisfied with it whatever the performace of my (shit) system. I am not biased 2 either company after owning hardware from both company. However i found that my nvidia card was less hassle (maybe bcos it was a simple card). :)

Just thought i would share my thoughts. :lol:

atiVidia
05-10-2004, 11:13 PM
i say nvidia is less of a hassle cuz their cards actually work with my system&#33;

Illuminati
05-10-2004, 11:34 PM
Originally posted by Marius24+11 May 2004 - 00:12--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Marius24 @ 11 May 2004 - 00:12)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-lynx@10 May 2004 - 22:57
Trilinear Filtering - Colored Mipmaps

Here´s what Microsoft has to say about it:

Quote: "The DX9 reference rasterizer does not produce an ideal result for level-of-detail computation for isotropic filtering, the algorithm used in NV40 produces a higher quality result. Remember, our API is constantly evolving as is graphics hardware, we will continue to improve all aspects of our API including the reference rasterizer."

Source (http://www.tomshardware.com/graphic/20040504/ati-x800-35.html)

Anyone still going to claim that Nvidia were cheating?
i have no idea what the big deal about 2 two companys. i can hardly tell the difference between 60 and 100 fps so i have no favorate. I can play anygame and still be satisfied with it whatever the performace of my (shit) system. I am not biased 2 either company after owning hardware from both company. However i found that my nvidia card was less hassle (maybe bcos it was a simple card). :)

Just thought i would share my thoughts. :lol: [/b][/quote]
Those thoughts echo my own :)