Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 23

Thread: Nvidia Accused Of Cheating Again?

  1. #11
    atiVidia's Avatar ^would've been cool.
    Join Date
    Dec 2003
    Posts
    1,522
    Originally posted by adamp2p+23 April 2004 - 14:29--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (adamp2p &#064; 23 April 2004 - 14:29)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-lynx@23 April 2004 - 09:03
    Let&#39;s see if I&#39;ve got this right.

    100+ fps, a speed which the human eye can&#39;t process, and they spot 1 frame which is out of spec.

    Oh, and who are the accusers? Oh yes, ATI.

    Do me a favour, it&#39;s absolute bullshit.
    Sorry to correct you lynx, but I have to:

    Driverheaven is the one who published these results.

    If you take a moment (actually a few minutes) to read and interact with the entire article including the flash demos, and uncompressed images, you will see several instances where Nvidia&#39;s hardware produced images that are still not up to par with what currently available ATi hardware can produce.

    What can we conclude from this? Well, a couple of things, that&#39;s for sure.

    What this means in a game? Well for one thing, if you are getting 100 frames per second on a card that cost you over &#036;400 USD, I am sure you would want every frame to be high quality.

    In general, getting higher framerates of lower quality images does not equate to a better gaming experience (unless you are colorblind). So what? Well here is the real issue: when ATi releases its next generation hardware in a couple of weeks, and it can produce framerates at or equal (and possibly above or below by a few fps) the Nvidia offerings, folks will be sure to be looking at the image quality.

    Nvidia has been hammered recently by the gaming community ([H]ardOCP, Anandtech, etc.) due to its hardware producing image quality less than ATi&#39;s hardware. So you can be sure folks will be looking and comparing this again. [/b][/quote]
    sorry to correct you adam, but driver heaven is practically paid by ATI to fu<k nvidia up the @ss.

    and dont wisecrack your bull in my face trying to achieve more ati converts. let the companies do that, not the customers.

    hell, we havent seen any reviews from ati yet, have we???

    HELL NO&#33; so keep your smart-sh_t talking to yourself and stop posting one-sided topics in order to achieve ATI converts you low-lifed ATI employee&#33;


    any1 who advertises for any company is an employee of that company, even if they are not being paid. they are bringing in profit for the company, and thus are employees.

    if u want to advertise and not be raped by other topic surfers, do it for BOTH major companies, not just the one who you *probably* work for.

    now L-T-T-F-A, please.


    oh yea and also, BULL SH_T&#33; frames which are not up to par with ATI&#39;s current solutions??? ur eye has been trained, if not hypnotized, to only adccept the screens coming from ATI&#39;s cards. hell we can easily accuse ATI of having screens that are off compared to nvidia. HELL WE CAN SAY NVIDIA IS BETTER than ATI because several of the distributors offer KICKASS, LIFETIME WARRANTIES (24/7 365days from BFGtech, amazing rebate offers, and cards which are 20 bucks under retail).

    btw ppl who play online generally set anything they can to the lowest (1600x1200 ut2004 with no FSAA and no ASF and ive fgot a kickass record.) ppl play their games online nowadays. they need high framerates so they can slaughter their opponents w/o lag.

    lol id love to play my games in 2048 x 1536



    oh, yes, and like lynx, i am an nVidia fanboy because ATI cant f__king wire their hardware properly to work on every goddamned machine, unlike nvidia cards which many ppl, if not every1, has no problems with in terms of hardware.

    and another thing, i see nothing wrong with anything run on ATI&#39;s or on nVidia&#39;s hardware. and dont try calling my eye untrained. its prolly better trained than the ati-hijacked eye ur runningg on. i must admit that the fx series really did suck balls, but i find no flaws in quality whatsoever with either nVidia or ATI (counting the newest cards.)

  2. Software & Hardware   -   #12
    Did I hear something, nope. That&#39;s what I thought....

  3. Software & Hardware   -   #13
    lynx's Avatar .
    Join Date
    Sep 2002
    Location
    Yorkshire, England
    Posts
    9,759
    Originally posted by Pitbul+23 April 2004 - 20:21--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Pitbul @ 23 April 2004 - 20:21)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-lynx@23 April 2004 - 10:03
    Let&#39;s see if I&#39;ve got this right.

    100+ fps, a speed which the human eye can&#39;t process, and they spot 1 frame which is out of spec.

    Oh, and who are the accusers? Oh yes, ATI.

    Do me a favour, it&#39;s absolute bullshit.
    LMAO&#33;&#33;&#33;, how many FPS that eye can see depends on the person. diffrent people can see the diffrence between diffrent FPS, but still irrelevant.

    Driverheaven is making the claims, not ATI you jerkass, the only thing ATI has said so far is that there is not much if any diffrence between the picture quality between Ps2.0 and Ps3.0, and so far we haven&#39;t seen any proof of this being true or false because we haven&#39;t seen a side by side picture of a game running PS2.0 and PS3.0.

    hehe your whole post screams out "nVidia Fanboy"Vi [/b][/quote]
    The human eye can actually process about 7 (SEVEN) frames per second. Of course we need videos with more than that so that there is a new image ready when the eye is ready otherwise we detect flicker. But even then the eye can&#39;t detect any difference above about 30fps. Learn something about basic human physiology before spouting crap.

    As for your nVidia fanboy comment, I&#39;ve only ever had one nVidia card (my current one) but I&#39;ve had plenty ATI cards in the past. As atividia says, driverheaven aren&#39;t exactly impartial, so who&#39;s the fanboy now?

    And in any case, look PROPERLY at the 3dmark03 test 2 and you will see that there are parts which only the nVidia card reporduces correctly, there are bits of the image missing on the ATI image and ON THE RATSER GRAPHICS IMAGE. But the raster graphics image is supposedly perfect, so how can there be parts missing? Trick of the light perhaps? Tricks by driverheaven more likely.

    I repeat my claim - it is all utter bullshit.
    .
    Political correctness is based on the principle that it's possible to pick up a turd by the clean end.

  4. Software & Hardware   -   #14
    Originally posted by lynx@23 April 2004 - 12:45

    I repeat my claim - it is all utter bullshit.
    We will just have to wait and see about that, no?

  5. Software & Hardware   -   #15
    Poster
    Join Date
    Oct 2003
    Posts
    1,065
    lol

    funny conversation....

    ppl argue&#39;n over which company is better.

    I can&#39;t say that one company is better than the other cause quite honestly i dont know... and neither do you guys... unless you are an official...who has been hired by both companies.

  6. Software & Hardware   -   #16
    bigdawgfoxx's Avatar Big Dawg
    Join Date
    Apr 2003
    Location
    Texas
    Age
    35
    Posts
    3,821
    lets just wait for the cards..and benchmarks from tomshardware or anandtech
    [SIZE=1]AMD 4200 X2 @ 2.65Ghz, ASRock 939-VSTA
    1.75GB PC3200, 2 X 160GB Seagate w/ 8MB Buffer
    HIS Radeon X800 Pro, Antec Super Lanboy Aluminum

  7. Software & Hardware   -   #17
    Originally posted by bigdawgfoxx@23 April 2004 - 16:47
    lets just wait for the cards..and benchmarks from tomshardware or anandtech
    My personal favorite sites for hardware review are (in order of preference, of course):

    #1) Anandtech
    #2) [H]ardOCP
    #3) Tom&#39;s Hardware


  8. Software & Hardware   -   #18
    atiVidia's Avatar ^would've been cool.
    Join Date
    Dec 2003
    Posts
    1,522
    Originally posted by adamp2p+23 April 2004 - 20:23--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (adamp2p @ 23 April 2004 - 20:23)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-bigdawgfoxx@23 April 2004 - 16:47
    lets just wait for the cards..and benchmarks from tomshardware or anandtech
    My personal favorite sites for hardware review are (in order of preference, of course):

    #1) Anandtech
    #2) [H]ardOCP
    #3) Tom&#39;s Hardware

    [/b][/quote]
    what happened to xbit labs? u were bragging about how they wrote the best reviews not too long ago... are their reviews too fair for u to accept?

    yes u heard the wind b_tching at u earlier. the wind is an nvidia fanboy...


    d00d, no marketing on the forums: its not a smart business practice...



    that will be 1 of the first things to change when atiVidia is formed: a corp. w/imageQ+framerates rolled into good business tactics.

  9. Software & Hardware   -   #19
    lynx's Avatar .
    Join Date
    Sep 2002
    Location
    Yorkshire, England
    Posts
    9,759
    Personally I have no preference between the companies, both have their strengths and weaknesses.

    What I DO object to is reviews which are inexcusably biased towards one manufacturer. If they want to produce an advertisement let them do that and be honest about it, rather than trying to slag off the opposition through a third party.

    I haven&#39;t heard any comments about the areas missing on both the raster image and the ATI image, why is the same bit missing from both images? The nvidia image shows extra detail (whuch is quite extensive) which they couldn&#39;t have just made up. Alternatively we could be looking at different frames which means that comparison is meaningless. The poor IQ claimed by the article may be exactly the same on the other two sources if we were looking at the same frame.
    .
    Political correctness is based on the principle that it's possible to pick up a turd by the clean end.

  10. Software & Hardware   -   #20
    lynx's Avatar .
    Join Date
    Sep 2002
    Location
    Yorkshire, England
    Posts
    9,759
    Trilinear Filtering - Colored Mipmaps
    Here´s what Microsoft has to say about it:

    Quote: "The DX9 reference rasterizer does not produce an ideal result for level-of-detail computation for isotropic filtering, the algorithm used in NV40 produces a higher quality result. Remember, our API is constantly evolving as is graphics hardware, we will continue to improve all aspects of our API including the reference rasterizer."
    Source

    Anyone still going to claim that Nvidia were cheating?
    .
    Political correctness is based on the principle that it's possible to pick up a turd by the clean end.

Page 2 of 3 FirstFirst 123 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •