PDA

View Full Version : Ati Schoolz Nvidia



adamp2p
05-04-2004, 03:20 PM
source (http://www.hardocp.com/article.html?art=NjExLDEw)
Best Playable IQ:
Image Resized
http://www.hardocp.com/images/articles/1083564189888Adk70te_10_1_l.gif' width='200' height='120' border='0' alt='click for full size view'> ('http://www.hardocp.com/images/articles/1083564189888Adk70te_10_1_l.gif')


The above chart will give you a quick understanding at just how powerful these video cards are when it come to gaming in the real world. Above you can see the IQ settings that we found playable on our test system.

There is a very obvious pattern here; 1600x1200 with AA and AF on the X800XT. Only in one game did we have to play at 1280x1024, and that was only because the game, NFS: Underground did not support 1600x1200 resolution. Otherwise every single game was playable at 1600x1200 with AA and AF! We find this to be nothing short of incredible and frankly we did not think we would see this kind of video card power till possibly later this year.

The Radeon X800Pro comes in second in the pack and did achieve 1600x1200 with AA and AF in a couple of games with the rest being playable at 1280x1024 with AA and AF.

The GeForce 6800Ultra came in third from the top in our testing, as it had only one game that it could run at 1600x1200 with AA and AF. The rest played best at 1280x1024 or 1280x960 with AA and AF. In the case of the new FarCry game that many folks is buzzing about, only 1024x768 MediumAA/4XAF was playable. That said, NVIDIA gave us a newer set of drivers that were outlined to improve FarCry frames per second by 20% by working out some Z culling issues according to NVIDIA. These new v61.11 drivers also were saddled with IQ bugs and we did not feel comfortable testing with them as we would not have suggested them for your gaming experience.


Comparing IQ Technology:

Looking at the Anti-Aliasing and Anisotropic image quality between the X800 series and the GeForce 6800Ultra we find them to be very comparable. There is one difference though. The X800 is so powerful, 6XAA is actually a useable Anti-Aliasing setting on the X800 series whereas comparable 8XAA on the 6800Ultra, is basically not usable, as it is too demanding in terms of performance because it is a super-sampling + multi-sampling technique.

The only shader quality differences we noticed were in FarCry where the X800 series is providing much better image quality. Compared to the 9800XT the X800 series have identical AA, AF and shader quality.

Temporal AA is an interesting feature, and one that you will simply have to judge for yourself. Some may like it some may not. Just keep in mind that it does enable VSYNC when it is turned on and if the framerate goes below 60fps it shuts off. Therefore in order to get the most benefit out of it you need to keep the framerate very high, the closer it is to your refresh rate the less shimmering you will notice. We are happy that ATI is evolving new techniques for improving image quality and providing a potentially better AA image quality with no performance hit impact.


Dueling Shaders:

So here we are at the question that we know everyone will be thinking about, what about Shader Model 3.0 versus Shader Model 2.0? Right now it isn’t even exposed on the GeForce 6800Ultra and won’t be until DX9.0c is out. There isn’t much to say except that both will look the same this year unless we see games with true displacement mapping implemented. Technically the X800 series, and going all the way back to the Radeon 9700 series, support point sampled Displacement Maps via N-Patches, while the 6800Ultra supports filtered Displacement Mapping via vertex texture lookups in VS 3.0. But it all depends on what developers use in games. If developers never use it, it is a useless feature. SM3.0 is an extension of SM2.0 and adds mostly performance enhancements, as well as possibly easier programming for developers. Again, it all depends on what the developers do, and we’ll stress again, SM3.0 versus SM2.0 image quality differences is really a non issue currently, and probably will be for this year of next-gen games.


The Bottom Line:

The X800Pro should be selling in major retail online outlets today, May 4th, and selling for an MSRP of $399. The X800XT Platinum Edition will begin shipping to retailers by May 21st and will be sold at an MSRP of $499. As usual, those MSRP prices will deflate somewhat quickly if the supply is available that ATI assures us there is.

When it comes right down to it the X800Pro matches or beats the GeForce 6800Ultra in game performance and IQ. Compare the price, performance, and IQ of the ATI Radeon X800Pro with the GeForce 6800Ultra, and the X800Pro definitely stacks up as the better value than the 6800Ultra. NVIDIA is today launching an even higher clocked Ultra, no doubt in response to the X800XT-PE. We have yet to even see a retail 6800Ultra for any of the partners so currently I would consider these a non-issue. Also, a lower clocked 6800GT is being introduced but this card is simply not currnetly going to play in the same ballpark with the X800 series from ATI.

The Radeon X800XT Platinum Edition goes even further and burns through these games like a hot knife through butter, besting NVIDIA’s 6800Ultra by an easily noticeable real-world margin. If the flagship is what you want, be assured the ATI Radeon X800XT Platinum Edition was the top performer allowing us to play today’s demanding games at higher resolution and quality settings than any 3D graphics accelerator we have ever experienced.

Our Radeon X800 Series VPU Technology article is here for those of you that wish to know more about what is in the silicon.

While we realize that it is not likely that you will own two X800s, both are worthy of our "Must Have" award. The X800Pro and X800XT-Platinum Edition can both easily be called Must Have [Hardware] by the discerning gamer.

[img]http://www.hardocp.com/images/articles/1083564189888Adk70te_10_2.gif

kaiweiler
05-04-2004, 03:27 PM
w00t!
That's what most of us would have guessed anyway, no?
I know I figured ATI would kick nvidia's ass in everything, and I was right :)

johnboy27
05-04-2004, 03:57 PM
On another board I post at,one of the memebers posted a link to some other article on hardocp that showed Nvidia coming out on top.So I quickly posted your link in the thread so he could look at it and see if he was still correct.His response was
'From what I can see, they're just trading off "wins" back and forth, with a fairly hefty amount of editorializing by Kyle (no surprise there...). "
Of course he has an MSI FX5900XT in his computer.Funny how all the Nvidia fans will look right at the results and then say that somebody is basically lieing about them.
So wheres ATIVidia at anyway.LOL

Mad Cat
05-04-2004, 04:14 PM
You have been saying no alot, no?

lynx
05-04-2004, 04:15 PM
Originally posted by johnboy27@4 May 2004 - 16:05
On another board I post at,one of the memebers posted a link to some other article on hardocp that showed Nvidia coming out on top.So I quickly posted your link in the thread so he could look at it and see if he was still correct.His response was
'From what I can see, they're just trading off "wins" back and forth, with a fairly hefty amount of editorializing by Kyle (no surprise there...). "
Of course he has an MSI FX5900XT in his computer.Funny how all the Nvidia fans will look right at the results and then say that somebody is basically lieing about them.
So wheres ATIVidia at anyway.LOL
Isn't that exactly what you did when someone posted a link showing Nvidia on top? At least his response was honest, all you ATI fanboys seem to be blinkered into believing only the ones you want to see.

Ati may have come out with a better card, but many of us think it is better to wait until someone comes out with a fact based review before making up our minds.

Mad Cat
05-04-2004, 04:23 PM
These things aren't even benchmarks also. Its what HardOCP says is a playable resolution and amount of quality extras.

johnboy27
05-04-2004, 04:54 PM
Originally posted by lynx+4 May 2004 - 17:23--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (lynx @ 4 May 2004 - 17:23)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-johnboy27@4 May 2004 - 16:05
On another board I post at,one of the memebers posted a link to some other article on hardocp that showed Nvidia coming out on top.So I quickly posted your link in the thread so he could look at it and see if he was still correct.His response was
&#39;From what I can see, they&#39;re just trading off "wins" back and forth, with a fairly hefty amount of editorializing by Kyle (no surprise there...). "
Of course he has an MSI FX5900XT in his computer.Funny how all the Nvidia fans will look right at the results and then say that somebody is basically lieing about them.
So wheres ATIVidia at anyway.LOL
Isn&#39;t that exactly what you did when someone posted a link showing Nvidia on top? At least his response was honest, all you ATI fanboys seem to be blinkered into believing only the ones you want to see.

Ati may have come out with a better card, but many of us think it is better to wait until someone comes out with a fact based review before making up our minds. [/b][/quote]
The reason i posted the link is because the one he posted was one that didn&#39;t even have any testing between the two,just what they thought by only looking at the two cards and the specs.As for the comment about Nvidia fans,it was supposed to be sarcasm.I like ATI,I really have no experience with Nvidia (well a little with friends computers) but I have always had ATI and have never had a problem with them.On another note,I really am not too concerned about the newer cards because I won&#39;t be buying one,I can&#39;t justify paying so much for something I won&#39;t really use to it&#39;s full potential.

adamp2p
05-04-2004, 08:43 PM
Conclusion Tom&#39;s Hardware


We haven&#39;t even really gotten over NVIDIA&#39;s highly impressive introduction of its GeForce 6800 Ultra, and here ATi is already hitting back hard. Thanks to its performance advantage when using anisotropic filtering, the Radeon X800 XT Platinum Edition shows its rivals who&#39;s boss in this discipline without noticeably sacrificing image quality. Even the much less expensive X800 Pro with its 12 pipes can beat the GeForce 6800 Ultra in some game tests. The price difference of about &#036;100 will certainly be an argument that could win over a number of undecided buyers. When quality-enhancing features like FSAA and anisotropic filtering aren&#39;t enabled, however, it is often the GeForce 6800 Ultra that takes first place. Thanks to Temporal AA, though, ATi has a good solution even to this "problem". At any rate, most gamers would be loath to do without anisotropic filtering when using cards of this caliber anyway. But keep in mind that neither of the cards can be termed "slow". We´re talking about differences at very high levels&#33;&nbsp;

In our opinion, the most impressive thing about this card is how little effort ATi needed to reach the performance we saw here. The power consumption of the X800 XT is about the same as that of its predecessors in 3D applications. Additionally, the cards require only one auxiliary power connector and don&#39;t need an especially potent power supply like the GeForce 6800 Ultra does. Even the cooler has shrunk a bit, reducing the card&#39;s overall weight and ensuring that it would fit even into a mini-ITX case.

The trouble is, there are also drawbacks to the fact that only little effort had to be put into this design. Technologically, ATi&#39;s 3D architecture has fallen behind that of NVIDIA, and it is now the green guys that can claim to have the more innovative chip and can rally support for new features. Although the X800 cards can now process longer and therefore more complex shader programs than the R9800XT, they are still limited to 24-bit floating-point precision and ShaderModel 2.0. It remains to be seen whether the GeForce 6800 Ultra, with its support for ShaderModel 3.0 and 32-bit fps precision, will enjoy any tangible performance advantages in practice. For now, ATi&#39;s shader quality definitely gives no grounds for complaint. And not to forget 3Dc, which can improve the game experience. Nonetheless the R420 only seems to be a temporary solution. Already, the R480 is rearing its head in the roadmaps, and there&#39;s a good possibility that ATi may just introduce the R5xx series instead. And after so much speculation and conspiracy, this author can&#39;t help but find himself humming an eerie little melody. You know - the theme from that mystery series with the X in the name...

Until then, the new performance leader is ATi&#39;s Radeon X800 XT Platinum Edition. Let&#39;s just hope that ATi stays true to its word and that these cards will be available in more than just homeopathic doses. Even if only the slightly slower non-Platinum XT versions actually make it to the market in high numbers, these should still be able to keep the GeForce 6800 Ultra in check. Then again, NVIDIA is not just sitting around twiddling its thumbs - it plans to launch the GeForce 6800 Ultra Extreme, which will be offered by a number of NVIDIA partners, Gainward and XFX among them. It remains to be seen at what price point these cards will be sold, as the "normal" GF 6800 Ultra&#39;s price tag of &#036;499 already makes it just as expensive as the X800 XT Platinum Edition. The direct competitor to the X800 Pro in the &#036;399 market segment will be the new GeForce 6800 GT.

Illuminati
05-04-2004, 10:45 PM
This thing happens every few months - ATI brings out cards which outperform nVidia&#39;s, nVidia bring out cards that outperform ATI&#39;s a couple of months later, the cycle repeats again a few couple of months after. It used to be every six months a few years back, but as with Moore&#39;s Law with today&#39;s processor technologies it&#39;s increasing in frequency than originally thought :)

And if I remember, this same bloody thread happened a few months ago as well by the same bloody person. :huh: See you again in a few months, adam :rolleyes:

bigdawgfoxx
05-04-2004, 11:24 PM
Yeah, since ATI and Nvidia release new cards like every couple of months <_<

RGX
05-05-2004, 12:28 AM
Originally posted by Illuminati@4 May 2004 - 22:53
This thing happens every few months - ATI brings out cards which outperform nVidia&#39;s, nVidia bring out cards that outperform ATI&#39;s a couple of months later, the cycle repeats again a few couple of months after. It used to be every six months a few years back, but as with Moore&#39;s Law with today&#39;s processor technologies it&#39;s increasing in frequency than originally thought :)

And if I remember, this same bloody thread happened a few months ago as well by the same bloody person. :huh: See you again in a few months, adam :rolleyes:
In case you havnt noticed, these are next generation cards, marking a huge step up from the current models available today. They are new cores, R420 and NV40, and represent the latest basis for this years cards. These are not your average step up, and are worth reporting on.

bigdawgfoxx
05-05-2004, 12:31 AM
Originally posted by RGX+4 May 2004 - 18:36--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 4 May 2004 - 18:36)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-Illuminati@4 May 2004 - 22:53
This thing happens every few months - ATI brings out cards which outperform nVidia&#39;s, nVidia bring out cards that outperform ATI&#39;s a couple of months later, the cycle repeats again a few couple of months after.&nbsp; It used to be every six months a few years back, but as with Moore&#39;s Law with today&#39;s processor technologies it&#39;s increasing in frequency than originally thought :)

And if I remember, this same bloody thread happened a few months ago as well by the same bloody person. :huh: See you again in a few months, adam :rolleyes:
In case you havnt noticed, these are next generation cards, marking a huge step up from the current models available today. They are new cores, R420 and NV40, and represent the latest basis for this years cards. These are not your average step up, and are worth reporting on. [/b][/quote]
Exactly&#33; ;)

adamp2p
05-05-2004, 01:28 AM
Conclusion xbitlabs.com (http://www.xbitlabs.com/articles/video/display/r420-2_35.html)


So, the new RADEON X800 graphics processors from ATI Technologies demonstrated themselves as extremelly powerful rivals for the high-end of NVIDIA GeForce 6-series graphics products.

The top-of-the-line &#036;499 RADEON X800 XT appeared to be faster compared to its main competitor – the GeForce 6800 Ultra – in plethora of applications where it was pretty natual to expect – the games that broadly use complex geometry and loads of math-intensive pixel shaders. Additionally, the new graphics processors from ATI Technologies are also getting performance advantages over&nbsp; the rivalling NVIDIA’s solution when full-scene antialiasing and anisotropic filtering are switched on – that’s because of the new HyperZ HD technology that maximizes the efficiency of memory bandwidth utilization as well as because of high-performance anisotropic filtering method.

A little bit less speedy flavour of the R420 – the RADEON X800 PRO – that has only 12 pixel pipelines and clocked at lower speeds undoubtedly demonstrate an excellent performance rise over the previous generation RADEON 9800 XT and the GeForce FX 5950 Ultra hardware. But the final conclusion about this one should still be put on hold, as NVIDIA has not finalized specification of its &#036;399 product. This is a kind of funny, but the RADEON X800 PRO is expected to be available in retail instantly, making the process of choice at &#036;399 price-point pretty tricky, as the actual performance comparison with competing solution from NVIDIA is still to see the light of the day.

Regrettably for the Markham, Ontario-based company, due to some drawbacks with efficient texturing in the new VPUs from ATI, NVIDIA GeForce 6800 Ultra manages to beat the rival in games where high fillrate and rapid texturing are important.

Furthermore, NVIDIA still has some more trumps in its hands. Firstly, the company’s GeForce 6800 Ultra GPU is able to calculate up to 32 Z/stencil values per pass, therefore, games that heavily use Z or stencil buffers for generation dynamic shadows will have loads of chances to run faster on NVIDIA’s hardware. Secondly, eventually game developers may implement Shaders 3.0 into their titles for the purposes of performance optimization, which will also boost the speed on NVIDIA’s latest hardware that supports the Shader Model 3.0, a capability that seems to be trimmed on the ATI’s RADEON X800 XT and X800 PRO.

Unfortunately for NVIDIA, right now there are no games that actually use the shaders 3.0 and there is also no DirectX 9.0c that will actually switch on the support for the SM 3.0. With that said, we should probably let the time to say its last word in the cruel battle between the R420 and the NV40 technologies, but based on current numbers achieved in benchmarks we believe that the RADEON X800 XT clearly packs the punch over the competitor in terms of performance in applications that use shaders 2.0/2.x and are available today.

NVIDIA down for the count (yet again, no?)

kaiweiler
05-05-2004, 01:35 AM
:lol: Poor nVidia
I wonder how long it&#39;ll be until they realise that they won&#39;t ever beat ATI lol

abu_has_the_power
05-05-2004, 01:42 AM
ATI IS THE BEST&#33; hands down

heres y its better than nvidia:

http://www.gamespot.com/features/6095215/index.html

look at the comparison chart

delphin460
05-05-2004, 02:09 AM
Originally posted by kaiweiler@5 May 2004 - 01:43
:lol: Poor nVidia
I wonder how long it&#39;ll be until they realise that they won&#39;t ever beat ATI lol
ya realy wanna hope they dont give up and submit , the advance in graphics technology is pushed by these two trading blows

if nvidia gives up the push to be best , and ati gets basicaly 100% market share of the performance card world, forget newer stuff quick , and forget cheap prices

lynx
05-05-2004, 02:22 AM
Trouble is people see reports like "version 62.11 drivers are broken so ignore those benchmarks" and forget that there is probably a small bug which, when fixed, will show that Nvidia is competing on just about the same level as Ati (better in some respects, poorer in others).

I fail to understand why they insist on comparing Ati&#39;s newest card running with Ati&#39;s newest drivers against Nvidia&#39;s newest card running with out of date drivers, it just doesn&#39;t make sense.

Certainly, Nvidia should be criticised for not getting their drivers right in time for the launch of the new card, but Ati had some time to seek out those bugs while their own product was still under wraps. Don&#39;t assume that there are no bugs in Ati&#39;s drivers, it may be that they just haven&#39;t been found yet, but then again Ati haven&#39;t made the leap forward into PS 3.0, that too could explain the difference.

adamp2p
05-05-2004, 02:39 AM
Originally posted by lynx@4 May 2004 - 18:30
Trouble is people see reports like "version 62.11 drivers are broken so ignore those benchmarks" and forget that there is probably a small bug which, when fixed, will show that Nvidia is competing on just about the same level as Ati (better in some respects, poorer in others).

I fail to understand why they insist on comparing Ati&#39;s newest card running with Ati&#39;s newest drivers against Nvidia&#39;s newest card running with out of date drivers, it just doesn&#39;t make sense.

Certainly, Nvidia should be criticised for not getting their drivers right in time for the launch of the new card, but Ati had some time to seek out those bugs while their own product was still under wraps. Don&#39;t assume that there are no bugs in Ati&#39;s drivers, it may be that they just haven&#39;t been found yet, but then again Ati haven&#39;t made the leap forward into PS 3.0, that too could explain the difference.
Excuse me, but what did you just say?

About Shader Model 3.0? Do you know HOW LONG it takes MICROSOFT to deliver products out of BETA? Obviously not&#33; Pixel Shader 3.0 IS NOT SUPPORTED BY A SINGLE GAME OR BY MICROSOFT (and probably will not be until late this summer or early fall and by then ATI can get ahead and develop a new solution that could easily outperform the 6800 series.) Obviously you have not done your reading on the topic. The 6800 Series is all NVIDIA has on their horizon until late next year. In other words, ATI has (yet) ANOTHER opportunity to best NVIDIA.

The real issue is performance in TODAYS games. I offer you an anecdote along the same train of thought:

It&#39;s great to hear about a beautiful woman but bedding her is an entirely different story.

A good marketing team is one that truly believes that their product is superior. It appears to me (I don&#39;t know about you) that Nvidia believes that their product will be superior in future games. That&#39;s just like a woman telling me that she looks okay today, but after her nosejob and breast augmentation she&#39;ll be even better. I just don&#39;t see the world that way.

:lol: >_<

lynx
05-05-2004, 02:51 AM
I actually meant that Nvidia have moved forward to enabling PS 3.0 support which could explain why they&#39;ve got bugs in their drivers.

But the other thing is that PS 3.0 needs hardware support which Ati simply haven&#39;t developed, so if you go out and buy an Ati card now will you be going out to but another in a few months time? At &#036;400-500 a time I don&#39;t think that&#39;s very likely, but if the improvements which PS 3.0 supposedly give it will be Ati who are in the shade.

To use your analogy, Nvidia are making sure they look pretty good today AND tomorrow. Ati can do the same for today, but they want more money from you to make sure they look ok tomorrow.

Still, if you&#39;ve got that much money to burn I suppose it doesn&#39;t matter to you, the rest of us have to live in the real world.

adamp2p
05-05-2004, 03:16 AM
Originally posted by lynx@4 May 2004 - 18:59
I actually meant that Nvidia have moved forward to enabling PS 3.0 support which could explain why they&#39;ve got bugs in their drivers.

But the other thing is that PS 3.0 needs hardware support which Ati simply haven&#39;t developed, so if you go out and buy an Ati card now will you be going out to but another in a few months time? At &#036;400-500 a time I don&#39;t think that&#39;s very likely, but if the improvements which PS 3.0 supposedly give it will be Ati who are in the shade.

To use your analogy, Nvidia are making sure they look pretty good today AND tomorrow. Ati can do the same for today, but they want more money from you to make sure they look ok tomorrow.

Still, if you&#39;ve got that much money to burn I suppose it doesn&#39;t matter to you, the rest of us have to live in the real world.
Most hard-core gamers DO NOT live in the real world, we live in a virtual world, and we influence the hardware industry with our loud voices.

Most of us have a job and can afford to spend &#036;500-1000 dollars a year on VPUs. How can we rationalize this? Well let us think for a moment. If I spend 3 hours a day gaming, that adds up to 1095 hours of gaming per year, and at that rate it is less than three dollars a day that we spend to quench our thirst for the highest performance VPU money can buy. That shows that what you define "the real world" to be may not be mine.

"Who needs beautiful women?" They all have the same machinery, right? A Pus*y is a pus*y, right? For you, maybe, but for me, NO&#33;
You have fun with your big legged woman, and I&#39;ll stick with "high performance women," okay?

More is better than less. Better is better that worse. More power is better than less power. It doesn&#39;t get simpler than that.

You stick with your Yugo while I drive a Ferrari, okay?

adamp2p
05-05-2004, 04:05 AM
a picture tells a thousand words... (http://www.hardocp.com/image.html?image=MTA4MzU2NDE4OTg4OEFkazcwdGVfNF80X2wuanBn)

bigdawgfoxx
05-05-2004, 04:10 AM
niceeee

How can I play that video? I dont wana get quicktime really...

adamp2p
05-05-2004, 04:12 AM
Originally posted by bigdawgfoxx@4 May 2004 - 20:18
niceeee

How can I play that video? I dont wana get quicktime really...
get the quicktime ALTERNATIVE here (http://home.hccnet.nl/h.edskes/mirror.htm) free

:)

lynx
05-05-2004, 09:42 AM
Originally posted by adamp2p@5 May 2004 - 04:13
a picture tells a thousand words... (http://www.hardocp.com/image.html?image=MTA4MzU2NDE4OTg4OEFkazcwdGVfNF80X2wuanBn)
So far all you&#39;ve shown is videos and pictures and impressions, no hard facts.

A picture which is garbage not only tells you about the dump, it tells you that the people who took it habituate the dump.

adamp2p
05-05-2004, 01:46 PM
Originally posted by lynx+5 May 2004 - 01:50--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (lynx @ 5 May 2004 - 01:50)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-adamp2p@5 May 2004 - 04:13
a picture tells a thousand words... (http://www.hardocp.com/image.html?image=MTA4MzU2NDE4OTg4OEFkazcwdGVfNF80X2wuanBn)
So far all you&#39;ve shown is videos and pictures and impressions, no hard facts.

A picture which is garbage not only tells you about the dump, it tells you that the people who took it habituate the dump. [/b][/quote]
That screenshot comparison is a fact&#33; You draw your own conclusions&#33;&#33;&#33;

:frusty:

j4y3m
05-05-2004, 02:49 PM
Originally posted by adamp2p@5 May 2004 - 04:13
a picture tells a thousand words... (http://www.hardocp.com/image.html?image=MTA4MzU2NDE4OTg4OEFkazcwdGVfNF80X2wuanBn)
"The ATi cards peform better in that game." is only eight words. :blink:

I&#39;m not sure if it&#39;s a game though. :P

adamp2p
05-05-2004, 03:33 PM
Originally posted by j4y3m+5 May 2004 - 06:57--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (j4y3m @ 5 May 2004 - 06:57)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-adamp2p@5 May 2004 - 04:13
a picture tells a thousand words... (http://www.hardocp.com/image.html?image=MTA4MzU2NDE4OTg4OEFkazcwdGVfNF80X2wuanBn)
"The ATi cards peform better in that game." is only eight words. :blink:

I&#39;m not sure if it&#39;s a game though. :P [/b][/quote]
http://www.driverheaven.net/reviews/6800x8.../conclusion.htm (http://www.driverheaven.net/reviews/6800x800pro/conclusion.htm)

Conclusion


Let’s start with the Geforce 6800 Ultra, its one hell of a fast card. There are some tests/game engines where the 6800 Ultra just streaks ahead of the X800 Pro at 1600 x 1200 – such as Painkiller and Prince of Persia. Testing wasn’t without issue though, which was strange – even for a reference design board. Zero Hour had the texture issues, Shadermark wouldn’t run and Max Payne wouldn’t work at maximum settings. In our opinion the Zero Hour bug will be easily fixed however in the case of Max Payne it may be that the maximum settings of 8xAA 16xAF will never work at 1600x1200. This does raise some concerns over using this mode in future games, will it become more un-useable as the use of video memory increases…or maybe you’ll require the 512mb 6800 Ultra to fully utilise it. If it’s a case of the later then the question of will it be useable from a playability point of view? Our “Maxing it out” section showed that 8xAA and 16xAF just didn’t provide playable frame-rate’s on the games we tested. Looking at the AA/AF results overall shows that weak AA/AF performance may be an inherent design issue in the NV40. The hit taken when enabling 4xAA 8xAF really hampered our card when compared to the R420 card. Even in benchmarks where the 6800 Ultra was well ahead of the R420 without AA/AF enabling both allowed the R420 to match or pass the 6800 Ultra in performance. The move to rotated grid multi sampling is has however resulted in much improved IQ over the last generation of Geforce product and it’s now much harder to choose which brand of video card has the best IQ.

The X800 Pro was a surprise to us in terms of just how fast it was. Considering it has a four pipeline deficit over the 6800 Ultra the card still keeps up and in some cases surpasses the Geforce. Testing went really without issue, all games we tested ran with no display concerns and the card was completely stable. It seems that the drivers are mature enough , even at this stage, that you can go out and buy a R420 based product and games will “just work”.

Where the NV40 core receives a major performance hit when enabling Anti-Aliasing and Anisotropic Filtering the R420 core has no such issues. The performance penalty received on the R420 is minimal resulting in games which maintain playable frame-rates at higher image quality than even the top end NV40 model. The fact that the X800 Pro manages to outscore the 6800 Ultra with AA/AF enabled in most cases further enforces the excellent AA/AF performance. It was also nice to see the X800 Pro maintaining playability in Farcry, one of the most demanding games available even at 1600x1200 when 4xAA and 16xAF was enabled. Our sample was an excellent overclocker, and providing your cards manufacturer uses decent memory there is no reason why you couldn’t achieve the same levels of performance when overclocking, an additional bonus.

There have been some rumours circulating around the past week over the core/clock speeds for the 6800 Ultra. We asked Nvidia about this and they confirmed that the 6800 Ultra reference speeds were still 400mhz core and 550mhz memory in 3d use however partner companies can choose to clock their boards higher if they wish. Based on our overclocking experience it would seem that a 50mhz increase on both core and memory will be about the maximum most partners will aim for. This will certainly improve results however it will only make the 6800 Ultra more competitive with the X800 Pro, the X800XT (read our review here) will still be a fair amount faster than the 6800 Ultra when AA/AF is used. This is the real deciding factor in our view. Yes both the X800 Pro and 6800 Ultra are fast, yes they give good image quality by default however with cards this fast you really need to enable at least 4xAA and 8xAF to get the full benefits of your purchase. At these settings there really is no competition. The X800 Pro is a clear winner.

adamp2p
05-05-2004, 03:34 PM
http://www.anandtech.com/video/showdoc.html?i=2044&p=22

Final Words


QUOTE&nbsp;
I don&#39;t think anyone thought the race would be this close after what has been going on over the past couple years with ATI and NVIDIA. Clearly, both camps have had their wins and losses, but it is safe to say that ATI comes out on top when it comes to DX9 and PS 2.0 performance, NVIDIA leads the way in OpenGL performance, and NV40 and R420 split the difference when it comes to DX8 (and older) style games. Even though we haven&#39;t yet seen the performance numbers from NVIDIA&#39;s 6850 Ultra part, it is likely that there will be a price premium that goes along with that performance. On top of that, the 6850 is really just an overclocked 6800 Ultra part. We will take a look at the issue further when we are finally able to run some numbers.

It is very clear that both NVIDIA and ATI have strong offerings. With better competition in the market place, and NVIDIA differentiating themselves by offering a richer feature set (that doesn&#39;t necessarily translate into value unless developers start producing games that use those features), consumers will be able to make a choice without needing to worry about sacrificing real performance. Hopefully we will be able to say the same about image quality when we get done with our testing in that area as well.

Of course, we are still trying to gather all the pieces that explain why we are seeing the numbers we are seeing. The problem is really the amount and level of information we are able to gather is based on how the API maps to the hardware rather than how the hardware does things.

The two rather large issues we have encountered when trying to talk about hardware from the software&#39;s perspective are the following: it is easy to get lost when looking at performing tasks from slightly different perspectives or angles of attack, and looking at two architectures that are designed to accomplish similar tasks obfuscates the characteristics of the underlying architectures. We are very happy that both NVIDIA and ATI have started opening up and sharing more about there architectures with us, and hopefully the next round of products will see even further development of this type of relationship.

There is one final dilemma we have on our hands: pricing. From the performance numbers from both this generation and the previous generation, it doesn&#39;t seem like prices can stay where they are. As we get a better feel for the coming market with the 12x1 NVIDIA offering, and other midrange and budget offerings from both NVIDIA and ATI, there will be so much overlap in price, performance, and generation without a very large gap in functionality that it might not make sense to spend more money to get something newer. Of course, we will have to wait and see what happens in that area, but depending on what the test results for our 6850 Ultra end up looking like, we may end up recommending that NVIDIA push their prices down slightly (or shift around a few specs) in order to keep the market balanced. With ATI&#39;s performance on par in older games and slightly ahead in newer games, the beefy power supply requirement, two slot solution, and sheer heat generated by NV40 may be too much for most people to take the NVIDIA plunge. The bottom line is the consumer here, and its good news all around.&nbsp;

adamp2p
05-05-2004, 03:37 PM
http://www.gamers-depot.com/hardware/video...ti/x800/005.htm (http://www.gamers-depot.com/hardware/video_cards/ati/x800/005.htm)


Getting games to really push these cards can be quite a challenge, we had to start enabling a lot more features like Anisotropic filtering and Anti-Aliasing so the games wouldn’t be more CPU bound than GPU bound.

After looking at the benchmark results, it’s not hard to conclude which company has the faster GPU. What may be hard to decide, however, is whether or not you believe you’ll keep whatever video card you buy long enough to have SM3.0 be an issue or not. Even so, the NV40 inherently prohibits itself from many end-user PCs, especially the Small Form Factor owners.

Secondly, the cooler-running R420 is a lot more forgiving in a wide variety of hardware configurations - this issue alone will be a major reason why ATI has a greater potential of getting major design-wins from OEMs.

3Dc, while cool, is not enough of a breakthrough technology to get us ultra-excited about the X800. It’s the sheer, unadulterated power of these cards that fuels our lust for faster graphics, which amounts to more fluid games that turns our crank. Keep all the fancy, unused features if it means giving up horsepower. If ATI can prove it can build a high run-rate of X800XT cards, then the high-end will surely be clinched by them. The part that gets tricky for ATI is the mid-range; it’s where a lot more money is made and also where NVIDIA can be a extremely aggressive with cards like the 6800GT.

Only time will tell if leaving out SM3.0 was a poor choice by ATI in favor or raw power. Even though we’ve heard mixed reviews, You can be assured that if NVIDIA can sell enough NV40’s, publishers will start being more forceful in making sure games support it – after all, both developers and publishers want their respective games looking and playing the best of a wide variety of PCs.

If you want a hotter card that requires more power and doesn&#39;t perform as well under most benchmarks, just for the sake of SM3.0 then go ahead and snag a 6800 Ultra - If, however, you want the most insanely powerful graphics solution for games of today, runs cooler, works with "normal" power supplies then ATI has your mealticket.

adamp2p
05-05-2004, 03:43 PM
http://www.bjorn3d.com/_preview.php?articleID=457&pageID=725

Performance Conclusion


If you were a bit dissapointed in the leap from Radeon 9700 pro to Radeon 9800 Pro or Radeon 9800 Pro to Radeon 9800XT then you will be extremely happy with the performance leap of the x800 XT. The 8 extra rendering pipelines as well as increased VPU and memory speed really helps the x800 XT to simply crush the Radeon 9800XT. If you could play at 4x AA and 8xAF on the Radeon 9800XT and get good framerates you now can play at 6xAA and 16xAF and still get higher performance. Impressive? Definitely. Since I couldn&#39;t test the 12 pipeline x800 Pro in time for this preview I cannot say to much about its performance more than that in theory it will still perform really good compared to the Radeon 9800XT.

IQ at a minimal cost


Pixel and Texel fillrate of up to 8.4 Gigapixels/sec, up to 37 GB/sec of raw bandwidth and 12 or 16 pipeline architecture are the key features when it comes to X800 performance. Due to its sophisticated 0.13-micron Low-K dielectric process, very efficient GDDR3 memory interface and superior pixel shader architecture (12 or 16 pixel pipelines and 6 vertex units), RADEON X800 easily doubles (x800 XT Platinum Edition) the performance of its high-end predecessor – RADEON 9800XT.

There is no doubt in our minds that the new RADEON X800 from ATI has just raised the image quality bar. By introducing Temporal Antialiasing, this technique brings in huge quality enhancements while keeping performance cost at zero. This is truly a step forward when it comes to programmable Antialiasing architecture. As with R3xx design, X800 offers full trilinear texture filtering by default along with up to 16x Anisotropic Filtering. When combined, the video output is phenomenal keeping the performance hit at minimal -- in some situations none-existent because of X800’s superior fillrate.


Conclusion


With the x800 Pro and XT Platinum Edition ATI has brought up a pair of impressive products. If there is something one could complain about is that the cards really do not have any new exciting features and that they basically still are R300 on crack. I don&#39;t necessarily think that the lack of PS3.0 will affect this generation though and the performance increase itself is enough to have any serious gamer wanting it. Any price-concious gamers should take a closer look at the Radeon x800 Pro which should perform excellent while be a lot more affordable.

The x800 pro should be out as you read this at a suggested retail price of &#036;399 and the x800 XT Platinum Edition will be out on the 21st of May at a suggested retail price of &#036;499.

j4y3m
05-05-2004, 03:47 PM
I&#39;m not denying the X800 is better, just that that picture doesn&#39;t say much about the cards at all. :unsure:

lynx
05-05-2004, 04:36 PM
Originally posted by adamp2p+5 May 2004 - 13:54--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (adamp2p &#064; 5 May 2004 - 13:54)</td></tr><tr><td id='QUOTE'>
Originally posted by lynx@5 May 2004 - 01:50
<!--QuoteBegin-adamp2p@5 May 2004 - 04:13
a picture tells a thousand words... (http://www.hardocp.com/image.html?image=MTA4MzU2NDE4OTg4OEFkazcwdGVfNF80X2wuanBn)
So far all you&#39;ve shown is videos and pictures and impressions, no hard facts.

A picture which is garbage not only tells you about the dump, it tells you that the people who took it habituate the dump.
That screenshot comparison is a fact&#33; You draw your own conclusions&#33;&#33;&#33;

:frusty: [/b][/quote]
As far as I can tell it could be a comparison of screen resolution. It tells me nothing else.


http://www.driverheaven.net/reviews/6800x8.../conclusion.htm (http://www.driverheaven.net/reviews/6800x800pro/conclusion.htm) Uses Beta drivers for the X800 card, but older drivers (60.72) for the 6800. Hardly a fair comparison yet the 6800 competes well and often beats (I think they said trounces) the X800.


http://www.anandtech.com/video/showdoc.html?i=2044&p=22 A much more honest comparison, the 6800 Ultra almost keeps up with Ati&#39;s flagship Platinum Edition. Can&#39;t wait to see the results for Nvidia&#39;s flagship 6850 Ultra.


http://www.gamers-depot.com/hardware/video...ti/x800/005.htm (http://www.gamers-depot.com/hardware/video_cards/ati/x800/005.htm) Seems like a reasonable review. The 6800 Ultra seems about on a par with the X800 Pro (actually it beats it more times than it is beaten), and sometimes beats the X800 PE. Again, I&#39;d like to see those tests repeated with the 6850 Ultra.


http://www.bjorn3d.com/_preview.php?articleID=457&pageID=725 I fail to see why they make any comparisons with features on the Nvidia cards when they haven&#39;t even bothered to test them. Sheer bias.

It&#39;s a shame you pointed to the conclusions rather than the start of the comparisons, because I don&#39;t really see that the conclusions drawn are totally justified. For example, how can driverheaven use words like "trounces" to describe the way the 6800 beats the X800 (even with old drivers) and still come to the conclusion that the X800 Pro is a clear winner.

Of those reviews, only anandtech&#39;s is really objective, and like me they want to see the 6850 results.

Btw, you really must do some remedial study for your math. The cost of your cards works out at about &#036;1 per hour, not per day.

adamp2p
05-05-2004, 08:55 PM
Originally posted by lynx+5 May 2004 - 08:44--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (lynx @ 5 May 2004 - 08:44)</td></tr><tr><td id='QUOTE'>
Originally posted by adamp2p@5 May 2004 - 13:54

Originally posted by lynx@5 May 2004 - 01:50
<!--QuoteBegin-adamp2p@5 May 2004 - 04:13
a picture tells a thousand words... (http://www.hardocp.com/image.html?image=MTA4MzU2NDE4OTg4OEFkazcwdGVfNF80X2wuanBn)
So far all you&#39;ve shown is videos and pictures and impressions, no hard facts.

A picture which is garbage not only tells you about the dump, it tells you that the people who took it habituate the dump.
That screenshot comparison is a fact&#33; You draw your own conclusions&#33;&#33;&#33;

:frusty:
As far as I can tell it could be a comparison of screen resolution. It tells me nothing else.


http://www.driverheaven.net/reviews/6800x8.../conclusion.htm (http://www.driverheaven.net/reviews/6800x800pro/conclusion.htm) Uses Beta drivers for the X800 card, but older drivers (60.72) for the 6800. Hardly a fair comparison yet the 6800 competes well and often beats (I think they said trounces) the X800.


http://www.anandtech.com/video/showdoc.html?i=2044&p=22 A much more honest comparison, the 6800 Ultra almost keeps up with Ati&#39;s flagship Platinum Edition. Can&#39;t wait to see the results for Nvidia&#39;s flagship 6850 Ultra.


http://www.gamers-depot.com/hardware/video...ti/x800/005.htm (http://www.gamers-depot.com/hardware/video_cards/ati/x800/005.htm) Seems like a reasonable review. The 6800 Ultra seems about on a par with the X800 Pro (actually it beats it more times than it is beaten), and sometimes beats the X800 PE. Again, I&#39;d like to see those tests repeated with the 6850 Ultra.


http://www.bjorn3d.com/_preview.php?articleID=457&pageID=725 I fail to see why they make any comparisons with features on the Nvidia cards when they haven&#39;t even bothered to test them. Sheer bias.

It&#39;s a shame you pointed to the conclusions rather than the start of the comparisons, because I don&#39;t really see that the conclusions drawn are totally justified. For example, how can driverheaven use words like "trounces" to describe the way the 6800 beats the X800 (even with old drivers) and still come to the conclusion that the X800 Pro is a clear winner.

Of those reviews, only anandtech&#39;s is really objective, and like me they want to see the 6850 results.

Btw, you really must do some remedial study for your math. The cost of your cards works out at about &#036;1 per hour, not per day. [/b][/quote]
Why won&#39;t you just admit it? I really don&#39;t know either... (http://www.neowin.net/forum/index.php?act=ST&t=163813&f=8&view=findpost&p=2063903)

bigdawgfoxx
05-05-2004, 11:53 PM
Yeah, after all those conclusions..Im gona have to say the ATI card is once again the leader... but theres just so many charts showing the 6800 winning, and the x800 winning...its hard to decide. And all this bullshit about "ohhh well the drivers arent that good yet"...thats bullshit...release your card with decent drivers or dont release them.

Thats like releasing a new Ferrari with bad gear ratios...

adamp2p
05-06-2004, 12:09 AM
Originally posted by bigdawgfoxx@5 May 2004 - 16:01
Yeah, after all those conclusions..Im gona have to say the ATI card is once again the leader... but theres just so many charts showing the 6800 winning, and the x800 winning...its hard to decide. And all this bullshit about "ohhh well the drivers arent that good yet"...thats bullshit...release your card with decent drivers or dont release them.

Thats like releasing a new Ferrari with bad gear ratios...
To be honest with you, I don&#39;t plan on upgrading until the next generation platforms are released (ie. PCI Express etc.). So if somehow Nvidia can convince me to cross over in the next two months, they will have my dollar.

But if things stay the way they are now, ATi is the way to go-that&#39;s for sure from what is being reported at this time from the entire Internet Journalist community...

There is no argument that at this time ATi is the performance leader. However, this may soon not be the case by the time PCI Express and the next gen platforms are released...

Take care,

Adam

Dray_04
05-06-2004, 12:13 AM
yeah but that will set you back a few &#036;&#036;&#036; If your talking about upgrading to a PCI Express mobo plus G card....


anyhoo i was going to ask how much the R9800XT will be (retail) after these cards gets released (X800 pro etc.)

Does anybody know if they will be shipped to new zealand any time soon... i cant WAIT&#33;

adamp2p
05-06-2004, 12:15 AM
Originally posted by Dray_04@5 May 2004 - 16:21
yeah but that will set you back a few &#036;&#036;&#036; If your talking about upgrading to a PCI Express mobo plus G card....


anyhoo i was going to ask how much the R9800XT will be (retail) after these cards gets released (X800 pro etc.)

Does anybody know if they will be shipped to new zealand any time soon... i cant WAIT&#33;
I have a lot of money in the bank right now (I got hit by a moving car and got a substantial settlement)...so I plan on dumping about &#036;2000 USD on a new system this summer...

:)

atiVidia
05-06-2004, 12:25 AM
ok listen up:

when dx9.0c comes out and allows ps3, and games start to support 32bit floats, ATI will fall onto its knees ;)

ati fanboy :P

tesco
05-06-2004, 12:27 AM
Originally posted by atiVidia@5 May 2004 - 19:33
ok listen up:

when dx9.0c comes out and allows ps3, and games start to support 32bit floats, ATI will fall onto its knees ;)

ati fanboy :P
but by then ati will have released a new card anyway :(

Dray_04
05-06-2004, 12:57 AM
Is someone going to answer my question

how much will the R9800XT be?&#33;

tesco
05-06-2004, 12:58 AM
Originally posted by Dray_04@5 May 2004 - 20:05
Is someone going to answer my question

how much will the R9800XT be?&#33;
cheaper than now :P

RGX
05-06-2004, 01:00 AM
Originally posted by atiVidia@6 May 2004 - 00:33
ok listen up:

when dx9.0c comes out and allows ps3, and games start to support 32bit floats, ATI will fall onto its knees ;)

ati fanboy :P
1. Pixel Shader 3, as far as I have seen from screenshots, is hardly any different to Pixel Shader 2.0.

2. When games start to support 32 bit floats, ATi will have brought out a new card to support them. Until that day, ATi destroys nVIDIA in the benchmarks, AGAIN.

Go out and buy your precious 6800 if you like, your welcome to but you&#39;ll always know that there is a card out there at a similar price that has better image quality and framerates. I really hope that PS 3.0 is worth it, because as far as I can see, ATi flat out won this round. The image quality of the 6800 compared to the ATi is horrible....I dont care how fast my games running if it looks like that.

adamp2p
05-06-2004, 01:23 AM
Originally posted by RGX+5 May 2004 - 17:08--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 5 May 2004 - 17:08)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-atiVidia@6 May 2004 - 00:33
ok listen up:

when dx9.0c comes out and allows ps3, and games start to support 32bit floats, ATI will fall onto its knees ;)

ati fanboy :P
1. Pixel Shader 3, as far as I have seen from screenshots, is hardly any different to Pixel Shader 2.0.

2. When games start to support 32 bit floats, ATi will have brought out a new card to support them. Until that day, ATi destroys nVIDIA in the benchmarks, AGAIN.

Go out and buy your precious 6800 if you like, your welcome to but you&#39;ll always know that there is a card out there at a similar price that has better image quality and framerates. I really hope that PS 3.0 is worth it, because as far as I can see, ATi flat out won this round. The image quality of the 6800 compared to the ATi is horrible....I dont care how fast my games running if it looks like that. [/b][/quote]
RGX, you are UNSTOPPABLE&#33; Lay it down.

This whole fanboi thing really gets me&#33; I mean, do you REALLY think that if NVIDIA was WINNING in EVERY gaming benchmark (as ATI is right now), I would not be concluding that NVIDIA is the leader? Nonsense. I agree with the journalists especially when they ALL are saying the same thing&#33;&#33;&#33;

:devil:

adamp2p
05-06-2004, 01:24 AM
Originally posted by RGX+5 May 2004 - 17:08--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 5 May 2004 - 17:08)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-atiVidia@6 May 2004 - 00:33
ok listen up:

when dx9.0c comes out and allows ps3, and games start to support 32bit floats, ATI will fall onto its knees ;)

ati fanboy :P
1. Pixel Shader 3, as far as I have seen from screenshots, is hardly any different to Pixel Shader 2.0.

2. When games start to support 32 bit floats, ATi will have brought out a new card to support them. Until that day, ATi destroys nVIDIA in the benchmarks, AGAIN.

Go out and buy your precious 6800 if you like, your welcome to but you&#39;ll always know that there is a card out there at a similar price that has better image quality and framerates. I really hope that PS 3.0 is worth it, because as far as I can see, ATi flat out won this round. The image quality of the 6800 compared to the ATi is horrible....I dont care how fast my games running if it looks like that. [/b][/quote]
And you are going to need a new power supply too, so add that to the bill as well&#33;

Dray_04
05-06-2004, 01:59 AM
wow thanks rossco... <_<


:P

tesco
05-06-2004, 02:07 AM
Originally posted by Dray_04@5 May 2004 - 21:07
wow thanks rossco... <_<


:P
:smartass:

atiVidia
05-06-2004, 02:11 AM
im running along with the lh beta crew (finally)

9.0c should be out by the summer, with Pixelshader 3.
games using PS3 will show up before q4 ends. this includes (possibly) half life 2.

adam, please dont quad post, and please dont spit BS into our faces.

Dray_04
05-06-2004, 02:12 AM
Originally posted by ROSSCO_2004+6 May 2004 - 14:15--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (ROSSCO_2004 @ 6 May 2004 - 14:15)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-Dray_04@5 May 2004 - 21:07
wow thanks rossco... <_<


:P
:smartass: [/b][/quote]
:box:

tesco
05-06-2004, 02:17 AM
Originally posted by Dray_04+5 May 2004 - 21:20--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Dray_04 @ 5 May 2004 - 21:20)</td></tr><tr><td id='QUOTE'>
Originally posted by ROSSCO_2004@6 May 2004 - 14:15
<!--QuoteBegin-Dray_04@5 May 2004 - 21:07
wow thanks rossco... <_<


:P
:smartass:
:box: [/b][/quote]
oh yeah&#33;
:blow: im magical :) HA&#33; lol.

wanna take this outside? :boxing: do you? huh? huh? do you? huh? huh? come on&#33; lets go&#33; ill kick your ass.


:lol: :lol:

atiVidia
05-06-2004, 02:20 AM
atividia v. rossco
:box:

:angry: :angry: :angry: :angry: :angry:
Why did I reach this page instead to see my posted message ?

Possible Reasons:

* You already posted your message (double post)
* You are using a script to post messages
* You hit the reload button but your message was already posted

If you think this is an error please contact our administrators and they will look into this problem. Please make sure to check the forum you posted to see if your message is not already there.

The Administration
:angry: :angry: :angry: :angry: :angry:

tesco
05-06-2004, 02:23 AM
Originally posted by atiVidia@5 May 2004 - 21:28
atividia v. rossco
:box:

:angry: :angry: :angry: :angry: :angry:
Why did I reach this page instead to see my posted message ?

Possible Reasons:

* You already posted your message (double post)
* You are using a script to post messages
* You hit the reload button but your message was already posted

If you think this is an error please contact our administrators and they will look into this problem. Please make sure to check the forum you posted to see if your message is not already there.

The Administration
:angry: :angry: :angry: :angry: :angry:
:&#39;( why does everyone hate me? :(


:lol:

atiVidia
05-06-2004, 02:30 AM
cuz ur being a pussified machoman :lol:

anyhow: here (http://www.vr-zone.com/?i=769)

is that what i think i see??? a 6800ultra with only 1 power connector??????????



wowowowowowow lolol i love it&#33;

amphoteric88
05-06-2004, 02:32 AM
Originally posted by atiVidia@6 May 2004 - 03:38
cuz ur being a pussified machoman :lol:

anyhow: here (http://www.vr-zone.com/?i=769)

is that what i think i see??? a 6800ultra with only 1 power connector??????????



wowowowowowow lolol i love it&#33;

Wow, only 900 euros :o

atiVidia
05-06-2004, 02:36 AM
thats expensive as hell lol


anyways: here (http://www.firingsquad.com/hardware/geforce_6800_ultra_extreme/)


d00d the OCed 68kGT beats the 68kUltraExtreme&#33;&#33;&#33; and its 1 slot&#33;&#33;&#33; and it uses only 1 power connector&#33;&#33;&#33; if it works with 4xagp im gettin it&#33; otherwise im volt moddin my board to support it.


curiously, my board does support agp8x bandwidth but due to voltages, only qualifies as agp 4x :huh:

amphoteric88
05-06-2004, 02:38 AM
Originally posted by atiVidia@6 May 2004 - 03:44
thats expensive as hell lol


anyways: here (http://www.firingsquad.com/hardware/geforce_6800_ultra_extreme/)


d00d the OCed 68kGT beats the 68kUltraExtreme&#33;&#33;&#33; and its 1 slot&#33;&#33;&#33; and it uses only 1 power connector&#33;&#33;&#33; if it works with 4xagp im gettin it&#33; otherwise im volt moddin my board to support it.


curiously, my board does support agp8x bandwidth but due to voltages, only qualifies as agp 4x :huh:
Why not just get the better card (i.e. the X800)?

tesco
05-06-2004, 02:41 AM
Originally posted by amphoteric88+5 May 2004 - 21:46--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (amphoteric88 @ 5 May 2004 - 21:46)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-atiVidia@6 May 2004 - 03:44
thats expensive as hell lol


anyways: here (http://www.firingsquad.com/hardware/geforce_6800_ultra_extreme/)


d00d the OCed 68kGT beats the 68kUltraExtreme&#33;&#33;&#33; and its 1 slot&#33;&#33;&#33; and it uses only 1 power connector&#33;&#33;&#33;&nbsp; if it works with 4xagp im gettin it&#33; otherwise im volt moddin my board to support it.


curiously, my board does support agp8x bandwidth but due to voltages, only qualifies as agp 4x :huh:
Why not just get the better card (i.e. the X800)? [/b][/quote]
have you even read the rest of the posts in this thread? atividia is one of the ones that doesnt like ati and is fighting adam that its not better than the 6800.

amphoteric88
05-06-2004, 02:43 AM
Originally posted by ROSSCO_2004@6 May 2004 - 03:49
have you even read the rest of the posts in this thread? atividia is one of the ones that doesnt like ati and is fighting adam that its not better than the 6800.
Of course I read the thread. I&#39;m asking why atividia (the apparent nVidia fanboy) just won&#39;t get the Radeon. From the reviews, it seems that the X800 is a better card, so why can&#39;t he see logic and get that card?

tesco
05-06-2004, 02:46 AM
Originally posted by amphoteric88+5 May 2004 - 21:51--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (amphoteric88 @ 5 May 2004 - 21:51)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-ROSSCO_2004@6 May 2004 - 03:49
have you even read the rest of the posts in this thread? atividia is one of the ones that doesnt like ati and is fighting adam that its not better than the 6800.
Of course I read the thread. I&#39;m asking why atividia (the apparent nVidia fanboy) just won&#39;t get the Radeon. From the reviews, it seems that the X800 is a better card, so why can&#39;t he see logic and get that card? [/b][/quote]
well, first of all he had a bad experience with ati, the card&#39;s bios wouldnt get along with his bios or something like that, so that put him off from ati (they tend to have problems like that a lot).

there are other reasons I can think of for why each card is better than teh other, but he may not beleive those...but i know he had that one bad experience though...

amphoteric88
05-06-2004, 02:49 AM
Originally posted by ROSSCO_2004@6 May 2004 - 03:54
well, first of all he had a bad experience with ati, the card&#39;s bios wouldnt get along with his bios or something like that, so that put him off from ati (they tend to have problems like that a lot).

there are other reasons I can think of for why each card is better than teh other, but he may not beleive those...but i know he had that one bad experience though...
Isn&#39;t that a bit of a stupid reason for not liking a company?

"A Ford Focus once ran into me, so I&#39;ll never buy a Ford"...

tesco
05-06-2004, 03:10 AM
Originally posted by amphoteric88+5 May 2004 - 21:57--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (amphoteric88 @ 5 May 2004 - 21:57)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-ROSSCO_2004@6 May 2004 - 03:54
well, first of all he had a bad experience with ati, the card&#39;s bios wouldnt get along with his bios or something like that, so that put him off from ati (they tend to have problems like that a lot).

there are other reasons I can think of for why each card is better than teh other, but he may not beleive those...but i know he had that one bad experience though...
Isn&#39;t that a bit of a stupid reason for not liking a company?

"A Ford Focus once ran into me, so I&#39;ll never buy a Ford"... [/b][/quote]
well its more like, "i owned a ford, and it wouldnt start. So im not buying a ford again." and you also here all over that many people have had problems with their ford car not starting, but you can get a honda which always starts, so then you switch over to honda without looking back...although he may look back becuase i think ati fixed those problems.

amphoteric88
05-06-2004, 03:17 AM
Originally posted by ROSSCO_2004@6 May 2004 - 04:18
although he may look back becuase i think ati fixed those problems.
I don&#39;t think he&#39;ll look back because he&#39;s obviously an ignorant nVidia fanboy

atiVidia
05-06-2004, 03:46 AM
Originally posted by ROSSCO_2004+5 May 2004 - 22:18--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (ROSSCO_2004 @ 5 May 2004 - 22:18)</td></tr><tr><td id='QUOTE'>
Originally posted by amphoteric88@5 May 2004 - 21:57
<!--QuoteBegin-ROSSCO_2004@6 May 2004 - 03:54
well, first of all he had a bad experience with ati, the card&#39;s bios wouldnt get along with his bios or something like that, so that put him off from ati (they tend to have problems like that a lot).

there are other reasons I can think of for why each card is better than teh other, but he may not beleive those...but i know he had that one bad experience though...
Isn&#39;t that a bit of a stupid reason for not liking a company?

"A Ford Focus once ran into me, so I&#39;ll never buy a Ford"...
well its more like, "i owned a ford, and it wouldnt start. So im not buying a ford again." and you also here all over that many people have had problems with their ford car not starting, but you can get a honda which always starts, so then you switch over to honda without looking back...although he may look back becuase i think ati fixed those problems. [/b][/quote]
rofl u nailed my logic perfectly :D

yea thats about right. actually, i had the same problem with more than 1 card. the 9600, 9600pro, aiw9600, and the 9600xt. i was a HUGE ati fanboy then (just cuz they seemed cool) but i really got mad at them cuz they couldnt solve my boot issues with their card. so i switched cards and BAM its fixed...


i might grab an x800, but if i up the mobo in my SFF to one that does 8X agp (or even pciE) well,,, nvidia is still the way to go.

atiVidia
05-06-2004, 03:48 AM
Originally posted by amphoteric88+5 May 2004 - 22:25--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (amphoteric88 @ 5 May 2004 - 22:25)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-ROSSCO_2004@6 May 2004 - 04:18
although he may look back becuase i think ati fixed those problems.
I don&#39;t think he&#39;ll look back because he&#39;s obviously an ignorant nVidia fanboy [/b][/quote]
oh cmon&#33; i may be ignorant but not an nvidia fanboy&#33;&#33;&#33;


wait i think i messed up there... this is what i meant:

oh cmon&#33; i may be an nvidia fanboy but not ignorant&#33;&#33;&#33;

amphoteric88
05-06-2004, 03:55 AM
Originally posted by atiVidia+6 May 2004 - 04:56--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (atiVidia @ 6 May 2004 - 04:56)</td></tr><tr><td id='QUOTE'>
Originally posted by amphoteric88@5 May 2004 - 22:25
<!--QuoteBegin-ROSSCO_2004@6 May 2004 - 04:18
although he may look back becuase i think ati fixed those problems.
I don&#39;t think he&#39;ll look back because he&#39;s obviously an ignorant nVidia fanboy
oh cmon&#33; i may be ignorant but not an nvidia fanboy&#33;&#33;&#33;


wait i think i messed up there... this is what i meant:

oh cmon&#33; i may be an nvidia fanboy but not ignorant&#33;&#33;&#33; [/b][/quote]
Need I say more? ;)

adamp2p
05-06-2004, 04:41 AM
Originally posted by amphoteric88+5 May 2004 - 20:03--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (amphoteric88 @ 5 May 2004 - 20:03)</td></tr><tr><td id='QUOTE'>
Originally posted by atiVidia@6 May 2004 - 04:56

Originally posted by amphoteric88@5 May 2004 - 22:25
<!--QuoteBegin-ROSSCO_2004@6 May 2004 - 04:18
although he may look back becuase i think ati fixed those problems.
I don&#39;t think he&#39;ll look back because he&#39;s obviously an ignorant nVidia fanboy
oh cmon&#33; i may be ignorant but not an nvidia fanboy&#33;&#33;&#33;


wait i think i messed up there... this is what i meant:

oh cmon&#33; i may be an nvidia fanboy but not ignorant&#33;&#33;&#33;
Need I say more? ;) [/b][/quote]
amphoteric88:

It&#39;s been almost a year since I saw you post&#33; You are the character who taught me how to screenshot "print screen..." before that I never knew that key existed...

:P

amphoteric88
05-06-2004, 04:44 AM
Originally posted by adamp2p@6 May 2004 - 05:49
amphoteric88:

It&#39;s been almost a year since I saw you post&#33; You are the character who taught me how to screenshot "print screen..." before that I never knew that key existed...

:P
B)

Mad Cat
05-06-2004, 07:34 AM
Game devs really would be the biggest idiots in the world if they started to force Pixel Shader 3 (like they have started to do with Pixel Shader 2 with recent games) when at least half of the high performance market would be running X800s, nevermind the lower end market.

lynx
05-06-2004, 10:03 AM
Originally posted by bigdawgfoxx@6 May 2004 - 00:01
Yeah, after all those conclusions..Im gona have to say the ATI card is once again the leader... but theres just so many charts showing the 6800 winning, and the x800 winning...its hard to decide. And all this bullshit about "ohhh well the drivers arent that good yet"...thats bullshit...release your card with decent drivers or dont release them.

Thats like releasing a new Ferrari with bad gear ratios...
So why are Ati running on beta drivers? :P

atiVidia
05-06-2004, 11:23 AM
http://www.hardwareanalysis.com/content/article/1711/

RGX
05-06-2004, 12:51 PM
Honestly, no-one cares. As I said before, if you want to go out and waste your money on a slower card just because it has one crappy feature, your welcome to, but dont start trying to convince us that its the sensible thing to do.

atiVidia
05-06-2004, 04:51 PM
im just trying to balance adams reviews with mine&#33;


lets pretend that i was adam and nVidia was the way to go (for most of u). and i post 3 reviews showing how nvidia creams ati. adam, who is an ati fanboy decides to add 3 counter reviews to balance it out. u cant say that hes trying to convince us to go otherwise. hes balancing the reviews&#33;


same here.

amphoteric88
05-06-2004, 05:14 PM
Originally posted by atiVidia@6 May 2004 - 16:59
im just trying to balance adams reviews with mine&#33;


lets pretend that i was adam and nVidia was the way to go (for most of u). and i post 3 reviews showing how nvidia creams ati. adam, who is an ati fanboy decides to add 3 counter reviews to balance it out. u cant say that hes trying to convince us to go otherwise. hes balancing the reviews&#33;


same here.
How many times do you have to be told. Adam isn&#39;t an ati fanboy&#33;

RGX
05-06-2004, 05:18 PM
Originally posted by atiVidia@6 May 2004 - 16:59
im just trying to balance adams reviews with mine&#33;


lets pretend that i was adam and nVidia was the way to go (for most of u). and i post 3 reviews showing how nvidia creams ati. adam, who is an ati fanboy decides to add 3 counter reviews to balance it out. u cant say that hes trying to convince us to go otherwise. hes balancing the reviews&#33;


same here.
So if thats reversed, you are the nVIDIA fanboy trying to balance out factual information with reviews saying otherwise?

Way to go owning yourself. :rolleyes:

adamp2p
05-06-2004, 05:31 PM
Originally posted by atiVidia@6 May 2004 - 08:59
im just trying to balance adams reviews with mine&#33;


lets pretend that i was adam and nVidia was the way to go (for most of u). and i post 3 reviews showing how nvidia creams ati. adam, who is an ati fanboy decides to add 3 counter reviews to balance it out. u cant say that hes trying to convince us to go otherwise. hes balancing the reviews&#33;


same here.
There is agreement everywhere&#33; (http://www.neowin.net/forum/index.php?act=ST&t=164144&f=13&view=findpost&p=2066376)

to atividia (http://www.neowin.net/forum/index.php?act=ST&t=164144&f=13&view=findpost&p=2065958)

in general... (http://www.neowin.net/forum/index.php?act=ST&t=164144&f=13&view=findpost&p=2066061)

atiVidia
05-06-2004, 05:51 PM
Originally posted by adamp2p+6 May 2004 - 12:39--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (adamp2p &#064; 6 May 2004 - 12:39)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-atiVidia@6 May 2004 - 08:59
im just trying to balance adams reviews with mine&#33;


lets pretend that i was adam and nVidia was the way to go (for most of u). and i post 3 reviews showing how nvidia creams ati. adam, who is an ati fanboy decides to add 3 counter reviews to balance it out. u cant say that hes trying to convince us to go otherwise. hes balancing the reviews&#33;


same here.
There is agreement everywhere&#33; (http://www.neowin.net/forum/index.php?act=ST&t=164144&f=13&view=findpost&p=2066376)

to atividia (http://www.neowin.net/forum/index.php?act=ST&t=164144&f=13&view=findpost&p=2065958)

in general... (http://www.neowin.net/forum/index.php?act=ST&t=164144&f=13&view=findpost&p=2066061) [/b][/quote]
@ adam, i can easily get myself a 6800gt (which will beat the x800pro btw, and even beat the x800XT after an OC) for 250 (thats 150 discount :))

so im not affecting any of you. plus, 9.0c (as i stated earlier) is in the last beta stage (release candidate 4) and will most likely be out before the end of the year. as i have said before, many games coming out by the end of this year are already switching to 32 bit floating point. sure, in 2 years ps3 will be out, but even when nvidia and ati release their next cards, users of nvidia cards wont need to spend another 500 bucks, as their cards already support the new features :P

dont try to counter this argument cuz saying that nvidia is not prepared for the future is false.


look ill give the x800 a shot and if i get the same issue i will switch ok? u happy now?????? geez&#33;

RGX
05-06-2004, 06:06 PM
Originally posted by atiVidia+6 May 2004 - 17:59--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (atiVidia @ 6 May 2004 - 17:59)</td></tr><tr><td id='QUOTE'>
Originally posted by adamp2p@6 May 2004 - 12:39
<!--QuoteBegin-atiVidia@6 May 2004 - 08:59
im just trying to balance adams reviews with mine&#33;


lets pretend that i was adam and nVidia was the way to go (for most of u). and i post 3 reviews showing how nvidia creams ati. adam, who is an ati fanboy decides to add 3 counter reviews to balance it out. u cant say that hes trying to convince us to go otherwise. hes balancing the reviews&#33;


same here.
There is agreement everywhere&#33; (http://www.neowin.net/forum/index.php?act=ST&t=164144&f=13&view=findpost&p=2066376)

to atividia (http://www.neowin.net/forum/index.php?act=ST&t=164144&f=13&view=findpost&p=2065958)

in general... (http://www.neowin.net/forum/index.php?act=ST&t=164144&f=13&view=findpost&p=2066061)
@ adam, i can easily get myself a 6800gt (which will beat the x800pro btw, and even beat the x800XT after an OC) for 250 (thats 150 discount :))

so im not affecting any of you. plus, 9.0c (as i stated earlier) is in the last beta stage (release candidate 4) and will most likely be out before the end of the year. as i have said before, many games coming out by the end of this year are already switching to 32 bit floating point. sure, in 2 years ps3 will be out, but even when nvidia and ati release their next cards, users of nvidia cards wont need to spend another 500 bucks, as their cards already support the new features :P

dont try to counter this argument cuz saying that nvidia is not prepared for the future is false.


look ill give the x800 a shot and if i get the same issue i will switch ok? u happy now?????? geez&#33; [/b][/quote]
Your welcome. Personally I prefer my games to have anti aliasing and AAF and working shaders without a huge performance drop but thats just me I guess....

atiVidia
05-06-2004, 06:44 PM
it prolly is as i can have the same performance without any issues :)

and i can play ut2004 at 2048x1536 (unlike u :))

amphoteric88
05-06-2004, 06:47 PM
Originally posted by atiVidia@6 May 2004 - 18:52
and i can play ut2004 at 2048x1536 (unlike u :))
Gayest. Argument. Ever.

RGX
05-06-2004, 06:55 PM
Originally posted by amphoteric88+6 May 2004 - 18:55--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (amphoteric88 @ 6 May 2004 - 18:55)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-atiVidia@6 May 2004 - 18:52
and i can play ut2004 at 2048x1536 (unlike u :))
Gayest. Argument. Ever. [/b][/quote]
Agreed. TBH, atividia talking to you or even reading your inane, non-sensical and frankly retarded posts is a waste of my time and your effort. Trying to reason with you is like ramming my head into a cow in the hope that it will learn Einstein, and so to this pointless argument I say goodbye.

Have fun with your 10,000 megavolt-drawing chunk of silicon.

adamp2p
05-06-2004, 07:41 PM
Originally posted by RGX+6 May 2004 - 11:03--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 6 May 2004 - 11:03)</td></tr><tr><td id='QUOTE'>
Originally posted by amphoteric88@6 May 2004 - 18:55
<!--QuoteBegin-atiVidia@6 May 2004 - 18:52
and i can play ut2004 at 2048x1536 (unlike u :))
Gayest. Argument. Ever.
Agreed. TBH, atividia talking to you or even reading your inane, non-sensical and frankly retarded posts is a waste of my time and your effort. Trying to reason with you is like ramming my head into a cow in the hope that it will learn Einstein, and so to this pointless argument I say goodbye.

Have fun with your 10,000 megavolt-drawing chunk of silicon. [/b][/quote]
;) :lol: :) :unsure:

johnboy27
05-06-2004, 07:43 PM
Hey adam,did you ever fell like you are
http://www.rock-it-land.com/southwestvoodoo/deadhorse1.gif
I think you are on this one.Nvidia fans will never admit defeat.

atiVidia
05-06-2004, 08:09 PM
eew that pic is just sick <_<

kaiweiler
05-06-2004, 08:12 PM
:lol: :lol:
I think it&#39;s hilarious&#33;
but then again i hate horses more then anything...

Mad Cat
05-06-2004, 08:14 PM
Why can&#39;t the X800 play UT2004 at 2048x1536?

atiVidia
05-06-2004, 08:15 PM
nvidia drivers can force games to play at resolutions higher than they can otherwise. ut2004 has a limit of 1600x1200. nvidia can force it to 2048x1536 whereas ati cant.

Mad Cat
05-06-2004, 08:19 PM
Originally posted by atiVidia@6 May 2004 - 20:23
nvidia drivers can force games to play at resolutions higher than they can otherwise. ut2004 has a limit of 1600x1200. nvidia can force it to 2048x1536 whereas ati cant.
Thats no argument. I bet any number of registry hacks/fixes can do the same, nevermind game tweaks.

atiVidia
05-06-2004, 09:13 PM
will it be at a decent fps?


id like to see that.

@rgx: if i was trying to reason with you, this thread would not have hit 6 pages. stop trolling :) if its a waste of your time, then WHY ARE YOU DOING IT???


:teehee:

johnboy27
05-06-2004, 09:27 PM
Originally posted by kaiweiler@6 May 2004 - 21:20
:lol: :lol:
I think it&#39;s hilarious&#33;
but then again i hate horses more then anything...
Must just be us crazy Nova Scotians I guess because I find it funny as hell also. :clap:

RGX
05-06-2004, 09:34 PM
Originally posted by atiVidia@6 May 2004 - 21:21
will it be at a decent fps?


id like to see that.

@rgx: if i was trying to reason with you, this thread would not have hit 6 pages. stop trolling :) if its a waste of your time, then WHY ARE YOU DOING IT???


:teehee:
I had stopped. It was a waste of my time because you were an idiot. I am not trolling, I am stating my opinion. You are stating yours, and although in practice it is wrong, you are welcome to it. I also have the right to argue with that opinion however. This argument was and still is a wast of my time, but a direct reply requires a direct response.

atiVidia
05-06-2004, 09:38 PM
Originally posted by RGX+6 May 2004 - 16:42--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 6 May 2004 - 16:42)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-atiVidia@6 May 2004 - 21:21
will it be at a decent fps?


id like to see that.

@rgx: if i was trying to reason with you, this thread would not have hit 6 pages. stop trolling :) if its a waste of your time, then WHY ARE YOU DOING IT???


:teehee:
I had stopped. It was a waste of my time because you were an idiot. I am not trolling, I am stating my opinion. You are stating yours, and although in practice it is wrong, you are welcome to it. I also have the right to argue with that opinion however. This argument was and still is a wast of my time, but a direct reply requires a direct response. [/b][/quote]
i coulda sworn u were trolling.


also the 6800gt (when OCd, can beat an x800xt OCed) takes less power than many other cards.


so ur argument is false.






i wont be surprised if someone ends up asking to close this thread or move it to the lounge :P

RGX
05-06-2004, 09:40 PM
Originally posted by atiVidia+6 May 2004 - 21:46--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (atiVidia @ 6 May 2004 - 21:46)</td></tr><tr><td id='QUOTE'>
Originally posted by RGX@6 May 2004 - 16:42
<!--QuoteBegin-atiVidia@6 May 2004 - 21:21
will it be at a decent fps?


id like to see that.

@rgx: if i was trying to reason with you, this thread would not have hit 6 pages. stop trolling :) if its a waste of your time, then WHY ARE YOU DOING IT???


:teehee:
I had stopped. It was a waste of my time because you were an idiot. I am not trolling, I am stating my opinion. You are stating yours, and although in practice it is wrong, you are welcome to it. I also have the right to argue with that opinion however. This argument was and still is a wast of my time, but a direct reply requires a direct response.
i coulda sworn u were trolling.


also the 6800gt (when OCd, can beat an x800xt OCed) takes less power than many other cards.


so ur argument is false.






i wont be surprised if someone ends up asking to close this thread or move it to the lounge :P [/b][/quote]
1) Please post a link to benchmark scores of an X800XT OC&#39;ed in direct comparison to a 6800GT, also OC&#39;ed.

2) Please post the power draw from the cards also.

I would be very interested to see this information.

adamp2p
05-06-2004, 09:42 PM
Originally posted by johnboy27@6 May 2004 - 11:51
Hey adam,did you ever fell like you are
http://www.rock-it-land.com/southwestvoodoo/deadhorse1.gif
I think you are on this one.Nvidia fans will never admit defeat.
That&#39;s great John&#33; :lol:

Did you make that yourself?

adamp2p
05-06-2004, 09:52 PM
xbitlabs poll (http://www.xbitlabs.com/misc/poll_display/59.html)

neowin poll (http://www.neowin.net/forum/index.php?showtopic=164144)

;)

tesco
05-06-2004, 09:57 PM
Originally posted by adamp2p@6 May 2004 - 17:00
xbitlabs poll (http://www.xbitlabs.com/misc/poll_display/59.html)

neowin poll (http://www.neowin.net/forum/index.php?showtopic=164144)

;)
nvidia is getting killed&#33; :lol:

I prefer the xbit labs poll

Come on, they both are too expensive... - 524 (40%)

add my vote to that one.

atiVidia
05-07-2004, 01:19 AM
Originally posted by Firing Squad
in addition to the X800 XT Platinum Edition, ATI also offers the X800 PRO. The X800 PRO differs from the XT Platinum Edition in the number of pipelines (which has been reduced from 16 in the XT to 12 in the PRO) and the clocks, which are reduced to 475MHz core and 900MHz memory. Both boards sport 256-bit memory interfaces with 256MB of GDDR3 memory, and the same 160 million transistor count.

Modders will be disappointed to hear that ATI has physically disabled four of the X800 PRO&#39;s pipelines, so you won’t be able to enable all sixteen pipelines in the core via software solutions, including flashing to an X800 XT Platinum Edition BIOS. Since the rendering core is split up into independent quad pipes, some pipes can be disabled without affecting the rest of the chip. This will also help reduce manufacturing costs. If a manufacturing defect occurs in one of the 16 pipes, the entire block can be disabled and the card can be sold as a 12-pipe PRO board.

It’s rumored that ATI may eventually provide an 8-pipe X800 SE card, since the architecture can operate in 4, 8, 12, and 16 pipeline configurations this is certainly possible.


adam u were wrong about the soft-mod thing :P

bigdawgfoxx
05-07-2004, 01:50 AM
What is the 6800GT?

X800 and X800 Pro both have 12 Pipes with the X800 having a little bit slower clock correct?

Edit: Yeah :P haha

adamp2p
05-07-2004, 02:05 AM
Originally posted by bigdawgfoxx@6 May 2004 - 17:58
What is the 6800GT?

As least ATI has an X800, X800 Pro, and an X800XT. All Nvidia has is the 6800Ultra.

X800 and X800 Pro both have 12 Pipes with the X800 having a little bit slower clock correct?
Actually, Nvidia has pretty much the same standard 8, 12, 16 pipeline strategy as ATi.

bigdawgfoxx
05-07-2004, 02:09 AM
what are the 12 and 16 cards?

I know the ultra is 16 but which one is 12? and what is the GT?

tesco
05-07-2004, 02:24 AM
Originally posted by bigdawgfoxx@6 May 2004 - 21:17
what are the 12 and 16 cards?

I know the ultra is 16 but which one is 12? and what is the GT?
found this on another forum:

6800 Ultra (16x1 @ 400MHz 256bit 256MB 1.1GHz memory)
6800 GT (16x1 @ 350MHz 256bit 256MB 1GHz memory)
6800 (12x1 @ 325MHz 256bit 128~256MB 700MHz memory)

so it looks like the gt is like the x800pro...

bigdawgfoxx
05-07-2004, 02:32 AM
Alright thanks

I cant wait for the cards to be in stores lol

tesco
05-07-2004, 02:35 AM
Originally posted by bigdawgfoxx@6 May 2004 - 21:40
Alright thanks

I cant wait for the cards to be in stores lol
ya, once they start to get some real benchmarks on all the cards, not stupid pictures of what resolution each card can run <_< lol

atiVidia
05-07-2004, 02:37 AM
http://www.pricewatch.com/1/37/6017-1.htm

have fun ;)

adamp2p
05-07-2004, 02:45 AM
Originally posted by atiVidia@6 May 2004 - 18:45
http://www.pricewatch.com/1/37/6017-1.htm

have fun ;)
I am not upgrading until PCI Express is released&#33;

see this&#33; (http://www.neowin.net/forum/index.php?act=ST&t=163813&f=8&view=findpost&p=2071142)

adamp2p
05-07-2004, 03:04 AM
http://www.pcstats.com/articleview.cfm?articleid=1578&page=8

http://www.pcstats.com/i/v3pctats_75T.gif

Benchmarks: Far Cry, Conclusions


DirectX 9.0b-compatible Graphics make Far Cry a true test of DX9 compatible videocards. Based on the CryENGINE, Far Cry boasts real-time editing, bump mapping, static lights, network system, integrated physics system, shaders, shadows, and a dynamic music system. In the absence of either DOOM III or Half-Life 2, Far Cry has quickly become a single-player hit. Not only does it feature immersive graphics, but its story is actually somewhat entertaining.


Despite a recent patch that supposedly accelerated nVidia&#39;s shader routines, ATI&#39;s new X800 XT dominates the overall performance picture, though at 1024x768 the Geforce 6800 Ultra comes in on top. At 1024x768 and 1280x1024, processor performance looks like it&#39;s limiting the X800 XT, while the GeForce 6800 Ultra begins dropping off after 1280x1024. By 1600x1200, the flagship Radeon X800 XT flagship has a 21 percent lead on the Geforce 6800 Ultra.

Though Far Cry&#39;s graphical intensity shrinks the gap between the GeForce 6800 Ultra and Radeon X800 to just slightly over 1 FPS at 1600x1200, lower resolutions still favor ATI by up to 14 percent. The Radeon 9800 XT fares just as well, beating nVidia&#39;s GeForce FX 5950 Ultra at each resolution, though only slightly.


Conclusions

nVidia&#39;s new GeForce 6800 Ultra is, without question, a massive improvement over the GeForce FX 5950 Ultra that came before it. And, at least from first glance, it&#39;s pretty clear that this time nVidia has the performance numbers to back it up.

Nvidia&#39;s new architecture introduces a lot of potential beyond what we saw with NV3x. To begin, it features a massively parallel 3D pipeline for achieving up to 8x higher pixel shading (a historic weakness of NV40&#39;s predecessor) and up to 2x better vertex shading performance.

Moreover, the architecture supports Shader Model 3.0, part of Microsoft&#39;s upcoming DirectX 9.0c. Among the improvements there, displacement mapping, vertex texture fetching, and FP32 shader precision are perhaps the most notable enhancements. The DX9 subset also introduces longer vertex and pixel shader programs, dynamic flow control, and geometry instancing as well, though.

But even more relevant is the nVidia GeForce 6800&#39;s position versus ATI&#39;s recently announced Radeon X800 XT, formerly referred to as R420. The Radeon X800 XT does lack in a few areas - it doesn&#39;t support the Shader Model 3.0 specification, and in turn, it won&#39;t do FP32 precision. At the same time, however, it is a bit faster than the GeForce 6800 Ultra in what could be construed as the most demanding metrics, consumes a single AGP slot, and it doesn&#39;t come with a 480W power supply recommendation.

In response to this, nVidia is reportedly readying a limited-edition version of the GeForce 6800 Ultra bearing an Extreme suffix. Performance considerations aside, the Extreme version of the nVidia Geforce 6800 will be even more expensive than the 6800 Ultra, and purportedly sparsely available. The card&#39;s relevance in relation to its competition remains to be seen, but when it is released, we&#39;ll do our best to bring you a review of it, so you can judge for yourself.

So, although nVidia has clearly taken strides to improve upon its previous architecture, and while it holds Shader Model 3.0 and FP32 precision over ATI&#39;s head, the nVidia GeForce 6800 Ultra fell just short of ATI&#39;s Radeon X800 XT video card in most of the benchmarks we&#39;ve shown you - especially at higher resolutions with all the eye candy turned on. For luscious gaming, with killer image quality at the highest resolutions, it&#39;s clear that the Geforce FX 5950 Ultra has been significantly surpassed by the new Geforce 6800 Ultra. The differences between the nVidia Geforce 6800 Ultra and the ATI Radeon X800 XT are closer, but it is apparent that nVidia&#39;s NV40 still has some catching up to do&#33;



;)

RGX
05-07-2004, 06:19 PM
Originally posted by RGX+6 May 2004 - 21:48--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 6 May 2004 - 21:48)</td></tr><tr><td id='QUOTE'>
Originally posted by atiVidia@6 May 2004 - 21:46

Originally posted by RGX@6 May 2004 - 16:42
<!--QuoteBegin-atiVidia@6 May 2004 - 21:21
will it be at a decent fps?


id like to see that.

@rgx: if i was trying to reason with you, this thread would not have hit 6 pages. stop trolling :) if its a waste of your time, then WHY ARE YOU DOING IT???


:teehee:
I had stopped. It was a waste of my time because you were an idiot. I am not trolling, I am stating my opinion. You are stating yours, and although in practice it is wrong, you are welcome to it. I also have the right to argue with that opinion however. This argument was and still is a wast of my time, but a direct reply requires a direct response.
i coulda sworn u were trolling.


also the 6800gt (when OCd, can beat an x800xt OCed) takes less power than many other cards.


so ur argument is false.






i wont be surprised if someone ends up asking to close this thread or move it to the lounge :P
1) Please post a link to benchmark scores of an X800XT OC&#39;ed in direct comparison to a 6800GT, also OC&#39;ed.

2) Please post the power draw from the cards also.

I would be very interested to see this information. [/b][/quote]
Still waiting...

atiVidia
05-07-2004, 07:30 PM
wait til mid june and ill confirm it for ya ;)

RGX
05-07-2004, 07:38 PM
Originally posted by atiVidia@7 May 2004 - 19:38
wait til mid june and ill confirm it for ya ;)
So basically, you lied. You are basing your "facts" on information that isnt available yet. Well done.

*Claps*

atiVidia
05-07-2004, 08:17 PM
thank you (i bow)


im waiting til mid june to confirm them for myself. u get urself an x800xt, and me a 6800gt, and well OC them and bench

RGX
05-07-2004, 09:01 PM
I dont doubt it will probably be faster. I think I could make an MX440 faster than the X800XT with a month and almost unlimited resources. All I am saying is you are spreading bullshit as fact. Moron.

Mad Cat
05-07-2004, 11:23 PM
Originally posted by atiVidia@7 May 2004 - 20:25
thank you (i bow)


im waiting til mid june to confirm them for myself. u get urself an x800xt, and me a 6800gt, and well OC them and bench
Yes, lets all run totally unfair tests on which to base opinions. Get 2 perfectly the same systems, the same batch number on each chip, RAM, etc. The only variable being the card. Even better, use the same system only replace the card each time.

Your idea is like pitting a tiger against a mouse with 4 broken legs.

3RA1N1AC
05-07-2004, 11:52 PM
Originally posted by Mad Cat@7 May 2004 - 15:31
Your idea is like pitting a tiger against a mouse with 4 broken legs.
http://www.tultw.com/pics/speedy0129.JPG

bigdawgfoxx
05-07-2004, 11:58 PM
I think the X800XT would beat the 6800GT...its not even the top nvidia card.

atiVidia
05-08-2004, 12:58 AM
Originally posted by bigdawgfoxx@7 May 2004 - 19:06
I think the X800XT would beat the 6800GT...its not even the top nvidia card.
6800gtOCed... its worth a try

tesco
05-08-2004, 03:59 AM
Originally posted by atiVidia+7 May 2004 - 20:06--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (atiVidia @ 7 May 2004 - 20:06)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-bigdawgfoxx@7 May 2004 - 19:06
I think the X800XT would beat the 6800GT...its not even the top nvidia card.
6800gtOCed... its worth a try [/b][/quote]
and cheaper too...

johnboy27
05-08-2004, 07:14 AM
Originally posted by atiVidia+8 May 2004 - 02:06--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (atiVidia &#064; 8 May 2004 - 02:06)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-bigdawgfoxx@7 May 2004 - 19:06
I think the X800XT would beat the 6800GT...its not even the top nvidia card.
6800gtOCed... its worth a try [/b][/quote]
Get over it man.You have absolutely no way of knowing whether or not that card will be a good overclocker.
Why can&#39;t you just face the facts that the X800 series of cars appears to be a better choice right now.Sure Nvidia will get the drivers right for their cards shortly I am sure,then they will perform better(maybe better than the ATI,maybe not).You sit here and say "No I am not an Nvidia fanboy.Blah,blah blah......" but you just keep trying to plug the card when it is quite obvious that "right now" ATI is leading the race,accept it.Your arguments about "yeah,well I am gonna overclock this and it will be even faster than yours will ever be and blah blah blah..."etc just sound like kids on the playground in elementary school talking about" Yeah,well my dad is bigger than your dad blah blah blah". Get over the fanboyism (I don&#39;t think it&#39;s a word...but oh well :lol: ).
I have a coworker that has been just like you for a long time,Nvidia has been the only way to go for him,he wouldn&#39;t think of getting an ATI.Now he has been reading all the reviews and he is starting to come around.He now respects both ATI and Nvidia equally.He prefers Nvidia,but knows that when time comes for him to get a new card, he "Will" give ATI a chance.Whoever is leading the pack at the time is who&#39;s card he&#39;ll be buying.

bigdawgfoxx
05-08-2004, 02:36 PM
Yeah ATI, you say your not an nvidia fanboy, but you are acting just like one..give it up. As of right now, ATI is best...you cannot deny that.

RGX
05-08-2004, 02:41 PM
Originally posted by bigdawgfoxx@8 May 2004 - 14:44
Yeah ATI, you say your not an nvidia fanboy, but you are acting just like one..give it up.&nbsp; As of right now, ATI is best...you cannot deny that.
Exactly. The 6800GT probably will be faster OC&#39;ing, because nvidia will go crazy over the fact they got their asses kicked AGAIN (although the 6800 is still a good card).

bigdawgfoxx
05-08-2004, 02:46 PM
Couldnt you overclock the ATI card also?

Mad Cat
05-08-2004, 03:26 PM
&#39;scuse me, but isn&#39;t the 6800GT the worst of the 6800 litter?

I also heard about a 6800 Ultra Extreme, jesus, why can&#39;t they think up any good names...?

EDIT: And atividia, what are you on about nVidia better overclockers?

Was it not ATi that just broke the Futuremark record with an onverclocked X800?
Strange how we don&#39;t hear that about the nvidias.. :P

tesco
05-08-2004, 03:31 PM
Originally posted by Mad Cat@8 May 2004 - 10:34
&#39;scuse me, but isn&#39;t the 6800GT the worst of the 6800 litter?

I also heard about a 6800 Ultra Extreme, jesus, why can&#39;t they think up any good names...?

EDIT: And atividia, what are you on about nVidia better overclockers?

Was it not ATi that just broke the Futuremark record with an onverclocked X800?
Strange how we don&#39;t hear that about the nvidias.. :P
no GT is in the middle...goes ultra, then gt, then plain old 6800.

I guess the ultra extreme ( :lol: ) will be at teh toop of the list, before the ultra.