Hey adam,did you ever fell like you are
http://www.rock-it-land.com/southwes...deadhorse1.gif
I think you are on this one.Nvidia fans will never admit defeat.
Printable View
Hey adam,did you ever fell like you are
http://www.rock-it-land.com/southwes...deadhorse1.gif
I think you are on this one.Nvidia fans will never admit defeat.
eew that pic is just sick <_<
:lol: :lol:
I think it's hilarious!
but then again i hate horses more then anything...
Why can't the X800 play UT2004 at 2048x1536?
nvidia drivers can force games to play at resolutions higher than they can otherwise. ut2004 has a limit of 1600x1200. nvidia can force it to 2048x1536 whereas ati cant.
Thats no argument. I bet any number of registry hacks/fixes can do the same, nevermind game tweaks.Quote:
Originally posted by atiVidia@6 May 2004 - 20:23
nvidia drivers can force games to play at resolutions higher than they can otherwise. ut2004 has a limit of 1600x1200. nvidia can force it to 2048x1536 whereas ati cant.
will it be at a decent fps?
id like to see that.
@rgx: if i was trying to reason with you, this thread would not have hit 6 pages. stop trolling :) if its a waste of your time, then WHY ARE YOU DOING IT???
:teehee:
Must just be us crazy Nova Scotians I guess because I find it funny as hell also. :clap:Quote:
Originally posted by kaiweiler@6 May 2004 - 21:20
:lol: :lol:
I think it's hilarious!
but then again i hate horses more then anything...
I had stopped. It was a waste of my time because you were an idiot. I am not trolling, I am stating my opinion. You are stating yours, and although in practice it is wrong, you are welcome to it. I also have the right to argue with that opinion however. This argument was and still is a wast of my time, but a direct reply requires a direct response.Quote:
Originally posted by atiVidia@6 May 2004 - 21:21
will it be at a decent fps?
id like to see that.
@rgx: if i was trying to reason with you, this thread would not have hit 6 pages. stop trolling :) if its a waste of your time, then WHY ARE YOU DOING IT???
:teehee:
I had stopped. It was a waste of my time because you were an idiot. I am not trolling, I am stating my opinion. You are stating yours, and although in practice it is wrong, you are welcome to it. I also have the right to argue with that opinion however. This argument was and still is a wast of my time, but a direct reply requires a direct response. [/b][/quote]Quote:
Originally posted by RGX+6 May 2004 - 16:42--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 6 May 2004 - 16:42)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-atiVidia@6 May 2004 - 21:21
will it be at a decent fps?
id like to see that.
@rgx: if i was trying to reason with you, this thread would not have hit 6 pages. stop trolling :) if its a waste of your time, then WHY ARE YOU DOING IT???
:teehee:
i coulda sworn u were trolling.
also the 6800gt (when OCd, can beat an x800xt OCed) takes less power than many other cards.
so ur argument is false.
i wont be surprised if someone ends up asking to close this thread or move it to the lounge :P
i coulda sworn u were trolling.Quote:
Originally posted by atiVidia+6 May 2004 - 21:46--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (atiVidia @ 6 May 2004 - 21:46)</td></tr><tr><td id='QUOTE'>Quote:
Originally posted by RGX@6 May 2004 - 16:42
<!--QuoteBegin-atiVidia
Quote:
@6 May 2004 - 21:21
will it be at a decent fps?
id like to see that.
@rgx: if i was trying to reason with you, this thread would not have hit 6 pages. stop trolling :) if its a waste of your time, then WHY ARE YOU DOING IT???
:teehee:
I had stopped. It was a waste of my time because you were an idiot. I am not trolling, I am stating my opinion. You are stating yours, and although in practice it is wrong, you are welcome to it. I also have the right to argue with that opinion however. This argument was and still is a wast of my time, but a direct reply requires a direct response.
also the 6800gt (when OCd, can beat an x800xt OCed) takes less power than many other cards.
so ur argument is false.
i wont be surprised if someone ends up asking to close this thread or move it to the lounge :P [/b][/quote]
1) Please post a link to benchmark scores of an X800XT OC'ed in direct comparison to a 6800GT, also OC'ed.
2) Please post the power draw from the cards also.
I would be very interested to see this information.
That's great John! :lol:Quote:
Originally posted by johnboy27@6 May 2004 - 11:51
Hey adam,did you ever fell like you are
http://www.rock-it-land.com/southwes...deadhorse1.gif
I think you are on this one.Nvidia fans will never admit defeat.
Did you make that yourself?
nvidia is getting killed! :lol:Quote:
I prefer the xbit labs poll
add my vote to that one.Quote:
Come on, they both are too expensive... - 524 (40%)
adam u were wrong about the soft-mod thing :PQuote:
Originally posted by Firing Squad
in addition to the X800 XT Platinum Edition, ATI also offers the X800 PRO. The X800 PRO differs from the XT Platinum Edition in the number of pipelines (which has been reduced from 16 in the XT to 12 in the PRO) and the clocks, which are reduced to 475MHz core and 900MHz memory. Both boards sport 256-bit memory interfaces with 256MB of GDDR3 memory, and the same 160 million transistor count.
Modders will be disappointed to hear that ATI has physically disabled four of the X800 PRO's pipelines, so you won’t be able to enable all sixteen pipelines in the core via software solutions, including flashing to an X800 XT Platinum Edition BIOS. Since the rendering core is split up into independent quad pipes, some pipes can be disabled without affecting the rest of the chip. This will also help reduce manufacturing costs. If a manufacturing defect occurs in one of the 16 pipes, the entire block can be disabled and the card can be sold as a 12-pipe PRO board.
It’s rumored that ATI may eventually provide an 8-pipe X800 SE card, since the architecture can operate in 4, 8, 12, and 16 pipeline configurations this is certainly possible.
What is the 6800GT?
X800 and X800 Pro both have 12 Pipes with the X800 having a little bit slower clock correct?
Edit: Yeah :P haha
Actually, Nvidia has pretty much the same standard 8, 12, 16 pipeline strategy as ATi.Quote:
Originally posted by bigdawgfoxx@6 May 2004 - 17:58
What is the 6800GT?
As least ATI has an X800, X800 Pro, and an X800XT. All Nvidia has is the 6800Ultra.
X800 and X800 Pro both have 12 Pipes with the X800 having a little bit slower clock correct?
what are the 12 and 16 cards?
I know the ultra is 16 but which one is 12? and what is the GT?
found this on another forum:Quote:
Originally posted by bigdawgfoxx@6 May 2004 - 21:17
what are the 12 and 16 cards?
I know the ultra is 16 but which one is 12? and what is the GT?
so it looks like the gt is like the x800pro...Quote:
6800 Ultra (16x1 @ 400MHz 256bit 256MB 1.1GHz memory)
6800 GT (16x1 @ 350MHz 256bit 256MB 1GHz memory)
6800 (12x1 @ 325MHz 256bit 128~256MB 700MHz memory)
Alright thanks
I cant wait for the cards to be in stores lol
ya, once they start to get some real benchmarks on all the cards, not stupid pictures of what resolution each card can run <_< lolQuote:
Originally posted by bigdawgfoxx@6 May 2004 - 21:40
Alright thanks
I cant wait for the cards to be in stores lol
http://www.pricewatch.com/1/37/6017-1.htm
have fun ;)
I am not upgrading until PCI Express is released!Quote:
Originally posted by atiVidia@6 May 2004 - 18:45
http://www.pricewatch.com/1/37/6017-1.htm
have fun ;)
see this!
http://www.pcstats.com/articleview.c...id=1578&page=8
http://www.pcstats.com/i/v3pctats_75T.gif
Benchmarks: Far Cry, Conclusions
;)Quote:
DirectX 9.0b-compatible Graphics make Far Cry a true test of DX9 compatible videocards. Based on the CryENGINE, Far Cry boasts real-time editing, bump mapping, static lights, network system, integrated physics system, shaders, shadows, and a dynamic music system. In the absence of either DOOM III or Half-Life 2, Far Cry has quickly become a single-player hit. Not only does it feature immersive graphics, but its story is actually somewhat entertaining.
Despite a recent patch that supposedly accelerated nVidia's shader routines, ATI's new X800 XT dominates the overall performance picture, though at 1024x768 the Geforce 6800 Ultra comes in on top. At 1024x768 and 1280x1024, processor performance looks like it's limiting the X800 XT, while the GeForce 6800 Ultra begins dropping off after 1280x1024. By 1600x1200, the flagship Radeon X800 XT flagship has a 21 percent lead on the Geforce 6800 Ultra.
Though Far Cry's graphical intensity shrinks the gap between the GeForce 6800 Ultra and Radeon X800 to just slightly over 1 FPS at 1600x1200, lower resolutions still favor ATI by up to 14 percent. The Radeon 9800 XT fares just as well, beating nVidia's GeForce FX 5950 Ultra at each resolution, though only slightly.
Conclusions
nVidia's new GeForce 6800 Ultra is, without question, a massive improvement over the GeForce FX 5950 Ultra that came before it. And, at least from first glance, it's pretty clear that this time nVidia has the performance numbers to back it up.
Nvidia's new architecture introduces a lot of potential beyond what we saw with NV3x. To begin, it features a massively parallel 3D pipeline for achieving up to 8x higher pixel shading (a historic weakness of NV40's predecessor) and up to 2x better vertex shading performance.
Moreover, the architecture supports Shader Model 3.0, part of Microsoft's upcoming DirectX 9.0c. Among the improvements there, displacement mapping, vertex texture fetching, and FP32 shader precision are perhaps the most notable enhancements. The DX9 subset also introduces longer vertex and pixel shader programs, dynamic flow control, and geometry instancing as well, though.
But even more relevant is the nVidia GeForce 6800's position versus ATI's recently announced Radeon X800 XT, formerly referred to as R420. The Radeon X800 XT does lack in a few areas - it doesn't support the Shader Model 3.0 specification, and in turn, it won't do FP32 precision. At the same time, however, it is a bit faster than the GeForce 6800 Ultra in what could be construed as the most demanding metrics, consumes a single AGP slot, and it doesn't come with a 480W power supply recommendation.
In response to this, nVidia is reportedly readying a limited-edition version of the GeForce 6800 Ultra bearing an Extreme suffix. Performance considerations aside, the Extreme version of the nVidia Geforce 6800 will be even more expensive than the 6800 Ultra, and purportedly sparsely available. The card's relevance in relation to its competition remains to be seen, but when it is released, we'll do our best to bring you a review of it, so you can judge for yourself.
So, although nVidia has clearly taken strides to improve upon its previous architecture, and while it holds Shader Model 3.0 and FP32 precision over ATI's head, the nVidia GeForce 6800 Ultra fell just short of ATI's Radeon X800 XT video card in most of the benchmarks we've shown you - especially at higher resolutions with all the eye candy turned on. For luscious gaming, with killer image quality at the highest resolutions, it's clear that the Geforce FX 5950 Ultra has been significantly surpassed by the new Geforce 6800 Ultra. The differences between the nVidia Geforce 6800 Ultra and the ATI Radeon X800 XT are closer, but it is apparent that nVidia's NV40 still has some catching up to do!
1) Please post a link to benchmark scores of an X800XT OC'ed in direct comparison to a 6800GT, also OC'ed.Quote:
Originally posted by RGX+6 May 2004 - 21:48--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 6 May 2004 - 21:48)</td></tr><tr><td id='QUOTE'>Quote:
Originally posted by atiVidia@6 May 2004 - 21:46
Quote:
Originally posted by RGX@6 May 2004 - 16:42
<!--QuoteBegin-atiVidia
Quote:
Quote:
@6 May 2004 - 21:21
will it be at a decent fps?
id like to see that.
@rgx: if i was trying to reason with you, this thread would not have hit 6 pages. stop trolling :) if its a waste of your time, then WHY ARE YOU DOING IT???
:teehee:
I had stopped. It was a waste of my time because you were an idiot. I am not trolling, I am stating my opinion. You are stating yours, and although in practice it is wrong, you are welcome to it. I also have the right to argue with that opinion however. This argument was and still is a wast of my time, but a direct reply requires a direct response.
i coulda sworn u were trolling.
also the 6800gt (when OCd, can beat an x800xt OCed) takes less power than many other cards.
so ur argument is false.
i wont be surprised if someone ends up asking to close this thread or move it to the lounge :P
2) Please post the power draw from the cards also.
I would be very interested to see this information. [/b][/quote]
Still waiting...
wait til mid june and ill confirm it for ya ;)
So basically, you lied. You are basing your "facts" on information that isnt available yet. Well done.Quote:
Originally posted by atiVidia@7 May 2004 - 19:38
wait til mid june and ill confirm it for ya ;)
*Claps*
thank you (i bow)
im waiting til mid june to confirm them for myself. u get urself an x800xt, and me a 6800gt, and well OC them and bench
I dont doubt it will probably be faster. I think I could make an MX440 faster than the X800XT with a month and almost unlimited resources. All I am saying is you are spreading bullshit as fact. Moron.
Yes, lets all run totally unfair tests on which to base opinions. Get 2 perfectly the same systems, the same batch number on each chip, RAM, etc. The only variable being the card. Even better, use the same system only replace the card each time.Quote:
Originally posted by atiVidia@7 May 2004 - 20:25
thank you (i bow)
im waiting til mid june to confirm them for myself. u get urself an x800xt, and me a 6800gt, and well OC them and bench
Your idea is like pitting a tiger against a mouse with 4 broken legs.
http://www.tultw.com/pics/speedy0129.JPGQuote:
Originally posted by Mad Cat@7 May 2004 - 15:31
Your idea is like pitting a tiger against a mouse with 4 broken legs.
I think the X800XT would beat the 6800GT...its not even the top nvidia card.
6800gtOCed... its worth a tryQuote:
Originally posted by bigdawgfoxx@7 May 2004 - 19:06
I think the X800XT would beat the 6800GT...its not even the top nvidia card.
6800gtOCed... its worth a try [/b][/quote]Quote:
Originally posted by atiVidia+7 May 2004 - 20:06--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (atiVidia @ 7 May 2004 - 20:06)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-bigdawgfoxx@7 May 2004 - 19:06
I think the X800XT would beat the 6800GT...its not even the top nvidia card.
and cheaper too...
6800gtOCed... its worth a try [/b][/quote]Quote:
Originally posted by atiVidia+8 May 2004 - 02:06--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (atiVidia @ 8 May 2004 - 02:06)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-bigdawgfoxx@7 May 2004 - 19:06
I think the X800XT would beat the 6800GT...its not even the top nvidia card.
Get over it man.You have absolutely no way of knowing whether or not that card will be a good overclocker.
Why can't you just face the facts that the X800 series of cars appears to be a better choice right now.Sure Nvidia will get the drivers right for their cards shortly I am sure,then they will perform better(maybe better than the ATI,maybe not).You sit here and say "No I am not an Nvidia fanboy.Blah,blah blah......" but you just keep trying to plug the card when it is quite obvious that "right now" ATI is leading the race,accept it.Your arguments about "yeah,well I am gonna overclock this and it will be even faster than yours will ever be and blah blah blah..."etc just sound like kids on the playground in elementary school talking about" Yeah,well my dad is bigger than your dad blah blah blah". Get over the fanboyism (I don't think it's a word...but oh well :lol: ).
I have a coworker that has been just like you for a long time,Nvidia has been the only way to go for him,he wouldn't think of getting an ATI.Now he has been reading all the reviews and he is starting to come around.He now respects both ATI and Nvidia equally.He prefers Nvidia,but knows that when time comes for him to get a new card, he "Will" give ATI a chance.Whoever is leading the pack at the time is who's card he'll be buying.
Yeah ATI, you say your not an nvidia fanboy, but you are acting just like one..give it up. As of right now, ATI is best...you cannot deny that.
Exactly. The 6800GT probably will be faster OC'ing, because nvidia will go crazy over the fact they got their asses kicked AGAIN (although the 6800 is still a good card).Quote:
Originally posted by bigdawgfoxx@8 May 2004 - 14:44
Yeah ATI, you say your not an nvidia fanboy, but you are acting just like one..give it up. As of right now, ATI is best...you cannot deny that.
Couldnt you overclock the ATI card also?
'scuse me, but isn't the 6800GT the worst of the 6800 litter?
I also heard about a 6800 Ultra Extreme, jesus, why can't they think up any good names...?
EDIT: And atividia, what are you on about nVidia better overclockers?
Was it not ATi that just broke the Futuremark record with an onverclocked X800?
Strange how we don't hear that about the nvidias.. :P
no GT is in the middle...goes ultra, then gt, then plain old 6800.Quote:
Originally posted by Mad Cat@8 May 2004 - 10:34
'scuse me, but isn't the 6800GT the worst of the 6800 litter?
I also heard about a 6800 Ultra Extreme, jesus, why can't they think up any good names...?
EDIT: And atividia, what are you on about nVidia better overclockers?
Was it not ATi that just broke the Futuremark record with an onverclocked X800?
Strange how we don't hear that about the nvidias.. :P
I guess the ultra extreme ( :lol: ) will be at teh toop of the list, before the ultra.