Page 5 of 7 FirstFirst ... 234567 LastLast
Results 41 to 50 of 63

Thread: Decisions

  1. #41
    BANNED
    Join Date
    Jul 2003
    Location
    Guatemala
    Posts
    4,044
    Originally posted by adamp2p@1 February 2004 - 21:13
    Now, do I have to explain to you what "FULL PRECISION" is? Is that the way you want your video games to be rendered? Is it any wonder why the entire FX's image quality suffers in newer DX9 games that are based on pixel shader 2.0?
    How would you know? Your video card (MSI Geforce4 MX440 64MB DDR 8x AGP) does not use a single pixel shader. So are you speaking out of your ass or for the needs of gamers? I don't think so.
    What do you mean by that? I find that extremely offensive and argumentative. So, if a mod actually EXISTS here (VB1234), step in and DO YOUR JOB.

    Second, does it actually matter what card I have to whether I know about stuff or not? I deal with nVidia cards a lot - I have used FX5200, FX5600, and FX5900, and YES, I do know about what I'm talking about, so please don't talk this BS to me about "speaking out of my ass".

    VB1234, you either step in here, or I'm gonna report this.

  2. Software & Hardware   -   #42
    Originally posted by Mad Cat+2 February 2004 - 09:25--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Mad Cat &#064; 2 February 2004 - 09:25)</td></tr><tr><td id='QUOTE'>
    Originally posted by DWk@2 February 2004 - 03:34
    Originally posted by adamp2p@1 February 2004 - 18:48
    <!--QuoteBegin-KinXen
    @1 February 2004 - 17:27
    9600xt for &#036;155...

    NO, LOOK:

    9600 XT With Half Life 2 Coupon &#036;150 after mail in rebate

    Here

    So after the game that only will cost you &#036;100 USD.


    Then again, you could get an FX5900 for 160&#036;, and it&#39;s one of the best cards around. It was 200 dollars like 2 months ago... price is down, time to buy

    EDIT - btw, I still don&#39;t understand why ATI&#39;s prices still haven&#39;t go down....
    Adam, using a long outdated test, that used older drivers for one of the cards isn&#39;t really fair.

    I&#39;ve heard about the picture quality thing also... from what I&#39;ve heard ATi beats nVidia in that. [/b][/quote]
    Mad Cat, like I said, the issue still remains as to the Nvidia cards reverting to running DX9 not in full precision mode.

    Both ATi and NVIDIA run DX8.1 just fine. Do I have to say this again?

    I am going to revert to the summary of my friend Anova as he summarizes very effectively:

    I&#39;ve gone over this a million times. Half Life 2 is not the only game the FX cards do poorly in. Everyone refers to HL2 simply because it is one of the most if not the most popular and highly anticipated games to come in a long while. It also heavily uses DX9. If you ever get a chance to ask Gabe Newell, founder of Valve software, i&#39;m sure he would tell you about the tricks nvidia was willing to go through with and why he himself owns a 9800.

    If you&#39;ve been following the graphics scene at all the past year you would know nvidia&#39;s architecture used in their FX line of products were not developed around the dx9 standard. They chose to develop their own code because at the time ATI was not in the picture. An nvidia card was in about 80% - 90% of all gaming computers. And because of this they thought Microsoft would cave into their demands. However, ATI emerged out of nowhere and released the 9700 Pro which proved to be nearly twice as fast as the Geforce 4 TIs. Since the core in these 9700&#39;s was developed around MS&#39;s dx9 standard MS decided not to go with nvidia&#39;s code. Nvidia did not show up at any of the meetings and were thus not in the loop of what was happening. By the time they figured it out it was already too late and would have required they go back to the drawing board. Rather then do that and waste countless amounts of time and money they just decided to release what they had and use their PR machine to continue what it did best. This is when the optimizations came. They knew their card&#39;s performance would be rather lacking so they decided to make optimations to their drivers by decreasing image quality, removing aspects of scenery, etc. They were basically forced to do this in order to keep sales going while they wrote a compiler which would ultimately convert the dx9 code into their own. This compiler was introduced with the &#39;forceware&#39; drivers. However, there are still optimizations and the image quality isn&#39;t on par with that of ATI&#39;s Radeons as was shown with Futuremarks last patch to 3DMark03.

    Yes there are other aspects involved as well. The NV30 (nvidia&#39;s fx core) only utilizes either FP16 or FP32. It cannot do FP24 like that of ATI&#39;s R300. In FP32 it&#39;s performance drops a significant amount. The design just isn&#39;t capable of producing high fps in FP32 mode. This is partly the reason why Valve had to write a special backend specifically for the FX cards in order to get decent fps in HL2. ATI&#39;s cards run it perfectly fine without any special code. It is the same for all current and future dx9 games.

    Nvidia has no one to blame but themselves. I&#39;m sure they&#39;re next product, the NV40, will fix these problems as they wouldn&#39;t dare commit the same mistakes twice. But until then, my advice would be to stay away from the FX cards. If you must go with nvidia then get a Geforce 4 Ti line, otherwise get a Radeon 9500 or above.
    Mad Cat, need I continue? The discussion is supposed to be geared at being informative. The issues brought up during the half-life 2 benchmarks are still here today. What does that mean? Well I will explain: for one, note that the FX line of cards are much more powerful, yet are slower in DX9 games than their ATi counterparts. I will not explain this again, I will revert you to the quote above, but this time I ask you to actually read it twice.

  3. Software & Hardware   -   #43
    Originally posted by DWk@2 February 2004 - 09:44

    What do you mean by that? I find that extremely offensive and argumentative. So, if a mod actually EXISTS here (VB1234), step in and DO YOUR JOB.

    Second, does it actually matter what card I have to whether I know about stuff or not? I deal with nVidia cards a lot - I have used FX5200, FX5600, and FX5900, and YES, I do know about what I&#39;m talking about, so please don&#39;t talk this BS to me about "speaking out of my ass".

    VB1234, you either step in here, or I&#39;m gonna report this.
    What&#39;s wrong with you kid? Can&#39;t you stand up for yourself, dude? Call for your mommy all you want&#33;

    This thread is aimed at advising a member to make a wise choice when choosing a graphics card.

    In your signature you have a card listed that does not feature pixel shaders. How can you claim that, through your own experience, you can advise somebody to make a wise purchase when the entire DX9 codepath uses pixel shaders? How can you consider yourself worthy of contributing advise to a member of this forum when you do not even own a DX9 card? Are you trying to fool us by claiming that you have "all this experience" with NVIDIA&#39;s entire line of offerings but you do not own a DX9 card&#33;

    I declare you a NVIDIA fanboy who does not even own a decent DX9 NVIDIA card.

  4. Software & Hardware   -   #44
    BANNED
    Join Date
    Jul 2003
    Location
    Guatemala
    Posts
    4,044
    Originally posted by adamp2p+2 February 2004 - 15:16--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (adamp2p @ 2 February 2004 - 15:16)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-DWk@2 February 2004 - 09:44

    What do you mean by that? I find that extremely offensive and argumentative. So, if a mod actually EXISTS here (VB1234), step in and DO YOUR JOB.

    Second, does it actually matter what card I have to whether I know about stuff or not? I deal with nVidia cards a lot - I have used FX5200, FX5600, and FX5900, and YES, I do know about what I&#39;m talking about, so please don&#39;t talk this BS to me about "speaking out of my ass".

    VB1234, you either step in here, or I&#39;m gonna report this.
    What&#39;s wrong with you kid? Can&#39;t you stand up for yourself, dude? Call for your mommy all you want&#33;

    This thread is aimed at advising a member to make a wise choice when choosing a graphics card.

    In your signature you have a card listed that does not feature pixel shaders. How can you claim that, through your own experience, you can advise somebody to make a wise purchase when the entire DX9 codepath uses pixel shaders? How can you consider yourself worthy of contributing advise to a member of this forum when you do not even own a DX9 card? Are you trying to fool us by claiming that you have "all this experience" with NVIDIA&#39;s entire line of offerings but you do not own a DX9 card&#33;

    I declare you a NVIDIA fanboy who does not even own a decent DX9 NVIDIA card. [/b][/quote]
    Adam do you own an XGI card? Do you own/use a p4? Do you own and use an nVidia DX9 card?

    I don&#39;t think so. So why does it matter whether I own a card or not?

    Time to report since nothing has been done to this

  5. Software & Hardware   -   #45
    Keikan's Avatar ........
    Join Date
    Apr 2003
    Location
    Edmonton (Not Enfield)
    Age
    35
    Posts
    3,743
    ok people

    1. Were talking about the ATI 9600 series only
    2. I asked if its worth it getting the xt over the pro
    3. I can&#39;t internet shop anywhere.
    4. I&#39;m not getting Asua a7n8x-x anymore I&#39;m getting Gigabyte ga-7n400-l1
    Ohh noo!!! I make dribbles!!!

  6. Software & Hardware   -   #46
    bigdawgfoxx's Avatar Big Dawg
    Join Date
    Apr 2003
    Location
    Texas
    Age
    36
    Posts
    3,821
    Originally posted by jasonmog@1 February 2004 - 18:53
    Perfect video card for the best price&#33; http://www.pricewatch.com/1/37/5113-1.htm

    Geforce FX 5200 128mb

    Sure there are &#036;300 cards better than it, but... THERE&#39;S NO GAMES THAT REQUIRE ANY &#036;300 VIDEO CARD. I&#39;m sick of arguing with people over which card is the best and the fact is you DON&#39;T need the best video card to play hit games. Unless you&#39;re render-farming, putting together CGI movies, or dealing with millions of polygons per inch, you are well-off with this fine piece of graphics acceleration right here. I put these in every computer I build and they run like a charm. Not just computers for myself, but my clients as well. Ati is to Nvidia as Intel is to AMD. The only difference is the price.&nbsp; You will NOT notice more than a few fps better any high-end video card these days. Trust me.
    I see this guy has not came back...I hope he stays gone good lord what a moron. A few FPS? More like 100FPS from the 5200 to the 9800XT. The 5200 is a piece of crap...obviously he is a newcomer and hasnt seen all the horrible threads about 5200s...haha we need a pinned topic on those things&#33; Nvidia is alright and the high end is pretty good..but the 5200 just sucks.

    Chill out guys
    [SIZE=1]AMD 4200 X2 @ 2.65Ghz, ASRock 939-VSTA
    1.75GB PC3200, 2 X 160GB Seagate w/ 8MB Buffer
    HIS Radeon X800 Pro, Antec Super Lanboy Aluminum

  7. Software & Hardware   -   #47
    BANNED
    Join Date
    Jul 2003
    Location
    Guatemala
    Posts
    4,044
    Again, it depends on what game you&#39;re gonna play

  8. Software & Hardware   -   #48
    I actually hope that nvidia releases good cards...as you know, if only one company has good offerings, then we the consumers will pay the price for that. We need healthy competition in order to ensure low prices.

    It looks like the next generation is going to be good for us:

    NV40 and R420 memory secrets revealed
    Read it here
    IT’S NOT OFTEN that rival graphics chip firms Nvidia and ATI use the same marchitectural tactics. But, this time around, it seems they don’t have any other choice. Nvidia’s upcoming NV40 and ATI’s R420 both support memory in the types DDR 1, GDDR 2 and GDDR 3, but both companies will be sticking with GDDR 2, at least at first.
    The reason is simple: DDR 1 is just too slow to support the latest-generation graphics chippery in high resolutions, with fancy FSAA and Anisotropic filtering. Also, DDR 1 has a clock limit of 1GHz which is very hard to crank up further. DDR 2, of course, is nothing more than DDR 1 that can run at more than 1GHz, given a set of different commands.

    Since both companies’ current chips use frequencies that are very close to that 1000MHz barrier, this means that neither has any choice other than to move to DDR II, or GDDR 2, as the suits would have us call it.

    GDDR 2 was sampled in Q3 2003 by Samsung and rated at 600 to 800MHz -- effectively 1200MHz to 1600MHz. Insiders have told us that Nvidia received 10,000 memory chips back in Q4 last year to prepare prototypes of its NV40 boards. We also learned that NV40 has 16 memory chips on board. Nvidia is aiming at a frequency of 750MHz -- or 1500MHz effectively -- but this depends on PCB quality and the number of layers. The first NV40 silicon-powered prototypes are currently meandering through the offices of special, beloved Nvidia partners, we are given to understand.

    GDDR 3 may, in theory, be one of the options on the market but, if you ask around in knowledgeable circles, you will learn that this memory is in early sample stage and so neither Nvidia nor ATI could get enough chips for Q2 retail availability of the cards, however big their muscles.

    It is expected that GDDR 3 will be ready by Q3 2004 so you might expect that the planned NV45 and the next ATI chip (R450 - R480?) will use this memory.

    Micron is the only signed up member of the Dramurai to have GDDR3 memory specifications on their site. There, the company suggests that Q1 will be a good time for sampling and my guess is that they won&#39;t be ready for production before Q3. Clockspeeds for both GDDR 2 and 3 will be set in the range from 600 to 800MHz - effectively 1200 to 1600 MHz.

    It’s interesting to see that 800MHz GDDR2 SDRAM has a latency of an incredible 1.25ns.

    Both the NV40 and R420 cards and memory interfaces are 256-bit ones and by current estimates, this means that a card that uses 600MHz GDDR 2 memory would have between a majestic 37.5 GB/s to a magnificent 50 GB/s raw bandwidth.

    We await their appearance with unabated breath.

  9. Software & Hardware   -   #49
    Keikan's Avatar ........
    Join Date
    Apr 2003
    Location
    Edmonton (Not Enfield)
    Age
    35
    Posts
    3,743
    Games I dunno depends what games i see interesting and download them

    but probably new games
    Ohh noo!!! I make dribbles!!!

  10. Software & Hardware   -   #50
    bigdawgfoxx's Avatar Big Dawg
    Join Date
    Apr 2003
    Location
    Texas
    Age
    36
    Posts
    3,821
    I got a &#036;25 gift certificate for best buy..what should I buy? Whats a good game for that much or a lil more or what else could I buy? lol
    [SIZE=1]AMD 4200 X2 @ 2.65Ghz, ASRock 939-VSTA
    1.75GB PC3200, 2 X 160GB Seagate w/ 8MB Buffer
    HIS Radeon X800 Pro, Antec Super Lanboy Aluminum

Page 5 of 7 FirstFirst ... 234567 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •