Here is a rip from Xbitlabs.com article I thought some of you guys might be interested in:
To Tell the truth, I tend to consider Aquamark3 test set not a synthetic graphics cards performance benchmark, but an excellent demonstration of the potential implied in the new Massive Development’s gaming engine.
The test consists of 9 episodes. Each of these episodes demonstrates one of the technologies used by the engine. Despite that, there are no significant differences between all 9 scenes, which I personally regard as a certain disadvantage, because it makes the test not “synthetic” enough.
I'll just copy this from what I wrote at Beyond 3D:
I was so confused by this comment from AT:
AnandTech wrote: "In fact, NVIDIA has flipped the tables on ATI in the midrange segment and takes the performance crown with a late round TKO. It was a hard fought battle with many ties, but in the games where the NV36 based card took the performance lead, it lead with the style of a higher end card."
That I tabulated my own results:
5700 wins 10 times
9600 XT wins 6
Where the 5700 won, it won on average by 15%
Where the 9600 won, it won on average by 17%
WITH AA / ANISO
5700 wins 6 times
9600 wins 6 times
Where the 5700 won, it won on average by 23%
Where the 9600 won, it won on average by 54%
There certainly is ZERO justification for saying something like: "but in the games where the NV36 based card took the performance lead, it lead with the style of a higher end card."
That characteristic belongs to ATI, not nVidia.
Another way to look at it: What percentage FPS difference is required to declare a "clear winner?"
Let's say that less than 10% difference, the cards are tied. In this case:
5700 wins 6 tests
9600 wins 4 tests
When the 5700 wins, it's by an average of 22%
When the 9600 wins, it's by an average of 22%
5700 wins 4 tests
9600 wins 6 tests
When the 5700 wins, it's by an average of 33%
When the 9600 wins, it's by an average of 54%
Taking this fact into account I would like to offer you the average performance results for all nice benchmarks, because it doesn’t make much sense to analyze each of them separately. The scenes level of detail was set to the maximum throughout the entire test session:
Conclusion:As is known, the DirectX 9.0 Pixel Shader performance of NVIDIA GeForce FX graphics processors is not as high as that of ATI RADEON 9500/9700/9800 chips. There are several reasons for that. First of all, lower Pixel Shader performance is the price you have to pay for higher flexibility of GeForce FX architecture
So, the announcement of NVIDIA GeForce FX 5950 Ultra didn’t help NVIDIA to become a leader in the High-End gaming market. However, no one had actually expected that to happen: “slightly overclocked” GeForce FX 5900 Ultra can hardly be able to compete with ATI RADEON 9800 XT in new games, which are using DirectX9 Pixel Shaders.read the whole article hereNevertheless, ATI RADEON 9800 XT remains the performance leader in contemporary games and the today’s test session really proves this point. Taking into account that the recommended price of ATI RADEON 9800 XT is $499, which is exactly the same as the price of NVIDIA GeForce FX 5950 Ultra, there shouldn’t be any questions about the choice of a High-End graphics card any more.