Creator, I wouldn't say that 3DMark2000 is deliberately biased towards nVidia. I'd say that it's just a very poorly designed benchmark, bearing no resemblance whatsoever with ANY real game, and which (totally by accident) happens to favour nVidia cards.
Lemme explain why.
1) For starters, it uses huge numbers of polygons. Even the medium quality "game" tests push ludicriously high numbers of polgons, compared to any actual game, existing or currently in the making.
2) It uses very little texturing power. For starters, most of the rendering is done in vertex lighting mode, which means it's mostly a single-texturing application. (That's why, for example, it has put the S3 Savage2000 at an artifficial disadvantage vs the GeForce SDR.) Most actual modern games (e.g., Q3A, to quote another T&L enabled title) use lightmaps, and a ludicrious ammount of multi-texturing. According to John Carmack (you may have heard of him
) the Q3A engine can use up to 8-10 textures per polygon.
3) The default test is in 16 bit colour. Which, combined with the mostly single texture use, doesn't even come close to stressing the memory bandwidth even on a SDR card, much less on a DDR one. (That's why, for example, the GeForce SDR and the MX score so high, when in practice you'll get a lot less fps with them in Q3A.)
So basically, again, it's just a lousy artifficial benchmark, which doesn't even come close to reflecting any game reality. It tends to stress polygon processing, including their transfer through the AGP bus, WAAAAAY more than the texturing.
For example, according to 3DMark2000, I was MUCH better off with a lower CPU speed, but a higher overclocked AGP bus. Strangely enough, no actual game ever showed the same anomaly. Go figure.
------------------
Moraelin -- the proud member of the Idiots' Guild
[This message has been edited by Moraelin (edited 12-19-2000).]