something creepy about 3dmark2001

D

drzaius

New Member
#1
i can't quite put my finger on it but its really weird. no system out there (that i've seen) has supported all the testes and monstered them. i've seen a Tbird 1.2 w/GF2ultra get ~3400, then got 3800 fully tweaked (new drivers manly) and someone got 4000 with a P4 1.5 w/GF2ultra, and i haven't seen any GF3 marks (liked to see one with a P4). wonder why i haven't seen a benchmark with a GF3
haven't seen one with a kyro2 either, that would be nice to see.

also some other interesting thoughts i've seen are:
1) fillrate is nothing. the diff between 800x600x16 and 1024x768x32 (for me atleast) was about 100 marks (1365 to 1269) and it probebly would not surprise me to see that the diff. between a GF2GTS (200/166) and an GF2ultra (250/220) would be just as small.

2) current TnL is only the begining of what we are going to see. what's the highest number of triangles a card can do today? 30-35 million? that will be nothing by the end of this year alone. i'm talkin 100-150 million (probebly)

3) pixel shaders: what the hell are they? i don't really know what they are, but i do think that they will be the talk of the town, like fillrate has always been. the better the pixel shader, higher the mark (and prettier too).

4) vertex shader: honestly i have no idea what this is, but i liked it how all the Neo's walked around and shot each other
anyways, again something that will take over the role of fillrate.

5) why can't i play the nature game
looked really really good on the demo
and 2001 trailer movie

OHHHH that's right it's something only a GF3 can do

and do i remember correctly that when the GF3 was released for MAC, a demo that John Carmack showed was this very demo... how long ago was that? 3-4 weeks ago? funny how this demo was pretty much ready back then, but for some reason was held back till yesterday (march 13 2001)
and yet we still dont have any GF3 marks for this.

just something that i've been thinkin' about.

*edit* score or 1.2gig tbird

[This message has been edited by drzaius (edited 03-14-2001).]
 
R

rarraflled

New Member
#2
So are you saying there is a conspiracy between Madonion and NVIDIA? Is this there plot to justify that users should go out and buy a nice new Geforce3 so they can be ready for the "games of the future"? Did everyone notice the advertizement by Falcon NW durring the bench? Is NVIDIA trying to make the 6 month production cycle profitable? In one year are we expected to buy a $1000(US) videocard just to run 3DMark2002?


Sorry just had to get that out!


------------------
Celymine [email protected]
Asus CUSL2
Crucial PC133 128megs
MSI Geforce2PRO 64megs
Windows ME
 
D

drzaius

New Member
#4
By rarraflled
So are you saying there is a conspiracy between Madonion and NVIDIA? Is this there plot to justify that users should go out and buy a nice new Geforce3 so they can be ready for the "games of the future"? Did everyone notice the advertizement by Falcon NW durring the bench? Is NVIDIA trying to make the 6 month production cycle profitable? In one year are we expected to buy a $1000(US) videocard just to run 3DMark2002?
in a round about way yeah

but also looking at this you can see what is to be expected within the year. when 3dmark 2000 came out it supported TnL, which at the time only the Geforce1 could do, and since then that TnL part has gotten stronger and more things have been added.
so by the end of this year, we will have really powerful pixel/vertex shaders and then of couse something else that will improve our 3d gaming experience.

------------------
people that are slow are easy to pass, it's people who drive fast that are hard.
 
O

outside looking in

Bah. Erm... Eh? Bleh!
#5
A buddy-buddy situation between MadOnion and NVidia is nothing new... which is why I smile when my Radeon beats GTS and GTS pro's, and makes the embm test look like butter (sorry, too much SNL
).

I too wonder why the nature runs smooth in the demo, but won't run period in the bench. Probably a line of code that does something like "search for GF3, none found, skip test"
.

------------------
Those who fear the facts will forever try to discredit the fact finders. - Daniel C. Dennett
 
R

rarraflled

New Member
#6
Hear is another thought. Have we seen any games made from the graphics engine MadOnion is using? If so what are they? Why not use a gaming engine that is or will be used in games? Are they just using the bench to licence there engine out to game makers? How many people are going to go and buy the Geforce3 just to run this bench? Why not just put a frikin CPU socket on the vidcard? Maybe add the option of increasing memory also! Just think, instead of buying a new card just buy the chip and some memory for it!
I love ranting


------------------
Celymine [email protected]
Asus CUSL2
Crucial PC133 128megs
MSI Geforce2PRO 64megs
Windows ME
 
A

aug1516

New Member
#8
Maybe this is a stupid question but what is that "Point Sprites" test all about? It just looked like some stupid horse statue with bad graphics going around in circles.
 
P

PeODB

New Member
#9
Well I think that we all need new CPUs more then vid cards for this benchmark. The score only increases by 300 Marks when going from 1024X768 32bit to 640x480 32bit. This is a 2 times lower resolution with 2 times less of a fillrate need meaning that it would double in performance, but it does not. The score look better with faster CPUs.
 
Gomez Addams

Gomez Addams

New Member
#10
I am rather curious about how the score is arrived at. We see the results of all tests that were completed but how is the final result computed ?


------------------
Friends don't let friends buy P4s.
 
N

netdude

New Member
#12
I ran it at 1024x768 resolution (gig T-Bird/GeForce 2 GTS)and get a 2910 or something like that. The nature scene looked great at 1024x768 as well.
 
nodnod

nodnod

Dismember
#14
I don't know about you guys but the nature game ran fine on my 16-color cga video card, maybe you guys need to upgrade!
Yeah Right!

------------------
/////////////////////
Don't Tread On Me
\\\\\\\\\\\\\\\\\\\\\
 
J

JollyRoger

New Member
#15
16 colours was EGA, CGA was 4 colours, take it from someone who remembers those days.

------------------
AHHH, I think my computer's got a Virus!

Oh no, that's Windows 98
 
D

drzaius

New Member
#16
posted by Notorious AGD:
Well I think that we all need new CPUs more then vid cards for this benchmark. The score only increases by 300 Marks when going from 1024X768 32bit to 640x480 32bit. This is a 2 times lower resolution with 2 times less of a fillrate need meaning that it would double in performance, but it does not. The score look better with faster CPUs.
i think this is because most of the things in DX8 are software driven (unless you have a card that can do DX8 stuff, ie. Radeon or GF3) so increasing the CPU without changing the video card, it will increase it, because it can do the DX8 stuff faster, but a DX8 compatable video card can do it even faster, and relieve the CPU of these tasks. so really you do want to get a new video card, but not a GF3, well not right now. (Radeon would be a good intermediate choice)

------------------
people that are slow are easy to pass, it's people who drive fast that are hard.
 
SexyMF

SexyMF

Multiphasic
#17
Dissapointed. 3DMark2001 runs like a pig

Anyone notice that as soon as there is a light source framerates 1/4

If a 1000MHz PIII with a GTS card can't maintain 30 FPS at decent quality, well.

As for the results, they are totally screwed up unless you can run all tests, its no where near a benchmark.

And definitly, whats with a demo nature scene but 'can't do it' in the benchmark.

Someone please run 3DMark2001 on a supercomputer and record it to an AVI file
 
Todd a

Todd a

New Member
#18
Mad Onion based it around the GeForce3 which he has had for a quite a while for testing. The program is also very CPU intensive. Performance will suck without a 1.33Ghz Athlon and GeForce3. I wonder if he made it SSE2 enhanced for the P4?

------------------
The COMPUTER is your FRIEND!
Happiness is manditory.
 
2

23skidoo

New Member
#19
I've another riddle, why are the fillrate scores 1/2 or less than 3DMark2000 scores and why have the polygon scores increased over 3DMark2000? Shouldn't they be the same since they attempt to find the theoretical maximum for each system? Or are we seeing the actual fillrate of the card rather than the theoretical fillrate? Or am I the only one who had low fillrate scores?

Also on my system, 866T-Bird with [email protected]/378, I get 500 points going from the default 3DM2001 bench to 640x480x16bit with 16bit textures. 3100 to 3600, pretty shabby when some are getting over 4500 with Radeon and Ultras.

BTW, it looks like the Radeon rules, almost 5000 for one system in the comparison list. I did see a P4 system with a Radeon card (I think, it may have been the GF comparison) but it was well down from all the Athlons so SSE2 optimation may not exist in 3DMark2001 or it can't match the Athlon FPU. Athlon was on top in all the comparisons I did but I could never get a complete list, MadOnion's servers must be swamped. Very interesting stuff benchmark, however.

[This message has been edited by 23skidoo (edited 03-18-2001).]
 
BladeRunner

BladeRunner

Silent & Cool.....
#20
If that is the case then it will be even more meaningless than the previous Mad Onion creations as a current D3D 3D Graphics benchmark. If it is overly CPU intensive it won't mirror real world game performance unless the GF3 really will be significantly faster when used with a high clock P4.

Whatever it would appear pretty pointless at the moment and only good for you own tweaking or comparisons to systems of exactly the same spec. There was me thinking they might have learned something from 3D MK 2000's Geforce and P3 bias
 

Associates