UM if you were to scroll to the top,and actually READ what I said,I said it WILL do 2048x1536x16 <<<<<<<<<<<<<Did you ever happen to notice the last number???????? Well that says 16 refering to 16 bit(65k colors) they seem to feel that in quake 3 NORMAL 2048x1536 x 16BIT!!!!! would be un achievable,for the GTS2.
And no,i dont usually get 16 players in 1 scene,the rooms are a little smaller than that,on a couple of maps that are outdoors,yes and it doesnt dip below 25FPS.
Mine is not stock ,mine is at 150/350 im using the 5.14 drivers.
Here is a GTS2 with 5.16 drivers notice the almost 100% difference between the 16bit and 32bit,the one above was just an example,this one is current,now your turn you do the math................
Furthermore,Im running on a K63-460,you know a little wimpy cpu,am Im play ing at 1600x1200 with very playable Framerates(note not just playable but VERY playable) #@bit color,32bit textures geometry onFULL!!!! If this doesnt impress anyone ,then the hell with you,that what this topic is 4/5 about,trying to tell people this is still a rocking card for the $$$ I can get a DDR for $220 right now,thats a good price,vs $600.The othe 1/5(that has been drug through the gutter and back)is the fact that I wish I had a 2048x1536 monitor to give it a shot.
I am well aware of the limited applicability of synthetic tests to real life performance in games and other apps. What I am saying is that, in my experience, most performance claims made for video cards (particularly fillrates) are theoretical at best and can't even be met in synthetic tests, let alone a real game.
I find it to be a novel concept (close to their claims) and somewhat astonishing that the GF2 was able to hit 1.2 GTS on a synthetic test that was written by a third party. They have certainly raised the performance bar.
I apologize for cluttering your thread Deanril. The GF DDR seems to be a very capable card. When do you plan to upgrade your CPU and what will be your chip of choice ?
um,i am planning on a DDR board,and either a duron or T-bird and 192 megs of DDR Ram.I guess i have a little waiting to do,but thats ok,ill wait this will be the upgrade to wait for.
I will keep my geforce DDR for another 6months and see what I can get then,when I bought it,and now its like getting a whole new video card ,with the drivers and the s3tc,it performs SO much better then when I bought it.
Did you guys understand my point?? Or am i being vaugue or something I did say 2048x1536x16bit ,i dont understand how nobody can see my point,oh well take care GOMEZ......
Im sorry Chaos0 let me answer your question.............
Yes g400 and ati are both HARDWARE DVD decoder built right into the card,where as a Geforce uses software "soft DVD",so yes you are correct.(i know for sure the ATI does not positive on the G400)
As far as 2d the geforce's 2d is crisp and very fast,actually it all depends on the card manufacturer,and the Filters used,but overall in 2D Geforce is crisp at all supported resolutions has high refresh rates and has a 350mhz RAMDAC thats very quick,so I disagree that in general(most) of the geforce have bad 2D,they have very good,if not 1 of the best 2d quality ive seen.
There I took the heatsink off my head and upgraded to a water cooled peltier system,lol............
*shrug* Some suckers just won't give up under any amount of reasoning, so why bother? Sure, mate. Go buy a few more GeForce's, while you're at it. Noone said fandom had to be cheap, nor for that matter rational
Let me get just a few things straight:
1) Quake3 is good and fine, but it's not the only game out there. Tell me how well that card performs in, say, Unreal Tournament. Or Half-Life. Or are you trying to tell me you're never ever gonna play anything but Q3, because it supports that T&L crap?
2) First we were told that 16 bit is soooo pase, and 32 bit is the only way to go. We were told that the TNT's were better than the faster Voodoo3 because they show better images. (In most cases it was false, because of the TNT's crappier texturing unit, anyway. But let's believe them.)
Now it's the exact opposite. Now we're told that, hey look, the GeForce2 is really fast in 16 bit colour. And, oh, anti-aliasing doesn't really matter either. Hey, who needs better image quality?
Am I the only one who smells shameless marketing at work here? How come last year quality was more important, and now all of a sudden it's not?
3) If you want a different kind of benchmarks, try running with FSAA enabled. I'm not even asking for the whole nine yards. Just the minimal 2x FSAA. THEN tell me what kind of performance penalty does the GeForce take from that.
Or if you can't be bothered, just point your browser to www.firingsquad.com and have a read. Let's just say that with FSAA enabled, the V5500 spanks the GF2 in all resolutions above 640x480, and all three quality settings. In spite of not having T&L. Go figure.
4) That test you mentioned about running in 2048x1536 was about a non-overclocked Voodoo. So until we know how well that one overclocks, there is no point in comparing it to a very overclocked GF2. It's like comparing a Celeron 500@750 to a non-overclocked P3, and deciding that the Celeron is faster than the P3.
Moraelin -- the proud member of the Idiots' Guild
Here we go!!!!!!!!! That^^^^^^^^^GTS2 is NOT o/c ed ,where in the hell do you pull that bs out of????? Secondly the voodoo5 550 spanks the GTS2 in FSAA hmmmmmm.......in anything over 640x480,maybe so good point,but as I stated above,i dont care for FSAA,I much rather have High resolution,inwhich the GTS2 DEFINATELY kicks the holy crap out of the v5 5500.
What in the hell happened to my thread????
it went from praising my DDR to this GTS2 VS V5 5500.........................
This is my GeForce DDR(original 1 year old)as I spoke of in this thread,and it does do 2048x1536x16 quake 3,I finaly bought a monitor that goes that high.So re-read this ,and refresh your mind and argument and CONCEDE,cause I win.