GEFORCE 2 finally revield

M

morphious

New Member
#1
geforce 2 looks amazing its way better then voodoo4 or 5 and geforce and what happened to the NV15 i haven't seen anything on it from nvidia yet whell we can only waite
 
BladeRunner

BladeRunner

Silent & Cool.....
#2
Well from the benchmarks I've seen so far it is around 30% faster at and above 1024 x 768.
That's not really much to get excited about.
The move to .18 micron silicon is the best news as that might mean the GPU can stay below flashpoint!

Also the Voodoo 5 is still in Alpha state and the drivers are not near release spec, any hardware site worth their salt that has tested them will make this clear, I expect there to be very little between the Geforce 2 and V5 5500 in final guise. I just hope the V5 6000 is significantly faster than both.

BladeRunner
 
L

LaoChe

New Member
#3
A BAD ASS new feature with the GF2GTS is pixel-shading! If game programmers use this feature it will look like a TOTALLY different game between a GF2GTS and any other card! However, I'm still looking very much forward to the Voodoo5 6000 and the retail drivers for the Voodoo5 5500. What I really want to see are UT benchmarks between a GF256, GF2GTS, Voodoo5s, and Voodoo3s. I want to see how much a GF2GTS or Voodoo5 will improve the picture quality and FPS in UT compared to my Voodoo3 3000.



Here is an example of what pixel-shading can do to a game.


[This message has been edited by LaoChe (edited 04-26-2000).]
 
M

Moraelin

New Member
#4
I think the GeForce 2 _is_ the NV15.

But basically, I'm pretty much unimpressed. Compared to voodoos, yeah, it'll probably be better. But even with 3DFX's beta drivers: not by much. BUT... compared to what ATI says their next card will do, it doesn't really sound that hot.

First of all, even on paper, it can only do 25 million triangles, while ATI claims 30 million. Also that's only as long as they're attached to each other in a very well defined way... or it drops to about 8 million. Basically it doesn't do 25 million _triangles_, it does 25 million vertices per sec. There's a difference.

Second, from those benchmarks it looks like it's _very_ far from the promised 1.6 gigatexels. At a wild guess, it chokes on the memory bandwidth, like the 256 did with SDR. Simple maths on memory timings would have shown that there just wasn't room for that 4x increase. Heck, there wasn't even room for a 2x increase. Until someone makes QDR, those 1.6 gigatexels will likely exist only on paper and in marketing hype. And, no, there won't be any QDR available any time soon. (On the other hand, if ATI actually makes a memory access optimization that works, it could easily gain some serious ground there.)

Third, the features really don't look that impressive.

Briefly, yeah, they'll probably get a bunch of people to buy it based on lots of hype, and a lot less actual substance. Then again, what else is new?


------------------
Moraelin -- the proud member of the Idiots' Guild
 
Todd a

Todd a

New Member
#5
The card does look very impressive on a PIII 800/i820/DDR with fast-writes going. On an Athlon 1G/ASUS K7V/133 make sure you get the new BIOS for the K7V.

Without fastwrites and Rambus 800Mhz memory the GeForce2 GTS seems to struggle. This gives the PIII an advantage at the higher resolutions. Check out the article yourself:
http://www.sharkyextreme.com/hardware/articles/nvidia_geforce2_gts_guide/4.shtml

The Voodoo5 5500 is not doing so good. If the release drivers are not a lot better, they might go under. Their stock is at under $10 a share and I think nVidia is at $80+.

I'd like to see how it performs on a K6-2 450 system (and also the Spitfire and ThunderBird). AMD was demoing their DDR chipsets today. I wonder what fastwrite would be like on 133Mhz DDR memory?
 
N

Noctem

New Member
#6
All I can say is: look out 3dfx. The Q3 benches I saw at Anand's show the GeForce2 demolishing the V5500. I'm kinda skeptical about them pulling that much performance out of driver optimization.

The GTS looks impressive from where I sit, but I think I'll wait for the NV-20 since my Anni Pro is doing fine for now.
 
M

Moraelin

New Member
#7
Even with fast writes and everything, let's do some simple maths. That 1.6 gigatexel figure is based on the assumption that each of those 4 units can actually be kept going on and on and on, like the energizer bunny, and doing two textures per clock.

Now let's say we're running in 32 bit colour.

A) Let's take a worst case scenario first, where everything is drawn back to front. Two textures per unit mean two memory reads to get that pixel. Plus one memory read from the Z-Buffer, plus one memory write to draw the pixel, plus a memory write to update the Z-Buffer. (If you also have transparency/translucency effects, add one read to get the old pixel. But let's assume we have no transparent textures.) That's five memory operations per cycle, and per texturing unit. Now multiply this by 4 units, and you get 20 memory accesses per clock. Times 32 bits, that's 640 bits moved per clock.

640 bits per clock on an 128 bit bus? Dream on. But real life memory will have at least 6 cycle penalty for a page miss, and that'll happen a lot for reading the textures. And the memory writes aren't that fast, either.

B) Even in a best possible case scenario, things aren't looking that much better. As in: all the drawing is done front to back and there are no transparencies, and you're looking straight at a wall, so everything will get obstructed right away. (And assuming the card or game is actually smart enough to optimize drawing for this situation. That remains to be seen.) It's not quite the typical situation in a game, unless your only purpose in life is to view walls up close, but let's pretend it happens. And it only happens after the first layer of pixels have been drawn, as per the previous scenario. But even so, it's still at least one read per texturing unit for the Z-Buffer. Now it's 4 operations times 32 bits, and that's 128 bits moved per clock. On an 128 bit bus. It fits quite nicely, but you'd need some very ideal memory to actually get it. As in: can do one operation per GPU clock, and again that just doesn't exist.

C) So far I've been assuming that all the textures and triangles are in the card's memory, and _nothing_ needs to be transferred on the AGP bus. I.e., not only with fast writes, but even with divine intervention on the AGP bus, it'll still fall a lot shorter than that advertised number.

D) Note that the above calculations have already been _very_ generous. E.g., I've been blissfully ignoring bus usage issues. To actually get that number of bits per clock, the 128 bit bus would have to be able to act like 4 independent 32 bit busses, with separate address and control lines. Furthermore, the memory would have to be quad-ported so reads from the same chip don't wait for each other. In practice, the situation would be a lot less nice.

E) I've also been ignoring the fact that the screen refresh itself needs to read the memory, too. At least at high resolutions and high refresh rates, this can eat some of the memory bandwidth, too. At, say 1280x1024 by 32 bits colour, and with a 75 Hz refresh rate, that's 384 megabytes per second eaten just by that. It's not much, compared to DDR bandwidth, but it's there.

Briefly: the GeForce 2 can't possibly achive that advertised gigapixel, and that's it. It'll exist just in the marketing hype, not in your computer. So wth, go buy it. We really need to support falsehood in advertising, you know


------------------
Moraelin -- the proud member of the Idiots' Guild


[This message has been edited by Moraelin (edited 04-27-2000).]
 
M

morphious

New Member
#8
i would like to add something to moraelin letter i just saw a pick of the voodoo 4 and 5 and wow that thing is hugh you will be lucky to get it in your case it is like a foot lone it has like 5 fans and looks like a piece of crap that will melt your come if one of the fans dont work (also it doesn't have any heatsinks)
 
I

iixus

New Member
#9
Mor:

"Briefly, yeah, they'll probably get a bunch of people to buy it based on lots of hype, and a lot less actual substance. Then again, what else is new?"

Will you EVER admit that there are beter cards than a voodoo 3?

{{{LONG LIVE GEFORCE DDR-DVI}}}

And i thought this new card would have ddr...woul it make sense not to after it is in their current generation of card? Would seem like a downgrade...
 
S

Secrom

Senior Dismember
#10
3dfx seems quite in trouble, or even drowning in sh!t.

Everybody knows they are far too late (damn, they've a discount on Valium or what?). They were already with V3. Now, for their reasons, I don't know, and I think most gamers don't care anyway. It's be there or be square.

It seems that customer loyalty is rotting. Not that exiting products anymore (cf delays) and not that more price competitive either.

And also all these acquisitions, that must have crushed their cash-flow.
I would not like to be the CEO.

From a product point of view, I'd say they only have two aces left if they want to keep a bright image: drivers and (potentially) V5 6000, the main issue being price.
Drivers, because for what I have heard the GTS will use the Detonator, already highly optimised, whereas 3dfx will have plenty of space.
As I see it, they are definitely going to take the challenger's seat, and we'll see the same stuff as for the V3 line: the top selling product will be the El Cheapos, bought by those who want cost effective upgrades. Although they could go for a GeForce.
Maybe then their V4 and V5 PCI boards for mobos with integrated chipsets.

Damn, there are so many factors, I'm getting lost (and I am tired out, that doesn't help).
Headache generating situation. But hard times for 3dfx anyway.
And ATI is closing by ...

I am really disappointed (and that's a V3 lover who's talking).
 
S

Secrom

Senior Dismember
#11
Oh, also, they are apparently phasing out Glide. For a better version or simply to be OpenGL, I don't know.
Any opinion?

All these threads comparing cards on the basis of questionable benchmarks, that's a bit unfair though. Questionable includes beta drivers, platform choices and so on.

Anyway, everybody seemed to know the V5 5500 would mostly (unfortunately?) stand against the GeForce, so why complaining now that the it cannot make it against the GTS?
 
BladeRunner

BladeRunner

Silent & Cool.....
#12
Where do all you Geforce lovers suddenly appear from? it beats my why you are all so fanatical about what has been, (and still is), a troublesome product that should not have been released in the beta form it was.

The GTS will be nothing special over the Geforce. UP to 30 % above 1024 X 768 and It will still suck at Glide games. Where is the Glide support Nvidia? it is and has been open code now for a while.

I'm not a Voodoo troll either but how come this is the only card I can truely say I've been dissapointed with.

Wait until the Voodoos are out or you may all end up eating your words.
 
M

Moraelin

New Member
#13
Umm... Iixus, exactly what is your problem, anyway? I never said the Voodoos 3's are better. I mean, hell, if you want to argue with me, at least argue about something I've said, not about some fiction


Yes, the V3 has been out-gunned for quite a while now, by just about any other card. I still think it offers better bang per buck than a GeForce, though. But then every single card out there offers better bang per buck than a GeForce, and a lot of them do offer better bang than a V3. So basically I wouldn't advise anyone to buy a V3 nowadays. A TNT2 or a G400 will likely give better performance, at a similiar price.

But then, if they already have a V3, I wouldn't advise them to upgrade, either. At least, not right now. As I've said, for playing most games it should be more than enough. And I repeat, for _playing_ the game, not for bragging about some fps rate your eye can't even tell.

------------------
Moraelin -- the proud member of the Idiots' Guild
 
Todd a

Todd a

New Member
#14
I must admi the card was very impressive. The Voodoo5 really strugles at lower resolutions (of coarse I never use those lower resolutions) and did not show great results in FSAA (but I would probably limit them to racing games anyways). If they can get some driver improvements out by June, I might consider it.

Otherwise it will be the GeForce2. The only problem is that they have hamstrung the card with too slow of memory. They should have tried for a) faster DDR, b) 256bit data path for memory, or c) a dualk channel memory. If you up the color to 32bit you get very little performance differance on the GTS. Also even with 2.5 times faster geomitry engine the GTS shows little improvement in DMZG or their new GTS Evolva game.

Maybe in June they will have a few cards out there that will be willing to spend a little more for faster DDR memory. Raideon256 is supposed to use 400Mhz DDR.

The Core on the GTS is VERY impressive though with a lower wattage and heat and overclocking to almost 250Mhz with pre-production chips... WOW!!! The DDR still tops off at about 350-370Mhz.
 
L

LaoChe

New Member
#15
I'm with Blade! The GeForce has problems playing UT and that is my FAVORITE game! I don't care how fast and how much better the other games look, if UT looks bad or doesn't play well then I don't want it! Hopefully the new drivers for the V5 5500 will help ALOT. The thing that has me worried is the fact that when 4XFSAA is turn on performance drops like a rock. I really like the way FSAA improves the look of the game but it might take the V5 6000 to make 4XFSAA play well.
 
M

Moraelin

New Member
#16
Oh, yeah, I almost forgot. I stand by what I've said: there is no way in hell for them to actually deliver that 1.6 gigatexel fill rate. In fact, there's no way in hell they'll deliver even _half_ of what they promise. Even with DDR, the memory bandwidth just isn't enough for that, plain and simple. And it can be seen by that tiny 30% speed improvement in benchmarks, instead of the promised 4 times higher performance.

I.e., they're selling hype, not substance.

If you think they are NOT actually selling hype and snake oil, care to explain what obvious point did I overlook? I mean, wth, do they have four-port RAM on that board? Did they make a time machine and bring QDR ram from a few years in the future? Or what?

------------------
Moraelin -- the proud member of the Idiots' Guild
 
T

Tonka

New Member
#17
I don't understand why some of you are complaining about the GeForce and UT. I have an Asus Geforce DDR and UT run beautifully. I think it might be your system and not the GeForce card itself. I wish I could be of some help but that is just my opinion.
 
L

LaoChe

New Member
#18
Sorry Tonka, I don't have a GeForce (I have a Voodoo3 3000) but ALOT of people are complaining about the GeForce and UT. That is why I am kind of hesitant about getting the new GeForces. I know the new Voodoos will have no problems with UT since GLIDE is what they do best. That is why I'm kinda leaning toward the Voodoos.
 
Huge

Huge

Why am I still doing up?
#19
Umm...did I read this right???

GeForce2 Presentation [08:41 pm]
84 Comments / Threads - Steve Gibson
I got a hold of the presentation that was given at the recent WinHEC show announcing the GeForce2-GTS. It's a pretty lengthy presentation that was a huge 15+ meg file but I hacked it down to quite a bit less. You can check out the entire presentation here. Just click the image itself and it will take you to the next slide. Keep in mind this card will be on shelves within the week.

http://www.shugashack.com Good bye tnt2u!




[This message has been edited by Huge (edited 04-28-2000).]
 
Huge

Huge

Why am I still doing up?
#20
LaoChe, with the new 5.xx drivers, I can get 50+ fps with my tnt2u card. It's weird how some people can get good performance with nVidia, yet some others get utter crap
 

Associates