GEFORCE 2 finally revield
Home | Reviews and Features | Special Reports | Forums |

Page 1 of 3 123 LastLast
Results 1 to 15 of 32

Thread: GEFORCE 2 finally revield

Hybrid View

  1. #1
    Join Date
    Dec 1999
    Location
    whitby/ cobourg, ont, canada
    Posts
    1,263

    GEFORCE 2 finally revield

    geforce 2 looks amazing its way better then voodoo4 or 5 and geforce and what happened to the NV15 i haven't seen anything on it from nvidia yet whell we can only waite
    duron 800@900 - asus a7v - 640mb pc133 - radeon 8500le 128mb - sb live - 16x dvd-rom - 48x24x48 burner

  2. #2
    Join Date
    Aug 1999
    Location
    Norwich, UK
    Posts
    3,790
    Well from the benchmarks I've seen so far it is around 30% faster at and above 1024 x 768.
    That's not really much to get excited about.
    The move to .18 micron silicon is the best news as that might mean the GPU can stay below flashpoint!

    Also the Voodoo 5 is still in Alpha state and the drivers are not near release spec, any hardware site worth their salt that has tested them will make this clear, I expect there to be very little between the Geforce 2 and V5 5500 in final guise. I just hope the V5 6000 is significantly faster than both.

    BladeRunner

  3. #3
    Join Date
    Aug 1999
    Posts
    3,425
    A BAD *** new feature with the GF2GTS is pixel-shading! If game programmers use this feature it will look like a TOTALLY different game between a GF2GTS and any other card! However, I'm still looking very much forward to the Voodoo5 6000 and the retail drivers for the Voodoo5 5500. What I really want to see are UT benchmarks between a GF256, GF2GTS, Voodoo5s, and Voodoo3s. I want to see how much a GF2GTS or Voodoo5 will improve the picture quality and FPS in UT compared to my Voodoo3 3000.


    Here is an example of what pixel-shading can do to a game.

    [This message has been edited by LaoChe (edited 04-26-2000).]

  4. #4
    Join Date
    Feb 2000
    Location
    Deutschland
    Posts
    936
    I think the GeForce 2 _is_ the NV15.

    But basically, I'm pretty much unimpressed. Compared to voodoos, yeah, it'll probably be better. But even with 3DFX's beta drivers: not by much. BUT... compared to what ATI says their next card will do, it doesn't really sound that hot.

    First of all, even on paper, it can only do 25 million triangles, while ATI claims 30 million. Also that's only as long as they're attached to each other in a very well defined way... or it drops to about 8 million. Basically it doesn't do 25 million _triangles_, it does 25 million vertices per sec. There's a difference.

    Second, from those benchmarks it looks like it's _very_ far from the promised 1.6 gigatexels. At a wild guess, it chokes on the memory bandwidth, like the 256 did with SDR. Simple maths on memory timings would have shown that there just wasn't room for that 4x increase. Heck, there wasn't even room for a 2x increase. Until someone makes QDR, those 1.6 gigatexels will likely exist only on paper and in marketing hype. And, no, there won't be any QDR available any time soon. (On the other hand, if ATI actually makes a memory access optimization that works, it could easily gain some serious ground there.)

    Third, the features really don't look that impressive.

    Briefly, yeah, they'll probably get a bunch of people to buy it based on lots of hype, and a lot less actual substance. Then again, what else is new?

    ------------------
    Moraelin -- the proud member of the Idiots' Guild
    Moraelin -- the proud member of the Idiots' Guild

  5. #5
    Join Date
    Dec 1998
    Location
    Grand Haven, Michigan, USA
    Posts
    11,332
    The card does look very impressive on a PIII 800/i820/DDR with fast-writes going. On an Athlon 1G/ASUS K7V/133 make sure you get the new BIOS for the K7V.

    Without fastwrites and Rambus 800Mhz memory the GeForce2 GTS seems to struggle. This gives the PIII an advantage at the higher resolutions. Check out the article yourself:
    http://www.sharkyextreme.com/hardwar..._guide/4.shtml

    The Voodoo5 5500 is not doing so good. If the release drivers are not a lot better, they might go under. Their stock is at under $10 a share and I think nVidia is at $80+.

    I'd like to see how it performs on a K6-2 450 system (and also the Spitfire and ThunderBird). AMD was demoing their DDR chipsets today. I wonder what fastwrite would be like on 133Mhz DDR memory?
    AMD Phenom II x4 945 3Ghz | ASUS M4A77TD | 2X WD 1TB SATA 2 hard drive | 2x2GB Corsair XMS3 | nVidia GeForce 8800 GTS | ATI TV Wonder Theater Pro 550 | Antec P-160 case | Antec 650w Earth Watts | LG Blu-ray Super Drive | LG DVD RW | Windows 7 Pro

  6. #6
    Join Date
    Jan 2000
    Location
    Huntsville, AL
    Posts
    104
    All I can say is: look out 3dfx. The Q3 benches I saw at Anand's show the GeForce2 demolishing the V5500. I'm kinda skeptical about them pulling that much performance out of driver optimization.

    The GTS looks impressive from where I sit, but I think I'll wait for the NV-20 since my Anni Pro is doing fine for now.
    _________________________"Give a man a fish and he will eat for a day,
    Teach a man to fish and his wife will divorce
    him, get the house, the kids, the boat, his rods
    and reels, and he will learn to drink..."

  7. #7
    Join Date
    Feb 2000
    Location
    Deutschland
    Posts
    936
    Even with fast writes and everything, let's do some simple maths. That 1.6 gigatexel figure is based on the assumption that each of those 4 units can actually be kept going on and on and on, like the energizer bunny, and doing two textures per clock.

    Now let's say we're running in 32 bit colour.

    A) Let's take a worst case scenario first, where everything is drawn back to front. Two textures per unit mean two memory reads to get that pixel. Plus one memory read from the Z-Buffer, plus one memory write to draw the pixel, plus a memory write to update the Z-Buffer. (If you also have transparency/translucency effects, add one read to get the old pixel. But let's assume we have no transparent textures.) That's five memory operations per cycle, and per texturing unit. Now multiply this by 4 units, and you get 20 memory accesses per clock. Times 32 bits, that's 640 bits moved per clock.

    640 bits per clock on an 128 bit bus? Dream on. But real life memory will have at least 6 cycle penalty for a page miss, and that'll happen a lot for reading the textures. And the memory writes aren't that fast, either.

    B) Even in a best possible case scenario, things aren't looking that much better. As in: all the drawing is done front to back and there are no transparencies, and you're looking straight at a wall, so everything will get obstructed right away. (And assuming the card or game is actually smart enough to optimize drawing for this situation. That remains to be seen.) It's not quite the typical situation in a game, unless your only purpose in life is to view walls up close, but let's pretend it happens. And it only happens after the first layer of pixels have been drawn, as per the previous scenario. But even so, it's still at least one read per texturing unit for the Z-Buffer. Now it's 4 operations times 32 bits, and that's 128 bits moved per clock. On an 128 bit bus. It fits quite nicely, but you'd need some very ideal memory to actually get it. As in: can do one operation per GPU clock, and again that just doesn't exist.

    C) So far I've been assuming that all the textures and triangles are in the card's memory, and _nothing_ needs to be transferred on the AGP bus. I.e., not only with fast writes, but even with divine intervention on the AGP bus, it'll still fall a lot shorter than that advertised number.

    D) Note that the above calculations have already been _very_ generous. E.g., I've been blissfully ignoring bus usage issues. To actually get that number of bits per clock, the 128 bit bus would have to be able to act like 4 independent 32 bit busses, with separate address and control lines. Furthermore, the memory would have to be quad-ported so reads from the same chip don't wait for each other. In practice, the situation would be a lot less nice.

    E) I've also been ignoring the fact that the screen refresh itself needs to read the memory, too. At least at high resolutions and high refresh rates, this can eat some of the memory bandwidth, too. At, say 1280x1024 by 32 bits colour, and with a 75 Hz refresh rate, that's 384 megabytes per second eaten just by that. It's not much, compared to DDR bandwidth, but it's there.

    Briefly: the GeForce 2 can't possibly achive that advertised gigapixel, and that's it. It'll exist just in the marketing hype, not in your computer. So wth, go buy it. We really need to support falsehood in advertising, you know

    ------------------
    Moraelin -- the proud member of the Idiots' Guild


    [This message has been edited by Moraelin (edited 04-27-2000).]
    Moraelin -- the proud member of the Idiots' Guild

  8. #8
    Join Date
    Dec 1999
    Location
    whitby/ cobourg, ont, canada
    Posts
    1,263
    i would like to add something to moraelin letter i just saw a pick of the voodoo 4 and 5 and wow that thing is hugh you will be lucky to get it in your case it is like a foot lone it has like 5 fans and looks like a piece of crap that will melt your come if one of the fans dont work (also it doesn't have any heatsinks)
    duron 800@900 - asus a7v - 640mb pc133 - radeon 8500le 128mb - sb live - 16x dvd-rom - 48x24x48 burner

  9. #9
    Join Date
    Feb 2000
    Location
    NY
    Posts
    257
    Mor:

    "Briefly, yeah, they'll probably get a bunch of people to buy it based on lots of hype, and a lot less actual substance. Then again, what else is new?"

    Will you EVER admit that there are beter cards than a voodoo 3?

    {{{LONG LIVE GEFORCE DDR-DVI}}}

    And i thought this new card would have ddr...woul it make sense not to after it is in their current generation of card? Would seem like a downgrade...

  10. #10
    Join Date
    Feb 2000
    Location
    Half the way between a PC screen and the scary real life...
    Posts
    886
    3dfx seems quite in trouble, or even drowning in sh!t.

    Everybody knows they are far too late (damn, they've a discount on Valium or what?). They were already with V3. Now, for their reasons, I don't know, and I think most gamers don't care anyway. It's be there or be square.

    It seems that customer loyalty is rotting. Not that exiting products anymore (cf delays) and not that more price competitive either.

    And also all these acquisitions, that must have crushed their cash-flow.
    I would not like to be the CEO.

    From a product point of view, I'd say they only have two aces left if they want to keep a bright image: drivers and (potentially) V5 6000, the main issue being price.
    Drivers, because for what I have heard the GTS will use the Detonator, already highly optimised, whereas 3dfx will have plenty of space.
    As I see it, they are definitely going to take the challenger's seat, and we'll see the same stuff as for the V3 line: the top selling product will be the El Cheapos, bought by those who want cost effective upgrades. Although they could go for a GeForce.
    Maybe then their V4 and V5 PCI boards for mobos with integrated chipsets.

    Damn, there are so many factors, I'm getting lost (and I am tired out, that doesn't help).
    Headache generating situation. But hard times for 3dfx anyway.
    And ATI is closing by ...

    I am really disappointed (and that's a V3 lover who's talking).
    Main rig: Athlon64 x2 4200+ | Asrock 939N68PV-GLAN | 4x512MB PC3200 | 120GB Seagate | Samsung DVDRW SH-S203D | VX550W PSU | WinXP Home SP3 / Ubuntu 7.10
    Headache generator: K6-2 500 | FIC VA503+ | 2x64MB PC133 | Voodoo 3 2000 PCI | SB32 ISA PnP | 15,2MB IBM | Xubuntu 8.04 / DeLiLinux 0.7.2
    Lappy: Compaq EVO N115 | Mobile Duron 1GHz | 384MB PC133 | VIA KM133 | Xubuntu 8.04
    A big thanks to MS's Vista for making me look into Linux distros!

  11. #11
    Join Date
    Feb 2000
    Location
    Half the way between a PC screen and the scary real life...
    Posts
    886
    Oh, also, they are apparently phasing out Glide. For a better version or simply to be OpenGL, I don't know.
    Any opinion?

    All these threads comparing cards on the basis of questionable benchmarks, that's a bit unfair though. Questionable includes beta drivers, platform choices and so on.

    Anyway, everybody seemed to know the V5 5500 would mostly (unfortunately?) stand against the GeForce, so why complaining now that the it cannot make it against the GTS?
    Main rig: Athlon64 x2 4200+ | Asrock 939N68PV-GLAN | 4x512MB PC3200 | 120GB Seagate | Samsung DVDRW SH-S203D | VX550W PSU | WinXP Home SP3 / Ubuntu 7.10
    Headache generator: K6-2 500 | FIC VA503+ | 2x64MB PC133 | Voodoo 3 2000 PCI | SB32 ISA PnP | 15,2MB IBM | Xubuntu 8.04 / DeLiLinux 0.7.2
    Lappy: Compaq EVO N115 | Mobile Duron 1GHz | 384MB PC133 | VIA KM133 | Xubuntu 8.04
    A big thanks to MS's Vista for making me look into Linux distros!

  12. #12
    Join Date
    Aug 1999
    Location
    Norwich, UK
    Posts
    3,790
    Where do all you Geforce lovers suddenly appear from? it beats my why you are all so fanatical about what has been, (and still is), a troublesome product that should not have been released in the beta form it was.

    The GTS will be nothing special over the Geforce. UP to 30 % above 1024 X 768 and It will still suck at Glide games. Where is the Glide support Nvidia? it is and has been open code now for a while.

    I'm not a Voodoo troll either but how come this is the only card I can truely say I've been dissapointed with.

    Wait until the Voodoos are out or you may all end up eating your words.

  13. #13
    Join Date
    Feb 2000
    Location
    Deutschland
    Posts
    936
    Umm... Iixus, exactly what is your problem, anyway? I never said the Voodoos 3's are better. I mean, hell, if you want to argue with me, at least argue about something I've said, not about some fiction

    Yes, the V3 has been out-gunned for quite a while now, by just about any other card. I still think it offers better bang per buck than a GeForce, though. But then every single card out there offers better bang per buck than a GeForce, and a lot of them do offer better bang than a V3. So basically I wouldn't advise anyone to buy a V3 nowadays. A TNT2 or a G400 will likely give better performance, at a similiar price.

    But then, if they already have a V3, I wouldn't advise them to upgrade, either. At least, not right now. As I've said, for playing most games it should be more than enough. And I repeat, for _playing_ the game, not for bragging about some fps rate your eye can't even tell.

    ------------------
    Moraelin -- the proud member of the Idiots' Guild
    Moraelin -- the proud member of the Idiots' Guild

  14. #14
    Join Date
    Dec 1998
    Location
    Grand Haven, Michigan, USA
    Posts
    11,332
    I must admi the card was very impressive. The Voodoo5 really strugles at lower resolutions (of coarse I never use those lower resolutions) and did not show great results in FSAA (but I would probably limit them to racing games anyways). If they can get some driver improvements out by June, I might consider it.

    Otherwise it will be the GeForce2. The only problem is that they have hamstrung the card with too slow of memory. They should have tried for a) faster DDR, b) 256bit data path for memory, or c) a dualk channel memory. If you up the color to 32bit you get very little performance differance on the GTS. Also even with 2.5 times faster geomitry engine the GTS shows little improvement in DMZG or their new GTS Evolva game.

    Maybe in June they will have a few cards out there that will be willing to spend a little more for faster DDR memory. Raideon256 is supposed to use 400Mhz DDR.

    The Core on the GTS is VERY impressive though with a lower wattage and heat and overclocking to almost 250Mhz with pre-production chips... WOW!!! The DDR still tops off at about 350-370Mhz.
    AMD Phenom II x4 945 3Ghz | ASUS M4A77TD | 2X WD 1TB SATA 2 hard drive | 2x2GB Corsair XMS3 | nVidia GeForce 8800 GTS | ATI TV Wonder Theater Pro 550 | Antec P-160 case | Antec 650w Earth Watts | LG Blu-ray Super Drive | LG DVD RW | Windows 7 Pro

  15. #15
    Join Date
    Aug 1999
    Posts
    3,425
    I'm with Blade! The GeForce has problems playing UT and that is my FAVORITE game! I don't care how fast and how much better the other games look, if UT looks bad or doesn't play well then I don't want it! Hopefully the new drivers for the V5 5500 will help ALOT. The thing that has me worried is the fact that when 4XFSAA is turn on performance drops like a rock. I really like the way FSAA improves the look of the game but it might take the V5 6000 to make 4XFSAA play well.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •