First GeForceFX 5800 Ultra benchies start to pop up...
Home | Reviews and Features | Special Reports | Forums |

Page 1 of 4 123 ... LastLast
Results 1 to 15 of 60

Thread: First GeForceFX 5800 Ultra benchies start to pop up...

  1. #1
    Join Date
    Apr 2000
    Location
    UK
    Posts
    8,466

    GeForceFX 5800 Ultra released

    http://www.tecchannel.de/hardware/1109/4.html

    Not sure how long that will be up since it's breaching NDA, so here are some copies of the benchmarks:



















    Terratec has disclosed their European pricing...

    659 = 437.89 for the 5800 Ultra
    579 = 384.74 for the 5800

    They are both over twice the price of a Radeon 9700NP. Oh dear...

    MuFu.

  2. #2
    Join Date
    Oct 1999
    Location
    Alaska
    Posts
    1,595
    Damn! These are the early driver stages to. If they can increase the performance another 10-20% within the next couple of months i might just decide to sell a kidney...

  3. #3
    Join Date
    Jan 2000
    Location
    Plymouth, England
    Posts
    1,399
    What an anti-climax. Even with the driver increases it looks like it will be only enough to level the playing field against R350. I can't believe this FX thingy hasthe gall to hog two card slots and not even bother to make raise its game. For the same price you could have two Radeon 9700NPs hogging the slots in SLI mode.

    Bennyboy.
    If you've got your money for nothing, who cares if the chicks are free!

  4. #4
    Join Date
    Oct 1999
    Location
    Alaska
    Posts
    1,595
    Give it some time. I've seen performance jump 30-40 & 50% after initial tests with alpha and beta drivers.

  5. #5
    Join Date
    May 2000
    Location
    us
    Posts
    1,411
    still i am not impressed. maybe if it came out 3 months earlier. grated there is room for improvement...but it better improve a hell of alot to justifiy its price.

    but then again these benchmarks could possibly be fakes too.

  6. #6
    Join Date
    Jul 1999
    Location
    Kentucky, USA
    Posts
    893
    it would be funny if Nvidia started losing sales like 3dfx, then ATI ends up buying them out

    if they keep producing crap like this, I can see it happening
    Intel P4C 3.0Ghz
    Asus P4P800-E Deluxe
    2x256mb Corsair Value Select PC3200
    eVGA e-GeForce 6800
    WD 120gb SE
    IBM 60GXP 60gb
    Creative Labs Audigy (free)
    Toshiba SD-M1402 (12x dvd/40x cd)
    Lite-On 52x24x52 ($15 retail )
    Pioneer A06 DVD+/-RW
    ThermalTake Silent Purepower 420W PSU

  7. #7
    Join Date
    Nov 2000
    Location
    Connecticut
    Posts
    6,459
    Bah, that card's got nuthin on mine. Looks like I won't have to trade in my 9700 for a Geforce FX afterall...
    My Guitar Hero 2 Xbox Live Ranking
    Main Rig (click this link for specs)
    Certified in: Nothing at all!
    Certifiable: 100% hands-on, real-world experience

  8. #8
    Join Date
    Jul 2002
    Posts
    1,230
    Talk about a disappointment. Without FSAA it barely outperforms the already old 9700 Pro, and with FSAA it's actually slower. And even if they squeezed another 10% or so with driver optimizations, it would _still_ be slower in 4x FSAA mode.

    I'm most definitely not impressed. Especially with the price for that. I'd probably buy that kind of previous generation performance for 200 bucks, but most certainly not 659.

    Let's see if ATI can get the 350 out the door in time.

  9. #9
    Join Date
    Aug 2002
    Location
    Winnipeg, Canada
    Posts
    778
    thats nuts for that price! and it needs its own mulix connecter
    Main
    AMD Athlon 64 3000+ 512 cache
    Asus K8V
    Samsung 1gig (2) 512 ddr400 timings 200mhz 2.5-3-3-5
    Zalman CNPS7000A-ALCU
    Radeon 9800PRO
    Audigy2
    Logitech z640
    Pioneer 107d
    Toshiba dvd 16x40
    BARRACUDA SATA 120GB,300GB
    ENERMAX 450w PSU
    Windows w2k Pro

  10. #10
    Join Date
    Jul 1999
    Location
    Kentucky, USA
    Posts
    893
    hell this gives ATI even more time to release whatever it is they want to release and make it 2x times better. Why put out another product when the current one kills Nvidias new card? To me that just hurts yourself. And if they do put anything else out, I dont think it should be anything but just a core/memory speed increase.

    Maybe here soon Nvidia will make these 128-bit DDRII card their highend value line, then put 256-bit DDRII on the cards and make them their highend highend cards. What idiots Nvidia were for putting in 128bit DDRII. Certainly they realized it? Hell anyone here could have told them not to do that its common sense. Is there just a shortage of it or did they just like the 128bit better?
    Intel P4C 3.0Ghz
    Asus P4P800-E Deluxe
    2x256mb Corsair Value Select PC3200
    eVGA e-GeForce 6800
    WD 120gb SE
    IBM 60GXP 60gb
    Creative Labs Audigy (free)
    Toshiba SD-M1402 (12x dvd/40x cd)
    Lite-On 52x24x52 ($15 retail )
    Pioneer A06 DVD+/-RW
    ThermalTake Silent Purepower 420W PSU

  11. #11
    Join Date
    Apr 2000
    Location
    UK
    Posts
    8,466
    Reviews:

    http://www.hardocp.com/article.html?art=NDIx

    http://www.anandtech.com/video/showdoc.html?i=1779&p=1

    http://www.extremetech.com/article2/...,846356,00.asp

    http://www.tomshardware.com/graphic/20030127/index.html

    I suggest you check out the [H] review first - good stuff from Brent as usual.

    Interesting that despite the noise, the toasty operating temperature, mediocre overclocking potential and ridiculous cost it probably still doesn't take the performance crown from from the 400MHz+ "custom" 9700 Pros that can be had for a lot less and come with passive cooling.

    This is pretty funny (from B3D):

    Originally posted by Dave H:
    We knew Tom Pabst was good for something: FXFlow sound clips :!:

    GFFX 5800 Ultra boot sequence--fan revs up at full speed then drops to 2D settings.

    GFFX 5800 Ultra running 3dMark--fan goes from "quiet" speed to full speed, then slowly revs down once the benchmark ends.

    Radeon 9700 Pro boot sequence--for comparison. Same mic settings as previous two.

    It actually sounds exactly like a dustbuster. I know: I have a dustbuster. The resemblance is uncanny.
    MuFu.

  12. #12
    Join Date
    Jul 1999
    Location
    Kentucky, USA
    Posts
    893
    Well hardocp's review differs a whole lot more than Anandtechs review.
    Intel P4C 3.0Ghz
    Asus P4P800-E Deluxe
    2x256mb Corsair Value Select PC3200
    eVGA e-GeForce 6800
    WD 120gb SE
    IBM 60GXP 60gb
    Creative Labs Audigy (free)
    Toshiba SD-M1402 (12x dvd/40x cd)
    Lite-On 52x24x52 ($15 retail )
    Pioneer A06 DVD+/-RW
    ThermalTake Silent Purepower 420W PSU

  13. #13
    Join Date
    Apr 2000
    Location
    UK
    Posts
    8,466
    Originally posted by Ruffian
    What idiots Nvidia were for putting in 128bit DDRII. Certainly they realized it? Hell anyone here could have told them not to do that its common sense. Is there just a shortage of it or did they just like the 128bit better?
    You can keep trace density on the PCB the same and double the bandwidth by switching from DDR-I to DDR-II on a 128-bit bus, hence making PCB design logistics slightly more manageable.

    The problem is that despite that, the NV30 uses a twelve layer, 128-bit PCB and it retails at a price much higher than most people are willing to pay. The 9700 Pro uses an 8 layer board, has a 256-bit memory bus and costs a lot less. Not to mention the fact that it's been available for months now.

    MuFu.

  14. #14
    Join Date
    Jul 2002
    Posts
    1,230
    One thing worth noting is that 1 GHz memory may sound impressive -- which I suppose it's why Nvidia's marketroids throw that number around -- but 1 GHz with 128 bit bus is the same bandwidth as 500 MHz with 256 bit. Remembering that DDR means double data rate, a 256 bit card only needs 250 MHz memory to match the GF FX memory bandwidth. Needless to say, even the non-pro 9700 has more memory bandwidth than that. The Pro actually has almost 25% more memory bandwidth.

    Another thing I sort of find interesting is that the 500 MHz GPU barely outperforms a 325 MHZ GPU. Heck, even the increase from a 4600 doesn't seem to match the GPU speed increase. A 500 MHz GPU should have performed _much_ faster, but somehow it doesn't. At a very wild guess, either Nvidia took lessons from Intel, and made a GPU that's actually slower per clock than a GF4, or we're noticing precisely that it's held back by the memory bandwidth.

    I.e., I wouldn't expect a whole lot of black magick to happen with driver optimizations. Unless some miracle happens -- e.g., that it really has a Kyro-style tiled renderer inside that someone forgot to enable yet -- it's very likely that most (or even all) the memory access optimization tricks are already in effect. If not, it would be trailing behind the 9700 Pro.

    And that's not even taking into account that Nvidia's FSAA, as I expected, still looks like crap. Judging by those pics I'm guessing it would take at least 16x AA to get comparable AA out of a GFFX as I'm getting on the 9700 Pro. At which point, I doubt that any kind of driver optimization would still keep the frame rate anywhere near playable.

    To keep it fair though, at another wild guess, where it could still be competitive is 16 bit rendering. Especially since a 9700 Pro can't even do any AA in 16 bit modes. But either way, _if_ that 500 MHz GPU really is held back by memory bandwidth, then it should perform one helluva lot faster in 16 bit colour. (Though why would someone buy a 659$ graphics card to play in 16 bit, is another good question.)

  15. #15
    Join Date
    Aug 2002
    Location
    39.4834 N 87.3261 W
    Posts
    658

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •