There is no 9700 Pro 256MB planned, AFAIK. There's a 9800 Pro 256MB and a 9800 Pro 256MB w/DDR-II in the works. The benefits won't be too great intially - have a look at the performance of the forthcoming GeForceFX 5900 Ultra (256MB) compared to the non-Ultra (128MB) to get a feel for things. They should be launched in about 1 1/2 weeks' time and the price of the Ultra is probably going to be $499.
I am told that one of the nVidia tech demos they are going to use to launch the 5900 Ultra currently uses 344MB of textures (!!!). They are working frantically to try and get the size down, because it's very impressive (~an abandoned gas station with a bunch of snazzy, realtime lighting effects) but runs very slowly at the moment due to necessary AGP fetches.
I think some UT2003 maps already use ~<128MB of textures and the likes of Doom III, STALKER, HL2 etc probably will as well when configured for highest quality. It's not as if these games won't run without 256MB RAM; they'll just run very slowly.
I want one of them there disco machines. All those pretty lights turn me all the way on. But for the topic at hand. I have an Asylum GF4 Ti4200 128MB card and I have not really seen that much of a difference between that and a GF3 Ti200 card.
Originally posted by MuFu There is no 9700 Pro 256MB planned, AFAIK. There's a 9800 Pro 256MB and a 9800 Pro 256MB w/DDR-II in the works. The benefits won't be too great intially - have a look at the performance of the forthcoming GeForceFX 5900 Ultra (256MB) compared to the non-Ultra (128MB) to get a feel for things. They should be launched in about 1 1/2 weeks' time and the price of the Ultra is probably going to be $499.
As far as I know the stuff nVidia is calling DDR2 is not the same thing as what JEDEC is calling DDR2. I think the JEDEC DDR2 is supposed to be quad band memory with data bits being sent 4 times per cycle (DDR sends 2 per cycle). The stuff nVidia uses is supposed to just be tweeked DDR, that can clock a bit faster and is lower latency (or something like that. It still does 2 reads per cycle liek DDR1. ATI has a similar memory like nVidias DDR2, but they are calling it GDR-II or something like that. I wish JEDEC would just call it QDR for Quad (instead of Double).
It would help unconfuse things a bit, but every Manufactuer has to have "Brand Reconition" that's where they make sure the consumer see's a difference between them and their competitors.
Which is why we see that ATI is way better atm.
I still remember reading an article back when my brother got his 8Mhz IBM AT and there was a 12Mhz one out already. The article said that the speed of computers would top out at about 30-35Mhz since the electricity could not travel any faster down the curuits. Man that makes me laugh looking back at it.
Now admittadly the benifit of this faster hardware is not as much as it used to be. Back in the old days games ran really slow on most computer (like Quake2) and it took years for the hardware to catch up in the way of CPU speed and graphics card power. I remember upgrading my system and running Quake 2 on my K6-2 350 and Voodoo3 200 and was amased I could run 1024x768 and it looked pretty smooth. This game came out in the days of the P75 and a Voodoo1. And benchmarks like crusher still took my system to its knees. It was not untill I got my Duron [email protected] that I had completely conquired the game performance wise (still with my Voodoo3). As time went by, though, the hardware grew faster than the software. My Duron [email protected] did not run so good on my K6-2 450, but ran 45FPS at 1024x768. With my Duron I could do 75FPS, but had to lower the resolution to 800x600. Then I got my GeForce2 GTS and I could run 1024x768x32 at 75fps.
Things were really looking up. Then I got my Geforce4 Ti 4200 about the tiem RtCW came out and I could run that game without a hitch on my new AthlonXP 1600+. Well now cards like the Radeon 9800 Pro and 3 Ghz CPUs are out and vertually no game can put these systems to there knees. Even FSAA and AF are being turned on. Game developers are falling farther and farther behind. Even Doom3 will likely run on just about any computer made within two years of its launch that does not use integrated graphics. This game only uses DX8.1 and DX 9.0A is already out. It will likely take at least 2-3 more years befor the games will start to catch up to the hardware from today and look where the hardware will be in 2-3 years. Think 5-6 Ghz or faster CPUs. Dual cores will be a possiblity by then. 2-4GB of RAM will be more common. Graphics cards with 1Ghz cores, half a gig of memory, 16 fully pipelined texture units, and 50-100GPS data throughput. Real time cinimatic features would be a reallity by then, but the games to take advantage of all this power would likely be 3-5 years out by then. Sad, but true.
Alot to look forward too I must say. I can't wait until games look like The Final Fantasy movie and I mean the whole game even games like Tropico and sim city that would be sweet. And games like Half-Life looking almost like real people, can't wait!