16 pipes in NV40?

O

Oxblood

New Member
#1
Hi,

About a week ago I read in theinquirer.net that NV40 NVIDIA's next chip will have 16 pipelines for processing.

When I read this I waited around in the video cards board for a good thread to get started about it :)

However, I think no one made mention of it and I was surprized.

anywho:

is this true?

what does this mean for ATI, is 8 pipelines completely inadquate against 16 pipelines?
 
grover

grover

Ex pert
#2
Depends on the pipeline architecture... but judging by the similar pipeline performance in the present cards? Yeah, if true, it's probably close to 50% faster than an 8x1 card! But the die size would make them really expensive... of course, with Doom3 packaged with new nvidia cards, that would offset the cost a bit ;)
 
wrathchild_67

wrathchild_67

Hidden Member
#3
Yeah Mufu posted about this about a week ago. But the thread mysteriously vanished within minutes. [cue ghost sounds]
 
Todd a

Todd a

New Member
#4
The pieplines give you the basic fill rate of a card, but other things like memory bandwidth and hardware features like PS andVS can restrict the speed quite a bit (and AA and AF too).

I keep getting a lot of mixed info. See most of these sites that talk about the 16 pipelines only say NV4x. The problem is the NV45 is supposed to come out later this year and was supposed to have 16. I'm wondering if people are getting these two cards mixed up. I did hear mention of something like 220M transisters, so it could be true.
 
MuFu

MuFu

Moderator
#5
Sorry about the disappearing thread - wanted to check with a couple of people that spoke to David Kirk about it first.

16x1/32x0
~475 core
600MHz GDDR3
PS/VS 3.0

The estimated power consumption is pretty scary - 150W max. :eek:

MuFu.
 
ThreeOnTheTree

ThreeOnTheTree

New Member
#6
That is pretty unfortunate if they consume that much power. I think with Intel pushing to increase the TPD of Socket T, we will already be experiencing some power supply deficiencies. I'd imagine an NV40 paired with a 3.6ghz prescott will consume so much power that you will probably need a quality 500W PS as a bare minimum. Then there is the whole thermal management issue. BTX should help, but it looks as though the large fans will have to be pretty powerful to vent 200-250W of heat. It will be like running a server from your bedroom.
 
MuFu

MuFu

Moderator
#7
Well it'll need a couple of Molex feeds and AGP power from a fat PSU or have it's own power supply ("Voodoo Volts" anybody?).

MuFu.
 
Todd a

Todd a

New Member
#8
Man that would suck. I read that even the Antec power supplies at 550 really did not put out much above 500w. Todays power supplies will really struggle. If it does use 150w it is going to make the dust buster cooling look mild compared to what this beast will need.
 
grover

grover

Ex pert
#9
Originally posted by MuFu
Well it'll need a couple of Molex feeds and AGP power from a fat PSU or have it's own power supply ("Voodoo Volts" anybody?).

MuFu.
GeForceFX already requires both AGP and molex power. But that's the direction it's going- with video cards literally turning into miniature PCs in their own right, their power consumption is going up as fast as CPU consumption. And heat... 150W is just ridiculous! I mean, how the hell are we supposed to cool that? There's no way that's true- it would never make it to production if it were.
 
MuFu

MuFu

Moderator
#10
I presume 150W is the 3-rail draw. It'll probably thermally dissipate about 100-120W.

MuFu.
 
Todd a

Todd a

New Member
#11
That is still insane. Of course some of that is likely from the memory, so it might be 100w for the GPU. That will still need a mighty heatsink to keep cool. I still have images of the FX Cool system on the original GeforceFX 5800 Ultras and the funny pictures of people using them as leaf blowers and hair driers and articles saying they sounded like a vaccume cleaner inside their case when they powered up. :D
 
ThreeOnTheTree

ThreeOnTheTree

New Member
#12
I think the real challenge in the future will not be to get the best bang for your buck, but rather the best bang for your watt! :)