Ever wondered where 3dfx, nVidia, ATI and Matrox are heading? What features and performance we can expect from their upcoming products? Never really got the hang of all those 3D buzz-words? This article features a in-depth explanation of the basics of 3D acceleration and will be discussing the features of 3D cards now and in the future, all in an effort to paint a picture of where consumer video cards are heading.
"FSAA is in the 3D spotlight at the moment, but the crowd will die down soon. This is a feature that will fade into the background" Perhaps the most ill-considered prophecy I have seen in a long time. As one who been using a V5 for 3 weeks and have watched as Nvidia scrambles to get FSAA implemented properly in its new drivers, your statement is only worthy of comment due to the large number of newbies that may read your article and assume you have some sort of clue about video cards. With the current V5 (or GTS) you are pretty much limited to FSAA in the 640x480 to 1024x768 resolutions depending on the game, and and FSAA is a substitute for high res, in fact the opposite is true, high res is no substitute for FSAA as even 1600x1200 (which I suspect a majority of monitors out there cannot even support) does not eliminate the pixel crawling and popping that becomes so noticeable and ugly once you are used to the superior 3D image quality of FSAA. Of course your mileage may vary according to the game, for example in UT I do not find much advantage, yet in Q3a FSAA makes a significant difference, for sims the difference is literally night and day. In the future with the V5 6000 and other next gen cards with a lot of horsepower, both high res and FSAA will be possible and of course preferable. I sincerely doubt that any high end gaming oriented cards will be taken seriously in the future if they do not support FSAA, currently the best FSAA available is from 3dfx.
You know, I've seen some damn long descriptions of where people live, and I was just wondering how freakin' long of a location they actually allow you to write in here. Looks like it's quite a bit. Oh well, if the space is here I'll use it!!! :)
Intelligent move not to start taking sides in the Intel vs. AMD debate until near the end of the article.
Otherwise, I would have stopped reading much sooner.
The solution to the problem of life is found in the diminishing of that problem.
IMO, Mr. Derek Smart is a hypocrite: Only someone who is either (a) lying (b) ashamed of their products (c) just plain ashamed, would hestitate to give out some simple and straight forward information. - Derek Smart, Ph.D.
The Voodoo3’s main competitor was the TNT 2 from nVidia. On the fill rate front the TNT 2 produced a 250 Mpixel/sec fill rate (300 Mpixel/sec for the TNT2 Ultra), but we have to remember that the TNT (TwiN Texel) architecture doubles the texel fillrate to a healthy 500 Mtexel/sec.
No. Thats 250/300 Mp/sec, or 250/300 Mt/sec. The TNT / TNT2 has two pipelines that can either render two single-textured pixels or one dual-textured pixel (2 texels) in a single pass.
TwiN Texel is only a name - it singnfies that the TNT is the first Nvidia chip with multiple texture/pixel pipelines, if you remember after the Voodoo2 unleashed single-pass multi-texturing upon us, having a single texture pipeline (like the Riva 128) suddenly became bad.
The Voodoo 3 range gives a 286 Mpixel/sec fill rate on the 2000, with the 3000 offering 333 Mpixel/sec and finally the 3500 pumping out 366 Mpixel/sec.
Wrong again. the Voodoo 3 can only do dual-textured multitexturing in a single pass. It can only create a single pixel per clock.
You know, if someone their "expert" opinion around here, they should atleast have a firm grounding in what they're talking about.
I was a litte dismayed at your viewpoint: you go back as far as the Voodoo Graphics (NOT the Voodoo 1, child), but you neglect to mention that before the release of the TNT, there was no defined winner in the 3d graphics market. 3dfx was too expensive, and everyone else was slower than the monsterous Voodoo2. The market was a warzone with heavy competition from:
ATI: Rage IIc, Rage Pro
Rendition: v1000, v2x00
PowerVR: PCX1, PCX2
Number 9: Revolution 3D
3dlabs: Permedia, Permedia 2
S3: Savage 3D
and of course...
Nvidia: NV1, Riva 128, Riva 128ZX
3dfx: Voodoo Graphics, Voodoo Rush, Voodoo 2
You might be too young to remember, but intitially the TNT was as expensive as the GF2 was on release, but it quickly fell in price, dragging Voodoo2 prices down with it. For the first time people could really afford a high-performance 3d card, and sales of competitors cards dwindled.
This grand introduction of competition into the high-performance accelerator market had several immediate effects. Rendition dropped out of the graphichs market even though their RRedline Multimedia Accelerator (RMA was very promising, and Intel found their once high-performing cheap i740 overnight counld no longer compete. The longstanding effects have been few however, now that the graphics market has stabilized we are back to high prices for high performance, only now we have enev fewer companies to choose from...go figure.
If youre going to take a look back at the begiinnings of the PC graphics market, the least you could have done was be a bit more thorough and accurate. Your viewpoint is that Nvidia was nothing before the TNT, and that the early graphics world revolved entirely around VG...how wrong you are.
what i don't get is why this artical didn't say anything about S3?!?! or anything on S3TC (unless it did, i don't remember)
also (if remember correctly) it didn't really go into any depth about what actually coming (did it?) it was just an overview of what's been happening, and what features he thought were going to stay around. i guess everyone to their own.
people that drive slow are easy to pass, it's people who drive fast that provide a challange.
Ohhh, almost forgot, The Matrox G100 and G200, although the G200 was really the only one cut out for gaming. The G100 was released a few months before the G200, and was purposefully cut-down featurewise in preparation for the release of the G200.
A few other things I neglected to mention: I originally intended to put only real 3d-accelerators on that list (hence no ViRge or Mystique s220), so I have to detract a few unmentionables:
Nvidia: NV1, used only in the Diamond Edge 3D, terrible POS in 3D, included sound.
ATI: Rage IIc: precursor to the Rage Pro, missing a few essiential features like billinear filtering, lukewarm DirectX support.
NEC/PowerVR: PCX1, the stupidest little screwup. This was released EARLY to get a jump on the market, but the idiots announced that the PCX2 with support for billinear filtering would be out in a mere 3 months. Needless to say, it did not sell well.
I would like to take a moment to apologize for my spelling errors, I am a full-size veteran who must type on a notebook keyboard during the day.
I would also like to ask: WHY was this article written? The content HWC provides is already so terrible that we do not need another pointless 15-page ramble. I only come here for the forums and occasionally the hardware REVIEWS. So why bother writing something that is simply a waste of people's time, with less information than anyone could glean from any one of the hundreds of other "previews of whats next" on other sites?
One little side note about the Voodoo 1 shipset: The Voodoo 1 was SLI capable. Only one manufacturer took advantage of it, Quantum3D. They had a board with 2 complete Voodoo chipsets in an SLI configuration. Of course, the board was maaaad expensive (~$600 back in the day) and I don't know a single person that was going to pay for that.
"Then the Z-Buffer is used to calculate which objects need to be textured and which don’t. This approach cuts back on wasted rendering and bandwidth. The feature is similar to the hidden surface removal method that the PowerVR used."
One and the same company. You might have mentioned the Neon 250 <niff>
Also, I don't beleive there is a z-buffer as such, it's all done on the embedded "tile" saving z-buffer BW as well.
This was a problem with the original PCX1 ans PCX2, but one of the benefits touted with the PVRSG ( what eventually powered the Neon 250 ) was supposed to supply a Z-buffer for all but PowerVR-SGL powered games. Did they ever live up to that promise? I know the Kyro does.