September 4th, 1999, 09:41 PM
I am totally confused with 3d cards.
Can anyone with a lot of patience please explain what the terms for 3d cards mean?
like fillrate biliniar, triliniar, voxel, texel, twin texel, Anisotropic Filtering, Procedural textures, and so on...
if my Creative Riva TNT 16 AGP card has 250MHz DAC chip and is capable of
180 million pixels/sec peak fill rate
36 billion operations/sec pixel processing pipeline
that means i should see 180 million pixels on my screen / second.
so that means in 800x600 which has 480'000 pixels if it goes at 60 fps it will show 28'800'000 pixels per second. far below the expected 180 million pixels.
so why the heck do i get < 10 fps in unreal in 800x600 in openGL or D3D mode?
can anyone please explain me this?
September 4th, 1999, 11:21 PM
Since your card is a TNT card it is only clocked at i think 90 or 100 mhz core. Also Unreal is a very demanding game which relies heavily on CPU speeds. ex. If you have a 200 mhz pentuim and you get 10 fps, when u go up to a 400 mhz + cpu u will get more performance lets say 30-50 fps.
i hope that helped you a little
September 5th, 1999, 01:24 AM
But you got to also consider that Unreal is native glide and just support the others, and since glide is only done by voodoo you're supposed to get better performance.
September 5th, 1999, 08:49 PM
The problem is that I have a Pentium II 400 MHZ, (no celeron or low cache) and 64 mb of 10 ns SDRAM 100 MHz, plus one of the best motherboards around: Abit BX6, which by the reviews is one of the best m/b's around.
The cooling of the chip is done by an OEM fan so it's not overheated.
Hard disk is a Seagate 13 gig Ultra ATA66 although not running at it and running as an IDE. Nero 184.108.40.206 says it's speed is about 5 mb/s.
I would have thought on such a system it should run great really.
I "should" mention that on my gameplay, everything was on. Volumetric lighting, fog, reflections etc...
Anyone know if it runs with Unified's Glide x2 Drivers?
September 5th, 1999, 09:11 PM
Creative Labs released a small utility called Unified which is basically Glide2 drivers.
I tried to use it with Rollcage and it didn't work, Also Tripex winamp plugin didn't work since both need x3 drivers.
the beta seems to have come to a halt since the next version was supposed to be released by june !!
I think the people who write drivers for creative are doing a hell of a crap job.
My friend just paid £170 UK (~ $270) for Creative's Riva TNT2 Ultra 32 AGP, and he gets black lines on his screen, he crashes very often and he can't get even 15 fps in unreal with 2.25f patch.
What's going on ?
September 6th, 1999, 01:45 AM
First of all F117, what you are expecting is an ideal situation. Ideally, the video card can deliver 180mp/s. But you need to realize some things.
Thats 180m single-textured pixels. The TNT can also display dual-textured pixels at 90mp/s. Most games have multiple textures per pixel which need to be combined before the pixel is actually written to the screen. Unreal is no exception, where it often takes 2 or more DUAL textured passes just to display a pixel.
Next up: traditional rendering architectures have a little problem called Overdraw. This means they will draw EVERYTHING, including objects whose view is obstructed by other objects or effects closer to you. Depending on the game, this can cause the video card to waste time drawing 2-4 times the number of pixels usually required to draw a single frame. This is called the Overdraw Rating, or the average number of pixels actually calculated to draw a single frame. For games like Unreal, with tons of effects, the overdraw rating is about 3.
Next: there is a hardware limitation on Nvidia's Riva TNT/TNT2. It manages the textures you see on screen that conflicts glaringly with the way Unreal wants to manage the textures. The result is much lower performance. Plus, its a hardware problem, so there is no fix.
Last but not least, you have to consider the actual software. The OpenGl drivers may be good, but nothing is perfect, they add a little to the overhead. Also, because Epic wrote Unreal as Glide only, and because their programmers suck, the OpenGL version is terrible. The Glide version is much more optimized.
...And thats why Unreal sucks on your system but plays smooth on any system with a Voodoo2/3.
This signature space FOR RENT
This signature space FOR SALE
September 6th, 1999, 10:27 AM
Most of the statistics on a graphics card are lab simulated. The 180mb/s represents how fas the card could just fill pixels. The same color, just filling the screan over and over and nothing else. Nothing would ever do this.
36b triangle/sec would be just that. The card can draw, say, an image constructed of triangle. Drawing a 20 sided object over and over with no colors, textures, lighting... Nothing does that.
Now take those and combine them. Draw milions of triangle and aply colors, texture, lighting... and hope your CPU can dump the information to the graphics card.
RAMDAC deals with what resolutions and how many colors you can desplay in 2D mode at what refresh rate. Pretty strait forward.
Most of the other operations deal with image quality or speed and are of no use unless the software you are using supports it.
Overdraw is another issue. Some near futer cards are trying to prevent overdraw, but the performance hit might be more than the performance gain. With a geamitry engine that the NV10 has, this might look more atractive since their fill rate did not jump very much. Drivers and support by games will make a huge diferance with this card.
AMD Phenom II x4 945 3Ghz | ASUS M4A77TD | 2X WD 1TB SATA 2 hard drive | 2x2GB Corsair XMS3 | nVidia GeForce 8800 GTS | ATI TV Wonder Theater Pro 550 | Antec P-160 case | Antec 650w Earth Watts | LG Blu-ray Super Drive | LG DVD RW | Windows 7 Pro
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)