Creator of RivaTuner determines both ATI/Nvidia are cheating.
Unwinder the maker of RivaTuner has determined the following:
I really RE'd both Detonator and Catalyst and found application detections mechanisms in each of the drivers. I really created the scripts to prevent the drivers from detecting D3D applications (the scripts block pixel/vertex shader and command line/window text checksum calculation in the Detonator and texture pattern/pixel shader code detections in the Catalyst).
Blocking application detection code caused dramatic performance drop in 3DMark2001/Nature on both NV (74->42 FPS) and ATI (66->42 FPS) boards. IQ in this test changed on both of the systems. Do I have to continue?
NVAntiDetector also caused significant performance drop in other D3D benchmarks (i.e. UT2003), 3DMark2003 score on NV35 dropped even more then with 330 patch (it's info from my tester and I cannot confirm it because I don't have NV35 sample).
Review containing details and benchmarks is preparing for publishing on Digit-Life now.
You need details and facts? No problem. Youíre free to follow our discussion on iXBT and comment texture pattern detections in D3DDP2OP_TEXBLT token handler, youíre free to comment pixel shader per-byte comparisons in D3DDP2OP_CREATEPIXELSHADER token handler in the latest Catalyst. Prove me that itís not application detection, prove me that blocking this code doesnít cause changes in performance and IQ.
Youíre the real specialist, you perfectly understand what are D3DDP2OP_XXX so itís not a problem for you at all, right?
My trust to NVIDIA and ATI PR is almost equal to 0 now. Both of them seem to use the same benchmark 'optimization' techniques, but NVIDIA promotes it as 'application specific optimizations', ATI simply tried to appear innocent, but both are fooling us for a long time. 3DM2001/Nature was de-facto in estimating PS performance, but both IHVs show distorted benchmark results by altering rendering code. And itís very sad.
ATi and nVidia are just trying to give us what we apparently want: better scores on benchmarks that don't matter. Before things like 3Dmark, bapco, sandra, etc, we used to buy video cards based on user feedback, visual quality, and gaming results. It's kinda like Sammy Sosa's corked bat. He claims it was for a better show in batting practice. What a great response! He was just trying to give us what we want, a great show! Things can be spun in any direction, and money has a way of making them turn the right way. ATi, nVidia, Apple, AMD, Intel, Microsoft--they all do it. How else do you get people to replace something that isn't even obsolete yet?
Give what you cannot keep to gain what you cannot lose.
I don't mind optimizations for games as long as they are not designed for a pre-recorded path used for benchmarking and it does not decrease image quality. They can take commands and break them down at the driver level so the card can proccess the commands faster. This is just one example of good optimisations. Bad ones are clipping the image based on a pre-recorded path or decrease the Z or color depth of images or textures.. Also changing things like LOD is bad.
The 330 partch got rid of most of the "bad" optimizations in 3DMark2003.
AMD Phenom II x4 945 3Ghz | ASUS M4A77TD | 2X WD 1TB SATA 2 hard drive | 2x2GB Corsair XMS3 | nVidia GeForce 8800 GTS | ATI TV Wonder Theater Pro 550 | Antec P-160 case | Antec 650w Earth Watts | LG Blu-ray Super Drive | LG DVD RW | Windows 7 Pro
Now we have to wait and see if the people on the top of the Online Result Browser at Futuremark are honest and can stand to lose a few points. Breaking 10k in 3Dmark03 won't be happening for a long while now.
Mighty Damn quiet in this thread. When there is proof of Ati cheating ONLY 2 or 3 people respond. But search over the last several months at the Nvidia Cheating threads. Hell some have 4 or five pages not just 2 or 3 people. Like I said before, Might yDamn quiet when it applys to Ati. Where did all of the experts with so much to share go?????
That is funny!
- Antec Nine Hundred
- Q6600 G0 Stepping @ 3.0GHz
- Gigabyte GA-P35-DQ6
- Corsair XMS2 (4x1GB) PC2-8500
- eVGA SuperClocked GeForce GTX 560 Ti
- PC P&C Silencer 750 Quad
- Enhance Technology Quadrapack Q14SS
- HP Smart Array P400 SAS/SATA Controller w/ 512MB BBWC
- 4 x 72g 15,000rpm 2.5inch SAS Drives @ RAID5
- 2 x 750g 7,200rpm 3.5inch SATA Drives @ RAID1
- 2 x LG DVDĪR Burners
- Hanns∑G HH-281HPB 28" Display
- Windows 7 Ultimate 64-bit w/ SP1
Well, we endured a lot, a few years back with the original Radeon and the Quake3 benchmark. Even Nvidia was part of the game, and were really offended that ATI was cheating. Seems like what goes for one, doesn't always go for the other one.
So there is no benchmark left to be trusted. The medium is the message, so the income is earned by those with the resources available through their business enterprise to play the game of defying the benchmarks by placing some employees job description as manipulators of any current benchmark program utilized because their income is that great due to their product. This can change for the better.
Originally posted by wrathchild_67 Now we have to wait and see if the people on the top of the Online Result Browser at Futuremark are honest and can stand to lose a few points.
Unwinder posted this at B3D:
Finally, I'd like to make some comments about the current test results. NVAntiDetector hurts NV performance much more that ATIAntiDetector hurts ATI results. Currently ATIAntiDetector affected performance in both 3DMark2001 and 2003 (performance drop in 2003 is similar to result of installing 330 patch).
NVAntiDetector caused performance drop in a lot of 3DApplications including UT2003, CodeCreatures, AquaMark etc. Performance drop in 3DMark2003 is not comparable to 330 results, results are a way *lower* so it seems like FM missed some detections:
I hope the Digit Life article comes soon. Pretty curious to see some more game benchmarks with the nV/ATi optimisations turned off. There are a few examples of reviews where custom timedemos, new games etc are now being used. The results are quite a bit different from those when the 5900 Ultra was launched: