Optimizations, a word that causes havoc amongst the graphics card enthusiast community. Ever since Nvidia were found to be optimizing for 3DMark03 people really started to care about what their graphics card/drivers were doing behind the scenes.

Numerous image quality comparisons / performance comparisons have been published on the internet. I think its fair to say that because of the level of end user feeling on this matter companies had to become more open with what exactly was being optimized. Around the time the Geforce 6 series was launched Nvidia began allowing users the option to disable texture/detail optimizations in their drivers which was firstly a surprise and secondly a wise move.
It meant that whatever the users needs, they were catered for. ATI's stance was that the optimizations in their drivers always produced the best image quality possible and therefore did not need to be disabled. My own feelings on this were that this was a bad move to make, I’m all for having the optimizations available in the driver, even for having them turned on by default, however to not give the end user the option to disable them wasn't ideal. As a reviewer this also causes issues and ties your hands a little. For example, on the 6800U and X800XTPE you can run just about anything at max settings with optimizations off and still get great performance. We proved that in our Gainward CoolFX review, we disabled the optimizations and the card still performed well compared to an X800XTPE with optimizations on. However this wasn't an apples to apples comparison (which wasn't really the point and many people missed that). At the time I was writing that article I was also pretty sure that it would only be a matter of time until ATI offered the ability to disable the optimizations in their driver, after all comments in the online press and on forums were still raising this as a concern and ATI are always willing to listen to and act on customer feedback.

So that brings us to today, we now have a beta build of an upcoming ATI driver which features the Catalyst AI functionality (Catalyst Control Centre only). The basic idea behind Catalyst AI is that it gives the end user the option to completely disable the optimizations in ATI's driver should they feel its necessary.

For those of you interested in specifics, Catalyst AI uses ATI's Texture Analyzer technology (R9600 Series and R4xx series) to optimize performance in any Game/3D application. What about support for older boards? Ati have responded with the following: "We fully support CATALYST A.I. on all R3XX hardware. In fact the R3XX series of products will see an even larger performance boost in Doom 3 than the RADEON X series of products."
ATI believe that whilst doing this they maintain the correct image quality and in some cases can even improve IQ. Cat AI does this by analyzing individual textures as they are loaded in order to choose the best and fastest way for them to be displayed. Settings available to the user are Off, Standard and Advanced. By default the drivers are set to Standard. In most cases Standard should be sufficient as it uses less CPU overhead than Advanced and its up to the end user to decide which option works best on their system in each particular game or application. There should be no IQ difference between the two settings however in games where there are frequent texture loads or when using a slower system the extra computations may cancel out the performance increases gained by using the Advanced algorithm.

As well as the texture optimization algorithm there is a second aspect to Catalyst AI and that is the application specific optimizations and tweaks. Examples of these application specific items are forcing Anti Aliasing off in the driver for Splinter Cell or Prince Of Persia because AA doesn’t work in those titles. ATI have informed us that they will never specifically detect a synthetic benchmark with Catalyst AI optimizations however some benchmarks may see improved scores due to using game engines that have improvements within the driver. ATI have also guaranteed that they will only optimize if they can do so without any reduction in Image Quality.

Just to be clear, disabling Catalyst AI disables application specific optimizations, bug fixes and generic optimizations.

So there you have it, quite an interesting feature. In the near future we will be re-publishing our Gainward CoolFX article showing how the card compares to the X800 XTPE when both have optimizations disabled. For now we have a selection of tests and IQ shots to show how a reference 6800 Ultra compares to the reference X800XTPE with optimizations enabled and disabled.

Test system:

Reference Design Nvidia Geforce 6800 Ultra
Reference Design ATI Radeon X800 XT PE

AMD Athlon64 FX-53 Socket 939
2gb PC3200 DDR
80Gb Samsung 7200rpm SATA drive with 8mb Cache
Ultra X-Connect 500w PSU

Windows XP SP2
DirectX 9.0c
.net framework 1.1
Catalyst 4.1x beta
Forceware 66.31 WHQL
Fraps 2.3.0
Counter-Strike Source
Colin McRae Rally 2005
Thief 3

As with all of our reviews a clean install was performed for both test systems. All tests were run 3 times and the middle result of the 3 is detailed below. All ingame options were set to their maximum settings and the latest game patches were applied.


Of all our tests Doom3 shows the largest change in performance for ATI's drivers, this is mainly due to the additional game specific optimizations within the driver. This includes tweaks such as replacing the lighting shader, which is based on a look-up table, with a mathematically precise lighting shader that not only significantly improves performance but also renders a more mathematically correct scene. As far as performance goes there is really nothing out of the ordinary here, the 6800 Ultra continues to dominate in Doom3.

The latest build of the Source engine (Not Counter-Strike: Source) shows a change of just over 7% with the Geforce. The Radeon, despite having application specific optimizations as well as texture analysis optimizations shows a very small drop and even with optimizations disables it continues to show great performance, outscoring the optimized 6800U by 31% and the non optimized 6800U by 36.25%.

Thief 3 again shows a larger drop on the Geforce when optimizations are disabled, nearly twice that of the X800XTPE. On the performance front the X800XTPE takes the crown by a hefty margin and really it doesn’t matter performance wise if you have the optimizations on or off, you're still going to find the game perfectly playable on ATI's card. Non optimized Vs Non Optimized shows a difference of 28% between the two cards.

We thought we'd throw an upcoming game engine into the mix as well, Colin McRae 2004 has been used by us for a while now in articles and the 2005 version is shaping up well. In the build that we have there is quite a performance difference between the two cards however this may get closer by the time the game is released. (We'd still expect ATI to have a reasonable lead though). On the optimizations front we see that the difference between optimizations on (standard) and off on the Radeon is very small. Only 1.4% of a change between the two settings. When the optimizations are disabled on the Geforce the change is over 10%. Comparing the two un-optimized scores shows the difference of 41% between the two cards (in favour of ATI). Its also worth noting that when the optimizations were disabled on the 6800Ultra the awful stuttering/throttling bug that affects some 6800 series cards appeared. This was the only test which exhibited stuttering for us (though not the only one we've ever seen) and only when optimizations are disabled. It did make us wonder if the stutter/throttle bug that 6800 users are reporting is actually related to Nvidia's optimizations not functioning correctly in some games...this is of course just a theory but something that others may want to investigate.

UT2004 (above): 10.5% difference in figures for ATI.
In this test we’ve run we look at the 3rd and final game engine that ATI optimize for (D3/Source/UT are the app specific optimized engines) we can see again the improvements gained from using Catalyst AI are visible and performance increases across the board.

UT2003 ATI optimizations off

UT2003 ATI optimizations on

As you can see from the synthetic UT2003 benchmark the gains possible with Catalyst AI and the UT engine are quite large however when testing in actual gameplay these aren't quite so noticeable, this is due to the use of bots in our test and the CPU workload they cause.


Image Quality
For those of you interested in IQ between the 2 cards and 2 settings we have a selection of screenshots, please note, they are high resolution BMPs and are several megabytes each so those on dial up might want to give it a miss.

Doom 3 ATI optimizations off

Doom 3 ATI optimizations on

Doom 3 Nvidia optimizations off

Doom 3 Nvidia optimizations on


CS Source ATI optimizations off

CS Source ATI optimizations on

CS Source Nvidia optimizations off

CS Source Nvidia optimizations on


Thief 3 ATI optimizations off

Thief 3 ATI optimizations on

Thief 3 Nvidia optimizations off

Thief 3 Nvidia optimizations on

All of the above screenshots were taken at 1600x1200 4xAA 16xAF and maximum ingame quality. Optimizations shots were taken using the driver default values and the no optimizations were taken with all optimizations options disabled in both drivers. (NOTE: App Specific optimizations are enabled in the NV drivers)

I'll leave you to make your own conclusions (and discuss in our forum) for IQ because really its a "personal preference thing" however its worth commenting on a couple of aspects of the source screenshots. Firstly the 6800U screenshots have a really bad distortion at the base of each screen. This appears to be a driver bug in 66.31 (and a rather strange one for a WHQL driver) and does not appear during the actual running of the engine. When comparing screenshots you should try to ignore this and the water at the base of the tunnel as that is different in every screen. Pay close attention to the walls, items and floors for comparisons.

The first thing that strikes me about the results we gathered is the small performance increases which are achieved by ATI with their optimizations. On one hand it seems so small that’s its not really worth talking about...so small infact that it could just be down to natural margin of error in some cases. On the other hand it could be looked at in the way that ATI's optimized setting is infact much closer to Nvidia's non optimized results in terms of comparison articles than every one thought up to this point. With a margin so small it almost made us consider leaving our review of Nvidia's no-optimizations Vs ATI's with Optimizations and not updating it. Overall though its great that ATI now offer the user control over optimizations, even if its not worth their while disabling them. Looking at the IQ shots this certainly seems to be the case, there is no difference in IQ seen by the end user between off and standard yet you get some excellent tweaks – like the improved calculations (and therefore performance increases) in Doom3.

It does seem to us that Nvidia's optimizations are aimed more at gaining raw performance across the board where as ATI's are aimed at improving all aspects of the gaming experience. (We'd welcome detailed information on app specific optimizations from Nvidia though if it shows they are improving things in the same way) Its also worth mentioning that ATI give you the option to disable all optimizations, where as when we disabled the Nvidia optimizations it was texture/detail optimizations only. Any Application specific optimizations cannot be disabled in the Nvidia drivers and are therefore causing the Nvidia results to be higher. It would be nice to see Nvidia take steps to allow all optimizations to be turned off in future drivers. In a way, as things stand we still cant get an Apples to Apple comparison between competing cards however we are much closer than before...and that will remain the case as long as Nvidia don’t allow the option to disable all optimizations.

ATI have converted me, up until this driver I wanted this option, I wanted the ability to run without optimizations, to run what I felt was the better, more correct and nicer looking non optimization version. I can see though that would be detrimental to my gaming experience and so I’ll be sticking with Catalyst AI "on" in future... still nice to have the option though...


Missed our other articles?


Click here to go to application and install page Click here to go to pcmark2004 page Click here to go to the results page Click here to go to the conclusion page

Click here to go to application and install page Click here to go to pcmark2004 page Click here to go to the results page Click here to go to the conclusion page