# AMD Catalyst 10.12 Performance Analysis



## W1zzard (Dec 18, 2010)

In this article we will investigate how much the HD 4870, HD 5870, GTX 285 and GTX 480 have gained over the lifetime of their driver releases. We also put this in contrast to what the latest AMD Catalyst 10.12 driver update can deliver.

*Show full review*


----------



## bear jesus (Dec 21, 2010)

Thank you so much for this, hopefully it will cause a few less comments about what driver version is used in reviews  i know it wont but at least this is the perfect thing to link to when people do start complaining.


----------



## dj-electric (Dec 21, 2010)

Wow w1zz this is brilliant!


----------



## HXL492 (Dec 21, 2010)

Found this article absolutely excellent and helpful!


----------



## Benetanegia (Dec 21, 2010)

Excelent review. It finally puts an end to the driver improvements myth.


----------



## Sasqui (Dec 21, 2010)

Really interesting that the 5870 saw a 3% gain from 10.11 to 10.12, but the 6870 didn't see squat.

Hope we see more of these comparisons in the future!  Must have been a lot of time spent to do all the tests and compile results.  Thanks W1zzard!


----------



## 20mmrain (Dec 21, 2010)

I am also glade someone did this so possibly AMD will get off their high horse and actually see that there is really no benefits coming from certain drivers.

That might be away to take this review that was never intended. But to me it is a perk. 

Besides that thought pattern I like this review for the fact that it gives people info on how good a driver is before they upgrade. If these types of reviews were put into effect with every CCC.... it would take allot of guess work out of the whole situation.

Thnax Wizz


----------



## bear jesus (Dec 21, 2010)

20mmrain said:


> AMD will get off their high horse and actually see that there is really no benefits coming from certain drivers.



The only problem i have with that comment is that all the improvements are normally listed as game specific and many improvements are not games that are bench marked here/most sites or even played as much anymore in some cases.

From this article what i take is that there are improvements between driver versions but on average the improvements are low enough and also game specific enough for it not to be worth anyone complaining about what driver version is used in a review.


----------



## human_error (Dec 21, 2010)

Thank you for doing this review w1zz. It's very interesting to see how small the differences are with new drivers over a long period. 

I would love to see this become an annual feature where we can see the changes in performance of the top cards for the past couple of generations from each vendor with the different drivers, to help prove or disprove the myths about magical performance boosts people claim cards will get with new drivers.

I wouldn't mind seeing some sli/cfx comparisons as well so we can see if scaling is getting better or not over time (maybe it will help encourage the amd and nvidia to actually make some improvements over time).


----------



## bear jesus (Dec 21, 2010)

human_error said:


> I would love to see this become an annual feature where we can see the changes in performance of the top cards for the past couple of generations from each vendor with the different drivers, to help prove or disprove the myths about magical performance boosts people claim cards will get with new drivers.



It would be nice to see how performance differed from the drivers given for the release review compared to the drivers available when the next generation is released as i think that would give a great picture of real improvements over the time high end cards really are current high end cards.


----------



## yogurt_21 (Dec 21, 2010)

Benetanegia said:


> Excelent review. It finally puts an end to the driver improvements myth.



yeah 5-6% from launch drivers isn't really that much considering how many drivers were inbetween and the length of time. 

it's certainly not enough to change card performance placements. 

though i will say if you've got a gtx480 with 197.17 and you're playing bad company 2.... you need to udate your drivers lol.

none of the others were that dynamic.


----------



## mrg666 (Dec 21, 2010)

Thank you! I always check Techpowerup for the GPU benchmarks and this one is a classic.


----------



## lucassp (Dec 21, 2010)

XVID = DivX (old version) = MPEG-4 SP/ASP, so both can be decoded by newer ATi and nVidia chips.


----------



## Black Panther (Dec 21, 2010)

Great review! I was a tad disappointed not seeing the 5970 though...


----------



## Wyverex (Dec 21, 2010)

This is one of the most awesome _reviews_ I ever saw. Thank you, W1zz!
Very useful


----------



## 63jax (Dec 21, 2010)

very good article. thank you!


----------



## Super XP (Dec 21, 2010)

Very interesting indead. I didn't think Drivers made that much of a difference. It does seem as though CCC 9.x had better performance than the CCC 9.12 for the HD 4870's. The CCC 10.12a seemed to have picked up some performance for the HD 6970 card(s) and brings it on par with the GTX 580 in some games. 
Like always, Great Review Wiz once again.


----------



## dir_d (Dec 21, 2010)

Thank You w1zz this review is great.


----------



## Dice (Dec 21, 2010)

Very interesting!
Bit furstrtrating for me as all the recent drivers for my 5870 - the ones that actually increase performance 10.10&e .11 & .12, cause terrble stuttering when watching tv in wmc (live or recorded) anyone else had this issue?


----------



## T3RM1N4L D0GM4 (Dec 21, 2010)

W1z is magic and this review is what all we ever want!

no more myth plz...


----------



## razaron (Dec 21, 2010)

Interesting review. It amuses me how the results for less popular games are so erratic.


----------



## avatar_raq (Dec 21, 2010)

A much needed review. Thanks W1zz! Hope you can add CF and SLI scaling in the future as well as eyefinity/surround resolutions.


----------



## HellasVagabond (Dec 21, 2010)

Benetanegia said:


> Excelent review. It finally puts an end to the driver improvements myth.



How do you figure ? ATI still has their Low Quality setting as Quality and their Quality as High Quality....And i really doubt they are changing this any time soon.


----------



## Mussels (Dec 21, 2010)

HellasVagabond said:


> How do you figure ? ATI still has their Low Quality setting as Quality and their Quality as High Quality....And i really doubt they are changing this any time soon.



and on the latest nvidia drivers, their HQ setting doesnt even work. whats your point?


----------



## HellasVagabond (Dec 21, 2010)

Looking at your specs Mussels i really don't know how you can tell what the new NVIDIA drivers do or don't but i am not going to start an stupid flame. The proof is out there for anyone who wants to know the truth behind the performance numbers ATI has been getting lately, that's all that matters to me.


----------



## Zubasa (Dec 21, 2010)

HellasVagabond said:


> The proof is out there for anyone who wants to know the truth behind the performance numbers ATI has been getting lately, that's all that matters to me.


You made that point you better back that up with link of such proof.
Telling people to look for evidence to proof your point is not convincing at all.


----------



## HellasVagabond (Dec 21, 2010)

I'm not into promoting other sites, many people in here (who usually keep quiet for reasons uknown) know that of which i am talking about, google does too for the ones who want to search for "ATI Image Quality Optimizations".

Just thought i should say a couple of words about the "performance" numbers by ATI lately that's all, no need to start a flame.

PS : They make great cards but i really hate it when either company does something bellow the radar to gain performance, other than the usual game optimizations which both companies do.


----------



## damric (Dec 21, 2010)

Great work, as usual.

By the way, I'm enjoying that 3dMark11 Advanced that I won in your contest


----------



## DannibusX (Dec 21, 2010)

Black Panther said:


> Great review! I was a tad disappointed not seeing the 5970 though...



I was too, until I realized it wouldn't be exactly fair comparing a dual gpu card to singles.  Maybe when nVidia puts one out we'll see some 5970 comparisons.


----------



## 20mmrain (Dec 21, 2010)

bear jesus said:


> The only problem i have with that comment is that all the improvements are normally listed as game specific and many improvements are not games that are bench marked here/most sites or even played as much anymore in some cases.
> 
> From this article what i take is that there are improvements between driver versions but on average the improvements are low enough and also game specific enough for it not to be worth anyone complaining about what driver version is used in a review.



You are right.... I guess don't mind my comment so much I have been just really frustrated with ATI's drivers and malfunctions as of late.

And my comment was just more of a compliment to Wizz and a suggestion at the same time.

Again just voicing frustration! 

But still you are right it does show how driver improvements are more game specific and complaining about what driver is used is not worth the complaint.


----------



## bear jesus (Dec 21, 2010)

20mmrain said:


> You are right.... I guess don't mind my comment so much I have been just really frustrated with ATI's drivers and malfunctions as of late.
> 
> And my comment was just more of a compliment to Wizz and a suggestion at the same time.
> 
> ...



 Yea i can understand that, although i have been lucky and had no ATI/AMD specific issues


----------



## trickson (Dec 21, 2010)

Another GREAT review W1zzard ! I only wish you did it with the HD5770 as well . This must not be a very good card I guess .


----------



## Magikherbs (Dec 21, 2010)

Thank you W1zzard 

.. I needed that !


----------



## Lionheart (Dec 21, 2010)

Good review


----------



## Steevo (Dec 21, 2010)

In comprehensive testing ATI image quality was found to be superior to Nvidia in 3D on current generation cards, in specific LOD on ATI had to be DECREASED to 70% to match Nvidia.


----------



## Bobington (Dec 21, 2010)

Shouldn't you as due diligence point out the bugs that were reintroduced in 10.12; that's make or break for some system configurations.


----------



## Super XP (Dec 21, 2010)

Truthfully both ATI and NVIDIA have strengths and weaknesses in regards to Image Quality. If you look in the past several years, I personally have to admit ATI won more Video Card IQ tests than NVIDIA, but by a small margin. What I’ve learned is NVIDIA tends to produce better contrast which in turn results in a cleaner gaming experience, contrast is or was ATI’s weak point, and images were slightly darker when in game. Though overall ATI did lean toward overall better IQ in other video quality aspects.  
Today it’s rare to see any difference between the two. You have some games looking better on ATI’s Radeons and others looking good on NVIDIA’s GeForce. It all depends on the game.

What I say is HAPPY GAMING!!!!


----------



## HellasVagabond (Dec 21, 2010)

Actually i tend to disagree on the game part. True in video quality some tests put ATI infront by a small margin, others however do not.

However in games the truth is just far away and the comment made by Steevo above is just science fiction to say the least. To my knowledge the new GTX4/5 series outperform everything ATI has to date in terms of IQ in games but of course if Steevo or anyone else has any specific test to show us with latest drivers that show that ATI is ahead in D3D/OpenGL IQ i would love to see it.


----------



## crazyeyesreaper (Dec 21, 2010)

unless of course you go into CCC and manually set the Image quality sliders omg so hard  That said having used 8800gtd 640 9800gt gtx 470 ati 4870x2 5850 and 6970s image quality in games is to the point you will not notice it so my only reaction is its kinda childish to go OMG this vendors quality is suxxors cause there drivers drop quality. Its a game both camps have been playing since the dawn of there creation let it go buy what works for you and just let it go. Id have bought 2x gtx 570s if Nvidia had decent mobos on the AMD side but 2 choices isnt really good enough so i went ATi again. 

To w1zzard thanks for the review


----------



## pantherx12 (Dec 21, 2010)

HellasVagabond said:


> How do you figure ? ATI still has their Low Quality setting as Quality and their Quality as High Quality....And i really doubt they are changing this any time soon.



Who in the damn hell actually uses the quality slider anyway 

having used both ATI and NV cards I have to say the the colours on ATI cards are better, and for me that makes a huge difference in image quality, + it always seems a bit sharper with ATI installed.






+ one for running this type of review each time a new gen comes out, if it's not to much hassle.


----------



## Benetanegia (Dec 21, 2010)

It's not about if it looks better or not, I thnk. The trilinear and anisotropic "optimizations" on latest Ati drivers are true and are very well documented by some sites. The default image quality has been degraded to improve performance and that's a fact, even if it's not noticeable in games at all or only on some occasions and by some people. The improvements to HD5xxx that latest drivers (since HD68xx release) bring is mostly due to this optimization alone.

Now, no one is discussing if these optimizations are legit or not, if they should be made or not or whether it degrades picture quality on games or not. As long as most people find them useful or don't care at all, it's all well. But just calling things for what they are, the performacne imrovements in latest drivers come from lowering IQ.

It's like Crysis 2, it will most probably run better and most people will think it looks the same or better, but the fact is that technically it will most probably be inferior to Crysis 1. Most of what will make Crysis 2 run better on our machines will be because of lowered IQ and not because of real optimizations.

* Can someone honestly believe that after 14 months on the shelves, it's now with Cat 10.11 and 10.12 that the HD5870 gets improvements? All while the newly released cards don't improve?


----------



## Steevo (Dec 21, 2010)

HellasVagabond said:


> Actually i tend to disagree on the game part. True in video quality some tests put ATI infront by a small margin, others however do not.
> 
> However in games the truth is just far away and the comment made by Steevo above is just science fiction to say the least. To my knowledge the new GTX4/5 series outperform everything ATI has to date in terms of IQ in games but of course if Steevo or anyone else has any specific test to show us with latest drivers that show that ATI is ahead in D3D/OpenGL IQ i would love to see it.



http://www.rage3d.com/articles/catalyst_2011_image_quality_investigation/index.php?p=4


Science fiction, done in the tune of actual research and testing.


----------



## Wrigleyvillain (Dec 21, 2010)

super xp said:


> it all depends on the game.



+1


----------



## HellasVagabond (Dec 21, 2010)

Benetanegia whatever you say won't change a thing to people that don't like to listen because they think that what they own is always the best and that's a fact. Now there are always other reasons as well but that's another story.

Steevo did you even bother to read the entire review ? Obviously not since in the end they admit NVIDIA has a better overal IQ when applying the right settings with their drivers. However for some " weird " reason we don't see them mentioning the IQ cost / Performance issue......


----------



## HTC (Dec 21, 2010)

pantherx12 said:


> + one for running this type of review each time a new gen comes out, if it's not to much hassle.



+1 to this *but only after @ least 2 driver versions for the new cards are out*


----------



## Steevo (Dec 22, 2010)

HellasVagabond said:


> Benetanegia whatever you say won't change a thing to people that don't like to listen because they think that what they own is always the best and that's a fact. Now there are always other reasons as well but that's another story.
> 
> Steevo did you even bother to read the entire review ? Obviously not since in the end they admit NVIDIA has a better overal IQ when applying the right settings with their drivers. However for some " weird " reason we don't see them mentioning the IQ cost / Performance issue......



"Image Quality Conclusion

Both companies offer good default image quality, which we believe is directly comparable. Performance and Quality testing and comparisons should be performed at like to like settings. Currently that default is 'Quality' for both companies, which is probably not the optimal solution for high performance enthusiast products. Perhaps there ought to be different default levels for driver settings, depending on the graphics product. While that sounds like a problematic scheme to propose, it's not too far away from what is already in place; different GPU cores have different codepaths, which perform different optimizations for the same external driver control panel settings."

ATI has sharper images, no angle dependancy.

Let me guess, you like 1080 high def, on your 480 TV....


----------



## pantherx12 (Dec 22, 2010)

HellasVagabond said:


> Benetanegia whatever you say won't change a thing to people that don't like to listen because they think that what they own is always the best and that's a fact. Now there are always other reasons as well but that's another story.
> 
> Steevo did you even bother to read the entire review ? Obviously not since in the end they admit NVIDIA has a better overal IQ when applying the right settings with their drivers. However for some " weird " reason we don't see them mentioning the IQ cost / Performance issue......



I see you've edited the post, but I was going to say regardless of it being rage 3d you can just looks at the pictures and come to your own conclusions, it's what I'm doing right now, yay!


*edit* seems you didn't read it all either, they don't state if one is better, they say AMD has a few glitches in the filtering techniqe and could do with more options. ( that's the only thing that could possibly be miscontrued as nv has better quality, if anything the review implies it's down to what the users eyes find best)


----------



## HellasVagabond (Dec 22, 2010)

Panther pictures represent settings you use and we have no way of knowing the real settings used in them. Also we have to think about why they "forgot" to mention the IQ trick ATI has been pulling as of late to get up to a 10% performance increase, which for me is important since it feels like they think of the end user as ignorant. Of course it's not the first time ATI has cut IQ details from here and there to gain performance but again thats another story.

Finally no i don't trust results from an ATI fan forum that carries the name of one of ATIs first 3d cards ( which were a fail btw, for example black waters in tomb raider for all those who used them back then ) as i would not trust results from a forum if it had the name Riva128. Makes sense i think correct ?

Speedo stop cutting pieces and go where they say that ATI could do better


----------



## pantherx12 (Dec 22, 2010)

HellasVagabond said:


> Also we have to think about why they "forgot" to mention the IQ trick ATI has been pulling as of late to get up to a 10% performance increase, which for me is important since it feels like they think of the end user as ignorant.





I thought it was common knowledge that both companies do this fairly often.


Normally when the competitor releases a new card 


As for not knowing real settings used, fair enough paranoid but fair enough. You have a 580 and an 5970 though, download the various tests they used and run them yourself that way you know for sure.

by the by, just read the 570 review on that website, they don't seem to have a bias (that effects the review)at all, none of the tests are skewed in anyway and they even give the 570 5 stars are were impressed by it and the 580s performance.


----------



## HellasVagabond (Dec 22, 2010)

No, using low quality setting as quality and quality as high quality setting is not something both companies do ( true you can hardly see the differences but that does not change the fact that they are there ).

Now 10% is not something huge i agree on that with some people but imagine what would happen if in every test we have seen lately of both the 6870/6970 and even the less powerful 6850/6950 people were to deduct a 5-10%, what then ?

The entire world was impressed with the 570/580 who could say otherwise and why ? 

Steevo no what they are stating are typical optimizations and not the IQ trick, or how should i put it, they are not really giving the right "emphasis" on that. Better now ?


----------



## Steevo (Dec 22, 2010)

HellasVagabond said:


> Panther pictures represent settings you use and we have no way of knowing the real settings used in them. Also we have to think about why they "forgot" to mention the IQ trick ATI has been pulling as of late to get up to a 10% performance increase, which for me is important since it feels like they think of the end user as ignorant. Of course it's not the first time ATI has cut IQ details from here and there to gain performance but again thats another story.
> 
> Finally no i don't trust results from an ATI fan forum that carries the name of one of ATIs first 3d cards ( which were a fail btw, for example black waters in tomb raider for all those who used them back then ) as i would not trust results from a forum if it had the name Riva128. Makes sense i think correct ?
> 
> Speedo stop cutting pieces and go where they say that ATI could do better



"As part of the new driver released for the HD 6800, there were some changes to the image quality options available. Gone were the Narrow and Wide Tent MSAA modes, leaving only the standard Xox and excellent, but performance sapping, Edge Detect MSAA modes. Changes to the Catalyst Application
Intelligence (AI) slider appeared as well, with a new Texture Quality slider replacing Cat AI standard or advanced. The new default settings for Texture Quality is Quality, which is not the highest quality mode - there are some optimizations performed, that AMD states shouldn't visibly affect image quality but can increase performance."


That link, you know, the one there I posted, if you scroll down, you know with your mouse wheel. 


Perhaps you can't as you have a old mouse and don't know about that yet.


----------



## pantherx12 (Dec 22, 2010)

High quality = no optimisations
Quality = AMD suggested optimisation
Performance = Lots of optimisation

I think your taking the term quality a bit to literally here, doesn't seem to be an issue there for me.


----------



## HellasVagabond (Dec 22, 2010)

I can't force you to accept some things, but since you like looking at pictures check online for several pictures ( pointing out the IQ trick ) showing off Quality with NVIDIA cards and Quality with ATI cards in Games and judge for yourself. 

When i check quality on either the 5970 or the 580 i expect the same level of IQ otherwise we can't really compare anything that way.


----------



## pantherx12 (Dec 22, 2010)

HellasVagabond said:


> I can't force you to accept some things, but since you like looking at pictures check online for several pictures ( pointing out the IQ trick ) showing off Quality with NVIDIA cards and Quality with ATI cards in Games and judge for yourself.
> 
> When i check quality on either the 5970 or the 580 i expect the same level of IQ otherwise we can't really compare anything that way.



"quality" is the optimised setting for BOTH companies. ( basically that "trick" you keep mentioning will not run when you switch optimisations off, I.E high quality setting in CCC )

You want to compare "high quality" dude  ( or the nv equivilant) for a true apples to apples image quality comparison : ] ( otherwise it's down to what the company feels does not detract from user experiance, which is basically the companies "opinion" which means nothing to anyone)


----------



## Benetanegia (Dec 22, 2010)

Steevo said:


> "there are some optimizations performed, that AMD states shouldn't visibly affect image quality but can increase performance."



That's all that matters for this thread guys. I already said that the resulting IQ is irrelevent, mainly because it's highly subjective. The optimizations are there, however, and according to the above quote even AMD stated that optimizations are there and that they increase performance. What else is there to say?

The optimizations are there and increase performacen. People find this completely legit and useful. 

Nothing prevents Nvidia from using the same optimizations and gaining 5-10% performance in some drivers in the future, that's all.


----------



## pantherx12 (Dec 22, 2010)

When w1zz does reviews do you think he switches off things like IQ optimisations ?


----------



## HellasVagabond (Dec 22, 2010)

Panther have even searched the web about what i am talking about ? If you had you would see that even the High Quality mode is not really High Quality, please do and then i will be happy to hear what you have to say. As for Wiz i think he lets them at default CC settings, as do most people reviewing cards.

Benetanegia if NVIDIA was to do this we would probably have 10 threads with accusations, flames and stories about the end of the world and how they are the reason 

Anyway no time to argue and this was not my intention in the first place. Just saying my opinion, back to work now


----------



## pantherx12 (Dec 22, 2010)

no high quality mode just switches off optimisation, to get the rest of the quality settings up you have to turn them up.

As for search naww I havn't it's you making the statement post some links : ]

I wouldn't know where to look.


Oh should note I'm talking about catalyst A.I slider, as I said I don't think anyone but casual gamers actually uses the global IQ setting slider. ( turning catalyst A.I off in old versions of CCC does the same thing as turning it to high quality now) as it doesn't adjust the catalyst A.I setting  ( just changes AA, AF and things like that)

And who cares about them, they can't see the difference.

*Casual I mean people who have a gaming PC, but have no idea how it works.


Honestly when there's an off switch for image quality tricks I don't think it matters at all, one bit, except in benchmarks. ( where they should be off 100% of the time)


Upsetting to see you use the term argue, I thought this was a friendly discussion : [ 


Infact I don't dissagree with you, there is optimisations going on, but you can switch them off which is the important thing to consider.

So saying nvidia or AMD have better image quality is well, silly really, they're about the same when no optimisations are on, trust me I know, I could quickly swap to an NV card now and do some tests just incase my memory is failing me, but I'm fairly certain


----------



## bear jesus (Dec 22, 2010)

Some of the comments in here make me wonder how long certain people have been using computers as both company's have used optimizations that have defiantly negativly effected the image quality over the years and both company's have used optimizations that have not changed image quality enough for it to be noticeable, just like both company's have been beating the other at some point it keeps swapping between the company's who is complaining at who for what optimizations they have used.

Can't we just get over it and accept both company's do this stuff and at the moment it hardly effects quality yet increases performance, it's really to be expected if you have paid attention to computer graphics over the past decade and a half or so.


----------



## HellasVagabond (Dec 22, 2010)

Most people do benchmarks with default settings, if not all, meaning quality setting in both companies.

When however the IQ is lower, even not noticeable by most people, when it comes to ATI than the one in NVIDIA i hardly call that fair. 

So when we have seen several hundred benchmarks with the launch of the 6xxx series without people mentioning this then i really think that the end user is not getting all the facts right and in the end that is simply wrong. As i said 5-10% performance increases is not huge but deduct that from all benchmarks of the 6xxx series so far and many things change.

Anyways i can't continue, too much work, just wanted to say a couple of things, whether or not some people decide to accept them or look into them further is really up to them.

Cheers


----------



## bear jesus (Dec 22, 2010)

HellasVagabond said:


> Most people do benchmarks with default settings, if not all, meaning quality setting in both companies.
> 
> When however the IQ is lower, even not noticeable by most people, when it comes to ATI than the one in NVIDIA i hardly call that fair.
> 
> ...




I have to ask have you only stared using computers recently? i would assume not by the tomb raid comment you made before so can i ask you if you were you posting like this in forums when nvidia was doing this with noticeably degraded image quality?

As i said before both company's have done this many times over the years, anyone who has been gaming for a while should know this and understand the effects of it thus this whole exchange of posts is pointless.


----------



## HellasVagabond (Dec 22, 2010)

When NV was doing similar things ATI was also doing similar things, everyone knew it, everyone mentioned it, noone denied it and we all knew what to expect. 

However now is now and since only ATI is doing it i think we should underline it especially in light of their new 6xxx series don't you ?

Anyways like i said i don't have time to follow threads, just said what i thought on the drivers matter that is all.


----------



## OneCool (Dec 22, 2010)

Looks like AMD gave the 5870 one last bang for the 10.12.

What is that about 3-4% performance gain over all?  not bad


----------



## Mussels (Dec 22, 2010)

and as i said earlier, nvidia are doing the same shit. 

its a pointless argument, who cares if its not 100% of what they can achieve in quality, so long as the end user cant tell the difference?


----------



## HellasVagabond (Dec 22, 2010)

Well if they did it on their most recent ( few days ago ) drivers i see no wrong with that, its just countering what ATI does for almost 2 months now if not longer. 

Of course from what i see in that other thread you are taking a whole different approach saying that NV did this to make their cards seem faster in light of the new 6xxx series....Seriously ??? Obviously you are running for ATI all the way so yes this is pointless i agree.


----------



## AsRock (Dec 22, 2010)

Cool, wish ya did this more often.  Must take some time installing all those drivers over and over.

I am hopeing that you did not just use old benches for the older cards due to game patches and fixing stuff..

MASSIVE thanks....  Hope to see this done again


----------



## Fatal (Dec 22, 2010)

Nice review Wizz, I installed the 10.12's and will try them out.


----------



## Super XP (Dec 22, 2010)

It all boils down to the company that can allow us users to jack up Image Quality to the MAX but at the same time maintaining great performance. Today I see this happening with Radeon cards.

In the past that was not possible I believe. The more you increase IQ settings the more is sucks your performance away (Except for games such as Half-Life 2 and FarCry).


----------



## HellasVagabond (Dec 22, 2010)

Super XP said:


> It all boils down to the company that can allow us users to jack up Image Quality to the MAX but at the same time maintaining great performance. Today I see this happening with Radeon cards.
> 
> In the past that was not possible I believe. The more you increase IQ settings the more is sucks your performance away (Except for games such as Half-Life 2 and FarCry).



I see a much better implementation of AA and AF ( at least when it comes to performance cost ) with the new GTX580 in high resolutions than i have seen with any other card in the market, up until today that is.

In any case AA / AF will always hit performance, until of course we reach a time when max AA / max AF will be default and another setting in terms of quality takes their place.


----------



## codiown (Dec 22, 2010)

for my 5970 is 10.12whql first usefull driver with nice performance since 10.5a even though theres annoying bug sometimes when systems hangs for few secs if monitor goes back from sleep mode or when u press alt+f4


----------



## wahdangun (Dec 22, 2010)

wew, i though this thread is about AMD driver improvement ?? and btw both company use some short of optimization, and btw at least AMD admit it.and if you don't like it at all then you can just change the option from quality to high quality


----------



## Magikherbs (Dec 22, 2010)

Isn't the "enhance app setting" AA option in the Nvidia control panel, the same as Catalyst A.I. in CCC ?


----------



## trickson (Dec 22, 2010)

The one thing I would like to see ATI fix is the speed limit of the GPU . They lock you at 960MHz this sucks ! nVidia doesn't lock you to any specific clock speed on your GPU but ATI sure does . I have seen this for the longest time and it really pisses me off . They fix every thing and nothing all the time !


----------



## bear jesus (Dec 22, 2010)

trickson said:


> The one thing I would like to see ATI fix is the speed limit of the GPU . They lock you at 960MHz this sucks ! nVidia doesn't lock you to any specific clock speed on your GPU but ATI sure does . I have seen this for the longest time and it really pisses me off . They fix every thing and nothing all the time !



I assume you are talking about the max selectable clocks in the overdrive tab on the catalyst control center, if so then you should just use different software like MSI afterburner to set your clocks... assuming they would be stable that high.


----------



## Steevo (Dec 22, 2010)

90% of users will never exceede the clocks listed in CCC overdrive, for those that will apps like afterburner, Trixx, and BIOS editing tools will be the superior method of getting the clocks they want and can achieve.


----------



## erocker (Dec 22, 2010)

trickson said:


> The one thing I would like to see ATI fix is the speed limit of the GPU . They lock you at 960MHz this sucks ! nVidia doesn't lock you to any specific clock speed on your GPU but ATI sure does . I have seen this for the longest time and it really pisses me off . They fix every thing and nothing all the time !



Nvidia doesn't have a built-in overclocking utility. You have to use a 3rd party overclocking utility. When using a 3rd party overclocking utility with ATi cards clocks can be adjusted until the card blows up.


----------



## brandonwh64 (Dec 22, 2010)

I installed 10.12 and the new beta cat but i am going to uninstall and put 10.12 and old cats back cause that new CCC sucks IMO, there is less features


----------



## trickson (Dec 22, 2010)

Well the main problem for me and my setup is that MSI afterburner and EVGA precision just don't work . I can not get them to work and well it maybe because I have windows 7 64 bit OS . nVidia cards third party software has always worked great but now that I have ATI and CCC it just seems to suck . I wished they would fix this for advanced users and get us a good over clocking tool that has all the bells and whistles we advanced users want and need .


----------



## Steevo (Dec 22, 2010)

Try advanced view.


----------



## erocker (Dec 22, 2010)

trickson said:


> Well the main problem for me and my setup is that MSI afterburner and EVGA precision just don't work . I can not get them to work and well it maybe because I have windows 7 64 bit OS . nVidia cards third party software has always worked great but now that I have ATI and CCC it just seems to suck . I wished they would fix this for advanced users and get us a good over clocking tool that has all the bells and whistles we advanced users want and need .



Is your card an ATi reference design? How does MSI Afterburner not work? It's definitely not a Windows 7 x64 issue. Don't bother using CCC if you want to clock past what it gives you. ATi caps the OC limit to obviously protect from people burning up their cards.


----------



## trickson (Dec 22, 2010)

Steevo said:


> Try advanced view.



I have .


----------



## trickson (Dec 22, 2010)

erocker said:


> Is your card an ATi reference design? How does MSI Afterburner not work? It's definitely not a Windows 7 x64 issue. Don't bother using CCC if you want to clock past what it gives you. ATi caps the OC limit to obviously protect from people burning up their cards.



I do not know they are XFX HD5770 cards both are the same . Well it locks my computer up and then I have to hard boot . Should I just install the drivers and not CCC then ?


----------



## erocker (Dec 22, 2010)

trickson said:


> I do not know they are XFX HD5770 cards both are the same . Well it locks my computer up and then I have to hard boot . Should I just install the drivers and not CCC then ?



Go into CCC and reset everything to default. Untick Overdrive and manual fan control. Download the latest MSI Afterburner and install it. You will have to go into Afterburner.cfg and change "EnableATiUnofficialOverclocking" from 0 to 1. Start the program up and give it a go.


----------



## trickson (Dec 22, 2010)

erocker said:


> Go into CCC and reset everything to default. Untick Overdrive and manual fan control. Download the latest MSI Afterburner and install it. You will have to go into Afterburner.cfg and change "EnableATiUnofficialOverclocking" from 0 to 1. Start the program up and give it a go.



Ok I will give this a try thank you .


----------



## trickson (Dec 22, 2010)

Well that fucked my computer up really bad ! It locked up dumped my desktop and just fucked it all up now I do not know what to do I think I have to reinstall windows 7 now . It is just a mess now !


----------



## Super XP (Dec 22, 2010)

brandonwh64 said:


> I installed 10.12 and the new beta cat but i am going to uninstall and put 10.12 and old cats back cause that new CCC sucks IMO, there is less features


I don't agree, I like the new layout. This is a beta version so hopefully we will see a lot more settings to play with.


----------



## dir_d (Dec 22, 2010)

trickson said:


> Well that fucked my computer up really bad ! It locked up dumped my desktop and just fucked it all up now I do not know what to do I think I have to reinstall windows 7 now . It is just a mess now !



I think theres something wrong with your overclock, if you are sure its stable turn off c1e or anything else like it, it causes alot of problem with ATI cards.


----------



## crazyeyesreaper (Dec 22, 2010)

he might have to disable the ultra low power state in the registry im sure everyone here has forgotten that anyone who has it turned out will have MSI afterburner lock up when overclocking with dual cards from Ati


----------



## trickson (Dec 22, 2010)

Wow I guess I will just have to stick with what I have it aint much but at least I do not loos my desktop or files ! Man what a job getting it all back up and running !


----------



## crazyeyesreaper (Dec 22, 2010)

i know the feeling trickson afterburner didnt like my setup at all either after jumping through the hoops i actually flashed my cards to have no real oc limit and just used afterburner to up voltages worked wonders for me


----------



## niko084 (Dec 23, 2010)

Definitely cool to see such an in depth comparison between driver versions.

Although most results show such a small difference on most tests it could even be standard fluctuation. Some of course showing obvious gains.


----------



## $ReaPeR$ (Dec 24, 2010)

this review was a great idea Wizzard  keep up the great work!


----------



## Wshlist (Dec 25, 2010)

Thumbs up for doing this test, it's amazing no others do this sort of thing much.
And I was already curious what those 12a did for the HD69xx cards in practice but I could not find any info before this, and it's interesting it seem to do very little, not quite what I expected but very elucidating.

Thanks for the testing.


----------



## RejZoR (Dec 25, 2010)

Got awful image tearing in BioShock 2 with MLAA with or without V-Sync so i went back to Cat 10.10e. Also textures weren't as sharp as i wanted them even with all optimizations disabled.
Though i was using many settings that were not meant to be seen in CCC.
Will try again today to see how it works. Though it's strange that MLAA is causing tearing, considering it's used very late in the rendering pipeline so it shouldn't interfere with anything.


----------



## WarEagleAU (Dec 26, 2010)

I too never thought or seen that drivers made that much difference. It seems to have done that here, though not by a huge margin. Nice to see some improvement.


----------



## SolidEther (Jan 7, 2011)

*10.12 and CCC2*

I use a Sapphire Radeon HD4870x2, and I have used it for about a year and a some odd months (since a short time after it was released.) The only time it ran odd- slightly unstable was when using catalyst drivers near launch. THe driver disc included with Sapphire cards usually have worthless premature drivers that may not function at all (like the Sapphire HD 2600 Pro 512 MB AGP, it has PCI-e drivers included I believe) The cards both work perfectly now that I have found the right drivers THe AGP HD2600 just uses the AGP Hotfix and it always works great. 

I used 10.9a since its release as a hotfix for the Dual GPU Cards HD4870x2,HD4850x2s and 5970 cards. It works beautifully, but I wanted to try 10.12 w/ the new CCC2 and it worked. The graphics weren't as smooth-they looked more aliased than the 10.9a's no matter what settings. I wanted the 10.12s to work I like CCC2. I want AMD to release an individual CCC2 download and I could uinstall it and the 10.9a driver and see if they mesh well. I will try 11.0 on release, but I think this 10.9a is perfect for gameplay. All games I play (All CODS) AVP, SC2, DAO and all DLCs, Spore, Chr. of Riddick-Assault on D.A. Far Cry 2, HAWX, DOOM3 and R.O.E., Sins of a Solar Empire. Only problem is the Overclock on 10.9a CCC won't downclock the memmory back to 2D sppeds. I have to use RivaTuner to O.C. 10.12 had a noticable framerate drop at times and and some stuttering. I run all options at highest detail w/ 4-8x AA and Quality Adaptive AA. on a 1920x1080 HD 16.9 LCD LG Flatron W2240T (same brand as in this techpower review test only smaller size) and my Rig is pretty juiced (see specs if you care) so I am pretty sure I can compare faults in the drivers, also I always clean install/ remove drivers with Driver Cleaner Pro and a cleaned registry.

Drivers 10.12 and 10.9a, 10.8b Do stop the problem of Regular 3D speeds running on BOTH gpus during 3D mode and drops BOTH gpus to 2D mode after the game exits. One of the most annoying and frustrating issues with dusing proper drivers for Dual GPU cards. As it is PC Gamer says an HD4870x2 is the 5th Best card on thier HOTTEST HARDWARE thermometer page as of thier FEB 2011 issue. Drivers allow it to be that way without the right driver the card only runs one gpu in 3D in half of the games making it a bigger, hotter HD 4870. I've installed DX 11 already, when more Direct X 11 titles release and make more and more use of the fancy Tesselation and Morphological AA and other stuff I will get a 5970.


----------

