# HIS Radeon HD 6970 2 GB



## W1zzard (Dec 14, 2010)

AMD's new Radeon HD 6970 comes with an improved shader architecture that promises more performance with less transistors, consuming less power. Another interesting aspect is the inclusion of a power limitation system that ensures maximum performance when needed and avoids damage to the graphics cards.

*Show full review*


----------



## alexsubri (Dec 15, 2010)

o m g ...i am disappointed as well! The 580 def caught AMD off guard! If 6990 can't pull it off, I'm grabbing a 580. Still waiting to see the 6950 and 6970 xfire review. The only up side is price and tessellation


----------



## v12dock (Dec 15, 2010)

:shadedshu... Nothing more to say


----------



## Jamborhgini313 (Dec 15, 2010)

No wonder they delayed it...GTX 580 KILLED IT


----------



## v12dock (Dec 15, 2010)

570 beats it and is cheaper.... ?


----------



## jasper1605 (Dec 15, 2010)

wtstalin said:


> the comparison card is the 570



which is $50 cheaper.  I do like that this card has 2gb of memory so it could fuel my eyefinity, but I definitely am sad that the performance is not quite up to what was expected.  

Oh well, more reason to save money instead of spending it!


----------



## Jamborhgini313 (Dec 15, 2010)

wtstalin said:


> i had to make an account to post this
> of course the 580 killed it
> it costs significantly more money
> do you not understand how this works?
> the comparison card is the 570



So? 5970 cost $100 more than GTX 580 and they were comparable. These 2 are the flagships from each company


----------



## blu3flannel (Dec 15, 2010)

After all the hype, that was somewhat disappointing. Ah well, such is life.


----------



## HXL492 (Dec 15, 2010)

Yes, the 69xx may be a disappointment but Nvidia got a head start anyway. The 480GTX was already fast but lacked the efficiency of most cards. All Nvidia had to do was improve the architecture. Whereas with AMD, they needed to make a card faster than the 5870 but didn't know where to start. The 580GTX is an improvement of the 480GTX but at least the 6970 was an improvement over the 5870. To sum up I was very disappointed as I heard the 6970 was meant to be at least 10-20% faster than the 480GTX


----------



## Volkszorn88 (Dec 15, 2010)

In terms of war, can we still fall back on the 5970? :/


----------



## wolf (Dec 15, 2010)

unfortunately not as fast as I had have hoped, the GTX570 is looking to be my next card.


----------



## KainXS (Dec 15, 2010)

well you have to look at it like this, the gtx580 is basically the same architecture as the 480 which has been around for about 7 months and the drivers for it are optimized pretty good from the looks

the 6970 though is a different design then the other cards though so the drivers aren't very mature, even a few days ago drivers were released that gave it a boost.

its still too early to call

glad i don't have the money to buy anything right now so i can wait and see.


might buy a 570 later though

that was the quickest ban i ever saw.


----------



## Goodman (Dec 15, 2010)

10.11?
Thx! for the review btw


----------



## DonInKansas (Dec 15, 2010)

I bet Catalyst 11.1 addresses some of the issues like Bluray playback; we'll see what mature drivers can do for the card.


----------



## Goodman (Dec 15, 2010)

10.12 give a nice performance boost , as i heard


----------



## W1zzard (Dec 15, 2010)

you heard wrong. check the amd release notes. also no way to retest all the cards in 1 day. 10.12 was released 30 hours ago.


----------



## AndreiD (Dec 15, 2010)

Great review as always, but the card seriously disappoints.
The only thing left for AMD is to price the cards lower, because at these prices, everyone will just buy the 570 or some future 560.


----------



## W1zzard (Dec 15, 2010)

DonInKansas said:


> I bet Catalyst 11.1 addresses some of the issues like Bluray playback; we'll see what mature drivers can do for the card.



the bluray power problem has existed since hd 6800 release


----------



## Buster_Jack (Dec 15, 2010)

v12dock said:


> 570 beats it and is cheaper.... ?



Not realy! Heck the 6970 beats the gtx570 in most the tests and gets real close to the gtx580 man. Go back and check the review again!
It even beats the gtx570 in Unigine heaven 2.1.

Im not saying the card isn't a disappointment, im just saying it's not a complete disappointment.


----------



## mdsx1950 (Dec 15, 2010)

Bang for the buck FTW!

With the new drivers the cards will perform better for sure.


----------



## Animalpak (Dec 15, 2010)

performance was not the goal fo sure


----------



## v12dock (Dec 15, 2010)

Buster_Jack said:


> Not realy! Heck the 6970 beats the gtx570 in most the tests and gets real close to the gtx580 man. Go back and check the review again!
> It even beats the gtx570 in Unigine heaven 2.1.
> 
> Im not saying the card isn't a disappointment, im just saying it's not a complete disappointment.








All I care about


----------



## CDdude55 (Dec 15, 2010)

Pretty fail card for a high end single GPU solution compared to other cards on the market currently. They still haven't fully gotten above to the performance level of even the older 480. My high expectations have been destroyed.

But hey, i'll get one when they get a bit cheaper, and with driver improvements it should be a bit better at that moment in time.


----------



## Volkszorn88 (Dec 15, 2010)

At this point, it's just best to buy a 5970.


----------



## HTC (Dec 15, 2010)

W1zzard said:


> also no way to retest all the cards in 1 day. 10.12 was released 30 hours ago.



 Official AMD Radeon 6000 Series Discussion Thread



W1zzard said:


> you heard wrong. check the amd release notes.



Really?


----------



## Maban (Dec 15, 2010)

It's a step forward for ATI but still unimpressive.


----------



## Buster_Jack (Dec 15, 2010)

v12dock said:


> http://tpucdn.com/reviews/HIS/Radeon_HD_6970/images/perfrel.gif
> 
> All I care about



It beats the gtx570 in cod4, Crysis, Drit2, F1, Riddick, Clear Sky, and even UT3 where ATI cards do real bad, plus excellent tessellation performance...

It's the second best card for a 30" monitor:


----------



## Melvis (Dec 15, 2010)

It performs well in certain games and not so good in others, maybe with better drivers it might be a contender? Otherwise its not a GTX580 killer, but not far off it, but still nothing i would jump at as of yet. 4870X2 FTW!!!

O and the heat? thats a bit high for my liking.


----------



## wolf (Dec 15, 2010)

Buster_Jack said:


> It beats the gtx570 in cod4, Crysis, Drit2, F1, Riddick, Clear Sky, and even UT3 where ATI cards do real bad, plus excellent tessellation performance...
> 
> It's the second best card for a 30" monitor:
> 
> http://tpucdn.com/reviews/HIS/Radeon_HD_6970/images/perfrel_2560.gif



which less than 2% of people use. 1080p is fast becoming the most popular res.


----------



## Buster_Jack (Dec 15, 2010)

wolf said:


> which less than 2% of people use. 1080p is fast becoming the most popular res.


remember in some test like 3dmark vantage and dar Cry 2 ATI card do real bad, and that's where average performance level closes up with that of the gtx570. Otherwise, if the price goes down to $350/--, the card might become the best card to buy.


----------



## wolf (Dec 15, 2010)

Buster_Jack said:


> http://tpucdn.com/reviews/HIS/Radeon_HD_6970/images/perfrel_1920.gif



whats your point? it's two percent faster than a GTX480 or 570, my point is Nvidia trumped them this time, at least so far.

and don't get me wrong I'd love to see drivers work magic for these cards too, it will only mean price drops on Nv cards making everyones life better.


----------



## Benetanegia (Dec 15, 2010)

Wow. Really really dissapointing. They don't even OC a damn. I didn't even expect this. 

After I read that the final clocks were going to be 880 Mhz, an strange clock instead of the typical rounded clocks that AMD has always used, I was 100% sure that after the clock bump/bios change it would trade blows with one Nvidia card, I just thought it would be the 580, but no it's the 570 and that's with both at stock. When overclocked the HD6970 (at least the one W1zz got) is hopeless 4% OC pff. 

And 90ºC wow. I wouldn't have thought AMD could do this wrong after Cypress honestly. Every advantage they had is gone. Compared to GTX570 Perf/watt is the same, because power consumption and performance are the same, thermals and OC are much worse...

Future is not so bright for AMD. They do have a small die area advantage, but looking at how the HD6950 is just as fast as the 5870, Nvidia will probably put a lot of pressure with the GTX560 and maybe even beat the HD6950 (bear in mind GTX460@850Mhz ~= HD5870, now add 15% more shaders), with a significantly smaller chip and a vastly cheaper to produce card. IF that happens, it's going to be the first time to happen that on the "high-end" since the 7900 GTX!

I was dissapointed by GF110, because it was nothng more than GF100 done right, as opposed to the 1.5x GF104 that I was anticipating (576 SP). And now this. Well, good news for me then. I bought a very cheap Gigabyte GTX460 OC a pair of months ago and in the last week I've been worried thinking I did wrong*. I'm happy now knowing I am not missing anything, but my little enthusiast heart is broken.

*I was not going to buy any of these cards, I only buy 150 euros cards now, but the fear was that if they were very fast, prices of of faster cards (Barts, GTX470...) would drop drastically. Or that the soon to be launched GTX560 would have to be sold at 150-200 euros or so to compete. But with HD6950 being what it is, Nvidia can release it for 220 euros and we would even have to thank them because of the good deal. Damn it.


----------



## mdsx1950 (Dec 15, 2010)

Have to see how these cards scale in CF. People say it can beat GTX580SLi so i'm very eager to see. Hopefully using the new 10.12 drivers. 



wolf said:


> which less than 2% of people use. 1080p is fast becoming the most popular res.



I'm proudly a part of that 2% lol


----------



## wolf (Dec 15, 2010)

mdsx1950 said:


> I'm proudly a part of that 2% lol



so you should be, I'd love a 30" monitor at 2650x1600...


----------



## Kantastic (Dec 15, 2010)

If ANYONE continues to bitch about the old drivers I will personally shove my foot up your cord.

The card was disappointing to say the least, but great review as always W1z, thanks!


----------



## Volkszorn88 (Dec 15, 2010)

I'll wake up and realize that this was all one bad nightmare. Everything will be fine when well rested.


----------



## KainXS (Dec 15, 2010)

I did not even notice it was running 90C

its more like AMD's GF100, AMD's next gen cards might be great but its just like nvidia when the 480 launched, a test card

as for the drivers, its like AMD hired a ninja to stab itself in the back

dam im gonna say, . . . . .:shadedshu

I did not expect this


----------



## AndreiD (Dec 15, 2010)

What will newer drivers do? Drivers can't add that much performance anyway so idk why so many people are bitching about them.
I've yet to see a driver which adds 10% performance to a card. If the card turned out crappy, it will be the same with new drivers too.
I'm sure drivers will add 200 more sp so it can beat the 580.
[/rage against people crying about drivers]


----------



## theonedub (Dec 15, 2010)

I guess price drops on NVIDIA's lineup is out of the question now, no?


----------



## Goodman (Dec 15, 2010)

AndreiD said:


> What will newer drivers do? Drivers can't add that much performance anyway so idk why so many people are bitching about them.
> I've yet to see a driver which adds 10% performance to a card. If the card turned out crappy, it will be the same with new drivers too.
> I'm sure drivers will add 200 more sp so it can beat the 580.
> [/rage against people crying about drivers]



I've seen some good performance boost with the latest drivers on my 5750 true the year , it may not be much but it is there


----------



## cherubrock (Dec 15, 2010)

Can someone please clarify why some performance numbers vary between when you reviewed the card, and when you're comparing against them? Specifically, when you reviewed the 6850 at 1920 on Metro, it got 18.5fps. But when it's compared in your new review, it shows as 13.4fps.

Here's the 6850's review (the 6850's 18.5fps is in blue):






And here is the 6950's review (the 6850's 13.4fps is at the very top):


----------



## N3M3515 (Dec 15, 2010)

omg...a 1920sp would have done the trick...
and the 6950 sucks! it should have beated hd5870....
shadedshu


----------



## Benetanegia (Dec 15, 2010)

cherubrock said:


> Can someone please clarify why some performance numbers vary between when you reviewed the card, and when you're comparing against them? Specifically, when you reviewed the 6850 at 1920 on Metro, it got 18.5fps. But when it's compared in your new review, it shows as 13.4fps.
> 
> Here's the 6850's review (the 6850's 18.5fps is in blue):
> http://tpucdn.com/reviews/ATI/Radeon_HD_6850_CrossFire/images/metro_2033_1920_1200.gif
> ...



All cards are much slower on the second chart, relative performance remains the same. As to the cause. idk but there's many posible causes: benching in a different game location, tesselation on/off, advanced DoF on/off...


----------



## N3M3515 (Dec 15, 2010)

Looks like a pair of 6950's beats a pair of gtx 570's
Here


----------



## Bjorn_Of_Iceland (Dec 15, 2010)

Wow.. a gtx480 was having this performance.. well, 8 months ago.


----------



## Volkszorn88 (Dec 15, 2010)

Bjorn_Of_Iceland said:


> Wow.. a gtx480 was having this performance.. well, 8 months ago.



We know.


----------



## Bjorn_Of_Iceland (Dec 15, 2010)

Volkszorn88 said:


> We know.



The tessellation performance of 6970 is good though. Not that bad of a card. Just focused on features that are thought to be the future of gaming graphics.


----------



## H82LUZ73 (Dec 15, 2010)

Great review Wizz ,Look on the up side at least the Green cards will fall in prices hopefully.Uhm I`m confused and upset they fell behind after the bart`s chips that looked promising...oh well so many video cards now on the market what ever should i buy......


----------



## Bjorn_Of_Iceland (Dec 15, 2010)

Just hoped it had 3dmark11 bench though.. and this thing is hot..


----------



## Jeffredo (Dec 15, 2010)

Wow.  I bet Charlie Demerjian is having a coronary right now trying to figure a way to spin this.


----------



## Benetanegia (Dec 15, 2010)

Jeffredo said:


> Wow.  I bet Charlie Demerjian is having a coronary right now trying to figure a way to spin this.



lol +1


----------



## qubit (Dec 15, 2010)

I'm so disappointed. It can't even beat the hot and bothered last gen GTX 480. :shadedshu It looks like AMD needs to hire some better engineers.

I was hoping that it would be very competitive with the GTX 580, so that the price of this card would come down and we might see a price war. Can't see this happening now.


----------



## Lionheart (Dec 15, 2010)

Words cannot describe these cards are fail in my eyes big time nah just kidding, they are not exciting performance but it looks like crossfire is where its at, seems more like a new architecture experiment to me, oh well Im glad I don't have to wait anymore for the reviews, I can finally decide on what card I want

Great reviews Wizz


----------



## DarkOCean (Dec 15, 2010)

Being a new architecture i bet it would get better and better with new drivers remember gtx 480 when it was out for the the first time it barelly beat a 5870 or even get beaten by it ; anyway 6950 looks the better one in therms of efficiency .
28nm for me.


----------



## Red_Machine (Dec 15, 2010)

Oh my god, that's the funniest thing I've seen all month.

Thanks for giving the big thimbs up to the 580, W1zz!


----------



## streetfighter 2 (Dec 15, 2010)

I find it amusing that the w1z didn't use his own utility, Sapphire TRiXX, to increase the voltage.  (I'm sure there are legal/business reasons why but it's still kinda ironic.)

I know everyones disappointed that it isn't the performance beast we were all hoping for, but it's considerably cheaper than the GTX 480 and it performs similarly.


----------



## W1zzard (Dec 15, 2010)

streetfighter 2 said:


> I find it amusing that the w1z didn't use his own utility, Sapphire TRiXX, to increase the voltage. (I'm sure there are legal/business reasons why but it's still kinda ironic.)



the simple reason is that i dont have a datasheet for that controller, so i have no clue how to program it, which means i can't change voltage. and no, i have my own awesome internal oc utilities


----------



## Over_Lord (Dec 15, 2010)

blu3flannel said:


> After all the hype, that was somewhat disappointing. Ah well, such is life.



AMD epic disappointment, hope they dont go the HD2000 way from here again man that would be SUCH A BUMMER.


----------



## W1zzard (Dec 15, 2010)

cherubrock said:


> Can someone please clarify why some performance numbers vary between when you reviewed the card, and when you're comparing against them? Specifically, when you reviewed the 6850 at 1920 on Metro, it got 18.5fps. But when it's compared in your new review, it shows as 13.4fps.



from the review test setup page, which you might have missed: "Benchmark scores in other reviews are only comparable when this exact same configuration is used."

I have added a version number to the test system. right now it's "Test System - VGA Rev. 12", look at that when trying to compare stuff.

for the case of metro 2033: rebench, different level, game patched, new drivers, using benchmark tool, different settings


----------



## Tatty_One (Dec 15, 2010)

streetfighter 2 said:


> I find it amusing that the w1z didn't use his own utility, Sapphire TRiXX, to increase the voltage.  (I'm sure there are legal/business reasons why but it's still kinda ironic.)
> 
> I know everyones disappointed that it isn't the performance beast we were all hoping for, but it's considerably cheaper than the GTX 480 and it performs similarly.



But more expensive than the GTX 570 which also performs similarily, it's fairly safe to say therefore that the GTX 480 is on It's way.  I find this sad that there is not more performance but to be honest, i didnt expect it, those that do/did just became too embroiled in the hype of discussion threads on "what might be", this is a damn fast card at a decent price, it will have to come down in price a little if AMD wants to make some money on it..... and it will, when it does plenty will jump on it.

Lastly, there must be one or two very red faced members who posted in the 6900 series discussion threads who offered up 101 reasons why this card was going to be a GTX 580 killer and accused anyone who disagreed with them as being fanboi's, I kept waway from the thread for that very reason..... hope is a great thing and we all have it, however reality is very different and often kicks you up the ar*e!!


----------



## LAN_deRf_HA (Dec 15, 2010)

I guess I could stand the $20 price premium over the 570 due to the memory sizes. Though honestly I'm skipping this whole gen. The lack of die shrink has made the improvements so minimal on all cards that unless you didn't own a 5000 or 400 series it's a true waste to "upgrade". I'll pick up a 28/32nm card and cross my fingers we never get shafted by another die shrink hold up again... or maybe hope that we do. Then I don't have to upgrade as often. Not like we'd miss out if improvements slowed, as long as we're getting stuck with console ports.

Side note, again a poor clocking HIS sample. Starting to think there's something off with their cards, as other reviews have pushed well beyond those numbers... not to mention in past multi card reviews here they're normally the weakest clockers.


----------



## R3DF13LD (Dec 15, 2010)

Nice review as always wiz 
btw it's like mobo now eh? 
flashing bios and all sort of stuff


----------



## pantherx12 (Dec 15, 2010)

Why is everyone still bitching and moaning, this is still an epic card regardless of what brand you prefer.

Price is good as well, should be same price as 570s in the UK and from the looks of it after 10.12 drivers games that were getting silly low fps should match the 570 should it should just be a feature set buying choice after that.

Whilst I wanted a 580 beater this is fine too, thanks for review wiz!


By the way folks, remember there is other reviews on the web! I happpen to prefer hardocp take on things.

"Both of these new AMD GPUs look to be incredible values. Never before has any GPU delivered so much gaming performance at such a low cost. If you have been waiting for the time to upgrade and are more than a generation back, that upgrade time has now come. The AMD Radeon HD 6950 and AMD Radeon HD 6970 are both tremendous values. "


----------



## pr0n Inspector (Dec 15, 2010)

looks like the 570 is a better deal, considering CUDA.


----------



## Red_Machine (Dec 15, 2010)

pantherx12 said:


> "Both of these new AMD GPUs look to be incredible values. Never before has any GPU delivered so much gaming performance at such a low cost. If you have been waiting for the time to upgrade and are more than a generation back, that upgrade time has now come. The AMD Radeon HD 6950 and AMD Radeon HD 6970 are both tremendous values. "



Sounds like fanboyism to me.  The 580 & 570 are better performers all round, no-one who purely went on performance would recommend the 6900 series.  And the cost?  Not much cheaper than a 580, and more expensive than the better performing 570.


----------



## pantherx12 (Dec 15, 2010)

Red_Machine said:


> Sounds like fanboyism to me.  The 580 & 570 are better performers all round, no-one who purely went on performance would recommend the 6900 series.  And the cost?  Not much cheaper than a 580, and more expensive than the better performing 570.



Having read more than one review, all I can say is the cards are wrecking shit for the price dude 

+ new drivers ( you can see games having obvious problems that there shouldn't be in wiz's review, for example cards being outperformed by 5870 and 5850 ) the average performance graph would actually be in the 6870s favour not the 570 ( not by much) the cards are head to head and as far as I can see are the same price in the UK.

So then it's down to feature sets, and for me the 6970 is the better choice due to crossfire scaling and eyefinity.

But what ever, I see how you post there always seem a wee bit o bias towards one side  which is cool I'm not fussed, just means I won't be willing to say anything more than what I've just said  


For price ref
http://www.overclockers.co.uk/showproduct.php?prodid=GX-026-HS&groupid=701&catid=56&subcat=1752

http://www.overclockers.co.uk/showproduct.php?prodid=GX-032-PV&groupid=701&catid=56&subcat=1010

both cheapest non sale prices on the same site.

Don't just take what the review says as gospel, google some stuff for yourself as well the $50 difference is in the US, not here.


----------



## pr0n Inspector (Dec 15, 2010)

10.12 saves the wrold? 
HWC disagrees.




pantherx12 said:


> For price ref
> http://www.overclockers.co.uk/showproduct.php?prodid=GX-026-HS&groupid=701&catid=56&subcat=1752
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=GX-032-PV&groupid=701&catid=56&subcat=1010
> ...



So why are the UK prices *so* important? Using your logic one could simply say "... is in the UK, not here."


----------



## Red_Machine (Dec 15, 2010)

pantherx12 said:


> So then it's down to feature sets, and for me the 6970 is the better choice due to crossfire scaling and eyefinity.



Does that make up for the lack of PhysX and CUDA?  I'm sorry, but having to buy an extra card just to be able to do PhysX is not on in my opinion.


----------



## _JP_ (Dec 15, 2010)

Great review! 
Sounds disappointing, but I'll want for mature drivers and AIB revisions.
Now, if AMD wants this card to be more competitive by January (or whenever will the GTX560 be launched), they should let AIB make modifications to the card, pronto (I'm looking at you ASUS, MSI and Sapphire). Maybe by then the cards will be more competitive, I mean, as competitive as they should have been now.


----------



## TAViX (Dec 15, 2010)

You guys forget one thing. Even AMD stated a couple of months ago that the 6xxx series is *not* a new generation but an evolution to the previews architecture. The new arch. is suppose to come next year with improved performance. So yes, I kinda expect this type of performance, I'm not disappointed because we already knew this was coming. 







Red_Machine said:


> Does that make up for the lack of PhysX and CUDA?  I'm sorry, but having to buy an extra card just to be able to do PhysX is not on in my opinion.


Now that's a typical FANBOY post...


----------



## pantherx12 (Dec 15, 2010)

Red_Machine said:


> Does that make up for the lack of PhysX and CUDA?  I'm sorry, but having to buy an extra card just to be able to do PhysX is not on in my opinion.



Yeah , I have a 9600gt for phsyx and don't run any cuda apps.



pr0n Inspector said:


> So why are the UK prices *so* important? Using your logic one could simply say "... is in the UK, not here."



I live in the UK so based my buying decisions based on the price here and not in the US? makes sense to me.


----------



## toyo (Dec 15, 2010)

I cannot believe this... why did they pushed this garbage on the market after all the "we are going to release the best single GPU solution!" stuff. A 480 without CUDA/Physx? No no no AMD, c'mon guys, what the hell were you thinking?

This is the worst release since the 3800 stuff! 

Damn, I just wish I could join the party at Nvidia right now. I bet it's pretty wild!

Thank you for the review...


----------



## bear jesus (Dec 15, 2010)

i was considering a 6950 or maybe a 6970 but the 6870 is close enough to the 6950 that I'm sure these is hardly a noticeable difference in game between an overclocked 6870 and 6950 so unless future drivers push up the performance a bit i can't really see there being any major reason to upgrade right now 

I still damn Nvidia for not adding the ability to run 3 monitors on one card with the 580, if they had then they could have had my money.

For now i will wait and see what the 69xx cards can do with more voltage but unless the 6950 while over volted can pull further away from 6870 oc/volted i don't see a new gpu for me any time soon.


----------



## Tatty_One (Dec 15, 2010)

bear jesus said:


> I still damn Nvidia for not adding the ability to run 3 monitors on one card with the 580, if they had then they could have had my money.



I suppose to them it was a cost > demand thing, adding the technology would have brought the price up for all when only a minority would enjoy the benefit..... i am not suggesting that is the right way to go, just the fact that it seems it's the way they decided to go.


----------



## bear jesus (Dec 15, 2010)

Tatty_One said:


> I suppose to them it was a cost > demand thing, adding the technology would have brought the price up for all when only a minority would enjoy the benefit..... i am not suggesting that is the right way to go, just the fact that it seems it's the way they decided to go.



True, i do understand why it was not something that was added and even more so as the 580 is really just a fully working gf100 in a way thus adding extra features to it would have effected the time taken to get the new chips ready and it would not have been worth any kind of delay just to add triple monitor compatibility.

Plus i suppose tiple monitor is mainly seen as a high end thing that if you have the money for you have the money for another cad... even if my 3 monitors combined cost a lot less than a 580 

For now i will hold on to the hope that the 6xx cards from nvidia support 3 monitors on one card and keep an eye on the 69xx cards for voltage adjustments updates and to see how future drivers do with them but after reading more reviews i see the 6950 as a viable option to people who do not own 58xx or 68xx cards and similar powered card from nvidia.


----------



## NdMk2o1o (Dec 15, 2010)

Red_Machine said:


> Does that make up for the lack of PhysX and CUDA?  I'm sorry, but having to buy an extra card just to be able to do PhysX is not on in my opinion.



A handful of games use PhysX and it's an NVIDIA owned API why in the hell would it be on an ATI card? if you MUST HAVE PhysX then buy NV though not a lot of people are that bothered with it,

As for CUDA, if you mainly use your GPU for cuda you wont be buying a mainstream consumer GPU and would be opting for a workstation card a'la Quadro instead, you have 2 minute moot points, what are you doing in here anyway besides trolling as usual?


----------



## hrvoje (Dec 15, 2010)

for me:
im a bit disappointed, thought it will be a much better vs 5850.
now its probably better to buy 2nd hand 5850 and crossfire, then upgrade to 69xx

btw nice review as always wizz

cheers


----------



## Fourstaff (Dec 15, 2010)

I wonder why most says that this product is a fail. It clearly is a good product at its price point, just that the performance is weaker than expected. pre GTX 580, most of us believed that the 6970 will beat the GTX 480, and since that it's just as powerful as the GTX 480 its only justified to say that this product is only slightly disappointing, not epic fail as what this thread sounded like.

Also, at 1920x1200, the 6950CF scales about 70%, which is a lot better than the 5870 CF.


----------



## cdawall (Dec 15, 2010)

I must have missed something in the review because overall at the resolution I play the card was quite good. Yes the 570 traded blows with it big deal amd cards always drop in price a month or so from now this card will be $250 and a great value. Amd didn't flop the card did exactly what its priced to do be a smidge better than the 570 at high resolution. Heck it trades a few blows with the 580.

Nv fanboys really need to stop saying oh look xyz is better and this doesn't have cuda or this doesn't have physx. No big deal your card doesn't have audio out your card doesn't have multi display that's comparable. This is stupid and so is bashing. A simple redo gave 570 performance so in theory you got beat by a suped up 5870.


----------



## Red_Machine (Dec 15, 2010)

NdMk2o1o said:


> what are you doing in here anyway besides trolling as usual?



Trolling?  Me?  I'm contributing to a discussion.  Trolling would be like this:

"LOL AMD you FAIL!  GTX 580 is the best and that's a FACT!  NVIDIA FTW!"


----------



## yogurt_21 (Dec 15, 2010)

performance is right about where I expected it to be for 1920x1200 and 2560x1600 but the lower resolution performance I'm sure will come up in time with drivers. 

so slower than 580 but typically faster than the 480, that's where i posted it would perform many times, but people seemed to have gotten it in their heads that the 6970 would beat the 580 which just wasn't likely. Both the 5XX and the 6XXX series were bridge series while both sides work on the next big thing and wait for tmsc to get their sh1t together. 

so if you have a GTX4X0 or a 5XX0 card there really isn't much incentive to go to the GTX5X0 or 6XX0 card these are probabally more targeted at those who are still on older series or those who just like their rigs to be the latest and greatest.

so I'm not dissapointed at all, the cards performed where I expected them to and there really isn't a single game that the 6970 can't play at a high res and high detail so what more do you want anyway?


----------



## jpierce55 (Dec 15, 2010)

Not a bad price, but I am disappointed that the power efficiency wasn't better for the performance. I don't see why everybody is so upset when it is not the top level g-card. It is considerably cheaper than the 580.


----------



## brandonwh64 (Dec 15, 2010)

I had a feeling that this what the outcome of the 6950 and 6970 benches would be. hopefully they are making the 6990 a beast to blow away all other cards like the 5970 did at release but only time will tell.


----------



## NdMk2o1o (Dec 15, 2010)

Red_Machine said:


> Trolling?  Me?  I'm contributing to a discussion.  Trolling would be like this:
> 
> "LOL AMD you FAIL!  GTX 580 is the best and that's a FACT!  NVIDIA FTW!"



Ok well maybe you need to rethink your argument on PhysX and CUDA I could say NV doesn't even have eyefinity or stream WTH .... see my point?


----------



## Red_Machine (Dec 15, 2010)

True, CUDA is mostly redundant unless you're folding or GPGPU'ing.  But EyeFinity is only for ultra-enthusisats, so not worth the added price or board complexity in my opinion.  I doubt I will even multi-mon as it is with my setup.


----------



## NdMk2o1o (Dec 15, 2010)

Red_Machine said:


> True, CUDA is mostly redundant unless you're folding or GPGPU'ing.  But EyeFinity is only for ultra-enthusisats, so not worth the added price or board complexity in my opinion.  I doubt I will even multi-mon as it is with my setup.



You can run Eyefinity on a 5650 they are not expensive, gaming on one is a different story, though there are other uses for eyefinity aside from gaming. 

I could care less about these latest gens from both teams, they seem to be playing catch up with each other and not actually looking to do best for the consumer, again, when I can get a card that gives me double the perf of my current (GTX 470) for the same price I paid £200 then it will be time to upgrade, as it happens anything along the lines of a GTX460+ and ATI 5850+ is fine for gaming for at least 1-2 years


----------



## WhiteLotus (Dec 15, 2010)

Red_Machine said:


> True, CUDA is mostly redundant unless you're folding or GPGPU'ing.  But EyeFinity is only for ultra-enthusisats, so not worth the added price or board complexity in my opinion.  I doubt I will even multi-mon as it is with my setup.



Physx is only for the ultra-enthusiasts. And almost completely pointless as only a handful of games uses it. Why would ATI pay it's rival nVidia for a licence to something that no one gives two squirts about?


I think this review is a little harsh on the card, it's still high performing, a reasonable price for what it offers. However temps are what gets me.


----------



## sneekypeet (Dec 15, 2010)

I'm not trying to start anything, just want a simple answer....Can these cards fold worth a fart yet? Was sort of hoping they got some lovin this time around


----------



## WhiteLotus (Dec 15, 2010)

sneekypeet said:


> I'm not trying to start anything, just want a simple answer....Can these cards fold worth a fart yet? Was sort of hoping they got some lovin this time around



That's a damn good question, I remember wizz including f@h scores once upon a time.


----------



## bear jesus (Dec 15, 2010)

Red_Machine said:


> True, CUDA is mostly redundant unless you're folding or GPGPU'ing.  But EyeFinity is only for ultra-enthusisats, so not worth the added price or board complexity in my opinion.  I doubt I will even multi-mon as it is with my setup.




Why thank you, it feels nice to be called an ultra-enthusiast with my £300 triple monitor setup 



sneekypeet said:


> I'm not trying to start anything, just want a simple answer....Can these cards fold worth a fart yet? Was sort of hoping they got some lovin this time around



As far as i know we are waiting on stanford to release a new client not on AMD to release a card that folds better as cards better than things like 3850/70's are out yet there is next to no increase in performance


----------



## newtekie1 (Dec 15, 2010)

cdawall said:


> No big deal your card doesn't have audio out your card doesn't have multi display that's comparable.



Don't have audio out?  You sure about that?

And being able to use 3 monitors with one card isn't all that great of a think when the card isn't powerful enough to actually drive all 3 monitors.  So needing 2 cards with nVidia isn't really a drawback.

And the thing you missed in the review is that everyone doesn't play at the resolution you play at, that is why we are concerned with overall performance, and not one specific resolution and the card gets judged on overall performance and not just one resolution.

And yes, the cards will come down in price, but so will nVidia's cards, so that point is rather moot.


----------



## bear jesus (Dec 15, 2010)

newtekie1 said:


> Don't have audio out?  You sure about that?
> 
> And being able to use 3 monitors with one card isn't all that great of a think when the card isn't powerful enough to actually drive all 3 monitors.  So needing 2 cards with nVidia isn't really a drawback.



I think he was giving random out dated examples as just something to use as a point not to be exactly accurate.

But i have to say at the moment I'm gaming at 5040x1050 on a single 6870, to me that would suggest many cards in the Nvidia range have the power as long as you don't want things like x24AA


----------



## cdawall (Dec 15, 2010)

newtekie1 said:


> Don't have audio out?  You sure about that?
> 
> And being able to use 3 monitors with one card isn't all that great of a think when the card isn't powerful enough to actually drive all 3 monitors.  So needing 2 cards with nVidia isn't really a drawback.
> 
> ...



They don't have onboard audio like the ati cards do unless something changed while I was on hiatus. Also a 5650 could easily drive 3 monitors no not in games but for cad and such that's a great entry level thing.

Also I can judge at one resolution as that's what I play at. However for reviews I think 1024x768 needs to bite the bullet...


----------



## newtekie1 (Dec 15, 2010)

cdawall said:


> They don't have onboard audio like the ati cards do unless something changed while I was on hiatus. Also a 5650 could easily drive 3 monitors no not in games but for cad and such that's a great entry level thing.
> 
> Also I can judge at one resolution as that's what I play at. However for reviews I think 1024x768 needs to bite the bullet...



Ummm...

Before you go on raging fanboy rants, perhaps you should make sure you know what your talking about.

And who gives a shit if the HD5650 can drive 3 monitors.  If you aren't gaming on them, just buy two of the cheapest PCI cards you can find and save the money over a HD5650.  It is idiotic to buy a gaming card, even a low end gaming card, just for 3 monitor support.  People have been driving 3+ monitors on $15 graphics cards for ages.  And with the PCI cards you can use more available DVI and VGA monitors and not be forced to use a DisplayPort monitor or a DisplayPort adapter.


----------



## Red_Machine (Dec 15, 2010)

cdawall said:


> They don't have onboard audio like the ati cards do unless something changed while I was on hiatus.



They've had it since the 200 series.  At least the ones that _weren't_ based on the G92/94.


----------



## cdawall (Dec 15, 2010)

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131339

Maybe you should do some aswell hd5450 $30 with dvi vga hdmi and eyefinity mustbe tough to find monitors to use that and for cad what $15 pci cards are you using I do 3d and it kicks my x1270 and single monitors ass...


newtekie1 said:


> Ummm...
> 
> Before you go on raging fanboy rants, perhaps you should make sure you know what your talking about.
> 
> And who gives a shit if the HD5650 can drive 3 monitors.  If you aren't gaming on them, just buy two of the cheapest PCI cards you can find and save the money over a HD5650.  It is idiotic to buy a gaming card, even a low end gaming card, just for 3 monitor support.  People have been driving 3+ monitors on $15 graphics cards for ages.  And with the PCI cards you can use more available DVI and VGA monitors and not be forced to use a DisplayPort monitor or a DisplayPort adapter.


----------



## newtekie1 (Dec 15, 2010)

cdawall said:


> PowerColor Go! Green AX5450 512MK3-SH Radeon HD 54...
> 
> Maybe you should do some aswell hd5450 $30 with dvi vga hdmi and eyefinity mustbe tough to find monitors to use that and for cad what $15 pci cards are you using I do 3d and it kicks my x1270 and single monitors ass...



Except that card can't drive 3 displays at the same time, because the 3rd display must be on a DisplayPort connecter.  Come on, you're touting these technologys and you don't even know how they work or what is required?  Do some research.

From AMD's Site:


			
				AMD's Site said:
			
		

> To enable more than two displays, additional panels with native DisplayPort™ connectors, and/or certified DisplayPort™ adapters to convert your monitor’s native input to your cards DisplayPort™ or Mini-DisplayPort™ connector(s), are required.



The cheapest card that would work to drive 3 monitors would be this HD5450 for $45.  But again, you have to use a more expensive DP monitor, or a $30 DP adapter to get it to work.


----------



## cdawall (Dec 15, 2010)

newtekie1 said:


> Except that card can't drive 3 displays at the same time, because the 3rd display must be on a DisplayPort connecter.  Come on, you're touting these technologys and you don't even know how they work or what is required?  Do some research.
> 
> From AMD's Site:



Yea went and looked it up my bad poor design imo...oh well still no reason the 6970 is a bad deal in comparo to nvs cards


----------



## bear jesus (Dec 15, 2010)

POWERCOLOR Active DisplayPort to Single-Link DVI-D...

$27 for an active display port to DVI adapter is not exactly a deal breaker for most, also cheaper adapters are usable with things like vga or hdmi.


----------



## newtekie1 (Dec 15, 2010)

bear jesus said:


> POWERCOLOR Active DisplayPort to Single-Link DVI-D...
> 
> $27 for an active display port to DVI adapter is not exactly a deal breaker for most, also cheaper adapters are usable with things like vga or hdmi.



It is when the card your talking about buying is practicly the same price.


----------



## bear jesus (Dec 15, 2010)

newtekie1 said:


> It is when the card your talking about buying is practicly the same price.



True but then if cost is that much of an issue would it not be logical to be using VGA monitors and a dirt cheap DP to VGA adapter?


----------



## newtekie1 (Dec 15, 2010)

bear jesus said:


> True but then if cost is that much of an issue would it not be logical to be using VGA monitors and a dirt cheap DP to VGA adapter?



Or forgo the whole thing and just use dirt cheap PCI cards like people have been doing for decades?


----------



## cdawall (Dec 15, 2010)

newtekie1 said:


> It is when the card your talking about buying is practicly the same price.



I agree unless you have a display port monitor the eyefinity on cheap cards is stupid...most boards are multi pci-e at least what I buy so I would get another 5450 if I wanted multi display I just can't buy a pci card they are a waste pci is dieing...


----------



## newtekie1 (Dec 15, 2010)

cdawall said:


> I agree unless you have a display port monitor the eyefinity on cheap cards is stupid...most boards are multi pci-e at least what I buy so I would get another 5450 if I wanted multi display I just can't buy a pci card they are a waste pci is dieing...



Very true, even if the board didn't have two PCI-E x16 slots, I think I'd buy two PCI-E cards and just dremel one down to fit in one of the PCI-E x1 slots.


----------



## bear jesus (Dec 15, 2010)

Sorry i misunderstood i thought the point being conversed was for using gpu accelerated 3d modeling across multiple monitors thus the use of having a higher powered gpu.


----------



## crow1001 (Dec 15, 2010)

Excellent and honest review.


----------



## Red_Machine (Dec 15, 2010)

newtekie1 said:


> Very true, even if the board didn't have two PCI-E x16 slots, I think I'd buy two PCI-E cards and just dremel one down to fit in one of the PCI-E x1 slots.



You can modify a PCIe x16 card to fit in a PCIe x1 slot?


----------



## bear jesus (Dec 15, 2010)

Red_Machine said:


> You can modify a PCIe x16 card to fit in a PCIe x1 slot?



Yes but you have to hurt it, either the card or the pci-e slot has to be damaged by removing some plastic or PCB.


----------



## Red_Machine (Dec 15, 2010)

Makes sense, I suppose.  People have modded 3dfx cards (which only used AGP 2x) to work with AGP 4x/8x slots.


----------



## animal007uk (Dec 15, 2010)

Can't be bothered to read all the post but i like what i see, early drivers and all that to, This card might get some good improvments later on. It beat the 580 is a few test at high res so thats not to bad and to be honest i never expected this card to be faster, It sits where i thought it would.

Im going to wait till after xmas and see what happens with drivers and stuff then i might just buy one. Nvidia might be faster but it don't make me wan't to go and buy one for a few extra FPS i can probs gain from a small overclock on an ATI card oh sorry that should be AMD.


----------



## cherubrock (Dec 15, 2010)

cherubrock said:


> Can someone please clarify why some performance numbers vary between when you reviewed the card, and when you're comparing against them? Specifically, when you reviewed the 6850 at 1920 on Metro, it got 18.5fps. But when it's compared in your new review, it shows as 13.4fps.
> 
> Here's the 6850's review (the 6850's 18.5fps is in blue):
> 
> ...



Can someone please answer this question?

I'm not asking for speculation on what you think, or your opinion, etc; I'd like to know the real reason why the same card is showing different FPS on different reviews.


----------



## bear jesus (Dec 15, 2010)

W1zzard said:


> from the review test setup page, which you might have missed: "Benchmark scores in other reviews are only comparable when this exact same configuration is used."
> 
> I have added a version number to the test system. right now it's "Test System - VGA Rev. 12", look at that when trying to compare stuff.
> 
> for the case of metro 2033: rebench, different level, game patched, new drivers, using benchmark tool, different settings



It was answered by w1zzard the writer of the review.


----------



## Red_Machine (Dec 15, 2010)

W1zz answered that earlier.  I think he said the system he used was slightly different for the second review.  And it may have had different drivers, too.


----------



## Gjohnst4 (Dec 15, 2010)

I was hoping this release would drive down 5870 prices, but I figure alot of people will just hold on to them considering the boost is not worth the cash. Guess I will have to shell out for crossfire =/


----------



## Josh154 (Dec 15, 2010)

Wow! The gtx 570 competes with the 6970 it seem's like to me. 570 is cheaper also! AMD dropped the ball on this one.. And as soon as 265.90 drivers hit desktop gpu's you just wait!


----------



## Assimilator (Dec 15, 2010)

Oh dear. These aren't bad cards, but after the greatness that was the 5000 series they are quite a disappointment. Looks like my 5850 is going to be with me until the GTX 600/HD 7000 series arrive.



bear jesus said:


> Yes but you have to hurt it, either the card or the pci-e slot has to be damaged by removing some plastic or PCB.



Rather trim the end of the PCI-e slot on the mobo, that way the card will still run at full x16 speed if you put it into a full-length slot later.


----------



## cdawall (Dec 15, 2010)

Assimilator said:


> Oh dear. These aren't bad cards, but after the greatness that was the 5000 series they are quite a disappointment. Looks like my 5850 is going to be with me until the GTX 600/HD 7000 series arrive.
> 
> 
> 
> Rather trim the end of the PCI-e slot on the mobo, that way the card will still run at full x16 speed if you put it into a full-length slot later.



$30 video card or $150 mobo which do you want to not be able to rma?


----------



## Nokiacrazi (Dec 15, 2010)

Hmm...Crossfire 6870 or 6950...


----------



## HalfAHertz (Dec 15, 2010)

This release is a hard one to judge properly. On one side the performance increase compared to last gen is similar to the one at Nvidia. However the big difference is that Nvidia got their increase without a die increase and at the same time lowered their TDP.
   I do realize that Cayman is supposed to be a stepping stone and will have some bad sides but really hope that we see improvements soon.
   From AT's review:


> The bad news is that it means many of AMD’s VLIW5-centric shader compiler tricks are no longer valid; at the start shader compiler performance is going to be worse while AMD learns how to better program a VLIW4 design. The good news is that in time they’re going to learn how to better program a VLIW4 design, meaning there’s the potential for sizable performance increases throughout the lifetime of the 6900 series. That doesn’t mean they’re guaranteed, but we certainly expect at least some improvement in shader performance as the months wear on.
> 
> On that note these VLIW changes do mean that some code is going to have to be rewritten to better deal with the reduction of VLIW width. AMD’s shader compiler goes through a number of steps to try to optimize code, but if kernels were written specifically to organize instructions to go through AMD’s shaders in a 5-wide fashion, then there’s only so much AMD’s compiler can do. Of course code doesn’t have to be written that way, but it is the best way to maximize ILP and hence shader performance.


----------



## Robert-The-Rambler (Dec 15, 2010)

*I really don't get this launch*

Now if this is preparation for doubling performance later next year then I guess you have to take the hit now in PR when the cards simply don't do that much better than the 5800 series and the 5970. The new shader structure will probably be the basis for higher performance in the future but right now for me it makes no sense to buy an AMD graphics card because there are Nvidia cards with similar performance at a similar price with Physx support and yes I do use that for several games. I wonder if the new way of doing things for AMD will lead to higher minimum frame rates as in my experience Nvidia has always traditionally had an advantage in smoothness do to higher minimum frame rates. The GTX 460 has leaned my allegiance more towards Nvidia and the experience has been great. I still have a lot of AMD 4800 series cards in play but my I7s for sure will more than likely be housing Nvidia and possibly in SLI if there is any need for higher performance.


----------



## Steevo (Dec 15, 2010)

I believe this is the first true cooperative effort from AMD/ATI to build a core that is a extension of the CPU for handling more tasks. 

Perhaps we will see some increase in performance, but it looks to be clock limited, I'm sure ATI was aiming for much higher clocks but failed to achieve them due to die constraints of 40nm. If it clocked like many of the 5870's do with a little more voltage this card would be a huge win.


----------



## HalfAHertz (Dec 15, 2010)

I think that it fails at clocking because of the power limiter


----------



## Super XP (Dec 15, 2010)

Great review Wizzard once again. Though this time I do have a minor beef with your final thumbs Down! I’ve seen it B4 with other reviews but anyway here it is.
The performance increase IMO is not all that disappointing and with newer drivers coming down the pipeline we should see better performance improvements. Overall, The price justifies the performance very nicely indeed. But that’s not my beef.

It’s not a disappointment to “NOT” have NV’s CUDA or PhysX support, it's not a disadvantage for ATI Radeon owners. What we need is OPEN STANDARD Physics and CUDA so that we may all benefit from it. NVIDIA is not willing to work with AMD or Intel to do such a thing, so all three will continue their separate ways with gamers paying the price. 

 •*No support for CUDA / PhysX*
If you can omit this line from your future ATI/AMD Radeon reviews it would be nice, if not no prob.


----------



## horik (Dec 15, 2010)

Good price here in Spain       and the card is awailable,the 6950 is at 286 €. 
http://www.coolmod.com/list/1772/4/2/1/TARJETAS-GRAFICAS.htm


----------



## Benetanegia (Dec 15, 2010)

horik said:


> Good price here in spain      View attachment 39558



Nice price indeed, but it's the only store carrying those prices according to what I see. Alternate sells them for 360-380 euros. That's probably a limited deal, it won't last too much.


----------



## Black Panther (Dec 15, 2010)

Great review as always, but I'm just not impressed by the card. I was expecting something even better than the 5970, and which would also beat the 580.


----------



## bear jesus (Dec 15, 2010)

Super XP said:


> Great review Wizzard once again. Though this time I do have a minor beef with your final thumbs Down! I’ve seen it B4 with other reviews but anyway here it is.
> The performance increase IMO is not all that disappointing and with newer drivers coming down the pipeline we should see better performance improvements. Overall, The price justifies the performance very nicely indeed. But that’s not my beef.
> 
> It’s not a disappointment to “NOT” have NV’s CUDA or PhysX support, it's not a disadvantage for ATI Radeon owners. What we need is OPEN STANDARD Physics and CUDA so that we may all benefit from it. NVIDIA is not willing to work with AMD or Intel to do such a thing, so all three will continue their separate ways with gamers paying the price.
> ...



My only thought on that is in the Nvidia reviews it could say no eyefinity support instead of "Still limited to two active display outputs per card"


----------



## Assimilator (Dec 15, 2010)

cdawall said:


> $30 video card or $150 mobo which do you want to not be able to rma?



Point taken.


----------



## Robert-The-Rambler (Dec 15, 2010)

*PhysX actually has a purpose*



bear jesus said:


> My only thought on that is in the Nvidia reviews it could say no eyefinity support instead of "Still limited to two active display outputs per card"



Eyefinity on the other hand.......


----------



## btarunr (Dec 15, 2010)

Guys, let's not go through a general NV vs. ATI discussion. Save time, save kilobytes on TPU's SSD, help prevent climate change.


----------



## Benetanegia (Dec 15, 2010)

bear jesus said:


> My only thought on that is in the Nvidia reviews it could say no eyefinity support instead of "Still limited to two active display outputs per card"





Robert-The-Rambler said:


> Eyefinity on the other hand.......



I think it's more on the lines of multi monitor, both brands have it and the only real advantage* to Eyefinity is that it can do more than 2 outputs from a single card, and considering that one card is not usually fast enough to play new games on 3 monitors**, it's almost a moot one.

On the other hand PhysX is the only GPU physics engine in actual use so far and AMD has none and hence it's a clear advantage.

*And afaik Nvidia's Surround has some benefits over Eyefinity, like bezel correction. Correct me if I'm wrong, thats what I hear.
** I'm waiting to see if Cadaveca tries the HD6900 series, but after this review I don't have high expectations.


----------



## qubit (Dec 15, 2010)

Jeffredo said:


> Wow.  I bet Charlie Demerjian is having a coronary right now trying to figure a way to spin this.



Here he is. This is the very last bit of the conclusion:



> How well does it perform? That is a question to be answered in our performance testing and review, something that is being wrapped up as we speak. *The short story is that AMD took the unequivocable high end lead with dual Cypress/Hemlock/HD5970 and hasn't looked back. Even the recent Nvidia GTX580 can't beat that year old card, and the dual Cayman/Antilles/HD6990 are just around the corner. Game over until 28nm late next year.*



http://www.semiaccurate.com/2010/12/14/look-amds-new-cayman6900-architecture/

Hmmm, yeah, right. Still no review up on SemiAccurate... I wonder why?


----------



## bear jesus (Dec 15, 2010)

Benetanegia said:


> I think it's more on the lines of multi monitor, both brands have it and the only real advantage* to Eyefinity is that it can do more than 2 outputs from a single card, and considering that one card is not usually fast enough to play new games on 3 monitors**, it's almost a moot one.
> 
> On the other hand PhysX is the only GPU physics engine in actual use so far and AMD has none and hence it's a clear advantage.
> 
> ...



I think i have been misunderstood, i was joking thus the  

The whole no cuda/physx thing on AMD/ATI reviews has been explained many times before.

But eyefinity does have bezel correction.


----------



## qubit (Dec 15, 2010)

AndreiD said:


> What will newer drivers do? Drivers can't add that much performance anyway so idk why so many people are bitching about them.
> I've yet to see a driver which adds 10% performance to a card. If the card turned out crappy, it will be the same with new drivers too.
> I'm sure drivers will add 200 more sp so it can beat the 580.
> [/rage against people crying about drivers]



Yep, +1. That's why they are so much cheaper than the 580. They will never be able to outperform it. This is another HD2900 lemon.

I've got a HD 2900 XT and a GTX 8800 and with the latest drivers, nvidia still walks all over ATI. Game over.


----------



## Robert-The-Rambler (Dec 15, 2010)

*I must rebuke*



btarunr said:


> Guys, let's not go through a general NV vs. ATI discussion. Save time, save kilobytes on TPU's SSD, help prevent climate change.



This discussion was in direct relation to the review at hand how it was mentioned that not having PhysX support was a disadvantage for an AMD video card. 

Don't you know a joke when you here one!!! Get me outta here!!!!!  Moving on......


----------



## Benetanegia (Dec 15, 2010)

qubit said:


> Here he is. This is the very last bit of the conclusion:
> 
> 
> 
> ...



I don't know what you guys are talking about. Charlie is always right. ALWAYS. You listen to me? Ok... well, he is sometimes wrong but 90% of times right, and for the most.... bla bla bla  j/k

Will ever people stop listening to his BS? Probably not. He will never link to articles where he was wrong (he may even edit them, he's done that in the past) and will concentrate on the few ones in which he was right. He will continue making vague claims with 50% posibilities of being right and he will eventually nail it again with a relevant chip and people will believe him again. :shadedshu

The funny thing is how Fuad has been almost completely right in the last couple of weeks 
It's been a definitive Fuad  Charlie.

EDIT: It's funny to read the comments from the drones in SA btw. Some acting as if nothing happened. Others claiming bad behavior and bias from ALL the reviewers around the world and plenty of other with more elaborate and imaginative theories. Charlie=mute. As unreliable as it is, SA surely is a fun site to read.


----------



## Kovoet (Dec 15, 2010)

Think I will stick to my 5970


----------



## Black Panther (Dec 15, 2010)

Kovoet said:


> Think I will stick to my 5970



+1 and also for another reason why I don't personally like amd's current naming scheme: regular pc users would just think _oh look a *6*970 would be better than a *5*970..._ :shadedshu


----------



## Benetanegia (Dec 15, 2010)

Black Panther said:


> +1 and also for another reason why I don't personally like amd's current naming scheme: regular pc users would just think _oh look a *6*970 would be better than a *5*970..._ :shadedshu



Definatelly the naming scheme does not fit at all now. Pur marketing spin. I remember what the "excuse" or rationalisation was for HD68xx: "The 8 is the new number for performance parts while the 9 is for high-end" and it made sense because Cayman was going to be so fast over HD5870 that it would be totally legit to call it a new generation and the increased series number. I never bougth that reasoning, even if Cayman had been 50% faster than Cypress, but let's say, ok fair enough.

Then... what now? How can the new generation naming be justified? How can the new generation be justified? Cayman is a slightly new architecture*, Barts isn't. Performance is not there to justify a generation jump, just like there's none for GTX5xx.

*Difference between X1800 and X1900 was even more pronounced iirc and didn't warrant a new gen, because things made more sense back then, I suppose.

The names should have been:

Barts Pro: HD5840
Barts XT: HD5860
Cayman Pro: HD5880 (with some reservations)
Cayman XT: HD5890

And right now there's no way of spining it in a way that would make me think it should have been different than that. For what it's worth Cayman wouldn't even deserve HD5930 and HD5950 names, let alone HD6950/70.


----------



## OneCool (Dec 15, 2010)

Bummer.I really thought it would have been faster than that :shadedshu


Looks like the GTX570 is going to be the sweet spot.


----------



## ERazer (Dec 15, 2010)

great review w1zz 

guess im gonna be on green side this time around been on red team since 3850 or ill just wait till next generation


----------



## HTC (Dec 15, 2010)

Kovoet said:


> Think I will stick to my 5970



Really?

Apart from a 5970 being 1 card instead of 2, it still has internal CF, and from a 580 not having scaling issues to worry about because it's the most powerful single GPU card to date, let's see W1zzard's findings @ 2560x1600, shall we:







In AvP, 6950 CF has 48.5% over 5970 and 71.9% over 580






In Bad Company 2, 6950 CF has 24.8% over 5970 and 62.9% over 580






In Battleforge, 6950 CF has 10% over 5970 and 15% over 580






In Call of Duty 4, 6950 CF has 3.6% over 5970 and 56.3% over 580






In Call of Juarez 2, 6950 CF has 20.8% over 5970 and *-30.7%* over 580






In Crysis, 6950 CF has 36.9% over 5970 and 77.6% over 580






In Dawn of War 2, 6950 CF has 22.3% over 5970 and 75.1% over 580






In Far Cry 2, 6950 CF has 45.4% over 5970 and 36.9% over 580






In HAWX, 6950 CF has 43.6% over 5970 and 33.2% over 580






In Metro 2033, 6950 CF has 35.1% over 5970 and 32.3% over 580






In Riddick: Dark Athena, 6950 CF has 21.1% over 5970 and 51.6% over 580






In STALKER - Clear Sky, 6950 CF has 37% over 5970 and 42.3% over 580






In Supreme Commander 2, 6950 CF has 2.2% over 5970 and 9.6% over 580






In Unreal Tournament 3, 6950 CF has 13.8% over 5970 and 47% over 580






In 3DMark 03, 6950 CF has 21.8% over 5970 and 84.3% over 580






In 3DMark 05, 6950 CF has 6.2% over 5970 and 7.7% over 580






In 3DMark 06, 6950 CF has 5.4% over 5970 and 19.1% over 580






In Unigine Heaven 2.0, 6950 CF has 96.8% over 5970 and 77.9% over 580


6950 CF is $600: that's 3.4% more then a 5970 and 20% more then a 580 but how much better then a 5970 and 580 is it? Shall we add up?

*Not counting the 3DMark benches AND Unigine bench: 6950 CF has 26.1% over 5970 and 41.5% over 580

Not counting 3DMark benches only: 6950 CF has 38.8% over 5970 and 43.9% over 580

Counting everything: 6950 CF has 27.5% over 5970 and 42.8% over 580*

That's 27.5% more power costing 3.4% more or 42.8% more power costing 20% more: does that sound bad to you? To put it into perspective, 6970 costs 23.3% more then 6950 and only brings ~12% better performance then 6950.


Unless i made some mistake, i didn't arrive @ the same conclusion as W1zzard:







Can't do the same for 6970 CF because W1zzard hasn't put it up yet and i'm almost sure the relative gains it has over 5970 and 580 won't be as good as 6950 CF because it costs much more and that little detail will skew everything off 6970 CF (guessing, for now).

*This comparison is only for 2560x1600: for other resolutions 6950 CF will NOT be as good.*


----------



## Benetanegia (Dec 15, 2010)

HTC said:


> That's 27.5% more power costing 3.4% more or 42.8% more power costing 20% more: does that sound bad to you? To put it into perspective, 6970 costs 23.3% more then 6950 and only brings ~12% better performance then 6950.



Why are you quoting him?
It makes your point moot. It's not 27.5 more power costing 3.4% more or 42,8% more power costing 20% more. For him it's 27.5% faster costing $600 more. He already has the HD5970. 

Other than that you are disregarding power consumption and noise, specially noise and also physical space inside the case. Not only performance scaling counts.


----------



## HTC (Dec 15, 2010)

Benetanegia said:


> Why are you quoting him?
> It makes your point moot. It's not 27.5 more power costing 3.4% more or 42,8% more power costing 20% more.* For him it's 27.5% faster costing $600 more. He already has the HD5970. *
> 
> Other than that you are disregarding power consumption and noise, specially noise and also physical space inside the case. Not only performance scaling counts.



You're right, hehe 

As for noise and temps, i can't because W1zzard didn't do that part in the review and, as such, i can't compare but you're right about the physical space inside the case.


----------



## [H]@RD5TUFF (Dec 15, 2010)

OMG it's a giant piece of fail. . . .  shocker .. .


----------



## cdawall (Dec 15, 2010)

[H]@RD5TUFF said:


> OMG it's a giant piece of fail. . . .  shocker .. .



as were all of the nvidia based intel boards but someone has to buy them :shadedshu


----------



## MxPhenom 216 (Dec 15, 2010)

mdsx1950 said:


> Bang for the buck FTW!
> 
> With the new drivers the cards will perform better for sure.



yeah if ati can get their ass into gear with new drivers. lately their drivers are garbage. and nvidia still has room for improvements in their drivers. Oh and the 570/580 overclock better then these cards as well which is a big selling point to me.


----------



## Volkszorn88 (Dec 15, 2010)

nvidiaintelftw said:


> yeah if ati can get their ass into gear with new drivers. lately their drivers are garbage. and nvidia still has room for improvements in their drivers. Oh and the 570/580 overclock better then these cards as well which is a big selling point to me.



Anything with nvidia on it is a big selling point for you.


----------



## dr emulator (madmax) (Dec 15, 2010)

nice to see my cards still in 6th place 

well 6th on the firstpage


----------



## MxPhenom 216 (Dec 15, 2010)

Volkszorn88 said:


> Anything with nvidia on it is a big selling point for you.





only on cards. basically i see that Nvidia  ati with the unexpected 580


I mean ive owned a HD5870 and it was okay till ati drivers started to become garbage! So then i switched to a 470 and havent had problems yet. and my 470 overclocks way better then my 5870 did and i love to overclock. its where I get most of my fun out of new parts before i go into gaming.


----------



## springs113 (Dec 16, 2010)

i don't know if this was mentioned/discussed...but these cards have a dual bios...what are the possibilities of this being an undercut card...because it still seem like the shader count should be more...almost like unlocking a tri core processor into a quad?

everyone also forgets that the 5970 = 5850 x2 and not 5870x2...with that being said lets say for arguments sake what if the 6990 is a full 6970(with the assumed 1900+shaders) and not that of the actual 6970 that is currently out...reading most of these reviews seem to indicate that amd is hiding/holding something back with this (new) design.

personally i am a little dissappointed but i am a little excited too, it feels as this a stopping gap for the 7000 series.  after reading anandtech's review it looks like amd is really holding back.


----------



## johnnyfiive (Dec 16, 2010)

Wow.... I think I speak for 95% of TPU when I say... shitty.

Great review wiz!


----------



## CDdude55 (Dec 16, 2010)

cdawall said:


> as were all of the nvidia based intel boards but someone has to buy them :shadedshu



I both agree and disagree with that, while they were plagued with problems, my 680i board has been solid and stable over around three years now.


----------



## MxPhenom 216 (Dec 16, 2010)

CDdude55 said:


> I both agree and disagree with that, while they were plagued with problems, my 680i board has been solid and stable over around three years now.



my friend had the 790i Ultra PWM by evga and it was any amazing board as well!


----------



## RONX GT (Dec 16, 2010)

Damn, all i can say now is "I CANT WAIT FOR THE 7XXX TO SHOW UP".


----------



## Wshlist (Dec 16, 2010)

*OpenCL not tested*

I hate to be negative towards techpowerup but I'm baffled why there are no benchmarks run for OpenCL, and seeing the GPUZ shot you did not even install the OpenCL driver, WTF guys? it is late 2010 you know, why run a 3dmark03 but not test the OpenCL performance differences? And I'm pretty sure OpenCL will become more important soon as developers start to employ it, it's an accepted standard that works cross platform and cross-OS after all.

And you can even install the AMD OpenCL driver on systems without hardwaresupport since ti will then use the CPU instead, even if that CPU is intel btw.

And yes there are some OpenCL benchmarks available now.

As for the results of the test, I too am a bit let down but the price is at least not in gtx580 areas which is way too high for a mere graphics card IMHO, and yeah the 570 is beating it sometimes but at the now standard resolutions of at least 1920 it is about even I think?
But some reviewers mock that nobody has the 2000+ resolutions however since monitors are relatively cheap I think you should look at it as a way to play mufti-monitor and then the 6970 is a competitor, but the problem though is that it's probably too weak to run recent games at HD resolution over 3 monitors, so you'd have to reduce resolution which is a pity.
What we need is a way to have a central monitor at full HD res and two side monitors that can simultaneously run at lower res on the same game (which is coincidentally how real simulators for flight training used to do it, have higher res at the focal point of the pilot).

I don't think crossfire is a fix BTW, it's too fraught with issues and draws insane power and heats up too much which requires ridiculous amounts of fans or complex expensive watercooling or it will drive the case temps to unmanageable levels, all for a buggy experience.


----------



## Hayder_Master (Dec 16, 2010)

grate review w1z as always
expected performance after we see old 6850 review, but only feel happy with new improves in tessellation 
can't force gtx580 anyway
congrats ATi your new toy fail


----------



## Fourstaff (Dec 16, 2010)

Wshlist said:


> I hate to be negative towards techpowerup but I'm baffled why there are no benchmarks run for OpenCL, and seeing the GPUZ shot you did not even install the OpenCL driver, WTF guys? it is late 2010 you know, why run a 3dmark03 but not test the OpenCL performance differences? And I'm pretty sure OpenCL will become more important soon as developers start to employ it, it's an accepted standard that works cross platform and cross-OS after all.
> 
> And you can even install the AMD OpenCL driver on systems without hardwaresupport since ti will then use the CPU instead, even if that CPU is intel btw.
> 
> ...



Well, I believe W1zzard have already stated some time ago that unless someone sponsors him some more monitors, we will have to settle for 1920X1200 and 2560x1600. Lots of people still game on 1650x1050 and below, not everybody can afford the hardware upgrades all the time. Even in Steam, the gamer-centric shop still report masses of Intel graphics, something which is too weak to power 1920x1080.

As a TPU regular, I have seen a few members with crossfire and the likes, and their opinion is that Crossfire can be a bitch at times, but it works most of the time, so I believe that your criticism against Crossfire is a bit skewed. It does not draw insane power, indeed, if you crossfire 2 5770's the power draw will still be less than a GTX470 while providing performance within 5% of the GTX470. 

3 monitors is a luxury a very select few can have, and so its justified that its left out, seeing that there will be no competition from Nvidia anyway (they will need to use 2 graphics card, and if you are not pro-crossfire, then I would assume that you don't like SLi either). 

I am not too sure about the OpenCL, but W1zzard tends to add things bit by bit as he go along reviewing things, and right now he has all the benches in 3Dmark03, so its good base for comparison. I would like to have some OpenCL benches up though, of course once he deems it is necessary/desirable. 

W1zzard's review might not be the best in the world, it might not be perfect, but its consistent, which is the most important criteria when you compare things.


----------



## Xaser04 (Dec 16, 2010)

springs113 said:


> everyone also forgets that the 5970 = 5850 x2 and not 5870x2



A HD5970 *IS* HD5870 x 2. It was clocked at HD5850 clocks to keep it within the 300w TDP PCI-e spec. 

The cooler on the card is capable of dealing with 400W of heat so its easy to see what ATI had in mind...

I am slightly baffled why 1024x768 is still included in these reviews along with 3DMark03 and 05?. All these do is cloud the overall averages. Removing them would reduce the workload without any detrimental effect on the actual review. 

As for Eyefinity or Surround being a luxury... you can pick up 3 22" 1680x1050 monitors for around £300. Whist I can appreciate why this is not included in the standard reviews it isn't eaxctly out of reach for most enthusiasts.


----------



## H82LUZ73 (Dec 16, 2010)

Fourstaff said:


> Well, I believe W1zzard have already stated some time ago that unless someone sponsors him some more monitors, we will have to settle for 1920X1200 and 2560x1600. Lots of people still game on 1650x1050 and below, not everybody can afford the hardware upgrades all the time. Even in Steam, the gamer-centric shop still report masses of Intel graphics, something which is too weak to power 1920x1080.
> 
> As a TPU regular, I have seen a few members with crossfire and the likes, and their opinion is that Crossfire can be a bitch at times, but it works most of the time, so I believe that your criticism against Crossfire is a bit skewed. It does not draw insane power, indeed, if you crossfire 2 5770's the power draw will still be less than a GTX470 while providing performance within 5% of the GTX470.
> 
> ...



He also does not know that 2 6950 in crossfire is beating gtx570in sli and gtx 480 also 2 6970 in crossfire are dead even to gtx580 in sli,go read some more reveiws please.


At first i was upset that the cards did not perform,but AMD is a manufacture that sells cards,why sell 1 at $599 when you can sell 2 at $349 ,They in turn just made it very cheap to run crossfire.And they also promote the Eyfinity Look 2 6950`s will cost you the same as 1 gtx580. And get the same performance out of it.And for all we know these 6970`s could have a bios update from them to enable the 1920 shaders ....... I`m buyin 2 of em in a week for reasons that my board does not do sli and i like AMD of just plugging in my hdmi cable for sound DTS Master and True Dolby through my AV receiver.I like to game and watch Blu-Ray on my PC and Nvidia has it also but you have run this cable to that ......


----------



## W1zzard (Dec 16, 2010)

Wshlist said:


> why there are no benchmarks run for OpenCL



which opencl consumer application would you suggest?


----------



## Mr McC (Dec 16, 2010)

When I bought my 5870, I bought it with the intention of skipping this generation. This review now allows me to smugly affirm that I was correct to do so.

I don't know whether AMD or forum hype is to blame, but certainly everybody was expecting much more from these cards and I can't help feeling that, given the lead established over Nvidia with the 58xx series, the 6xxx series is, for the most part, a missed opportunity.

Thanks for the review Wizz.


----------



## bear jesus (Dec 16, 2010)

Xaser04 said:


> As for Eyefinity or Surround being a luxury... you can pick up 3 22" 1680x1050 monitors for around £300. Whist I can appreciate why this is not included in the standard reviews it isn't eaxctly out of reach for most enthusiasts.



I really have to agree with that and even more so as I'm running 3 1680x1050 monitors that cost about £100 each, but i already had one of the exact same kind so a £200 upgrade and when many people on here are spending more than that on a gpu alone and often even double i think it is a relatively cheap upgrade.

Also i have to say it's the best upgrade i have ever done excluding the 4mb voodoo card i got in the mid 90's  trying to play a game with a single monitor now feels like i have one eye closed and the remaining one half closed thus why i recommend it as the best upgrade to everyone i game with and none of those people are enthusiasts they just like pc gaming and most don't have a lot of cash to spare (mainly students) and now many of them are looking at good value monitors as a future upgrade

Triple monitor setups may be a luxury but it is far from enthusiast only territory when talking about lower res monitors, even more so when thinking about say 3 19" 1280x1024 monitors.

I think as time goes on triple monitor resolutions will become more of a normal thing in reviews even if its only 5040x1050 and/or 5760x1080.


----------



## Wshlist (Dec 16, 2010)

W1zzard said:


> which opencl consumer application would you suggest?



Now come on, read the rest where I say:
"And I'm pretty sure OpenCL will become more important soon as developers start to employ it, it's an accepted standard that works cross platform and cross-OS after all."

And I found a site that has everal directcompute and opencl tests now:
http://www.rage3d.com/reviews/video/amd_hd6970_hd6950_launch_review/index.php?p=7

Which nicely shows that there are some benvmarks out and that interest is growing.


----------



## Wshlist (Dec 16, 2010)

Fourstaff said:


> ...
> 3 monitors is a luxury a very select few can have
> ....



It's amusing that in the same breath people suggest crossfire, needing 2 expensive cards (and a fat PSU and a high energy bill), but then claim you can't afford 2 more monitors that you can get for half the price of a single gtx580 for instance, so make up your mind, are the people poor or wealthy?

Monitors go for prices like $150 and lower these days and it seriously is not the issue it used to be.

As for crossfire, I heard a LOT of bitching about it, from annoying micro-stutters to not speeding up anything because of lacking profiles, and when I read the driver release notes AMD adds a good deal of pain themselves with weird bugs. (Note that SLI is also not smooth sailing, but we are talking AMD here)

And it's a bit strange to retort to my crossfire remark related to running several HD monitors in a modern game  with a reply suggesting low-end cards in crossgire, that's not going to do anything except give you the crossfire issues on top of lacking power in that scenario.

Thanks for all the replies though guys and for reading my post


----------



## Fourstaff (Dec 16, 2010)

Wshlist said:


> Now come on, read the rest where I say:
> "And I'm pretty sure OpenCL will become more important soon as developers start to employ it, it's an accepted standard that works cross platform and cross-OS after all."
> 
> And I found a site that has everal directcompute and opencl tests now:
> ...



Don't worry, W1zzard did not jump straight into benchmarking everything in DX11 when it first started, and even now he is slowly easing DX11 into the benchmarks. If he deems OpenCL worthy of his time, then he will do it. Reviewing so many graphics card while keeping rowdy forum members takes a lot of time, and he needs a life too .



Wshlist said:


> As for crossfire, I heard a LOT of bitching about it, from annoying micro-stutters to not speeding up anything because of lacking profiles, and when I read the driver release notes AMD adds a good deal of pain themselves with weird bugs. (Note that SLI is also not smooth sailing, but we are talking AMD here)



Well, you did not hear from people whose crossfire ran right, did you? While its true that people can afford to get GTX 580 and stuff like that, if you take a general consensus in the forums (this one for example), you will notice that most people are running mid-high graphics cards. If you think that 5770 is rather low end, well, when it was released it was a solid mid range graphics card. Perhaps I should have used 6870 as examples, because they handily beat GTX 580 by a good margin and consumes less power too.

You might think that I would run crossfire with only the most expensive cards and so on, but many people forget that crossfire is a very handy tool if you cannot afford a kickass graphics card in one go. You can get a mid range value for money graphics card like the 6850, and then add another one in a year or two's time to up your graphics power, and that is a better decision than otherwise, for example, getting a GTX 580 in a years time while continuing to game with your 9800GTX or something in the meantime.


----------



## Wshlist (Dec 16, 2010)

Fourstaff said:


> Don't worry, W1zzard did not jump straight into benchmarking everything in DX11 when it first started, and even now he is slowly easing DX11 into the benchmarks. If he deems OpenCL worthy of his time, then he will do it. Reviewing so many graphics card while keeping rowdy forum members takes a lot of time, and he needs a life too .



Oh I'm not seriously mad at him, I'm just reminding him and showing that there is interest from 'the public' in these things, and both AMD and nvidia spent a lot of design effort in the compute element of their cards the last years and it would seem they consider it significant and we might respect that a bit.
And there are actually already games that use directcompute I understand.

Oh one more thing, I thought he didn't install the OpenCL driver since the GPUZ screenshot had the box unchecked but that seems to be a bug in GPUZ since the other site that did test OpenCL also has that box show unchecked in GPUZ, so my bad in misinterpreting that.


----------



## springs113 (Dec 16, 2010)

hey W1zz what are your thoughts on the dual bios...and is possible that theres hidden potential there also do you think that this sort of a radical change in using vli4 instead of 5 is amds way to catch up with this tessellation craze and that the realization of this(performance wise) should come next series?

 i don't know if you read anandtechs article but they seem to imply that amd is not letting on some tricks that they might have when it comes to this series...could it also be that because this was designed to be made on 28nm and not 40 originally could you elaborate on what kind of impact that could have had?

could that dual bios be an unlocker so to speak?...it just seems like something is missing and can be placed/unlocked at moments notice.


----------



## Wshlist (Dec 16, 2010)

Fourstaff said:


> ...
> You might think that I would run crossfire with only the most expensive cards and so on, but many people forget that crossfire is a very handy tool if you cannot afford a kickass graphics card in one go. You can get a mid range value for money graphics card like the 6850, and then add another one in a year or two's time to up your graphics power, and that is a better decision than otherwise, for example, getting a GTX 580 in a years time while continuing to game with your 9800GTX or something in the meantime.
> ...



Fair enough, you do tend to hear the complainers, (but the errors in the release notes come from AMD though), and as for crossfire being an alternative, sure, and many people are content I'm sure depending on their setup and favorite games, but my specific remark was about the setup where you use 3 monitors to play a single modern game, at that point you are looking at a very much increased use of the GPU and increased demand on the RAM, both in speed and how much of it is used, and I'm not sure mid-range crossfire  is good for that situation.
And of course the advantage AMD quotes in their slide of the 69xx features is that for use with many monitors you don't need SLI and it works from a single card.


----------



## bear jesus (Dec 16, 2010)

I wonder if the dual bios is so people trying to flash a 6950 to a 6970 and failing would help them out 

I really don't know why they added such a feature, i think some bios editing from the master is required *hint, hint* you know you want to take a look w1zzard  

Hmmm i don't suppose AMD plans to release a new bios or something? thus the feature is to make sure people can't easily brick their cards, although i admit if that was intended i assume we would have heard some rumors by now.


----------



## newtekie1 (Dec 16, 2010)

CDdude55 said:


> I both agree and disagree with that, while they were plagued with problems, my 680i board has been solid and stable over around three years now.



Not to mention the nForce 4 boards were actually pretty damn good for thier time, especially compared to the Intel 925/915 offerings of the same time.

The nforce 600 and 700 series were a little lacking compared to the Intel chipsets in overclocking performance, but they still offered some great features.  Such as x16/x16 slots on 750i vs. the x8/x8 slots of the P45.


----------



## Wshlist (Dec 16, 2010)

As a general remark I can tell you that I'm in the market for a new graphics cards, something that has some lasting power so not too low end, and now I'm looking at the HD6970 and GTX570 and am really in a bind, and in my neck of the woods I notice the GTX570 is actually quite a bit more expensive than the HD6970, and availability seems to be not that great either if you don't want to pay a lot extra (hence the higher price, the retailers exploit the opportunity.)
Both have things that are attractive, and both have direct disadvantages as well as potential ones for the future, it's quite vexing.


----------



## FilipM (Dec 16, 2010)

To me it looks like ill hold on to my 5870 for a bit longer than expected


----------



## W1zzard (Dec 16, 2010)

Wshlist said:


> Now come on, read the rest where I say:
> "And I'm pretty sure OpenCL will become more important soon as developers start to employ it, it's an accepted standard that works cross platform and cross-OS after all."
> 
> And I found a site that has everal directcompute and opencl tests now:
> ...



so you are saying "test opencl, even though no actual applications exist other than benchmarks?" 

why benchmark something that nobody uses?

does anyone know an actual consumer application that uses opencl or dx compute? (not games)

we had folding in our reviews for a while. nobody said anything about it so i assumed people were not interested in it


----------



## W1zzard (Dec 16, 2010)

springs113 said:


> the dual bios...and is possible that theres hidden potential there



its just 2 bios chips that you can switch via switch. what potential are you looking for, other than the obvious?


----------



## bear jesus (Dec 16, 2010)

I think some people think that something has been locked out thus the switch was to swap between 2 different bios chips with different settings and not 2 bios chips that are the same.



W1zzard said:


> its just 2 bios chips that you can switch via switch. what potential are you looking for, other than the obvious?



It's the switch to change it between 1536 SP's and 1920 but only once AMD releases the second bios that you have to flash yourself  you know you want to make us a 1920SP bios w1zz, it's not harder than overclocking 2gb of ram into 4gb 

Just so my comment is not confused for something serious, that was a joke


----------



## N3M3515 (Dec 16, 2010)

W1zzard said:


> so you are saying "test opencl, even though no actual applications exist other than benchmarks?"
> 
> why benchmark something that nobody uses?
> 
> ...



What i really don't understand is the 1024x768 result, they don't say anything and messes the                                                                                                                      performance summary for all resolutions.

Also, if the test are for 300USD cards why bother to put 1024*768 and 1280*1024 ?, who buys a 300USD card to play with a 17" monitor?, imho for midrange and highend those 2 resolutions doesn't mean anything.


----------



## Wshlist (Dec 16, 2010)

@N3M3515
I guess lower resolution results when compared with high resolution ones can give an indication of the bottlenecks and strengths since at low resolutions each component can probably run its fullest where at higher resolutions you get the RAM speed playing, and one thing waiting for another to finish.
As parts of a puzzle those things can be used to deduce information.

As for the remarks to me from W1zzard about OpenCL, you know every damn company is spending billions on developing support for it, even on handsets, from AMd through IBM to intel for their newest chips, a quarter of the damn transistors on the latest nvida and ATI cards are designed for computing, you seriously think that that's all for naught and they are all wrong?
And as I said it works on OSX and windows and linux, and you think all the software companies are wrong too? 
Sure it's early day for applications, but you don't buy a new graphics card every month, at least I don't, so I take OpenCL and also DirectCompute serious as a thing to consider when purchasing a higher end card, and I imagine I'm not all alone in that.
Now that techpowerup focuses on the games is not unreasonable, but surely while doing all those test to throw in some computing test can't harm? Even if it was for just 5% of the consumers that's still millions of people.


----------



## H82LUZ73 (Dec 16, 2010)

W1zzard said:


> its just 2 bios chips that you can switch via switch. what potential are you looking for, other than the obvious?



Because some cards  I seen in reviews and manufactures web-page specs read 1536sp but in gpuz on some of the reviews it says 1620 to 1600 and that is after they overclock them.

Go here and the 2 side by side are the Saphire and the one on your right is the XFX and see the SP count is different on the Sappy.

http://www.overclockersclub.com/reviews/amd_hd6970_hd6950_review/6.htm

this leads me to think 
1,A bios update 
2,A bug in the GPUZ 
3,It depends on what chip it is IE binned or not binned


----------



## W1zzard (Dec 16, 2010)

those people used the wrong build of gpuz, the one that says "2011" for release date has no support for hd 6900 and calculates the shaders wrong. the one with the correct release date works right


----------



## W1zzard (Dec 16, 2010)

Wshlist said:


> As for the remarks to me from W1zzard about OpenCL, you know every damn company is spending billions on developing support for it, even on handsets, from AMd through IBM to intel for their newest chips, a quarter of the damn transistors on the latest nvida and ATI cards are designed for computing, you seriously think that that's all for naught and they are all wrong?
> And as I said it works on OSX and windows and linux, and you think all the software companies are wrong too?
> Sure it's early day for applications, but you don't buy a new graphics card every month, at least I don't, so I take OpenCL and also DirectCompute serious as a thing to consider when purchasing a higher end card, and I imagine I'm not all alone in that.
> Now that techpowerup focuses on the games is not unreasonable, but surely while doing all those test to throw in some computing test can't harm? Even if it was for just 5% of the consumers that's still millions of people.



i do take it seriously. if there was an application that 5% of users use and that i could test i'd certainly look into it.


----------



## N3M3515 (Dec 16, 2010)

W1zzard said:


> i do take it seriously. if there was an application that 5% of users use and that i could test i'd certainly look into it.



W1zz, any comments on what i wrote?


----------



## W1zzard (Dec 16, 2010)

N3M3515 said:


> W1zz, any comments on what i wrote?



not really. dont look at those results if you dont like them. they can serve as an indicator at which fps rate you see cpu bottlenecks in a particular benchmark


----------



## MxPhenom 216 (Dec 17, 2010)

Mr McC said:


> When I bought my 5870, I bought it with the intention of skipping this generation. This review now allows me to smugly affirm that I was correct to do so.
> 
> I don't know whether AMD or forum hype is to blame, but certainly everybody was expecting much more from these cards and I can't help feeling that, given the lead established over Nvidia with the 58xx series, the 6xxx series is, for the most part, a missed opportunity.
> 
> Thanks for the review Wizz.



yeah. the hype was far to high. 

I kind of chuckled to myself when i checked the reviews during presentations in my Psychology class


----------



## Wile E (Dec 17, 2010)

N3M3515 said:


> What i really don't understand is the 1024x768 result, they don't say anything and messes the                                                                                                                      performance summary for all resolutions.
> 
> Also, if the test are for 300USD cards why bother to put 1024*768 and 1280*1024 ?, who buys a 300USD card to play with a 17" monitor?, imho for midrange and highend those 2 resolutions doesn't mean anything.



I just look at the chart for my res. I have a 1920x1200 screen, so the only chart I actually look at in that section is the 1920x1200 relative performance chart. I completely ignore the overall chart.

That said, these results are disappointing. I was hoping for at least trading blows with the 580.

Well, that settles it. 580 will be my next card when money allows.


----------



## Thatguy (Dec 17, 2010)

bear jesus said:


> I think some people think that something has been locked out thus the switch was to swap between 2 different bios chips with different settings and not 2 bios chips that are the same.
> 
> 
> 
> ...



putting on the tin foil hat for a moment. without a spec sheet on the hardware, hiding something like that ram size, bus width and number of SP's is actually very doable. If you had a datasheet with a complete ISA and all the registers. You might be able to figure out if they did hide SP's. 

  but thats from a practical perspective. yes you could in thoery disable in bios 2gb of ram and 400sp's. 

  in thoery, and no one would be the wiser.


----------



## Steevo (Dec 17, 2010)

They probably do have a few extras on the die so they hit their target yield while being able to lazor cut bad sections.


If you had the exact widths of the shader sections, insulating areas, number of die traces and insulating area you might be able to figure out if and when there was anything hidden. 


However, ATI faces a tough competitor, and a set of awesome cards this time around, so why hold back? I expect anywhere from 2-5% improvement through drivers and that is all. 


Who wants to try the BIOS mod first?


----------



## blu3flannel (Dec 17, 2010)

They definitely got a smack in the face when Nvidia released the 5xx series, but this is still a decent price/performance generation.


----------



## Thatguy (Dec 17, 2010)

Steevo said:


> They probably do have a few extras on the die so they hit their target yield while being able to lazor cut bad sections.
> 
> 
> If you had the exact widths of the shader sections, insulating areas, number of die traces and insulating area you might be able to figure out if and when there was anything hidden.
> ...



  one simply need look at the power difference between the 6950 and 6970 and determine one thing. 

   where is all that power going ? then again tsmc has proven to be rampant shitty at 40nm and it could just be a really leaky chip.

  I suspect they have a new bios comming to deal with the 2 way DMA changes that are likely bottlenecking the card.


----------



## Wshlist (Dec 17, 2010)

W1zzard said:


> those people used the wrong build of gpuz, the one that says "2011" for release date has no support for hd 6900 and calculates the shaders wrong. the one with the correct release date works right



http://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/32.html

Notice how the OpenCL box is not checked? So you say you never installed the OpenCL driver I guess? As I initially thought.
Thanks for clarifying.

It does start to almost seem you have some sort of hate against OpenCL now though, some personal dislike, with the rather fervent insisting it's nothing and ignoring my arguments and specifically not installing it at all and stuff.

I feel like if I spoke a kind word about assange in the presence of hillary


----------



## erocker (Dec 17, 2010)

Wshlist said:


> http://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/32.html
> 
> Notice how the OpenCL box is not checked? So you say you never installed the OpenCL driver I guess? As I initially thought.
> Thanks for clarifying.
> ...



I don't get it. What OpenCL application is there that could be benchmarked? As a 5850 owner I have yet to see any application that needs it.


----------



## W1zzard (Dec 17, 2010)

i love opencl and would love to see applications for it.

i love porn too, but didnt install it and didnt benchmark it.

stop complaining and suggest opencl applications to use


----------



## bear jesus (Dec 17, 2010)

W1zzard said:


> i love opencl and would love to see applications for it.
> 
> i love porn too, but didnt install it and didnt benchmark it.
> 
> stop complaining and suggest opencl applications to use



 such an awesome example.

I agree with the suggesting opencl app's to use as I'm yet to find a need for it but would happily take a look at something available but i don't know of anything so some suggestions would give me a direction to go in to look for things.

One thing though, i thought the new folding client that would help use the power for ATI/AMD cards was supposed to be using opencl, i am far from sure though, but if it did I'm sure that would be something many ATI/AMD card owners would be interested to see.


----------



## W1zzard (Dec 17, 2010)

folding doesn't use opencl. it hasn't been updated in ages for ATI cards. NVIDIA got the new gpu3 client, ati didnt. as mentioned before, we had folding, no feedback for it, so i dropped it. also it's not too reliable to benchmark


----------



## bear jesus (Dec 17, 2010)

W1zzard said:


> folding doesn't use opencl. it hasn't been updated in ages for ATI cards. NVIDIA got the new gpu3 client, ati didnt. as mentioned before, we had folding, no feedback for it, so i dropped it. also it's not too reliable to benchmark



No i meant as in the future client that would be coming  as in the one that was said to solve the fail performance of ATI cards I thought it was either opencl or direct compute... i have no idea what I'm talking about 

*edit*
Forgot to say that i agree folding is far from a reliable benchmark unless you could use the exact same work unit every single time.


----------



## nt300 (Dec 17, 2010)

Once again nice review W1zzard. just ordered my nice XFX Radeon HD 6970 2GB, can't wait to blow away zombies with it in Left 4 Dead 2 and soon to come 3  Can't beat the Price/Performance, this is priced A LOT better than the HD 5870 when it was released. I also read that AMD is going to release a driver fix that will release a massive performance increase of upto 20% plus in most if not all games. Let's see what plays out.


----------



## Magikherbs (Dec 18, 2010)

*I demand a rebench with the 10.12a hotfix *

I have found that, regardless of the version (10.2, 10.4, 10.7-12) most mobo/gpu driver updates do not install properly the first time. This applies to Amd chipset/ Nvidia configs but not as frequent. 
Glitchy startup/shutdowns and lower than expected scores are an easy sign of bad sb/gpu driver installs. eg. Performance Test 7.0 - Graphics 3D Simple and Medium tests score 60-70% lower. For some reason the complex and DirectX 10 test results are not affected. 
Things get back to norms once I "express uninstall all Ati software"(before the reboot, I uninstall ATI Stream SDK, Microsoft Visual C++ 2010x64 and Catalyst Install manager, if still present), then re install the SB and Gpu drivers followed by DirectX June 2010.

My gut tells me the 6970 has not been given a fair shake.

Peace

Edit
Amd Catalyst 10.12a hotfix
http://support.amd.com/us/kbarticles/Pages/AMDCatalyst1012ahotfix.aspx


----------



## Wile E (Dec 18, 2010)

Magikherbs said:


> I have found that, regardless of the version (10.2, 10.4, 10.7-12) most mobo/gpu driver updates do not install properly the first time. This applies to Amd chipset/ Nvidia configs but not as frequent.
> Glitchy startup/shutdowns and lower than expected scores are an easy sign of bad sb/gpu driver installs. eg. Performance Test 7.0 - Graphics 3D Simple and Medium tests score 60-70% lower. For some reason the complex and DirectX 10 test results are not affected.
> Things get back to norms once I "express uninstall all Ati software"(before the reboot, I uninstall ATI Stream SDK, Microsoft Visual C++ 2010x64 and Catalyst Install manager, if still present), then re install the SB and Gpu drivers followed by DirectX June 2010.
> 
> ...


You demand it? Don't you think REQUESTING it is a better way to ask for it? Besides, I'm willing to bet the differences are small.




On an unrelated note, I too, would like to see some OpenCL apps. Especially ones that accelerate video encoding, and don't suck at it like ATI's does.


----------



## W1zzard (Dec 18, 2010)

i demand that there will be less comments about driver versions in vga reviews

a little bit of patience .. i might have something for you in the future to solve the "omgz teh new driverz enable teh sideport and give +456348569845% performanc0rz!1" discussions


----------



## Wile E (Dec 18, 2010)

I was beginning to think sideport was a myth.


----------



## pantherx12 (Dec 18, 2010)

W1zzard said:


> so you are saying "test opencl, even though no actual applications exist other than benchmarks?"
> 
> why benchmark something that nobody uses?
> 
> ...




Just to be a dick, yes I do http://en.wikipedia.org/wiki/Mathematica .


----------



## Wile E (Dec 18, 2010)

Now, do you have a good bench app for that program?

Seems they have a player version that would be perfect for running a Mathmatica gpu bench app.


----------



## W1zzard (Dec 18, 2010)

oh i didnt know that mathematica supported opencl. nice find.

anyone got any kernels they could share? (that have some real world application). especially interested in the stock market prediction ones ^^


----------



## pantherx12 (Dec 18, 2010)

Wile E said:


> Now, do you have a good bench app for that program?
> 
> Seems they have a player version that would be perfect for running a Mathmatica gpu bench app.



I don't myself, I was just answering wizz's indirect question : ]



They got a free full trial wiz http://www.wolfram.com/mathematica/trial/ not sure if that stock module comes with it though, I'm downloading it now to have a look.


----------



## W1zzard (Dec 18, 2010)

pantherx12 said:


> I'm downloading it now to have a look.



expect to be lost if you haven't used it before


----------



## wahdangun (Dec 18, 2010)

yeah i'm fully support for openCL benchmark, or directcompute.


btw wizz can we have some short of donation thing for you so you can make eyefinity review ? 

and do you want to make HD 6970 CF review ?


----------



## W1zzard (Dec 18, 2010)

wahdangun said:


> and do you want to make HD 6970 CF review ?



working on it. german customs people liked the card very much, they had it for a week


----------



## entropy13 (Dec 18, 2010)

W1zzard said:


> working on it. german customs people liked the card very much, they had it for a week



They must have been benching it already.


----------



## Black Hades (Dec 18, 2010)

Benetanegia said:


> [...] Wow. Really really dissapointing. Future is not so bright for AMD. [...]



What's wrong with you people overly dramatizing everything. What are you rabid hockey fans?  
Future's bright for everyone. This is a good card, as is the competition's, any differences will be leveled out price-wise in a short time frame. The tables will turn one or two gen from now... big deal.


----------



## qubit (Dec 18, 2010)

W1zzard said:


> working on it. german customs people liked the card very much, they had it for a week



Wow, what did they think of it's performance?! Silly sods, it's just a graphics card, lol.


----------



## newtekie1 (Dec 18, 2010)

W1zzard said:


> we had folding in our reviews for a while. nobody said anything about it so i assumed people were not interested in it



I was extremely interested in it.  I just figured you removed it because it was too hard to get an acurate benchmark since the WUs change and vary so much.


----------



## qubit (Dec 18, 2010)

newtekie1 said:


> I was extremely interested in it.  I just figured you removed it because it was too hard to get an acurate benchmark since the WUs change and vary so much.



I remember with the old SETI client from years ago, it was possible to grab one work unit and always use that for benchmark comparisons. Is it possible to do that with Folding? (Even if it needs an awkward workaround).


----------



## wahdangun (Dec 18, 2010)

newtekie1 said:


> I was extremely interested in it.  I just figured you removed it because it was too hard to get an acurate benchmark since the WUs change and vary so much.



yeah especially AMD performance was really crap, i think folding guy really didn't want to cure cancer at all, i mean if they really want the power available why they didn't update for AMD client GPU for ages and stuck with HD 2900/HD 3870 era


----------



## Magikherbs (Dec 18, 2010)

Wile E said:


> You demand it? Don't you think REQUESTING it is a better way to ask for it? Besides, I'm willing to bet the differences are small.
> 
> 
> 
> ...



lol.. if I was serious there would be no smilely 



W1zzard said:


> i demand that there will be less comments about driver versions in vga reviews
> 
> a little bit of patience .. i might have something for you in the future to solve the "omgz teh new driverz enable teh sideport and give +456348569845% performanc0rz!1" discussions



 sweeet !


----------



## newtekie1 (Dec 18, 2010)

wahdangun said:


> yeah especially AMD performance was really crap, i think folding guy really didn't want to cure cancer at all, i mean if they really want the power available why they didn't update for AMD client GPU for ages and stuck with HD 2900/HD 3870 era



Really the nvidia client hasn't changed that much since it was implemented with CUDA and the G80 card. The GPU3 client was supposed to be an update for both but they ran into problems with the AMD side so they just released the nvidia client instead of holding up both because one didn't work.

Remember that GPU folding started with ATI only folding it was only because of CUDA that nVidia folding became possible and works so well. And nVidia GPUs are just so much better at folding and GPU computing tasks in general.


----------



## Benetanegia (Dec 18, 2010)

Black Hades said:


> What's wrong with you people overly dramatizing everything. What are you rabid hockey fans?
> Future's bright for everyone. This is a good card, as is the competition's, any differences will be leveled out price-wise in a short time frame. The tables will turn one or two gen from now... big deal.



You didn't get my point. They'll do "fine", both, fine and just fine. Future will be no bright for anyone of them. In the HD3xxx/8800/9800 or GTX2xx/HD4xxx the situation was not bright for anyone. Massive and constant price cuts are not good for anyone in the industry, it's only good for us, consumers.

When I say it's not going to be so bright for AMD, I meant no so bright as people were aniticipating. This performance means that Nvidia will probably always have the fastest card to make some extra profits (be it GTX580 or GTX595), while they will both compete in every lower segment (I suspect GTX560 will do exceptionally well against the HD6950) and hence the future is just so ever slightly better for Nvidia, not only because of that, but because of the 90+ % market share in professional cards and the fact they have HPC and if the $250-350 million Tegra2 order from Samsung turns out to be true.

This all means Nvidia doesn't need to be very profitable in every consumer card on their lineup, while AMD desperatelly needs to make profits, big profits, because they have loans to repay.

People should check their own blind fanboism and consider the entire picture before calling others out. :shadedshu


----------



## pantherx12 (Dec 18, 2010)

newtekie1 said:


> Really the nvidia client hasn't changed that much since it was implemented with CUDA and the G80 card. The GPU3 client was supposed to be an update for both but they ran into problems with the AMD side so they just released the nvidia client instead of holding up both because one didn't work.
> 
> Remember that GPU folding started with ATI only folding it was only because of CUDA that nVidia folding became possible and works so well. And nVidia GPUs are just so much better at folding and GPU computing tasks in general.



Run milky way at home, ATi cards  do better than NV cards.

I think you'll find its mostly support that hampers ATI cards in FOH.

With that being said though fermi does have GPGPU heavily in mind so 470s/570s up should do better.


----------



## AsRock (Dec 18, 2010)

Red_Machine said:


> Does that make up for the lack of PhysX and CUDA?  I'm sorry, but having to buy an extra card just to be able to do PhysX is not on in my opinion.



In that case it would not matter what AMD did as no AMD card has CUDA or Physx And never will ?..  So really performance don't matter even if it takes more electric to run it or not.


----------



## AddSub (Dec 18, 2010)

Barely 200+ posts and it's been days? As I recall when HD5xxx's launched the AMD fanb... um, "crowd", gloated and whipped themsevles into a frenzy causing 500+ post threads in short order. I guess the nVidia "crowd" just has more class, among other things.


----------



## erocker (Dec 18, 2010)

AddSub said:


> Barely 200+ posts and it's been days? As I recall when HD5xxx's launched the AMD fanb... um, "crowd", gloated and whipped themsevles into a frenzy causing 500+ post threads in short order. I guess the nVidia "crowd" just has more class, among other things.



I find people that pose an allegience to any corporation have no class. What's your point?


----------



## springs113 (Dec 18, 2010)

I second that Erocker...these guys take things to the heart lol.  I have dealt with both sides of the current gpu world but I prefer to use ATIs especially now because of the hdmi aspect, my last green team purchase has been the 8800gt which I only used for about a month before placing it inside its box, at that time I got a 3870 and loved the idea of hdmi only to watch my xbox hddvd player

These allegiences are fine but it crosses the line when one side starts nitpicking every little thing in order to keep their side relevant.


----------



## pantherx12 (Dec 18, 2010)

AddSub said:


> Barely 200+ posts and it's been days? As I recall when HD5xxx's launched the AMD fanb... um, "crowd", gloated and whipped themsevles into a frenzy causing 500+ post threads in short order. I guess the nVidia "crowd" just has more class, among other things.



5870 nearly twice the performance.

6970 not twice the performance.

Not going to be to much fanfair lol


----------



## Kreij (Dec 18, 2010)

Whether someone likes Nvidia or AMD is inconsequential.
We should all be thankful to both for keeping the competition alive.
If one goes down, we will be left with whatever crap the sole survivor puts on the market.

All of the latest offering are top notch and they are adding feartures on every rendition.
As consumers, we win !!


----------



## Magikherbs (Dec 18, 2010)

erocker said:


> I don't get it. What OpenCL application is there that could be benchmarked? As a 5850 owner I have yet to see any application that needs it.



Hey dude and all the best for the season !

You may want to try this Open CL bench 


http://www.xtremesystems.org/forums/showthread.php?t=260942

Peace

As far as I know, this only works with the 10.9 gpu and sb drivers. 
http://support.amd.com/us/gpudownload/windows/previous/10/Pages/integrated.aspx?os=Windows Vista - 64-Bit Edition&rev=10.9#1

Open Cl driver
http://developer.amd.com/gpu/atistreamsdk/pages/default.aspx

My apologies if this has caused your system to freeze up.


----------



## wahdangun (Dec 18, 2010)

newtekie1 said:


> Really the nvidia client hasn't changed that much since it was implemented with CUDA and the G80 card. The GPU3 client was supposed to be an update for both but they ran into problems with the AMD side so they just released the nvidia client instead of holding up both because one didn't work.
> 
> Remember that GPU folding started with ATI only folding it was only because of CUDA that nVidia folding became possible and works so well. And nVidia GPUs are just so much better at folding and GPU computing tasks in general.



how do you know that ? there are no ati stream version. and i think thats why we need openCL benchmark so we can compare it equally


----------



## newtekie1 (Dec 18, 2010)

wahdangun said:


> how do you know that ? there are no ati stream version. and i think thats why we need openCL benchmark so we can compare it equally



Well that is a problem that AMD decided not to addres.  When nVidia provided aid to the FAH to get it working on CUDA, ATi didn't provide anything.  I see that argument a lot, that there isn't support for ATi streams, and the simple reason is that ATi doesn't provide anywhere near the support level that nVidia does.

And while OpenCL is great being universal, that being universal also means that it isn't as efficient as the native CUDA/Streams.


----------



## erocker (Dec 18, 2010)

Magikherbs said:


> My apologies if this has caused your system to freeze up.



Apology accepted.


----------



## Magikherbs (Dec 18, 2010)

erocker said:


> Apology accepted.



Whew!   I got myself too lolz 

Stock GPU CCC triple buff adv AI







800 1250 GPU stock CCC triple buff adv AI





800 1250 GPU 16xAA 16xAF triple buff adv AI





Cpu Open CL


----------



## Wile E (Dec 19, 2010)

springs113 said:


> I second that Erocker...these guys take things to the heart lol.  I have dealt with both sides of the current gpu world but I prefer to use ATIs especially now because of the hdmi aspect, my last green team purchase has been the 8800gt which I only used for about a month before placing it inside its box, at that time I got a 3870 and loved the idea of hdmi only to watch my xbox hddvd player
> 
> These allegiences are fine but it crosses the line when one side starts nitpicking every little thing in order to keep their side relevant.



The current nVidia cards do audio over HDMI as well.


----------



## horik (Dec 19, 2010)

I was thinking about buyng this card,but when i found an 6950 at 281€ i took it,from what i saw in the review the 6950 is close to gtx570 at resolution i usualy play


----------



## nt300 (Dec 19, 2010)

Wile E said:


> The current nVidia cards do audio over HDMI as well.


They never had audio over HDMI for some time now but they've been doing it as of late. ATI was first for full audio + video over HDMI ever siince the HD 2000 series if not earlier. The audio's been getting better and better.


----------



## Steevo (Dec 19, 2010)

Moot point on the audio of current gen.

ATI, superior video playback quality, superior 3D render quality, more stable drivers (by crash report per capita), faster bugfixes as of late, consistent driver updates, overclocker friendly, enthusiast friendly new BIOS anti-f**kup feature, uses industry standards (no proprietary CUDA), and in general has provided more bang for the buck than competition.


----------



## CBRworm (Dec 19, 2010)

You know what would be interesting to add to the card reviews?  Card idle draw with multiple monitors attached.  The 6970 uses the 500/1375 clocks at idle once a second monitor is attached and used in any mode other than "clone", which adds up to the idle power draw changing from ~20 watts to ~70 watts.  This is something I experienced with my 5870 as well, although it used about 20 watts less at idle with two displays than the 6970.  Apparently most cards by both MFG's are affected by this and as more people use multiple displays this is a measurement that would be usefull going forward.

So now it's the old green team vs the new green (formerly red) team?  AMD can switch names, but they should have stuck with a red logo for the video products!


----------



## W1zzard (Dec 19, 2010)

CBRworm said:


> Card idle draw with multiple monitors attached



i have that planned for the next time i rebench power on all cards


----------



## CBRworm (Dec 19, 2010)

That will be much appreciated!


----------



## Wile E (Dec 20, 2010)

nt300 said:


> They never had audio over HDMI for some time now but they've been doing it as of late. ATI was first for full audio + video over HDMI ever siince the HD 2000 series if not earlier. The audio's been getting better and better.



I know ATI was first, I bought a 2900XT on launch day instead of an 8800 because of it, but that doesn't matter, all that matters is what's available now. Both have audio over HDMI.





Steevo said:


> Moot point on the audio of current gen.
> 
> ATI, superior video playback quality, superior 3D render quality, more stable drivers (by crash report per capita), faster bugfixes as of late, consistent driver updates, overclocker friendly, enthusiast friendly new BIOS anti-f**kup feature, uses industry standards (no proprietary CUDA), and in general has provided more bang for the buck than competition.



I disagree on playback and render quality. They are both equal. I have also seen a lot more complaints about ATI drivers over the past few month than nVidia. Just look at how many unfixed Eyefinity problems there are. Sorry, but I just don't believe your crash report claims. Got proof of them?

And who cares if they use open standards, when there is nothing out there making use of them? OpenCL is all but useless. At least there are actually some apps written for CUDA, and nVidia actually lends a hand to devs to get stuff working on their cards. Can't say the same for ATI. I'd rather have a well supported proprietary, but free to use API, than a barely supported open API. If OpenCL actually gains ground, and ATI actually gets of off their asses and helps devs get it going on the hardware, I'll change my tune, until then, I'll take CUDA, thanks.


----------



## pantherx12 (Dec 20, 2010)

Protip! Cuda has a two year headstart guys!

OpenCL needs to offer the same features as support as cuda before it will be used more commonly.


----------



## Wile E (Dec 20, 2010)

pantherx12 said:


> Protip! Cuda has a two year headstart guys!
> 
> OpenCL needs to offer the same features as support as cuda before it will be used more commonly.



I understand that, and when it finally does catch up, i'll consider it as part of my buying process. As it stands tho, it is not a significant feature for the current generation of cards. By the time it catches up, newer stuff will be out.


----------



## HalfAHertz (Dec 20, 2010)

OpenCL runs on *everything*. It provides the true homogeneous experience both for GPUs and CPUs (and soon even APUs). People already started adding GPUs in HPC supercomputers, just look at the current number one - the one in China. That means that if you want to utilize all the resources at the same time you have to use openCL. 
What I'm trying to say is that the fact that openCL is dead  in the consumer space doesn't mean that it's not relevant in the server market...


----------



## pantherx12 (Dec 20, 2010)

Magikherbs said:


> Hey dude and all the best for the season !
> 
> You may want to try this Open CL bench
> 
> ...



It causes me system to freeze everytime, any tips?


----------



## Magikherbs (Dec 20, 2010)

pantherx12 said:


> It causes me system to freeze everytime, any tips?



Did you try the 10.9 CCC and SB drivers ?


----------



## pantherx12 (Dec 20, 2010)

Magikherbs said:


> Did you try the 10.9 CCC and SB drivers ?



Link to chipset drivers please, as for 10.9 I can't be bothered to rollback from 10.12 XD


----------



## Magikherbs (Dec 20, 2010)

pantherx12 said:


> Link to chipset drivers please, as for 10.9 I can't be bothered to rollback from 10.12 XD



lol.. they are in my post you quoted  The Sb drivers are in the optional tab. I don't use CC cleaner ect... btw..  After you 'express uninstall all Ati' , say 'no' to the reboot and get rid of ATI Steam Sdk and Visual C++ 2010, if present.

I'm back on the 10.12 set again. but his time I installed the CCC drivers first, instead of the other way around. The usual startup and restart flickers and blips are noo more heh...

EDIT
Again.. to all that may have suffered a system freeze up and the grief that goes with it, I just want to say... 
http://www.youtube.com/watch?v=oGuSSSbll1w


----------



## AboAl3meer KG21 (Dec 20, 2010)

ill go with the gtx580  its gona be my next card for sure


----------



## pantherx12 (Dec 20, 2010)

After reading loads of different reviews for the 6900 series, it has to be said performance is ALL over the place lol

the bjorn3d review puts a 6950 up against 480s ( infact at higher resolutions it sometimes beats it)

According to AMD a driver should be released sometime in early 2011 which fix a lot of things.

It still won't be beating the 580 but will consistantly be competing with the 570 (where as at the moment sometimes it epically fails)

Here's hope AMD arnt BSing!



WIZZ!!!!!!!!!!!! 3dmark11 uses opencl and direct compute   (bullet phsyx uses open cl I beleive, at the very least it uses open cl for ati cards as they don't support cuda phsyx)

There's another one for ya he he


----------



## Steevo (Dec 20, 2010)

Wile E said:


> I know ATI was first, I bought a 2900XT on launch day instead of an 8800 because of it, but that doesn't matter, all that matters is what's available now. Both have audio over HDMI.
> 
> I disagree on playback and render quality. They are both equal. I have also seen a lot more complaints about ATI drivers over the past few month than nVidia. Just look at how many unfixed Eyefinity problems there are. Sorry, but I just don't believe your crash report claims. Got proof of them?
> 
> And who cares if they use open standards, when there is nothing out there making use of them? OpenCL is all but useless. At least there are actually some apps written for CUDA, and nVidia actually lends a hand to devs to get stuff working on their cards. Can't say the same for ATI. I'd rather have a well supported proprietary, but free to use API, than a barely supported open API. If OpenCL actually gains ground, and ATI actually gets of off their asses and helps devs get it going on the hardware, I'll change my tune, until then, I'll take CUDA, thanks.



Playback quality has been done in a real test by W1zz, ATI won.
Render quality has been done on 5XXX, 6XXX series cards VS Nvidia, ATI won. To achieve the same effects as Nvidia we would have to re implement angle dependency, reduce LOD by 30% and not include the EQAA or MLA effects ATI has out.
Studies by Microsoft and Steam both show that per capita Nvidia has more driver faults than ATI.


You are right, there are apps for CUDA. 13 games use their proprietary physx, and even in those games Nvidia whitepapers do not reccomend the real time rendering of certain effects, in stead suggesting "precooking" effects.

ATI is supporting standardized techonology, they did support the beginnings of hardware acceleration, and had it first(F@H) (Avivo video transcoder). ATI had tessellation first (Nvidia didn't want to play along), ATI had DX10 specs met, Nvidia didn't want to play along.


All said though, I agree, if ATI and open CL doesn't get moving I am going green on my next card.


----------



## W1zzard (Dec 20, 2010)

pantherx12 said:


> 3dmark11 uses opencl and direct compute  (bullet phsyx uses open cl I beleive, at the very least it uses open cl for ati cards as they don't support cuda phsyx)



and how is 3dmark11 productive?

it will be added to the benchmarks soon, but i dont think it serves any use as gpgpu computation application to bench


----------



## pantherx12 (Dec 20, 2010)

W1zzard said:


> and how is 3dmark11 productive?
> 
> it will be added to the benchmarks soon, but i dont think it serves any use as gpgpu computation application to bench



Touché it certainly isn't productive, but how are any of the other things you benchmark productive? 

I would just like to see it added as it's a step away from phsyx and thus levels the playing field between NV/ATI during benchmarking.  ( although I imagine you ran the other 3d mark tests with phsyx switched off for fairness sake anyway)


----------



## Swamp Monster (Dec 20, 2010)

About applications that use OpenCL: Fractron 9000. 
That is little program that renders nice graphic effects and you can later save them as jpeg image.


----------



## pantherx12 (Dec 20, 2010)

Swamp Monster said:


> About applications that use OpenCL: Fractron 9000.
> That is little program that renders nice graphic effects and you can later save them as jpeg image.



I think wiz is interested in programs you have to pay for in order for it to be worth his time kinda thing.

Seems that way anyway.


----------



## Steevo (Dec 20, 2010)

He is looking for a benchmark that is repeatable, standard, and clearly defined in measurement.


A DX11 only benchmark will not allow someone to compare how much of a increase a game will get compared to their old 4xxx, or 2xx cards. Currently a upgrade from either company is almost a sideways move in performance. Metro is no more playable on a 580 than a 480, likewise dirt2 is no less playable on a 6970 than a 5870.

A few worthless benchmarks are the only place most users would notice any performance increase.


----------



## pantherx12 (Dec 20, 2010)

Steevo said:


> He is looking for a benchmark that is repeatable, standard, and clearly defined in measurement.
> 
> 
> A DX11 only benchmark will not allow someone to compare how much of a increase a game will get compared to their old 4xxx, or 2xx cards. Currently a upgrade from either company is almost a sideways move in performance. Metro is no more playable on a 580 than a 480, likewise dirt2 is no less playable on a 6970 than a 5870.
> ...





I didn't mean a paid for benchmark, just meant he seems to want to see more opencl support in applications before commiting to adding it to his benchmarks. ( due to him already doing so many I imagine  )


----------



## Swamp Monster (Dec 20, 2010)

pantherx12 said:


> I think wiz is interested in programs you have to pay for in order for it to be worth his time kinda thing.
> 
> Seems that way anyway.



I didn't mean it for benchmarks, only for people to know that some apps use OpenCL

For benchmarks I've been using:
http://www.ngohq.com/graphic-cards/16920-directcompute-and-opencl-benchmark.html
Interesting program, but not suitable for TPU reviews I guess.


----------



## Super XP (Dec 21, 2010)

Wile E said:


> I know ATI was first, I bought a 2900XT on launch day instead of an 8800 because of it, but that doesn't matter, all that matters is what's available now. Both have audio over HDMI.
> 
> I disagree on playback and render quality. They are both equal. I have also seen a lot more complaints about ATI drivers over the past few month than nVidia. Just look at how many unfixed Eyefinity problems there are. Sorry, but I just don't believe your crash report claims. Got proof of them?
> 
> And who cares if they use open standards, when there is nothing out there making use of them? OpenCL is all but useless. At least there are actually some apps written for CUDA, and nVidia actually lends a hand to devs to get stuff working on their cards. Can't say the same for ATI. I'd rather have a well supported proprietary, but free to use API, than a barely supported open API. If OpenCL actually gains ground, and ATI actually gets of off their asses and helps devs get it going on the hardware, I'll change my tune, until then, I'll take CUDA, thanks.


More complaints about ATI Drivers? Never have I heard of such a think as of late. They've been rock solid with a steady stream of wonderful updates.


----------



## pantherx12 (Dec 21, 2010)

Swamp Monster said:


> I didn't mean it for benchmarks, only for people to know that some apps use OpenCL
> 
> For benchmarks I've been using:
> http://www.ngohq.com/graphic-cards/16920-directcompute-and-opencl-benchmark.html
> Interesting program, but not suitable for TPU reviews I guess.



Cool you should check out gluxmark also it's a openGL/CL benchmark.


----------



## Magikherbs (Dec 21, 2010)

So its not as much about the drivers as I thought. ..
This awesome review by W1zzard shows that the GTX 580 is not the hands down winner some may say it is.

http://www.techpowerup.com/reviews/AMD/Catalyst_10.12_Performance/

Some games and benchmarks clearly favor the Nvidia gpus. In my view, they should not count. Otherwise you may aswell include Folding @Home stats !


----------



## Wile E (Dec 22, 2010)

Steevo said:


> Playback quality has been done in a real test by W1zz, ATI won.
> Render quality has been done on 5XXX, 6XXX series cards VS Nvidia, ATI won. To achieve the same effects as Nvidia we would have to re implement angle dependency, reduce LOD by 30% and not include the EQAA or MLA effects ATI has out.
> Studies by Microsoft and Steam both show that per capita Nvidia has more driver faults than ATI.
> 
> ...


Playback quality is not better on ATI. I disable ALL effects in video playback. I want accuracy, not flashy. I don't do "enhancements". They are both equal.

I don't like what I've seen of EQAA or MLA. I like very light AA.

I said nothing about Physx. 

I don't care about open standards, I care what relevant to me as an end user.

And more driver faults over what span of time, with what hardware? I still don't buy it. A couple years ago nVidia had really bad drivers, but now it's ATI, and it has been since a couple months after the 4k series release. (8.11 is where it all went to crap, iirc.)



Super XP said:


> More complaints about ATI Drivers? Never have I heard of such a think as of late. They've been rock solid with a steady stream of wonderful updates.



The updates have been all but useless. Still stupid crappy bugs like scaling. Then there's still the shitty multi-gpu scaling in some games. Borderlands comes to mind for me. It's not exactly like my X2 is a weak card.



Magikherbs said:


> So its not as much about the drivers as I thought. ..
> This awesome review by W1zzard shows that the GTX 580 is not the hands down winner some may say it is.
> 
> http://www.techpowerup.com/reviews/AMD/Catalyst_10.12_Performance/
> ...



You are misunderstanding that review. First, the 580 is faster in most of the benchmarks, second, the performance summary is not comparing card vs card, it is comparing launch driver vs newest driver, grouped. 100% for both 6970 and 580 doesn't mean they are equal, it means that both have the same performance with launch drivers as they do the newest drivers.


----------



## pantherx12 (Dec 22, 2010)

Wile E said:


> I don't care about open standards, I care what relevant to me as an end user.
> 
> .




That's sort of contradicting yourself dude, open standards = better for the consumer as it breeds competition as all companies need to be competitive with their hardware if the software is open to all.

Where as when it's propietry you don't have to be competitive simply because another company can't use your software. I.E Phsyx could actually be really shitty and we have no idea if it is as we have no real comparison except CPU at the moment and a few AMD demos.

Open standards are precisely what consumers should feel most strongly about, if everyone pushed forward open standards then hardware would get better as companies are really forced to make better hardware than the other company


----------



## Wile E (Dec 22, 2010)

pantherx12 said:


> That's sort of contradicting yourself dude, open standards = better for the consumer, propietry = charge what ever they like.
> 
> Excuse my spelling.
> 
> No spell check XD



It's not contradictory at all. Open standards aren't better for the consumer when nobody adopts them into their software. OpenCL is completely irrelevant for me, because nothing uses it.

So, buy into an Open standard with no software, or buy into a free to use, but proprietary standard that does have software?

As an end user, I know which one I am picking.


----------



## pr0n Inspector (Dec 22, 2010)

Wile E said:


> Playback quality is not better on ATI. I disable ALL effects in video playback. I want accuracy, not flashy. I don't do "enhancements". They are both equal.




This.

I don't understand the fascination over shitty filters. or wide-gamut.


----------



## Bjorn_Of_Iceland (Dec 24, 2010)

Steevo said:


> He is looking for a benchmark that is repeatable, standard, and clearly defined in measurement.
> 
> 
> A DX11 only benchmark will not allow someone to compare how much of a increase a game will get compared to their old 4xxx, or 2xx cards. Currently a upgrade from either company is almost a sideways move in performance. Metro is no more playable on a 580 than a 480, likewise dirt2 is no less playable on a 6970 than a 5870.
> ...


Ok so unigine heaven doesnt count


----------



## Magikherbs (Dec 24, 2010)

Wile E said:


> You are misunderstanding that review. First, the 580 is faster in most of the benchmarks, second, the performance summary is not comparing card vs card, it is comparing launch driver vs newest driver, grouped. 100% for both 6970 and 580 doesn't mean they are equal, it means that both have the same performance with launch drivers as they do the newest drivers.



I understand it completely lol.. You seem not to understand my end statement. Games/app/benchmarks that benefit from Cuda, Physx and/or high shader speeds will score higher on Nvidia gpus. I was trying to use Folding @Home as an example.

How many of those results fall into that catagory ?



pantherx12 said:


> That's sort of contradicting yourself dude, open standards = better for the consumer as it breeds competition as all companies need to be competitive with their hardware if the software is open to all.
> 
> Where as when it's propietry you don't have to be competitive simply because another company can't use your software. I.E Phsyx could actually be really shitty and we have no idea if it is as we have no real comparison except CPU at the moment and a few AMD demos.
> 
> Open standards are precisely what consumers should feel most strongly about, if everyone pushed forward open standards then hardware would get better as companies are really forced to make better hardware than the other company



Exactly ! hah.. or.. you may aswell own an Apple ! LmfreakinAO


----------



## Wile E (Dec 24, 2010)

Magikherbs said:


> I understand it completely lol.. You seem not to understand my end statement. Games/app/benchmarks that benefit from Cuda, Physx and/or high shader speeds will score higher on Nvidia gpus. I was trying to use Folding @Home as an example.
> 
> How many of those results fall into that catagory ?
> 
> ...


The fact that nVidia scores higher due to Cuda, Physx and especially higher shader speeds is irrelevant. All that matters is end results. If an nVidia card does better on your favorite games, it does better on your favorite games. Reasons why do not matter. And F@H performance is poor on ATI because ATI is not devoting any time to Stanford to help them get it running better on their cards. nVidia does. ATI hardware has the potentialto do much better, but their software team lets it down.

I'd rather have Windows or OS X than Linux. Windows and OS X are not open, and Linux is, but Linux isn't better from the standpoint of most end users. I actually do own an older iMac, btw. 

Open =/= better. Sometimes proprietary standards are just better for an end user, because it is better supported in the real world.


----------



## bear jesus (Dec 24, 2010)

If cuda and physx ran on ATI/AMD cards i think that could be compared to something like windows as it may be closed but it runs on all hardware so there is no limit to what company you have to buy a product from, that will never happen so i would assume as openCL is newer there will be a while until there is anywhere near as many programs that use cuda but with time the number will grow so yes openCL is of little use to us right now but in the near future it should become more useful.

I think the cuda sdk came out near 4 years ago yet ATI/AMD released their openCL sdk a little over a year ago, is there any surprise there is more things that use cuda? how many programs used cuda within a year of the sdk release?



Wile E said:


> And F@H performance is poor on ATI because ATI is not devoting any time to Stanford to help them get it running better on their cards. nVidia does. ATI hardware has the potential to do much better, but their software team lets it down.



I wish AMD/ATI could see folding performance is a selling point for high end cards, sure not a massive market but if AMD cards could fold as well as nvidia but with less power usage then I'm sure many people running their own folding farms would happily look to them for a future upgrade due to reduced power usage/running cost and possibly heat, that is of course assuming AMD's architecture could be efficient at folding given good enough software.

i used to fold with my 4870 but the performance was hardly better than people running 3850's yet used loads of power and put out so much heat that i just had to give up on it, i only intend to fold once i get a Nvidia card in my htpc as i have no faith in a useful AMD folding client as things stand right now.


----------



## Wile E (Dec 24, 2010)

bear jesus said:


> If cuda and physx ran on ATI/AMD cards i think that could be compared to something like windows as it may be closed but it runs on all hardware so there is no limit to what company you have to buy a product from, that will never happen so i would assume as openCL is newer there will be a while until there is anywhere near as many programs that use cuda but with time the number will grow so yes openCL is of little use to us right now but in the near future it should become more useful.
> 
> I think the cuda sdk came out near 4 years ago yet ATI/AMD released their openCL sdk a little over a year ago, is there any surprise there is more things that use cuda? how many programs used cuda within a year of the sdk release?


Then compare it to OSX instead. OSX only runs on approved hardware, but is still better for most end users compared to linux. Although I have a feeling Steve Jobs is going to ruin that at some point, and move to the walled garden system used on iPad/iPhone/iPod touch.

And when OpenCL becomes more useful, it will become relevant. By that time, we'll probably have new hardware out, so the point is moot as far as reviews are concerned for the current generation.


bear jesus said:


> I wish AMD/ATI could see folding performance is a selling point for high end cards, sure not a massive market but if AMD cards could fold as well as nvidia but with less power usage then I'm sure many people running their own folding farms would happily look to them for a future upgrade due to reduced power usage/running cost and possibly heat, that is of course assuming AMD's architecture could be efficient at folding given good enough software.
> 
> i used to fold with my 4870 but the performance was hardly better than people running 3850's yet used loads of power and put out so much heat that i just had to give up on it, i tried other software that ATI cards work better with but always struggled to get anything to process so fully gave up and only intend to fold once i get a Nvidia card in my htpc.


I wish ATI would devote more time to the software side of things, period. Not just folding.


----------



## Magikherbs (Dec 24, 2010)

Wile E said:


> The fact that nVidia scores higher due to Cuda, Physx and especially higher shader speeds is irrelevant. All that matters is end results. If an nVidia card does better on your favorite games, it does better on your favorite games. Reasons why do not matter. And F@H performance is poor on ATI because ATI is not devoting any time to Stanford to help them get it running better on their cards. nVidia does. ATI hardware has the potentialto do much better, but their software team lets it down.
> 
> I'd rather have Windows or OS X than Linux. Windows and OS X are not open, and Linux is, but Linux isn't better from the standpoint of most end users. I actually do own an older iMac, btw.
> 
> Open =/= better. Sometimes proprietary standards are just better for an end user, because it is better supported in the real world.



I highly doubt a Nvidia card will score the same with Physx disabled, with 3DMark06/11, Nfs Shift, FFXIV benchmark and others. Nfs shift goes so far as to install Physx, even though my gpu is not Nvidia. ahha.......

If Physx had never sold out to Nvidia, and stayed neutral, we would not be having this converstation right now lolz 

*pantherx12* is right ! "Open standards are precisely what consumers should feel most strongly about, if everyone pushed forward open standards then hardware would get better as companies are really forced to make better hardware than the other company."

We as consumers should demand this.


----------



## Wile E (Dec 24, 2010)

Magikherbs said:


> I highly doubt a Nvidia card will score the same with Physx disabled, with 3DMark06/11, Nfs Shift, FFXIV benchmark and others. Nfs shift goes so far as to install Physx, even though my gpu is not Nvidia. ahha.......
> 
> If Physx had never sold out to Nvidia, and stayed neutral, we would not be having this converstation right now lolz
> 
> ...


GPU Physx doesn't even effect those games. On or off makes no difference. if nVidia is faster, it's purely because the card is faster in those games/benches. Most games that use Physx use the CPU, not the gpu, just like havok does. Go here: http://www.geforce.com/#/GamesandApps and click on the Physx link. Only 13 games use gpu acceleration.

And you seemed to miss the fact that I was primarily talking about CUDA, not Physx anyway. There are no physics engines written in OpenCL to my knowledge, so there is nothing here to compare Physx against.

Why should we feel strongly about open standards? CUDA is free to use. Open standards hasn't made Linux the better choice for most people. You aren't getting all worked up over OpenGL instead of Microsoft's proprietary DirectX. I just want well supported APIs that costs me nothing as an end user. I don't care who developed them. APIs with no apps are useless.


----------



## pantherx12 (Dec 24, 2010)

YO wille, bullet phsyx supports opencl, ( or will do very soon) download their pre compiled demos, got a few GPU based demos on there.

Also mathmatica uses opencl, mathmatica is a program that's 20 years old ( although opencl support since 2010) it's one of THE maths research programs.

CUDA had a two year headstart is all man, I think most people are going to use the open standard now it's starting to become a bit more polished as well, it's stupid not to lol it restricts how many systems their software can run on other wise : ].


You make it sound like Cuda is inheriantly better ( may of spelt that wrong he he) where as it's not it's just more stable, a bit more grown up.

But opencl should have a massive growth spurt soon as it CAN do everything cuda can, but it can do it running on ANY hardware.

That is a HUUUUUUUUUUUUGEEEEEEEEEE thumbs up when your a software designer  

Us end users are the least important when it comes to stuff like this, remember that.


Also Wille, I don't think you work for AMD you can't really comment on what they do behind the scenes. 
Loads of people say " ahh they don't give dev support !" where as they just didn't make a song and dance about it, they helped with loads of the launch dx11 titles but no one knew because there was no AMD logos splattered all over the place.  (AVP for example, can't remember the rest, I'll need to watch the video again where all the devs met up)

I suspect that could be the case with their software development.

You should check out their developer forums, their reps are ALWAYS on them ( their avatars have headsets) so they're even helping normal people using their sdks, not just companys.

( sure nvidia does the same, but everyone assumes AMD doesn't)


I could go there now and get help developing an app that I've designed myself...... If I knew the first thing about coding that is!


( by the way, don't let my specs fool you, not a fanboy, I like what ever company gives me the best stuff! he he)


----------



## Wile E (Dec 24, 2010)

There have been more than a few devs that specifically mentioned that ATI would not help them, but nVidia would.

And again, just because it's open, does not mean it will gain more support. OpenGL has less support than DirectX, despite being open and cross platform. OS X has more support than Linux, despite being a closed platform. Open is not always better for the consumer.

It will all come down to ease of developing for it. For some reason, open platforms seem to be harder to code for, and many devs find it not worth doing.

Like I said, I don't care about the behind the scenes, I only care what I can do with it. Mathmatica serves me no purpose. How many Bullet Physics titles have OpenCL acceleration? Again, completely not relevant to an end user. OpenCL will not be relevant until more things use it.

And look at my specs, I'm no fanboy either.


----------



## pantherx12 (Dec 24, 2010)

I was only saying about the fanboy bit to avoid being excused of being one for the long post 

lol

Opengl is less sucessful because microsoft have directx and most people use their operating system anyway,(I.E it has a massive market share so open source has no advantage)  no point using opengl just to please a a tiny market segment ( linux and osx)

So it's a bit of a different kettle but I can see what you mean.

And they were just examples to show that people will be using opencl and are using opencl.

And when something big like mathmatica uses it that's a good sign of things to come : ]


----------



## Magikherbs (Dec 24, 2010)

Wile E said:


> GPU Physx doesn't even effect those games. On or off makes no difference. if nVidia is faster, it's purely because the card is faster in those games/benches. Most games that use Physx use the CPU, not the gpu, just like havok does. Go here: http://www.geforce.com/#/GamesandApps and click on the Physx link. Only 13 games use gpu acceleration.
> 
> And you seemed to miss the fact that I was primarily talking about CUDA, not Physx anyway. There are no physics engines written in OpenCL to my knowledge, so there is nothing here to compare Physx against.
> 
> Why should we feel strongly about open standards? CUDA is free to use. Open standards hasn't made Linux the better choice for most people. You aren't getting all worked up over OpenGL instead of Microsoft's proprietary DirectX. I just want well supported APIs that costs me nothing as an end user. I don't care who developed them. APIs with no apps are useless.



Why don't I don't see Need for Speed Shift on that list of Physx games lol..

Pardon the slight change of topic from Cuda to Physx. Maybe, I should not have included Cuda in my initial statements b/c Cuda drivers/code are written by Nvidia and not the software developers.


----------



## wahdangun (Dec 24, 2010)

Wile E said:


> It's not contradictory at all. Open standards aren't better for the consumer when nobody adopts them into their software. OpenCL is completely irrelevant for me, because nothing uses it.
> 
> So, buy into an Open standard with no software, or buy into a free to use, but proprietary standard that does have software?
> 
> As an end user, I know which one I am picking.



but if no one pick up the open standard then how do you expect it to became mainstream??


----------



## bear jesus (Dec 24, 2010)

As i said "cuda sdk came out near 4 years ago yet ATI/AMD released their openCL sdk a little over a year ago" how about we see how opencl is doing in 3 years?


----------



## W1zzard (Dec 24, 2010)

bear jesus said:


> As i said "cuda sdk came out near 4 years ago yet ATI/AMD released their openCL sdk a little over a year ago" how about we see how opencl is doing in 3 years?



http://en.wikipedia.org/wiki/Close_to_Metal around late 2005


----------



## bear jesus (Dec 24, 2010)

W1zzard said:


> http://en.wikipedia.org/wiki/Close_to_Metal around late 2005



wow i don't think i have ever been so wrong in my life  my bad


----------



## W1zzard (Dec 24, 2010)

bear jesus said:


> wow i don't think i have ever been so wrong in my life  my bad



no worries. we're all here to learn.


----------



## bear jesus (Dec 24, 2010)

W1zzard said:


> no worries. we're all here to learn.



Very true  but the silly thing is i poked google to try and gather the dates i listed, darn internet i blame it for my knowledge and lack of it 

I would hope that with more time openCL will become more used as my last two cards have been AMD/ATI so cuda has been unavailable to me for years now and i would love to be able to put my cards to some useful GPGPU processing.


----------



## Wile E (Dec 25, 2010)

Magikherbs said:


> Why don't I don't see Need for Speed Shift on that list of Physx games lol..
> 
> Pardon the slight change of topic from Cuda to Physx. Maybe, I should not have included Cuda in my initial statements b/c Cuda drivers/code are written by Nvidia and not the software developers.



Because Shift does not use GPU Physx. It is just CPU Physx. It does not use the video card for running any of teh Physx calculations at all. Most Physx games are run on the CPU.


----------



## HalfAHertz (Dec 25, 2010)

OpenGL is not popular anymore because the api is so cumbersome to write for. After the Kronos group took over it and started slapping "patches" left and right to fill the widening gap between it and Dx, things quickly went downhill. Game devs nowadays need to put out a game as quickly as possible and because writing for Dx is quicker and simpler, most people go that way.
   OpenCL on the other hand has been kept pretty tight so far. But I think development has slowed down to a craw because it is trying to cover so many different interests (Nvidia, AMD, Intel) and because the big three will never come to an agreement.


----------



## Magikherbs (Dec 25, 2010)

Wile E said:


> Because Shift does not use GPU Physx. It is just CPU Physx. It does not use the video card for running any of teh Physx calculations at all. Most Physx games are run on the CPU.



Then why does Need for Speed Shift install the Nvidia Physx software regardless of your config ?  There is no option, it just installs it. heh..

I just remembered how the new Nvidia drivers seem to relieve your cpu of its physics duties. hehh.. How much does that affect things ?


----------



## pantherx12 (Dec 25, 2010)

Magikherbs said:


> Then why does Need for Speed Shift install the Nvidia Physx software regardless of your config ?  There is no option, it just installs it. heh..
> 
> I just remembered how the new Nvidia drivers seem to relieve your cpu of its physics duties. hehh.. How much does that affect things ?
> 
> http://img.techpowerup.org/101225/Capture.jpg



It installs it so your cpu can run phsyx....

phsyx isn't something that just runs it needs to be installed XD


----------



## wahdangun (Dec 27, 2010)

W1zzard said:


> no worries. we're all here to learn.



btw wizz where is HD 6970 CF ?? ,


----------



## Magikherbs (Dec 27, 2010)

pantherx12 said:


> It installs it so your cpu can run phsyx....
> 
> phsyx isn't something that just runs it needs to be installed XD



huh.. didn't know it could run Physx without a Nvidia gpu.

My cpu can run physics just fine lol. I was trying to say that EA Games seems to have some vested interests.


----------



## pantherx12 (Dec 27, 2010)

Loads of games run the phsyx software on the CPU actually, just hardly any use GPU phsyx.

Big list on nvidia website : ]


----------



## DarthElvis (Dec 31, 2010)

W1zzard said:


> i love porn too, but didnt install it and didnt benchmark it.



Well then your leaving out a benchmark that way more than 5% of users could use

Back on topic, 10% -15% more performance with a 580 for 30% more cost. Yeah, ATI failed alright

6970 prices have stabilized here to about the same price as the 570, which is pretty much it's equal.


----------



## Super XP (Jan 1, 2011)

DarthElvis said:


> Well then your leaving out a benchmark that way more than 5% of users could use
> 
> Back on topic, 10% -15% more performance with a 580 for 30% more cost. Yeah, ATI failed alright
> 
> 6970 prices have stabilized here to about the same price as the 570, which is pretty much it's equal.


Elvis fan I see, yes he was the KING, oh and still remains the KING 
ATI's Radeon HD 6970 2GB GDDR5 Graphics Card currently holds the best Price/Performance here in Canada IMO. It all depends on the games you want to play. I also like the FACT AMD/ATI's HD 6800 and HD 6900 series scales AMAZINGLY in CrossfireX vs. previous Gen Graphics Cards.


----------



## Wile E (Jan 3, 2011)

Super XP said:


> Elvis fan I see, yes he was the KING, oh and still remains the KING
> ATI's Radeon HD 6970 2GB GDDR5 Graphics Card currently holds the best Price/Performance here in Canada IMO. It all depends on the games you want to play. I also like the FACT AMD/ATI's HD 6800 and HD 6900 series scales AMAZINGLY in CrossfireX vs. previous Gen Graphics Cards.



Where is everybody getting that they scale better in Crossfire than previous generations? I haven't seen this anywhere. It all scales about evenly, including SLI setups, when the cards are of similar capabilities singly.


----------



## bear jesus (Jan 3, 2011)

Wile E said:


> Where is everybody getting that they scale better in Crossfire than previous generations? I haven't seen this anywhere. It all scales about evenly, including SLI setups, when the cards are of similar capabilities singly.



I think everyone is referring to the fact that the 68xx and 69xx cards scale better than past mid/high end ATI/AMD cards, if you take a look at the TPU reviews of the 68xx and 69xx cards you can see some games/res scale up to 100% where as past cards were often well below 80%.

Just as an example on cryisis at 2560x1600 the 5850 got 16.7fps and 5850 crossfire got 25.9fps, 6950 got 20.9fps and 6950 crossfire got 41.2fps


----------



## Wile E (Jan 3, 2011)

bear jesus said:


> I think everyone is referring to the fact that the 68xx and 69xx cards scale better than past mid/high end ATI/AMD cards, if you take a look at the TPU reviews of the 68xx and 69xx cards you can see some games/res scale up to 100% where as past cards were often well below 80%.
> 
> Just as an example on cryisis at 2560x1600 the 5850 got 16.7fps and 5850 crossfire got 25.9fps, 6950 got 20.9fps and 6950 crossfire got 41.2fps



One test is hardly a conclusive result, however.

At 2560x1600, if I did my math right, overall scaling is at 69% for 6950 Crossfire, it's listed at 68% for 5870 Crossfire. I'd call 1% within the margin for error, or even attributable to driver update since the Crossfire 5870 test (run on a beta between Cat 9.10 and 9.11).


----------



## bear jesus (Jan 3, 2011)

Wile E said:


> One test is hardly a conclusive result, however.
> 
> At 2560x1600, if I did my math right, overall scaling is at 69% for 6850 Crossfire, it's listed at 68% for 5870 Crossfire. I'd call 1% within the margin for error, or even attributable to driver update since the Crossfire 5870 test (run on a beta between Cat 9.10 and 9.11).



I was just picking a result that showed what i was talking about i know it alone proves nothing. 

With the 6870/50 reviews there was a very long winded chat in the comments about how the overall scaling results were much lower than they really were according to the numbers in the results as most people were coming up with over 80% scaling and removing a couple games that had negative scaling/no scaling at all it went into 90%.

I should probably take a closer look at the 6950 crossfire review though as i admit i only skimmed through it


----------



## Wile E (Jan 3, 2011)

Yeah, I meant 6950. fixed.

And going by the tpu review, the 6800 cards still scale at around 60% at 2560x1600.


----------



## bear jesus (Jan 3, 2011)

Wile E said:


> Yeah, I meant 6950. fixed.
> 
> And going by the tpu review, the 6800 cards still scale at around 60% at 2560x1600.



Are you using the overall result at the end that w1zzard said was wrong or adding results together yourself?

I have been referring to the results that forum members got doing the math them self as they saw that the end result of overall scaling were wrong.

Such as this post here

I'm unsure if the issue has been fixed but this is what w1zzard had to say about them


----------



## Wile E (Jan 3, 2011)

I'm going be relative performance charts @ 2560x1600. Didn't bother looking at 1920.


----------



## Over50 (Jan 3, 2011)

*just a look at 68xx and 69xx*

ATI/AMD making new cards to replace some old in the market and helping to reduce cost of nVidia GTX 460, GTX 470, GTX 480 and GTX 570 then possibly GTX 580.

the 68xx are more like the 57xx replacements.
While the 69xx are to compete with nVidia GTX 570 and GTX 580.

Still the ATI HD 5870 and HD 5970 are great gpu's.

Here we're seeing ATI /AMD using 2GB over the 1GB and 564MB also DDR5.
Not only that but ATI gpu's best configured for CrossfireX to get that Scaling ratio.

So depending on your system setup .. Do you prefer loading up PCI x16 /2.0 Slots with large ATI CrossfireX cards or a single nVidia card or 4 carded SLI.

The ATI 68xx series are throw backs or kick backs to mid-level performance and
the 69xx series are the new High End performers.

From what I have been reading from Reviews and various Forums only make sense that ATI /AMD is attempting to keep up with nVidia. Unfortunately, nVidia will always be one step ahead of ATI.

So trying is better than giving up ... we should appreciate ATI/AMD's efforts in keeping nVidia
GPU's at somewhat affordable prices. Otherwise we would be paying $500 to $1000 for those Top cards all the time. That's competition rather than one having a fully established monopoly. Thank you ATI !!!! can't wait till the GTX 580 is marked down some so I can buy it.


----------



## bear jesus (Jan 3, 2011)

Wile E said:


> I'm going be relative performance charts @ 2560x1600. Didn't bother looking at 1920.



 I suck at trying to make a point, i was really not paying attention to what i was linking to res wise but the results were wrong for every resolution including 2560x1600.


----------



## Thatguy (Jan 3, 2011)

Over50 said:


> ATI/AMD making new cards to replace some old in the market and helping to reduce cost of nVidia GTX 460, GTX 470, GTX 480 and GTX 570 then possibly GTX 580.
> 
> the 68xx are more like the 57xx replacements.
> While the 69xx are to compete with nVidia GTX 570 and GTX 580.
> ...



    What kind of bullshit is that ? right and the 5xxxx cards went unanswered for how long ? 

   oh thats right, they didn't. 

   the 6xxxx cards were 2 different releases, TSMC screwed them on a node and the 68xx cards were refreshs and the 69xx cards were a pretty significant architectural change. 

  I mean its such a shame the the 69xx cards kick ass at high resolution. I mean what nvidia gonna do when they drop the dual gpu 6990 for the 580 price ? 

  BTW 580 is a nice card, but no reason to spout bullshit in a forum. 

  if anything the market is neck and neck with one always edging out the other, this round is a win for AMD/ATI on per/perf/watt and they will again have the single GPU crown with the 6990, nvidia can't compete with crossfire 6970's.

  its such a shame that ATI/AMD just can't compete. tragic really.


----------



## Wile E (Jan 3, 2011)

bear jesus said:


> I suck at trying to make a point, i was really not paying attention to what i was linking to res wise but the results were wrong for every resolution including 2560x1600.



I calculated using this chart:


----------



## bear jesus (Jan 3, 2011)

Wile E said:


> I calculated using this chart:
> 
> http://tpucdn.com/reviews/ASUS/Radeon_HD_6950_CrossFire/images/perfrel_2560.gif



My only problem is i do not know if the script has be fixed, i would hope so by now but once again without doing the math it seams lower than i would expect due to the following results at 2560x1600



game |single| dual 
avp   |22       |44
bc2   |37       |76
bf2   |27       |56
cod4 |80       |151
crysis |20      |41
dawn of war| 88| 170
far cry 2| 55 |111
hawx |85 |169
metro| 14| 25
riddick| 39|75
stalker| 26| 51
I know I'm missing a bunch of results but with so many around 100% 41% seams a bit low, to be honest though I'm too lazy to do the math so could we just leave it at most people are looking at the fact many games now scale to 100% where as they were lower with past AMD/ATI cards.

Weather the relative performance at the end is right or not a lot of people mainly look at the large amount of perfect scaling games and thus say they scale much better.

I feel we are going a little off topic here though so i hope i have explained myself well enough so i can be a quiet bear for now


----------



## Wile E (Jan 3, 2011)

It's not 41% scaling, a single is 41% slower than Crossfire, but that is approximately 70% scaling for dual cards compared to single. 41/59 = .6949 and change.

It's within a few percentage points of that with all cards of recent generations.


----------



## bear jesus (Jan 3, 2011)

Wile E said:


> It's not 41% scaling, a single is 41% slower than Crossfire, but that is approximately 70% scaling for dual cards compared to single. 41/59 = .6949 and change.
> 
> It's within a few percentage points of that with all cards of recent generations.



I must admit i had never thought of the graphs like that, but that now makes me wonder if everyone else was wrong when they were working out the scaling on the 68xx card's reviews although that confuses me as to why w1zzard agreed they were wrong..... i think i should be quiet as I'm just confused now


----------



## pantherx12 (Jan 3, 2011)

Wile E said:


> I calculated using this chart:
> 
> ]




That's not calculating dude XD

Look at each game result instead, those overall performance graphs are useless 

They're for at a glance looks at which cards are good buys essentially (It's card vs card, rather than a cards individual results it seems) working out the results yourself as well, after you've done them you can remove the games/benchmarks you find irrelevent to your use ( such as older titles/benches etc) and get a performance figure catered to what you'll be using the card for : ]

Hence me getting a 6870 for future crossfire fun!  Removing the games I didn't play/wasnt interested in and the benchmarks it left me with an average scaling of 93% or something mad at HD resolution 

If you check out crazyeyes 6970 review I done the individual scaling results for him there ( he edited his posts to include them now so easy to find) they're less impressive than 6870 but you can see there is some drivers issues with certain games/not compatible with crossfire or something.


----------



## Wile E (Jan 3, 2011)

Did you do the same for 5800 series, or any nVidia cards?


----------



## pantherx12 (Jan 3, 2011)

Not anywhere on the forum but I done 460 sli scaling which is also very good stuff.

I only done the 6xxx cards because the results were so obviously incorrect, I always read through each benchmark and ignore the graphs at the end ( cept noise and power) at a glance I knew a lot were scoring 90+ so I went and done the math  lol ( wiz said his table or the thing he uses to average the performance went wrong for the crossfire results)


----------



## Wile E (Jan 3, 2011)

Well do the same math on the nvidia and 5800 series cards, and post your results here. It would settle the scaling debate once and for all. I'm too lazy. lol.


----------



## pantherx12 (Jan 3, 2011)

Wile E said:


> Well do the same math on the nvidia and 5800 series cards, and post your results here. It would settle the scaling debate once and for all. I'm too lazy. lol.



If you could remind me in 10 hours or so sir, I'm off to work soon


----------



## Wile E (Jan 3, 2011)

I was only kidding, but hey, if you really want to, I will not stop you. lol.


----------



## Lycos (Jan 3, 2011)

*New Catalyst?*

I think the 69xx cards have a huge potential with a new driver. When will the next version of Catalyst be released?


----------



## Over50 (Jan 3, 2011)

Thatguy said:


> What kind of bullshit is that ? right and the 5xxxx cards went unanswered for how long ?
> 
> oh thats right, they didn't.
> 
> ...



 Thank you!  you said it.. takes ATI/AMD GPU's in CrossfireX to beat a single nVidia GPU.

 ..you said ... please refer to your comments " nvidia can't compete with crossfire 6970's. " also " the dual gpu 6990 for the 580 price ."

 you are very contradictive !!!!


----------

