# AMD's Own HD 7970 Performance Expectations?



## btarunr (Dec 20, 2011)

Ahead of every major GPU launch, both NVIDIA and AMD give out a document to reviewers known as Reviewer's Guide, in which both provide guidelines (suggestions, not instructions), to reviewers to ensure new GPUs are given a fair testing. In such documents, the two often also give out their own performance expectations from the GPUs they're launching, in which they compare the new GPUs to either previous-generation GPUs from their own brand, or from the competitors'. Apparently such a performance comparison between the upcoming Radeon HD 7970 and NVIDIA's GeForce GTX 580, probably part of such a document, got leaked to the internet, which 3DCenter.org re-posted. The first picture below, is a blurry screenshot of a graph in which the two GPUs are compared along a variety of tests, at a resolution of 2560 x 1600. A Tweakers.net community member recreated that graph in Excel, using that data (second picture below). 

A couple of things here are worth noting. Reviewer guide performance numbers are almost always exaggerated, so if reviewers get performance results lower than 'normal', they find it abnormal, and re-test. It's an established practice both GPUs vendors follow. Next, AMD Radeon GPUs are traditionally good at 2560 x 1600. For that matter, the performance gap between even the Radeon HD 6970 and GeForce GTX 580 narrows a bit at that resolution.



 



*View at TechPowerUp Main Site*


----------



## pantherx12 (Dec 20, 2011)

I hate these type of graphs, obviously easy to understand if you read them, but at a glance so misleading lol

OVER DOUBLE PERFORM..oh  30% improvement in AVP not double : [ ( this would be 30% faster than 6970 as well by the by)

Still, looking forward to the TPU review on the 22nd to see if results are anywhere close to this!


----------



## trickson (Dec 20, 2011)

They look so grand AMD knows how to put a show on for sure . One thing though there is a lot to be said about ATI they do a great job on them for sure . I love ATI !


----------



## crazyeyesreaper (Dec 20, 2011)

yea if the performance is good ill be grabbing 7970 and selling my 6950s cheap. just so i have something new to play with.


----------



## btarunr (Dec 20, 2011)

Again, both AMD and NV make their graphs look like that.


----------



## DannibusX (Dec 20, 2011)

Bring the heat, AMD and you can have around $1400 out of my pocket.


----------



## trickson (Dec 20, 2011)

btarunr said:


> Again, both AMD and NV make their graphs look like that.



I think it is called marketing .


----------



## btarunr (Dec 20, 2011)

trickson said:


> I think it is called marketing .



You already know that, right? So don't single AMD out to flame.


----------



## erocker (Dec 20, 2011)

Since AMD is a North American company I would think they would use "1.6" rather than "1,6" in the graph. The graph looks a bit generic too, but the results seem to be going along with all of the other rumors.


----------



## trickson (Dec 20, 2011)

btarunr said:


> You already know that, right? So don't single AMD out to flame.



Oh no way I would think of this ! I have to tell you AMD/ATI has the best GPU's ever . I have always loved them and I love nvidia as well . I just think that ATI has way more bang for the buck than the other .


----------



## btarunr (Dec 20, 2011)

erocker said:


> Since AMD is a North American company I would think they would use "1.6" rather than "1,6" in the graph. The graph looks a bit generic too, but the results seem to be going along with all of the other rumors.



If you read the OP, the second image/graph is a recreation of the first (blurry) one. It's possible that the guy who recreated it is European. The first image/graph uses a period.


----------



## erocker (Dec 20, 2011)

btarunr said:


> If you read the OP, the second image/graph is a recreation of the first (blurry) one. It's possible that the guy who recreated it is European.



Ah, quite right. I quickly moved to the 2nd pic due to the blurriness of the 1st.


----------



## John Doe (Dec 20, 2011)

btarunr said:


> Again, both AMD and NV make their graphs look like that.



Yeah, this isn't new. These graphs are always deceiving. If the card was indeed 1.6 times faster, it'd have to get 100 FPS where a 580 gets 40 FPS... or for example, this chart is also similar. It glorifies the 7950GX2 as the best card, when in reality it was plagued with driver issues, and the X1900 XTX has close performance on itself alone. With a more advanced GPU that can do AA and HDR at the same time, and no dual GPU worries. ATi cards owned at that time.


----------



## DannibusX (Dec 20, 2011)

Both pictures show you what effect alcohol has on you.  The first one is totally this hot babe that you were lucky enough to pick up at the bar and the second one is what comes into focus when you find her penis.


----------



## btarunr (Dec 20, 2011)

erocker said:


> Ah, quite right. I quickly moved to the 2nd pic due to the blurriness of the 1st.








j/k


----------



## trickson (Dec 20, 2011)

I am willing to bet that this is spot on . Maybe even better than what is shown in the graphs .


----------



## trickson (Dec 20, 2011)

DannibusX said:


> Both pictures show you what effect alcohol has on you.  The first one is totally this hot babe that you were lucky enough to pick up at the bar and the second one is what comes into focus when you find her penis.



NOW THAT IS funny . I bet you have done this too !


----------



## bear jesus (Dec 20, 2011)

It looks like they average out to about 45% faster than the 580, if that is right it seams relatively impressive although i will wait until i have read the reviews before i start being impressed.


----------



## erocker (Dec 20, 2011)

Sorry, I thought I needed 3D glasses to see the first pic properly. 

So, based on the graph, I come up with the 7970 being about 48% faster than the GTX 580 on average.


----------



## dieterd (Dec 20, 2011)

dont get fanatic about this, because:
1. remember Buldozzer graphs?
2. if you have to cash (major price increase vs 6970) for every increased frame then it is no use of this new GPU generation and you copuld get that kind of preformance increase and just grab 6990 or 6950x2 for same price!


----------



## trickson (Dec 20, 2011)

erocker said:


> Sorry, I thought I needed 3D glasses to see the first pic properly.
> 
> So, based on the graph, I come up with the 7970 being about 48% faster than the GTX 580 on average.



That is what I gather from it . I am willing to bet that is spot on .


----------



## bear jesus (Dec 20, 2011)

erocker said:


> Sorry, I thought I needed 3D glasses to see the first pic properly.
> 
> So, based on the graph, I come up with the 7970 being about 48% faster than the GTX 580 on average.



I like your percentage more than my guess.


----------



## erocker (Dec 20, 2011)

dieterd said:


> just grab 6990 or 6950x2 for same price!



Some would rather have the stability of a single GPU and not have to deal with the occasional dual-gpu issues.

More maths: Using W1zz's reviews for readings, the 7970 gets 11 FPS more in Metro 2033 then the GTX 580 3gb card.


----------



## b82rez (Dec 20, 2011)

>2011
>Trusting AMD graphs after Bulldozer marketing
>ISHYGDDT


----------



## John Doe (Dec 20, 2011)

dieterd said:


> dont get fanatic about this, because:
> 1. remember Buldozzer graphs?
> 2. if you have to cash (major price increase vs 6970) for every increased frame then it is no use of this new GPU generation and you copuld get that kind of preformance increase and just grab 6990 or 6950x2 for same price!



I wouldn't get any dual-GPU card of this generation. They're to be stayed away due a few reasons IMO.


----------



## Hayder_Master (Dec 20, 2011)

Lol AMD they are so tricky, sure it's better than GTX 580 but not at this much, they use 2500 x 1600 reslution cuz it use more ram size, Lol they should compare with GTX580 3Gb or run all of this at 1920x1080.
Nice move AMD but we got u


----------



## trickson (Dec 20, 2011)

Just because BD was a bust doesn't mean ATI will be . I have yet to see one ATI card let me down .


----------



## John Doe (Dec 20, 2011)

trickson said:


> I have yet to see one ATI card let me down .



These were all nice and dandy (I didn't care about power consumption) till I put them in CrossFire. Having forced to change application names to get kicked out of Steam servers was frustrating... not to mention the heat in CF. And I had a GT too.


----------



## trickson (Dec 20, 2011)

John Doe said:


> These were all nice and dandy (I didn't care about power consumption) till I put them in CrossFire. Having forced to change application names to get kicked out of Steam servers was frustrating... not to mention the heat in CF. And I had a GT too.
> 
> http://www.buraak.com/wp-content/uploads/2007/06/radeon_hd_2900_xt.jpg



What the flames on it did not give it away ?


----------



## John Doe (Dec 20, 2011)

lol yeah, they were built solid though. With digital power delivery and such. I actually really liked them until they started giving issues in CF. ATi Tray Tools was nice for IQ tweaks. You still can DL it from Ray Adam's SkyDrive AFAIK. 

I think that was the only "bad" ATi/AMD GPU. nVidia has had more disappointing GPU's than AMD had (*cough 590/570 cough*). AMD was the only one that provided more than sufficient power phases with Volterra's.


----------



## pantherx12 (Dec 20, 2011)

John Doe said:


> Yeah, this isn't new. These graphs are always deceiving. If the card was indeed 1.6 times faster, it'd have to get 100 FPS where a 580 gets 40 FPS...




Maths fail dude, 1.6 times 40 would be 64.

Considering 2 x 40 is only 80 he he


----------



## BazookaJoe (Dec 20, 2011)

*Rant in 3, 2, 1 .... 

Yeah - ya know what? I've been reading articles on AMD/ATI's "EXPECTATIONS" for the last 5 years now and seen nothing but soggy half-arsed failure after soggy half-arsed failure that ONLY EVER achieves reasonable frame rates by REMOVING GRAPHIC PROCESSING from your games - reducing textures, texture processing and filtering against your will and without your permission , at driver level.

Forcing FSAA so be disabled with no way to enable it - and other cheap underhanded Bull-SCHYYTE.

I have to rename all my game exe's so I can at least use *AA, BECAUSE THEIR ASS-HAT SOLUTION TO "IMPROVING PERFORMANCE" IS TO JUST FORCE DISABLE (IN THE DRIVER) ANY QUALITY IMPROVING SETTINGS* , and I'm sick of it - this company has failed - it cant compete and it must stop *LYING* to its customers, because no matter WHAT result it claims to gets, it only gets it BECAUSE it skewed the results, the environment, AND the benchmark itself just to get them and it will never live up to that in real life without DISABLING half your options before you play any game.


----------



## theJesus (Dec 20, 2011)

pantherx12 said:


> Maths fail dude, 1.6 times 40 would be 64.
> 
> Considering 2 x 40 is only 80 he he


Damn, beat me to it


----------



## John Doe (Dec 20, 2011)

pantherx12 said:


> Maths fail dude, 1.6 times 40 would be 64.
> 
> Considering 2 x 40 is only 80 he he



No, it's not a math fail. I thought of it as %260 instead of %160.


----------



## pantherx12 (Dec 20, 2011)

John Doe said:


> No, it's not a math fail. I thought of it as %260 instead of %160.



That is still a maths fail.

At the very least it's a reading comprehension fail.

Either way you should just admit the fault perhaps laugh it off and move on.





BazookaJoe said:


> I have to rename all my game exe's so I can at least use *AA, BECAUSE THEIR ASS-HAT SOLUTION TO "IMPROVING PERFORMANCE" IS TO JUST FORCE DISABLE (IN THE DRIVER) *


*

... Erm... what?

I've never had this problem EVER

You should probably change all the sliders in catalyst to quality instead of performance as this disables any optimisations.*


----------



## BazookaJoe (Dec 20, 2011)

pantherx12 said:


> I've never had this problem EVER



Unless you NEVER play games panther - that is flat out impossible.

Skyrim is the best recent example.

After a "Performance" Upgrade from AMD - Fsaa was simply REMOVED at driver level - no matter what you set in the game or your catalyst driver there is NO AA anymore - and the whole gfx goes to schyte with creepy crawly aliased edges allover the screen - but if you simply Alt-f4, find tesv.exe, Rename to quake.exe, and run the game again? BAM all processing now works perfectly again.

It is a very well known fact that AMD/ATI have been Cheating in their driver using EXE name detection for YEARS, deleting textures and grass in games like FarCry to reduce load on their struggling hardware, even Flat out cheating with Fur-Mark - and manipulating settings to effect the benchmark - in an article published in this TPU forum a few years ago.

(Edit : the Stock cooler design was so inferior that if you actually maxed the card EVEN AT STOCK CLOCKS it could physically destroy itself)

YES the idea of "Per-Game Optimizations" started as a good idea - to boost compatibility with various games, and both nVidia and AMD/ATI Use it - the fact was that it was there for COMPATIBILITY - not to cover up how lame ass rubbish your useless gfx cards are by force disabling peoples *PERFECTLY COMPATIBLE* quality settings just to get a better frame rate.

That is just plain fraud - and in my limited personal opinion, that is not representative of TechPowerUp or it's administration, and based on my personal experience with AMD/ATI products, I believe AMD/ATI are frauds - and anyone new to the game should take their "Claims" and "Expectations" with a large tablespoon of salt.


----------



## Patriot (Dec 20, 2011)

*Temp Fix*

http://www.hardocp.com/article/2011/12/19/amd_catalyst_121_preview_profiles_performance

New controls to choose how those application profiles are configured...
So instead of "lying" hardcoding temporary fixes in the driver that reduce IQ but fix other issues...now users can tweak to their hearts content...

and...its Nvidia who coded a profile for Furrmark to keep their cards from dying running it...

sigh... :shadedshu


----------



## pantherx12 (Dec 20, 2011)

BazookaJoe said:


> Unless you NEVER play games panther - that is flat out impossible.
> 
> Skyrim is the best recent example.
> 
> After a "Performance" Upgrade from AMD - Fsaa was simply REMOVED at driver level - no matter what you set in the game or your catalyst driver there is NO AA anymore - and the whole gfx goes to schyte with creepy crawly aliased edges allover the screen - but if you simply Alt-f4, find tesv.exe, Rename to quake.exe, and run the game again? BAM all processing now works perfectly again.





Thing is, it works fine for me, you sure you have looked at your cat settings?

More specifically " Catalyst A.I" It may say it's just for textures in the description but setting it to high quality instead of performance/quality seems to disable all optimisations.

And as far as I know has always done this.

"Catalyst A.I: Catalyst A.I. allows users to determine the level of 'optimizations' the drivers enable in graphics applications. These optimizations are graphics 'short cuts' which the Catalyst A.I. calculates to attempt to improve the performance of 3D games without any noticeable reduction in image quality. In the past there has been a great deal of controversy about 'hidden optimizations', where both Nvidia and ATI were accused of cutting corners, reducing image quality in subtle ways by reducing image precision for example, simply to get higher scores in synthetic benchmarks like 3DMark. "


----------



## Lionheart (Dec 20, 2011)




----------



## cdawall (Dec 20, 2011)

BazookaJoe said:


> Unless you NEVER play games panther - that is flat out impossible.
> 
> Skyrim is the best recent example.
> 
> ...



*BOTH COMPANIES DO THAT*

Thanks for playing though.


----------



## dieterd (Dec 20, 2011)

I dont know about you, but my major concern is about price tag and what I heard around there is gona be 500$+!! If new generations comes out it should NOT be like - if preformance is +150% then price is 150%!!! and if it would be like that, then right now hd6970 should cost in release like x5 (and more) times hd2900xt when it was released, but it is like 1:1. So when I w8 new generation I wait preformance increase and maybe nothing to few $ price increase, like Intel yesterday - Ivy Bridge chart - prices 1:1, preformance 1.3:1 vs Sandy. I w8ted buldozzer and gues what - barley no preformance increase and major price increase vs Phenom II so it is like 1.1:1 preformance and 1.3:1 price . I am w8ting hd79xx and best I could expect is like 1.5:1 price and 1.5:1 preformance and that preformance "1.5" preformance is from AMD PR charts with "up to" attached so it will be way down. of course price "1.5" also could be hike, but I have bad feeling about this...


----------



## laszlo (Dec 20, 2011)

till i don't see a review i can say Mr. Papermaster at AMD has start working


----------



## Benetanegia (Dec 20, 2011)

^^This sums up mi opinion.


----------



## theJesus (Dec 20, 2011)

My opinion of it (and all other GPUs) is that it does absolutely nothing until I see W1z's numbers.


----------



## Aceman.au (Dec 20, 2011)

I wonder what Nvidia will change to compete with this.


----------



## Completely Bonkers (Dec 20, 2011)

online chart maker

Just to point out the obvious, according to Moore's Law, AMD should be pumping out much more than 30% improvement.


----------



## runevirage (Dec 20, 2011)

Doesn't the graph just look like someone whipped it up in MS Word? You'd expect an internal or PR slide to have more embellishments.


----------



## entropy13 (Dec 20, 2011)

On average it's 147% for the HD 7970 (with the GTX 580 as the 100%). Which means it's just behind the HD 6990 (152%). If those figures are accurate. 

Now let's assume it's actually lower by 15 percentage points (132%)...that puts it behind a GTX 590 (140%) but still some way ahead of the HD 5970 (124%).


----------



## John Doe (Dec 20, 2011)

Patriot said:


> and...its Nvidia who coded a profile for Furrmark to keep their cards from dying running it...
> 
> sigh... :shadedshu



Nope. You're spreading fanboyism and garbage. OCP first came out in 400 series cards, not 500. The reason they wrote it was to tame the monster that was the 480; for people who didn't know what they're doing. The card was built to withstand those temps. Those that did knew what they're doing disabled it.


----------



## cdawall (Dec 20, 2011)

John Doe said:


> Nope. You're spreading fanboyism and garbage. OCP first came out in 400 series cards, not 500. The reason they wrote it was to tame the monster that was the 480; for people who didn't know what they're doing. The card was built to withstand those temps. Those that did knew what they're doing disabled it.



So capt smart do you know how to fully disable OCP? I'll give you a hint break out the soldering iron. Nvidia coded their drivers in the same manor AMD is being chastised for doing to theirs. Both companies adjust video quality in specific games, benchmarks, and apps. That's the nature of the beast it is and always will be a competition between the two.


----------



## John Doe (Dec 20, 2011)

cdawall said:


> So capt smart do you know how to fully disable OCP? I'll give you a hint break out the soldering iron.



I've an IPC 610 certification and have recapped a 1250W unit.


----------



## cdawall (Dec 20, 2011)

John Doe said:


> I've an IPC 610 certification and have recapped a 1250W unit.



And? IPC 610 means you can perform:

# Hardware installation
# Soldering criteria, including lead free connections
# Soldered requirements for connecting to terminals
# Soldered connection requirements for plated-through holes
# Surface mounting criteria for chip components, leadless and leaded chip carriers
# Swaged hardware and heatsink requirements of mechanical assemblies
# Component mounting criteria for DIPS, socket pins and card edge connectors
# Jumper wire assembly requirements
# Solder fillet dimensional criteria for all major SMT component groups
# Soldering, such as tombstoning, dewetting, voiding and others
# Criteria for component damage, laminate conditions, cleaning and coating


I am a few steps beyond that at my current job...Recapping a PSU is also pretty idiot proof unless you can't follow + and -'s on a PCB. You still didn't answer my question. Have you personally ever bypassed the OCP on anything? I have a couple of mobo's and GPU's not mention CPU's hardmodded.


----------



## John Doe (Dec 20, 2011)

cdawall said:


> I am a few steps beyond that at my current job...Recapping a PSU is also pretty idiot proof unless you can't follow + and -'s on a PCB. You still didn't answer my question. Have you personally ever bypassed the OCP on anything? I have a couple of mobo's and GPU's not mention CPU's hardmodded.



You don't get the point here, I haven't because I don't have the need to. I'm not a benchmarker. I can OC and game all the way. And no, it's not idiot proof to recap a Real Power 1250. Enhance heatsinks are a huge PITA to desolder.


----------



## cdawall (Dec 20, 2011)

John Doe said:


> You don't get the point here, I haven't because I don't have the need to. I'm not a benchmarker. I can OC and game all the way. And no, it's not idiot proof to recap a Real Power 1250. Enhance heatsinks are a huge PITA to desolder.



Not terrible if the entire unit is baked.


----------



## John Doe (Dec 20, 2011)

cdawall said:


> Not terrible if the entire unit is baked.



It had a bunch of unknown caps on the secondary, actually. That was why I went into it. A blue one that looked like Asia-X, a purple one and lol... I replaced them with Teapo's.


----------



## cdawall (Dec 20, 2011)

John Doe said:


> It had a bunch of unknown caps on the secondary, actually. That was why I went into it. A blue one that looked like Asia-X, a purple one and lol... I replaced them with Teapo's.



I don't blame you odd that Enhance would cheap out like that IMO. Normally they are pretty good about using good brand caps. Oh well guess the industry has its woes thanks to the economy.


----------



## John Doe (Dec 20, 2011)

cdawall said:


> I don't blame you odd that Enhance would cheap out like that IMO. Normally they are pretty good about using good brand caps. Oh well guess the industry has its woes thanks to the economy.



Yeah, $1-5 per unit costs much more on a larger scale. At times, some other quality OEM's do this as well.


----------



## (FIH) The Don (Dec 20, 2011)

shut up bitches, its a gpu thread, not the i have the biggest soldering iron thread.



oh yeah and whoever mentions ATI all the time, it DOES NOT EXIST!!! its AMD now and has been for a veeeeeerrrrrrrrrrrrrryyyyyyyyyyy loooooooooooooong time.


----------



## swirl09 (Dec 20, 2011)

Not sure where this thread is currently going ^^

Back on topic, without any bizarre maths or trying to take a swipe at what is obviously PR (Hi, this happens.... k?)

Going back 6+ months the estimates were that the shrink would produce in the region of a 40% performance bump. Even if AMD cherry picked these results (lol "if", of course they did, so what?), its still showing an increase of a little shy of 1 and a half times that of the respectable GTX580.

As for whoever commented they "cheated" using higher res's than 1080p for their tests... UH HUH?! Why are you spending serious money on a GPU if your only running 1080p? These are top tier cards meant for top tier res's. On that note, I find it hilarious that some sites still test using rigs with GPU solutions alone costing well above a grand and yet dont go higher than 1900x1200 in their testing suite? THAT makes *no* sense.

My only concern now is how long will it take for nV to get their answer out. Last time I checked they mentioned well into Q1 2012, they're unlikely to be early and given there has been a recent wave of new beefed up 580's coming out I think thats a good indication we're awhile off seeing it :/  Well, when it does arrive Ill be happy - pick the better card and couple it with an IB and /happynewrigtime


----------



## cdawall (Dec 20, 2011)

(FIH) The Don said:


> shut up bitches, its a gpu thread, not the i have the biggest soldering iron thread.
> 
> 
> 
> oh yeah and whoever mentions ATI all the time, it DOES NOT EXIST!!! its AMD now and has been for a veeeeeerrrrrrrrrrrrrryyyyyyyyyyy loooooooooooooong time.



My VGA card still says ATi on it.







So bugger off


----------



## phanbuey (Dec 20, 2011)

swirl09 said:


> Not sure where this thread is currently going ^^
> 
> Back on topic, without any bizarre maths or trying to take a swipe at what is obviously PR (Hi, this happens.... k?)
> 
> ...



I think using lower resolutions in reviews makes sense, which is why they do it.  These are not "top tier cards meant for top tier res's" since no single GPU card can max most the new  games at acceptable FPS.  1900x1200 allows you to use max settings and get a fluid experience.

Either way, I have seen these graphs from AMD so many times that I know enough not to even waste my time to look at them.  They are always way off.  There is not one graph that they have released like that which is actually true.  Just because it is "PR" also doesnt make it right - I hope both companies get flamed for releasing crap like this.


----------



## Super XP (Dec 20, 2011)

If this is true, then AMD take my Bloody Money already 



> *HD 7970 up to 60 percent faster than GTX 580*
> http://www.fudzilla.com/graphics/item/25278-hd-7970-up-to-60-percent-faster-than-gtx-580


----------



## phanbuey (Dec 20, 2011)

Its the same graph lol... and what the hell is 1x MSAA - ah googled... it exists... some random performance settings on that chart.


----------



## Nesters (Dec 20, 2011)

These graphs... Every time I see a bar twice as long as comparison and something like "up to 50%" I know it's AMD.


----------



## Velvet Wafer (Dec 20, 2011)

ah, this incredibly misleading graphs again, from which you cant even derive actual performance, as they most times are just heavily biased


----------



## trickson (Dec 20, 2011)

(FIH) The Don said:


> shut up bitches, its a gpu thread, not the i have the biggest soldering iron thread.
> 
> 
> 
> oh yeah and whoever mentions ATI all the time, it DOES NOT EXIST!!! its AMD now and has been for a veeeeeerrrrrrrrrrrrrryyyyyyyyyyy loooooooooooooong time.



Bull shit ! If ATI did not exist then why the hell is it on the fucking box ? When I go to the store I see ATI NOT AMD?ATI on the box ! So shut up bitch


----------



## BarbaricSoul (Dec 20, 2011)

I can wait two more days for Wizz's review to tell us the actual difference between the current top tier cards and the 7970. Personally I hoping the 7970 will be the motivation I need to replace my 5870 crossfireX set-up.


----------



## phanbuey (Dec 20, 2011)

BarbaricSoul said:


> I can wait two more days for Wizz's review to tell us the actual difference between the current top tier cards and the 7970. Personally I hoping the 7970 will be the motivation I need to replace my 5870 crossfireX set-up.



Seriously... i cant either - I think that is why this fake graph pisses me off so much... usually we have some leaked 3rd party benches from China or something... not some crap graph from AMD that doesnt even list FPS. :/

Come on people... NDA's are meant to be disregarded


----------



## Benetanegia (Dec 20, 2011)

phanbuey said:


> Seriously... i cant either - I think that is why this fake graph pisses me off so much... usually we have some leaked 3rd party benches from China or something... not some crap graph from AMD that doesnt even list FPS. :/
> 
> Come on people... NDA's are meant to be disregarded



Yes but don't reviewers usually get cards 3-4 weeks before launch so as to have time to review properly and fix any problems that could arise? I'm sure that with the sudden release day change reviewers only got cards 1 or 2 weeks ago. So not too much time for proper testing. But that's something only W1zz could tell us.


----------



## cadaveca (Dec 20, 2011)

From my own sample "procurment", you might have several weeks..you might only have 3 days, or even less. That's going to change with every product, no matter who makes it.

Ideally, yes, reviewers would get a few weeks, but that's not always possible.

All i know is, no 7970XTX cards at my house, yet.


----------



## bear jesus (Dec 20, 2011)

cadaveca said:


> From my own sample "procurment", you might have several weeks..you might only have 3 days, or even less. That's going to change with every product, no matter who makes it.
> 
> Ideally, yes, reviewers would get a few weeks, but that's not always possible.
> 
> All i know is, no 7970XTX cards at my house, yet.



That's because i stole your mail


----------



## (FIH) The Don (Dec 20, 2011)

trickson said:


> Bull shit ! If ATI did not exist then why the hell is it on the fucking box ? When I go to the store I see ATI NOT AMD?ATI on the box ! So shut up bitch



ATI does NOT exist ANYMORE, do you understand??????

its A-M-D now, get it? :shadedshu


----------



## Benetanegia (Dec 20, 2011)

cadaveca said:


> From my own sample "procurment", you might have several weeks..you might only have 3 days, or even less. That's going to change with every product, no matter who makes it.
> 
> Ideally, yes, reviewers would get a few weeks, but that's not always possible.
> 
> All i know is, no 7970XTX cards at my house, yet.



Oh yeah, I forgot to post what I was actually trying to point out: the usual chinese leaks might not have happened because said chinese sources didn't get a card this time around or not yet.


----------



## phanbuey (Dec 20, 2011)

(FIH) The Don said:


> ATI does NOT exist ANYMORE, do you understand??????
> 
> its A-M-D now, get it? :shadedshu



I am sure the guys at the  GFX Division, refer to corporate as "Those A**hats from AMD".


----------



## (FIH) The Don (Dec 20, 2011)

phanbuey said:


> I am sure the guys at the  GFX Division, refer to corporate as "Those A**hats from AMD".





still doesnt change the fact that its AMD, ATi is dead.

gonna drop it here

and i saw what you did there lol


----------



## phanbuey (Dec 20, 2011)

I can see Ben's point - they prolly pushed the release date up last moment.  I envision it like this:

AMD VP: "OH S**T! Christmas is on the 25TH!!?!?! When the hell did that happen?, Suzy - Quick get Steve on the phone- I need that damn 7970 OUT NOW!"

Suzy: "But sir, the reviewers don't even have the cards yet"

AMD VP: "DAMMIT SUZY! I don't pay you THINK! Get Bob from marketing to make one of his famous graphs, then leak it on the internet."


----------



## Super XP (Dec 20, 2011)

trickson said:


> Bull shit ! If ATI did not exist then why the hell is it on the fucking box ? When I go to the store I see ATI NOT AMD?ATI on the box ! So shut up bitch


Umm, Rory (AMD's CEO) is bringing back the ATI logo and making a Discrete Graphics Card Division once again.


----------



## cadaveca (Dec 20, 2011)

Super XP said:


> Umm, Rory (AMD's CEO) is bringing back the ATI logo and making a Discrete Graphics Card Division once again.






WHUT?


----------



## BazookaJoe (Dec 20, 2011)

cdawall said:


> *BOTH COMPANIES DO THAT*
> 
> Thanks for playing though.



""Per-Game Optimizations" started as a good idea - to boost compatibility with various games, *and both nVidia and AMD/ATI Use it* "

Reading can be very difficult. Sorry you weren't a winner this time.

Thank YOU for playing.


----------



## Tenxu24 (Dec 20, 2011)

*This graph may be true or may be false, be water my friends !!!
*






It all depends on bechmarks when some computer expert has the amd 7970, I personally believe to be true due to 3 reasons, first and 1000mhz gpu design, the second speed virtual memory and the third the 3 gigabytes of virtual memory


----------



## phanbuey (Dec 20, 2011)

According to quantum mechanics, the graph can be true and false at the same time.


----------



## Recus (Dec 20, 2011)

Tenxu24 said:


> 1000mhz gpu design



AMD didn't mention that in any slide.


----------



## Patriot (Dec 20, 2011)

John Doe said:


> Nope. You're spreading fanboyism and garbage. OCP first came out in 400 series cards, not 500. The reason they wrote it was to tame the monster that was the 480; for people who didn't know what they're doing. The card was built to withstand those temps. Those that did knew what they're doing disabled it.



http://www.techpowerup.com/forums/archive/index.php/t-139617.html
http://www.overclock.net/t/929152/have-you-killed-a-570-no-recent-deaths-buy-some-570s

nope my memory is correct thanks for playing ...
Nvidia has OCP ATI/AMD has Powertune...

and the 580 is just a fully working 480...btw...


----------



## brandonwh64 (Dec 20, 2011)

phanbuey said:


> I can see Ben's point - they prolly pushed the release date up last moment.  I envision it like this:
> 
> AMD VP: "OH S**T! Christmas is on the 25TH!!?!?! When the hell did that happen?, Suzy - Quick get Steve on the phone- I need that damn 7970 OUT NOW!"
> 
> ...



This would be bad ass to have someone make a video of this and do a full reenactment


----------



## (FIH) The Don (Dec 20, 2011)

brandonwh64 said:


> This would be bad ass to have someone make a video of this and do a full reenactment



would love to see that too instead of that same old hitler/untergang video clip


----------



## erocker (Dec 20, 2011)

Lots of stuff (benchmarks, etc.) on page 14: http://www.rage3d.com/board/showthread.php?p=1336788334#post1336788334


----------



## phanbuey (Dec 20, 2011)

The cast would have to be good:
AMD VP = Alec Baldwin
Suzy = Liz Lemon/ Tina Fey
Bob from Marketing = Tracy Morgan - "These cards are PHAT, they is at least 1.3x the speed of the other cards - I made this graph to show the relationship."


----------



## dir_d (Dec 20, 2011)

Not gonna lie...looks disappointing for the money. Looks like i might pickup 2 7xxx cards based on VLIW4. I dont use the GPGPU features so that extra money for that performance is not worth it for me.


----------



## erocker (Dec 20, 2011)

dir_d said:


> looks disappointing for the money.



What's dissappointing for you? Performance looks to be pretty good and there has been no confirmed price yet. It looks like AMD has a card that handily beats Nvidia's top single GPU card.. of course they are going to want to milk a bit more money out of it.


----------



## Crap Daddy (Dec 20, 2011)

Those are benchmarks made by AMD, they look good but we'll have to see non-biased benchies at different resolutions to get a clearer picture. At 25x16 it destroys the 580 as it should be. It is a high-end card and it will be priced accordingly. The interesting thing is the 7950 compared to the GTX580, if it will perform better by a fair margin and cost less then NV should lower the price on their top card(s).


----------



## phanbuey (Dec 20, 2011)

^^ or if it is unlockable... that would be awesome


----------



## cadaveca (Dec 20, 2011)

erocker said:


> It looks like AMD has a card that handily beats Nvidia's top single GPU card..



The 7970 better damn well beat nVidia's last gen, and by a sizeable amount, too. This isn't the CPU division...AMD's GPUs in all forms compete directly with nVidia's cards, and most are going to expect that these cards are at least the same as a 6990 in terms of performance.

The specifications listed above hint that the 7970 is NOT double the performance of a 6970, so yeah, it's disappointing. With that said, there's very little reason for 6990 users to upgrade, and if the rumoured prices of $499 are correct, I see very little reason to purchase 7970, at all. It's not like current games push the 6950, even, nevermind the 6970.

Besides, it's fail becuase I cannot afford $500 for a GPU.


----------



## Crap Daddy (Dec 20, 2011)

All info we have until now suggest a higher price than 500$ and will retail certainly above the MSRP in the first month at least. So they should be much better than 2x6950 to be worth considering. I personally don't care, no game pushes my card at my resolution and I don't see one in the foreseeable future (a game that I might be interested in) but it's interesting to follow the developments.  But again, to spend in excess of 500 Euro, which is what will cost over here, just to game, is not justifiable. If this equipmente helps you with your work then yes but I don't know if this GPU is good for anything else. (maybe somebody will correct me)


----------



## phanbuey (Dec 20, 2011)

Crap Daddy said:


> All info we have until now suggest a higher price than 500$ and will retail certainly above the MSRP in the first month at least. So they should be much better than 2x6950 to be worth considering. I personally don't care, no game pushes my card at my resolution and I don't see one in the foreseeable future (a game that I might be interested in) but it's interesting to follow the developments.  But again, to spend in excess of 500 Euro, which is what will cost over here, just to game, is not justifiable. If this equipmente helps you with your work then yes but I don't know if this GPU is good for anything else. (maybe somebody will correct me)



If the price is that high, then I agree - it will not be worth it - but usually these prices do not stay so high.  

And they will need a 79xx product priced aggressively because when the Kepler from Nvidia comes out, if you already have a 79xx, you are less likely to upgrade... thus hurting the sales of the competition.

If however, the price is too high... then most people will want to wait for Kepler to either go that route, or to see if the price drops when there is competition in the space.

If I was them, I would release the 7970 at the same price as the 580 and then release the 7950 at the same price as the cheapest 6970 and make it unlockable.  That would flood the market with 79xx parts and seriously  Nvidia.

IMO i agree that the 7950 will be the product to watch.


----------



## devguy (Dec 20, 2011)

Will the 7950 be released/reviewed on the 22nd as well?  Or just the 7970?


----------



## phanbuey (Dec 20, 2011)

devguy said:


> Will the 7950 be released/reviewed on the 22nd as well?  Or just the 7970?



Should be both, if past history is any indicator.


----------



## Solaris17 (Dec 20, 2011)

This graph looks hardcore faked.

First. Why is it blurry? The Picture does not indicate that it was taken by a camera. No lines or glass or splotches. 

Second. The lettering looks tampered with. It could be that someone attempted to blur it on purpose. But why in such a way? Their are far easier ways to blur a photo. Not to mention the graphs look doubled. 

Third. The areas around the lettering. They look brighter white to me. Why? If a photo was blurred this should not be the case.


Someone said that this graph looks accurate according to rumors. But thats assuming the graph is legitimate. The graph could too easily and IMO probably is fake. Theirfor of course it would represent the current performance rumors.


----------



## badtaylorx (Dec 20, 2011)

wow if those DH leaks are spot on (like the BD ones) than Nvidia is gonna have a ruff go of it for a spell


----------



## (FIH) The Don (Dec 20, 2011)




----------



## Casecutter (Dec 20, 2011)

I would say AMD/ATI can justify a $500 MSRP... remember TSMC bumped the 28Nm wafer cost and for that both AMD/Nvidia get less fully functioning top shelf chips. Then if the performance is something close to what's being presented, that affects on pricing compared to the competition.  The last thing is the worst... strong demand and not enough inventory, which by all accounts will be a big motivator for all retail/E-tailers will set pricing above MSRP, why because there's enough folk willing to pay the inflated price for acouple of months any-who.

At $500... Nvidia hasn't any reason to worry about lowering the 580's (40Nm price is fixed) they might adjust a little and AIB will offer more custom solutions, but Nvidia has no reason to run out little the buildings on fire.  They’ll hold and play their cards, because they know they'll be in the same situation when Kepler starts showing.


----------



## phanbuey (Dec 20, 2011)

Casecutter said:


> I would say AMD/ATI can justify a $500 MSRP... remember TSMC bumped the 28Nm wafer cost and for that both AMD/Nvidia get less fully functioning top shelf chips. Then if the performance is something close to what's being presented, that affects on pricing compared to the competition.  The last thing is the worst... strong demand and not enough inventory, which by all accounts will be a big motivator for all retail/E-tailers will set pricing above MSRP, why because there's enough folk willing to pay the inflated price for acouple of months any-who.
> 
> At $500... Nvidia hasn't any reason to worry about lowering the 580's (40Nm price is fixed) they might adjust a little and AIB will offer more custom solutions, but Nvidia has no reason to run out little the buildings on fire.  They’ll hold and play their cards, because they know they'll be in the same situation when Kepler starts showing.



Yeah but we also dont know the margins on the cards or the production volumes... even if the 28nm cost is higher - if the margins are squishy enough to strategically price the card they should do so.  After all, it won't help to keep a static % profit if it means you don't benefit from an early launch.  

They'll want to cut into a nice chunk of Nvidia's market share, and, you're right, they won't do it with a $500 card.  It would be unfortunate if they couldn't produce enough units to take advantage of what they have, but scarcity would be the only way the price would stay that high.


----------



## air_ii (Dec 21, 2011)

Solaris17 said:


> This graph looks hardcore faked.
> 
> First. Why is it blurry? The Picture does not indicate that it was taken by a camera. No lines or glass or splotches.
> 
> ...



It's not a fake. It's from AMD's review guide (as in the link below):

http://www.megaupload.com/?d=H86KSABC

EDIT:
And all slides as well.
http://www.megaupload.com/?d=37TFTYDX

Links originally supplied by -The_Mask- over at B3D forum.


----------

