# AMD's Answer to GeForce GTX 700 Series: Volcanic Islands



## btarunr (May 8, 2013)

GPU buyers can breathe a huge sigh of relief that AMD isn't fixated with next-generation game consoles, and that its late-2013 launch of its next GPU generation is with good reason. The company is building a new GPU micro-architecture from the ground up. Codenamed "Volcanic Islands," with members codenamed after famous islands along the Pacific Ring of Fire, the new GPU family sees AMD rearranging component-hierarchy within the GPU, in a big way. 

Over the past three GPU generations that used VLIW5, VLIW4, and Graphics CoreNext SIMD architectures, the component hierarchy was essentially untouched. According to an early block-diagram of one of the GPUs in the series, codenamed "Hawaii," AMD will designate parallel and serial computing units. Serial cores based on either of the two architectures AMD is licensed to use (x86 and ARM), could handle part of the graphics processing load. The stream processors of today make up the GPU's parallel processing machinery. 


We can't make out text in the rather blurry block-diagram, but are rather convinced that if it's authentic, then AMD is making some big changes. Another reason for AMD's delay could be silicon fab process. "Tahiti" as implemented on Radeon HD 7970 GHz Edition, already poses high thermal envelope. AMD doesn't want the 28 nm process to restrict its next-generation architecture development, and is holding out till the 20 nm process is in place at TSMC. The fab set Q4 as its tentative bulk manufacturing date for the process.

The source that leaked the block-diagram also posted specifications of the chip that's codenamed "Hawaii," which appears to be the flagship part. 
20 nm silicon fab process
4096 stream processors
16 serial processor cores
4 geometry engines
256 TMUs
64 ROPs
512-bit GDDR5 memory interface

*View at TechPowerUp Main Site*


----------



## lobsterrock (May 8, 2013)

This sounds awesome, but nvidia still has almost half a year of newly branded gpu's. That's going to be a huge cash cow for them. I know the '700-series' is mostly going to be rebrands, but people will still eat that up.


----------



## LAN_deRf_HA (May 8, 2013)

I was under the impression the 700 series isn't that far off. This sounds more like it'd compete with the 800 series.


----------



## btarunr (May 8, 2013)

lobsterrock said:


> This sounds awesome, but nvidia still has almost half a year of newly *rebranded* gpu's. That's going to be a huge cash cow for them. I know the '700-series' is mostly going to be rebrands, but people will still eat that up.



ftfy


----------



## nikolaj1651 (May 8, 2013)

they say it has ''16 serial processor cores'', and nvidia 690 only have 2? gonna be interesting.


----------



## RCoon (May 8, 2013)

I'm just going to ignore processor releases this year and next, as my systems really dont need the power. I am however interested in converting from crossfire to single AMD card provided this flagship product keeps up with my 120hz needs. NVidia havent taken any of my money since getting 3 GTX 570's, and never will until they get off of their pedastal.


----------



## renz496 (May 8, 2013)

nikolaj1651 said:


> they say it has ''16 serial processor cores'', and nvidia 690 only have 2? gonna be interesting.



i think you misunderstood that 16 serial core thing when you compared it to 2 gpu available in one GTX690 package. do you think amd wants to fit all 16 gpu core inside a pcb with each core consisting 4k stream processor? this thing might be more like APU (combining a cores that is good at serial task and cores that good at parallel task) though in this regard both cores will handle graphic task. so is this amd respond to nvidia 700 series? it's like 'hey we have something new next year so you should hold out for it instead of getting nvidia's rebrand (700 series)'.


----------



## nikolaj1651 (May 8, 2013)

renz496 said:


> i think you misunderstood that 16 serial core thing when you compared it to 2 gpu available in one GTX690 package. do you think amd wants to fit all 16 gpu core inside a pcb with each core consisting 4k stream processor? this thing might be more like APU (combining a cores that is good at serial task and cores that good at parallel task) though in this regard both cores will handle graphic task. so is this amd respond to nvidia 700 series? it's like 'hey we have something new next year so you should hold out for it instead of getting nvidia's rebrand (700 series)'.



ahh i see, i misunderstood sry


----------



## Mathragh (May 8, 2013)

Wow wut, with 16 x86 cores in there, it sounds like this will be capable of a whole lot more than just graphics, and even GPU computing. 

This could have huge ramifications, such as, people modding linux/windows to run on these chips solely, completely bypassing the current schism in memory hierarchy, and much much more.


On a more current note: I wonder how this complete redesign will influence driver development.


With AMD going this route, it sounds like they're going the same way as Nvidia, who is also rumoured to include ARM cores in one of their next major revisions. After reading this, I've got the feeling that AMD will get to market first with their version, like they traditionally get to market first on a new proces node.

Edit:

I'm wondering what part of the graphics workload is actually better off being performed on serial cores? perhaps a part of the pipeline currently handled by the CPU via drivers?


----------



## buggalugs (May 8, 2013)

hmmm, so the option is pay a fortune for a 780 mid-year, or wait until the end of the year for Amd.......I think I'll wait.


----------



## RCoon (May 8, 2013)

buggalugs said:


> hmmm, so the option is pay a fortune for a *gimped-renamed-titan*, or wait until the end of the year for Amd.......I think I'll wait.



fix'd


----------



## jigar2speed (May 8, 2013)

WTF is 16 serial processor cores ?


----------



## cdawall (May 8, 2013)

jigar2speed said:


> WTF is 16 serial processor cores ?



The core is going to be broken down to have x86 or ARM cores within it for additional parallel processing. I would guess ARM for power savings, but that is nothing more than a guess. God knows the top model with 4096 stream processors and 16 cpu-esque cores will use some power.


----------



## Pandora's Box (May 8, 2013)

Sorry but if this is indeed a new architecture I will be going with Nvidia. AMD has barely got the 7xxx Series working properly and theres still the microstutter issue.


----------



## Mathragh (May 8, 2013)

Pandora's Box said:


> Sorry but if this is indeed a new architecture I will be going with Nvidia. AMD has barely got the 7xxx Series working properly and theres still the microstutter issue.



Perhaps those serial processors are exactly what will alleviate the current stuttering issue.


----------



## RCoon (May 8, 2013)

Pandora's Box said:


> Sorry but if this is indeed a new architecture I will be going with Nvidia. AMD has barely got the 7xxx Series working properly and theres still the microstutter issue.



Because a company fully aware of a memory issue, fixing a memory issue, is then going to use the same process for memory as the previous architecture that caused the issue on their new architecture. Sure it might happen, or it might exist but to a lesser degree, but you'd be pretty ignorant to think it would carry over to a new architecture and base your buying options on that alone. But by all means, go nvidia if they offer you a suitable product.


----------



## lobsterrock (May 8, 2013)

btarunr said:


> ftfy



Yeah that's what I meant to say, I guess I just forgot to add the re- part. But at least it'll mean the 680 is getting cheaper right?


----------



## RCoon (May 8, 2013)

lobsterrock said:


> Yeah that's what I meant to say, I guess I just forgot to add the re- part. But at least it'll mean the 680 is getting cheaper right?



I thought that about the 5xx series when all these new cards came out. They're all still pushing the high £200's and mid £300's to this day in the UK.


----------



## amdftw (May 8, 2013)

Pandora's Box said:


> Sorry but if this is indeed a new architecture I will be going with Nvidia. AMD has barely got the 7xxx Series working properly and theres still the microstutter issue.



Sorry but AMD's single 7970 crad has better frametimes than NV's GTX680...
http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Testin


----------



## Prima.Vera (May 8, 2013)

btarunr said:


> 20 nm silicon fab process
> 4096 stream processors
> 16 serial processor cores
> 4 geometry engines
> ...



lol.

4096 stream processors??? That's double than a 7970 GPU
512-bit GDDR5??? Prepare for some records on the bandwidth

I think AMD is preparing a monster here, and I wont be off to say that this GPU will almost be as powerful as a 7990 card. 
A new 5870 on the horizon?? (compared to 4870X2 that is...)


----------



## alwayssts (May 8, 2013)

What, did ya'll seriously think Maxwell was the only one heading in this direction?

64-bit ARM cores (A57) are the new black.  It's all about decreasing dependance/latency associated with modern systems.  You can call it an APU, but it's really just enhancing the capabilities of GPGPU beyond a GPU being a dumb processor being thrown parallel workloads by a CPU.  It needs it's own brain(s) to go with the brawn.  These advancements plus shared memory = huge improvements.  

Wonder if they will scale this (in the sense that 1 cpu block would go with 4 ROPs and 4 Compute Units)...I bet they do...because that would make a ton of sense (and be pretty efficient I reckon).  Be it one ARM block + 1 said gpu block, or 2 (say x86 blocks + 2 gpu blocks...which sounds an awful lot like the supposed specs of Kaveri).

Also, I think we all kind of figured we'd see a 512-bit bus considering bandwidth is a seriously huge (and limiting) factor, even now.  Ofc we all plan on seeing GDDR6 at some point, but no word of production has been spoken about afaik.

Figuring 4096 cores and 512-bit/7gbps on the current architecture..that's around 975mhz-ish, 8000mhz, closer to 1100mhz.  That is not accounting for cpu cores and perhaps very likely a more robust cache structure though.  Obviously with the cpu cores, at the bare minimum cache will need to be larger/faster to feed them...I'm just throwing it out there assuming they are supplemented in conjunction with one another...or more-or-less bolted on to the current gpu spec.  

Sounds plausible as while the the 1.9x density improvements are possible and touted 1.3x speed improvements at the same voltage may be approximately correct (and hence a chip this spec lines up with being the size to accompany a fast 512-bit bus), 20nm will likely target a lower nominal voltage.  IE, where 28nm may have been .85v/850mhz but in reality running 1.05-1.175 (and up to 1.3v) with yields settling in-between 850-1175, this will probably be closer to .85-1.05v in reality (up to 1.175)  with yields starting around 1000-1100mhz.  Given the typical ~10% clock cushion per voltage...it lines up.

So...in essence...AMD is doing the exact same thing as nvidia is with Maxwell/Denver...only (conceivably) earlier and in a more flexible manner.


----------



## Prima.Vera (May 8, 2013)

However, on AMD's official webpage the specs are more realistics, and actually look more like a 7970 rebrand....
hmmm...wtf?!??

http://www.amd.com/us/products/desktop/graphics/8000/pages/8000-series.aspx#2

Edit:

Or AMD is talking about the 9000 series??????????????????????????????????????????????


----------



## Mathragh (May 8, 2013)

Prima.Vera said:


> However, on AMD's official webpage the specs are more realistics, and actually look more like a 7970 rebrand....
> hmmm...wtf?!??
> 
> http://www.amd.com/us/products/desktop/graphics/8000/pages/8000-series.aspx#2
> ...



the big "OEM" at the top of the page gives it away. That 8000 series was just to keep OEM's happy apparently, and is just a rebrand of 7000 series cards.


----------



## lilhasselhoffer (May 8, 2013)

Wow this into AMD versus Nvidea fanboyisms relatively quickly.


AMD: Please deliver a new architecture that can alleviate the micro stuttering, at a price point this side of reasonable.  The 7970 was good, but Nvidea kinda ate your lunch with the 6xx series by having less issues on the driver side.

Nvidea: A rebranding of 6xx cards to the 7xx series cards might fool some people.  The 6xx series is nice, but the pricing is still unreasonable.  Look away from the Titan, and deliver a 7xx series without massive mark-ups and minimal performance increases.

All GPU producers: Resist the urge to rush to market.  We will wait four months until you can secure reasonably priced chips, and a decent step forward on features.  Just make sure whatever you release can actually be utilized.  Two or three games make my current 6950 cry at moderate (1920x1080) resolution, I'm not paying more than $250 for an improvement in the two of those games I actually play.



My opinion on the architecture change is simple.  Include x86 style cores for the express purpose of doing physics processing without using CPU resources.  An octet of slow speed cores doesn't sound like much, but they could do everything that Physx on a CPU does.  AMD offloading their shiny new Tressfx to a separate on-board processor means better performance while being transparent to the user.  If the APU is adding GPU to a CPU then this move must be the opposite approach to the same goal, a decent SoC


----------



## Fluffmeister (May 8, 2013)

So a year away then still, oh well.


----------



## xenocide (May 8, 2013)

amdftw said:


> Sorry but AMD's single 7970 crad has better frametimes than NV's GTX680...
> http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Testin



I might save that link for whenever someone recommends a user goes Crossfire...


----------



## badtaylorx (May 8, 2013)

oh no....i buy cards from both camps....but only on the "even" numbered series'  


not sure why...just how ive been doing it sincebeenpimpn!!!
4850 to 460 than sli to 6970 than crossfire to 670 now sli....

but this one could get expensive.....


----------



## ST-Cyclone921R (May 8, 2013)

I resampled the diagram and tried to read the module tags, couldn't read all. but here is what I could







https://twitter.com/ST_Cyclone/status/332119099121479681/photo/1

ARM Cortex Co-processor, ECC ram are interesting, but what blew me away was the Hyper-transport links!  pleeeeease tell me this will aid/accelerate the PCI-E bottleneck!


----------



## Slacker (May 8, 2013)

Getting off topic here, Hawaii isn't part of the Pacific Ring of Fire. It ain't a composite volcano sow why name it that? Amd needa hire some people that really know their stuff.


----------



## Jorge (May 8, 2013)

Since AMD has plenty of excellent GPU cards available right now, the Q4 of '13 release of the new series should be just fine, IMO. Only extremists buy the $300+ GPU cards. The majority of consumers including enthusiasts don't spend over ~$300 on a GPU card.

It appears to me that AMD is really ramping the good products that they have had in development for years be it GPU or CPU/APU models. This is nice to see and good for consumers.


----------



## MxPhenom 216 (May 8, 2013)

amdftw said:


> Sorry but AMD's single 7970 crad has better frametimes than NV's GTX680...
> http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Testin



I might be blind but I'm not seeing it and i just checked every graph. they are nearly identical.


----------



## OnePostWonder (May 8, 2013)

Question for W1zzard or anyone else who might know:

Will a micro-architecture change like this dramatically affect the approach you have to take for supporting this in GPU-Z?


----------



## Casecutter (May 8, 2013)

As said on 5 days ago 


Casecutter said:


> honestly don't think AMD is being "lethargic or has console-fixation" (more banter), but saying (in an Italian accent) *No please, I insist you's go first... *AMD has gone first the last two times and Nvidia's just played "one-up's-man-ship". I sense AMD would rather wait and see what Nvidia does, then could they go all "4870" on them.



This is more interesting than I thought it would get, if this is true it's going to be a stagnate 6 months.  And I might be wrong!  Nvidia might... might wise-up and drop all their GTX7XX prices to grab sales before this information really gets traction.   

If this is the course AMD is on they have been working on this for more than year...  But it's hard to know how well this can be pulled off on 20Nm.  Had they said 28Nm I would've seen this as simple, but to complicate it on a TSMC shrink. 

But you can kind of feel how the camps are liking or disliking this 20Nm.
http://www.xbitlabs.com/news/other/..._Is_Ahead_of_Its_Own_20nm_Roadmap_Report.html

http://www.extremetech.com/computin...y-with-tsmc-claims-22nm-essentially-worthless



Slacker said:


> Getting off topic here, Hawaii isn't part of the Pacific Ring of Fire. It ain't a composite volcano sow why name it that? Amd needa hire some people that really know their stuff.


The code name is just "Volcanic Islands".  Not sure who else is actual fixating on "Pacific Ring of Fire", maybe just btarunr at this point... Hawaii Islands were formed by a volcano the last I heard.


----------



## Sihastru (May 8, 2013)

Relax guys, "16 serial processor core" or maybe a better term would be "16 serial and parallel processing cores" is just something similar to NVIDIA's "SMX Unit" (or "Streaming Multiprocessor Unit"). It has nothing to do with x86 or ARM.

For example GTX 680 has 8 of these "processing cores", GTX 690 has 16, GeForce TITAN has 14 enabled (from a total of 15).

All the other bling-blings, they are reserved for the "professional" FirePro series and will most likely be disabled for the mainstream Radeon series.

The good news is that it looks like a card that could compete against GTX 690. The bad news is that it's not actually going to compete against GTX 690, but something new, much more powerful, NVIDIA's next (next) gen.


----------



## BigMack70 (May 8, 2013)

Cool... if they do this and fix their stuttering issues, consumers win big time. Right now consumers are kinda getting screwed at the $1k price mark. 

I honestly prefer this 2 year refresh cycle - I'd rather see substantive improvements ever 2-3 years than incremental improvements every 12-16 months.


----------



## FrustratedGarrett (May 8, 2013)

Looks impressive, on paper. If they manage to pull this off by December and price the 2X-7970 performer at ~$400 MSRP, then AMD will have a big winner on their hands.


----------



## ST-Cyclone921R (May 8, 2013)

Sihastru said:


> Relax guys, "16 serial processor core" or maybe a better term would be "16 serial and parallel processing cores" is just something similar to NVIDIA's "SMX Unit" (or "Streaming Multiprocessor Unit"). It has nothing to do with x86 or ARM.
> 
> For example GTX 680 has 8 of these "processing cores", GTX 690 has 16, GeForce TITAN has 14 enabled (from a total of 15).
> 
> ...



emmm .. not really....

nVidia's term "SMX" is their marketing name for blocks stream processors (shader units, ALU's...etc.) Building block with its own front end (rasterizer) and back end (ROPS). And they are massively parallel processors. 

Serial Units Above, are from their name; Serial. and AMD has two "serial" architecture licenses, x86 and ARM, and its not hard to guess which they opted for


----------



## BigMack70 (May 8, 2013)

FrustratedGarrett said:


> Looks impressive, on paper. If they manage to pull this off by December and price the 2X-7970 performer at ~$400 MSRP, then AMD will have a big winner on their hands.



Did I miss something about performance being at 2x7970 levels? I would be really surprised if they pulled that off... it would mean that they have a chip ~40%+ faster than the Titan. Don't get me wrong, I would love to see that, but that sounds a little extreme to me.

I would expect them to do something more like match the Titan, i.e. 7970 + 40% performance.


----------



## Mathragh (May 8, 2013)

BigMack70 said:


> Did I miss something about performance being at 2x7970 levels? I would be really surprised if they pulled that off... it would mean that they have a chip ~40%+ faster than the Titan. Don't get me wrong, I would love to see that, but that sounds a little extreme to me.
> 
> I would expect them to do something more like match the Titan, i.e. 7970 + 40% performance.



Pure theoretical peak performance would be around 2x the 7970 on parallel execution units alone( 7970 has 2048 shaders, this chip has 4096). On top of that I suppose all the other changes in the architecture are there to further improve performance, not decrease it, resulting in over 2 times 7970 performance.
Add in some overhead, and we're back at 2x 7970(with a major grain of salt)


sooo, 2x 7970 doesnt sound that far off(as said with a major grain of salt).


----------



## BigMack70 (May 8, 2013)

Mathragh said:


> Pure theoretical peak performance would be around 2x the 7970 on parallel execution units alone( 7970 has 2048 shaders, this chip has 4096). On top of that I suppose all the other changes in the architecture are there to further improve performance, not decrease it, resulting in over 2 times 7970 performance.
> Add in some overhead, and we're back at 2x 7970(with a major grain of salt)
> 
> 
> sooo, 2x 7970 doesnt sound that far off(as said with a major grain of salt).



Interesting. Well, I guess we'll know in a few months! At least now I know what timeframe I need to sell off my 7970s in


----------



## Casecutter (May 8, 2013)

Just wondering could this be on GlobalFoundries 20nm?  A year ago the process was "crowning"; may have AMD been working on this with them?  AMD really just needs a card to off-set Titan LE at the high-end the rest of the Southern Island 7XXX still rivals the re-branded GTX770 on down.  That could mean AMD has perhaps time to nurture the relationship with the new partner?


----------



## FrustratedGarrett (May 8, 2013)

BigMack70 said:


> Did I miss something about performance being at 2x7970 levels? I would be really surprised if they pulled that off... it would mean that they have a chip ~40%+ faster than the Titan. Don't get me wrong, I would love to see that, but that sounds a little extreme to me.
> 
> I would expect them to do something more like match the Titan, i.e. 7970 + 40% performance.



Actually, they would have a product that's (2x / 1.3x ~ 1.55) ~55% faster than Titan.


----------



## BigMack70 (May 8, 2013)

FrustratedGarrett said:


> Actually, they would have a product that's (2x / 1.3x ~ 1.55) ~55% faster than Titan.



I was just thinking of the regular 7970, not the GHz edition


----------



## Mathragh (May 8, 2013)

This might not be so new btw, someone at tweakers.net forums showed that this picture has actually been floating around the webs quite a while now:

http://www.flickr.com/photos/8091090@N02/8006861851/in/photostream/

So it could either actually be really old news, or a fake.


----------



## TheHunter (May 8, 2013)

Nice this looks like a perfect upgrade.


There is never enough powaah!! Yes even at 1080P and newer titles. 



Although im a bit confused about this part

16 serial processor units
4 geometry engines

So in reality it will have 16*4 = 64 CU?




Casecutter said:


> Just wondering could this be on GlobalFoundries 20nm?  A year ago the process was "crowning"; may have AMD been working on this with them?



I read it once that AMD has 3 fabs to chose from and there wont be any delays because of it.


----------



## tastegw (May 8, 2013)

Looking great so far, looks like Godzilla is coming.  If its better than my titan, I'll be selling when it comes out.


----------



## TheHunter (May 8, 2013)

Of course it will be better then Titan, Titan looks like a joke compared to this


----------



## drdeathx (May 8, 2013)

Pandora's Box said:


> Sorry but if this is indeed a new architecture I will be going with Nvidia. AMD has barely got the 7xxx Series working properly and theres still the microstutter issue.



That's ludicrous untill you see the performance IMO. Looks like this may blow Nvidea out of the water.


----------



## W1zzard (May 8, 2013)

TheHunter said:


> Of course it will be better then Titan, Titan looks like a joke compared to this



you can buy titan today, and it's tested to be fast and stable with decent power consumption, low fan noise and no coil noise.

this slide might be a pipedream of some chinese forum user who's laughing his ass off now


----------



## TheHunter (May 8, 2013)

True, but what if you're not in a hurry? Why cash out absurd 1000$ and if a ~600$ gpu later stomps all over it 


Imo VI vs GK110 is like GK110 vs GF110, if not even a bigger gap.


----------



## m1dg3t (May 8, 2013)

*Raising the bar, AGAIN!*

Goddamn you ATi/AMD!  Hopefully i can still fetch a pretty penny for my 5870 when these drop so it would make transitioning a little easier... 

5870 - Out of 775 sys. and for sale or donate to relative, or just add to my GPU collection 
7950 - Out of 1155 sys. and into 775 sys.
8950? - Into 1155 sys.

Sounds good! 

GFX makers; Give me MoBo style GFX solution! One board, with swappable GPUs!


----------



## TheoneandonlyMrK (May 8, 2013)

Wow compute cored gfx is around the corner  nice , whilst the pic may be a hoax it does all make perfect sense , imho those serial cores will either be a new special Dp shader sub unit aimed at serial compute or something much more interesting like Jaguer cores perhaps but they are not at all like an smx unit btw ( cant remember who said that) they will be separate and special , I think the bus/fabric that binds it all will be very very interesting too.


----------



## Mathragh (May 8, 2013)

W1zzard said:


> this slide might be a pipedream of some chinese forum user who's laughing his ass off now



Aye, all that's new with this pic is the fact that its a negative of the original apparently


----------



## Casecutter (May 8, 2013)

With word on the street of Apple looking at A7 chip production at TSMC, perhaps this is why AMD a GloFlo might be worth reflecting on. Plus demand 20Nm parts where to be high at least what TSMC and others where saying.   While we know Nvidia signed with TSMC last December for 20Nm with Maxwell in mind.  Then when you search "AMD TSMC 20nm" the information trail is really cold?


----------



## FrustratedGarrett (May 8, 2013)

theoneandonlymrk said:


> Wow compute cored gfx is around the corner  nice , whilst the pic may be a hoax it does all make perfect sense , imho those serial cores will either be a new special Dp shader sub unit aimed at serial compute or something much more interesting like Jaguer cores perhaps but they are not at all like an smx unit btw ( cant remember who said that) they will be separate and special , I think the bus/fabric that binds it all will be very very interesting too.



The small custom Arm CPUs might be used for scheduling complex workloads with thousands of threads, and perform some heavy duty in-order or serial processing. 

The specs do make sense, but like Wizzard said, it could be a fabricated news. I hope it's not, cuz BF4 is coming out later this year, and I want to upgrade to a worthy card that can max BF4 out on a 120Hz 2560x1440 res. display. 

*on another note*: Are we getting any 8 bit, backlight-strobed displays this year with 120Hz refresh rate? I mean with those 8 bit panels that are rated at 6ms pixel transition times, it's perfectly possible with a strobing backlight to achieve perfectly smooth and clear animation that surpasses that of 120Hz TN displays with 2-3ms pixel response ratings.


----------



## TheHunter (May 8, 2013)

Casecutter said:


> With word on the street of Apple looking at A7 chip production at TSMC, perhaps this is why AMD a GloFlo might be worth reflecting on. Plus demand 20Nm parts where to be high at least what TSMC and others where saying.   While we know Nvidia signed with TSMC last December for 20Nm with Maxwell in mind.  Then when you search "AMD TSMC 20nm" the information trail is really cold?





> Bring on 2014: Volcanic Islands
> 
> After Sea Islands reaches the end of the line, AMD will introduce perhaps the most important GPU family of them all - Volcanic Islands. Volcanic Islands (VI) will go head to head against NVIDIA Maxwell and second-gen Xeon Phi architecture.
> 
> Manufactured at 20nm Gate-Last process, this will be the first GPU family which AMD should be able to manufacture in Common Platform Alliance as well as its long-standing foundry partner, TSMC. Thus, AMD will have the choice between TSMC GigaFab Hsinchu/Taichung, IBM East Fishkill, GlobalFoundries in New York and Dresden or Samsung in Austin. The manufacturing flexibility will be of paramount importance, for Volcanic Islands GPU architecture will represent the pinnacle of system integration between the CPU and GPU.



http://vr-zone.com/articles/amd-nex...aled-2013-2014-2015-gpus-get-names/17154.html


----------



## 15th Warlock (May 8, 2013)

W1zzard said:


> you can buy titan today, and it's tested to be fast and stable with decent power consumption, low fan noise and no coil noise.
> 
> this slide might be a pipedream of some chinese forum user who's laughing his ass off now



Exactly, it seems like Titan is getting a bad rap from a lot of ppl here lately, calling it a crappy, high leakage GPU when it's anything but. Yes the price is outrageous, but the performance and low power usage are there. 

Anyways, this thread is not even about Titan, so moving on.... it seems like AMD is shooting for a GPU with more general purpose focus, if these cores are ARM cores they will use a cut down version of this GPU for mobile devices, depending on power consumption, this might actually be the breakthrough AMD needs to get their designs into more mobile devices 

Let's not forget that the green team is also an ARM licensee, and they already have parts that couple A15 cores with Kepler cores in the pipeline, GPU designs take over aprox 24 mos. to reach the market, so they may be working on a similar design using Maxwell cores as well, but at this point this is all speculation


----------



## AlB80 (May 8, 2013)

It's not GPU pic!!! It's APU pic!!!


----------



## TheoneandonlyMrK (May 8, 2013)

15th Warlock said:


> Exactly, it seems like Titan is getting a bad rap from a lot of ppl here lately, calling it a crappy, high leakage GPU when it's everything but. Yes the price is outrageous, but the performance and low power usage are there.
> 
> Anyways, this thread is not even about Titan, so moving on.... it seems like AMD is shooting for a GPU with more general purpose focus, if these cores are ARM cores they will use a cut down version of this GPU for mobile devices, depending on power consumption, this might actually be the breakthrough AMD needs to get their designs into more mobile devices
> 
> Let's not forget that the green team is also an ARM licensee, and they already have parts that couple A15 cores with Kepler cores in the pipeline, GPU designs take over aprox 24 mos. to reach the market, so they may be working on a similar design using Maxwell cores as well, but at this point this is all speculation


I think thats what nv are working towards with project denver (arm 64/32bit cpu) and project Parker ( denver with maxwell gpu) as jhen hsung himself stated but they're aimed optimistically at 2015 with that and it also unifies all system memory.
Amd are beating many to the ball with this tech and due to their apu , gpu and soc achievements they are certainly one to bet on imho.


----------



## HumanSmoke (May 8, 2013)

Slacker said:


> Getting off topic here, Hawaii isn't part of the Pacific Ring of Fire. It ain't a composite volcano sow why name it that? Amd needa hire some people that really know their stuff.


With marketing ? Why start now ?
These are the same people who name a Southern Islands GPU after a Northern hemisphere island archipelago (Cape Verde)

AMD ain't care.



Casecutter said:


> This is more interesting than I thought it would get, if this is true it's going to be a stagnate 6 months.  And I might be wrong!  Nvidia might... might wise-up and drop all their GTX7XX prices to grab sales before this information really gets traction.


If this information _gets_ traction, How many 7970/7990/7950's do you think AMD are going to sell ? You really think that a purported AMD slide (and lets face it, don't you find it just a little bit suspicious that this surfaces just as the GTX 700 hype machine gains momentum?) is going to headshot Nvidia's sales but leave AMD untouched ?
You do remember AMD Osborning themselves with the Trinity hype that resulted in the company taking a write down on Llano inventory recently ?  


Casecutter said:


> Just wondering could this be on GlobalFoundries 20nm?


AFAIA, GloFo's 20nm HKMG so far is of the low power (LPM) variety, 20nm TSV isn't slated for volume production until 2014.

Still, the "late-2013" launch, could mean just about anything (and I'm presuming the original time frame originates from James Prior's addition at the end of this piece) - tape out, risk wafer debug/validation, volume wafer starts laid down, shipping for revenue, or even ....retail availability...but if it's on 20nm then the signs are that it would be TSMC.


----------



## D007 (May 8, 2013)

amdftw said:


> Sorry but AMD's single 7970 crad has better frametimes than NV's GTX680...
> http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Testin





MxPhenom 216 said:


> I might be blind but I'm not seeing it and i just checked every graph. they are nearly identical.



Not to mention it's a GHZ edition he's clocking against.. All that graph shows is just how terrible Crossfire is.. Look at what you get when you sli 2 GTX 680 vs Crossfire..

Also if you are going to compare an ATI superclocked edition in a vs comparison, compare it to an NVIDIA superclocked edition. 
Not an NVIDIA standard model..


----------



## erocker (May 8, 2013)

HumanSmoke said:


> Osborning



I haven't heard that term before. I like it!


----------



## HumanSmoke (May 8, 2013)

erocker said:


> I haven't heard that term before. I like it!



Easier to fit into a sentence than Osborne Effect


----------



## TheoneandonlyMrK (May 8, 2013)

HumanSmoke said:


> With marketing ? Why start now ?
> These are the same people who name a Southern Islands GPU after a Northern hemisphere island archipelago (Cape Verde)
> 
> AMD ain't care.s
> ...



Tsv @ any size is just chip stacking and id say maxwell/ volta will get to use that first, I doubt Amd are daft enough todo a complete refresh of architecture inc  bus changes and major Ip insertion and drop process size as well and then add several more (first time out Tsv)  complications,  all Imho .


----------



## librin.so.1 (May 8, 2013)

Mathragh said:


> This might not be so new btw, someone at tweakers.net forums showed that this picture has actually been floating around the webs quite a while now:
> 
> http://www.flickr.com/photos/8091090@N02/8006861851/in/photostream/
> 
> So it could either actually be really old news, or a fake.



*IF* this thing is not fake, it resembles much more of a SoC than a GPU.
Especially so because those serial processing modules have an uncanny resemblance to bulldozer/piledriver cores...

EDIT: OH WAIT this thread had a second page *derp*

EDIT #2: OH TEH WAIT AGAIN this much more resembles a ramped-up version of the PS4 :|


----------



## cdawall (May 8, 2013)

drdeathx said:


> That's ludicrous untill you see the performance IMO. Looks like this may blow Nvidea out of the water.



I would hope the next generation cards perform better than the current generation of cards. 

As for manufactured news I don't care bumped the stock up another $.25


----------



## HumanSmoke (May 8, 2013)

theoneandonlymrk said:


> I doubt Amd are daft enough todo a complete refresh of architecture inc  bus changes and major Ip insertion and drop process size as well and then add several more (first time out Tsv)  complications,  all Imho .


Well, Southern Islands accomplished most of that.
Bus change...256-bit bus width to 384
IP insertion...VLIW4  to GCN
Process node...40nm to 28nm
TSV....I was under the impression that TSMC had already demonstrated viability under 28nm (CoWoS) over a year ago. TSV looks to be an extrapolation of the process.


----------



## D4S4 (May 8, 2013)

looks like soon enough gfx cards will be gaming pc's all on their own

edit - also, take a look at the new play station hardware, kinda similar? i know little about programming but i imagine porting to a system that has cpu and system memory separated by a comparatively slow, high latency pcie bus could be a hassle. now if you had 8gb of memory and some cpu-like cores on the card?


----------



## Casecutter (May 8, 2013)

Wasn't it Southern Island 7XXX, then Sea Island, while Volcanic Islands would via Maxwell?  Well, it looks like Sea Island is gone and the road map has move-up.


----------



## TheoneandonlyMrK (May 8, 2013)

HumanSmoke said:


> Well, Southern Islands accomplished most of that.
> Bus change...256-bit bus width to 384
> IP insertion...VLIW4  to GCN
> Process node...40nm to 28nm
> TSV....I was under the impression that TSMC had already demonstrated viability under 28nm (CoWoS) over a year ago. TSV looks to be an extrapolation of the process.



Tsv was first done by Ibm 1-2 years ago but it is just a chip stacking tech which also conveniantly enables odd stacks like 20nm on (45-90nm) but isn't directly tied to lithographic size and by bus I didn't mean adding more memory controllers I ment internal inter Ip buses which will have also increased a vast amount id imagine.
Im sorry though i should have said, I do agree with your original point that Amd will be waiting on 20nm finfet bulk process or Soi gate last I forget the actual process names.


----------



## drdeathx (May 8, 2013)

cdawall said:


> I would hope the next generation cards perform better than the current generation of cards.
> 
> As for manufactured news I don't care bumped the stock up another $.25



Agreed CD


----------



## esrever (May 8, 2013)

Casecutter said:


> Wasn't it Southern Island 7XXX, then Sea Island, while Volcanic Islands would via Maxwell?  Well, it looks like Sea Island is gone and the road map has move-up.



sea island is refresh of southern island. It also includes some new chips like bonaire and mars. There may be more cards coming by end of the year, especially for mobiles. CI isn't gone, its just going to be delayed a bit a merged with existing products. VI will be more than a year away from the looks of it.


----------



## Casecutter (May 8, 2013)

HumanSmoke said:


> don't you find it just a little bit suspicious that this surfaces just as the GTX 700 hype machine gains momentum?)


We know this is Spy-Vs.-Spy!  Nvidia has held-off on the 7XX news as long as they could and have worked down inventories without letting prices tumble.  AMD has been holding something extremely "tight-fisted" ever since that "no new card revelation" back in Feb.  I take all rumors with “pinch of salt” even saying in my first post on the topic…


Casecutter said:


> If this is the course AMD is on...


 
Sure it's released (by whoever) to rain on Nvidia that we know…  AMD will still sell the cards they've figured based on them releasing this news, or debunk it if it not truly going to help their sales. Even if totally false they wouldn't necessarily care unless it hurts their sales figures.  I think Nvidia has to take a hard look at this decide, is it true, then either say take sales on lower profit, or hope it doesn’t go optimistically, could goes south for AMD, or not true... and not show any or much change in current price points. 

If AMD did really "out'd-this" info this early they played it wrong!  I would've waited the month when Nvidia started releasing card like the GTX780/GTX770 for review and had set MSRP's, then fired off with "technical white paper" on what they will have, that would've been the right timing.

Heck for all we know Nvidia out'd this information and some parts of it carry some truth.  Better for them to get it out now than letting AMD pull it out and rain on their release.  We just never know who behind the curtain and what the string actually mean.


----------



## esrever (May 8, 2013)

doesn't seem right.


----------



## erocker (May 8, 2013)

esrever said:


> http://abload.de/img/volcanic-islands26iur9.jpg
> doesn't seem right.



Yeah it looks like it could be that "system on a chip" that I remember reading in the news somewhere, not a GPU... But I know very little about architecture.


----------



## Casecutter (May 8, 2013)

esrever said:


> CI isn't gone, its just going to be delayed a bit a merged with existing products. VI will be _more than a year away _from the looks of it.


While I might say some re-spins and/or re-badging of Southern Island parts is probably still viable into 2014. Although, I'm not sure why you're saying VI is "more than a year away" the article is projecting, speculation, alleging for the 4th Qtr of this year.  I’m not saying your opinion can’t be valid still, but working from this AMD may have adjusted it's evolution up.


----------



## esrever (May 8, 2013)

Casecutter said:


> While I might say some re-spins and/or re-badging of Southern Island parts is probably still viable into 2014. Although, I'm not sure why you're saying VI is "more than a year away" the article is projecting, speculation, alleging for the 4th Qtr of this year.  I’m not saying your opinion can’t be valid still, but working from this AMD may have adjusted it's evolution up.


If they want to hit 20nm with VI it won't be by the end of the year. From what I read online, TSMC will make 20nm products to market early 2014, AMD is likely to have a product line refresh Q4 this year but its unlikely to be VI.


----------



## Casecutter (May 8, 2013)

esrever said:


> If they want to hit 20nm with VI it won't be by the end of the year. From what I read online, TSMC will make 20nm products to market early 2014, AMD is likely to have a product line refresh Q4 this year but its unlikely to be VI.



Odd that's not what I'm reading... 
"TSMC would then be able to begin volume production at the end of the second quarter and ramp 20-nm production in the second half of 2013 a Focus Taiwan report said referencing unnamed sources."
http://www.eetimes.com/design/programmable-logic/4411189/TSMC-expected-to-begin-20-nm-line-early
http://www.xbitlabs.com/news/other/..._Is_Ahead_of_Its_Own_20nm_Roadmap_Report.html



Casecutter said:


> But you can kind of feel how the camps are liking or disliking this 20Nm.
> 
> http://www.extremetech.com/computing...ally-worthless


----------



## the54thvoid (May 8, 2013)

Mass hysteria from what looks like a slide from another product.  Ho hum, not going to bother getting excited.  Still expecting well into 2014 to be the next gpu generation release year.  At least by that time the next gen consoles should be pushing a few more game developers into new territories for visual fidelity so the new gpu's can flex some mighty muscle.


----------



## esrever (May 8, 2013)

Casecutter said:


> Odd that's not what I'm reading...
> "TSMC would then be able to begin volume production at the end of the second quarter and ramp 20-nm production in the second half of 2013 a Focus Taiwan report said referencing unnamed sources."
> http://www.eetimes.com/design/programmable-logic/4411189/TSMC-expected-to-begin-20-nm-line-early
> http://www.xbitlabs.com/news/other/..._Is_Ahead_of_Its_Own_20nm_Roadmap_Report.html



they moved it up for apple, which probably means low volume for everyone else at first. I don't really expect 20nm gpu parts this year but maybe a paper launch in dec like they did with the 7970.


----------



## FrustratedGarrett (May 8, 2013)

esrever said:


> they moved it up for apple, which probably means low volume for everyone else at first. I don't really expect 20nm gpu parts this year but maybe a paper launch in dec like they did with the 7970.



They'll probably fabricate the GPUs using Glofo's 28nm HPP or LPH which are used for Kaveri and Temash SOCs. Glofo's 28nm HPP is rumored to be ~25% less power consuming at the same transistor performance level (frequency) than TSMC's 28nm HPP, which is what Nvidia uses (not AMD) for their current gen graphics chips.


----------



## RoostieJDio (May 8, 2013)

Looks more like an Opteron flavour APU than a GPU.


----------



## TheoneandonlyMrK (May 8, 2013)

erocker said:


> Yeah it looks like it could be that "system on a chip" that I remember reading in the news somewhere, not a GPU... But I know very little about architecture.



After looking at a better pic it looks like the ps4s apu to me possibly nxbox ish but Vi doubt full as its also a fully capable apu with gcn gpu and x86 ,plus arm security core which has shown fused off hyper transport (which enabled would allow multisocket server cards etc) interesting though as eventually this is where gpus are going.

Also I counted 1024  shader units int array though it could obv break down further , my mind may explode 16 serial processors wtf.


----------



## mastrdrver (May 9, 2013)

Everyone realizes that this picture is about a year old and posted by the same person about a year ago?


----------



## HumanSmoke (May 9, 2013)

esrever said:


> http://abload.de/img/volcanic-islands26iur9.jpg
> doesn't seem right.


I think the DDR3 memory controllers might be a small giveaway, but you never know...maybe that's just for the enthusiast level. Hopefully they don't skimp on the EDO RAM for the mainstream cards 


Casecutter said:


> Heck for all we know Nvidia out'd this information and some parts of it carry some truth....


Really? Nvidia "leak" supposed AMD next-gen GPU that based on the specs would likely crush everything in existence. 
The tinfoil is for a hat...not an entire bodysuit.


----------



## TheoneandonlyMrK (May 9, 2013)

HumanSmoke said:


> I think the DDR3 memory controllers might be a small giveaway, but you never know...maybe that's just for the enthusiast level. Hopefully they don't skimp on the EDO RAM for the mainstream cards



Whilst I don't disagree on the pic imho this is similar to what you can expect just after gcn2 refresh ,mostly because there are bins yet to be sold surely and southern islands taped out a while ago afaik


----------



## xtremesv (May 9, 2013)

This doesn't change a thing. AMD is incapable of delivering a competitive GPU relying on 28 nm process as nvidia could (not talking about pricing though). Instead of developing a more efficient GPU to fit current technology, they must wait for others to do their job. I know perhaps I'm being too hard.

AMD thinks Southern Islands are still very competitive considering current games and they may be right, nonetheless, enthusiasts and geeks always expect the next big weapon in AMD vs nvidia endless war.


----------



## cdawall (May 9, 2013)

xtremesv said:


> This doesn't change a thing. AMD is incapable of delivering a competitive GPU relying on 28 nm process as nvidia could (not talking about pricing though). Instead of developing a more efficient GPU to fit current technology, they must wait for others to do their job. I know perhaps I'm being too hard.
> 
> AMD thinks Southern Islands are still very competitive considering current games and they may be right, nonetheless, enthusiasts and geeks always expect the next big weapon in AMD vs nvidia endless war.



Why bother continuing the 28nm tech when the 20nm is just around the corner.


----------



## sergionography (May 9, 2013)

looks nothing like a 4096 core part. more like a 16cu*64core part with 8cpu cores
xbox soc maybe? 1024gcn cores and 8 jaguar cores with 512bit width to maximize bandwidth on the slow ddr3 memory? seems like thats the most likely case since its clearly an soc distinguishing between the parallel and serial compute units

or who knows if its not kaveri with amd maybe upping the performance level on APU's with making much larger chips with more goodies(reason i think this is because since its showing each 2 cpus as one module makes me think its based on steamroller since jaguar each 4 cores has paired l2cache not each 2)
8core apu and maybe 10-12core fx on steamroller? sure bring it on why not


----------



## H82LUZ73 (May 9, 2013)

I have to agree with the guys saying APU/Console parts. I  have seen MAY 24- 28 as release date for either XBOX 720 or PS4(but will not be until Oct -Nov this year cash cow the Christmas buyers) .Like I have been saying AMD is in a holding pattern waiting for those consoles to come out,Why release a new computer GPU card and have so called NEW consoles behind again.


----------



## Ebo (May 9, 2013)

I dont care how it preforms, im still going to buy 2 for crossfire

Sam


----------



## Aquinus (May 9, 2013)

I think everyone should remember that Sea Islands is supposed to come first. This is a long ways down the road. I don't think we'll see this until at least next year at the earliest. I could be wrong and I would be more than happy to be wrong.


----------



## Crap Daddy (May 9, 2013)

Aquinus said:


> I think everyone should remember that Sea Islands is supposed to come first. This is a long ways down the road. I don't think we'll see this until at least next year at the earliest. I could be wrong and I would be more than happy to be wrong.



Well of course. There are two options for AMD, either launch Sea Islands as soon as they are ready with the Tahiti respin which will probably bring a a performance increase similar with what NV will offer with the GTX780, 770 and the rest or launch next year Volcanic on 20nm.
I can't imagine a 20nm monster launched this year. And that slide has nothing to do with Volcanic GPU.


----------



## jihadjoe (May 9, 2013)

Prima.Vera said:


> lol.
> 
> 4096 stream processors??? That's double than a 7970 GPU
> 512-bit GDDR5??? Prepare for some records on the bandwidth
> ...



Specs actually remind me of the 2900XT, but we all know how that turned out.


----------



## cdawall (May 9, 2013)

jihadjoe said:


> Specs actually remind me of the 2900XT, but we all know how that turned out.



It was fast, but a bit high on the power consumption side. Same thing with the current 7970's. AMD isn't exactly the power friendly group that being said the 3xx0 series was pretty awesome as was the 4xx0 series.


----------



## BigMack70 (May 9, 2013)

I'm not sure why the 7xxx series is getting blasted for power consumption... when it released, it was lauded as being quite power efficient. The fact that Nvidia managed to one-up them on power efficiency doesn't mean that AMD is terrible on power. We're not talking about the GTX 480 here.


----------



## librin.so.1 (May 9, 2013)

BigMack70 said:


> We're not talking about the GTX 480 here.



...the flagship GPU of the 400 series that had to have one of its SMs disabled due to being too power-hungry. ¬____¬


----------



## Casecutter (May 9, 2013)

cdawall said:


> Same thing with the current 7970's. AMD isn't exactly the power friendly



Go blow that smoke somewhere else.  When you look at power usage in actual gaming and average the usage over those the GTX680 vs the 7970 is only using 3-4% more watts and was more often 5-7% off the GTX 680 performance.  Now true the Ghz versions on synthetic tests went higher on power usage (not much fun playing synthetic tests), but looking at gaming the GHz version actual came in lower the a GTX 680 by 7 watts.  In real world gaming a Ghz will in no way change what you pay in your power bill, it might by these dare I say cost you less...

http://www.hardocp.com/article/2012/03/22/nvidia_kepler_gpu_geforce_gtx_680_video_card_review/13

http://www.hardocp.com/article/2012/10/30/xfx_double_d_hd_7970_ghz_edition_video_card_review/10


----------



## sergionography (May 9, 2013)

Vinska said:


> ...the flagship GPU of the 400 series that had to have one of its SMs disabled due to being too power-hungry. ¬____¬



kinda like gk110 you mean(titan) with one unit disable after a year of making
and then the 780 comming out with 2 units disabled lol
nvidia is just horrible with new process nodes, they never seem to get the hang of that, they are the ones always complaining about yield and what not simply because their engineers fail to work according to tsmc's fabs which nvidia has been using for over 10 years
amd on that front is miles ahead, always bringing excellent chips right when the fab spins, gotta appreciate amd from amd, and if this volcanic islands thing is coming out this years then it only further proves my point(though i still believe that image has nothing to do with volcanic island or the rumor itself)


----------



## Apocolypse007 (May 9, 2013)

Hello everyone. I haven't posted here in ages, though I still scan the articles from time to time. Anyway, here is a more clear image with text you can actually read on the specs:


----------



## librin.so.1 (May 9, 2013)

Apocolypse007 said:


> [...] here is a more clear image with text you can actually read on the specs:



It's nice and all, but too bad this was already posted 26 posts ago (post #74).
Here, have a cookie.


----------



## Apocolypse007 (May 9, 2013)

Vinska said:


> It's nice and all, but too bad this was already posted 26 posts ago (post #74).
> Here, have a cookie.



I'm sorry. I didn't notice. Like I said I tend to only cruise the news articles here these days and didn't see all 100+ replies (only the first page or so of replies ends up on the news article).


----------



## NeoXF (May 9, 2013)

Vinska said:


> It's nice and all, but too bad this was already posted 26 posts ago (post #74).
> Here, have a cookie.



Cool story bro... and you're trolling him... why?


----------



## Hilux SSRG (May 9, 2013)

"The fab set Q4 as its tentative bulk manufacturing date for the process."

So consumers will be able to purchase limited quantities in Q4 2013 and open availability in Q1 2014?  If so, then what will AMD launch between now and then?


----------



## HumanSmoke (May 9, 2013)

Casecutter said:


> Go blow that smoke somewhere else.  When you look at power usage in actual gaming and average the usage over those the GTX680 vs the 7970 is only using 3-4% more watts and was more often 5-7% off the GTX 680 performance.  Now true the Ghz versions on synthetic tests went higher on power usage (not much fun playing synthetic tests), but looking at gaming the GHz version actual came in lower the a GTX 680 by 7 watts.  In real world gaming a Ghz will in no way change what you pay in your power bill, it might by these dare I say cost you less...
> http://www.hardocp.com/article/2012/03/22/nvidia_kepler_gpu_geforce_gtx_680_video_card_review/13
> http://www.hardocp.com/article/2012/10/30/xfx_double_d_hd_7970_ghz_edition_video_card_review/10


Well, something must have happened between those old tests and the newer ones at [H]OCP...






I don't doubt that the power usage cost is negligible for the likely user bases of the cards, but I don't see the 7970GE being more frugal in power consumption than the 680 either...except in a very small minority of games.

Anyhow, I'd say all bets are off with any new architectures on new processes. I'm pretty sure that no one would have predicted the perf-per-watt difference between Fermi and Kepler, so there is no reason why the same can't be said for SI > VI


----------



## TheoneandonlyMrK (May 9, 2013)

HumanSmoke said:


> Well, something must have happened between those old tests and the newer ones at [H]OCP...
> http://www.hardocp.com/images/articles/1361915444mQwWwGD0oO_9_1.png
> 
> I don't doubt that the power usage cost is negligible for the likely user bases of the cards, but I don't see the 7970GE being more frugal in power consumption than the 680 either...except in a very small minority of games.
> ...



The slight efficiency your claiming for kepler over fermi  whilst real is in part due to the lack of compute power and its advanced clock and power gateing but isn't that special if you understand the tech and the way nvidia aimed gk104 so specifically at gaming and isn't out of amds reach at all.


----------



## Mathragh (May 9, 2013)

imho, a big part of keplers efficiency is the result of their new powertuning, which can keep the voltage a lot closer to the optimum than previous generations, resulting in a lower voltage needed for specific clocks, and thus lower powerconsumption and heat.


----------



## HumanSmoke (May 9, 2013)

theoneandonlymrk said:


> The slight efficiency your claiming for kepler over fermi  whilst real is in part due to the *lack of compute power *and its advanced clock and power gateing but isn't that special if you understand the tech and the way nvidia aimed gk104 so specifically at gaming and isn't out of amds reach at all.


Really? 
I was comparing apples to apples, GF100/GF110 to GK 110. 

GK 110 is Kepler µarchitecture isn't it ?


----------



## cadaveca (May 9, 2013)

HumanSmoke said:


> Well, something must have happened between those old tests and the newer ones at [H]OCP...
> http://www.hardocp.com/images/articles/1361915444mQwWwGD0oO_9_1.png
> 
> I don't doubt that the power usage cost is negligible for the likely user bases of the cards, but I don't see the 7970GE being more frugal in power consumption than the 680 either...except in a very small minority of games.
> ...



My ASUS 7970 Matrix @ 1200/1650 pulls less than 250W @ gaming. In Furmark, those numbers might be possible, but otherwise, those seem incredibly high to me. My 7950's don't pull over 200W, too. More like 150W.

I also doubt their full system, minus VGA pulled only 63W as they report. Just saying. Feel free to check ANY of my motherboard reviews to find more realistic numbers for system power consumption. I'd almost say that [H]ardOCP's reviewer there didn't test anything, really.

Test setup is listed as a 2500k @ 4.8 GHz. Average power consumption of such a CPU is around 150W in prime95, and about 90W in gaming. Impossible to be 63W only for CPU, fans, drives. Just saying. Their numbers are 1000% false. I'd minus at least 75W from each of those listed numbers. Even the NVidia numbers are suspect.


----------



## TheoneandonlyMrK (May 9, 2013)

HumanSmoke said:


> Really?
> I was comparing apples to apples, GF100/GF110 to GK 110.
> GK 110 is Kepler µarchitecture isn't it ?



Yea yeah gk110s power efficiency is way beyond amd not imho am I missing your point here ?.


----------



## HumanSmoke (May 10, 2013)

theoneandonlymrk said:


> Yea yeah gk110s power efficiency is way beyond amd not imho am I missing your point here ?.


Yeah, you probably are. What I said earlier was:


> I'm pretty sure that no one would have predicted the perf-per-watt difference between Fermi and Kepler, so there is no reason why the same can't be said for SI > VI


So, if Nvidia can improve on efficiency between one µarch and another, then the same holds true for AMD with SI (Southern Islands) to VI (Volcanic Islands). I made no comparison between the two vendors regarding what might/could eventuate. The only comparison was in the earlier part of the post- pointing out to Casecutter the vagaries of what passes for "power usage under load" even within tests carries out by the same site.
If I'm comparing µarch to µarch, then I would generally look to compare the analogue of each architectures GPUs. GF100/GF110 and GK110 are both similar in die size, placement within the product stack hierarchy, and feature set. 


cadaveca said:


> I also doubt their full system, minus VGA pulled only 63W as they report. Just saying. Feel free to check ANY of my motherboard reviews to find more realistic numbers for system power consumption. I'd almost say that [H]ardOCP's reviewer there didn't test anything, really.


I don't doubt that the [H]ardOCP figures aren't definitive either - they really can't be with the variance between tests conducted only a few months apart. I only used the [H]ardOCP result because Casecutter was using the same source for his initial argument.


----------



## NeoXF (May 10, 2013)

theoneandonlymrk said:


> Yea yeah gk110s power efficiency is way beyond amd not imho am I missing your point here ?.



GK110 w/ 1 CU shut down still consumes at least 20W more than a 7970GHz and gets trashed by it in anything compute. And isn't that much faster in a good number of games, heck, slower in DiRT Showdown, Tomb Raider (under some circumstances) and some other titles.

...and that's considering they had more than a year to get it right (LMAO @ the people saying they didn't release it because GK104 was "good enough").

nVidia might have marginally better power consumption in the high end, but that is all, nothing special about it. Looking at the lower end chips, AMD actually has a wider advantage over nVidia the other way around at higher TDP cards.


----------



## 15th Warlock (May 10, 2013)

NeoXF said:


> GK110 w/ 1 CU shut down still consumes at least 20W more than a 7970GHz and gets trashed by it in anything compute. And isn't that much faster in a good number of games, heck, slower in DiRT Showdown, Tomb Raider (under some circumstances) and some other titles.
> 
> ...and that's considering they had more than a year to get it right (LMAO @ the people saying they didn't release it because GK104 was "good enough").
> 
> nVidia might have marginally better power consumption in the high end, but that is all, nothing special about it. Looking at the lower end chips, AMD actually has a wider advantage over nVidia the other way around at higher TDP cards.



Why persist in spreading misinformation? This is from a previous thread about the GTX780:

From W1zzard's own GTX Titan review you can find in the TPU website:

*Power consumption:*






























7970 GHz beat by Titan in terms of power consumption efficiency in _every single scenario_

*Relative performance (average of every single 3D benchmark on every resolution):*


























GTX Titan beats the 7970GHz in every single resolution, now for Tomb Raider, this from W1zzard's review for the 7990:


























The GTX Titan is faster than the 7970GHz in every resolution in that particular game, you may counter the 7990 is faster (and it is) but that's not even the point; dunno about DiRT showdown, but if what you say is true (W1zzard doesn't even test cards using that game) then it's probably the only scenario were the 7970GHz beats the Titan...

EDIT: Oh wait, I found these benchmarks using DiRT Showdown at Anand's:





















Only in one scenario the 7970 "beats" Titan (if you call 0.9 FPS beating)

EDIT 2: as for the 7970 "trashing" Titan in compute performance, the theoretical max double precision performance (FP64) for 7970 is 1.08TFLOPs whereas Titan's is 1.3TFLOPs, but don't take it from me, this is (once again) from Anandtech, an analysis of Titan's compute performance by Rahul Garg, a Ph D. specializing in the field of parallel computing and GPGPU technology:




























Out of all compute tests performed, only in SystemCompute benchmark Titan is beat by 7970GHz, in all other benchmarks Titan leaves 7970 in the dust... I exactly wouldn't call that "trashing"


----------



## NeoXF (May 10, 2013)

15th Warlock said:


> [snip]



I don't really care for your cherry-picked reviews TBH...

Also, I do want to remind everyone that TPH does not hold the absolute truth in regards to GPU reviews, you know?


----------



## librin.so.1 (May 10, 2013)

Time for some lulz.
From the very same W1zz's review:

*わはは～！*





*わははは～！*


----------



## 15th Warlock (May 10, 2013)

NeoXF said:


> I don't really care for your cherry-picked reviews TBH...
> 
> Also, I do want to remind everyone that TPH does not hold the absolute truth in regards to GPU reviews, you know?



Cherry picked? This is the TPU forums you're posting at, what more appropriate than W1zzard's own reviews?

Not only that, but every scenario presented completely contradicts the facts mentioned by you, I'm not cherry picking anything, I'm actually posting every single test result, and you mention TR and DiRT... now I'm the one cherry picking?

You know, it doesn't really matter, if even showing you all the results (including studies from a Ph.D no less) won't convince you, then nothing will, if that's how you feel about this card in particular, you're entitled to your opinions...

Moving on...

EDIT: Just saw Vinska's reply, and I'm the one cherry picking, right...? I presented the condensed results for every single resolution in every single game... but this can drag on forever I see, it doesn't really matter, you guys win, OK 

Peace


----------



## librin.so.1 (May 10, 2013)

15th Warlock said:


> EDIT: Just saw Vinska's reply, and I'm the one cherry picking, right...? I presented the condensed results for every single resolution in every single game... but this can drag on forever I see, it doesn't really matter, you guys win, OK
> 
> Peace



yeah, it seems like cherry picking, but my point actually was:
Take these reviews with a *HUGE grain of salt*.

If You take a better look at W1zz's review, on Sleeping Dogs, the 7970 [GE] had almost twice [!] the fps on 5760x1080 when compared to 2560x1600. And on 1920x1200 it was slightly behind 5760x1080.
Similar situation with AC3 - on 5760x1080 it run significantly faster compared to 2560x1600 and 1920x1200 had pretty much the same framerates as 5760x1080

If that doesn't spell out the phrase "_something is fishy with this review_", then I don't know what would.

EDIT: I pointed out these things in the review discussion thread (there was a similar thing with the 7990, too). But no one seemed to care at all. Yet, I would LOVE to get an explanation or _even a guess_ to WTH is wrong here (as something obviously is).


----------



## ne6togadno (May 10, 2013)

ok fanboy wars will never end so cant we finish with dick size contest and go back to topic.
what i see on cleared pictures (thx apocolypes) looks more like pelidriver based apu (may be for next gen console) then discrete gpu. was exciting and the begging but at closer look it is more likely fake news.


----------



## Aquinus (May 10, 2013)

ne6togadno said:


> ok fanboy wars will never end so cant we finsh with dick size contest and go back to topic.



Disagreements and conversation about said topic does not qualify as fanaticism so I recommend not clumping everyone together and calling them "fan boys" when you're just adding to the noise by saying this.

I think we can all agree that the Titan is a powerful card at the cost of some extra moollah where the 7970 provides some less performance for considerably less moollah. Weather or not the 700-series cards will be more like Titan or not, we don't know, but what I will say is that regardless of what NVidia has up their sleeves, AMD is working on something else well.

I think everyone should calm down and acknowledge that both NVidia and AMD are both two very good companies that produce quality hardware and if you disagree with me then maybe you're being a fanatic and I'll challenge you to design a GPU that does better if people are going to continue bashing on people who are doing things that most here can probably only dream of.


----------



## NeoXF (May 10, 2013)

15th Warlock said:


> (including studies from a Ph.D no less)



OK, now I know you are joking... How the Hell is that relevant to anything ever discussed here... What, Doctors or highly educated or top positioned people can't be biased, wrong or just simply... mistaken? If anything, they are more prone to ego mistakes and corruption. But then again, that's some pure generalization to make a point right there.

I can find any number of reviews where we can see the TITAN hover at 35W or more over the 7970GHz as well as benches showing it being beaten by a bunch of frames in DiRT:S, Tomb Raider and probably some other not-so-known titles, as well as having breathing down it's neck or tying by the Radeon in Sleeping Dogs, Far Cry 3: Blood Dragon, Metro 2033, AvP 2010, Sniper Elite V2, Max Payne 3 and some games at 4K... As for compute... don't get me started.
And so could you very probably (heck, you just did)... so I don't really care, I just hate praising an overpriced piece of late hardware for things that aren't even true.




Aquinus said:


> Disagreements and conversation about said topic does not qualify as fanaticism so I recommend not clumping everyone together and calling them "fan boys" when you're just adding to the noise by saying this.



What I was about to reply to him. LOL
Thanks.


Edit: So more on-topic. Like GF 7900GTX to 8800GTX (DX 9.0c to DX10) and Radeon HD 4870 to Radeon HD 5870 (DX10 to DX11) I see this Volcanic Islands card as another huge jump in performance... but related to the jump from HD to UHD more so than anything else, like API upgrade, because let's face it, none of today's card cut it for 3840x2160 gaming (not that it's here yet anyway)... I dislike multi-monitor setups so much that multi-GPU and subsequent issues with them are a none-issue for me from the get-go.


----------



## d1nky (May 10, 2013)

did you see the review with the 4k resolutions. on a single card set up the 7970 performed quite well, but titan did come out top. and tbh like people say before the hardware Is performing well on both sides and its up to the software to catch up.

and is that volcanic islands diagram real?


----------



## FrustratedGarrett (May 10, 2013)

15th Warlock said:


> From W1zzard's own GTX Titan review you can find in the TPU website:
> 
> *Power consumption:*
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_Titan/images/power_multimon.gif
> ...



Well, the power consumption figures from tech report show a different picture:

The 7970GHz edition consumes less than 10 watts more than the GTX680 on full load , but it consumes less power than the 680 on idle and up to 11 watts less when the display is off, and that's makes it more power efficient than the 680.




15th Warlock said:


> As for the 7970 "trashing" Titan in compute performance, the theoretical max double precision performance (FP64) for 7970 is 1.08TFLOPs whereas Titan's is 1.3TFLOPs, but don't take it from me, this is (once again) from Anandtech, an analysis of Titan's compute performance by Rahul Garg, a Ph D. specializing in the field of parallel computing and GPGPU technology:
> 
> Out of all compute tests performed, only in SystemCompute benchmark Titan is beat by 7970GHz, in all other benchmarks Titan leaves 7970 in the dust... I exactly wouldn't call that "trashing"



I wouldn't trust anandtech, and I certainly think that they are biased in favor of Intel and against AMD. 

In the compute tests from toms hardware and techreport or even Hexus, you get a completely different picture. The 7970 does trash even the dual gpu 690 and blow it out of the water when it comes to shader performance in in GPGPU.


----------



## ne6togadno (May 10, 2013)

Aquinus said:


> Disagreements and conversation about said topic does not qualify as fanaticism so I recommend not clumping everyone together and calling them "fan boys" when you're just adding to the noise by saying this.



so you are telling that titan vs 7970 fits nice to "AMD's Answer to GeForce GTX 700 Series: Volcanic Islands ". 
english isnt my native and i dont pretend to perfectly understand it but obviously i understand it quite worse then i thought. seams noone want (or can) comment what is shown on the picture so lets share more "test results" that favor "my grafic card".
anyway discusion went too far away from that to be useful. gl in diagram comparison
truth i out there


----------



## cadaveca (May 10, 2013)

Vinska said:


> Similar situation with AC3 - on 5760x1080 it run significantly faster compared to 2560x1600 and 1920x1200 had pretty much the same framerates as 5760x1080
> 
> If that doesn't spell out the phrase "_something is fishy with this review_", then I don't know what would.
> 
> EDIT: I pointed out these things in the review discussion thread (there was a similar thing with the 7990, too). But no one seemed to care at all. Yet, I would LOVE to get an explanation or _even a guess_ to WTH is wrong here (as something obviously is).





Only AMD can answer why this is the case. I tested myself and get the same results as W1zz does. AMD said their memory management is broken/sub-optimal, and that's how it's noticed...Eyefinity.

Also, Eyefinity doesn't actually draw every single pixel on the side monitors in the same ratio/aspect as on the primary monitor, due to fish-eye effect. So although the resolution of the monitors is 5670x1080/1200, the workload may not actually be that many pixels, depending on app.


Do keep in mind that W1zz used to write ATITool, and writes other AMD-specific clocking apps. Best I can tell, he really doesn't care who is faster, and has no agenda...notice we don't have ads here except on the front page. TPU is not a site driven by the opportunity to make money doing reviews...we all just provide the numbers, and you decide who you like based on the results. Because anyone can replicate our tests, in every review. For me, I actually hope you do test and check our numbers... I know you'll find you get the same results.


----------



## 15th Warlock (May 10, 2013)

Vinska said:


> yeah, it seems like cherry picking, but my point actually was:
> Take these reviews with a *HUGE grain of salt*.
> 
> If You take a better look at W1zz's review, on Sleeping Dogs, the 7970 [GE] had almost twice [!] the fps on 5760x1080 when compared to 2560x1600. And on 1920x1200 it was slightly behind 5760x1080.
> ...



That's interesting, have you tried sending a PM to W1zzard with your findings? I'm sure he'll appreciate it and change the charts accordingly 



Aquinus said:


> Disagreements and conversation about said topic does not qualify as fanaticism so I recommend not clumping everyone together and calling them "fan boys" when you're just adding to the noise by saying this.



Couldn't agree more, thank you very much for your post. In the almost 9 years I've been visiting this forum, this is the first time I've been labeled a fanboy. Truth is I had never before made a post with so many charts to try and get my point accross, and all for naught LOL 




NeoXF said:


> OK, now I know you are joking... How the Hell is that relevant to anything ever discussed here... What, Doctors or highly educated or top positioned people can't be biased, wrong or just simply... mistaken? If anything, they are more prone to ego mistakes and corruption. But then again, that's some pure generalization to make a point right there.



The guy's field of study is parallel computing and GPGPU technology, not much room for bias in that field, then again, like I said before, you're entitled to your opinion, and you have already mentioned your mistrust of doctors in general - and science by extension (?!) - so no point in trying to convince you otherwise, right?

I guess we can all agree that a this point speculating on the performance of graphic cards that are yet to be released is pointless, as there no evidence whatsoever to back these facts, all we can do is wait and see, no point in fighting to try and show the world who has the biggest e-peen 

It's all good, like I said, this could drag on forever, perhaps it's better to move on, for the sake of this thread


----------



## Casecutter (May 10, 2013)

HumanSmoke said:


> Well, something must have happened between those old tests and the newer ones at [H]OCP...


Yeah, if you read what you posted [H] say they're using "real gaming and recorded the *highest* value in each"... Not an average of what it took to complete that section!  Sure a 7970 might peak for a millisecond, is that what they mean as the "highest" value?  

While now [H] doen't tell us the games used, but hopefully figure the 5 [H] used in that new review, which are different that the earlier 5.  [H] drop Batman and Witcher (use 11% more watts than the 7970GHz) which as move the data against the GHz Edition.  Also, in most of the titles the 7970GHz provide more Fps verse a 680, so we'd logial anticipate more power usage. Even Sleeping Dog [H] had to us the lower 1920x Res to have more Fps, 

Going back to an average of the what a card require to complete the run-throughs of each game, and them take those five games add them together and divide by 5 is more real world anyway you slice it.



15th Warlock said:


> Why persist in spreading misinformation? This is from a previous thread about the GTX780:
> 
> From W1zzard's own GTX Titan review you can find in the TPU website:



That's only one game Crysis 2 on a specific run-through. Sure it looks good by that on one data point, but hardly is telling the whole story, when various titles have their average power usage and over a long period of playing each.  Sure if all you play is "Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw. *Highest single reading during the test*" and then limited your play to that one small run-through each time then you can abide with that one point of data.


----------



## Recus (May 10, 2013)

sergionography said:


> kinda like gk110 you mean(titan) with one unit disable after a year of making
> and then the 780 comming out with 2 units disabled lol
> nvidia is just horrible with new process nodes, they never seem to get the hang of that, they are the ones always complaining about yield and what not simply because their engineers fail to work according to tsmc's fabs which nvidia has been using for over 10 years
> amd on that front is miles ahead, always bringing excellent chips right when the fab spins, gotta appreciate amd from amd, and if this volcanic islands thing is coming out this years then it only further proves my point(though i still believe that image has nothing to do with volcanic island or the rumor itself)



Except AMD regularly feeding their fans with fake marketing slides, and lies such as HD 2000 (DDR4), HD 7000 (XRD2), HD 9000 (20 nm).

Sincerely, *A*l*M*ost*D*ead


----------



## W1zzard (May 10, 2013)

Casecutter said:


> Highest single reading during the test



that's for the "peak" graph. "average" represents the average you are looking for. all other sites that i know use a single reading for their power consumption measurements and dont disclose details, some even list full system power

even today, crysis 2 is still a great choice for power consumption testing


----------



## 15th Warlock (May 10, 2013)

W1zzard said:


> that's for the "peak" graph. "average" represents the average you are looking for. all other sites that i know use a single reading for their power consumption measurements and dont disclose details, some even list full system power
> 
> even today, crysis 2 is still a great choice for power consumption testing



Precisely, no other reviewer so thoroughly tests hardware under most conceivable scenarios like you do, and clearly disclose all environmental factors influencing the results.

And I agree, Crysis 2 maxed out can still stress most hardware configurations out there, and make even the fastest system break a sweat, it's as good a test tool as any other game out there.


----------



## d1nky (May 10, 2013)

Recus said:


> Except AMD regularly feeding their fans with fake marketing slides, and lies such as HD 2000 (DDR4), HD 7000 (XRD2), HD 9000 (20 nm).
> 
> Sincerely, *A*l*M*ost*D*ead



really? what is it about a bledy graphics/cpu brand that creates conflicts?! in my life its girls that create conflicts!


----------



## MxPhenom 216 (May 10, 2013)

Recus said:


> Except AMD regularly feeding their fans with fake marketing slides, and lies such as HD 2000 (DDR4), HD 7000 (XRD2), HD 9000 (20 nm).
> 
> Sincerely, *A*l*M*ost*D*ead



  :shadedshu


----------



## RejZoR (May 10, 2013)

W1zzard said:


> that's for the "peak" graph. "average" represents the average you are looking for. all other sites that i know use a single reading for their power consumption measurements and dont disclose details, some even list full system power
> 
> even today, crysis 2 is still a great choice for power consumption testing



Also CS:GO and Trackmania Unlimited (even in track editor). The only two games that make my otherwise silent graphic card spin its fans like crazy. So crazy high that i had to create my own fan curve and sacrifice some thermals in order to keep it quiet along with the rest of the system.


----------



## xorbe (May 10, 2013)

RejZoR said:


> Also CS:GO and Trackmania Unlimited (even in track editor). The only two games that make my otherwise silent graphic card spin its fans like crazy. So crazy high that i had to create my own fan curve and sacrifice some thermals in order to keep it quiet along with the rest of the system.



Are your fps readings in the stratosphere?  You might try a frame rate limiter.


----------



## TheoneandonlyMrK (May 10, 2013)

Recus said:


> Except AMD regularly feeding their fans with fake marketing slides, and lies such as HD 2000 (DDR4), HD 7000 (XRD2), HD 9000 (20 nm).
> 
> Sincerely, *A*l*M*ost*D*ead



Useless post there dude and largely balls too, Xdr2 for 7### was a rumour and 8### isn't fully out yet so suggesting the nine series isn't going on 20nm is jumping the gun in the extreme ,as it probably will be 20nm as could be the v2  sea islands , lame amd bashing try harder.
I think its quite clear to most that the pics of an Apu, so if this were Vi then amd will just be making scaleable Apus for everything and most situations,  and its too soon for that imho but id  welcome a mythical gpu like that pic x2, because damn them things would fold well even intels phi would look a bit weak on double precision compared to that spec of chip.


----------



## Casecutter (May 10, 2013)

W1zzard said:


> that's for the "peak" graph. "average" represents the average you are looking for


Stand corrected Thank you.

So with Crysis 2 the Ghz furnishes 9.8% more performance, while requiring 28% more watts than a GTX 680.  If several other titles had that same trend(s) it could be appreciated it as a veritable results, but one data point is not definitive proof. 

While Titan has 35% performance increase while basically using the same power of GHz, (even with the one data point) Titan appears to have some determinate efficiency.  Now, can the Titan LE (fusing off 2 SMX) provide something approaching that performance/watts?


----------



## cdawall (May 11, 2013)

Recus said:


> Except AMD regularly feeding their fans with fake marketing slides, and lies such as HD 2000 (DDR4), HD 7000 (XRD2), HD 9000 (20 nm).
> 
> Sincerely, *A*l*M*ost*D*ead



You might want to tell their stock holder because they believe something is going well. Sitting at almost $4 a share now which is almost double from the start of the month.


----------



## HumanSmoke (May 11, 2013)

Casecutter said:


> Yeah, if you read what you posted [H] say they're using "real gaming and recorded the *highest* value in each"... Not an average of what it took to complete that section!  Sure a 7970 might peak for a millisecond, is that what they mean as the "highest" value?


Nice hypothesis 
A couple of observations:
1. How do you know that the maximum measurement is the peak of *a millisecond*?, and 
2. Isn't it conceivable that the other cards are being measured the exact same way...so in fact the GTX 680's full load power measurement could also be a transient peak of *a millisecond* duration ?


Casecutter said:


> While now [H] doen't tell us the games used, but hopefully figure the 5 [H] used in that new review....


I was told by another guy who holds AMD to be the one true god, that they have it on good authority that the 7970GE was tested with Crysis 2, Metro 2033, BF3 multiplayer, Furmark, and 3DMark while the GTX 680 was tested with Solitaire, Minesweeper, Tetris, Farmville and The Lost Titans. If true- and I'm assured it is, that could account for the discrepancy. If so, then the world-wide conspiracy against AMD does indeed cover the entire planet!
Xbit ( difference of 78W in Metro 2033)
HT4U (difference of 76W - gaming benchmarks)
PCGH ( difference of 73W in Battlefield Bad Company 2)
Hardware.info (difference of 65W in Metro 2033)
Hexus (difference of 61W in Far Cry 3)
PC Perspective ( difference of 61W in BF3)
SweClockers (difference of 50W - application not specified)
Lab501 (difference of 47W in Crysis 2)
Hardware Canucks (difference of45W in Ungine Valley bench)
TechPowerUp (difference of 43W in Crysis 2)
HotHardware ( difference of 40W -application not specified)
TechSpot (difference of 38W in Crysis 3)
Hardware France (difference of 42W in Anno 2070 and 36W in BF3)
Bit-tech (difference of 35W in Unigine Heaven bench)
ComputerBase (difference of 35W in AC3)
Anandtech (difference of 24W in BF3)
HardwareLUXX (difference of 13W - application not specified)
Tech Report ( difference of 8W - application not specified)

By my count, that takes in Northern, Central, Western, and Eastern Europe, North America, and Australia.


----------



## d1nky (May 11, 2013)

HumanSmoke said:


> I was told by another guy who holds AMD to be the one true god, that they have it on good authority that the 7970GE was tested with Crysis 2, Metro 2033, BF3 multiplayer, Furmark, and 3DMark while the GTX 680 was tested with Solitaire, Minesweeper, Tetris, Farmville and The Lost Titans



its late....... was that sarcasm?!


----------



## Super XP (May 11, 2013)

This new series should be called AMD MONSTER HD 8970. Can't wait to see these baby's in action. The HD 8900 Series is my next upgrade.


----------



## pjl321 (May 11, 2013)

I will be upgrading from a very old system but want to wait for either Volcanic Islands or Maxwell for a graphics card.

So the question is, what is faster my AMD Radeon HD 4870 512mb or the Intel HD 4600 on the 4770k?

Next question, as i am pretty sure Intel will not have managed to compress an entire high-end (ish) video card from 2008 into a few transistors on a CPU, can i still use the OpenCL power of the Haswell chip if i have my old discrete card plugged in? I remember on early Sandy Bridge reviews you could only use QuickSync if you had a monitor plugged into Intel's video outputs.


----------



## Super XP (May 11, 2013)

Personally I think the HD 4870 is the faster card, though I believe it supports upto DX10.1 where as the Intel supports Direct X 11.1.
Discrete graphics won't choke in games like integrated. The Intel one does 4K video well but for games the AMD takes it. Can somebody else advise


----------



## HTC (May 11, 2013)

Super XP said:


> This new series should be called AMD MONSTER HD 8970. Can't wait to see these baby's in action. *The HD 8900 Series is my next upgrade.*



That's some blind faith you have, dude!

Wouldn't it be more sensible to see some reviews BEFORE making that decision?


----------



## Recus (May 11, 2013)

cdawall said:


> You might want to tell their stock holder because they believe something is going well. Sitting at almost $4 a share now which is almost double from the start of the month.



Because of this http://blogs.barrons.com/techtraderdaily/2013/05/01/amd-spikes-13-on-heavy-volume/.


----------



## TheoneandonlyMrK (May 11, 2013)

Super XP said:


> Personally I think the HD 4870 is the faster card, though I believe it supports upto DX10.1 where as the Intel supports Direct X 11.1.
> Discrete graphics won't choke in games like integrated. The Intel one does 4K video well but for games the AMD takes it. Can somebody else advise



You have it right there mate.
I use a hybrid physx card that needs to have a monitor attached btw I just use the second input on a single monitor, I can't see them both but dont use thr second input and it enables the features I want,  the pjl136 dude could use this same tactic on his intel gfx output surely. 

One day this phone will pay for all these messups.

Odd dp I edited sorry.


----------



## W1zzard (May 11, 2013)

HumanSmoke said:


> How do you know that the maximum measurement is the peak of a millisecond



nobody tests graphics card power consumption with millisecond resolution. 

most editors use cheap killawatts that take a reading every 1-2 seconds. some even slower/people just look and memorize the highest number. 

a handful of sites use proper measuring devices. i'm running at 12 samples per second, which in my opinion is a good compromise between accuracy and speed.

it could be interesting to look at power consumption with sub-microsecond resolution to observe the effects of power limiting systems, but spending a few k just for that doesn't seem to be worth it.


----------



## xorbe (May 14, 2013)

W1zzard said:


> nobody tests graphics card power consumption with millisecond resolution.
> 
> most editors use cheap killawatts that take a reading every 1-2 seconds.



Right, but that number that's updated every 1-2 seconds ... was it just an "instantaneous" (perhaps "1 ms" effectively) reading, or a true average of the last 1-2 seconds?


----------

