# AMD Dual-GPU Radeon HD 7990 to Launch in Q1 2012, Packs 6 GB Memory



## btarunr (Dec 23, 2011)

Even 12 months ago, an Intel Nehalem-powered gaming PC with 6 GB of system memory was considered high-end. Now there's already talk of a graphics card taking shape, that has that much memory. On Thursday this week, AMD launched its Radeon HD 7970 graphics card, which features its newest 28 nm "Tahiti" GPU, and 3 GB of GDDR5 memory across a 384-bit wide memory interface. All along, it had plans of making a dual-GPU graphics card that made use of two of these GPUs to give you a Crossfire-on-a-stick solution. AMD codenamed this product "New Zealand". We are now getting to learn that codename "New Zealand" will carry the intuitive-sounding market name Radeon HD 7990, and that it is headed for a Q1 2012 launch. 

This means that Radeon HD 7990 should arrive before April 2012. Tests show that Tahiti has superior energy-efficiency compared to previous-generation "Cayman" GPU, even as it has increased performance. From a technical standpoint, a graphics card featuring two of these Tahiti GPUs, running with specifications matching those of the single-GPU HD 7970, looks workable. Hence, there is talk of 6 GB of total graphics memory (3 GB per GPU system). 



 




One can also expect the fruition of AMD's new ZeroCore technology. This technology powers down the GPU to zero draw when the monitor is blanked (idling), but in CrossFire setups, this technology completely powers down other GPUs than the primary one to zero state, when the system is not running graphics-heavy applications. This means that the idle, desktop, and Blu-ray playback power-draw of the HD 7990 will be nearly equal to that of the HD 7970, which is already impressive.





*View at TechPowerUp Main Site*


----------



## JrRacinFan (Dec 23, 2011)

Just wondering something, why 6GB? I mean, it's nice but what right now would utilize it?


----------



## Esse (Dec 23, 2011)

There's my little country


----------



## de.das.dude (Dec 23, 2011)

probably good at 4GB. 6 GB looks over over-kill


----------



## btarunr (Dec 23, 2011)

In multi-GPU setups, applications only see the memory of a single GPU, which is then mirrored to the memory of partner GPUs.


----------



## pantherx12 (Dec 23, 2011)

JrRacinFan said:


> Just wondering something, why 6GB? I mean, it's nice but what right now would utilize it?



Because it's a dual GPU, so in reality it's only 3gb usable.

And eyefinity with several 4k monitors would use up 3gb easily : ]


----------



## Sasqui (Dec 23, 2011)

Memory availability sounds almost like Raid 0

The power savings is brilliant (obvious but brilliant), like the later CPUs, cores not used are cores that doen't eat up much/any power Nom Nom Nom Nom...


----------



## JrRacinFan (Dec 23, 2011)

pantherx12 said:


> Because it's a dual GPU, so in reality it's only 3gb usable.
> 
> And eyefinity with several 4k monitors would use up 3gb easily : ]



Nah I know that, i mean what I am saying is with all the not so grande ports we are getting as PC gamers and not many games really utilizing over 1.5GB I don't see the use.


----------



## pantherx12 (Dec 23, 2011)

JrRacinFan said:


> Nah I know that, i mean what I am saying is with all the not so grande ports we are getting as PC gamers and not many games really utilizing over 1.5GB I don't see the use.



I dunno quite a few can use 1.5 if given the chance : ]

and that's just at HD.

I get slowdowns when I turn round in skyrim because it uses more than my 1gb.

and I only play @ 1680x1050.

Hate to use this line, but it's an enthusiast card the more over the top the better 


Also as GPGPU becomes more popular having a lot of ram on your card could become very useful.


----------



## TheMailMan78 (Dec 23, 2011)

btarunr said:


> Even 12 months ago, an Intel Nehalem-powered gaming PC with 6 GB of system memory was considered high-end. Now there's already talk of a graphics card taking shape, that has that much memory. On Thursday this week, AMD launched its Radeon HD 7970 graphics card, which features its newest 28 nm "Tahiti" GPU, and 3 GB of GDDR5 memory across a 384-bit wide memory interface. All along, it had plans of making a dual-GPU graphics card that made use of two of these GPUs to give you a Crossfire-on-a-stick solution. AMD codenamed this product "New Zealand". We are now getting to learn that codename "New Zealand" will carry the intuitive-sounding market name Radeon HD 7990, and that it is headed for a Q1 2012 launch.
> 
> This means that Radeon HD 7990 should arrive before April 2012. Tests show that Tahiti has superior energy-efficiency compared to previous-generation "Cayman" GPU, even as it has increased performance. From a technical standpoint, a graphics card featuring two of these Tahiti GPUs, running with specifications matching those of the single-GPU HD 7970, looks workable. Hence, there is talk of 6 GB of total graphics memory (3 GB per GPU system).
> 
> ...



That thing will cost more then a super charger on my 4.6 Mustang. Ill pass.


----------



## Trackr (Dec 23, 2011)

The question here is not why AMD chose 6GB. It's why AMD didn't choose 4GB like the HD 6990.

The reason is, the HD 6970 has a 256-bit memory bus. This allows for 2048MB.

The GTX 580 has a memory bus of 384-bit. This allows for 1536MB.

So AMD had to choose between 1.5GB per GPU or 3GB per GPU.

And I think 6GB is great.

Besides, they will have to make the price competitive to stay above the HD 6990, so the extra RAM likely won't mean a higher price.


----------



## bogie (Dec 23, 2011)

*Green Goblin?*

All quiet on the Green front! 

lol


----------



## buggalugs (Dec 23, 2011)

If idle power savings are really 75% its a game changer.


----------



## RejZoR (Dec 23, 2011)

People complaining over 6GB thing. Please read the news again. It doesn't say anywhere that the cards have 6GB of VRAM. It just says that system with 6GB of RAM (regular system RAM that is!) was considered as high end till recently.
However it does say that it comes with 3GB of GDDR5 VRAM memory.


----------



## ensabrenoir (Dec 23, 2011)

JrRacinFan said:


> Just wondering something, why 6GB? I mean, it's nice but what right now would utilize it?


90% of out rigs arent fully utilized/overkill.... This will fit right in.  we all need to push for better software though


----------



## TheMailMan78 (Dec 23, 2011)

JrRacinFan said:


> Just wondering something, why 6GB? I mean, it's nice but what right now would utilize it?



Run Battlefield 3 or F1 maxed out on three 30" monitors and tell me the same thing.


----------



## specks (Dec 23, 2011)

Cool! Straight to the big guns


----------



## Platibus (Dec 23, 2011)

rejzor said:


> people complaining over 6gb thing. Please read the news again. It doesn't say anywhere that the cards have 6gb of vram. It just says that system with 6gb of ram (regular system ram that is!) was considered as high end till recently.
> However it does say that it comes with 3gb of gddr5 vram memory.


↓


> ...from a technical standpoint, a graphics card featuring two of these tahiti gpus, running with specifications matching those of the single-gpu hd 7970, looks workable. Hence, there is talk of 6 gb of total graphics memory (3 gb per gpu system).


----------



## bear jesus (Dec 23, 2011)

At 5040x1050 many games use between 1.5GB and 1.8GB, the highest i have seen is 2 GB in aliens vs predator but my 6970 runs out of GPU power so i think at a res higher than 5040x1050 with a 7990 the 3 GB per GPU could really be put to use.

I look forward to see how it performs in eyefinity resolutions.


----------



## ironwolf (Dec 23, 2011)

Wished I had about ten kidneys to sell to afford one of these LOL.


----------



## mtosev (Dec 23, 2011)

6gb omg. this will be a monster of a card.  AMD isn't joking around


----------



## Fx (Dec 23, 2011)

does the 6990 suffer from the micro stutter like crossfire does?

if not then I think I am going to start saving for this bad boy


----------



## claylomax (Dec 23, 2011)

JrRacinFan said:


> not many games really utilizing over 1.5GB



Metro 2033, Stalker CoP, Crysis 2, Cryostasis, Battlefield 3, etc ... they max out 1.5GB playing at 1900 x 1200 with 4xAA on my system.


----------



## radrok (Dec 23, 2011)

TheMailMan78 said:


> Run Battlefield 3 or F1 maxed out on three 30" monitors and tell me the same thing.



I'm going to confirm this...
HD6990 CFX's 2GBs doesn't work like 8GB It's always 2GB, even with 4 GPUs right? Well anyway it can't cope with BF3 VRAM needs, had to remove AA samples.


----------



## Velvet Wafer (Dec 23, 2011)

Finally, 8192x shadows in Skyrim are possible without major lag!


----------



## phanbuey (Dec 23, 2011)

Nice... Kepler needs to show up and push these ridonkeylous prices.


----------



## dieterd (Dec 23, 2011)

eh, 1000$ mark cracked like nothing and I dont think that Nvidia will be the one who will compete with pricing. but i see none of you really care about prices, so Merry X-mass - I hope that we all will get only cash under X-mass tree, because the 28-nm generation started wit lota appetite for it!


----------



## the54thvoid (Dec 23, 2011)

TheMailMan78 said:


> That thing will cost more then a super charger on my 4.6 Mustang. Ill pass.





Trackr said:


> Besides, they will have to make the price competitive to stay above the HD 6990, so the extra RAM likely won't mean a higher price.



Yeah, price.  If the 7970 is going to be £450, the dual card will be *prohibitively* expensive.  I simply cannot justify buying a card that costs £700 plus (which surely it will be?) even though I've bought a £480 cpu.



claylomax said:


> Metro 2033, Stalker CoP, Crysis 2, Cryostasis, Battlefield 3, etc ... they max out 1.5GB playing at 1900 x 1200 with 4xAA on my system.



Is that a Vista thing? I run BF3 at max AA at that res and it doesn't go over that.


----------



## TheMailMan78 (Dec 23, 2011)

the54thvoid said:


> Yeah, price.  If the 7970 is going to be £450, the dual card will be *prohibitively* expensive.  I simply cannot justify buying a card that costs £700 plus (which surely it will be?) even though I've bought a £480 cpu.
> 
> 
> 
> Is that a Vista thing? I run BF3 at max AA at that res and it doesn't go over that.



It doesnt go over that because thats all you have.


----------



## the54thvoid (Dec 23, 2011)

TheMailMan78 said:


> It doesnt go over that because thats all you have.



I use afterburner and see the mem usage on my g19 in game.  It doesn't come as close to that.  Most I've seen in game (Metro 2003 i think) was 1400 or something.  I'm aware my gfx card is 1.5GB.

In fact , I'll go check now


----------



## TheMailMan78 (Dec 23, 2011)

the54thvoid said:


> I use afterburner and see the mem usage on my g19 in game.  It doesn't come as close to that.  Most I've seen in game (Metro 2003 i think) was 1400 or something.  I'm aware my gfx card is 1.5GB.
> 
> In fact , I'll go check now



How many displays?


----------



## Crap Daddy (Dec 23, 2011)

BF3 and I think all these games that need lots of VRAM use all that's available so on my card BF3 all maxed out shows 1260 something MB used so if I had more it would use more.


----------



## dorsetknob (Dec 23, 2011)

Its an incestuous relationship between hardware manufacturer's and the software houses
manufacturer's bring equipment to the market 
software houses write software that pushes the hardware to the limit 
so manufacturer's revise and improve their hardware 
 we all upgrade to the new and latest spec hardware and so the software houses produce more demanding software to take advantage  ect on on and on the cycle goes on


----------



## the54thvoid (Dec 23, 2011)

Just one.  Just checked.  I'm getting 1463MB of usage with ultra, settings, 4xMSAA and 16 AF.  Motion Blur and Ambient Occlusion off.  And that was in a firefight.

Would it climb to 1536MB if fully used or is there some left as buffer, i.e. is tha my card being maxed?


----------



## Mistral (Dec 23, 2011)

I can just imagine some people trying to play with that card on Windows XP 

Though Photoshop will surely love the obscene amount of memory...


----------



## dj-electric (Dec 23, 2011)

I don't like the whole dual-gpu memory amount marketing...
Becuase its simply half true. They can market it just becuase the chips are there
but people might think they can use up to 6GB of VRAM


----------



## TheMailMan78 (Dec 23, 2011)

the54thvoid said:


> Just one.  Just checked.  I'm getting 1463MB of usage with ultra, settings, 4xMSAA and 16 AF.  Motion Blur and Ambient Occlusion off.  And that was in a firefight.
> 
> Would it climb to 1536MB if fully used or is there some left as buffer, i.e. is tha my card being maxed?


 Run three monitors with that 1.5 GB of memory.  Its hardly pushing one now as you just saw. 


Mistral said:


> I can just imagine some people trying to play with that card on Windows XP
> 
> Though Photoshop will surely love the obscene amount of memory...



It will make no difference in Photoshop. Photoshop uses Open GL. Different ballgame all together. Most IGP's is all Photoshop needs for its draw.


----------



## the54thvoid (Dec 23, 2011)

TheMailMan78 said:


> Run three monitors with that 1.5 GB of memory.  Its hardly pushing one now as you just saw.



Nvidia, three monitors? Maybe they'll sort that out by Kepler? 

(I know some AIB's have done it...)


----------



## NdMk2o1o (Dec 23, 2011)

Am I right in saying the recent benches with 7970 were between 20-30% faster than GTX 580? 
I would expect the performance to increase by at least 10% across the board with mature drivers meaning 30-40% faster on all counts than a GTX 580.
Slap 2 of these chips together without lowering core clocks and shaders etc a'la GTX 590 and this dual GPU card should be a monster and stomp the 590 by a LOT


----------



## radrok (Dec 23, 2011)

Should also run very very cool compared to the HD 6990, hell sometimes the top GPU runs at 92C -.-


----------



## Super XP (Dec 23, 2011)

ensabrenoir said:


> 90% of out rigs arent fully utilized/overkill.... This will fit right in.  we all need to push for better software though


Software is greatly holding back hardware. Something needs to change, something in terms of a nice kick in the aris for those software developers to step it up a notch.

This card is nuts


----------



## radrok (Dec 23, 2011)

That's not entirely true, it's because the resolution standard is stuck at 1080p, once your go over you have the need for such graphics cards.


----------



## Dent1 (Dec 23, 2011)

dorsetknob said:


> Its an incestuous relationship between hardware manufacturer's and the software houses
> manufacturer's bring equipment to the market
> software houses write software that pushes the hardware to the limit
> so manufacturer's revise and improve their hardware
> we all upgrade to the new and latest spec hardware and so the software houses produce more demanding software to take advantage  ect on on and on the cycle goes on



I disagree, it's the opposte, it's because software isnt optimised for the hardware, so the hardware is unefficient and hence runs the software with pure brute force. 

If what you said was correct, why can a sub $100 CPU and sub $150 GPU run any game today at max settings?


----------



## pantherx12 (Dec 23, 2011)

Velvet Wafer said:


> Finally, 8192x shadows in Skyrim are possible without major lag!



Ha ha not whilst the shadows are run by the cpu!


----------



## NdMk2o1o (Dec 23, 2011)

Dent1 said:


> If what you said was correct, why can a sub $100 CPU and sub $150 GPU run any game today at max settings?



BF:3, Metro 2033, Skyrim can all be run on max with sub $100 CPU and sub $150 GPU? I think not, there are a few more though not many so we are at a transition period whereby the game devs need to start making more games like the aforementioned to take advantage of the current hardware, though your statement is untrue, the mid range can't max the top games, least not the few I mentioned and others. There will be another Farcry/Crysis/Metro 2033 out soon enough and so it continues.


----------



## Super XP (Dec 23, 2011)

radrok said:


> That's not entirely true, it's because the resolution standard is stuck at 1080p, once your go over you have the need for such graphics cards.


Absolutely not, I got an 8-Core CPU running at 4.40GHz and a HD 6970 and I still cannot run games like Skyrim and Metro 2033 at MAX PQ. 
Like somebody already said, the software needs to catch up to the hardware. Anybody claiming otherwise is gaming too much in their sleep.


----------



## Dent1 (Dec 23, 2011)

NdMk2o1o said:


> BF:3, Metro 2033, Skyrim can all be run on max with sub $100 CPU and sub $150 GPU? I think not, there are a few more though not many so we are at a transition period whereby the game devs need to start making more games like the aforementioned to take advantage of the current hardware, though your statement is untrue, the mid range can't max the top games, least not the few I mentioned and others. There will be another Farcry/Crysis/Metro 2033 out soon enough and so it continues.



Well yes. Bought my CPU over 2 years ago for £75. Bought my GPU for £79.99, actually bought two for CF. Runs all those games max settings no problem. Friend has same setup with larger monitor still runs maxed out.


----------



## pantherx12 (Dec 23, 2011)

Super XP said:


> Absolutely not, I got an 8-Core CPU running at 4.40GHz and a HD 6970 and I still cannot run games like Skyrim and Metro 2033 at MAX PQ.
> Like somebody already said, the software needs to catch up to the hardware. Anybody claiming otherwise is gaming too much in their sleep.



Should get the new SKSE performance plugin dude, will be able to max out skyrim easily 

( I can max it out and then some and I'm at 4.3 and using a 6870)


----------



## mechtech (Dec 23, 2011)

Esse said:


> There's my little country



You're telling me, the province of Ontario is larger than that land mass.

And I am more interested to see the 7850!!!


----------



## Super XP (Dec 23, 2011)

pantherx12 said:


> Should get the new SKSE performance plugin dude, will be able to max out skyrim easily
> 
> ( I can max it out and then some and I'm at 4.3 and using a 6870)


Where we get this bro


----------



## radrok (Dec 23, 2011)

Super XP said:


> Absolutely not, I got an 8-Core CPU running at 4.40GHz and a HD 6970 and I still cannot run games like Skyrim and Metro 2033 at MAX PQ.
> Like somebody already said, the software needs to catch up to the hardware. Anybody claiming otherwise is gaming too much in their sleep.



Until consoles get a generation refresh the only way to increase performance is to buy overkill hardware, while I agree with you on optimization the need for faster hardware is still there, as I said previously.

3DMark 2011, although not being a game, is a fine example of how much a full DirectX11 game could be heavy on GPUs, if all developers would implement all DX11 libraries we'd need more than dual Tahiti to cope with the need for hardware.
Metro2033 itself is a GPU hog, sure some games are CPU bound and that's developing flaws.

And a Bulldozer is not an 8-core CPU, because it's like saying that mine is a 12-core CPU... doesn't work that way.


----------



## pantherx12 (Dec 23, 2011)

radrok said:


> And a Bulldozer is not an 8-core CPU, because it's like saying that mine is a 12-core CPU... doesn't work that way.



It's closer to being real cores than it is hyper-threading though ( I call them half cores  )


As I have 8 physical threads ( 8 integer cores)

How ever 2 integer cores share the resources ( albeit bumped up ones) of what would normally be 1 core.

So bulldozer does 8 threads in parallel where as a 4core 8 thread intel cpu does 4 threads at a time but in a really efficient way . ( Think of it as a really well organised tightly packed queue)


Sorry for off original topic in regards to original post. Will take conversation to PM if if anyone wants to discuss it.


----------



## radrok (Dec 23, 2011)

I know the Bulldozer Modules architectures but thank you anyway for your kind explanation


----------



## Mega-Japan (Dec 23, 2011)

This card makes me want to go take a vacation to New Zealand and chill at Mt Taranaki.

Anyone with me? Anybody? Yes? Maybe? Fine~


----------



## happita (Dec 23, 2011)

Just like the 6xxx and 5xxx series, the 7990 will most probably be 2x 7950's slapped together vs. 2x 7970's (for power and heat reasons). However, I would be very surprised if they just said screw it and made it 2x 7970's, just for the sake of being top dog. But IMO, they don't need to as they don't have any competition yet. We'll have to see what Kepler brings to the table, and I for one can't wait for that to happen. Hurry up Nvidia!!!


----------



## radrok (Dec 23, 2011)

Super XP said:


> Where we get this bro



Here http://forums.bethsoft.com/topic/1321575-rel-tesv-acceleration-layer/ 



happita said:


> Just like the 6xxx and 5xxx series, the 7990 will most probably be 2x 7950's slapped together vs. 2x 7970's (for power and heat reasons). However, I would be very surprised if they just said screw it and made it 2x 7970's, just for the sake of being top dog. But IMO, they don't need to as they don't have any competition yet. We'll have to see what Kepler brings to the table, and I for one can't wait for that to happen. Hurry up Nvidia!!!



Isn't the 5970 an underclocked 5870? The HD 6990 is an underclocked 6970, not 6950, because all the shaders are enabled.


----------



## Velvet Wafer (Dec 24, 2011)

pantherx12 said:


> Ha ha not whilst the shadows are run by the cpu!



that wasnt my problem...CPU load was fine, but Skyrim did consume 99% of the Video Memory due to that setting


----------



## happita (Dec 24, 2011)

radrok said:


> Isn't the 5970 an underclocked 5870? The HD 6990 is an underclocked 6970, not 6950, because all the shaders are enabled.



Oh dam. Thanks for clarifying that for me. I'm not as smart as I thought I was haha. I dont know why I always thought that it was 2x 50's and not 70's.


----------



## chinmi (Dec 24, 2011)

wogh.... can't wait to crossfire this to play bf3 in max resolution @ 1080p... time to ditch my [going to be] obsolete 6990....


----------



## Super XP (Dec 24, 2011)

radrok said:


> And a Bulldozer is not an 8-core CPU, because it's like saying that mine is a 12-core CPU... doesn't work that way.


Bulldozer's 8-Core is hardware where as Intel's Hyper Threading is software. People will agree to disagree in regards to this because not everybody agrees with AMD's concept of an 8-Core CPU. Personally I think it’s as much an 8-Core as a Phenom II x6 being a 6-Core, it’s just configured differently. Now if AMD would like to prove that this new Modular design is the way to go, then by all means, we are all waiting for updated performance figures and applications that should take advantage of this new design.


----------



## radrok (Dec 24, 2011)

While I agree that Intel's Hyperthreading is a fancy way to address 2 threads to a core and AMD's Bulldozer is a module that has 2 Integer and 1 FPU units I still don't recognize it as 2 cores per module, even though it is very close to it.
Let's say, 1,5 cores per module is more appropriate...
Still I'd love to see improvements with the upcoming patch and surely I'd want to see AMD to address the enormous latency on the cache but this is another cup of tea and I'm going off-topic 

Remaining on thread I seriously hope that AMD will stick to single slot solution for the HD 7990 like the HD 6990


----------



## fullinfusion (Dec 24, 2011)

It's a bullshit money grab imo!!!!


----------



## Lionheart (Dec 24, 2011)

fullinfusion said:


> It's a bullshit money grab!!!!
> 
> trust me i been testing this for some time under different cpu's and such.
> 
> ...



Troll:shadedshu


----------



## grndzro7 (Dec 24, 2011)

*the power can be used even in single monitor set ups.*

1080p + 8xSSAA & Edge detect + EQAA + Morplological filtering + Ambient lighting + Physics + Tesselation + 16x Quality/Trilinear filtering + massive negative LOD .

Ya the power can still be used on single monitor setups.

Besides, this card will stay quite relevant for single monitor setups for years. The very low idle power will save the fan from dust/wear.


----------



## 0xe0 (Dec 24, 2011)

JrRacinFan said:


> Just wondering something, why 6GB? I mean, it's nice but what right now would utilize it?



BF3 can


----------



## micropage7 (Dec 24, 2011)

JrRacinFan said:


> Just wondering something, why 6GB? I mean, it's nice but what right now would utilize it?



coz dual, it may assumed every single chip has 3 gig


----------



## nt300 (Dec 24, 2011)

radrok said:


> I seriously hope that AMD will stick to single slot solution for the HD 7990 like the HD 6990


Really? HD 6990 is single slot, I thought I see Dual Slot air cooler. It would be nice to have single slot cooler so you can use your PCIe slots instead of having them taken up by the Dual slot coolers.


grndzro7 said:


> Morplological filtering


Everytime I enable this my games look like crap. Any idea why


----------



## Prima.Vera (Dec 24, 2011)

You guys are exaggerating to much. I have a Core 2 Quad and a 5870 and I play *ALL* games on the market today with absolutely NO problems and glitches with MAX DETAILS settings (4xAA, 16xAF on a 1080p resolution). Stop being so geeky and OVER-pretentious!


----------



## Yukikaze (Dec 24, 2011)

Super XP said:


> Bulldozer's 8-Core is hardware where as Intel's Hyper Threading is software.



Erm. 

What? 

That's just false.


----------



## the54thvoid (Dec 24, 2011)

Prima.Vera said:


> You guys are exaggerating to much. I have a Core 2 Quad and a 5870 and I play *ALL* games on the market today with absolutely NO problems and glitches with MAX DETAILS settings (4xAA, 16xAF on a 1080p resolution). Stop being so geeky and OVER-pretentious!



Why are people typing such bullshit?

So you are playing at less fps than the GTX480 and this isn't even a graph for maxed out details.  (and the below TPU graphs from W1zzard are 1050 res.






Again





It'd be nice if folk stopped lying about their system's capabilities.  There is no way your C2Quad and a 5870 are maxing out all gfx details and playing the above games for example on decent fps.  Unless you have a magical PC.


----------



## pantherx12 (Dec 24, 2011)

radrok said:


> I know the Bulldozer Modules architectures but thank you anyway for your kind explanation



And thank you for not having a for polite response 



fullinfusion said:


> It's a bullshit money grab!!!!
> 
> trust me i been testing this for some time under different cpu's and such.
> 
> ...



... Wait, wut!?

You realise a single 7970 is already right up a 6990s arse right?

And you realise in games where crossfire works the 7970 has +90% scaling?

I complete wrecks two 580s in sli let alone 6990.

( wrecks Asus mars two as well)

Go read the crossfire reviews man 

until Nvidia release their cards the 7000 series will be best on market.


----------



## radrok (Dec 24, 2011)

nt300 said:


> Really? HD 6990 is single slot, I thought I see Dual Slot air cooler. It would be nice to have single slot cooler so you can use your PCIe slots instead of having them taken up by the Dual slot coolers.
> 
> Everytime I enable this my games look like crap. Any idea why



I didn't explain it, the HD 6990 is single slot capable thanks to the location of the DVI and mini dp, once you put a waterblock on it becomes single slot  

Morphological AA is a post processing, it is applied after the frame has been rendered.
It doesn't recognize text and so it smooths the text too making it blurry


----------



## Prima.Vera (Dec 25, 2011)

the54thvoid said:


> Why are people typing such bullshit?
> 
> So you are playing at less fps than the GTX480 and this isn't even a graph for maxed out details.  (and the below TPU graphs from W1zzard are 1050 res.
> http://tpucdn.com/reviews/AMD/HD_7970/images/crysis_1680_1050.gif
> ...



You obviously fail to understand that any game that outputs more than 30fps is more than playable. I played both Metro and Crysis and didn't have any problem with that fps. What is your point? Btw, I am not a geek gamer, so use your "bullshit" on someone else. :shadedshu


----------



## qubit (Dec 25, 2011)

I'm waiting for the graphics tech that can gang two GPU's together on one card so that they can both use the whole memory buffer together and work as a single logical GPU. This will make the whole thing much more efficient:

- The whole memory buffer can be used, so wasting half of it on mirroring won't be necessary
- The two GPU's will work as one logical double-wide GPU
- Microstuttering should be eliminated
- The card will work exactly like a single GPU card, but with double the performance and its full performance will always be available without requiring any special profiles for specific games like we have now

Obviously, this kind of functionality won't be available with two physical cards, because the GPU's will be too far apart physically and on different boards.

This will be a true innovation and whichever company implements it first will wipe the floor with the other one. Well, here's hoping it comes to pass.


----------



## pantherx12 (Dec 25, 2011)

qubit said:


> I'm waiting for the graphics tech that can gang two GPU's together on one card so that they can both use the whole memory buffer together and work as a single logical GPU. This will make the whole thing much more efficient:
> 
> - The whole memory buffer can be used, so wasting half of it on mirroring won't be necessary
> - The two GPU's will work as one logical double-wide GPU
> ...




Didn't the older graphics cards do this no problem?

When they didn't really even market them as x 2 etc.


----------

