# Radeon R9 295X2 Press Deck Leaked



## btarunr (Apr 3, 2014)

Here are some of the key slides from AMD's press-deck (presentation) for reviewers, for the Radeon R9 295X2 dual-GPU graphics card, ahead of its April 8 launch. The slides confirm specifications that surfaced earlier this week, which describe the card as bearing the codename "Vesuvius," having two 28 nm "Hawaii" GPUs, and all 2,816 stream processors on the chips being enabled, next to 176 TMUs, 64 ROPs, and 512-bit wide GDDR5 memory interfaces. Two such chips are wired to a PLX PEX8747 PCI-Express 3.0 x48 bridge chip. There's a total of 8 GB of memory on board, 4 GB per GPU. Lastly, clock speeds are revealed. The GPUs are clocked as high as 1018 MHz, and memory at 5.00 GHz (GDDR5-effective). The total memory bandwidth of the card is hence 640 GB/s.

The Radeon R9 295X2 indeed looks like the card which was pictured earlier this week, by members of the ChipHell tech community. It features an air+liquid hybrid cooling solution, much like the ROG ARES II by ASUS. The cooling solution is co-developed by AMD and Asetek. It features a couple of pump-blocks cooling the GPUs, which are plumbed with a common coolant channel running through a single 120 mm radiator+reservoir unit. A 120 mm fan is included. A centrally located fan on the card ventilates heatsinks that cool the VRM, memory, and the PCIe bridge chip. 



 

 

 




The card draws power from two 8-pin PCIe power connectors, and appears to use a 12-phase VRM to condition power. The VRM appears to consist of CPL-made chokes, and DirectFETs by International Rectifier. Display outputs include four mini-DisplayPort 1.2, and a dual-link DVI (digital only). The total board power of the card is rated at 500W, and so AMD is obviously over-drawing power from each of the two 8-pin power connectors. You may require PSUs with strong +12V rails driving them. Looking at these numbers, we'd recommend at least an 800W PSU for a single-card system, ideally with a single +12V rail design. The card is 30.7 cm long, and its coolant tubes shoot out from its top. AMD expects the R9 295X2 to be at least 60 percent faster than the R9 290X at 3DMark FireStrike (performance).



 

 

 

 



*View at TechPowerUp Main Site*


----------



## Naito (Apr 3, 2014)

Beast of a card! Certainly offers more kit than nVidia's offering


----------



## Xzibit (Apr 3, 2014)

Just noticed. Its called Project Hydra.


----------



## LAN_deRf_HA (Apr 3, 2014)

Why oh why would you try to cheaply imitate the styling of your competitor? This card looks like a Chinese knockoff of a high end Nvidia card. It's so bizarre they'd go that direction, look at the final product, and then just not see the glaring problem there. I mean they even have the glowing side logo.... It's like they decided grey metal and rivets must be part of a new DX standard.


----------



## hanzawhtet7 (Apr 3, 2014)

WOOHOO! AMD is giving best shot! Looking forward to it


----------



## jateruy (Apr 3, 2014)

What a fugly shroud and fan, I would much prefer the original R9 styled cooler instead of this cheapy plastic looking LED fan.


----------



## manofthem (Apr 3, 2014)

Sounds like these gpu hybrid coolers are becoming increasingly popular, but I wouldn't want to bother with it. 

This card deserves a full water block.


----------



## Nordic (Apr 3, 2014)

They should sell these cards with no cooler for cheap.


----------



## qu3becker (Apr 3, 2014)

This card reminds me of the XFX recent cards like this one: 
http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150590
As for me, the card looks okay. At least it's not glossy plastic like the 7990.


----------



## Xzibit (Apr 3, 2014)

Looking at the numbers

295x2 - 11.5 TFLOPS
Titan Z = 8 TFLOPS


----------



## HumanSmoke (Apr 3, 2014)

Xzibit said:


> Looking at the numbers
> 
> 295x2 - 11.5 TFLOPS
> Titan Z = 8 TFLOPS


What a joke.
Point #1 : 295X2 is based on continuous operation at 1018MHz (and the actual number would be 11.467*) - how likely is that scenario
And since you're cherry picking a bullet point in isolation, how about this one:
Point #2 : FP64.....295X2 : 1.433 TFlops (AMD artificially limits double precision to 1/8 of single precision with Radeon Hawaii) -and which is actually ~25% less than the HD 7990) and* 54% less than Titan Z* if the Z's 8TF FP32 is correct
* (1018 * 5632 * 2)


----------



## damric (Apr 3, 2014)

I love posting this pic


----------



## RCoon (Apr 3, 2014)

Overdrawing power from two 8pins is less than stellar. I'm aware most psu's from reputable brands can handle it, but it's outside specs. Would it really have hurt just banging on a third 8 pin just for safety? It's not like the people who buy this won't have one available.


----------



## Patriot (Apr 3, 2014)

RCoon said:


> Overdrawing power from two 8pins is less than stellar. I'm aware most psu's from reputable brands can handle it, but it's outside specs. Would it really have hurt just banging on a third 8 pin just for safety? It's not like the people who buy this won't have one available.



PCIE 3 slot can give 300w... why mobo needs multiple 8pins.


----------



## seronx (Apr 3, 2014)

Patriot said:


> PCIE 3 slot can give 300w... why mobo needs multiple 8pins.


PCIe 3.0 Electromechanical.
PCIe slot = 75 watts
PCIe 2x4 = 150 watts

75 + 150 + 150 = 375 watts

The PEG cables can be told to go above and beyond the official specification but not the slot.

75W + 300W + 150W or 300W + 300W with no power coming from the slot.


----------



## RCoon (Apr 3, 2014)

Patriot said:


> PCIE 3 slot can give 300w... why mobo needs multiple 8pins.


 


btarunr said:


> The total board power of the card is rated at 500W, and so AMD is obviously over-drawing power from each of the two 8-pin power connectors


 
Says in the OP it's going to be overdrawing.


----------



## Patriot (Apr 3, 2014)

seronx said:


> PCIe 3.0 Electromechanical.
> PCIe slot = 75 watts
> PCIe 2x4 = 150 watts
> 
> ...



So it seems... looks like 300w is still max for pcie spec... dual 8pin config not official lol ...  Thanks for the correction.


----------



## HumanSmoke (Apr 3, 2014)

seronx said:


> PCIe 3.0 Electromechanical.
> PCIe slot = 75 watts
> PCIe 2x4 = 150 watts
> 
> ...


Well, that sounds fine and dandy.......what about AMD's own range of 900 (990FX/990X/970) series chipset boards? They aren't specced for PCI-E 3.0 operation are they? That being the case then an overdraw past 75W via the slot isn't a given.


----------



## seronx (Apr 3, 2014)

HumanSmoke said:


> Well, that sounds fine and dandy.......what about AMD's own range of 900 (990FX/990X/970) series chipset boards? They aren't specced for PCI-E 3.0 operation are they? That being the case then an overdraw past 75W via the slot isn't a given.


The PCIe Electromechanical specification is forward and backwards compatible.


----------



## HumanSmoke (Apr 3, 2014)

RCoon said:


> Would it really have hurt just banging on a third 8 pin just for safety? It's not like the people who buy this won't have one available.


Other than thumbing its nose at the PCI-SIG, it might have been a tight fit to accommodate a third power plug. According to the slide deck, the card is 305-307mm long. I believe the PCI-SIG limit is 312mm.


----------



## RCoon (Apr 3, 2014)

HumanSmoke said:


> Well, that sounds fine and dandy.......what about AMD's own range of 900 (990FX/990X/970) series chipset boards? They aren't specced for PCI-E 3.0 operation are they? That being the case then an overdraw past 75W via the slot isn't a given.


 
Kaveri/FM2+ does though right?


----------



## HumanSmoke (Apr 3, 2014)

RCoon said:


> Kaveri/FM2+ does though right?


Yup.


----------



## btarunr (Apr 3, 2014)

NextFX CPUs, which could be FM2+ chips with four Steamroller modules, no IGP, but integrated PCIe gen 3 root complex, should overcome that. Current AMD 7-series APUs support PCIe gen 3.


----------



## radrok (Apr 3, 2014)

Would've loved to see it as a 3x8 pin just in case someone wants to abuse it for overclocking, 2x8 doesn't leave much room for power


----------



## adulaamin (Apr 3, 2014)

I'm glad AMD came out with this card although if I were ever to buy one I'd ditch that hybrid cooler and build a custom loop. I don't like how it looks like but if it performs well and is priced right then it's a win.


----------



## RCoon (Apr 3, 2014)

radrok said:


> abuse it for overclocking, 2x8 doesn't leave much room for power


 
Could be that they don't want to stress the power assembly too much. I imagine most would want to slap a custom block on this and overclock, but it depends on how strong the VRM's are. Similar to the 780/Titan issue of the VRM's being a little less than ideal for 1300+ OC's on the reference cards. Granted it was achievable, but too much power through those puppies and they were liable to pop. Not so sure how a dual chip VRM assembly would deal with the intense TDP, stress and overall heat. It wouldn't surprise me to see a voltage limit on this.


----------



## Assimilator (Apr 3, 2014)

LAN_deRf_HA said:


> Why oh why would you try to cheaply imitate the styling of your competitor? This card looks like a Chinese knockoff of a high end Nvidia card.



Agreed, it looks like they tried to copy the "premium" styling of the GTX 690/Titan/Titan Z, but the end result ended up looking like a cheap knockoff.


----------



## HumanSmoke (Apr 3, 2014)

adulaamin said:


> I'm glad AMD came out with this card although if I were ever to buy one I'd ditch that hybrid cooler and build a custom loop. I don't like how it looks like but if it performs well and is priced right then it's a win.


Well, unless you have a scarcity of PCIEx16 slots, chances are you'd be much better off with a couple of 290's or 290X's if the $1500 price tag is to believed. I'm pretty sure you could grab two cards, full cover blocks, fittings, and a Crossfire bridge if required for less than the price of the 295X. They will overclock better, and hooking them up to a triple rad gives the option of using less aggressive (noisy) fan profiles.


----------



## RCoon (Apr 3, 2014)

HumanSmoke said:


> Well, unless you have a scarcity of PCIEx16 slots, chances are you'd be much better off with a couple of 290's or 290X's if the $1500 price tag is to believed. I'm pretty sure you could grab two cards, full cover blocks, fittings, and a Crossfire bridge if required for less than the price of the 295X. They will overclock better, and hooking them up to a triple rad gives the option of using less aggressive (noisy) fan profiles.


 
If the 7990 is anything to go by, two single cards appears to be better than dual GPU single cards.


----------



## HumanSmoke (Apr 3, 2014)

RCoon said:


> If the 7990 is anything to go by, two single cards appears to be better than dual GPU single cards.


Can probably be a reasonable generalization levelled at most dual GPU cards.
The 7990 (initial pricing), this card, and the Titan Z aren't even attempting to achieve price/performance parity with a couple of single cards. At least previous duallies made an effort not to charge a premium over two single GPU boards. Add in the fact that a duallie is generally either lower performing or louder/hotter (or both) than two single cards and I can't really see the attraction for a gaming workload.


----------



## buildzoid (Apr 3, 2014)

RCoon said:


> Could be that they don't want to stress the power assembly too much. I imagine most would want to slap a custom block on this and overclock, but it depends on how strong the VRM's are. Similar to the 780/Titan issue of the VRM's being a little less than ideal for 1300+ OC's on the reference cards. Granted it was achievable, but too much power through those puppies and they were liable to pop. Not so sure how a dual chip VRM assembly would deal with the intense TDP, stress and overall heat. It wouldn't surprise me to see a voltage limit on this.


The VRM on this seems to be a doubled up and more compact version of the R9 290X VRM so the GPUs can easily get fed 375A each so 750A total current allowance. Just don't expect the PCIe 8Pins(30A-40A) to be able to carry that much power(750A @ 1.5V = 1125W = 93A @ 12V) so if you do plan to use all of the VRM's capability you should solder on 1 or 2 more 6 pin connectors or risk burning something. So VRM wise your good but the dual 8 pins are insufficient. Most Quad R9 290X OC attempts I've seen included 2 1600W PSUs so this card OCed + OCed intel hexa core/ AMD octa core will need a 1200W+ PSU.


----------



## radrok (Apr 3, 2014)

RCoon said:


> Could be that they don't want to stress the power assembly too much. I imagine most would want to slap a custom block on this and overclock, but it depends on how strong the VRM's are. Similar to the 780/Titan issue of the VRM's being a little less than ideal for 1300+ OC's on the reference cards. Granted it was achievable, but too much power through those puppies and they were liable to pop. Not so sure how a dual chip VRM assembly would deal with the intense TDP, stress and overall heat. It wouldn't surprise me to see a voltage limit on this.



I remember I could overclock the 6990 up to 1.35v on both cores with Trixx, yielded me 1100 Mhz which was insane on that card, ATI has always had beefy PCB components, they actually care!


----------



## Recus (Apr 3, 2014)

Xzibit said:


> Looking at the numbers
> 
> 295x2 - 11.5 TFLOPS
> Titan Z = 8 TFLOPS



I wonder where those numbers go?

AMD launches FirePro W9100
NVIDIA Tesla Powers HIV Research Breakthrough
Did Nvidia Just Demo SkyNet on GTC 2014? – Neural Net Based “Machine Learning” Intelligence Explored


----------



## Serpent of Darkness (Apr 3, 2014)

HumanSmoke said:


> Point #2 : FP64.....295X2 : 1.433 TFlops (AMD artificially limits double precision to 1/8 of single precision with Radeon Hawaii) -and which is actually ~25% less than the HD 7990) and* 54% less than Titan Z* if the Z's 8TF FP32 is correct
> * (1018 * 5632 * 2)



About Point # 2
Hagedoorn, Hilbert, AMD Unveils FirePro W9100, Guru3D.com, 3/27/2014.
http://www.guru3d.com/news_story/amd_unveils_firepro_w9100.html

GTX Titan-Z SPP = 8.06 TFLOPs. = 700 MHz per GPU = 2x 6GB VRam = $3,000.00 per unit.
R9-295x SPP = 11.466 TFLOPs. = 1018 MHz per GPU = 2x 4GB VRam = $1,500.00 per unit.

64bit floating point precision comparison:
GTX Titan-Z 64FPP = 2x 1.3468 TFLOPs at stock.
K40 Tesla 64FPP = > 1.4 TFLOPs. = 12 GB VRam = $5,400.00
FirePro W9100 64FPP = 2.56 TFLOPs. = 16 GB VRam = $3,000.00 to $4,500.00
R9-295x 64FPP = 2x 0.716672 TFLOPs.
R9-290x 64FPP = 0.704 TFLOPs.


----------



## Sasqui (Apr 3, 2014)

RCoon said:


> If the 7990 is anything to go by, two single cards appears to be better than dual GPU single cards.



What do you want to bet W1z is testing the 290x2 under NDA as we poke around in this thread 

Here's his 290x CF review perf chart (scaling isn't great, but then again who knows what new drivers have done for it):


----------



## Hilux SSRG (Apr 3, 2014)

Sasqui said:


> What do you want to bet W1z is testing the 290x2 under NDA as we poke around in this thread


 
We can only hope!  Can't wait to see the eventual comparos to the TitanZ.


----------



## Casecutter (Apr 3, 2014)

It’s disappointing the use of pretty much standard pump/blocks that aren't even independent (first one heats the water and sends it to the next).  I  really had hoped AMD had stepped up their game. Engineering something more akin to a vapor chambers that wicked the heat quickly from the chip, then to a larger area permitting the water block not to have such a localized hot spot.  A single pump that’s more integrated into the radiator/fan module and then two line in/out but split to send cool water to each block independently.  I think if done right you could have exceptionally good heat transfer in a smaller package blocks with less restriction, necessitating just one pump and perhaps able to better juggle the radiator/fan requirement, somewhat…

While overall outward aesthetics is a little blah, and mimic’ie of the Nvidia style.  I think AMD/Asetek should have taken it to another level of refinement and not just pulled on what just the existing ROG ARES II ASUS used.  Asus if I remember made a limited run of 1,000 units of ARES II, selling them at $1,500.  AMD easily is probably looking at minimum 5x that, and they can’t price it better on their volume.
Not my cup of tea.


----------



## RCoon (Apr 3, 2014)

Sasqui said:


> What do you want to bet W1z is testing the 290x2 under NDA as we poke around in this thread
> 
> Here's his 290x CF review perf chart (scaling isn't great, but then again who knows what new drivers have done for it):



Woah, I didn't realise crossfire scaling on the 290X was really that bad... I expected a litte more than 31% overall...


----------



## Brusfantomet (Apr 3, 2014)

RCoon said:


> Woah, I didn't realise crossfire scaling on the 290X was really that bad... I expected a litte more than 31% overall...



considering thats over 1600 x 900, 1920 x 1080 and 2560 x 1600 its not that bad, the scaling goes up a little at 2560 x 1600 to 47 %, guessing its even better at 4K


----------



## RCoon (Apr 3, 2014)

Hilux SSRG said:


> Can't wait to see the eventual comparos to the TitanZ.



I think, and I really want to believe, the 295X2 will beat the Titan Z. The Titan Z will be voltage locked, and it will be bottlenecked by the air cooler. The AMD however will have a much higher thermal tolerance before throttling (i hope), and will possibly have a higher voltage perimeter.

All that being said. Titan Z *is not designed for gaming*, for the 50th time, it's a cheap dpcompute card for home servers. This needs to be stapled to every forum header, title, post, thread, the whole works, until everyone on the internet understands that.


----------



## xorbe (Apr 3, 2014)

RCoon said:


> Woah, I didn't realise crossfire scaling on the 290X was really that bad... I expected a litte more than 31% overall...



Yeah it needs more pixels to push, seems cpu bottlenecked at lower res.


----------



## Suka (Apr 3, 2014)

RCoon said:


> I think, and I really want to believe, the 295X2 will beat the Titan Z. The Titan Z will be voltage locked, and it will be bottlenecked by the air cooler. The AMD however will have a much higher thermal tolerance before throttling (i hope), and will possibly have a higher voltage perimeter.
> 
> All that being said. Titan Z *is not designed for gaming*, for the 50th time, it's a cheap dpcompute card for home servers. This needs to be stapled to every forum header, title, post, thread, the whole works, until everyone on the internet understands that.


Why do you say so? Nvidia themselves says its for gaming.


----------



## RCoon (Apr 3, 2014)

Suka said:


> Why do you say so? Nvidia themselves says its for gaming.



Watch the press release


----------



## HumanSmoke (Apr 3, 2014)

Suka said:


> Why do you say so? Nvidia themselves says its for gaming.


They also target the professional markets. In fact, the Titan Z was announced at a technology conference, not a gaming event, and the CEO specifically mentioned the professional aspects of the board in his launch introduction









So if you had a product that was proficient in more than one market segment, why wouldn't you market the products for those markets ?

AMD pushes cryptocurrencies on its gaming blog, are we to assume that Radeons are mining cards only?


> Dedicating more hardware to mining helps increase the likelihood that you can be first to verify a transaction and receive your own coins as a reward. This is why a great GPU, like an AMD Radeon™ R9 Series product, is so important.


Of course not. You market wherever you have an opportunity to sell.


----------



## SIGSEGV (Apr 3, 2014)

RCoon said:


> All that being said. Titan Z *is not designed for gaming*, for the 50th time, it's a cheap dpcompute card for home servers. This needs to be stapled to every forum header, title, post, thread, the whole works, until everyone on the internet understands that.[/QUOTE]



LOL, i found it very funny..
how much salary do you get from nvidia?


----------



## HumanSmoke (Apr 4, 2014)

SIGSEGV said:


> LOL, i found it very funny..
> how much salary do you get from nvidia?


LOL I found this very funny...
If Nvidia designed the Titan Z for gaming why would they pay RCoon to say otherwise?

_Logical thought process escapes random forum poster_ - No headline news ever.

*:SMH:*


----------



## Xzibit (Apr 4, 2014)

295x2 just has to a better Price/Performance then Titan Z/790 = Win

We should know when the 295x2 will launch on the 8th

Anyone know when the Titan Z will be launched ?


----------



## sweet (Apr 4, 2014)

RCoon said:


> I think, and I really want to believe, the 295X2 will beat the Titan Z. The Titan Z will be voltage locked, and it will be bottlenecked by the air cooler. The AMD however will have a much higher thermal tolerance before throttling (i hope), and will possibly have a higher voltage perimeter.
> 
> All that being said. Titan Z *is not designed for gaming*, for the 50th time, it's a cheap dpcompute card for home servers. This needs to be stapled to every forum header, title, post, thread, the whole works, until everyone on the internet understands that.




If you are right about Titan Z, its Geforce tag is just a cunning marketing trick then. Crap for nVidia.


----------



## HumanSmoke (Apr 4, 2014)

Xzibit said:


> 295x2 just has to a better Price/Performance then Titan Z/790 = Win


Well, that just sounds stupid.
Buying high end performance isn't about AMD vs Nvidia, it's about performance, and maybe performance per dollar.
People aren't going to buy the 295X2 because it's "better" than another overpriced card that few would actually buy for gaming. What they are going to compare it to is a comparable solution.

R9 295X2 = $1500

2 x MSI 290X Gaming = $1180

and if you already have an interest in proper watercooling- as a prospective buyer of watercooled $1500 graphics cards should be

2 x PowerColor LCS 290X's = $1400

or if you're not lazy, you can save yourself $46 by doing it yourself

2 x PowerColor PCS+ 290X's = $1140 + 2 EKWB FC blocks $214...Total: $1354

Better overclock than the 295X2. Better power load distribution ( 4 PCI-E inputs versus 2). Less LED lights. Lower price.

The reality is that people go where the performance is at  this level of expenditure- and performance (and price) resides with combinations of single GPU boards, not some PR grab for the halo which might (or might not) give midrange card buyers a chubby.

You really sound like someone that has never bought enthusiast grade hardware.


----------



## Xzibit (Apr 4, 2014)

Since your reverting back to single gpu card argument in a dual gpu comparison does that mean you've given up hope already.  At least wait for the reviews.

It just has to win the dual gpu card war and since Nvidia is selling Titan Z as a Gaming/CUDA Compute card it will compete on a dual GPU card basis with it.  No matter how much you and others try and defend or move the goal post in your heads.

Does anyone hear that?  Its HumanSmoke in another AMD thread.


----------



## HumanSmoke (Apr 4, 2014)

Xzibit said:


> Since your reverting back to single gpu card argument in a dual gpu comparison does that mean you've given up hope already


Yes. I think the 295X2 is a waste of time for the enthusiast


Xzibit said:


> At least wait for the reviews.


Why? Even AMD's own propaganda pegs the 295X2 at* 60% better than a single card at a 154% higher price.*


Xzibit said:


> It just has to win the dual gpu card war and since Nvidia is selling Titan Z as a Gaming/CUDA Compute card it will compete on a dual GPU card basis with it


Not really. People aren't going to buy the 295X2 for CUDA development, or for Octane, or for any number of CUDA accelerated productivity apps ( of no small importance due to the hit-and-miss state of OpenCL support)....and as a gaming head-to-head...who cares when two single cards are faster and cheaper


Xzibit said:


> Does anyone hear that?  Its HumanSmoke in another AMD thread.


 I've got a long way to go before I reach the lack of understanding and trolling you Xzibit in the Nvidia threads


----------



## Xzibit (Apr 4, 2014)

Why not pull up the one where you didn't know where the PSU was in a server.

Strange how you only express negative views in AMD threads



HumanSmoke said:


> I don't like any dual GPU card on principle - not just this one.
> Duallies are usually more problematic, suffer in ability (OC) to two single cards, have more issues with drivers, lower resale, and generally aren't a saving over two separate cards



Hypocrite much?

I'm flattered though really.  Never had a Hobbit pay so much attention to me.


----------



## arbiter (Apr 4, 2014)

So most reviews say 290(x) uses around 300watts but AMD is claiming 250watts? If pcie 3 can give 300 watts which probably not best idea for AMD to use that to power this, doubt that up to clock will be constant . Which that is other thing that sucks about AMD saying you will get Up to this performance most likely less. Least nvidia say you get least X but get what ever card can do after that.

With that said TitanZ i think was just Nvidia putting a shot across AMD nose to get them to respond and they did. Probably won't be long til see Nvidia fire back with something soon.


----------



## HumanSmoke (Apr 4, 2014)

Xzibit said:


> Why not pull up the one where you didn't know where the PSU was in a server.


Nah, that's bullshit 


Xzibit said:


> Strange how you only express negative views in AMD threads


Obviously, since I'd prefer two 290X's to a single 295X2 


Xzibit said:


> Hypocrite much?


The quote is actually consistent with what I've been saying. Are you sure you understand what the word hypocrite means?


Xzibit said:


> I'm flattered though really.  Never had a Hobbit pay so much attention to me.


Well done. You are here....where you've always been







arbiter said:


> So most reviews say 290(x) uses around 300watts but AMD is claiming 250watts? If pcie 3 can give 300 watts which probably not best idea for AMD to use that to power this, doubt that up to clock will be constant .


Well, the board is specced for a nominal 375 watts delivery (2 x 8pin), but it will surely pull more than 150 watts per PCI-E 8-pin - as is quite common at the high end.
Even if the board will clock to its max 1018MHz, I'd think that any overclocking will pull significantly more current than the PCI specification,  so any prospective owner might want to check their PSU's power delivery per rail (real or virtual). I really couldn't see this card getting to stable 290X overclocks.


----------



## alwayssts (Apr 4, 2014)

buildzoid said:


> The VRM on this seems to be a doubled up and more compact version of the R9 290X VRM so the GPUs can easily get fed 375A each so 750A total current allowance. Just don't expect the PCIe 8Pins(30A-40A) to be able to carry that much power(750A @ 1.5V = 1125W = 93A @ 12V) so if you do plan to use all of the VRM's capability you should solder on 1 or 2 more 6 pin connectors or risk burning something. So VRM wise your good but the dual 8 pins are insufficient. Most Quad R9 290X OC attempts I've seen included 2 1600W PSUs so this card OCed + OCed intel hexa core/ AMD octa core will need a 1200W+ PSU.



Each wire is rated for 13A, just butchered for the pci-e sig spec.  Three active in both 6 or 8-pin (8-pin has extra grounds).   13Ax6 @ 12v = 936w.  I have no idea if the slot can go out of spec, probably not, but still...you could (theoretically) feed a double 8-pin card 1KW.


----------



## sweet (Apr 4, 2014)

alwayssts said:


> Each wire is rated for 13A, just butchered for the pci-e sig spec.  Three active in both 6 or 8-pin (8-pin has extra grounds).   13Ax6 @ 12v = 936w.  I have no idea if the slot can go out of spec, probably not, but still...you could (theoretically) feed a double 8-pin card 1KW.


Finally a proper explain.
8+8 card can pull more than just 375W, as proved with 7990. Just make sure your PSU is single rail, or each 8 pin is on a separated healthy 30A rail.
By the way, 780ti is a 6+8 card, and it can pull a lot more than just 300W.


----------



## HumanSmoke (Apr 4, 2014)

alwayssts said:


> I have no idea if the slot can go out of spec.


It can, but varies on motherboards design. A prime example would be the recent GTX 750/750Ti reviews that used the reference (no aux power) design for testing. More than a few results (esp those which overclocked) yielded a power draw in excess of 75 watts.




[Source]


----------



## RCoon (Apr 4, 2014)

SIGSEGV said:


> how much salary do you get from nvidia?



Clearly not enough if you saw the car I drive. I'd also demand a free better GPU as a perk of the job. They can keep their shield for all I care.

Is it too much to ask to have a civilised thread without everyone sharpening their pitchforks and getting all potty mouthed?
I think I prefered it when the worse thing said in one of these threads is "that card looks ugly". I'd almost welcome Jorge at this point (this is a filthy lie).


----------



## alwayssts (Apr 4, 2014)

sweet said:


> Finally a proper explain.
> 8+8 card can pull more than just 375W, as proved with 7990. Just make sure your PSU is single rail, or each 8 pin is on a separated healthy 30A rail.
> By the way, 780ti is a 6+8 card, and it can pull a lot more than just 300W.



Right.  There are actually many cards that do it.  Before the days of  software to limit tdp so amd/nvidia could upsell you  powertune and the like many volt-modded 4850's way out of spec.  The same game was played in reverse when nvidia's 500 line essentially was clocked to the balls end of a pci-e spec for their plugs and overclocking brought them substantially over.  The fact of the matter is while you could say pci-e plugs are an evolution of the guidelines of the old 'molex' connector, which is all well and good, they are over-engineered by a factor of around 2-3.  This card just seems to bring it down to around 2.




HumanSmoke said:


> It can, but varies on motherboards design. A prime example would be the recent GTX 750/750Ti reviews that used the reference (no aux power) design for testing. More than a few results (esp those which overclocked) yielded a power draw in excess of 75 watts.



Very interesting!  Thanks for this.  So it's a factor of around 2 then, if not the card simply limited to that power draw (to stay in the 75+75w spec) and it's even higher.

Well then, I guess you could take 936 + at least 141w then.  When you factor in the vrm rated at (taking his word for it) 1125w, it gives an idea of what the card was built to withstand (which is insane).  It seems engineered for at least 2x spec, which sounds well within reason for anything anybody is realistically going to be able to do with it.  I doubt many people have a power supply that could even pull that with a (probably overclocked) system under load, even with an ideal cooling solution.

Waiting for the good ol' XS/coolaler gentleman to hook one up to a bare-bones cherry-picked system and prove me wrong (while busting some records).  That seems pretty much what it was built to do.

On a weird sidenote, their clocks kind of reveal something interesting inherent to the design.  First, 1018 is probably where 5ghz runs out of bandwidth to feed the thing.  Second, 1018 seems like where 28nm would run at 1.05v, as it's in tune with binning of past products (like 1.218 for the 1200mhz 7870 or 1.175 for the 1150mhz 7970).  It's surely a common voltage/power scaling threshold on all processes, but in this case half-way between the .9v spec and 1.2v where 28nm scaling seems to typically end.  Surely they aren't running them at 1.05v and using 1.35v 5ghz ram, but it's interesting that they *could* and probably conserve a bunch of power, if not even dropping down to .9v and slower memory speed.  If I were them I would bring back the UBER AWSUM RAD TOOBULAR switch that toggled between 1.05/1.35v for said clocks and 1.2v/1.5v-1.55v (because 1.35v 5ghz ram is 7ghz ram binned from hynix/samsung at that spec).  That would actually be quite useful for both the typical person that spends $1000 on a videocard (like we all do from time to time) and those that want the true most out of it.


----------



## xorbe (Apr 4, 2014)

I don't believe that chart up there that claims a 750Ti is pulling 141W peak -- try again with 100ms divisions on the measuring hardware.


----------



## Bytales (Apr 4, 2014)

radrok said:


> Would've loved to see it as a 3x8 pin just in case someone wants to abuse it for overclocking, 2x8 doesn't leave much room for power


They dont design something just because someone would love to see it because someone else would supposedly like to  abuse it.
They were more practical in their design. If you would take a look at the PCB,  you would understand why they went with the dual 8 pin power connectors.
Because there isn't place for a third, and the card is long as it it.

What i would like to see is how ASUS would design their ARES III. They would probably use a wider PCB and triple 8 pin power plugs. Even so, i don't think there would be place for 8 GB per GPU.


----------



## Bytales (Apr 4, 2014)

HumanSmoke said:


> ...
> The reality is that people go where the performance is at  this level of expenditure- and performance (and price) resides with combinations of single GPU boards, not some PR grab for the halo which might (or might not) give midrange card buyers a chubby.
> 
> You really sound like someone that has never bought enthusiast grade hardware.



The reality is i am interested in 295x2 because it allows me to use 4 GPUs using only 4 PCI slots, further using only 2 Physical slots on the motherboard. The way its designed(the slots on the back are on a single row), a custom made waterblock, would allow a single 295x2 to be made single slot, thus widening my possibilities, and winning me another slot on the motherboard.

Thats why i would be personally interested in this card.


----------



## radrok (Apr 4, 2014)

Bytales said:


> They dont design something just because someone would love to see it because someone else would supposedly like to  abuse it.
> They were more practical in their design. If you would take a look at the PCB,  you would understand why they went with the dual 8 pin power connectors.
> Because there isn't place for a third, and the card is long as it it.
> 
> What i would like to see is how ASUS would design their ARES III. They would probably use a wider PCB and triple 8 pin power plugs. Even so, i don't think there would be place for 8 GB per GPU.



I expect overkill on these kind of GPUs, nothing less.


----------



## Blín D'ñero (Apr 4, 2014)

RCoon said:


> I think, and I really want to believe, the 295X2 will beat the Titan Z. The Titan Z will be voltage locked, and it will be bottlenecked by the air cooler. The AMD however will have a much higher thermal tolerance before throttling (i hope), and will possibly have a higher voltage perimeter.
> 
> All that being said. Titan Z *is not designed for gaming*, for the 50th time, it's a cheap dpcompute card for home servers. This needs to be stapled to every forum header, title, post, thread, the whole works, until everyone on the internet understands that.


Isn't it? Why don't you post your comment then here (@ geforce.com) where they say it "is a gaming monster, built to power the most extreme gaming rigs on the planet.", "a serious card built for serious gamers"... etcetera.


----------



## RCoon (Apr 4, 2014)

Blín D'ñero said:


> Isn't it? Why don't you post your comment then here (@ geforce.com) where they say it "is a gaming monster, built to power the most extreme gaming rigs on the planet.", "a serious card built for serious gamers"... etcetera.



Gaming is not and has never been a Titan's primary purpose (please read *Titan without DPCompute AKA 780ti*). Nobody reads keynotes, nobody watches release events, nobody pays attention to anything and just blurt out what they think before doing research. I'm sick and tired of saying the same thing over and over again.
Yes, the Titan Z is viable for gaming, and is probably very good at it. This is not it's primary purpose. Please research.
Before anybody says anything, I would never buy a Titan, this isn't "lol you must work for NVidia", this is common sense, because I actually watched the release event where it's primary purpose was specifically outlined.
*Cheap DPCompute servers. For the millionth time.
*
_"The GTX Titan Z packs two Kepler-class GK110 GPUs with 12GB of memory and a whopping 5,760 CUDA cores, making it a "*supercomputer you can fit under your desk*," according to Nvidia CEO Jen-Hsun Huang" _

Anybody who buys a Titan Z for gaming probably needs to rethink their life, and apply for the Darwin Award.

In other news, I heard this is an AMD thread regarding the 295X2?


----------



## pr0n Inspector (Apr 4, 2014)

alwayssts said:


> Each wire is rated for 13A, just butchered for the pci-e sig spec.  Three active in both 6 or 8-pin (8-pin has extra grounds).   13Ax6 @ 12v = 936w.  I have no idea if the slot can go out of spec, probably not, but still...you could (theoretically) feed a double 8-pin card 1KW.


16AWG is 13A, 18AWG is 10A.
But the bottleneck is the connector, not the conductor.


----------



## Xzibit (Apr 4, 2014)

New images


----------



## Slizzo (Apr 4, 2014)

Yeah that design makes sense. While it does suck that it's not using a full cover waterblock for the card, I can see where a rush to market would mean they didn't want to wait for a full coverage block to be designed and chose to go this route.


----------



## radrok (Apr 4, 2014)

Oh my god I didn't notice, you can make this thing single slot with a waterblock.

Oh my god.


----------



## nem (Apr 5, 2014)

CHIPHELL did again here is the reality

http://www.chiphell.com/thread-1002731-2-1.html


----------



## Xzibit (Apr 6, 2014)

AMD posted this teaser/commercial for the 295x2

*AMD's Top Secret Mission*


----------



## Blín D'ñero (Apr 6, 2014)

nem said:


> CHIPHELL did again here is the reality
> 
> [...]


Why does your post contain a hyperlink to itself? Instead of a link to source Chiphell?


----------



## radrok (Apr 6, 2014)

Can't wait to see power consumption figures, those should be high like its performance.


----------



## MxPhenom 216 (Apr 6, 2014)

Xzibit said:


> 295x2 just has to a better Price/Performance then Titan Z/790 = Win
> 
> We should know when the 295x2 will launch on the 8th
> 
> Anyone know when the Titan Z will be launched ?



As far as im concerned, TitanZ is not the 790.


----------



## radrok (Apr 6, 2014)

MxPhenom 216 said:


> As far as im concerned, TitanZ is not the 790.



Agreed, there is literally no point in marketing the TitanZ for gaming.

Nvidia should grow some and get a proper dual 780ti on the market, without all the fuss about DP compute and so, 6GB per GPU for less than 1399 USD and it would have a winner.


----------



## HumanSmoke (Apr 7, 2014)

radrok said:


> Nvidia should grow some and get a proper dual 780ti on the market, without all the fuss about DP compute and so, 6GB per GPU for less than 1399 USD and it would have a winner.


The sites which leaked the advance info regarding the Titan Z seem convinced that a 780/780 Ti based GTX 790 dual card was incoming and that it would be distinct from Titan Z. My guess is that Nvidia would play out the same scenario as happened with the original GeForce GK110 launch - snag the price-no-object crowd with Titan, once the initial sales buzz dies down, launch a more price friendly number (700) series card that AIB's would have more input into.
Likely, Nvidia could tune the board frequencies for performance once the 295X2 arrives...although it wouldn't surprise me for AMD to counter with an lower cost air cooled dual 290 (no-X) either.

Haven't owned a dual board since a GeForce 6600GT duallie (early adopter-itis) , and I don't see any of these offerings convincing me to return to a dual card.
If all these SKUs come to pass, it seems like both camps are treading water until 20/16FF comes onstream, and they more boards they launch, the less likely it seems that we'll see an early appearance of the new process.


----------



## xorbe (Apr 7, 2014)

Titan Z would be identical to a 2x780Ti card.  They blow a firmware bit more or less to disable high perf fp64.


----------



## radrok (Apr 7, 2014)

Nvidia locks compute hardware level, too.


----------



## MxPhenom 216 (Apr 7, 2014)

radrok said:


> Nvidia locks compute hardware level, too.



This. Not to mention, GPUs compute performance also has something to do with the cache on the gpu, which Geforce cards are stripped of.


----------

