• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon R9 295X2 Press Deck Leaked

Since your reverting back to single gpu card argument in a dual gpu comparison does that mean you've given up hope already
Yes. I think the 295X2 is a waste of time for the enthusiast
At least wait for the reviews.
Why? Even AMD's own propaganda pegs the 295X2 at 60% better than a single card at a 154% higher price.
It just has to win the dual gpu card war and since Nvidia is selling Titan Z as a Gaming/CUDA Compute card it will compete on a dual GPU card basis with it
Not really. People aren't going to buy the 295X2 for CUDA development, or for Octane, or for any number of CUDA accelerated productivity apps ( of no small importance due to the hit-and-miss state of OpenCL support)....and as a gaming head-to-head...who cares when two single cards are faster and cheaper
Does anyone hear that? Its HumanSmoke in another AMD thread.
:D I've got a long way to go before I reach the lack of understanding and trolling you Xzibit in the Nvidia threads :clap:
 
Why not pull up the one where you didn't know where the PSU was in a server.

Strange how you only express negative views in AMD threads

I don't like any dual GPU card on principle - not just this one.
Duallies are usually more problematic, suffer in ability (OC) to two single cards, have more issues with drivers, lower resale, and generally aren't a saving over two separate cards

Hypocrite much?

I'm flattered though really. Never had a Hobbit pay so much attention to me.
 
So most reviews say 290(x) uses around 300watts but AMD is claiming 250watts? If pcie 3 can give 300 watts which probably not best idea for AMD to use that to power this, doubt that up to clock will be constant . Which that is other thing that sucks about AMD saying you will get Up to this performance most likely less. Least nvidia say you get least X but get what ever card can do after that.

With that said TitanZ i think was just Nvidia putting a shot across AMD nose to get them to respond and they did. Probably won't be long til see Nvidia fire back with something soon.
 
Why not pull up the one where you didn't know where the PSU was in a server.
Nah, that's bullshit :shadedshu:
Strange how you only express negative views in AMD threads
Obviously, since I'd prefer two 290X's to a single 295X2 :shadedshu:
Hypocrite much?
The quote is actually consistent with what I've been saying. Are you sure you understand what the word hypocrite means?
I'm flattered though really. Never had a Hobbit pay so much attention to me.
Well done. You are here....where you've always been
xsAcVmm.jpg

So most reviews say 290(x) uses around 300watts but AMD is claiming 250watts? If pcie 3 can give 300 watts which probably not best idea for AMD to use that to power this, doubt that up to clock will be constant .
Well, the board is specced for a nominal 375 watts delivery (2 x 8pin), but it will surely pull more than 150 watts per PCI-E 8-pin - as is quite common at the high end.
Even if the board will clock to its max 1018MHz, I'd think that any overclocking will pull significantly more current than the PCI specification, so any prospective owner might want to check their PSU's power delivery per rail (real or virtual). I really couldn't see this card getting to stable 290X overclocks.
 
Last edited:
The VRM on this seems to be a doubled up and more compact version of the R9 290X VRM so the GPUs can easily get fed 375A each so 750A total current allowance. Just don't expect the PCIe 8Pins(30A-40A) to be able to carry that much power(750A @ 1.5V = 1125W = 93A @ 12V) so if you do plan to use all of the VRM's capability you should solder on 1 or 2 more 6 pin connectors or risk burning something. So VRM wise your good but the dual 8 pins are insufficient. Most Quad R9 290X OC attempts I've seen included 2 1600W PSUs so this card OCed + OCed intel hexa core/ AMD octa core will need a 1200W+ PSU.

Each wire is rated for 13A, just butchered for the pci-e sig spec. Three active in both 6 or 8-pin (8-pin has extra grounds). 13Ax6 @ 12v = 936w. I have no idea if the slot can go out of spec, probably not, but still...you could (theoretically) feed a double 8-pin card 1KW.
 
Each wire is rated for 13A, just butchered for the pci-e sig spec. Three active in both 6 or 8-pin (8-pin has extra grounds). 13Ax6 @ 12v = 936w. I have no idea if the slot can go out of spec, probably not, but still...you could (theoretically) feed a double 8-pin card 1KW.
Finally a proper explain.
8+8 card can pull more than just 375W, as proved with 7990. Just make sure your PSU is single rail, or each 8 pin is on a separated healthy 30A rail.
By the way, 780ti is a 6+8 card, and it can pull a lot more than just 300W.
 
I have no idea if the slot can go out of spec.
It can, but varies on motherboards design. A prime example would be the recent GTX 750/750Ti reviews that used the reference (no aux power) design for testing. More than a few results (esp those which overclocked) yielded a power draw in excess of 75 watts.
01-GTX-750-Ti-Complete-Gaming-Loop-170-seconds.png

[Source]
 
how much salary do you get from nvidia?

Clearly not enough if you saw the car I drive. I'd also demand a free better GPU as a perk of the job. They can keep their shield for all I care.

Is it too much to ask to have a civilised thread without everyone sharpening their pitchforks and getting all potty mouthed?
I think I prefered it when the worse thing said in one of these threads is "that card looks ugly". I'd almost welcome Jorge at this point (this is a filthy lie).
 
Finally a proper explain.
8+8 card can pull more than just 375W, as proved with 7990. Just make sure your PSU is single rail, or each 8 pin is on a separated healthy 30A rail.
By the way, 780ti is a 6+8 card, and it can pull a lot more than just 300W.

Right. There are actually many cards that do it. Before the days of software to limit tdp so amd/nvidia could upsell you powertune and the like many volt-modded 4850's way out of spec. The same game was played in reverse when nvidia's 500 line essentially was clocked to the balls end of a pci-e spec for their plugs and overclocking brought them substantially over. The fact of the matter is while you could say pci-e plugs are an evolution of the guidelines of the old 'molex' connector, which is all well and good, they are over-engineered by a factor of around 2-3. This card just seems to bring it down to around 2.


It can, but varies on motherboards design. A prime example would be the recent GTX 750/750Ti reviews that used the reference (no aux power) design for testing. More than a few results (esp those which overclocked) yielded a power draw in excess of 75 watts.

Very interesting! Thanks for this. So it's a factor of around 2 then, if not the card simply limited to that power draw (to stay in the 75+75w spec) and it's even higher.

Well then, I guess you could take 936 + at least 141w then. When you factor in the vrm rated at (taking his word for it) 1125w, it gives an idea of what the card was built to withstand (which is insane). It seems engineered for at least 2x spec, which sounds well within reason for anything anybody is realistically going to be able to do with it. I doubt many people have a power supply that could even pull that with a (probably overclocked) system under load, even with an ideal cooling solution.

Waiting for the good ol' XS/coolaler gentleman to hook one up to a bare-bones cherry-picked system and prove me wrong (while busting some records). That seems pretty much what it was built to do.

On a weird sidenote, their clocks kind of reveal something interesting inherent to the design. First, 1018 is probably where 5ghz runs out of bandwidth to feed the thing. Second, 1018 seems like where 28nm would run at 1.05v, as it's in tune with binning of past products (like 1.218 for the 1200mhz 7870 or 1.175 for the 1150mhz 7970). It's surely a common voltage/power scaling threshold on all processes, but in this case half-way between the .9v spec and 1.2v where 28nm scaling seems to typically end. Surely they aren't running them at 1.05v and using 1.35v 5ghz ram, but it's interesting that they *could* and probably conserve a bunch of power, if not even dropping down to .9v and slower memory speed. If I were them I would bring back the UBER AWSUM RAD TOOBULAR switch that toggled between 1.05/1.35v for said clocks and 1.2v/1.5v-1.55v (because 1.35v 5ghz ram is 7ghz ram binned from hynix/samsung at that spec). That would actually be quite useful for both the typical person that spends $1000 on a videocard (like we all do from time to time) and those that want the true most out of it.
 
Last edited:
I don't believe that chart up there that claims a 750Ti is pulling 141W peak -- try again with 100ms divisions on the measuring hardware.
 
Would've loved to see it as a 3x8 pin just in case someone wants to abuse it for overclocking, 2x8 doesn't leave much room for power :(
They dont design something just because someone would love to see it because someone else would supposedly like to abuse it.
They were more practical in their design. If you would take a look at the PCB, you would understand why they went with the dual 8 pin power connectors.
Because there isn't place for a third, and the card is long as it it.

What i would like to see is how ASUS would design their ARES III. They would probably use a wider PCB and triple 8 pin power plugs. Even so, i don't think there would be place for 8 GB per GPU.
 
...
The reality is that people go where the performance is at this level of expenditure- and performance (and price) resides with combinations of single GPU boards, not some PR grab for the halo which might (or might not) give midrange card buyers a chubby.

You really sound like someone that has never bought enthusiast grade hardware.

The reality is i am interested in 295x2 because it allows me to use 4 GPUs using only 4 PCI slots, further using only 2 Physical slots on the motherboard. The way its designed(the slots on the back are on a single row), a custom made waterblock, would allow a single 295x2 to be made single slot, thus widening my possibilities, and winning me another slot on the motherboard.

Thats why i would be personally interested in this card.
 
They dont design something just because someone would love to see it because someone else would supposedly like to abuse it.
They were more practical in their design. If you would take a look at the PCB, you would understand why they went with the dual 8 pin power connectors.
Because there isn't place for a third, and the card is long as it it.

What i would like to see is how ASUS would design their ARES III. They would probably use a wider PCB and triple 8 pin power plugs. Even so, i don't think there would be place for 8 GB per GPU.

I expect overkill on these kind of GPUs, nothing less.
 
I think, and I really want to believe, the 295X2 will beat the Titan Z. The Titan Z will be voltage locked, and it will be bottlenecked by the air cooler. The AMD however will have a much higher thermal tolerance before throttling (i hope), and will possibly have a higher voltage perimeter.

All that being said. Titan Z is not designed for gaming, for the 50th time, it's a cheap dpcompute card for home servers. This needs to be stapled to every forum header, title, post, thread, the whole works, until everyone on the internet understands that.
Isn't it? Why don't you post your comment then here (@ geforce.com) where they say it "is a gaming monster, built to power the most extreme gaming rigs on the planet.", "a serious card built for serious gamers"... etcetera.
 
Isn't it? Why don't you post your comment then here (@ geforce.com) where they say it "is a gaming monster, built to power the most extreme gaming rigs on the planet.", "a serious card built for serious gamers"... etcetera.

Gaming is not and has never been a Titan's primary purpose (please read Titan without DPCompute AKA 780ti). Nobody reads keynotes, nobody watches release events, nobody pays attention to anything and just blurt out what they think before doing research. I'm sick and tired of saying the same thing over and over again.
Yes, the Titan Z is viable for gaming, and is probably very good at it. This is not it's primary purpose. Please research.
Before anybody says anything, I would never buy a Titan, this isn't "lol you must work for NVidia", this is common sense, because I actually watched the release event where it's primary purpose was specifically outlined.
Cheap DPCompute servers. For the millionth time.

"The GTX Titan Z packs two Kepler-class GK110 GPUs with 12GB of memory and a whopping 5,760 CUDA cores, making it a "supercomputer you can fit under your desk," according to Nvidia CEO Jen-Hsun Huang"

Anybody who buys a Titan Z for gaming probably needs to rethink their life, and apply for the Darwin Award.

In other news, I heard this is an AMD thread regarding the 295X2?
 
Last edited:
Each wire is rated for 13A, just butchered for the pci-e sig spec. Three active in both 6 or 8-pin (8-pin has extra grounds). 13Ax6 @ 12v = 936w. I have no idea if the slot can go out of spec, probably not, but still...you could (theoretically) feed a double 8-pin card 1KW.
16AWG is 13A, 18AWG is 10A.
But the bottleneck is the connector, not the conductor.
 
Yeah that design makes sense. While it does suck that it's not using a full cover waterblock for the card, I can see where a rush to market would mean they didn't want to wait for a full coverage block to be designed and chose to go this route.
 
Oh my god I didn't notice, you can make this thing single slot with a waterblock.

Oh my god.
 
CHIPHELL did again here is the reality

[...]
Why does your post contain a hyperlink to itself? Instead of a link to source Chiphell?
 
Can't wait to see power consumption figures, those should be high like its performance.
 
295x2 just has to a better Price/Performance then Titan Z/790 = Win

We should know when the 295x2 will launch on the 8th

Anyone know when the Titan Z will be launched ?

As far as im concerned, TitanZ is not the 790.
 
As far as im concerned, TitanZ is not the 790.

Agreed, there is literally no point in marketing the TitanZ for gaming.

Nvidia should grow some and get a proper dual 780ti on the market, without all the fuss about DP compute and so, 6GB per GPU for less than 1399 USD and it would have a winner.
 
Back
Top