Friday, August 23rd 2013
NVIDIA Working on GK110-based Dual-GPU Graphics Card?
GeForce GTX 295 showed that its possible to place two GPUs with ludicrously high pin-counts next to each other on a single PCB, and if you get a handle over their thermals, even deploy a 2-slot cooling solution. NVIDIA might be motivated to create such a dual-GPU graphics card based on its top-end GK110 chip, to counter AMD's upcoming "Volcanic Islands" GPU family, or so claims a VideoCardz report, citing sources.
The chips on the card needn't be configured, or even clocked like a GTX Titan. The GTX 780 features just 2,304 of the chip's 2,880 CUDA cores, for example. Speaking of 2,880 CUDA cores, the prospect of NVIDIA developing a single-GPU GeForce product with all streaming multiprocessors on the GK110 enabled, the so-called "Titan Ultra," isn't dead. NVIDIA could turn its attention to such a card if it finds AMD's R9 2xxx within its grasp.
Source:
VideoCardz
The chips on the card needn't be configured, or even clocked like a GTX Titan. The GTX 780 features just 2,304 of the chip's 2,880 CUDA cores, for example. Speaking of 2,880 CUDA cores, the prospect of NVIDIA developing a single-GPU GeForce product with all streaming multiprocessors on the GK110 enabled, the so-called "Titan Ultra," isn't dead. NVIDIA could turn its attention to such a card if it finds AMD's R9 2xxx within its grasp.
43 Comments on NVIDIA Working on GK110-based Dual-GPU Graphics Card?
I kinda imagine the % of people buying the Titan and leaving it a stock is tiny because let's face it, you gotta be really into it to buy one or two, like me.
That said, it wouldn't have cost that much to Nvidia to deliver a proper VRM.
So yes, I'm ranting cause Nvidia always cheaps out on PCB components ;)
If they can release a Titan Ultra that doesn't throttle with all 2880 cores and clock it at a full Gigahertz, it might just get Nvidia by until whenever Maxwell is due. This however, means the VRMs on the board must support at least 300W-350W of power to not throttle the chip. Nvidia themselved proved that their cards were throttled when they released their GTX 770s which consume more power than the 680s they originally replaced.
I honestly hope they don't waste their time with this dual-GPU card bullshit, as it would be ironic for them to release it, especially after they spent months "educating" everyone about RUNT frames and how much having 2x Xfire/SLI GPUs increases latency, ever since they released the Titan. I don't even think anyone here would give two shits about a dual Titan card for any reason, not even to drool over, since the Titans throttle on their own already without sharing a single PCB. It doesn't matter even if I had $5K burning a hole in my pocket, I wouldn't go anywhere near any dual-GPU card with a barge pole after seeing how it fucks up frame timings in most games.
If you are talking about reference well, volterras on my 6990s didn't have noticeable vdroop either.
I get what you are saying though.
But my point is that even the ones that do want to tweak it can without a problem, the PWM is good enough for decent overclocks.
Anyway to answer in your tone, you'll eventually realize someday that money isn't directly proportional to stupidity.
Goes well with the fact that those people who make good money actually have used their brain in the right way. :toast:
I'm done since you had to empower your point with an insult :cool:
No insult intended at all, the plain fact is that titans are completely overpriced and only an idiot would actually pay the outrageous premium for them, W1z has even said so himself.
And there are plenty of people out there with more money than brains. Just because you have money doesn't mean you're smart.;)
www.overclock.net/t/1421221/gtx780-titan-any-ncp4206-card-vdroop-fix-solid-1-325v
www.techpowerup.com/forums/showpost.php?p=2967267&postcount=50
[YT]sRo-1VFMcbc[/YT]