• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 Founders Edition Potentially Pictured: 3-slot Behemoth!

SLI/Crossfire, even in its bright days, has always had a lot of problems with support, it is very dependent on the drivers for each game and the goodwill of game developers.

Then there was the problem of poor scalability. Adding a second GPU, at best, could bring 40% or 50% more performance, but mostly it was well below that. That is, you paid for a GPU to have half or 1/3 the performance of it, depending on the game. And it got worse, the more GPUs we put in.

There could be rares cases where it paid off, but as a general rule, after a few years, it was better to sell the card and buy one of the new generation, avoiding a lot of hassle. Not to mention the heat, noise and power consumption that SLI/Crossfire usually caused. Often, the GPU that was on top was constantly throttling due to not having room to breathe, further decreasing the performance gain.

I don't think it has anything to do with the price of the RTX 2080Ti, Nvidia started to follow this path long before that, for example, the GTX 1060 in 2016 no longer had support for this. AMD did not embark on cards over $1000 and also abandoned dual-GPU.
Dude, you really have NO IDEA about SLI, don't you? Where are you getting those numbers? 40-50%? That happened when it first came back with PCIe. By the time we got to the GTX 780 and R9 290x the improvements were in the 90-100% range. Not only that, what you say about temperatures and sound? totally wrong. Why? Because you rarely had to go 100% in both cards, unlike when you have one card and you have to squeeze all the power from them. Yep, they consume more power, but that's about it.
I have used SLI and CFX quite a few times. I had a CFX of 4870, then of GTX 480, and then for R9 290x. Heck, my MSI GT80 still has a SLI of GTX980m.
Of course, the games that to properly implement SLI and CFX, if you, there was little to no gain, and that is what happened. If the card makers aren't pushing dual (or triple, quadruple even) configurations any more, game companies will not support them.
 
Given the size of the cooler, I wonder how hot will the RTX 3xxx series run. Despite the use of a newer fab (unknown at this point but surely better than 12nm), the power requirement is still shooting through the roof. I feel Nvidia went ultra aggressive to cram in as much as they can. I suspect most of the die space to be taken up by RT, Tensor and whatever bespoke cores they are going to add in there.

Ultimately, the one with the best value to performance will still be the hottest selling card. Cards like the XX80Ti are not meant for most people, whether it is due to their requirements or budget.

I wonder if there are any ITX cases that are big enough to fit this beast.
Big enough ITX? Chances are slim. Able to handle the heat? I think chances are very slim.
 
Well, It depends on the cards you compare. For example, the most common first mid-end Turing cards (1660/1660Ti), were not bad values compared to the 1060 3GB and 1060 6GB, respectively.

The 1660 cost 10% more than the 1060 3GB, but offered 35% more performance at 1080p, in addition to increasing the VRAM to 6GB.

The 1660Ti cost 12% more (if we ignore the higher Founders Edition price of 1060), but it had 36% more performance at 1080p than 1060 6GB.
Are you focking serious? 1060 is THREE years older than 16xx's. In old days you could've get 100% performance for same price in such period.
 
I guess im keeping my 1080ti or upgrading to 2080ti cuz nothing else will fit in this N1Case.
 
I've been craving another power monster since the 295x2.

Ah yes the $1500 face melting water cooled beast! The persuit of performance was fun back then, now people just like to get upset about things they don't intend on buying anyway.
 
Given the size of the cooler, I wonder how hot will the RTX 3xxx series run. Despite the use of a newer fab (unknown at this point but surely better than 12nm), the power requirement is still shooting through the roof. I feel Nvidia went ultra aggressive to cram in as much as they can. I suspect most of the die space to be taken up by RT, Tensor and whatever bespoke cores they are going to add in there.

Ultimately, the one with the best value to performance will still be the hottest selling card. Cards like the XX80Ti are not meant for most people, whether it is due to their requirements or budget.


Big enough ITX? Chances are slim. Able to handle the heat? I think chances are very slim.

I think size would be the biggest limiting factor. There are plenty of ITX cases that allow the GPU to draw fresh air through the side panel, that heat isn't an issue.
I've seen small builds using threadrippers and those things can pump out some real heat.
 
Ah yes the $1500 face melting water cooled beast! The persuit of performance was fun back then, now people just like to get upset about things they don't intend on buying anyway.
Though I have to give a credit that AMD managed to cool it with just an 120mm AIO. :eek:

That truly was a beast, I had a R9 290 CF year ago and when CF worked, it scaled pretty nice. But I kinda understand why SLI/CF is kinda yesterday's thing.
 
and then there are pictures of coolers with a 'shroud' on and off...

Nvidia-RTX-3080-Ampere-1000x750.jpg


Like wut
That is the rear of the card, what you're looking at is the backplate. Look at the position of the PCIe slot and I/O bracket. The PCB is obviously behind there, so it's not like it's blocking any airflow. Entirely agree with the rest of the post though.
Are you focking serious? 1060 is THREE years older than 16xx's. In old days you could've get 100% performance for same price in such period.
While you're not entirely wrong, that is also the ever increasing reality of chipmaking - as time passes, the generational gains shrink. As production nodes near various physical limits, they become more expensive and difficult to make, making bigger chips expensive and low yielding. As the number of GPU cores increases, memory bandwidth is ever more of a bottleneck. As is the rest of the system (hence why the new consoles are NVMe-only and have dedicated decompression hardware to feed their GPUs). So while in the past a three year wait would allow for for example a 100% increase in CUs/CUDA cores, a small bump in clock speeds and the same power draw at the same price, that isn't happening any longer. Mind you, that doesn't justify current GPU prices by any stretch of the imagination, but it does explain why generational gains are shrinking. It's a sign of maturing technologies. Hopefully increased competition this generation will drop prices a bit and keep them there though, as the current midrange and upper midrange cards are priced where high end and flagship cards used to be...
 
Are you focking serious? 1060 is THREE years older than 16xx's. In old days you could've get 100% performance for same price in such period.
Probably even more. Think about 6600 GT (2004) -> 8600 GTS (2007) for example. In the high-end range the boost is even higher than in those mid-end cards.
 
Ah yes the $1500 face melting water cooled beast! The persuit of performance was fun back then, now people just like to get upset about things they don't intend on buying anyway.

Those cards were special, abberations. This card is a bog standard card.
 
for the glory!!!!



normally I would agree with you, but I think those days are gone now, moving forward after this bump will be 5% max gains on fps ends, but improving DLSS and RT and any other new gimmicks they come up with to keep us spending money and not worrying about the 5%.
Raytracing is new enough that we are about to see generation 1 from one team and gen 2 from Nvidia, it's still very new , generational leaps should be quite good for five years, but 12 years use of a 3090, you and him are having a laugh, no chance of it remaining useful.

No GPU would, it's not a slur against Nvidia to say such, it's a slur against them to imply they can't improve adequately in that time.
 
Last edited:
Gtx 480 might have a succesor with respect to power drawn and heat outputed.
 
I've been singing the same tune for several weeks.. if this power rumor us true (3 slot monster seems like that is coming true) amd doesn't stand a chance to compete with non titan flagship. Unless this silicon is completely borked, how does a new architecture and a die shrink at 300W+ compare against a new arch with a tweaked process? Remember 5700XT was 45% slower than a 2080ti. If ampere is 50% faster, then AMD needs to be ~100% faster to compete. We havent see a card come close, from any camp, ever. That saod, maybe its RTRT performance is where the big increase is... who knows.

So lomg as amd's card lands between them and is notably cheaper, it will be a win for everyone. But I just don't think rdna2 flagship will be within 15%.
 
Last edited:
nvidia knows that this next year/18 month that pc gaming hardware will take a hit, they always do with the start of a new console cycle. as such they are going to try and cream as much margin off the top as they can.
 
Stupid price, stupid size, stupid power consumption. Pathetically desperate effort from Nvidia to cling to the performance crown no matter what. Again so much time devoted to an ultra niche product only 0.01% of gamers will actually buy despite all the keyboard warriors claiming otherwise.
Desperate? High-end is what moves the market forward.

And get your facts straight, Nvidia's current top model, RTX 2080 Ti, holds 0.88% of the Steam user base, that's slightly more than e.g. RX 5700 XT.

Neither price nor power consumption is confirmed at this point.
 
It would seem a lot of people in this thread are confusing the RTX 3080 leaked a few weeks ago with this new 3090.

3080 leak from July:
1598184321629.png


3090 leak from Friday:
1598184395183.png


No, these are not the same card. Please stop posting leaked images of the 3080 and using it to claim things about the 3090.
 
That beast will not be going into my Dell T3500. It literally won't fit in physically. I'm sure it would work though. Might be time to build a new system... Been dragging my feet for almost a year anyway... I've settled on Threadripper, just trying to decide on which one..
How about getting some proper case for 60$ if throwing $1000+ on VGA?
 
how can it be justified by performance? That is nonsense.
GTX 1070 e.g. for a launch price of 380 USD gave the same performance, if not more, of 980ti, which had a launch price of 650 USD. It was even on par with Titan X, which in 2015 had launch price of 1000 USD. That was progress. It was also 50 USD more than GTX 970, which however, it beat by 50% to 60% in various scenarios.

RTX 2070 on the other hand was only 30% better than GTX 1070 and had a launch price of 500 USD, FE was even 600 USD I think. So it provided 30% performance for at a price increase of 31+%. That is no progress, that is a very linear upgrade with more money being paid.

They already screwed up the game there. 3000 series add a cherry on top of this.
 
nvidia should just put GPU on the front CPU on the back of the PCB. and voila, not case needed. just a stand.
 
It would seem a lot of people in this thread are confusing the RTX 3080 leaked a few weeks ago with this new 3090.

3080 leak from July:
View attachment 166438

3090 leak from Friday:
View attachment 166439

No, these are not the same card. Please stop posting leaked images of the 3080 and using it to claim things about the 3090.

Lol^ you are asking people to stop speculating in a thread that is entirely speculation. Why? nobody is claiming anything there is literally the word potential in the title. Relax and don't take it too seriously.
 
I'll go back to sailing my boat while you guys talk it out.
 
This card is going to be workstation oriented. I think it's just too big for most systems, plus those power requirements are going to be insane. I don't think all power supplies will play fair with this monster. I'll wait for the 3050 or 3060 series and see what they offer then. Something a little more reasonable.

It's cards like this that allow some crazy gaming, dont get me wrong. 4k gaming is wicked. But, we've seen where 4k gaming is headed at a dev-level with the Xbox and PS5. Pseudo 4k is the current future. The 3090 is likely to offer like a full 4k experience, but at an unreasonable cost. RDNA2 will probably win with cross platform optimizations on a more pedestrian and market-accessible product that comes in two consoles or your choice of AIB GPUs.
 
I guess im keeping my 1080ti or upgrading to 2080ti cuz nothing else will fit in this N1Case.

like there won’t be normal size cards for better price to match 2080ti (i.e. 3070). That’s what I’m expecting to put in my ncase 6.1 too instead of still doing fine 1080.
 
Back
Top