• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Ti Founders Edition

Surely our @W1zzard tested somewhere else in the game but the difference between 3080 and 6900XT in his review is 0 compared to the 13% in the HU review.
Probably test scene, either way not worth spending the money, send me a 5950X if you want, I'll use it

Is there something wrong about the Cyberpunk 2077 ray-tracing results? The RX6000 series non-RT vs RT results are the same...
Fixing. 6000 Series shouldn't be included here, haven't retested them yet with RT on, because I'm locking in a specific patch version for Cyberpunk until I do a full retest
 
A vast majority of consumers aren't going to tweak power limits. IMO it's frankly annoying to have another program running in the background and another source of potential issues.

That's not a problem customers should have to solve either. This is just like AMD users who were claiming AMD Vega is power efficient once you under-volt. That's great and all but it doesn't mean squat to the vast majority of users. Companies should ship products that hit their target markets out of the box. Customers should not have to fiddle with products after the fact. That's for enthusiasts if they want to spend the extra effort.
One defence in AMD's favour is that they offered not one, but three solutions to that.

1. Hardware switch on all vega models that switched between a normal and quiet BIOS with lower TDP
2. Set-and-forget presets in the drivers that required no undervolting experience - power saver, default, turbo
3. A full undervolting suite with arguably more tuning than even MSI Afterburner built in - so no annoying other program running in the background.

So AMD's defaults were "hot and loud" but even people who didn't want to tinker had two viable options to get far greater power-efficiency. I've been saying it for years, but Nvidia needs to offer basic tuning and fan curve control in their drivers. Afterburner is a necessary evil that looks like it belongs in the Windows XP era.
 
Just as much as I thought sadly, not much extra performance for a shed load more cash, I mean, $700 to $1200?? Nah...

Don't flame me here but I think I'd rather take the 6900 XT model over the 3080 Ti at this point for just the numbers. Those new AMD cards are utterly amazing, sold out, sold out everywhere or stupidly priced but still...

And the memory temps as well and that's just in game, I'd hate to see what it will be like when they get them mining..... You know it'll happen and it's just a sad fact that it will... Oh well :) I'll stick with the cards I've got... I'm happy regardless! :D
 
One defence in AMD's favour is that they offered not one, but three solutions to that.

1. Hardware switch on all vega models that switched between a normal and quiet BIOS with lower TDP
2. Set-and-forget presets in the drivers that required no undervolting experience - power saver, default, turbo
3. A full undervolting suite with arguably more tuning than even MSI Afterburner built in - so no annoying other program running in the background.

So AMD's defaults were "hot and loud" but even people who didn't want to tinker had two viable options to get far greater power-efficiency. I've been saying it for years, but Nvidia needs to offer basic tuning and fan curve control in their drivers. Afterburner is a necessary evil that looks like it belongs in the Windows XP era.

Yes, in that regard AMD is a lot better.
 
Dude, watch any YT video with RivaTuner running on a 5950X and 3090. An engine designed around Jaguar cores isn't going to utilize 32 threads:
View attachment 202648
Numbers do not lie. And it isn't the only game to take advantage of more threads than 16 if combined with the most powerful GPUs of today.

Probably test scene, either way not worth spending the money, send me a 5950X if you want, I'll use it
@W1zzard don't take it wrongly. I deeply respect your work all these years for us PC enthusiasts. So, I don't even dare to say to you what to do with your test system, I just imply that for high performance GPUs of today, in order not to bottleneck them, you need more than an 8-core CPU, no matter how fast it is. And as for the test scene, the bottleneck in your system seems to exist in most of the newest games (DX12 and Vulkan better utilise more than 16-threads or more cache)
1622712513636.png
1622712622938.png
 
Last edited:
Reviewing this unavailable series is not ethical anymore, let alone the Fraudy Edition which never really existed and were always unavailable.
 
One defence in AMD's favour is that they offered not one, but three solutions to that.

1. Hardware switch on all vega models that switched between a normal and quiet BIOS with lower TDP
2. Set-and-forget presets in the drivers that required no undervolting experience - power saver, default, turbo
3. A full undervolting suite with arguably more tuning than even MSI Afterburner built in - so no annoying other program running in the background.

So AMD's defaults were "hot and loud" but even people who didn't want to tinker had two viable options to get far greater power-efficiency. I've been saying it for years, but Nvidia needs to offer basic tuning and fan curve control in their drivers. Afterburner is a necessary evil that looks like it belongs in the Windows XP era.

Wasn't the fan curve tuning in AMD software broken? and Enhanced Sync is also broken?

The problem with Vega was that the Pascal was in another league in term of efficiency
perfwatt_2560_1440.png

Even if you set the Vega64 to powersave BIOS, the GTX1080 is still 40% more efficient, simply an insurmountable difference.
 
And as for the test scene, the bottleneck in your system seems to exist in most of the newest games (DX12 and Vulkan better utilise more than 16-threads or more cache)
Where do you see the bottleneck in those two games? Just "higher FPS" ? That means lighter test scene, not bottleneck. Bottleneck is when many results are bunched up against an invisible wall, like in DoS II in my tests, or Death Stranding, and that goes away at higher resolution
 
Where do you see the bottleneck in those two games? Just "higher FPS" ? That means lighter test scene, not bottleneck. Bottleneck is when many results are bunched up against an invisible wall, like in DoS II in my tests, or Death Stranding, and that goes away at higher resolution
The difference between RX6900XT vs RTX3090 in Death Stranding @1080P is +7% in HU and -2% in your review. And for Watchdogs Legion, while in HU test 6900 wins with 14%, in your test wins for 8%. That seems to be a clear sign of bottleneck imho. Open to other theories though. I like to learn new things, especially from a pro like you. :lovetpu: :toast:
 
The difference between RX6900XT vs RTX3090 in Death Stranding @1080P is +7% in HU and -2% in your review. And for Watchdogs Legion, while in HU test 6900 wins with 14%, in your test wins for 8%. That seems to be a clear sign of bottleneck imho. Open to other theories though. I like to learn new things, especially from a pro like you. :lovetpu: :toast:
Maybe different definition of bottleneck. I'm sure if you look, you can find other results showing the opposite. I'd say this is first and foremost test scene selection, then test platform differences, like memory freq and timings (they use 32 GB 3200CL14 DR, we use 16 GB 4000CL19 SR) and then maybe CPU speed (5800X vs 5950X). I don't think one is more correct than the other, but certainly one will please you more than the other.

This is great, gives you more data, so you can come to your own conclusions. If we all had the same benchmark results, they'd become meaningless and we'd all be left having to try to entertain our audience in other ways to get their attention
 
Maybe different definition of bottleneck. I'm sure if you look, you can find other results showing the opposite. I'd say this is first and foremost test scene selection, then test platform differences, like memory freq and timings (they use 32 GB 3200CL14 DR, we use 16 GB 4000CL19 SR) and then maybe CPU speed (5800X vs 5950X). I don't think one is more correct than the other, but certainly one will please you more than the other.

This is great, gives you more data, so you can come to your own conclusions. If we all had the same benchmark results, they'd become meaningless and we'd all be left having to try to entertain our audience in other ways to get their attention
No problem at all @W1zzard. I enjoyed our discussion. Just another clue for the HU's test results: he didn't use the SAM feature.
 
we'd all be left having to try to entertain our audience in other ways to get their attention
Like that time you drank articlean?
 
Wasn't the fan curve tuning in AMD software broken? and Enhanced Sync is also broken?

The problem with Vega was that the Pascal was in another league in term of efficiency
perfwatt_2560_1440.png

Even if you set the Vega64 to powersave BIOS, the GTX1080 is still 40% more efficient, simply an insurmountable difference.
Sounds like FUD to me. Some partner models with custom boards, fan controllers, vBIOSes perhaps, but that's on the partners and the same issues exist with Nvidia cards and Afterburner not being able to control the fan speeds properly or at all.

I have experience with 50+ Vega and Polaris cards, most of them reference models, and can't say I had a problem with fan speed tuning other than a couple of partner models not matching their RPM in the driver exactly to what GPU-Z reported - but you could still tune them faster or slower without problems which is better than absolutely no control whatsoever from Nvidia's control panel.

Not sure about enhanced sync, seems like a shit idea anyway so I never used it - why would you not just use freesync instead? If you're on an ancient 60Hz monitor and can't hit 60fps then it's going to be ugly no matter whether you run vsync on, off, or enhanced.

As for Vega's efficiency, that's not the topic we're discussing and even the driver/tuning stuff we're talking about is already pretty far OT.
 
I love the peanut gallery throwing out anecdotes about how PC gaming is totally dying and everyone is going to go buy consoles. Yeah, I cant get a GPU right now so I'm gonna drop high end GPU money on a console that cant do 4k60/1440p144 at ALL and can barely do 4k30/1440p60 (1080p60 for PS5 since it cant even do 1440p LMFAO) with a totally closed environment with no competition and stuck with joystick controls. :rolleyes:

Whatever you're smoking to come up with that argument, you can keep it, cause its garbage.

Also daily reminder that the 8800 ultra launch for the equivalent of $1100 in 2006. Prices go up, prices go down. :roll: LOLCALMDOWN :roll:
That anomaly was because lack of competition from amd, unlike now.
 
*Begins crying*
*Uses tears to fill my EK block and watercool the VRAM with an active backplate*

Yeah its a problem, and something the Ti should have resolved. They clearly just used the existing cooling setups with zero changes.

Yup, I agree, man. Improving the memory cooling would go a long way into making Ampere a better product overall, even though AIB designs exist, even the overbuilt cards like the Strix have some difficulty with that.

My TUF has been adequate on that front but the assembly is superb, and well, I keep my computer obsessively clean and I'm using it on a case that has a small hurricane going on, it's an PC-O11 Air with 12 fans installed or so. Not a quiet system exactly, even though it's whisper quiet to anyone used to blower Vega cards (like me). I swear my old Frontier left my right ear a bit deaf :laugh:
 
5-10% performance increase over 3080 with >50% MSRP increase. Good job Nvidia.. good job.
They are eliminating the scalpers by scalping themselves.
This should be illegal by all means, they are exploiting the situation that is mainly their fault and this is not only directed at NVIDIA but AMD as well.
Let's say the realistic MSRP is $800 and they are making 25% profit that is $200 per card.
With $1200 MSRP on the same card they are making $200 + $400 = $600 basically by selling one card they are making the same profit as selling 3 cards probable even more as their cost is lower to make and distribute etc.
So I don't see any incentive for either NVIDIA or AMD to produce more cards really and this is where somebody should sue their assess off.
 
They are eliminating the scalpers by scalping themselves.
This should be illegal by all means, they are exploiting the situation that is mainly their fault and this is not only directed at NVIDIA but AMD as well.
Let's say the realistic MSRP is $800 and they are making 25% profit that is $200 per card.
With $1200 MSRP on the same card they are making $200 + $400 = $600 basically by selling one card they are making the same profit as selling 3 cards probable even more as their cost is lower to make and distribute etc.
So I don't see any incentive for either NVIDIA or AMD to produce more cards really and this is where somebody should sue their assess off.

The problem is that GPUs aren't essential goods, the market conditions (shipment/freight limitations and delays due to the ongoing pandemic, strained production capacity, overwhelming demand and increase in BOM from their suppliers increasing costs, including TSMC, plus silicon quality binning, i.e. 6900 XT XOC models with Navi 21 XTXH ASIC come to mind) are favorable to a price hike, and the MSRP is defined by the manufacturer regardless of whether it's high or low. There's also the concern that when you buy a graphics card, you don't only buy the components that go in it - you buy the ongoing driver development throughout its lifecycle (graphics KMD engineers and QA departments are expensive!) and foot the R&D costs for the future generations' development and their surrounding features (even more expensive).

End of the day, AMD, NVIDIA and Intel all have something in common: they're all multi-billion-dollar corporations with the sole purpose of making money, and all of them are guilty of anti-consumer shenanigans in recent times.
 
What a joke of a price. Double the MSRP for 2 GB of extra VRAM.

And as I thought, the 3070 Ti is 8GB G6X. Makes me glad I went with the regular 3070 because my G6 isn't running at the temperature of the surface of Mercury.
 
So I don't see any incentive for either NVIDIA or AMD to produce more cards really
Not true. If you produce more, you sell more and make more profit. Even if the price per unit goes down a bit, and you sell more, you still make more profit
 
Dang this card is a hungry boi.
 
They are eliminating the scalpers by scalping themselves.
This should be illegal by all means, they are exploiting the situation that is mainly their fault and this is not only directed at NVIDIA but AMD as well.
Let's say the realistic MSRP is $800 and they are making 25% profit that is $200 per card.
With $1200 MSRP on the same card they are making $200 + $400 = $600 basically by selling one card they are making the same profit as selling 3 cards probable even more as their cost is lower to make and distribute etc.
So I don't see any incentive for either NVIDIA or AMD to produce more cards really and this is where somebody should sue their assess off.
Hi,
Last I read there is a tariff cost added to the cards maybe there are still tariffs on imports to the USA but I would of thought current administration would of stopped them by now guess not.
 
Not true. If you produce more, you sell more and make more profit. Even if the price per unit goes down a bit, and you sell more, you still make more profit
"NVIDIA (NASDAQ: NVDA) today reported record revenue for the first quarter ended May 2, 2021, of $5.66 billion, up 84 percent from a year earlier and up 13 percent from the previous quarter, with record revenue from the company's Gaming, Data Center and Professional Visualization platforms.May 26, 2021"
They doubled their revenue during "shortage".
Let's agree to disagree.
 
You misunderstood the argument prior commenters were making.

The problem is the pricing increases of GPUs in general and the complete lack of any improvements in the budget market, not temporary pricing during the pandemic. The pandemic is a separate problem that inflates prices across the board.

The pandemic is not forever, what people are worried about is that even if it does go, that still leaves little room for budget options and it won't change the fact that Nvidia is still charging $1,200 for this card. Consoles on the other hand will return to their MSRP of $500.

Most people are aware of the drawbacks of consoles, you don't have to point that out. That said at $500, if Nvidia / AMD completely fail to address the budget market you can't really blame those people for considering console when in fact Nvidia / AMD aren't even providing products most people can afford. PC elitists seem to forget that the PC market is held up mostly by budget and midrange where the vast majority of gamers reside. No amount of "Well PC can do this..." will change the price. If a person can't afford it they can't buy it, if a person thinks it isn't worth it they will spend their money elsewhere.

Speaking of the 8800 ultra:

"The 8800 Ultra, retailing at a higher price,[clarification needed] is identical to the GTX architecturally, but features higher clocked shaders, core and memory. Nvidia later[when?] told the media the 8800 Ultra was a new stepping,[clarification needed] creating less heat[clarification needed] therefore clocking higher. Originally retailing from $800 to $1000, most users thought the card to be a poor value, offering only 10% more performance than the GTX but costing hundreds of dollars more. Prices dropped to as low as $200 before being discontinued on January 23, 2008."


At the time that card released it was roundly criticized by the press for being extremely poor value and that was for a 10% gain on a 30% price increase. The 3080 Ti is a 7% increase for 70% more money. I'm glad you brought that up because it just objectively shows how piss poor value the 3080 Ti is even compared to more extreme examples. Mind you that was still a single overpriced card. Nvidia has been increasing the ASP across their entire GPU stack, not just a single model.
People complaining about high GPU prices pre pandemic seem to have utterly forgotten that this decade we have had:

The competitive 7000 series bringing HD gaming performance to the $100 segment
The 290x retailing for under $300 for nearly a year
The RX 480/580/470/570/560 being fantastic budget cards for 3 years running, well into 2019.

And on the high end, you had
The $650 980ti that was an OC beast
The $700 1080ti that became a high end champion for nearly half a decade
The 3080 launching at just $700 while obliterating 2080ti performance
The 6800xt launching at $650 with more VRAM and better non RT performance.

The elevated MSRP and lack of budget cards is a VERY RECENT development that has largely occurred during the pandemic. Thinking that these price increases are permanent is hilariously short sighted. People were forecasting the doom of the PC gaming industry in 2007 too, and within 2 years the evergreen GPU lineup from AMD smashed predictions for what could be had on a budget and forced nvidia's hand, giving us the $500 flagship era.

As for "citation needed", have a review from anandtech for the launch of the 8800 ultra, instead of some garbage wikipedia entry:


No one is forcing you to buy these expensive cards, and their presence will not stop budget cards from existing anymore then the existence of the bugatti veyron means you cant buy a honda civic. With modern process nodes running at capacity, the priority is going to be on higher end cards until supply improves and prices drop. Not to mention the continuing supply issues from the pandemic, from labor shortages to electronics supply restrictions. Pieces used to make the fans for cards, the capacitors on cards, ece are all in short supply.

And no amount of "well you cant get budget cards right now but these consoles are only $500" is going to convince people to A. abandon the library of PC games they have already acquired and B. invest in a closed platform that appeals to a different market, especially when C. those consoles are in JUST AS SHORT OF SUPPLY as GPUs are. Good luck buying consoles right now for anywhere near MSRP.

Budget cards will return. Supply will normalize eventually. IDK why people are so insistent that everything has changed permanently forever when the pandemic isnt even over yet and its effects will continue for some time afterwards. Just wait. Go outside and take a break from the PC, or play the literal decades of backwards compatible games that will run toaster GPUs. You may find you dont need a high end GPU to enjoy gaming.
 
The 3080 launching at just $700 while obliterating 2080ti performance
Just $700 BECAUSE the 2080Ti was ridiculously overpriced, that's an illusion. $700 - overpriced.
6800XT at $650 BECAUSE 3080 was at $700 and had DLSS.

Anything from TOURING(turding) has been disgustingly overpriced. You should never EVER take any touring product as an example for comparisons.
 
Back
Top