• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Ti Founders Edition

Pascal was the last great generation from nvidia the 1060 offered unbelievable value for 1080p gaming, the 1070 gave you 980ti performance at 50% less power, the 1080 was great for 1440p, and lastly how can you forget the 1080ti, a card so good, it blew the industry away with its capabilities and armed with 11gb of gddr5x.

ever since rtx 2000 series nvidia made no real improvements in performance and power efficiency, just focusing on raytracing and DLSS, rtx 3000 is even worse, abysmal power efficiency( up to 500w on rtx 3090!!!!), lackluster VRAM ( aside from 3060 and 3090), overheating gddr6x memory, no stock, and pointless SKUs like the this 3080ti,... wtf is going on at nvidia ??!!!

just when high resolution high refreshrate gaming started to becomes a reallity for everyone nvidia went full L since rtx 2000 series, no one wants ray tracing, we want 4k 144fps gaming, look how rtx 3060 promises rtx 2060 Super performance at 170watts, the 2060 super gave you gtx 1080 performance at 190watts, the gtx 1080 was 180watts gpu !!! NO REAL POWER EFFICIENCY IMPROVEMENTS SINCE 2016 !!!! AND THEY CHARGE YOU MORE

It's maybe not all bad... According to RedGamingTech leaks Intel XE & DG2 dGPU program is shaping up quite well (3070-3080 level of performance) and more importantly, Raja has allegedly got Intel's support to primarily target $200-300 mainstream market. Plus no AIB partners and putting strict distributors & retailer pricing policies in place, just like with it's CPUs. 2022 dGPU market might look much, much better if mining craze ends. Granted RDNA3 and Ampere next gen will be a tier above Intel offerings performance wise, but hey I'll gladly buy 3070-3080 level of performance GPU for 300 bucks if it has half decent drivers instead of what Ngreedia & AMD will try to charge for their new dGPU lineups.
 
Last edited:
The only way we can solve the problem is going back to 360p resolution gaming and sticking with integrated graphics. Just sit closer to the screen so it feels like a 27"/34" panel... for the wide-screeners just squint your eyes. As long as we have "huge" graphical fidelity ambitions, we will always be robbed by profiteering Ngreed'ism/co. I even regretted purchasing a 1080 TI initially for £600/£700. Once up and running it didn't feel like me money's worth. Although it did turn out to be a nice investment eventually.
 
I wonder if W1zzard has to get additional insurance with all these cards in his studio????
 
What's the point? Nvidia could package up toenail shavings for a couple thousand and countless idiots would buy it, jacking up the prices for everyone else (not for toenails but graphics cards to be clear).

I liked PC gaming before the middle class kids or casuals got interested in it about 5 or 6 years ago. Now they all want the best graphics cards so save up a whole month's worth of their McDonald's counter salary to buy one. These people don't have bills or kids.
 
What's the point? Nvidia could package up toenail shavings for a couple thousand and countless idiots would buy it, jacking up the prices for everyone else (not for toenails but graphics cards to be clear).

I liked PC gaming before the middle class kids or casuals got interested in it about 5 or 6 years ago. Now they all want the best graphics cards so save up a whole month's worth of their McDonald's counter salary to buy one. These people don't have bills or kids.
Wow, so many stupid assumptions.
 
@W1zzard

Thanks for the review!

Is there something wrong about the Cyberpunk 2077 ray-tracing results? The RX6000 series non-RT vs RT results are the same...

1622678212179.png
 
Why is everybody so concerned with NVIDIA's MSRP? It's just a meaningless number. They probably didn't want to lower the x80 Ti MSRP compared to 2080 Ti, so they picked 1200, to not look bad when they announce 4080 Ti.

You will not be able to buy the 3080 Ti at that price, probably ever. As much as that sucks for all of us, that's what will happen. Look at what the card offers, compared to what's available at what price and make a decision based on that? I tried to go through some options and examples in my conclusion.

Eh the 2080 Ti was significantly overpriced at $1200. Its performance wasn't all that impressive. One of the key reasons I waited to pick one up used on ebay for just a bit above half of what its MSRP was (including a pre-installed wb to boot).

As you mentioned in a later post, though, people do need to stop buying this stuff at these price points or it won't ever go down.
 
PC gaming isn't going away. It just needs a temporary adjustment in our thinking. People want to downplay the conditions wrought by the Pandemic and we have a hell of a mess to clean up going forward but it will happen.
 
Last edited:
You're giving it a Thumbs up/Pro for being 8nm? As opposed to what?

Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.

The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.

mvddc.png


I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.
 
I love the peanut gallery throwing out anecdotes about how PC gaming is totally dying and everyone is going to go buy consoles. Yeah, I cant get a GPU right now so I'm gonna drop high end GPU money on a console that cant do 4k60/1440p144 at ALL and can barely do 4k30/1440p60 (1080p60 for PS5 since it cant even do 1440p LMFAO) with a totally closed environment with no competition and stuck with joystick controls. :rolleyes:

Whatever you're smoking to come up with that argument, you can keep it, cause its garbage.

Also daily reminder that the 8800 ultra launch for the equivalent of $1100 in 2006. Prices go up, prices go down. :roll: LOLCALMDOWN :roll:
 
On the circuit board analysis page it would be nice to have some notes explaining the differences between the 3080FE, 3080Ti FE, and 3090FE - the chokes, VRMs, memory modules
Since the cards are so similar, something as simple as "the 3080 has X memory modules, Ti has 2 more, and 90 has them doubled onto the back of the PCB" would be really informative to those reading this first, without the background knowledge

According to the HWiNFO developer, GDDR6X modules are rated to throttle at around 110°C. They're toasty and consume a lot of power, any 3090 owner will attest to that :D
*Begins crying*
*Uses tears to fill my EK block and watercool the VRAM with an active backplate*


Yeah its a problem, and something the Ti should have resolved. They clearly just used the existing cooling setups with zero changes.
 
Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.

The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.

View attachment 202619

I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.

Agree 100%, even on my 3080 I see the memory hogging down power sometimes and you can undervolt the core by a good amount on ampere and get the power consumption down but the memory will still hog down power to the point that sometimes you can actually see the memory using more than the core when playing games that aren't using the core much often. AMDls solution was as you said pretty great in those regards.
 
I love the peanut gallery throwing out anecdotes about how PC gaming is totally dying and everyone is going to go buy consoles. Yeah, I cant get a GPU right now so I'm gonna drop high end GPU money on a console that cant do 4k60/1440p144 at ALL and can barely do 4k30/1440p60 (1080p60 for PS5 since it cant even do 1440p LMFAO) with a totally closed environment with no competition and stuck with joystick controls. :rolleyes:

Whatever you're smoking to come up with that argument, you can keep it, cause its garbage.

Also daily reminder that the 8800 ultra launch for the equivalent of $1100 in 2006. Prices go up, prices go down. :roll: LOLCALMDOWN :roll:

You misunderstood the argument prior commenters were making.

The problem is the pricing increases of GPUs in general and the complete lack of any improvements in the budget market, not temporary pricing during the pandemic. The pandemic is a separate problem that inflates prices across the board.

The pandemic is not forever, what people are worried about is that even if it does go, that still leaves little room for budget options and it won't change the fact that Nvidia is still charging $1,200 for this card. Consoles on the other hand will return to their MSRP of $500.

Most people are aware of the drawbacks of consoles, you don't have to point that out. That said at $500, if Nvidia / AMD completely fail to address the budget market you can't really blame those people for considering console when in fact Nvidia / AMD aren't even providing products most people can afford. PC elitists seem to forget that the PC market is held up mostly by budget and midrange where the vast majority of gamers reside. No amount of "Well PC can do this..." will change the price. If a person can't afford it they can't buy it, if a person thinks it isn't worth it they will spend their money elsewhere.

Speaking of the 8800 ultra:

"The 8800 Ultra, retailing at a higher price,[clarification needed] is identical to the GTX architecturally, but features higher clocked shaders, core and memory. Nvidia later[when?] told the media the 8800 Ultra was a new stepping,[clarification needed] creating less heat[clarification needed] therefore clocking higher. Originally retailing from $800 to $1000, most users thought the card to be a poor value, offering only 10% more performance than the GTX but costing hundreds of dollars more. Prices dropped to as low as $200 before being discontinued on January 23, 2008."


At the time that card released it was roundly criticized by the press for being extremely poor value and that was for a 10% gain on a 30% price increase. The 3080 Ti is a 7% increase for 70% more money. I'm glad you brought that up because it just objectively shows how piss poor value the 3080 Ti is even compared to more extreme examples. Mind you that was still a single overpriced card. Nvidia has been increasing the ASP across their entire GPU stack, not just a single model.
 
Last edited:
Pascal was the last great generation from nvidia the 1060 offered unbelievable value for 1080p gaming, the 1070 gave you 980ti performance at 50% less power, the 1080 was great for 1440p, and lastly how can you forget the 1080ti, a card so good, it blew the industry away with its capabilities and armed with 11gb of gddr5x.

ever since rtx 2000 series nvidia made no real improvements in performance and power efficiency, just focusing on raytracing and DLSS, rtx 3000 is even worse, abysmal power efficiency( up to 500w on rtx 3090!!!!), lackluster VRAM ( aside from 3060 and 3090), overheating gddr6x memory, no stock, and pointless SKUs like the this 3080ti,... wtf is going on at nvidia ??!!!

just when high resolution high refreshrate gaming started to becomes a reallity for everyone nvidia went full L since rtx 2000 series, no one wants ray tracing, we want 4k 144fps gaming, look how rtx 3060 promises rtx 2060 Super performance at 170watts, the 2060 super gave you gtx 1080 performance at 190watts, the gtx 1080 was 180watts gpu !!! NO REAL POWER EFFICIENCY IMPROVEMENTS SINCE 2016 !!!! AND THEY CHARGE YOU MORE
I can still remember clearly when consumer and media were very amazed that GTX 1080 only uses 1x 8pin to deliver flagship performance. Even GTX 1080 Ti with 8+6pin was considered power hungry at that time. I thought we were heading to a good direction with 20, 30, and 40 series and beyond in terms of power efficiency, apparently not :(

1060 was a phenomenal card, Nvidia will not be able to beat it with the current increasing MSRP. xx60 will reach xx80's price in the near future, as @RedelZaVedno said here:
It's not meaningless in the long run... just look at the direction MSRP prices are headed: GTX 680 = $499 / 780TI =$699 / 980TI = $649 / 1080TI = $699 / 2080TI = $999 / 3080TI =$1.199... 240% price hike in 9 years (18% inflation in this period). Elevated MSRPs are here to stay even after mining graze ends.
 
I can still remember clearly when consumer and media were very amazed that GTX 1080 only uses 1x 8pin to deliver flagship performance. Even GTX 1080 Ti with 8+6pin was considered power hungry at that time. I thought we were heading to a good direction with 20, 30, and 40 series and beyond in terms of power efficiency, apparently not :(

1060 was a phenomenal card, Nvidia will not be able to beat it with the current increasing MSRP. xx60 will reach xx80's price in the near future, as @RedelZaVedno said here:

Who cares about maximum power consumption when you can tweak the power limits to your liking, reducing power consumption will increase efficiency, demonstrated by the mobile GPU.

Infact you should be thankful that Nvidia/AMD keep increasing the maximum power limits on their desktop GPU because they have to design better VRMs to accomodate higher power consumption limits, better VRM --> higher VRM efficiency. Let say you have 6 phrase VRM that have 20W power loss at 150W TGP before, now you have 10+ phrase VRM that have only 10W power loss at 150W TGP

1660 Super was a super fine GPU at 230usd
 
Great review as usual.

Disappointing that $500 doesn't even get better thermal pads over the 3080. That card at msrp was exciting. This one not so much. Same number of vrms as well, although they repositioned one? Looks like very, very limited availability for the FE as well. I guess that was to be expected, but this time around seems even lower with best buy only selling in person at a limited number of stores.
 
And here is why the use of 5800X is a bottleneck for top-of-the-line GPU reviews now-a-days since some games properly utilise more threads
1622699779518.png

Surely our @W1zzard tested somewhere else in the game but the difference between 3080 and 6900XT in his review is 0 compared to the 13% in the HU review.
 
Who cares about maximum power consumption when you can tweak the power limits to your liking, reducing power consumption will increase efficiency, demonstrated by the mobile GPU.

Infact you should be thankful that Nvidia/AMD keep increasing the maximum power limits on their desktop GPU because they have to design better VRMs to accomodate higher power consumption limits, better VRM --> higher VRM efficiency. Let say you have 6 phrase VRM that have 20W power loss at 150W TGP before, now you have 10+ phrase VRM that have only 10W power loss at 150W TGP

1660 Super was a super fine GPU at 230usd

A vast majority of consumers aren't going to tweak power limits. IMO it's frankly annoying to have another program running in the background and another source of potential issues.

That's not a problem customers should have to solve either. This is just like AMD users who were claiming AMD Vega is power efficient once you under-volt. That's great and all but it doesn't mean squat to the vast majority of users. Companies should ship products that hit their target markets out of the box. Customers should not have to fiddle with products after the fact. That's for enthusiasts if they want to spend the extra effort.
 
Is anyone here on this forum a reseller and can actually get these cards at MSRP?
 
Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.

The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.

View attachment 202619

I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.
I feel GDDR6X is a stop gap solution for a faster GDDR standard. Its almost similar to GDDR5X that never had a future beyond Nvidia's Pascal. As a result, of pushing such high clockspeed as compared to GDDR6, a lot of power is required. I wasn't very sure if GDDR6 uses that much power until I noticed the TGP of the RTX 3070 vs 3070 Ti. And in this case, its got only 8x GDDR6X 1GB. When you have 10, 12 or 24 of hot and power hungry RAM onboard, that will increase power requirement drastically. And I do agree that AMD's Infinity Cache is a great way to go around this power requirement and yet achieve better or comparable memory bandwidth.

As to Samsung's 8nm, while it is certainly more efficient than what its replacing, I don't necessarily think that its good. Its been proven that Samsung's 7nm is not as good as TSMC's 7nm, not to mention this supposed 8nm is basically Samsung's refined 10nm. Most of these RTX 3xxx runs at a fairly conservative clockspeed, i.e. around 1.8 Ghz, to keep power consumption in check. You can push it further into the 1.9GHz range, but that is generally with a +15% power limit applied. The saving grace here is probably Nvidia's Ampere architecture with ample memory bandwidth, and less of the 8nm Samsung node in my opinion.

A vast majority of consumers aren't going to tweak power limits. IMO it's frankly annoying to have another program running in the background and another source of potential issues.

That's not a problem customers should have to solve either. This is just like AMD users who were claiming AMD Vega is power efficient once you under-volt. That's great and all but it doesn't mean squat to the vast majority of users. Companies should ship products that hit their target markets out of the box. Customers should not have to fiddle with products after the fact. That's for enthusiasts if they want to spend the extra effort.
Companies ship product that works. Therefore, they ship with settings that are what they deem as "safe" to make sure the product works according to specs. They can't possibly test every chip that comes in and provide a custom setting each time.

In my opinion, its the people that are savvy that will figure out something is not right, and will try and fix it, i.e. fiddle with the power limits, etc. For people that are not savvy, they probably will live with it since while it runs hot, it works.
 
And here is why the use of 5800X is a bottleneck for top-of-the-line GPU reviews now-a-days since some games properly utilise more threads
View attachment 202635
Surely our @W1zzard tested somewhere else in the game but the difference between 3080 and 6900XT in his review is 0 compared to the 13% in the HU review.
Dude, watch any YT video with RivaTuner running on a 5950X and 3090. An engine designed around Jaguar cores isn't going to utilize 32 threads:
1622705977828.png
 
Companies ship product that works. Therefore, they ship with settings that are what they deem as "safe" to make sure the product works according to specs. They can't possibly test every chip that comes in and provide a custom setting each time.

In my opinion, its the people that are savvy that will figure out something is not right, and will try and fix it, i.e. fiddle with the power limits, etc. For people that are not savvy, they probably will live with it since while it runs hot, it works.

This is simply not true given that both AMD (for CPUs and GPUs) and Nvidia have dynamic boost features that will give the end user extra performance depending on specific silicon quality and temperature. AMD's dynamic boosting for it's CPUs in particular does an excellent job to the point where manual tuning isn't needed and can actually yield less performance than the automatic boosting system.
 
3080 Ti and 3090 have their place .. For 4K-5K gaming

I will keep my 3080 till 4070-4080 launches in late 2022 tho or wait for refreshes in 2023 if pricing and availablity have not normalized by 2H 2022

Or maybe I will consider Radeon 7800XT/8800XT, if AMD can keep up the pace
 
Back
Top