• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Founders Edition

Consoles are not quire around the same MSRP either. For example, the recently announced price of Xbox Series X is $499/499€/£449 and 49980 yen. Compared to US price Japanese price is reduced as it is a difficult market for Xbox and EU/GB prices reflect included taxes.

There are a number of reasons to vary MSRP across different regions or countries - taxes are a big one and marketing position is another.

15-31%. RTX3080 is considerably more CPU limited on 1440p.
Yeah but the difference is slight in consoles, but here we are talking of more than 40% increase on the official MSRP.
 
You just need to cash out more to silent PC.
 
...if those numbers show anything, it's that there are no CPU bottlenecks at 4k, given that it scales beyond the lower resolutions. Which is exactly what I was pointing out, while @birdie was claiming that this GPU is CPU limited even at 4k:

Here you go, this is from 3080 review
borderlands-3-3840-2160.png


Here is from 10900K vs 3900XT
borderlands-3-3840-2160.png


This game probably shows the biggest difference of 10%, other games show that 10900K or 3900XT can improve 3080's FPS by a few %. Overall the standard testing rig show sign of CPU bottleneck even at 4K
 
Here you go, this is from 3080 review
borderlands-3-3840-2160.png


Here is from 10900K vs 3900XT
borderlands-3-3840-2160.png


This game probably shows the biggest difference of 10%, other games show that 10900K or 3900XT can improve 3080's FPS by a few %. Overall the standard testing rig show sign of CPU bottleneck even at 4K
Exactly! I am really not the person you need to be showing this to.

Well, I wouldn't completely mark it down to unwillingness.
Getting a PC + Display to enjoy all the bells and whistle settings you had on 1080p at resolutions of 1440p @120+ or 2160 @60 without selling organs, just started to get feasible now.


Let's hope RDN2 delivers as well in October and this trend continues.
Put it this way: if you're willing to pay $700 for a GPU (no matter the generation) yet unwilling to pay more than ~$300 for your monitor, that is ... wait for it ... unwillingness. There have been GPUs perfectly capable of handling >60fps @1440p for years below $700, and there have been 1440p>60Hz monitors at relatively reasonable prices for just as long. And while I'm in no way denying the effect of high refresh rates on perceived visual quality or the gaming experience as a whole, unless one is a die-hard esports gamer I would say 1440p120 beats 1080p240 every time.
 
All the people crying "herp derp it's only whatever % faster than 2080 Ti" are missing the point entirely: RTX 3080 is the first graphics card that gets proportionally faster than its predecessors at higher resolutions. That is nucking futs!

Not to mention that RTX is finally able to render at framerates almost matching rasterised performance (or exceeding it - see Control @ 4K RTX). Yes it still requires DLSS to get there, but the fact that this was literally not even a possibility two years ago (before Turing launch) is just mindblowing. The next generation will almost certainly bring performance parity in RTX and rasterised rendering, and that's insane: we will have gone from "ray-traced rendering is an academic novelty" to "ray-traced rendering is the standard" in probably half a decade.

This is the world's first true 4K graphics card and NVIDIA should be congratulated for it. My only question is, what happens to us non-4K plebs now? Because NVIDIA is building GPUs that are literally too powerful for the rest of our systems!
 
BTW, do you remember Radeon R9 Fury? Was released with paltry 4GB of VRAM which was strangely enough for AMD fans.
Back with the 200/300 series, 8 GB was important for future-proofing, but for Fury 4 GB was plenty due to HBM? :P
Those who make these anecdotes can't even be consistent. People should instead look at thorough reviews, those will reveal any memory problems. And when cards truly run out of VRAM, it will be an unplayable nightmare, not just 5-10% performance drop.
 
Would you rather he wore only an apron? :p
Joking of course.
Although I can't stand the man...This product is Mega good. And although I don't game as much as I used to - The latest gen may have me reaching for my wallet.
Will wait for AMD's response - Big Navi. But I honestly can't see them beating, nor even coming close to Nvidia this time.
 
Back with the 200/300 series, 8 GB was important for future-proofing, but for Fury 4 GB was plenty due to HBM? :p
Those who make these anecdotes can't even be consistent. People should instead look at thorough reviews, those will reveal any memory problems. And when cards truly run out of VRAM, it will be an unplayable nightmare, not just 5-10% performance drop.

It depends on the game/engine I think.
Some games/engines handle the slight or modest VRAM limitations really well and don't ruin the user experience.
 
A 20GB version of the RTX 3080 will likely be launched in the next two months:

 
All the people crying "herp derp it's only whatever % faster than 2080 Ti" are missing the point entirely: RTX 3080 is the first graphics card that gets proportionally faster than its predecessors at higher resolutions. That is nucking futs!

Not to mention that RTX is finally able to render at framerates almost matching rasterised performance (or exceeding it - see Control @ 4K RTX). Yes it still requires DLSS to get there, but the fact that this was literally not even a possibility two years ago (before Turing launch) is just mindblowing. The next generation will almost certainly bring performance parity in RTX and rasterised rendering, and that's insane: we will have gone from "ray-traced rendering is an academic novelty" to "ray-traced rendering is the standard" in probably half a decade.

This is the world's first true 4K graphics card and NVIDIA should be congratulated for it. My only question is, what happens to us non-4K plebs now? Because NVIDIA is building GPUs that are literally too powerful for the rest of our systems!
While a lot of what you say here is true, the first point needs some moderation: this is mostly due to CPU limitations and/or architectural bottlenecks preventing some games scaling to higher FPS numbers, not due to 4k performance itself increasing more than lower resolution performance in a vacuum. Of course we can't get faster CPUs than what exists today, nor can we magically make game engines perform better or the Ampere architecture re-balance itself for certain games on demand, so it is what it is in terms of performance - but you are arguing as if this was caused by the GPU alone rather than these contextual factors.

RTX performance with DLSS looks excellent, and I'm definitely looking forward to how this in combination with RT-enabled consoles will affect games going forward. The big questions now become not only whether AMD is able to match Nvidia's RT performance, but if they can provide any type of alternative to DLSS.

Also, this is a perfectly suitable 1440p120 or 1440p144 GPU. No need for 4k, though it's obviously also great for that.

One question though: going by the meager perf/W increases, was anything beyond die sizes on 12nm stopping Nvidia from delivering this kind of performance with Turing?

A 20GB version of the RTX 3080 will likely be launched in the next two months:

Ouch, that's going to piss off a lot of early buyers. Not that there's any reason to - those extra 10GB are most likely going to be entirely decorative for the lifetime of the GPU - but people tend not to like being presented "the best product ever, go buy it now" and then having it superseded in just a few months by the same company.
 
What a beast of a card. Hope it drives the prices of all other graphics card downwards so 1080p people like me get some trickle benefits.
 
I second that, I don't use headphones so quietness is quite important.

More like QUIET important, amirite???

On a more serious note - I don't use headphones either, my PC is under 1m away from my ears, and I never hear it. Fan stop under a certain temperature threshold for GPUs and PSUs is amazing.
 
I doubt anyone will buy that card for 1080p
I think I will....
But that's only because I'm a weird case:
- gaming on a 1080p@120Hz projector (there are no 4k projectors with >60Hz and/or low input lag yet)
- I love eye-candy. Details at max. Vsync ON.

What I need is GPU capable of consistent fps above 120 with max eye candy, Vsync and RTX ON.
RTX-3080 may still be an overkill, but... "will it run Crysis ?" :)
 
I prefer to wait for RTX 20 GB or the next AMD 16gb graphics card that reassures when you buy a product at 800 dollars (for 5 years 6 years) that the product is more or less future proof, as PC games are developed to be suitable for console therefore for 16GB of Vram. there is a good chance that the developers will not bother and transpose this on PC for the maximum setting .
 
OK now from the performance summary lets see if this is more of an upgade than the 1080 to 2080S cards;

1080P:
1080 to 2080S -> +45%
2080S to 3080 -> +30%

1440P:
1080 to 2080S -> +56%
2080S to 3080 -> +43%

4K:
1080 to 2080S -> +60%
2080S to 3080 -> +56%

So the 1080 to 2080S was a better upgrade than 2080S to 3080 at the three resolutions interesting!!!
 
OK now from the performance summary lets see if this is more of an upgade than the 1080 to 2080S cards;
Did you choose 2080S just so it would illustrate your point?

The main generations - GTX1080 is from 2016, RTX2080 is from 2018, 3080 is 2020.
S was a mid-lifecycle update to Turing. 1080 got some update as well but that was more minor and didn't take off.
 
Some people actually like a quiet system. The more power draw, the more increased rpm on psu fan.
Well of course, I do as well. But quality hardware can handle that.
My Fractal Ion Platinum PSU is dead quiet even under heavy loads, In a windowless Fractal Define Nano S case.
 
Did you choose 2080S just so it would illustrate your point?

The main generations - GTX1080 is from 2016, RTX2080 is from 2018, 3080 is 2020.
S was a mid-lifecycle update to Turing. 1080 got some update as well but that was more minor and didn't take off.
The current replacement cards from 2019 are all supers so why not, they should have given us this performance in 2018?

Here is the numbers with the 2080 non-super cards
1080P:
1080 to 2080 -> +39%
2080 to 3080 -> +35%

1440P:
1080 to 2080 -> +47%
2080 to 3080 -> +51%

4K:
1080 to 2080 -> +50%
2080 to 3080 -> +67%

Almost even up to 4K using 2080 which was replaced only after 9 months by the 2080S super cards.
 
Last edited:
Finally had time to catch up on this review.

It's impressive, in the same way that the AMD Radeon Fury was impressive - it brute-forced its way past the previous generation but at great cost.

The 3080 is overshadowed by several glaring issues for anyone who has been following:
  • It's not the 90% faster than Nvidia claimed. That's barely realistic even with RTX and DLSS-optimised titles that have been carefully hand-tuned by Nvidia/developer collaboration and many new games won't get that preferential treatment.
  • 10GB RAM. Yeah, it's enough for now but it's not 'plenty' and it's not necessary enough for that much longer.
  • That power draw. Even though the cooler is quiet enough, that heat has to go somewhere. Your poor CPU, RAM, motherboard, drives, and armpits are going to pay for that. Whatever's in the XBSX and PS5 can't possibly be as power hungry because their design and power bricks simply aren't enough to handle a 350W GPU.
  • RDNA2 is looming ominously and this has failed to meet the overblown claims. Rushed out too early to try and maximise sales at the current price before big Navi lands? Nvidia must be working closely with game devs and insiders who also have hands-on time with the consoles, so regardless of confidentiality agreements, I'm sure someone's been coerced into leaking info to Nvidia about the RDNA2 in the upcoming consoles.
 
Uh oh we are comparing GA102 die to TU104, which is wrong, compare it to TU102, +50% ingame performance + 50% transistors.

next is 54000 Mtr in 2022 moores law

so this is just another GTX780, a xx102 based xx80 card. 3080 as good as it might seem, it is not a 5-6 year investment. 10-12 months and be replaced by 4070, or 4060 even. If with next gen we have a stronger leap like 960 to 1060. 3080 is just a glorified 4060 with double the power comsumption, and if you buy this now, you have to keep dragging it for years or sell it at a hudge loss.
 
This is the world's first true 4K graphics card and NVIDIA should be congratulated for it. My only question is, what happens to us non-4K plebs now? Because NVIDIA is building GPUs that are literally too powerful for the rest of our systems!

Jack up the render scale and AA and mine eth while playing for us 1080p 144hz plebs that want this card lol...
 
Under the overclocked performance testing, what settings were used in Unigine Heaven?
Custom settings, custom scene, nothing you can reproduce
 
And it went INSTANTLY from "notify me" to "out of stock" …. quite literally…. I was updating the page around release time, 30 sec before it still showed the "notify me" button and then, after sipping from my coffee before Refreshing again, it showed "out of stock" …

GG Nvidia…. I guess scalpers used their bots again and the cards will Pop up in the next few days on eBay and co ….

really well done…. *sigh*
 
Back
Top