• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3080 Founders Edition

Great review as usual.

It's weird that they prioritized maximum performance over energy efficiency like in previous gens. By going from 270W to 320W, they sacrificed 15% energy efficiency for just 4% higher performance.
Kinda pulled an AMD there...

Agreed. Your performance increases are at the expense of excess heat and excess wattage.
There is no real wattage vs performance gain like previous generations.

I think people should see this. It is from Adored TV. It's 30+ minutes long and it will explain a lot on just why this card is running so high in wattage, which in turn will throw out a lot of excessive heat from that card.

 
Hmm.. I know the TDP of the 2080 Ti is at 250W, but you can see it using around 273W (average gaming) according to @W1zzard's charts. The 3080 is rated at 320W, but it does seem to be hovering around 303W. Maybe NVIDIA just overshooting their stated specs?
Unlikely. 3080 is probably the worst bin GA102 silicon, but I'd think the FE boards will be better than the average AIB.

Power_PCAT.png
 
Monster card but i'm skipping this gen. Where too far into this generation cycle + games are still running on old engines.

90% of games i've played in the last 12 months either have been extremely well optimzed or had moderate at best requirements.
I'd rather just wait for 4000 series. Ironically I see alot of people upgrading because "new game engines are coming soon and "it has better ray tracing" but by the time both are utilized the 4000 series will be out...... (lol)
 
Unlikely. 3080 is probably the worst bin GA102 silicon, but I'd think the FE boards will be better than the average AIB.

When the original Turing launched the FE models are the one that failed the most so I wouldn't be putting that much faith into those.
Just go with any brand that has the best RMA practise in your region is the best idea.
 
Wow, very impressive...

I still feel modern graphics hardware is too expensive but versus the previous few generations this is really reasonable.

I hope the 3070 delivers at 1440p what this card delivers at 4K. I may have to buy one, it will be my first Nvidia card since the GTX 970.
 
Nice. Twice the performance of 1080ti will make for a happier 4k120 display.
 
Computerbase.de tested out what is called Asynchronous DLSS and that improve DLSS performance in Youngblood by another 9%.
So Ampere still has alot more tricks up its sleeves.
 
3090 here I come. Hope it will be in stock.
 
I'm going to try to get one the performance uplift looks pretty huge vs the 2080ti on rendering apps like blender 60%+ in some cases. Going to expect VR performance is also much better than the 2080ti.

The power consumption is disappointing though
 
@W1zzard I have to disagree with one of your negative items in the conclusion, "makes little sense for people who don't have 4k 60hz gaming" - I mean, the RTX 3080 only does 92 fps at 1440p for AC Odyssey for example, and games like AC Odyssey are so much more fun (imo) when you lower settings to reach at 144 fps on a 144hz 1440p monitor. I mean I understand high refresh isn't for everyone, but I personally enjoy it in games. It enhances the immersion to me. (so RTX 3080 isn't powerful enough for me ideally) but in reality it is, cause I will need to turn down less settings to achieve my goal :)

that being said, I loved everything else about the review, great stuff! I love the format you use too, so easy to navigate.
 
The performance increase from 2080 to 3080 is roughly the same as 1080 to 2080. Where is the "near doubled performance" claimed by nv ceo?

And I totally agree with the conclusion that this card is strictly for 4k gamers. It makes no sense for anyone with a 1440p or lower monitors (myself) due to huge power consumption.
 
I doubt anyone will buy that card for 1080p
You know some people will. They'll want the absolute highest FPS for their 240hz displays and they'll get it with this card at 1080p.

I have to disagree with one of your negative items in the conclusion, "makes little sense for people who don't have 4k 60hz gaming" - I mean, the RTX 3080 only does 92 fps at 1440p for AC Odyssey for example, and games like AC Odyssey are so much more fun (imo) when you lower settings to reach at 144 fps on a 144hz 1440p monitor. I mean I understand high refresh isn't for everyone, but I personally enjoy it in games. It enhances the immersion to me. (so RTX 3080 isn't powerful enough for me ideally) but in reality it is, cause I will need to turn down less settings to achieve my goal :)
Agreed. I have dual 1440p 144hz(set for 120hz) displays and it would seem that settings will still have to be turned down for many games to hit the 120fps mark.
that being said, I loved everything else about the review, great stuff! I love the format you use too, so easy to navigate.
Agreed here as well. Excellent review that hit all the points that matter to prospective purchasers!
 
Last edited:
Great review as always @W1zzard , the card certainly packs a punch at the more traditional $700 pricepoint. That new cooler does a great job, congrats on that teardown too!

Looking forward to the 3070 review a bit later down the line, 2080 Ti or a bit better performance at a more comfortable 220W TDP would be very nice indeed.
 
Seems Nvidia's CEO has artistic math's skills, still a good card though.
 
Great review and looks like a beast of a card. It looks like my 1080ti is starting to age. I will wait to see what AMD has though as maybe they have some magic.

I am taken back by some of the negative comments that nic-pic about details that ultimately have not bearing on real world performance. But I guess there will always be those who will complain about anything that Nvidia does.
 
It is a shitty overclocker btw... from https://www.tomshardware.com/news/nvidia-geforce-rtx-3080-review

For the GPU core, while Nvidia specs the nominal boost clock at 1710 MHz, in practice, the GPU boosts quite a bit higher. Depending on the game, we saw sustained boost clocks of at least 1830 MHz, and in some cases, clocks were as high as 1950 MHz. That's not that different from the Turing GPUs, or even Pascal. The real question is how far we were able to push clocks.

The answer: Not far. I started with a modest 50 MHz bump to clock speed, which seemed to go fine. Then I pushed it to 100 MHz and crashed. Through a bit of trial and error, I ended up at +75 MHz as the best stable speed I could hit. That's after increasing the voltage by 100 mV using EVGA Precision X1 and ramping up fan speeds to keep the GPU cool. The result was boost clocks in the 1950-2070 MHz range, typically settling right around the 2GHz mark.

Memory overclocking ended up being far more promising. I started with 250 MHz increments. 250, 500, 750, and even 1000 MHz went by without a problem before my test (Unigine Heaven) crashed at 1250 MHz. Stepping back to 1200 MHz, everything seemed okay. And then I ran some benchmarks.

Remember that bit about EDR we mentioned earlier? It works. 1200 MHz appeared stable, but performance was worse than at stock memory clocks. I started stepping down the memory overclock and eventually ended up at 750 MHz, yielding an effective speed of 20.5 Gbps. I was really hoping to sustain 21 Gbps, in honor of the 21st anniversary of the GeForce 256, but it was not meant to be. We'll include the RTX 3080 overclocked results in the ultra quality charts (we didn't bother testing the overclock at medium quality).

Combined, we ended up with stable performance using the +75 MHz core overclock and +750 MHz GDDR6X overclock. That's a relatively small 4% overclock on the GPU, and a slightly more significant 8% memory overclock, but neither one is going to make a huge difference in gaming performance. Overall performance at 4K ultra improved by about 6%.

So even Nvidia is pushing pretty much to the silicon's limits now these days.
 
Theoreticaly RTX 3080 = SLI 2x RTX 2060 Super, if take power consumption / performance / price it same.
 
I have not yet read tpu review, but have seen other techpages and youtubers reviews and so far, I am not disappointed about the raw performance. RTX 3080 seems to be around 90 to 100 % faster than my old gtx 1080 TI, so a pretty good jump. But that power usage off, but what really does the out come of me not getting RTX 3080 for now. Is the amount of vram. 10 gb for 4k will not last a few years in the future for all games. So because of the vram, I will take my chances and wait in the hope of a rtx 3080 20 GB or a rtx 3080 ti in stead. Sure that will be more expensive, but might pay out in the long run because of more vram = better future proofing.

Ampere is impressive in the raw performance alone. But thinking of the power consumption, it does ruin the picture a little. Guess that the back side of all these Cuda cores and other tech ampere has. Fortunately I have seen power target seems to can be lowered from 100 % down to only 31 % or a 100 watt card and between. I will make good use of the power target slider when i don't need all the gpu power.
 
I just wish AMD would give something more before October 28th, I am assuming since they haven't, that the best they can do with Big Navi is match the 3080 at most, and possibly not even that. So yeah, terrible marketing on their end. They should have at least released preliminary benches if they were confident of anything more. and unless it can beat a 3080 for same price, might as well just roll nvidia at that point due to higher likelihood of stability. I was only going to consider Big Navi if it came out swinging.
 
Seems Nvidia's CEO has artistic math's skills, still a good card though.
Actually, it seems NVidia understated the projected performance as much of the data on offer is well over what NVidia stated.

W1zzard, when will available a 8K gaming resolution test?
8k displays are still extremely expensive and are not wildly available. Given that we have only just reached 4k gaming viability, 8k is a bit off just yet. Don't expect 8k testing until it is more widespread.
 
Back
Top