• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Founders Edition

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,638 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
NVIDIA's new GeForce RTX 3080 "Ampere" Founders Edition is a truly impressive graphics card. It not only looks fantastic, performance is also better than even the RTX 2080 Ti. In our RTX 3080 Founders Edition review, we're also taking a close look at the new cooler, which runs quietly without throttling and has fan stop.

Show full review
 
Oh boy, now you know why Nvidia should be worried & rightly so :D
Performance per Watt FPS 1920x1080
Performance per Watt FPS 2560x1440
Relative Performance 1920x1080
Performance per Watt FPS 3840x2160
Relative Performance 2560x1440
Relative Performance 3840x2160
 
Great review as always, will be getting one stock permitting!
 
80c under test bench and 20c ambient?
 
3080 x 5700 = 100% in 4k overall, amazing, finally a true 4k gaming gpu and the first hdmi 2.1 gpu.
 
Last edited:
Would have liked some memory usage figures, I am not quite convinced it really is a must have for 4K with hardly any more memory than a high end card from what, 7 years ago (290X 8GB) ? Not much progress in that area.

Had it been at least a 12 GB card, which we know they could have done easily because the bus interfaces are there it would have peaked my interest enough to buy one (well, a couple of months done the line as the price gouging somewhat diminishes). But as it is, nah.
 
While absolute performance is definitely impressive, the fact that perf/W is not moving whatsoever is ... a bit worrying, frankly. Sure makes it look like Nvidia is pushing this card in a similar manner as AMD did with their Vega cards, and ... that's not very promising. Here's hoping the 3070 behaves more sensibly. But going by this data alone, AMD might have a serious shot at taking the perf/W crown this go around.
 
Not moving? 1080 Ti = 70%, 2080 Ti = 85%, 3080 = 100%
It's moving at 4k, but they're actually behind AMD's past generation (!) and their own (!!) at 1080p and just barely ahead at 1440p. For a new generation with a supposed full node increase, that's pretty weak IMO.
 
It's moving at 4k, but they're actually behind AMD's past generation (!) and their own (!!) at 1080p and just barely ahead at 1440p. For a new generation with a supposed full node increase, that's pretty weak IMO.
That's because lower resolutions are CPU limited.

Edit: Actually you're making a great point. I'm using the same 303 W typical power consumption value from the power measurements page on all 3 resolutions, which isn't 100% accurate. Because it's some games are CPU limited, then in those games the power consumption is down, too, which I'm not taking into account
 
"Huge performance increase over RTX 2080" - 30% is huge - sure....

I am editing this post - I was wrong - Performance is 35 is to 60 ish - it is significant in line what used to be expected for a generational /node advancement with a power consumption increase.
 
Last edited:
Well thats the cooling question answered for me. I mean yes the card runs HOT like any other reference card does but when you take into account how much hrsprs it has over the 2080Ti then that is actually pretty good.

@W1zzard

Did you log the CPU thermals while you were running the tests? Im interested to know how much all heat from the GPu would of impacted thermals of other components.
 
That's because lower resolutions are CPU limited.

Edit: Actually you're making a great point. I'm using the same 303 W typical power consumption value from the power measurements page on all 3 resolutions, which isn't 100% accurate. Because it's some games are CPU limited, then in those games the power consumption is down, too, which I'm not taking into account
That makes sense. Also, there's a point to be made about the test suite if enough of it is CPU limited at 1080p for it to skew results in that manner. Time for an update? Besides, if the GPU is idle, waiting for the CPU, it should be consuming a lot less power, no?
 
Looks quite good. Not having extra money in my pocket at the moment will help to to not jump the gun and wait a little more to see what AMD will have to offer and then decide for my upgrade. I have to admit that it looks quite tempting though.
 
I just updated :) The issue is not only old games

Correct, as I said, that's not reflected in my perf/W summary scores
Understandable. Guess it might be time to add power draw measurements at each resolution then? And possibly add - as an optional extra for GPUs that hit CPU limitations, like this one - test in relevant resolutions in both GPU and CPU limited titles? Of course that adds a lot of work, but that seems to be the nature of all benchmarking - the workload just grows over time. There's also an argument to be made for producing an overall performance graph that excludes CPU-bound games, though at that point things might get too complicated for the average reader.
 
So RTX on, same performance loss. DLSS same benefits.
New Shader count divide by 2. Compared to 20 series +5% efficiency. 8704=4352*1.05. wow. just amazing. 3070 = 2080Super. 2944 *1.05
 
Guess it might be time to add power draw measurements at each resolution then?
Rather the only solution I can think of is to record power during all testing in all games, to get the proper power average. Not something I'm able to do at the moment.

that excludes CPU-bound games
4K is a good approximation of that. For the 3080, Full HD really is more of an academic resolution, I doubt anyone will buy that card for 1080p
 
I am now thoroughly convinced the RTX 3080 is absolute overkill for my 1440p monitor.
 
2080 Ti is more than 64 ROPS. @w1zz
Fixed, lol, this has been wrong in every single review since RTX 2080 Ti FE t.t .. congrats for being the first to notice it. Now I have to figure out how to fix all those reviews
 
Makes little sense for gamers without 4K monitor
I don't know about that. My 1440p@165Hz monitor is telling me it wants an RTX 3080 :)
 
Rather the only solution I can think of is to record power during all testing in all games, to get the proper power average. Not something I'm able to do at the moment.

4K is a good approximation of that. For the 3080, Full HD really is more of an academic resolution, I doubt anyone will buy that card for 1080p
Don't underestimate people's unwillingness to leave 1080p behind! :rolleyes:

And yeah, automatic and continuous power logging is probably the best way forward for something like that, though that also sounds like a serious investment in terms of equipment. Hope you have the opportunity to expand this in the future though!
 
Back
Top