• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Founders Edition

Joined
Feb 13, 2012
Messages
523 (0.11/day)
Which 4K graph has made you think so? RTX 3080 minimum fps is above average for RTX 2080 Ti. I don't remember the last time we've had such a huge leap in performance. How is it "underwhelming"?

Maybe you could share some graphs to prove your PoV?



It's exactly as powerful as advertised if not better at 4K, the resolution it was created for. Never seen so many AMD trolls in one discussion. Also, it costs $700, so not sure about greed. Also, no one forces you to buy it.

Speaking of greed:

perfrel_3840_2160.png

Here you go. 1080ti came out at 700 dollars and performed almost twice as fast as a 980ti that also was 700 dollars. Then 20series came out and almost doubled prices with 2080ti at like 1200 dollars. Everyone is impressed, but in reality Nvidia just shifted back to normal pricing because they are actually expecting competition. What everyone here is concerned about is the fact that 3080 is a cut down version of their biggest gaming chip and is using energy close to the pci thermal limit, and at this stage you start to run into diminishing returns as you scale performance any higher. But in summary, no, 3080 is not a 2080 successor, it is a 2080ti successor which was a cut down version of a Titan.
 
Joined
May 15, 2014
Messages
235 (0.06/day)
@W1zzard, try running FrameView 1.1 (even if you don't have the PCAT kit). It should give detailed data via NVAPI.
 
Joined
Apr 10, 2020
Messages
504 (0.29/day)
Thought about it, decided against it. Old DX11 engine, badly designed, badly optimized, not a proper simulator, small playerbase. When they add DX12 I might reconsider it
I can agree on being badly optimized at the moment, but not on having small player base and not being a proper simulator. Predicted sales are 2.27 million units over the next three years and third party devs will soon offer state of the art super highly detailed planes as addons through MS FS 2020 shop. FS2020 is predestined to become next X-Plane/Prepar3D kind of sim.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,963 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Jan 8, 2017
Messages
9,504 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
And that fact that they practically doubled the shader units from the 2080 Ti (not the 2080 Super, mind you) along with 8 more ROPs and RT/Tensor cores. I was realistically expecting this thing to exceed 300W.

That's the catch, the shaders have been doubled but not the amount of SMs. In other words more execution units now share the same control logic/registers/cache which are the real power hogs in a chip.

Nvidia has done this before with Kepler to a more extreme extent, which had six times the amount of shaders per SM versus Fermi. As a result it was one of the most inefficient architectures ever per SM in terms of performance. GK110 was 90% faster than GF110 despite having almost 600% more FP32 units (GF110 shaders did run at a faster clock, still, they are worlds apart).

It's a similar story here, a lot of shading power but it's used rather inefficiently because a lot of resources are shared, that's why the power consumption looks quite bad when you think about it.
 
Last edited:
Joined
Aug 12, 2019
Messages
2,248 (1.15/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
AMD or intel doesn’t matter! It’s time to upgrade from 10th Gen and game on! It’s almost 3x difference in performance for me from 1070ti :)
 

ZekeSulastin

New Member
Joined
Nov 27, 2019
Messages
4 (0.00/day)
Sorry to bother you, but did you take a look at either reducing the power limit or manually altering the voltage/freq curve in Afterburner/similar? computerbase.de reported only a few percent less performance with a 270 W limit, and someone linked me a video of someone changing that curve to good effect on another site.

It honestly sounds like Turing, where you can't drop the voltage with the slider but can with the curve editor.
 

specopsFI

New Member
Joined
Sep 10, 2019
Messages
12 (0.01/day)
@W1zzard

In the overclocking section, you mention shortly that undervolting is not possible. Can you elaborate on this very important point? Is it prevented on Ampere on a hardware level or is it because the necessary tools are not yet available? The inability to undervolt these extremely power hungry chips would be a serious shortcoming, which I can't believe nVidia would exclude for this exact generation, since they have allowed it for so long with previous generations with access to the whole voltage/clock curve.

Other than that, an excellent article as usual!
 

iO

Joined
Jul 18, 2012
Messages
531 (0.12/day)
Location
Germany
Processor R7 5700x
Motherboard MSI B450i Gaming
Cooling Accelero Mono CPU Edition
Memory 16 GB VLP
Video Card(s) RX 7900 GRE Dual
Storage P34A80 512GB
Display(s) LG 27UM67 UHD
Case none
Power Supply Fractal Ion 650 SFX
Great review as usual.

It's weird that they prioritized maximum performance over energy efficiency like in previous gens. By going from 270W to 320W, they sacrificed 15% energy efficiency for just 4% higher performance.
Kinda pulled an AMD there...
 
Joined
Jun 24, 2020
Messages
93 (0.06/day)
2k and 1080p not better enough could be CPU not good enough, could we test again with next Zen 3
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
2,039 (0.35/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case Cooler Master QUBE 500 Flatpack Macaron
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Keychron K2 HE Wireless / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Meta Quest 3 512GB
Software Windows 11 Pro 64-bit 24H2 Build 26100.2605
That's the catch, the shaders have been doubled but not the amount of SMs. In other words more execution units now share the same control logic/registers/cache which are the real power hogs in a chip.

Nvidia has done this before with Kepler to a more extreme extent, which had six times the amount of shaders per SM versus Fermi. As a result it was one of the most inefficient architectures ever per SM in terms of performance. GK110 was 90% faster than GF110 despite having almost 600% more FP32 units (GF110 shaders did run at a faster clock, still, they are worlds apart).

It's a similar story here, a lot of shading power but it's used rather inefficiently because a lot of resources are shared, that's why the power consumption looks quite bad when you think about it.

No doubt that doubling the shader units in the same amount of streaming multiprocessors (64/SM Turing vs 128/SM in Ampere) would increase power consumption. If anything, I don't consider it as efficient as Maxwell>Pascal, per se, but I also don't consider it being a complete waste of power as well. I am also considering the fact that those RT and Tensors cores also add to the weight.

All in all, I believe the slight sacrifice to energy efficiency is justified, as long as it doesn't get to Fury X or 290X-levels of wasted power. With this card I can probably play PUBG at 4K 144Hz with competitive settings, which is a mix of medium and low settings with AA maxed out for improved visual clarity, which is important for spotting opponents.

TL;DR - This card is overkill for those still gaming in 1080p between 1440p. If you're aiming at UW (3440x)1440p or 4K, this seems to hit the sweet spot.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,963 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
May 15, 2014
Messages
235 (0.06/day)
I don't see why the power consumption is that surprising if you look at the specs. Sure its on 8nm now, but thats coming from 12nm.

Nitpick, TSMC 16FF (12N) -> Samsung 10LPU (8N). 320W+ is a 30% increase in power for ~30% increase in perf.

No doubt that doubling the shader units in the same amount of streaming multiprocessors (64/SM Turing vs 128/SM in Ampere) would increase power consumption. If anything, I don't consider it as efficient as Maxwell>Pascal, per se, but I also don't consider it being a complete waste of power as well. I am also considering the fact that those RT and Tensors cores also add to the weight.

Means little when it's dark silicon. Largely useful for compute/CUDA workloads & @ 4k where each frame becomes more alu limited. The increased number & revised ROP partitions help @ 4k.

We may be doing GA a disservice given RT & denoising should show gains when games/compiler are better optimised.

GPU-Z already shows the same data, using NVAPI, too

With a nice in-game overlay? :)
 
Last edited:
Joined
Aug 5, 2020
Messages
199 (0.12/day)
System Name BUBSTER
Processor I7 13700K (6.1 GHZ XTU OC)
Motherboard Z690 Gigabyte Aorus Elite Pro
Cooling Arctic Freezer II 360 RGB
Memory 32GB G.Skill Trident Z RGB DDR4 4800MHz 2x16GB
Video Card(s) Asus GeForce RTX 3070 Super Dual OC
Storage Kingston KC 3000 PCIE4 1Tb + 2 Kingston KC 3000 1TB PCIE4 RAID 0 + 4 TB Crucial gen 4 +12 TB HDD
Display(s) Sony Bravia A85 j OLED
Case Corsair Carbide Air 540
Audio Device(s) Asus Xonar Essence STX II
Power Supply Corsair AX 850 Titanium
Mouse Corsair Gaming M65 Pro RGB + Razr Taipan
Keyboard Asus ROG Strix Flare Cherry MX Red + Corsair Gaming K65 lux RGB
Software Windows 11 Pro x64
I already have a 1080 SLI 3 years ago and it's great for 4k experience minimum 70 - 80 fps in Ultra in most games...
 
Joined
Apr 29, 2014
Messages
4,304 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Great review, I cannot wait for the RTX 3090 review!

Its interesting how this cooler performs, seems to be a nice improvement over past "FE" coolers in a meaningful way especially with the power consumption of these cards. I am a little disappointed in the overclocking of these cards, I mean overclocking has been meh for awhile but it seems like this one is even less than normal. Granted the memory moved up and it shows some decent performance gains but it looks like these cards are already pushed to their limit out of the box with only minor improvement with aftermarket cards.

Still cant wait!
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
The important increase is in transistors. 28000(5000 broken/disabled) is 70% more than 2080 and 25% more than what 2080Ti had. 25% is the same as the performance increase. now, on 7nm DUV this is a 426mm2 die, instead of 628mm2. the main reason to avoid this is that shrinks are imminent at this point. this we didn't get, sadly. on 6nm EUV this is 360mm2 and clock speed +50% for same power. so this is just another titan. big powerful but will fall down inevitably. sometimes in less than 10-12 months. so yeah $700 not as good as you think. except in the moment, in the moment is everything.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
2,039 (0.35/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case Cooler Master QUBE 500 Flatpack Macaron
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Keychron K2 HE Wireless / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Meta Quest 3 512GB
Software Windows 11 Pro 64-bit 24H2 Build 26100.2605
Nitpick, TSMC 16FF (12N) -> Samsung 10LPU (8N). 320W+ is a 30% increase in power for ~30% increase in perf.



Means little when it's dark silicon. Largely useful for compute/CUDA loads & @ 4k where each frame becomes more alu limited. The increased number & revised ROP partitions help @ 4k.

We may be doing GA a disservice given RT & denoising should show gains when games/compiler are better optimised.



With a nice in-game overlay? :)

Hmm.. I know the TDP of the 2080 Ti is at 250W, but you can see it using around 273W (average gaming) according to @W1zzard's charts. The 3080 is rated at 320W, but it does seem to be hovering around 303W. Maybe NVIDIA just overshooting their stated specs?
 
Joined
Mar 31, 2012
Messages
862 (0.19/day)
Location
NL
System Name SIGSEGV
Processor AMD Ryzen 9 9950X
Motherboard MSI MEG ACE X670E
Cooling Noctua NF-A14 IndustrialPPC Fan 3000RPM | Arctic P14 MAX
Memory Fury Beast 64 Gb CL30
Video Card(s) TUF 4090 OC
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo | WD Black SN850X 2TB
Display(s) 27" /34"
Case O11 EVO XL
Audio Device(s) Realtek
Power Supply FSP Hydro TI 1000
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint
Benchmark Scores i dont care about scores
I hate to say, but it's kinda weird with all those charts. Hype all the way. LOL
unimpressive performance. feels like they release this card in hurry. :oops:

nice review. thanks.
 
Joined
Apr 10, 2020
Messages
504 (0.29/day)
Hardware unboxed numbers (similar results):

RTX 3080 vs 2080 Ti: 14 Game Average at 1440p = +21 %
RTX 3080 vs 2080 Ti: 14 Game Average at 4K = +31 %

RTX 3080 vs RTX 2080: 14 Game Average at 1440p = +47%
RTX 3080 vs RTX 2080: 14 Game Average at 4K = +68%

Very nice gains at 4K and average generational gain at 1440p (excluding Turing)...


Total (off the wall) system power consumption: 523W
GPU only measured on Nvidia’s PCAT (Power Capture Analysis Tool) playing DOOM: 327W
8% performance per watt gain over Turing -> far from impressive given Ampere moved to new node
 
Joined
Aug 2, 2012
Messages
2,017 (0.45/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
The fan noise levels are enough for me to pass on a FE.

Looks like i will have to find a partner board that fit my Arctic Accelero Xtreme 3.
 
Joined
Dec 31, 2009
Messages
19,371 (3.54/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Here you go. 1080ti came out at 700 dollars and performed almost twice as fast as a 980ti that also was 700 dollars
Math my man. :(

that is 46% faster. You realize that 2x = 100% right? For example if card A ran at 100 FPS and card B ran at 146 FPS, card B is 46% faster than card A. If it was "double" it would be 100%.
 
Joined
Oct 17, 2012
Messages
9,781 (2.20/day)
Location
Massachusetts
System Name Americas cure is the death of Social Justice & Political Correctness
Processor i7-11700K
Motherboard Asrock Z590 Extreme wifi 6E
Cooling Noctua NH-U12A
Memory 32GB Corsair RGB fancy boi 5000
Video Card(s) RTX 3090 Reference
Storage Samsung 970 Evo 1Tb + Samsung 970 Evo 500Gb
Display(s) Dell - 27" LED QHD G-SYNC x2
Case Fractal Design Meshify-C
Audio Device(s) on board
Power Supply Seasonic Focus+ Gold 1000 Watt
Mouse Logitech G502 spectrum
Keyboard AZIO MGK-1 RGB (Kaith Blue)
Software Win 10 Professional 64 bit
Benchmark Scores the MLGeesiest
it didnt seem to have that damn adhesive that you need to heat up to access fasteners like the 1xxx Reference cards atleast. unless i missed the picture with that.

the 9xx Reference had those damn plastic type hex screws that stripped if you coughed near them
 

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.23/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
At 60Hz, yes.

My 1440p G-Sync monitor goes upto 165Hz, I wouldn't say overkill for that, it depends on the game.


True. Mine goes up to 144, so actually looking at the figures, it may make sense for it. Especially with future-proofing concerns. I suppose it depends mostly on cyberpunk 2077's performance, though. Luckily, I have time until we have some information from the competition.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
2,039 (0.35/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case Cooler Master QUBE 500 Flatpack Macaron
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Keychron K2 HE Wireless / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Meta Quest 3 512GB
Software Windows 11 Pro 64-bit 24H2 Build 26100.2605
True. Mine goes up to 144, so actually looking at the figures, it may make sense for it. Especially with future-proofing concerns. I suppose it depends mostly on cyberpunk 2077's performance, though. Luckily, I have time until we have some information from the competition.

I'm gonna throw my guess out there. This RTX 3080 is probably around 100 FPS in Cyberpunk 2077 at 4K. This is based off any improvements from the Witcher 3 engine and the fact that they are still optimizing it for the current gen (PS4/XB1) consoles.
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Hmm.. I know the TDP of the 2080 Ti is at 250W, but you can see it using around 273W (average gaming) according to @W1zzard's charts. The 3080 is rated at 320W, but it does seem to be hovering around 303W. Maybe NVIDIA just overshooting their stated specs?

It's simple to explain
2080 Ti FE 260W TDP put the chip in the lower region (more efficient) part of the perf/power curve while 3080 320W TG is at the higher point in the perf/power curve
Meaning it's easy to overclock the 2080 Ti by simply rasing the power limit while raising power limit on 3080 does nothing (as in every review have pointed out, very similar to 5700XT).
That also means lowering the TGP of the 3080 to similar level to 2080 Ti like Computerbase did will not lower the performance of 3080 by much, improving 3080 efficiency if you so require.

 
Top