• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Founders Edition

Joined
Dec 26, 2006
Messages
3,806 (0.58/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
It's amazing you've just signed up to spoil the launch with your very valuable opinion. Also, remember, we live in a free society, so please stop it with "overpriced". Did NVIDIA put a gun against your head and ask you to buy any of their GPUs? No? Then how on Earth are they overpriced? Also, I agree, "0GB VRAM is very low... for 4k". Except this card features 10GB and we have close to zero games which actually require more than 8GB of VRAM at 4K. Also read the rest of my comment.

Speaking of "planned obsolescence" due a lack of VRAM:



See how the GTX 1060 3GB still works relatively OK despite not having enough VRAM even 5 years ago. Yes, it's very slow in fact 33% slower, but not 2 or 3 times as slow as its 6GB brother. Also, see how both cards are unusable at this resolution.


  • Game developers target the most popular GPUs and most of them contain 8GB of VRAM or even less. Consoles also won't really feature more than 8GB of VRAM because their memory pool is shared between an OS core, game code and VRAM and that all should fit into 16GB or less. And both consoles feature uber fast textures streaming off their storage.
  • Also, it's been shown time and again, that NVIDIA has superb drivers and their GPUs performance is not seriously affected even when there's not "enough" VRAM. E.g. Call of Duty: Modern Warfare eats VRAM for breakfast (over 9.5GB of VRAM use at 4K), yet, are NVIDIA cards with 6GB of VRAM affected? Not at all. Besides with RTX IO it all becomes moot.
  • Lastly, by the time 10GB of VRAM is not enough, this GPU performance will be too low even if had twice as much VRAM.
Still you can always buy a future-proof GPU from AMD. BTW, do you remember Radeon R9 Fury? Was released with paltry 4GB of VRAM which was strangely enough for AMD fans.

Most people who have a budget in the GTX 1060 range are also people whose monitor budget is in the 1080p range. Also every game is different, look at Witcher 3.

If all I had was a 1080p and a small budget, I wouldn't get more than 4GB, and maybe even a 2GB card since a GPU's life cycle is so short anyway, compared to that of a monitor which is usually 10 years.
 
Joined
Nov 11, 2016
Messages
3,393 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Vega was also on an inferior node & admittedly an inferior uarch, that compounded their problem. But what's clear that there's still bias against AMD & towards Nvidia, JHH can launch a proverbial turd (remember Fermi?) & still get accolades while AMD not only has to please the audience but also pay them to do so :shadedshu:

RTG just reap what they sown, the market will response when RTG make a good product like Ryzen.
Vega was too inefficient
Navi is too expensive (compare to Polaris), 5500XT and 5600XT are like sacrificial pawn.
Not to mention the state of their drivers and hardware compatibilities, well you can blame AMD for that because the R&D funding was pooled into CPU developments, that was also why Raja left.
 
Joined
Mar 10, 2015
Messages
3,984 (1.13/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Joined
Oct 22, 2014
Messages
14,055 (3.83/day)
Location
Sunshine Coast
System Name Lenovo ThinkCentre
Processor AMD 5650GE
Motherboard Lenovo
Memory 32 GB DDR4
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Lenovo
Software W11 Pro 64 bit
There's no way i'm buying ANYTHING from someone who wears a leather jacket in the kitchen. :kookoo:
Would you rather he wore only an apron? :p
 
Joined
Dec 26, 2006
Messages
3,806 (0.58/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
W1zz, what is the BIOS version for that EVGA Z390 motherboard?
 
Joined
Apr 12, 2013
Messages
7,477 (1.77/day)
RTG just reap what they sown, the market will response when RTG make a good product like Ryzen.
Oh sure when AMD fucks up it's their fault, when they don't it's still their fault :rolleyes:

Let's see how many GPUs or series (uarch) do you think they didn't botch in the last 5 years?
well you can blame AMD for that because the R&D funding was pooled into CPU developments
For good reason, going EPYC not only helped them stay afloat but also now lead the top end CPU market virtually on all platforms.
 
Joined
Feb 14, 2012
Messages
2,353 (0.51/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
Looks good, but I am uneasy with that much wattage for a video card. The 3070 is supposed to be 220W? And only 8GB, why would anyone buy 220W of gpu fire power, and only have 8GB.
 
Joined
Jul 3, 2019
Messages
322 (0.17/day)
Location
Bulgaria
Processor 6700K
Motherboard M8G
Cooling D15S
Memory 16GB 3k15
Video Card(s) 2070S
Storage 850 Pro
Display(s) U2410
Case Core X2
Audio Device(s) ALC1150
Power Supply Seasonic
Mouse Razer
Keyboard Logitech
Software 22H2
Too much hype man, too much hype, nothing can live up to overhype. (ot) CP2077 will have the same faith.

OMG all those CUDA cores, forgeting the need to have balanced architecture that can feed all those executiion units. Forgot Vega64 vs GTX1080, 4k vs 2.5k ALUs

It's good generational jump.

Nvdia shouldnt have pushed it beyond the efficiency curve. Also 10GB for 4k card, yeee no.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,700 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
do you think with v.mod like shunt mods i have done on many cards will help with overclocking headroom
most certainly

Is there any plan to add the Call of Duty series back into reviews any time soon?
No plans, bnet banned me for trying to benchmark the game, and their support didn't want to admit it and chose to ignore my ticket for 3 months.
Always online sucks, because they will force perf-changing patches on you whenever it's most inconvenient.
 
Joined
Jun 19, 2019
Messages
212 (0.11/day)
If it has a manual power limit adjustment range of up to 370 W, from a default of 320 W, is it also possible to lower max power consumption, for example to 250W?
 
D

Deleted member 185088

Guest
While the performance is impressive finally a true 4k card and now that it's been confirmed that on LG OLEDs it works at 4k120HDR, this combination is just phenomenal.
But the card is way too expensive, the 700$ is just fake (I assume it doesn't include tax) elsewhere it's much more expensive (even in Asia where the card is built it can cost up to 1000$ or more), why not use a similar pricing as consoles (MSRP), the PS5 and Series X cost around the same in Europe, north America and Japan.
At the end of the day, great performance ruined by stupid pricing (I'm talking about nVidia MSRP).
 
Joined
Nov 11, 2016
Messages
3,393 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
While the performance is impressive finally a true 4k card and now that it's been confirmed that on LG OLEDs it works at 4k120HDR, this combination is just phenomenal.
But the card is way too expensive, the 700$ is just fake (I assume it doesn't include tax) elsewhere it's much more expensive (even in Asia where the card is built it can cost up to 1000$ or more), why not use a similar pricing as consoles (MSRP), the PS5 and Series X cost around the same in Europe, north America and Japan.
At the end of the day, great performance ruined by stupid pricing (I'm talking about nVidia MSRP).

You are mixing up supply and demand there, retailers just like to jack up the price when stuffs are in short supply.
Just wait few months when price settles, PS5/XBX price will be inflated when they first launch too.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,147 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
If it has a manual power limit adjustment range of up to 370 W, from a default of 320 W, is it also possible to lower max power consumption, for example to 250W?
There are a couple of links on page 9, it seems that this is indeed possible and the results are promising. One site went down the slider to 83% max power limit to match the wattage draw of a 2080Ti at 270w, but it only resulted in a ~4% performance loss, not bad at all! the hypothesis is that Nvidia have had to push the GPU beyond the optimal efficiency sweet spot to hit the performance target they were seeking.
 
Joined
Jan 11, 2005
Messages
1,491 (0.21/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
if they would produce it at TSMC on 7nm it will have the projected leap and would be cooler also...; price-wise no matter how i look i'll never pay more than 300 for a card so good purchase to those who'll buy it!
 
Joined
Aug 9, 2019
Messages
1,668 (0.87/day)
Processor 7800X3D 2x16GB CO
Motherboard Asrock B650m HDV
Cooling Peerless Assassin SE
Memory 2x16GB DR A-die@6000c30 tuned
Video Card(s) Asus 4070 dual OC 2610@915mv
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores Superposition 8k 5267 Aida64 58.5ns
That's because lower resolutions are CPU limited.

Edit: Actually you're making a great point. I'm using the same 303 W typical power consumption value from the power measurements page on all 3 resolutions, which isn't 100% accurate. Because it's some games are CPU limited, then in those games the power consumption is down, too, which I'm not taking into account
Perhaps all GPU-powerconsumption-testing should be done at 4k so we get a true idea of how much power it uses if not CPU-limited? :)
 
D

Deleted member 185088

Guest
You are mixing up supply and demand there, retailers just like to jack up the price when stuffs are in short supply.
Just wait few months when price settles, PS5/XBX price will be inflated when they first launch too.
No I'm not, this is the MSRP of nVidia, for instance they price the 3080 at 699$ in the US but 1000$+ in Japan the prices are from their website.
So why they didn't price the cards at 699$ or thereabouts everywhere?
The consoles are around the same MSRP in most markets.
 
Joined
Jun 6, 2009
Messages
8 (0.00/day)
Under the overclocked performance testing, what settings were used in Unigine Heaven?

Thanks
 
Joined
Jun 28, 2019
Messages
100 (0.05/day)
It isn't NVIDIA's fault if you can't read a chart, it's 1.9x at the same performance...
of course I had noticed it, on tom's IT I had also pointed it out to everyone, but still they declared 1.5x with the same power draw! see the picture.
Well, where is this 1.5x in performance/watt?
declare false, there is no excuse.

 
Last edited:
Joined
Feb 3, 2017
Messages
3,726 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
No I'm not, this is the MSRP of nVidia, for instance they price the 3080 at 699$ in the US but 1000$+ in Japan the prices are from their website.
So why they didn't price the cards at 699$ or thereabouts everywhere?
The consoles are around the same MSRP in most markets.
Consoles are not quire around the same MSRP either. For example, the recently announced price of Xbox Series X is $499/499€/£449 and 49980 yen. Compared to US price Japanese price is reduced as it is a difficult market for Xbox and EU/GB prices reflect included taxes.

There are a number of reasons to vary MSRP across different regions or countries - taxes are a big one and marketing position is another.

13-24% performance increase over 2080Ti. Depending on resolution. With 28% higher TDP. Don't bother selling your 2080Ti.
15-31%. RTX3080 is considerably more CPU limited on 1440p.
 
Joined
Jun 19, 2019
Messages
212 (0.11/day)
There are a couple of links on page 9, it seems that this is indeed possible and the results are promising. One site went down the slider to 83% max power limit to match the wattage draw of a 2080Ti at 270w, but it only resulted in a ~4% performance loss, not bad at all! the hypothesis is that Nvidia have had to push the GPU beyond the optimal efficiency sweet spot to hit the performance target they were seeking.

Combining that with undervolting and maybe no performance lost ;)
 
Joined
Nov 11, 2016
Messages
3,393 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
No I'm not, this is the MSRP of nVidia, for instance they price the 3080 at 699$ in the US but 1000$+ in Japan the prices are from their website.
So why they didn't price the cards at 699$ or thereabouts everywhere?
The consoles are around the same MSRP in most markets.

US and Japan do not include VAT into their listing price, everywhere else does LOL. Right now it tough for you to avoid VAT when buying online in the US so yeah please add 8-12% VAT depending on where you live.
If you can buy online from Nvidia website without VAT, go right ahead, it's a mighty good deal.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.83/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Just check the relative performance of 2080 Ti compare to 3080 across 3 resolution and you will see a change
1080p: 87%
1440p: 81%
4K: 75%
Might as well use DSR to do some 8K testing on this bad boy :D
People with 1080p and 1440p screen might as well use DSR if they have the 3080, otherwise you are just wasting all performance prowess of 3080.
...if those numbers show anything, it's that there are no CPU bottlenecks at 4k, given that it scales beyond the lower resolutions. Which is exactly what I was pointing out, while @birdie was claiming that this GPU is CPU limited even at 4k:
quite a lot of games being reviewed are severely CPU limited even at 4K!
Which I then asked for some data demonstrating, as this review fails to show any such bottleneck. And, as GN pointed out in their excellent video review, not all seeming CPU limitations are actual CPU limitations - some examples of poor GPU scaling are down to the architecture or layout of the GPU causing GPU bottlenecks that don't show up as 100% load.
That's a strange way to think about it, if it is possible to exceed that amount then it inevitably becomes a limitation.
Only if there is a perceptible difference in graphical quality - without that, the difference is entirely theoretical. And that's the entire point: a lot of games have extremely "expensive" Ultra settings tiers with near imperceptible or even entirely imperceptible differences in quality. If your benchmark for not having a VRAM limitation is the ability to enable all of these, then your benchmark is problematic. If the thing you're worried about for this GPU is that you might at some point in the future need to lower settings an imperceptible amount to maintain performance, then ... what are you worried about, exactly? Stop promoting graphical quality nocebo effects, please. Because at that point, all you are arguing for is the value/security of knowing your GPU can handle everything set to Ultra, no matter what this actually means for the quality of the game. Which is just silly. Not only is it an impossible target, but it's a fundamentally irrational one.
 
Joined
Jul 26, 2019
Messages
418 (0.22/day)
Processor R5 5600X
Motherboard Asus TUF Gaming X570-Plus
Memory 32 GB 3600 MT/s CL16
Video Card(s) Sapphire Vega 64
Storage 2x 500 GB SSD, 2x 3 TB HDD
Case Phanteks P300A
Software Manjaro Linux, W10 if I have to
I don't know guys. The 2080 Ti is 19% slower on average in these tests. That doesn't seem super impressive to me -- especially compared to the hype.
 
Top