# NVIDIA GeForce RTX 3080 Founders Edition



## W1zzard (Sep 16, 2020)

NVIDIA's new GeForce RTX 3080 "Ampere" Founders Edition is a truly impressive graphics card. It not only looks fantastic, performance is also better than even the RTX 2080 Ti. In our RTX 3080 Founders Edition review, we're also taking a close look at the new cooler, which runs quietly without throttling and has fan stop.

*Show full review*


----------



## R0H1T (Sep 16, 2020)

Oh boy, now you know why *Nvidia should be worried & rightly so*


----------



## wolf (Sep 16, 2020)

Great review as always, will be getting one stock permitting!


----------



## mrthanhnguyen (Sep 16, 2020)

80c under test bench and 20c ambient?


----------



## Metroid (Sep 16, 2020)

3080 x 5700 = 100% in 4k overall, amazing, finally a true 4k gaming gpu and the first hdmi 2.1 gpu.


----------



## gridracedriver (Sep 16, 2020)

as expected by me, + 10~20% increase in perf / watt and + 20~30% performance from 2080ti, but which 1.9x declared in the nVidia slides. 
I guess they are really worried about the arrival of RDNA2


----------



## Vya Domus (Sep 16, 2020)

Would have liked some memory usage figures, I am not quite convinced it really is a must have for 4K with hardly any more memory than a high end card from what, 7 years ago (290X 8GB) ? Not much progress in that area.

Had it been at least a 12 GB card, which we know they could have done easily because the bus interfaces are there it would have peaked my interest enough to buy one (well, a couple of months done the line as the price gouging somewhat diminishes). But as it is, nah.


----------



## Valantar (Sep 16, 2020)

While absolute performance is definitely impressive, the fact that perf/W is not moving whatsoever is ... a bit worrying, frankly. Sure makes it look like Nvidia is pushing this card in a similar manner as AMD did with their Vega cards, and ... that's not very promising. Here's hoping the 3070 behaves more sensibly. But going by this data alone, AMD might have a serious shot at taking the perf/W crown this go around.


----------



## W1zzard (Sep 16, 2020)

Valantar said:


> that perf/W is not moving


Not moving? 1080 Ti = 70%, 2080 Ti = 85%, 3080 = 100%


----------



## Valantar (Sep 16, 2020)

W1zzard said:


> Not moving? 1080 Ti = 70%, 2080 Ti = 85%, 3080 = 100%


It's moving at 4k, but they're actually _behind_ AMD's past generation (!) and their own (!!) at 1080p and just barely ahead at 1440p. For a new generation with a supposed full node increase, that's pretty weak IMO.


----------



## W1zzard (Sep 16, 2020)

Valantar said:


> It's moving at 4k, but they're actually _behind_ AMD's past generation (!) and their own (!!) at 1080p and just barely ahead at 1440p. For a new generation with a supposed full node increase, that's pretty weak IMO.


That's because lower resolutions are CPU limited. 

Edit: Actually you're making a great point. I'm using the same 303 W typical power consumption value from the power measurements page on all 3 resolutions, which isn't 100% accurate. Because it's some games are CPU limited, then in those games the power consumption is down, too, which I'm not taking into account


----------



## iuliug (Sep 16, 2020)

"Huge performance increase over RTX 2080" - 30% is huge - sure....

I am editing this post - I was wrong - Performance is 35 is to 60 ish - it is significant in line what used to be expected for a generational /node advancement with a power consumption increase.


----------



## FreedomEclipse (Sep 16, 2020)

Well thats the cooling question answered for me. I mean yes the card runs HOT like any other reference card does but when you take into account how much hrsprs it has over the 2080Ti then that is actually pretty good.

@W1zzard 

Did you log the CPU thermals while you were running the tests? Im interested to know how much all heat from the GPu would of impacted thermals of other components.


----------



## W1zzard (Sep 16, 2020)

FreedomEclipse said:


> Did you log the CPU thermals while you were running the tests?


I did not, it's watercooling anyway. Could be worth a separate article


----------



## Valantar (Sep 16, 2020)

W1zzard said:


> That's because lower resolutions are CPU limited.
> 
> Edit: Actually you're making a great point. I'm using the same 303 W typical power consumption value from the power measurements page on all 3 resolutions, which isn't 100% accurate. Because it's some games are CPU limited, then in those games the power consumption is down, too, which I'm not taking into account


That makes sense. Also, there's a point to be made about the test suite if enough of it is CPU limited at 1080p for it to skew results in that manner. Time for an update? Besides, if the GPU is idle, waiting for the CPU, it should be consuming a lot less power, no?


----------



## W1zzard (Sep 16, 2020)

Valantar said:


> Time for an update?


I just updated  The issue is not only old games



Valantar said:


> Besides, if the GPU is idle, waiting for the CPU, it should be consuming a lot less power, no?


Correct, as I said, that's not reflected in my perf/W summary scores


----------



## renz496 (Sep 16, 2020)

iuliug said:


> "Huge performance increase over RTX 2080" - 30% is huge - sure....


 did you even read the review?


----------



## mak1skav (Sep 16, 2020)

Looks quite good. Not having extra money in my pocket at the moment will help to to not jump the gun and wait a little more to see what AMD will have to offer and then decide for my upgrade. I have to admit that it looks quite tempting though.


----------



## Valantar (Sep 16, 2020)

W1zzard said:


> I just updated  The issue is not only old games
> 
> Correct, as I said, that's not reflected in my perf/W summary scores


Understandable. Guess it might be time to add power draw measurements at each resolution then? And possibly add - as an optional extra for GPUs that hit CPU limitations, like this one - test in relevant resolutions in both GPU and CPU limited titles? Of course that adds a lot of work, but that seems to be the nature of all benchmarking - the workload just grows over time. There's also an argument to be made for producing an overall performance graph that excludes CPU-bound games, though at that point things might get too complicated for the average reader.


----------



## ppn (Sep 16, 2020)

So RTX on, same performance loss. DLSS same benefits.
New Shader count divide by 2. Compared to 20 series +5% efficiency. 8704=4352*1.05. wow. just amazing. 3070 = 2080Super. 2944 *1.05


----------



## W1zzard (Sep 16, 2020)

Valantar said:


> Guess it might be time to add power draw measurements at each resolution then?


Rather the only solution I can think of is to record power during all testing in all games, to get the proper power average. Not something I'm able to do at the moment.



Valantar said:


> that excludes CPU-bound games


4K is a good approximation of that. For the 3080, Full HD really is more of an academic resolution, I doubt anyone will buy that card for 1080p


----------



## Raevenlord (Sep 16, 2020)

I am now thoroughly convinced the RTX 3080 is absolute overkill for my 1440p monitor.


----------



## W1zzard (Sep 16, 2020)

ppn said:


> 2080 Ti is more than 64 ROPS. @w1zz


Fixed, lol, this has been wrong in every single review since RTX 2080 Ti FE t.t .. congrats for being the first to notice it. Now I have to figure out how to fix all those reviews


----------



## londiste (Sep 16, 2020)

> Makes little sense for gamers without 4K monitor


I don't know about that. My 1440p@165Hz monitor is telling me it wants an RTX 3080


----------



## Valantar (Sep 16, 2020)

W1zzard said:


> Rather the only solution I can think of is to record power during all testing in all games, to get the proper power average. Not something I'm able to do at the moment.
> 
> 4K is a good approximation of that. For the 3080, Full HD really is more of an academic resolution, I doubt anyone will buy that card for 1080p


Don't underestimate people's unwillingness to leave 1080p behind! 

And yeah, automatic and continuous power logging is probably the best way forward for something like that, though that also sounds like a serious investment in terms of equipment. Hope you have the opportunity to expand this in the future though!


----------



## cueman (Sep 16, 2020)

what i can say, its amazing fast,power eat is okay when we look performance.
truly 4K gpu. 1st time ever, all away,easily. and that with 300W,excellent!

 numbers are so big that i cant see any chnage amd rx 6000 series,rtx 3080 FE is almost 2 times (100%) faster!....and only 70W more power need, thats mean unbeatible work from amd.

amd know now for sure if rx 6900 can reach it, noway. not  under under 400W gpu, and rx 6000 just cant handle it.


also AIB version rip off oc'd limited  things, so except alot more power, example asus strix advance rtx 3080 or msi rtx 3080 Lightning versions.. HOF and so on....


----------



## birdie (Sep 16, 2020)

People should forget about this card performance, power efficiency and price/performance at 1080 and 1440p - this card is *made for 4K* unless you wanna get 500fps@1080p in your favourite online shooter. Period.

Overall, I'm pleased with what NVIDIA has achieved with this generation but since I'm not a filthy rich European/US citizen who can afford a high refresh 4K HDR IPS monitor, I will simply forget about it.

To be honest over 95% of gamers may simply avoid reading this review as it's beyond their budget.

*AMD fans who are now crying wolf about this card characteristics at lower resolutions (below 4K) should be ignored with prejudice*. They simply don't understand the point of this GPU.


----------



## LuckyX2 (Sep 16, 2020)

There are three separate drivers because power is now split between core, mem and memory controller on Ampere unlike just core and mem previously.


----------



## sergionography (Sep 16, 2020)

Wait... That's it? I'm honestly disappointed this even gets an editors choice award...


----------



## Naito (Sep 16, 2020)

Very powerful, yet I still find that power consumption a bit concerning...


----------



## Durvelle27 (Sep 16, 2020)

Man this card really performs


----------



## Fleurious (Sep 16, 2020)

The non RTX performance is the deciding factor for me and it is on par with what i was expecting.  Unless the 3090 is magical or AMD pulls off a miracle i’ll probably pick up a 3080 to replace my 1070ti.


----------



## Mussels (Sep 16, 2020)

Performance: amazing

Performance per watt: oof


----------



## Chomiq (Sep 16, 2020)

Raevenlord said:


> I am now thoroughly convinced the RTX 3080 is absolute overkill for my 1440p monitor.


Yet at the same time going to 4k60 would be a downgrade from high refresh 1440p.


----------



## Valantar (Sep 16, 2020)

birdie said:


> People should forget about this card performance, power efficiency and price/performance at 1080 and 1440p - this card is *made for 4K* unless you wanna get 500fps@1080p in your favourite online shooter. Period.
> 
> Overall, I'm pleased with what NVIDIA has achieved with this generation but since I'm not a filthy rich European/US citizen who can afford a high refresh 4K HDR IPS monitor, I will simply forget about it.
> 
> ...


There's no doubt that this is first and foremost a 4k GPU - for which it performs admirably and is indeed a decent if relatively unremarkable* generational boost in efficiency - but 1440p high refresh rate gaming falls in the same performance category, and given monitor pricing and availability is going to be relevant for a lot more people. Looking at the results here, 1440p is clearly hitting _a lot _less CPU limitations than 1080p, yet perf/W is still just barely inching past previous generations from both manufacturers at that resolution. Of course, as @W1zzard said these numbers are all based off the same power reading (I'm guessing at 4k), and 1440p draws might be a tad lower, but they are likely still quite high.


*per-generation efficiency gains at 4k seem roughly in line with 10xx to 20xx upgrades, which didn't have a full node shrink to rely on for gains.


----------



## V3ctor (Sep 16, 2020)

Is there anyway to know about the VRAM usage?
I'm afraid to buy this, because I play on 4K. but 10Gb seems very little future proof


----------



## yeeeeman (Sep 16, 2020)

Performance is nice, but for a 650mm^2 GPU it is a bit meh. Anyway, since it is a cut down die, we'll see how good this is on the 3090. Also, power consumption being high is not a surprise. Samsung is not known for high efficiency process and also 8nm, that nvidia is using is just a 10nm tweak, so not far off the old 12nm FFN from TSMC. For the price, this is a step in the right direction given last month you could get 25% worse performance for 1200$ and now you get this for 700-800$.


----------



## Kohl Baas (Sep 16, 2020)

Review said:
			
		

> In this test, we disabled the DXR/RTX features to ensure a level playing field.



Shouldn't the DLSS be disabled too to get a level playing field?


----------



## Jism (Sep 16, 2020)

The engineering that went into that heatsink, lol. And the PCB / VRM setup. Good for pointing out the memory VRM part vs the core.

I guess we're in a era where 4K gaming finally is possible without spending 2k of hardware for it. Now it's just waiting on AMD. If they score equal for a lower price AMD has a winner.


----------



## Erazor6000 (Sep 16, 2020)

With this power consumption, I would buy it water cooled.


----------



## W1zzard (Sep 16, 2020)

V3ctor said:


> Is there anyway to know about the VRAM usage?
> I'm afraid to buy this, because I play on 4K. but 10Gb seems very little future proof


New consoles are 10 GB for CPU+VRAM, your PC has 16 GB for CPU + 10 GB for VRAM, should be fine. Obviously no way to predict the future


----------



## FreedomEclipse (Sep 16, 2020)

W1zzard said:


> I did not, it's watercooling anyway. Could be worth a separate article



well it seems like Linus also touched on it. He reports about a 10'c rise in CPU temps though im not sure how his test bench is set up. Im taking it with a pinch of salt.


----------



## Valantar (Sep 16, 2020)

W1zzard said:


> New consoles are 10 GB for CPU+VRAM, your PC has 16 GB for CPU + 10 GB for VRAM, should be fine. Obviously no way to predict the future


New consoles are 13.5GB for CPU+VRAM. 16GB total, 2.5GB reserved for the OS. Your point still stands though. A game requiring < 3.5GB of system memory and >10GB of VRAM sounds unlikely.


----------



## Shatun_Bear (Sep 16, 2020)

Only a *24%* gap between the 2080 Ti and 3080 @ 4K?! Very disappointing considering it uses 10-20% more power depending on the review.



gridracedriver said:


> as expected by me, + 10~20% increase in perf / watt and + 20~30% performance from 2080ti, but which 1.9x declared in the nVidia slides.
> I guess they are really worried about the arrival of RDNA2



HardwareUnboxed found the perf-per-watt increase even worse; 8 percent!


----------



## claylomax (Sep 16, 2020)

RTX 3080 Ti will double your FPS 

4th paragraph in conclusion.


You probably haven't slept in days.


----------



## the54thvoid (Sep 16, 2020)

It's a shot in the arm for the generational price/performance metric. UK price for FE is £649. Online retailers still have the 2080ti at >£1000. Who the hell would buy a 2080ti at those prices? It's a crazy situation. For me, on a 3700X at 1440p, there's not a great reason to upgrade. Impressed though. And why aren't people talking about how much effort has gone into the design? I mean, even those magnetic screw covers... That is thinking outside the box.


----------



## gridracedriver (Sep 16, 2020)

Shatun_Bear said:


> Only a *24%* gap between the 2080 Ti and 3080 @ 4K?! Very disappointing considering it uses 10-20% more power depending on the review.
> 
> 
> 
> HardwareUnboxed found the perf-per-watt increase even worse; 8 percent!


I remember that 1080 outperforme 980ti of + 30% with a increase perfwatt + 80%. lol


----------



## birdie (Sep 16, 2020)

It's amazing how much flak NVIDIA receives for the RTX 3080 despite the fact that quite a *lot of games being reviewed are severely CPU limited even at 4K*! You do *not* draw conclusions about the card's efficiency and performance based off poorly optimized games. I fully expect not a single complain from AMD fans about the upcoming RX 6900 which *will have the exact same issue* because no matter how fast your make your GPU, if it's constrained by the CPU, all your improvements are worth nothing.

I'd be extremely glad if @W1zzard created separate charts for performance and performance/power consumption only for the games whose performance scale linearly or near linearly with resolution.


----------



## Shatun_Bear (Sep 16, 2020)

gridracedriver said:


> I remember that 1080 outperforme 980ti of + 30% with a increase perfwatt + 80%. lol



Yep, this is the reason Nvidia held reviews until the day before release. Underwhelming compared to what was sold in their marketing show. Very underwhelming for a node shrink that has a much higher power draw.


----------



## W1zzard (Sep 16, 2020)

claylomax said:


> RTX 3080 Ti will double your FPS
> 
> 4th paragraph in conclusion.
> 
> ...


Fixed, thanks


----------



## Vya Domus (Sep 16, 2020)

W1zzard said:


> New consoles are 10 GB for CPU+VRAM, your PC has 16 GB for CPU + 10 GB for VRAM, should be fine. Obviously no way to predict the future



Thing is though, consoles will use dynamic resolutions, checkerboarding and all sorts of compromises to get there. For someone wanting native 4K with highest settings, it is concerning.



Valantar said:


> A game requiring < 3.5GB of system memory and >10GB of VRAM sounds unlikely.



I don't think anyone is wondering whether or not a game is going to need more than 10 GB any time soon because they wont. Running them at 4K with everything maxed out will, most definitely cause >10GB usage quite soon.



birdie said:


> I fully expect not a single complain from AMD fans



Dude you are the only guy that keeps talking about AMD here, give it up. You are as cringe worthy as ever with your intentional flaming.


----------



## W1zzard (Sep 16, 2020)

Vya Domus said:


> it is concerning


Not concerning imo, that's also why I didnt comment on it in the review. Not enough data anyway, let's talk again after xmas. But history suggests memory is not an issue. Remember 3 GB and 4 GB cards? These are still fine for 1080p, data is on the summary pages


----------



## mobiuus (Sep 16, 2020)

3080 not so powerfull vs 2080ti as advertised...
typically ngreedia


----------



## birdie (Sep 16, 2020)

Shatun_Bear said:


> Yep, this is the reason Nvidia held reviews until the day before release. Underwhelming compared to what was sold in their marketing show. Very underwhelming for a node shrink that has a much higher power draw.



Which 4K graph has made you think so? RTX 3080 minimum fps is above average for RTX 2080 Ti. I don't remember the last time we've had such a huge leap in performance. How is it "underwhelming"?

Maybe you could share some graphs to prove your PoV?



DarkStalker said:


> 3080 not so powerfull vs 2080ti as advertised...
> typically ngreedia



It's exactly as powerful as advertised if not better at 4K, the resolution it was created for. Never seen so many AMD trolls in one discussion. Also, it costs $700, so not sure about greed. Also, no one forces you to buy it.

Speaking of greed:


----------



## Valantar (Sep 16, 2020)

birdie said:


> It's amazing how much flak NVIDIA receives for the RTX 3080 despite the fact that quite a *lot of games being reviewed are severely CPU limited even at 4K*! You do *not* draw conclusions about the card's efficiency and performance based off poorly optimized games. I fully expect not a single complain from AMD fans about the upcoming RX 6900 which *will have the exact same issue* because no matter how fast your make your GPU, if it's constrained by the CPU, all your improvements are worth nothing.
> 
> I'd be extremely glad if @W1zzard created separate charts for performance and performance/power consumption only for the games whose performance scale linearly or near linearly with resolution.


I can't see a single game in the TPU test suite that doesn't show significant scaling at 4k. Are you saying that these are still CPU limited? Based on what? Is there hidden data somewhere showing CPU usage for each GPU tested? Or some other review you've seen showing this that you haven't mentioned?


----------



## M2B (Sep 16, 2020)

So, It's offering a 31.7% performance improvement over the 2080Ti and 68% improvement over the 2080 at the same price.
I would've liked to see more of a P/W improvement but it is what it is I guess.
Overall, the performance increase is fine and I'm happy with that.
The most impressive piece of this card is probably the cooler, a 2-slot card handling this 320W beast with relative ease is certainly impressive.

And lastly it doesn't matter what Nvidia offers, be it a 70% improvement or a 3x improvement, crybabies will always complain.


----------



## Vya Domus (Sep 16, 2020)

W1zzard said:


> Remember 3 GB and 4 GB cards? These are still fine for 1080p, data is on the summary pages



Maybe, but 1080p is long in the tooth by now so I don't think it's a good example.


----------



## okbuddy (Sep 16, 2020)

beware again: only 30% faster than 2080ti


----------



## Valantar (Sep 16, 2020)

Vya Domus said:


> I don't think anyone is wondering whether or not a game is going to need more than 10 GB any time soon because they wont. Running them at 4K with everything maxed out will, most definitely cause >10GB usage quite soon.


Consoles generally don't run at Ultra settings though, as Ultra settings typically means turning everything up to 11 despite the perceptible quality differences typically dropping off a good deal below that. I mean, if you insist on never ever lowering settings below ultra, I get that this is a concern, but ... who does that? Why? 10GB is not going to be an actual, real-life, relevant limitation for VRAM in the foreseeable future.


----------



## P4-630 (Sep 16, 2020)

Raevenlord said:


> I am now thoroughly convinced the RTX 3080 is absolute overkill for my 1440p monitor.



At 60Hz, yes.

My 1440p G-Sync monitor goes upto 165Hz, I wouldn't say overkill for that, it depends on the game.


----------



## Animalpak (Sep 16, 2020)

Good and clear review as usual from TPU. I particulary love the comparison 1080/1440/4K becasue its everything we need.

For the Ti consumers, wait till they release a Ti version of 3070 or 3080 and see if they can match what they said the + 40% perfomance... 

Because as a 2080 Ti owner with a 1440p 165Hz monitor the RTX 3080 didnt impress me as i expected.

Plus temperatures are high, we are getting GPU's that outputs tons of heat more and more. So no improvement over there.


----------



## EarthDog (Sep 16, 2020)

W1zzard said:


> Rather the only solution I can think of is to record power during all testing in all games, to get the proper power average. Not something I'm able to do at the moment.


Hwinfo does a great job of this for NV cards.


----------



## Mistral (Sep 16, 2020)

Bloody awesome performance, great job on nVidia's part. RTX hit is still a bit insane without DLSS, but hey...

Is it true though that the $699 price is only for the limited amount of founder's cards, and partner models will cost a good deal more? Hope that's not the case.


----------



## renz496 (Sep 16, 2020)

DarkStalker said:


> 3080 not so powerfull vs 2080ti as advertised...
> typically ngreedia


AFAIK nvidia only mention 2080 vs 3080. in jensen presentation not even once he said 3080 will be _x _faster than 2080Ti.


----------



## Valantar (Sep 16, 2020)

Vya Domus said:


> Maybe, but 1080p is long in the tooth by now so I don't think it's a good example.


Isn't it being long in the tooth precisely the point? It's not like the requirements for gaming at 4k are somehow going to change differently than the requirements for 1080p have historically, after all.


----------



## W1zzard (Sep 16, 2020)

EarthDog said:


> Hwinfo does a great job of this for NV cards.


Not at all, you have to do physical measurements, with proper equipment, to actually get usable power results. Also running HWiNFO while you're benching for performance will skew your performance results, especially on cards that approach the CPU limit


----------



## Valantar (Sep 16, 2020)

EarthDog said:


> Hwinfo does a great job of this for NV cards.


Aren't software power readings notoriously unreliable?


----------



## Vya Domus (Sep 16, 2020)

Valantar said:


> Consoles generally don't run at Ultra settings though



Exactly. Current gen consoles make do with about 5 GB and yet you don't even need to go for ultra settings nor 4K to go over that on just VRAM alone on PCs running the exact same game.


----------



## EarthDog (Sep 16, 2020)

W1zzard said:


> Not at all, you have to do physical measurements, with proper equipment, to actually get usable power results


If all you're after is a relative difference, it works just fine. Aren't these the same sensors used for throttling and clocks? I'd test the difference with equipment before simply dismissing the values in Hwinfo as 'not usable'. 




W1zzard said:


> Also running HWiNFO while you're benching for performance will skew your performance results, especially on cards that approach the CPU limit


EDIT to your edit after I replied........ : You don't run Hwinfo while you're benchmarking and gathering results.... you run it when testing for power and not FPS... you know this.



Valantar said:


> Aren't software power readings notoriously unreliable?


See above. If he's going for literal accuracy versus a relative difference, I wouldn't do it. That said, everything can be off... some things more than others.

I'll give you a hint.. I've tested this with a meter...they are usable results.


----------



## Valantar (Sep 16, 2020)

Vya Domus said:


> Exactly. Current gen consoles make do with about 5 GB and yet you don't even need to go for ultra setting nor 4K to go over than on just VRAM alone on OC on the exact same game.


... are there any current gen 4k console games that are VRAM limited? I would think they are rather limited by their terrible CPUs and less terrible but still not that impressive GPUs ...

Again: the question isn't whether it's _possible_ to exceed 10GB of VRAM usage in games, the question is whether 10GB is going to be a noticeable, real-world limitation that significantly impacts performance for this GPU. And I find that extremely unlikely.


----------



## xkm1948 (Sep 16, 2020)

@W1zzard 

First let me take a minute and say DAMN NICE PLOTTING SKILLS for the frame time and IQR page. Your plotting and analysis skill is better than most biomedical publications in some top tier journals. What did you use? Customized ggplot2?


On topic. Amazing GPU all around. With other board partners charging over $700, I would just buy the FE and call it a day!


----------



## nguyen (Sep 16, 2020)

Valantar said:


> I can't see a single game in the TPU test suite that doesn't show significant scaling at 4k. Are you saying that these are still CPU limited? Based on what? Is there hidden data somewhere showing CPU usage for each GPU tested? Or some other review you've seen showing this that you haven't mentioned?



Just check the relative performance of 2080 Ti compare to 3080 across 3 resolution and you will see a change
1080p: 87%
1440p: 81%
4K:        75%
Might as well use DSR to do some 8K testing on this bad boy 
People with 1080p and 1440p screen might as well use DSR if they have the 3080, otherwise you are just wasting all performance prowess of 3080.


----------



## Vya Domus (Sep 16, 2020)

Valantar said:


> Again: the question isn't whether it's _possible_ to exceed 10GB of VRAM usage in games, the question is whether 10GB is going to be a noticeable, real-world limitation that significantly impacts performance for this GPU.



That's a strange way to think about it, if it is possible to exceed that amount then it inevitably becomes a limitation.


----------



## mobiuus (Sep 16, 2020)

birdie said:


> Which 4K graph has made you think so? RTX 3080 minimum fps is above average for RTX 2080 Ti. I don't remember the last time we've had such a huge leap in performance. How is it "underwhelming"?
> 
> Maybe you could share some graphs to prove your PoV?
> 
> ...


whos trolling?? im on 1080ti u .....


----------



## W1zzard (Sep 16, 2020)

Vya Domus said:


> Maybe, but 1080p is long in the tooth by now so I don't think it's a good example.


Isn't it exactly a good example for that reason? Assuming similar progress we'll be on 10K resolution in the future, yet the current cards will still be fine for 4K



Valantar said:


> the question isn't whether it's _possible_ to exceed 10GB of VRAM usage in game


Actually .. if we assume game's are 100 GB, how much textures/levels data do you actually need at a given time?



xkm1948 said:


> What did you use? Customized ggplot2?


Excel + VBA


----------



## ZoneDymo (Sep 16, 2020)

That power consumption is extremely disappointing and makes me really not respect this product sadly.



renz496 said:


> AFAIK nvidia only mention 2080 vs 3080. in jensen presentation not even once he said 3080 will be _x _faster than 2080Ti.



preeeeeetty sure he claimed the 3070 would be better then the 2080Ti...


----------



## xkm1948 (Sep 16, 2020)

W1zzard said:


> Excel + VBA



Classy~

With your amazing programming skills and data presentation you would land a fantastic paying job in bioinformatics.


----------



## W1zzard (Sep 16, 2020)

xkm1948 said:


> and data presentation


Thanks, really appreciated. I always try to do my best to present data in a way that's still understandable for the more average reader.


----------



## P4-630 (Sep 16, 2020)

Now waiting for reviews of the Asus 3080 STRIX and the MSI 3080 Gaming X trio....


----------



## TheGuruStud (Sep 16, 2020)

Hahahahha. Oh, man. Poor 2080ti owners that dumped cards for nothing. Simple OC brings it to same perf.


----------



## steen (Sep 16, 2020)

W1zzard said:


> Rather the only solution I can think of is to record power during all testing in all games, to get the proper power average. Not something I'm able to do at the moment.


That was kinda the point of PCAT & FrameView. Nvidia were concerned that when testing @ high frame rates & DLSS, the increased load on the CPU esp CFL/CML would scare buyers when only total system power draw was reported.

Overall, 3080 looks pushed a bit over the efficiency of the process.


----------



## RedelZaVedno (Sep 16, 2020)

Oh Happy Day! You buy GPU and get a 370W room heater for free. It's mid September and we're still hitting +30C temperatures outside in central Europe. No way would I game with 320-370W GPU during such conditions. 3080 screams for Tsmc's 7nm EUV node, maybe RDNA2 can surprise?


----------



## xkm1948 (Sep 16, 2020)

Can I ask when can I expect 3090 review?


----------



## phill (Sep 16, 2020)

Thank you @W1zzard for such an in depth and amazing review, as always!! 

I can see by the majority of the games in the test suite that my 1080 Ti seems to be around 50% slower than the 3080, which to be honest, isn't so disappointing but now all I need to do is wait to find my prefered card and cooler, buy it and then be happy for a few weeks until the credit card bill comes through...  

The card is impressive, the stats do speak well.  It does go to show that such a high end GPU does get warm and takes the juice, but we are all enthisuasts so, does it really matter if your getting the performance you want in the games you play?  To me, no.  Slap a water block on this bad boy and away we go   That said, I'm going to be very interested in seeing the 3090 review as that supports SLI and I have a big case that needs filling up so it might be worth a look at the reviews just to see what the differences are.  What's so disppointing with that card, is that it's nearly 3 months mortgage payments...  Still, for those that wish to have the power and so on..... 

Thank you @W1zzard again, awesome work   Looking forward to the Strix, Gaming X Trio and FTW card reviews, just like @P4-630


----------



## EarthDog (Sep 16, 2020)

TheGuruStud said:


> Hahahahha. Oh, man. Poor 2080ti owners that dumped cards for nothing. Simple OC brings it to same perf.


I'd love to see ANY 2080Ti run 25% faster with a ambient cooled overclock..........


----------



## phill (Sep 16, 2020)

xkm1948 said:


> Can I ask when can I expect 3090 review?


Isn't that one in a weeks time??


----------



## W1zzard (Sep 16, 2020)

P4-630 said:


> Now waiting for reviews of the Asus 3080 STRIX and the MSI 3080 Gaming X trio....





xkm1948 said:


> Can I ask when can I expect 3090 review?


Soon


----------



## Vayra86 (Sep 16, 2020)

W1zzard said:


> Thanks, really appreciated. I always try to do my best to present data in a way that's still understandable for the more average reader.



You nailed it. Fantastic work all the way through. The background info and the data.

This is making me very curious where the 3070 might end up. I find the gap with 2080ti... okay, but not jaw dropping, and the gap with 3090 won't be either given the shader difference. Its still a good generational jump though. But not Pascal-esque, which I did kind of expect. I think the RT weight is killing that vibe, and perhaps the node. Very curious as well what RDNA2 will do now.


----------



## Vya Domus (Sep 16, 2020)

W1zzard said:


> Isn't it exactly a good example for that reason? Assuming similar progress we'll be on 10K resolution in the future, yet the current cards will still be fine for 4K



I don't think it is because we already "took the hit" of rendering at that resolution long ago. 

However, assets and effects haven't exactly been crated with the thought of 4K rendering up until very recently. As you probably know, many engines employ things such as rendering certain effects at half/quarter resolutions and so on but when they inventively get scaled up because of the more memory and processing power available so will the memory requirements.


----------



## xkm1948 (Sep 16, 2020)

phill said:


> Isn't that one in a weeks time??




My body is ready for a 3090FE


----------



## W1zzard (Sep 16, 2020)

Vya Domus said:


> As you probably know, many engines employ things such as rendering certain effects at half/quarter resolutions and so on but when they inventively get scaled up because of the more memory and processing power available so will the memory requirements.


Exactly, but no reason they get scaled up linearly, especially with pixel sizes going down due to higher resolution. Looks like we're having very similar thoughts and arguments, and just draw different conclusions from it  guess we'll know in a few years


----------



## biggermesh (Sep 16, 2020)

Vya Domus said:


> That's a strange way to think about it, if it is possible to exceed that amount then it inevitably becomes a limitation.


Not really, it depends on how the streaming is implemented. Some games fill the vram no matter the amount, what matters is the portion of the assets actually used at any given time.


----------



## RedelZaVedno (Sep 16, 2020)

W1zzard said:


> Soon



Adding *Microsoft Flight Simulator 2020 as a benchmark* into the review would be Awesome. It's "but can it play "Crisis"" kind of game in 2020


----------



## W1zzard (Sep 16, 2020)

RedelZaVedno said:


> Adding Microsoft Flight Simulator as a benchmark into the review would ne Awesome.


Thought about it, decided against it. Old DX11 engine, badly designed, badly optimized, not a proper simulator, small playerbase. When they add DX12 I might reconsider it


----------



## renz496 (Sep 16, 2020)

ZoneDymo said:


> That power consumption is extremely disappointing and makes me really not respect this product sadly.
> 
> 
> 
> preeeeeetty sure he claimed the 3070 would be better then the 2080Ti...



yeah i remember jensen saying that 3070 will be better than 2080Ti. but i'm replying to the post saying nvidia promise that 3080 will be better than 2080Ti by certain amount. jensen never said those. he only talk how 3080 will be double the performance from 2080 (we know the truth now with this review).


----------



## dayne878 (Sep 16, 2020)

I'm glad to see it has significant performance gains at 1440p over the 2080ti. I game on a 1440p 144hz Asus ROG monitor with G-Sync and so I was happy to see that the 3080 brings the average frame rate up significantly closer to 144 fps.  My CPU is 10700K.

I'll be getting it, either the FE or from another brand, and then I'll sell my 2080ti at a significant price hit (lucky if I get $400 for it probably) and eat the difference between the 3080 and the 2080ti sell price (~$400 assuming I get a 3080 for $800 or less and the 2080ti sells for $400). 

There's also the chance that I'll upgrade to a 4k monitor in the future, so this gives me more "future-proofing."

The only hesitance I have is that I've heard rumors of the 3070ti (and the presumed existence of a 3080ti down the road), with far more VRAM. But I don't want to wait on hypothetical cards that might not be out for awhile, and if they do come out I expect the 3080 will keep its resale value for awhile and I could always sell it and buy the 3080ti down the road (like 1+ years from now).


----------



## Cheeseball (Sep 16, 2020)

I don't see why the power consumption is that surprising if you look at the specs. Sure its on 8nm now, but thats coming from 12nm. Not a _huge_ jump like 980 Ti's 28 nm to 1080 Ti's 16nm like before. Same thing with 1080 Ti to 2080 Ti.

And that fact that they practically *doubled the shader units* from the 2080 Ti (not the 2080 Super, mind you) along with 8 more ROPs and RT/Tensor cores. I was realistically expecting this thing to exceed 300W.

EDIT: In addition, I expect the RTX 3070 to beat the RTX 2080 Super, but be slightly below the RTX 2080 Ti. The 2080 Ti still has a huge chunk of ROPs (88 vs 64) to its advantage.


----------



## renz496 (Sep 16, 2020)

and now waiting for more mainstream card release......


----------



## Testsubject01 (Sep 16, 2020)

Valantar said:


> Don't underestimate people's unwillingness to leave 1080p behind!



Well, I wouldn't completely mark it down to unwillingness.
Getting a PC + Display to enjoy all the bells and whistle settings you had on 1080p at resolutions of 1440p @120+ or 2160 @60 without selling organs, just started to get feasible now.


Let's hope RDN2 delivers as well in October and this trend continues.


----------



## Frick (Sep 16, 2020)

Better than I thought, and now I'm hoping to snag a cheap 1070.


----------



## sergionography (Sep 16, 2020)

birdie said:


> Which 4K graph has made you think so? RTX 3080 minimum fps is above average for RTX 2080 Ti. I don't remember the last time we've had such a huge leap in performance. How is it "underwhelming"?
> 
> Maybe you could share some graphs to prove your PoV?
> 
> ...





Here you go. 1080ti came out at 700 dollars and performed almost twice as fast as a 980ti that also was 700 dollars. Then 20series came out and almost doubled prices with 2080ti at like 1200 dollars. Everyone is impressed, but in reality Nvidia just shifted back to normal pricing because they are actually expecting competition. What everyone here is concerned about is the fact that 3080 is a cut down version of their biggest gaming chip and is using energy close to the pci thermal limit, and at this stage you start to run into diminishing returns as you scale performance any higher. But in summary, no, 3080 is not a 2080 successor, it is a 2080ti successor which was a cut down version of a Titan.


----------



## steen (Sep 16, 2020)

@W1zzard, try running FrameView 1.1 (even if you don't have the PCAT kit). It should give detailed data via NVAPI.


----------



## RedelZaVedno (Sep 16, 2020)

W1zzard said:


> Thought about it, decided against it. Old DX11 engine, badly designed, badly optimized, not a proper simulator, small playerbase. When they add DX12 I might reconsider it


I can agree on being badly optimized at the moment, but not on having small player base and not being a proper simulator. Predicted sales are 2.27 million units over the next three years and  third party devs will soon offer state of the art super highly detailed planes as addons through MS FS 2020 shop. FS2020 is predestined to become next X-Plane/Prepar3D  kind of sim.


----------



## W1zzard (Sep 16, 2020)

steen said:


> @W1zzard, try running FrameView 1.1 (even if you don't have the PCAT kit). It should give detailed data via NVAPI.


GPU-Z already shows the same data, using NVAPI, too



RedelZaVedno said:


> Predicted sales are 2.27 million units over the next three years


Doubt it, we'll see


----------



## Vya Domus (Sep 16, 2020)

Cheeseball said:


> And that fact that they practically *doubled the shader units* from the 2080 Ti (not the 2080 Super, mind you) along with 8 more ROPs and RT/Tensor cores. I was realistically expecting this thing to exceed 300W.



That's the catch, the shaders have been doubled but not the amount of SMs. In other words more execution units now share the same control logic/registers/cache which are the real power hogs in a chip.

Nvidia has done this before with Kepler to a more extreme extent, which had six times the amount of shaders per SM versus Fermi. As a result it was one of the most inefficient architectures ever per SM in terms of performance. GK110 was 90% faster than GF110 despite having almost 600% more FP32 units (GF110 shaders did run at a faster clock, still, they are worlds apart).

It's a similar story here, a lot of shading power but it's used rather inefficiently because a lot of resources are shared, that's why the power consumption looks quite bad when you think about it.


----------



## Hyderz (Sep 16, 2020)

AMD or intel doesn’t matter! It’s time to upgrade from 10th Gen and game on! It’s almost 3x difference in performance for me from 1070ti


----------



## ZekeSulastin (Sep 16, 2020)

Sorry to bother you, but did you take a look at either reducing the power limit or manually altering the voltage/freq curve in Afterburner/similar?  computerbase.de reported only a few percent less performance with a 270 W limit, and someone linked me a video of someone changing that curve to good effect on another site. 

It honestly sounds like Turing, where you can't drop the voltage with the slider but can with the curve editor.


----------



## specopsFI (Sep 16, 2020)

@W1zzard 

In the overclocking section, you mention shortly that undervolting is not possible. Can you elaborate on this *very *important point? Is it prevented on Ampere on a hardware level or is it because the necessary tools are not yet available? The inability to undervolt these extremely power hungry chips would be a serious shortcoming, which I can't believe nVidia would exclude for this exact generation, since they have allowed it for so long with previous generations with access to the whole voltage/clock curve.

Other than that, an excellent article as usual!


----------



## iO (Sep 16, 2020)

Great review as usual.

It's weird that they prioritized maximum performance over energy efficiency like in previous gens. By going from 270W to 320W, they sacrificed 15% energy efficiency for just 4% higher performance.
Kinda pulled an AMD there...


----------



## okbuddy (Sep 16, 2020)

2k and 1080p not better enough could be CPU not good enough, could we test again with next Zen 3


----------



## Cheeseball (Sep 16, 2020)

Vya Domus said:


> That's the catch, the shaders have been doubled but not the amount of SMs. In other words more execution units now share the same control logic/registers/cache which are the real power hogs in a chip.
> 
> Nvidia has done this before with Kepler to a more extreme extent, which had six times the amount of shaders per SM versus Fermi. As a result it was one of the most inefficient architectures ever per SM in terms of performance. GK110 was 90% faster than GF110 despite having almost 600% more FP32 units (GF110 shaders did run at a faster clock, still, they are worlds apart).
> 
> It's a similar story here, a lot of shading power but it's used rather inefficiently because a lot of resources are shared, that's why the power consumption looks quite bad when you think about it.



No doubt that doubling the shader units in the same amount of streaming multiprocessors (64/SM Turing vs 128/SM in Ampere) would increase power consumption. If anything, I don't consider it as efficient as Maxwell>Pascal, per se, but I also don't consider it being a complete waste of power as well. I am also considering the fact that those RT and Tensors cores also add to the weight.

All in all, I believe the slight sacrifice to energy efficiency is justified, as long as it doesn't get to Fury X or 290X-levels of wasted power. With this card I can probably play PUBG at 4K 144Hz with competitive settings, which is a mix of medium and low settings with AA maxed out for improved visual clarity, which is important for spotting opponents.

*TL;DR* - This card is overkill for those still gaming in 1080p between 1440p. If you're aiming at UW (3440x)1440p or 4K, this seems to hit the sweet spot.


----------



## W1zzard (Sep 16, 2020)

okbuddy said:


> 2k and 1080p not better enough could be CPU not good enough, could we test again with next Zen 3











						NVIDIA GeForce RTX 3080 with AMD Ryzen 3900XT vs. Intel Core i9-10900K
					

What's the best processor for NVIDIA's new GeForce RTX 3080? The AMD Ryzen 3900 XT or Intel Core i9-10900K? We dedicate a whole performance review to that question, with 23 games, each tested at 1080p Full HD, 1440p, and 4K Ultra HD.




					www.techpowerup.com


----------



## steen (Sep 16, 2020)

Cheeseball said:


> I don't see why the power consumption is that surprising if you look at the specs. Sure its on 8nm now, but thats coming from 12nm.



Nitpick, TSMC 16FF (12N)  -> Samsung 10LPU (8N). 320W+ is a 30% increase in power for ~30% increase in perf.



Cheeseball said:


> No doubt that doubling the shader units in the same amount of streaming multiprocessors (64/SM Turing vs 128/SM in Ampere) would increase power consumption. If anything, I don't consider it as efficient as Maxwell>Pascal, per se, but I also don't consider it being a complete waste of power as well. I am also considering the fact that those RT and Tensors cores also add to the weight.



Means little when it's dark silicon. Largely useful for compute/CUDA workloads & @ 4k where each frame becomes more alu limited. The increased number & revised ROP partitions help @ 4k.

We may be doing GA a disservice given RT & denoising should show gains when games/compiler are better optimised.



W1zzard said:


> GPU-Z already shows the same data, using NVAPI, too



With a nice in-game overlay?


----------



## Bubster (Sep 16, 2020)

I already have a 1080 SLI 3 years ago and it's great for 4k experience minimum 70 - 80 fps in Ultra in most games...


----------



## GhostRyder (Sep 16, 2020)

Great review, I cannot wait for the RTX 3090 review!

Its interesting how this cooler performs, seems to be a nice improvement over past "FE" coolers in a meaningful way especially with the power consumption of these cards.   I am a little disappointed in the overclocking of these cards, I mean overclocking has been meh for awhile but it seems like this one is even less than normal.  Granted the memory moved up and it shows some decent performance gains but it looks like these cards are already pushed to their limit out of the box with only minor improvement with aftermarket cards.

Still cant wait!


----------



## ppn (Sep 16, 2020)

The important increase is in transistors. 28000(5000 broken/disabled) is 70% more than 2080 and 25% more than what 2080Ti had. 25% is the same as the performance increase. now, on 7nm DUV this is a 426mm2 die, instead of 628mm2. the main reason to avoid this is that shrinks are imminent at this point. this we didn't get, sadly. on 6nm EUV this is 360mm2 and clock speed +50% for same power. so this is just another titan. big powerful but will fall down inevitably. sometimes in less than 10-12 months. so yeah $700 not as good as you think. except in the moment, in the moment is everything.


----------



## Cheeseball (Sep 16, 2020)

steen said:


> Nitpick, TSMC 16FF (12N)  -> Samsung 10LPU (8N). 320W+ is a 30% increase in power for ~30% increase in perf.
> 
> 
> 
> ...



Hmm.. I know the TDP of the 2080 Ti is at 250W, but you can see it using around 273W (average gaming) according to @W1zzard's charts. The 3080 is rated at 320W, but it does seem to be hovering around 303W. Maybe NVIDIA just overshooting their stated specs?


----------



## SIGSEGV (Sep 16, 2020)

I hate to say, but it's kinda weird with all those charts. Hype all the way. LOL
unimpressive performance. feels like they release this card in hurry. 

nice review. thanks.


----------



## RedelZaVedno (Sep 16, 2020)

*Hardware unboxed numbers (similar results):*

RTX 3080 vs 2080 Ti: 14 Game Average at 1440p = +21 %
RTX 3080 vs 2080 Ti: 14 Game Average at 4K = +31 %

RTX 3080 vs RTX 2080: 14 Game Average at 1440p = +47%
RTX 3080 vs RTX 2080: 14 Game Average at 4K = +68%

Very nice gains at 4K and average generational gain at 1440p (excluding Turing)...  

Total (off the wall) system power consumption: 523W
GPU only measured on Nvidia’s PCAT (Power Capture Analysis Tool) playing DOOM: 327W
8% performance per watt gain over Turing -> far from impressive given Ampere moved to new node


----------



## TheDeeGee (Sep 16, 2020)

The fan noise levels are enough for me to pass on a FE.

Looks like i will have to find a partner board that fit my Arctic Accelero Xtreme 3.


----------



## EarthDog (Sep 16, 2020)

sergionography said:


> Here you go. 1080ti came out at 700 dollars and performed almost twice as fast as a 980ti that also was 700 dollars


Math my man. 

that is 46% faster. You realize that 2x = 100% right? For example if card A ran at 100 FPS and card B ran at 146 FPS, card B is 46% faster than card A. If it was "double" it would be 100%.


----------



## jboydgolfer (Sep 16, 2020)

it didnt seem to have that damn adhesive that you need to heat up to access fasteners like the 1xxx Reference cards atleast. unless i missed the picture with that.

the 9xx Reference had those damn plastic type hex screws that stripped if you coughed near them


----------



## Raevenlord (Sep 16, 2020)

P4-630 said:


> At 60Hz, yes.
> 
> My 1440p G-Sync monitor goes upto 165Hz, I wouldn't say overkill for that, it depends on the game.




True. Mine goes up to 144, so actually looking at the figures, it may make sense for it. Especially with future-proofing concerns. I suppose it depends mostly on cyberpunk 2077's performance, though. Luckily, I have time until we have some information from the competition.


----------



## Cheeseball (Sep 16, 2020)

Raevenlord said:


> True. Mine goes up to 144, so actually looking at the figures, it may make sense for it. Especially with future-proofing concerns. I suppose it depends mostly on cyberpunk 2077's performance, though. Luckily, I have time until we have some information from the competition.



I'm gonna throw my guess out there. This RTX 3080 is probably around 100 FPS in Cyberpunk 2077 at 4K. This is based off any improvements from the Witcher 3 engine and the fact that they are still optimizing it for the current gen (PS4/XB1) consoles.


----------



## nguyen (Sep 16, 2020)

Cheeseball said:


> Hmm.. I know the TDP of the 2080 Ti is at 250W, but you can see it using around 273W (average gaming) according to @W1zzard's charts. The 3080 is rated at 320W, but it does seem to be hovering around 303W. Maybe NVIDIA just overshooting their stated specs?



It's simple to explain
2080 Ti FE 260W TDP put the chip in the lower region (more efficient) part of the perf/power curve while 3080 320W TG is at the higher point in the perf/power curve
Meaning it's easy to overclock the 2080 Ti by simply rasing the power limit while raising power limit on 3080 does nothing (as in every review have pointed out, very similar to 5700XT).
That also means lowering the TGP of the 3080 to similar level to 2080 Ti like Computerbase did will not lower the performance of 3080 by much, improving 3080 efficiency if you so require.


----------



## Icon Charlie (Sep 16, 2020)

iO said:


> Great review as usual.
> 
> It's weird that they prioritized maximum performance over energy efficiency like in previous gens. By going from 270W to 320W, they sacrificed 15% energy efficiency for just 4% higher performance.
> Kinda pulled an AMD there...



Agreed.  Your performance increases are at the expense of excess heat and excess wattage.
There is no real wattage vs performance  gain like previous generations.

I think people should see this. It is from Adored TV. It's 30+ minutes long  and it will explain a lot on just why this card is running so high in wattage, which in turn will throw out a lot of excessive heat from that card.


----------



## steen (Sep 16, 2020)

Cheeseball said:


> Hmm.. I know the TDP of the 2080 Ti is at 250W, but you can see it using around 273W (average gaming) according to @W1zzard's charts. The 3080 is rated at 320W, but it does seem to be hovering around 303W. Maybe NVIDIA just overshooting their stated specs?


Unlikely. 3080 is probably the worst bin GA102 silicon, but I'd think the FE boards will be better than the average AIB.


----------



## tussinman (Sep 16, 2020)

Monster card but i'm skipping this gen. Where too far into this generation cycle + games are still running on old engines.

90% of games i've played in the last 12 months either have been extremely well optimzed or had moderate at best requirements.
I'd rather just wait for 4000 series. Ironically I see alot of people upgrading because "new game engines are coming soon and "it has better ray tracing" but by the time both are utilized the 4000 series will be out...... (lol)


----------



## nguyen (Sep 16, 2020)

steen said:


> Unlikely. 3080 is probably the worst bin GA102 silicon, but I'd think the FE boards will be better than the average AIB.



When the original Turing launched the FE models are the one that failed the most so I wouldn't be putting that much faith into those.
Just go with any brand that has the best RMA practise in your region is the best idea.


----------



## Dudebro-420 (Sep 16, 2020)

I'll wait for the 3070ti


----------



## Zareek (Sep 16, 2020)

Wow, very impressive...

I still feel modern graphics hardware is too expensive but versus the previous few generations this is really reasonable. 

I hope the 3070 delivers at 1440p what this card delivers at 4K. I may have to buy one, it will be my first Nvidia card since the GTX 970.


----------



## TheUn4seen (Sep 16, 2020)

Nice. Twice the performance of 1080ti will make for a happier 4k120 display.


----------



## nguyen (Sep 16, 2020)

Computerbase.de tested out what is called Asynchronous DLSS and that improve DLSS performance in Youngblood by another 9%.
So Ampere still has alot more tricks up its sleeves.


----------



## mrthanhnguyen (Sep 16, 2020)

3090 here I come. Hope it will be in stock.


----------



## KainXS (Sep 16, 2020)

I'm going to try to get one the performance uplift looks pretty huge vs the 2080ti on rendering apps like blender 60%+ in some cases. Going to expect VR performance is also much better than the 2080ti.

The power consumption is disappointing though


----------



## xkm1948 (Sep 16, 2020)

W1zzard said:


> Soon




What about other brands like ASUS or EVGA? Really curious to see the 3rd party cooling performance versus FE


----------



## kucki (Sep 16, 2020)

@W1zzard base clock +145MHz on 3080?


----------



## SYST3M_OVERIDE (Sep 16, 2020)

What resolution did you use for the RTX & DLSS when testing Metro Exodus and Control at 4K?


----------



## Space Lynx (Sep 16, 2020)

@W1zzard I have to disagree with one of your negative items in the conclusion, "makes little sense for people who don't have 4k 60hz gaming" - I mean, the RTX 3080 only does 92 fps at 1440p for AC Odyssey for example, and games like AC Odyssey are so much more fun (imo) when you lower settings to reach at 144 fps on a 144hz 1440p monitor. I mean I understand high refresh isn't for everyone, but I personally enjoy it in games. It enhances the immersion to me. (so RTX 3080 isn't powerful enough for me ideally) but in reality it is, cause I will need to turn down less settings to achieve my goal 

that being said, I loved everything else about the review, great stuff!  I love the format you use too, so easy to navigate.


----------



## Parn (Sep 16, 2020)

The performance increase from 2080 to 3080 is roughly the same as 1080 to 2080. Where is the "near doubled performance" claimed by nv ceo?

And I totally agree with the conclusion that this card is strictly for 4k gamers. It makes no sense for anyone with a 1440p or lower monitors (myself) due to huge power consumption.


----------



## lexluthermiester (Sep 16, 2020)

W1zzard said:


> I doubt anyone will buy that card for 1080p


You know some people will. They'll want the absolute highest FPS for their 240hz displays and they'll get it with this card at 1080p.



lynx29 said:


> I have to disagree with one of your negative items in the conclusion, "makes little sense for people who don't have 4k 60hz gaming" - I mean, the RTX 3080 only does 92 fps at 1440p for AC Odyssey for example, and games like AC Odyssey are so much more fun (imo) when you lower settings to reach at 144 fps on a 144hz 1440p monitor. I mean I understand high refresh isn't for everyone, but I personally enjoy it in games. It enhances the immersion to me. (so RTX 3080 isn't powerful enough for me ideally) but in reality it is, cause I will need to turn down less settings to achieve my goal


Agreed. I have dual 1440p 144hz(set for 120hz) displays and it would seem that settings will still have to be turned down for many games to hit the 120fps mark.


lynx29 said:


> that being said, I loved everything else about the review, great stuff! I love the format you use too, so easy to navigate.


Agreed here as well. Excellent review that hit all the points that matter to prospective purchasers!


----------



## Fluffmeister (Sep 16, 2020)

Great review as always @W1zzard , the card certainly packs a punch at the more traditional $700 pricepoint. That new cooler does a great job, congrats on that teardown too!

Looking forward to the 3070 review a bit later down the line, 2080 Ti or a bit better performance at a more comfortable 220W TDP would be very nice indeed.


----------



## TheoneandonlyMrK (Sep 16, 2020)

Seems Nvidia's CEO has artistic math's skills, still a good card though.


----------



## FeelinFroggy (Sep 16, 2020)

Great review and looks like a beast of a card.  It looks like my 1080ti is starting to age.  I will wait to see what AMD has though as maybe they have some magic.

I am taken back by some of the negative comments that nic-pic about details that ultimately have not bearing on real world performance.  But I guess there will always be those who will complain about anything that Nvidia does.


----------



## Gameslove (Sep 16, 2020)

Thanks for the review! 
Yes, Nvidia made this, overall the 4K performance is amazing. I hope a RTX 3070 can also handle it. 

W1zzard, when will available a 8K gaming resolution test?


----------



## Jism (Sep 16, 2020)

It is a shitty overclocker btw... from https://www.tomshardware.com/news/nvidia-geforce-rtx-3080-review



> For the GPU core, while Nvidia specs the nominal boost clock at 1710 MHz, in practice, the GPU boosts quite a bit higher. Depending on the game, we saw sustained boost clocks of at least 1830 MHz, and in some cases, clocks were as high as 1950 MHz. That's not that different from the Turing GPUs, or even Pascal. The real question is how far we were able to push clocks.
> 
> The answer: Not far. I started with a modest 50 MHz bump to clock speed, which seemed to go fine. Then I pushed it to 100 MHz and crashed. Through a bit of trial and error, I ended up at +75 MHz as the best stable speed I could hit. That's _after_ increasing the voltage by 100 mV using EVGA Precision X1 and ramping up fan speeds to keep the GPU cool. The result was boost clocks in the 1950-2070 MHz range, typically settling right around the 2GHz mark.
> 
> ...



So even Nvidia is pushing pretty much to the silicon's limits now these days.


----------



## Gameslove (Sep 16, 2020)

Theoreticaly RTX 3080 = SLI 2x RTX 2060 Super, if take power consumption / performance / price it same.


----------



## Tomgang (Sep 16, 2020)

I have not yet read tpu review, but have seen other techpages and youtubers reviews and so far, I am not disappointed about the raw performance. RTX 3080 seems to be around 90 to 100 % faster than my old gtx 1080 TI, so a pretty good jump. But that power usage off, but what really does the out come of me not getting RTX 3080 for now. Is the amount of vram. 10 gb for 4k will not last a few years in the future for all games. So because of the vram, I will take my chances and wait in the hope of a rtx 3080 20 GB or a rtx 3080 ti in stead. Sure that will be more expensive, but might pay out in the long run because of more vram = better future proofing.

Ampere is impressive in the raw performance alone. But thinking of the power consumption, it does ruin the picture a little. Guess that the back side of all these Cuda cores and other tech ampere has. Fortunately I have seen power target seems to can be lowered from 100 % down to only 31 % or a 100 watt card and between. I will make good use of the power target slider when i don't need all the gpu power.


----------



## Space Lynx (Sep 16, 2020)

I just wish AMD would give something more before October 28th, I am assuming since they haven't, that the best they can do with Big Navi is match the 3080 at most, and possibly not even that. So yeah, terrible marketing on their end. They should have at least released preliminary benches if they were confident of anything more.  and unless it can beat a 3080 for same price, might as well just roll nvidia at that point due to higher likelihood of stability. I was only going to consider Big Navi if it came out swinging.


----------



## lexluthermiester (Sep 16, 2020)

theoneandonlymrk said:


> Seems Nvidia's CEO has artistic math's skills, still a good card though.


Actually, it seems NVidia understated the projected performance as much of the data on offer is well over what NVidia stated.



Gameslove said:


> W1zzard, when will available a 8K gaming resolution test?


8k displays are still extremely expensive and are not wildly available. Given that we have only just reached 4k gaming viability, 8k is a bit off just yet. Don't expect 8k testing until it is more widespread.


----------



## birdie (Sep 16, 2020)

What remains:


Reviews: custom GeForce RTX 3080 designs
Availability: GeForce RTX 3080 FE & custom17 September, 13:00 UTCGeForce RTX 3090 FE and custom: reviews and availability24 September, 13:00 UTCGeForce RTX 3070 FE and custom: reviews (TBC) and availability15 October, 13:00 UTC



lexluthermiester said:


> 8k displays are still extremely expensive and are not wildly available. Given that we have only just reached 4k gaming viability, 8k is a bit off just yet.





If you read the comments you might get the impression that almost everyone here has either a 1440p or 4K high refresh monitor, while in reality:





In other words less than 11% of people use high-res monitors and just *one out of 44* has a 4K monitor.


----------



## trog100 (Sep 16, 2020)

the 3070 will be the big seller..  2080ti performance for sensible money.. 

this 3080 card seems to be a 4K only card.. as a 1440.. 2080ti owner i dont feel any need to rush out and buy one.. 

good in one sense a tad boring in another.. 

trog


----------



## Foxiol (Sep 16, 2020)

Great review as usually. Not going to get it after all, I shall wait for the Ti version, I need a few more frames for 4K 60+FPS in most games. My LG CX6LB 55" OLED 120hz is asking for more juice than this.  

PS: Going for the 3090 is an option but I have to save for a few months if I want that thing. Let's see what happens.


----------



## TheoneandonlyMrK (Sep 16, 2020)

lexluthermiester said:


> Actually, it seems NVidia understated the projected performance as much of the data on offer is well over what NVidia stated.


66% average , by TPU review, and he used the old too little Vram tactics on his own RX2080 on doom eternal such that the cards memory is saturated, with ultra not extreme textures the difference is back down to an already impressive 77% the outliers in a small set of pre release demo ,inflated expectations.

There by, proving that Vram size Can limit performance given two years progress , while selling you a 10GB 4K card, the guy's a genius for sure.


----------



## sam_86314 (Sep 16, 2020)

It looks like I've found my next GPU unless AMD has something that competes (though even then, I need Cuda and RTX for 3D rendering).

I'll probably pick it up next year when prices fall. Maybe NVIDIA will get my tax return money.


----------



## erocker (Sep 16, 2020)

Good luck to everyone within the 5-10 minute window tomorrow trying to get one of these!


----------



## Lindatje (Sep 16, 2020)

A disappointment, no double performance compared to the 2080Super that Nvidia promised.
Very high power consumption, aib`s are even higher.

RDNA2 in October .....


----------



## Parn (Sep 16, 2020)

Tomgang said:


> 10 gb for 4k will not last a few years in the future for all games. So because of the vram, I will take my chances and wait in the hope of a rtx 3080 20 GB or a rtx 3080 ti in stead. Sure that will be more expensive, but might pay out in the long run because of more vram = better future proofing.



Exactly. With 3090 at 24GB, a mid-generation SUPER refresh with doubled amount of vram is pretty much guaranteed. So 3080S 20GB and 3070S 16GB.


----------



## lexluthermiester (Sep 16, 2020)

birdie said:


> If you read the comments you might get the impression that almost everyone here has either a 1440p or 4K high refresh monitor, while in reality:
> 
> 
> 
> ...


Exactly. For my main gaming system, I am in that 6.59% of 1440p users, but for the system I'm typing this comment on it's dual 1080p which is also counted on Steam. But there is a third system in the living room connected to our 4k TV which also has Steam on it. For reference, that is just accounting for 3 of the 9 PCs in this home. While I'm well aware that my home is in the vast minority of technology saturation in the home, most of the systems in this home are still on 1080p and no one is complaining.

Even with this new series of GPU's, few are going to move away from 1080p. That is the one conclusion W1zzard made that I disagree with. I think he underestimates that market sector just a little bit. Lots of people will be buying the 30X0 series to upgrade their 1080p gaming experience, and a massive upgrade they will get!

The thing is, it can be safely said that very few will buy the 3090 to do 1080p gaming. Then again those numbers are still incoming and yet to be publicly seen.


----------



## Colddecked (Sep 16, 2020)

okbuddy said:


> beware again: only 30% faster than 2080ti*



* For 60 percent of the price


----------



## 5150Joker (Sep 16, 2020)

okbuddy said:


> beware again: only 30% faster than 2080ti



Actually less if you consider it runs at a higher max boost clock vs the 2080 Ti in this review. If you put both at the same clocks and assume equal efficiency then knock off another 5-6% from the difference between the two. The card is great IF you can get one for $699 but it’s value drops when you spend above that amount. I’ve seen ASUS asking $850 + tax for their 3080 Strix which is just hilariously stupid. Yes the official price of 2080 Ti new was $999+ but it’s been discontinued and now there’s a lot of idiots offloading them for $400-450 which is both ridiculous and a better deal than the 3080.

Also I would like to see modern BR games tested in some fashion using a 1080p 240hz display as a lot of gamers (esp esports) have gone that direction rather than 4K.


----------



## birdie (Sep 16, 2020)

Lindatje said:


> A disappointment, no double performance compared to the 2080Super that Nvidia promised.
> Very high power consumption, aib`s are even higher.
> 
> RDNA2 in October .....



The RTG hype train has been excellent at overpromising until we get reviews and then the hype train derails hard. And the "let's wait" game never fails to disappoint. Too bad most people still prefer overpriced hot garbage with gimmicky tech from NVIDIA as according to Steam HW stats, the overhyped God Lisa Su bestowed RX 5000 series fails to register in the charts standing at 1.13% which is below a single card from the GeForce 20 RTX series, which is the RTX 2060 SUPER at 1.25%. It's amazing and extremely puzzling how and why the comments sections of major tech websites are flooded with AMD fans. Brand loyalty is effing stupid.

And "leaks" like this one is just disgusting: https://videocardz.com/newz/amd-radeon-rx-6900xt-leaks-in-new-photos If AMD had been confident in Big Navi they wouldn't have tried to steal the thunder from this NVIDIA release.


----------



## lexluthermiester (Sep 16, 2020)

theoneandonlymrk said:


> There by, proving that Vram size Can limit performance given two years progress , while selling you a 10GB 4K card, the guy's a genius for sure.


Ah, there's the kicker though: These are just cards from NVidia themselves. There are no limits(that have been made public) put on AIB's to require 10GB of RAM limit. I'm waiting for the 12GB or 16GB models of the 3080 from EVGA. There have always been card released with amounts of RAM that go beyond the reference designs. GTX 400s, 500s, 600s, 700s & 900s all had increased capacity RAM models made, I've had many of them.


----------



## TheoneandonlyMrK (Sep 16, 2020)

birdie said:


> The RTG hype train has been excellent at overpromising until we get reviews and then the hype train derails hard. And the "let's wait" game never fails to disappoint. Too bad most people still prefer overpriced hot garbage with gimmicky tech from NVIDIA as according to Steam HW stats, the overhyped God Lisa Su bestowed RX 5000 series fails to register in the charts standing at 1.13% which is below a single card from the GeForce 20 RTX series, which is the RTX 2060 SUPER at 1.25%. It's amazing and extremely puzzling how and why the comments sections of major tech websites are flooded with AMD fans. *Brand loyalty is effing stupid.*
> 
> And "leaks" like this one is just disgusting: https://videocardz.com/newz/amd-radeon-rx-6900xt-leaks-in-new-photos If AMD had been confident in Big Navi they wouldn't have tried to steal the thunder from this NVIDIA release.


Demonstrated perfectly.


----------



## Lindatje (Sep 16, 2020)

birdie said:


> The RTG hype train has been excellent at overpromising until we get reviews and then the hype train derails hard. And the "let's wait" game never fails to disappoint. Too bad most people still prefer overpriced hot garbage with gimmicky tech from NVIDIA as according to Steam HW stats, the overhyped God Lisa Su bestowed RX 5000 series fails to register in the charts standing at 1.13% which is below a single card from the GeForce 20 RTX series, which is the RTX 2060 SUPER at 1.25%. It's amazing and extremely puzzling how and why the comments sections of major tech websites are flooded with AMD fans. Brand loyalty is effing stupid.
> 
> And "leaks" like this one is just disgusting: https://videocardz.com/newz/amd-radeon-rx-6900xt-leaks-in-new-photos If AMD had been confident in Big Navi they wouldn't have tried to steal the thunder from this NVIDIA release.


And the reason of your rant is ....? It's a review of the *Nvidia *3080 and not AMD, only thing I said is that RDNA2 is coming in October.


----------



## HABO (Sep 16, 2020)

ZekeSulastin said:


> Sorry to bother you, but did you take a look at either reducing the power limit or manually altering the voltage/freq curve in Afterburner/similar?  computerbase.de reported only a few percent less performance with a 270 W limit, and someone linked me a video of someone changing that curve to good effect on another site.
> 
> It honestly sounds like Turing, where you can't drop the voltage with the slider but can with the curve editor.



yes, in review is bullshit regarding that undervolting, its possible throught that courve setting as usual but there is not so significant power savings as I thought


----------



## Krzych (Sep 16, 2020)

EarthDog said:


> I'd love to see ANY 2080Ti run 25% faster with a ambient cooled overclock..........



On ambient you are not getting past 16-17% faster than stock FE, and thats with 380W bios flashed (so 150% PL vs stock FE) and a custom LC, unless you want to run at 60 dB. So realistically on average setup you are not getting even half way to 3080 FE. And Ampere's OC potential is still unknown since we only got 116% PL at 75C for now, this 16-17% increase for 2080 Ti is at 150% PL and 50C. Also 3080 isn't really a replacement for 2080 Ti, you didn't get $1200+ card to lie to yourself that you can get close to $700 card, you sell it and get 3090.


----------



## mouacyk (Sep 16, 2020)

lexluthermiester said:


> Ah, there's the kicker though: These are just cards from NVidia themselves. There are no limits(that have been made public) put on AIB's to require 10GB of RAM limit. I'm waiting for the 12GB or 16GB models of the 3080 from EVGA. There have always been card released with amounts of RAM that go beyond the reference designs. GTX 400s, 500s, 600s, 700s & 900s all had increased capacity RAM models made, I've had many of them.


Since we're throwing out Micron Density specs, why not 13GB or 13.578GB?


----------



## lexluthermiester (Sep 16, 2020)

Lindatje said:


> And the reason of your rant is ....? It's a review of the *Nvidia *3080 and not AMD, only thing I said is that RDNA2 is coming in October.


He was making a comparative point while at the same time offering opinion on a point W1zzard stated.



mouacyk said:


> Since we're throwing out Micron Density specs, why not 13GB or 13.578GB?


I didn't not say anything about "Micron Density specs" at all. I was talking exclusively about about on-card memory capacity.


----------



## mouacyk (Sep 16, 2020)

lexluthermiester said:


> I didn't not say anything about "Micron Density specs" at all. I was talking exclusively about about on-card memory capacity.


You seem to have missed the fundamentals that Micron is only producing 8Gb and 16Gb chips, for use with 32-bit memory controllers, of which the 3080 has 10 enabled of the 12 total.  Hence, we got 10GB, and may possibly get 20GB in a future refresh.  If you're making up numbers, please show the math.


----------



## John Naylor (Sep 16, 2020)

Odd how people can see the same thing and come up with a different set of numbers.

1.   Relative performance based upon average fps:

1080 ==> 2080 = +45%
2080 ==> 3080 = +52% ... not bad, not bad at all .... more than I expected

2.  35 dbA a bit load but not significantly different that previos FE's ... no opinion until we see AIBs

3.  80 C a bit hot, 2C hotter than the 2080 Super .... expected more. ... no opinion until we see AIBs

4.  Power Consumption ... 30 watts more than 2080 Ti .in average gaming ... only 11 % ?  expected a lot more for 23 % more average fps.

5.  Have 0 concerns about memory ... With the rumors abou AIB versions wityh different memory configurations in play, I'm dying to see them compared.    Yes, Im sure folks will be able to crank up settings in the most demanding games and show a performance difference.  But when we have seen significant differences before , it's for the most part at say 18 and 25 fps were the game is already unplayable.  And yes, we will be see tools used that purport to show actual memory usage n excess of what the lower VRAM offering has.   But, as in the past, what we won't see is a significant number of games where the user experience, the way users actually play the game, is altered when cames are actually playable and without doing really, realy strange things.

6  I'll take 1 exception to the review conclusions  "Makes little sense for gamers without a 4K monitor ".   To date, I find that I more often enjoy playing better at 1440p w/ ULMB than playing at 4k with Sync.  The 3080 managed to hit 120 fps in just under  half the games.  Won't be investing in a 4k monitor until I get to see it do 120 Hz w/ ULMB.  The 3080 would drive it to 120 Hz in a bit less than 50% of games and that's enough to justify the purchase. 

Nvidia scared of AMD ? ... yeah like the Ravens were scared of the "This is our year" Cleveland Browns last Sunday .... (Browns went down 38-6).    Generational fps increase for nVidia @v1440p was 52% .... to match, AMD had to do 100% better than their 5700 XT and 67% better then the Radeon VII  I wouldn't bet the ranch.   I have no doubt that AMD has a substantial improvement coming ..... more than one couild normally expect in a single generation ... but thi looks like just too big a mountain to climb.  I hope Im wrong.


----------



## Tomgang (Sep 16, 2020)

Parn said:


> Exactly. With 3090 at 24GB, a mid-generation SUPER refresh with doubled amount of vram is pretty much guaranteed. So 3080S 20GB and 3070S 16GB.




I hope the super/ti versions comes out fast. I have sold my 1080 TI, currently suffering on a gtx 1060 6 GB. Will have to wait for amd zen 3 and rtx 3000 card with more vram before taking the jump to a new system.


----------



## shk021051 (Sep 16, 2020)

I'll Wait for RDNA2 then decide


----------



## lexluthermiester (Sep 16, 2020)

mouacyk said:


> You seem to have missed the fundamentals that Micron is only producing 8Gb and 16Gb chips, for use with 32-bit memory controllers, of which the 3080 has 10 enabled of the 12 total. Hence, we got 10GB, and may possibly get 20GB in a future refresh. If you're making up numbers, please show the math.


Citation? Don't bother arguing because neither Micron nor NVidia have declared or disclosed that specific specification data. Additionally, pairing additional RAM dies to a controller that has a certain limit is almost trivial and comes with little performance penalty.


----------



## cellar door (Sep 16, 2020)

@*W1zzard*
Thanks for a great, comprehensive review!

One that that might be useful to some users could be a table with the decode/encode support - kinda like the one you have for the general specs. Easy to read and look up the differences. Cheers


----------



## Xuper (Sep 16, 2020)

If you go 1080p , you can not go back to 720p
If you go 1440p , you can not go back to 1080p
If you go 2160p , you can not go back to 1440p

That's being said , Your eyes is excited to new world but will hurt in front of old world, so for next generation card , make sure you have at least enough budget , you can't keep up with current generation if you're at high tier.4K is no joke.


----------



## EatingDirt (Sep 16, 2020)

birdie said:


> The RTG hype train has been excellent at overpromising until we get reviews and then the hype train derails hard. And the "let's wait" game never fails to disappoint. Too bad most people still prefer overpriced hot garbage with gimmicky tech from NVIDIA as according to Steam HW stats, the overhyped God Lisa Su bestowed RX 5000 series fails to register in the charts standing at 1.13% which is below a single card from the GeForce 20 RTX series, which is the RTX 2060 SUPER at 1.25%. It's amazing and extremely puzzling how and why the comments sections of major tech websites are flooded with AMD fans. Brand loyalty is effing stupid.
> 
> And "leaks" like this one is just disgusting: https://videocardz.com/newz/amd-radeon-rx-6900xt-leaks-in-new-photos If AMD had been confident in Big Navi they wouldn't have tried to steal the thunder from this NVIDIA release.


We get it, you don't like AMD and Radeon. You've replied and posted in this thread multiple times ranting about the AMD fanboys that barely exist in this thread.

AMD GPU's are a month and a half off, it's not unreasonable for people to wait and see what performance Big NAVI & RDNA2 brings to the table before spending $700+ on a GPU.


----------



## TheoneandonlyMrK (Sep 16, 2020)

John Naylor said:


> Lots


Still sitting on the fence with no comments ,despite clear disparity between what 2X sounds like verses reality.


----------



## TheEndIsNear (Sep 16, 2020)

I have no brand preference right now I have a 2700x with a 5700xt with a 2k 144hz freesync monitor that does around 60 to 70 on badass settings in borderlands 3 and it does what I need it to do.  But I also have the part of me that says get this but I'm trying to drown that out because I have what I need but I want it!


----------



## W1zzard (Sep 16, 2020)

John Naylor said:


> 6 I'll take 1 exception to the review conclusions "Makes little sense for gamers without a 4K monitor ". To date, I find that I more often enjoy playing better at 1440p w/ ULMB than playing at 4k with Sync. The 3080 managed to hit 120 fps in just under half the games. Won't be investing in a 4k monitor until I get to see it do 120 Hz w/ ULMB. The 3080 would drive it to 120 Hz in a bit less than 50% of games and that's enough to justify the purchase.


Fair point, I updated the entry to "Makes little sense for gamers without a 4K monitor, or 1440p 144+ Hz"


----------



## JRMBelgium (Sep 16, 2020)

Thank you so much for testing with VII and testing of BFV. All other reviewers stopped testing Frostbite engine and for Battlefield players, that benchmark is very important. 
But since the 3080 is only 27% faster than my VII in BFV, I'm gonna wait for the Big Navi benchmarks and see if it still performs much better then the Nvidia counterparts.
For some reason, frostbite loves AMD cards.


----------



## md2003 (Sep 16, 2020)

JRMBelgium said:


> Thank you so much for testing with VII and testing of BFV. All other reviewers stopped testing Frostbite engine and for Battlefield players, that benchmark is very important.
> But since the 3080 is only 27% faster than my VII in BFV, I'm gonna wait for the Big Navi benchmarks and see if it still performs much better then the Nvidia counterparts.
> For some reason, frostbite loves AMD cards.


27% on 1080p do you mean? If i am not totally wrong here, BFV is capped at 200FPS.


----------



## chispy (Sep 16, 2020)

Thank you @W1zzard for an excellent review , job well done !

Boss i got a couple of questions , so in theory overclocking headroom is due to all the power limits , temps and low volts 1.081v , do you think with v.mod like shunt mods i have done on many cards will help with overclocking headroom , planning to run this on water chiller at -21c for hwbot competitive benchmarking.


----------



## dir_d (Sep 16, 2020)

I think i am going to wait for RDNA2 before i make a decision.


----------



## EarthDog (Sep 16, 2020)

John Naylor said:


> ... yeah like the Ravens were scared of the "This is our year" Cleveland Browns last Sunday ....


I can't escape it here either...     

(Browns Fan... plenty old enough to know the 80s...the good old days).


----------



## 5150Joker (Sep 16, 2020)

W1zzard said:


> Fair point, I updated the entry to "Makes little sense for gamers without a 4K monitor, or 1440p 144+ Hz"



Any chance you will test 240 Hz/1080p with popular BR games (e.g. Warzone)? I know reproducing a test can be hard in a BR but there are *75 million *of us out there which is a fair bit more than the SP games tested on this site and others.



Xuper said:


> If you go 1080p , you can not go back to 720p
> If you go 1440p , you can not go back to 1080p
> If you go 2160p , you can not go back to 1440p
> 
> That's being said , Your eyes is excited to new world but will hurt in front of old world, so for next generation card , make sure you have at least enough budget , you can't keep up with current generation if you're at high tier.4K is no joke.



Not true, I went back to 1080p/240 Hz from 1440p/144 Hz because the image quality improvement vs performance loss wasn't worth it.


----------



## Flow (Sep 16, 2020)

You get used to the resolution quite easy, even going back. Although I wouldn't want to go back from 1440p currently. I tried a 4k 28" monitor for a while, but frames were too low. Also I had to enlarge fonts in windows desktop, or else they were simply too small for my taste. A larger 4k screen would mitigate that though.
Graphically it looked very crisp , even the desktop. But my current 1440p monitor is hopefully gonna serve me for many years. 144Hz is pretty smooth if you get the fps also.


----------



## EzaaaH (Sep 16, 2020)

specopsFI said:


> @W1zzard
> 
> In the overclocking section, you mention shortly that undervolting is not possible. Can you elaborate on this *very *important point? Is it prevented on Ampere on a hardware level or is it because the necessary tools are not yet available? The inability to undervolt these extremely power hungry chips would be a serious shortcoming, which I can't believe nVidia would exclude for this exact generation, since they have allowed it for so long with previous generations with access to the whole voltage/clock curve.
> 
> Other than that, an excellent article as usual!


It seems the primary issue is the pre-release drivers and lack of support in software like Afterburner. Other sites have indicated even the OC Scanner tool doesn't work yet.

This will MOST LIKELY change moving forward. There's no reason to believe Nvidia would now decide to lock out these features. It would give them unnecessary bad PR and lost credibility with enthusiasts.

I don't like how this article said it though. I could be wrong but I'd be very surprised if this was the case.


----------



## efikkan (Sep 16, 2020)

Vya Domus said:


> I don't think anyone is wondering whether or not a game is going to need more than 10 GB any time soon because they wont. Running them at 4K with everything maxed out will, most definitely cause >10GB usage quite soon.





Vya Domus said:


> That's a strange way to think about it, if it is possible to exceed that amount then it inevitably becomes a limitation.


Proper benchmarks will be the proper way to test whether games actually needs more than a certain amount of VRAM.
Unless we are talking about custom mods which may cause abnormal VRAM usage, it's very unlikely that 10 GB will be a bottleneck for this card anytime soon. Nvidia have a very good idea of big titles coming in the next year or so, if this was going to be a problem they wouldn't release it with 10GB.

The reality is though, the way games commonly use VRAM also needs more bandwidth and computational performance to utilize more VRAM in a single frame. So by the time games needs >10 GB to run top games in 4K, this card will not be fast enough anyway. Many have made bold predictions about this kind of stuff for years, but unless games starts to work in a different way, I would expect other bottlenecks to occur before capacity problems (not accounting for the odd edge case of course).



xkm1948 said:


> W1zzard said:
> 
> 
> > Excel + VBA
> ...


You would be surprised (or horrified?) if you knew how many business critical processes are implemented using "primitive" tools like Excel and VBA. But even though it might not be an _elegant_ solution, it's often good enough and it fits internal procedures very well, so often it's actually much better than something some highly paid consultants would come up with. Many programmers make fun of this kind of stuff, but sometimes these improvised tools can be very low maintenance and work very well for many people.


----------



## ppn (Sep 16, 2020)

On top of that the more RTX becomes prevalent the less memory buffer is needed. SInce who needs textures anymore. when everything is done with rays or something like that so i believe.

So in order to be 31% faster than 2080Ti with bandwidth that is only 23% better, means the GPU must be 39% better and that corresponds to 6144 Cuda operating as FP32 and 2560 Cuda as Integer in games on average. 3070 must be 4352 fp32 1536 Integer lagging behind 2080Ti by 16% for the 448GBs memory the decision not to use GDDR6X 606GBs is a great loss. i like the idea of 3060Ti Asap.


----------



## Turmania (Sep 16, 2020)

I should be happy, but I`m not ! power consumption made me sad.


----------



## Searing (Sep 16, 2020)

Raevenlord said:


> I am now thoroughly convinced the RTX 3080 is absolute overkill for my 1440p monitor.



No it is just right. Jedi Fallen Order at 144fps exactly  That's a last gen game!


----------



## efikkan (Sep 16, 2020)

ppn said:


> On top of that the more RTX becomes prevalent the less memory buffer is needed. SInce who needs textures anymore. when everything is done with rays or something like that so i believe.


Textures will not become less useful with raytracing, quite the contrary, as more textures will be needed to represent the various properties of materials.

But raytracing may reduce the need for many render passes, temporary framebuffers etc. used in various advanced lighting techniques.



ppn said:


> So in order to be 31% faster than 2080Ti with bandwidth that is only 23% better, means the GPU must be 39% better and that corresponds to 6144 Cuda operating as FP32 and 2560 Cuda as Integer in games on average. 3070 must be 4352 fp32 1536 Integer lagging behind 2080Ti by 16% for the 448GBs memory the decision not to use GDDR6X 606GBs is a great loss. i like the idea of 3060Ti Asap.


Don't get too caught up in theoretical specs, good benchmarking is what actually matters.
Ampere is a major architectural overhaul, including caches etc. which means you can't extrapolate the performance based on older architectures.


----------



## moproblems99 (Sep 16, 2020)

Well, if the 3080 sells out, I'll try for 3090.  If that sells out, I'll wait for RDNA2.  Unless one of the formers comes in stock.


----------



## B-Real (Sep 16, 2020)

I'm just curious about those people's opinion about the power consumption who in the old days made fun of the 290X or 390X, or even the 480 or Vega cards. What do they say now about the extra 90W power it needs compared to the 2080 or the efficiency increase compared to the 2080 vs. 1080 to the 980?



birdie said:


> People should forget about this card performance, power efficiency and price/performance at 1080 and 1440p - this card is *made for 4K* unless you wanna get 500fps@1080p in your favourite online shooter. Period.
> 
> Overall, I'm pleased with what NVIDIA has achieved with this generation but since I'm not a filthy rich European/US citizen who can afford a high refresh 4K HDR IPS monitor, I will simply forget about it.
> 
> ...





















Wut? There are already released games that won't run fix 60 fps in 4K with the 3080. What do you say about the extra 90W power consumption compared to the 2080? You must have doomed AMD for the consumption of even the RX 480 back to those days.  And the next gen games are only arriving in 2021 and onward.


----------



## Ben_UK (Sep 16, 2020)

Those power consumption figures are absolutely horrible. So much for VGA firms going for more efficient, power saving architectures. It looks liek the progress stopped with the 1070 / 1080 series on that front.


----------



## 5150Joker (Sep 16, 2020)

Ben_UK said:


> Those power consumption figures are absolutely horrible. So much for VGA firms going for more efficient, power saving architectures. It looks liek the progress stopped with the 1070 / 1080 series on that front.



I think it has to do with Samsung rather than them not trying to be more energy efficient.


----------



## Auer (Sep 16, 2020)

If youre truly worried about the power consumption, play games on your phone.


----------



## Ashtr1x (Sep 16, 2020)

Great review. Thanks.

3080 for 1080P seems like an overkill but with high FPS % performance difference vs Pascal is great, I run a Maxwell card, that makes this supremely fast. Then for new titles which utilize more complex engines and newer assets for 9th gen it would become superb at1080P High Refresh rates with RT would be much better than dropping to 60FPS standard refresh rate for higher resolution 4K. 1440P seems apt in the middle ground, but looks like we need Ryzen 4000 and Intel Rocketlake to truly push more FPS at 1080P for these existing titles and old ones. I will wait for them and also wait for the 3080 SUPER / TI with 14/16GB G6X refresh that probably is inevitable, so making it a long life purchase for 1080P/1440P, 4K with RT will take a massive hit at sub 60FPS and then need DLSS too.


As for the design, do not like that ugly 12Pin connector, it's standing out like a sore thumb and the cabling will look pathetic no matter what you do, other than that the card's full metal design trumps AIB cards, a shame that many AIBs are making crappy looking GPUs, like EVGA of all made ugliest design who make a solid Dark series Mobo. ASUS Is too stylized vs their 2080 Strix, will wait for those reviews too as for that weird fan that pumps to the VRM / CPU / Mem, too many flaws in FE for that stunning look, not worth. 

Also shame that Nvidia dropped NVLink, esp given how these 2 cards 3080 SLI reach 3090 price and destroy in performance too also the fact that GA102 loses out NVLink this time.


Much needed for PCMR.


----------



## jellyrole (Sep 16, 2020)

@W1zzard Is there any plan to add the Call of Duty series back into reviews any time soon? Seems like it's getting a lot more attention on PC lately.


----------



## KainXS (Sep 16, 2020)

The only real time I think you have worry about power is in cases like the RX 480 and Vega(for example)) where some cards come with core voltage that is so high it causes you to hit the power limit and throttle and you have to undervolt(to reduce power draw) to get more performance. That is the only real case I can think of where that matters.


----------



## Fluffmeister (Sep 16, 2020)

Auer said:


> If youre truly worried about the power consumption, play games on your phone.



Fermi = bad, 290X = good,  295X2 = perfectly acceptable, Vega = underclockers dream.

It's the years they have to wait for the same peformance which drives them crazy I think.


----------



## SN2716057 (Sep 16, 2020)

I can't wait to play RDR2 at 90 ish fps. Although seeing the tear-down, considering I'm gonna water cool it, I might wait for one of the board partners.
W1zzard, do you already have some of board partners cards?


----------



## B-Real (Sep 16, 2020)

KainXS said:


> The only real time I think you have worry about power is in cases like the RX 480 and Vega(for example)) where some cards come with core voltage that is so high it causes you to hit the power limit and throttle and you have to undervolt(to reduce power draw) to get more performance. That is the only real case I can think of where that matters.


Well if you consider there was a node change, that extra 80-90W consumption can hardly be defended. Just check these 2 graphs:










I know the 980-1080 is 1440p, but 4k is the same difference (63% instead of 64).

And regarding performance (increase bw 980Ti and 1080 is around 37% and bw 2080 Ti and 3080 is around 31-32%), you should note the OC capabilities of the 2 cards: 1080 was able to be OCd by 13% whereas the 3080 can be OCd by around 4%. So there is a 10% point difference between the 2.



Fluffmeister said:


> Fermi = bad, 290X = good,  295X2 = perfectly acceptable, Vega = underclockers dream.
> 
> It's the years they have to wait for the same peformance which drives them crazy I think.


I don't see you laughing at the 3080's power consumption or small efficiency gain. I'm definitely sure you did it with Vega or even RX 480.


----------



## Upgrayedd (Sep 16, 2020)

W1zzard said:


> For the 3080, Full HD really is more of an academic resolution, I doubt anyone will buy that card for 1080p


I mean RDR2 says otherwise.  Yeah it's 100fps, but there's 165Hz 1440p displays. Let's see what full blown CP2077 thinks of the 3080.


----------



## Fluffmeister (Sep 17, 2020)

B-fucking-real said:
			
		

> I don't see you laughing at the 3080's power consumption or small efficiency gain. I'm definitely sure you did it with Vega or even RX 480.



Hmm? It's performance per watt seems to top the charts at 4k?

Vega was a late 1080 competitor, and Polaris was super juicy too, as those graphs you just posted show, that RX 590 was soooooo bad.


----------



## kings (Sep 17, 2020)

Compare this with Vega is a tremendous intellectual dishonesty. If we take things out of context, they lose their meaning.

Vega's consumption was criticized, due to the performance/watt ratio offered. It needed about 300W to match Nvidia's 170W~180W equivalent performing card. If Vega 64 humiliated the 1080Ti, you can be sure that consumption would have been the subject of much less talk.

If AMD presents something similar to 3080 that consumes much less power, you will see the same criticisms that were once made to Vega. As Intel's CPUs consumption is criticized against Ryzen at the moment.


----------



## WeeRab (Sep 17, 2020)

There's no way i'm buying ANYTHING from someone who wears a leather jacket in the kitchen.


----------



## B-Real (Sep 17, 2020)

Fluffmeister said:


> Hmm? It's performance per watt seems to top the charts at 4k?
> 
> Vega was a late 1080 competitor, and Polaris was super juicy too, as those graphs you just posted show, that RX 590 was soooooo bad.


Have I said that it's not the leader in the chart? Some were trying to defend 3080 efficiency by saying it is the leader in the chart. I mentioned that compared to the 980-1080 efficiency change, 3080 is nowhere near it compared to 2080 (55% vs. 17%) In fact, it's only a little better than the 1080-2080 efficiency gain (12%). When the 3070 or 3060 will arrive, they will definitely get the lead from 3080. And as for the TDPs, I'm assuming even the 3090 will. And don't forget RX 5000 caught up with NV, the 5600 XT even taking the first place from NV, nearly equalling the leader 1650 in FHD and surpassing it in 1440p. The RX 5700 was also better than its rival 2070 and 2070S in perf/W. So there is a good chance they bring efficient cards again.

And some personal: may I ask you to not play with my name? Thank you!


----------



## Bruno_O (Sep 17, 2020)

birdie said:


> People should forget about this card performance, power efficiency and price/performance at 1080 and 1440p - this card is *made for 4K* unless you wanna get 500fps@1080p in your favourite online shooter. Period.



except for the ridiculously low VRAM size, 10GB is not enough for 4k now and won't be in the future!

About the card, good performance bump, but the VRAM size (to be fixed with a 20GB TI variant), power usage and heat are *pathetic*. I have an HTPC so I do care about thermals, looking forward for RDNA2 and its promised superior efficiency. If it under performs, then a 3070 TI 16GB (being more efficient in theory, as it's 220W) could be the deal.


----------



## mechtech (Sep 17, 2020)

Wow that is a crazy amount of transistors!!!!!

~ 50% to almost 100% hit in performance for RTX wow

Need a relative performance per transistor chart.

For the frame time charts, for someone with red/green colour deficiency they are not very friendly.  Orange - Blue or something with more contrast would be way easier to distinguish.

Nice review.  

I will wait for the 350 cnd$ range/budget cards though.


----------



## gabiroli (Sep 17, 2020)

I think the people who are impressed by this card are blinded by Nvidia.   This card is only impressive because the 20 series was way overpriced....1200 for a 2080ti...meanwhile 2080 is a rehashed 1080ti.
20-30% improvement is still good. over 2080ti especially for the money.  But if the 2080ti was 700$ like how the 1080ti was this card would be irrelevant as well.   0GB VRAM is very low... for 4k
 think about It 1080ti had 11GB almost 4 years ago.   It is enough today... but it wont be enough 2 years from now.   It's planned Obsolescence.   One of the selling features of this gpu is nvcache which takes advantage of vram.   Which cpu doesn't have ANY SPARE to give at 4k anyways.   All major games will take advantage of it since ps5 and xbsx will have it too...


----------



## Auer (Sep 17, 2020)

gabiroli said:


> I think the people who are impressed by this card are blinded by Nvidia.   This card is only impressive because the 20 series was way overpriced....1200 for a 2080ti...meanwhile 2080 is a rehashed 1080ti.
> 20-30% improvement is still good. over 2080ti especially for the money.  But if the 2080ti was 700$ like how the 1080ti was this card would be irrelevant as well.   0GB VRAM is very low... for 4k
> think about It 1080ti had 11GB almost 4 years ago.   It is enough today... but it wont be enough 2 years from now.   It's planned Obsolescence.   One of the selling features of this gpu is nvcache which takes advantage of vram.   Which cpu doesn't have ANY SPARE to give at 4k anyways.   All major games will take advantage of it since ps5 and xbsx will have it too...


Very nice of you to make an account and warn the world about Nvidia, Thanks.


----------



## birdie (Sep 17, 2020)

gabiroli said:


> I think the people who are impressed by this card are blinded by Nvidia. This card is only impressive because the 20 series was way overpriced....1200 for a 2080ti...meanwhile 2080 is a rehashed 1080ti.
> 
> 20-30% improvement is still good. over 2080ti especially for the money. But if the 2080ti was 700$ like how the 1080ti was this card would be irrelevant as well. 0GB VRAM is very low... for 4k
> 
> think about It 1080ti had 11GB almost 4 years ago. It is enough today... but it wont be enough 2 years from now. It's planned Obsolescence. One of the selling features of this gpu is nvcache which takes advantage of vram. Which cpu doesn't have ANY SPARE to give at 4k anyways. All major games will take advantage of it since ps5 and xbsx will have it too...



It's amazing you've just signed up to spoil the launch with your very valuable opinion. Also, remember, we live in a free society, so please stop it with "overpriced". Did NVIDIA put a gun against your head and ask you to buy any of their GPUs? No? Then how on Earth are they overpriced? Also, I agree, "*0GB* VRAM is very low... for 4k". Except this card features 10GB and we have close to zero games which actually require more than 8GB of VRAM at 4K. Also read the rest of my comment.

Speaking of _"planned obsolescence"_ due a lack of VRAM:






See how the GTX 1060 3GB still works relatively OK despite not having enough VRAM even 5 years ago. Yes, it's very slow in fact 33% slower, but not 2 or 3 times as slow as its 6GB brother. Also, see how _both cards are unusable _at this resolution.



Bruno_O said:


> except for the ridiculously low VRAM size, 10GB is not enough for 4k now and won't be in the future!
> 
> About the card, good performance bump, but the VRAM size (to be fixed with a 20GB TI variant), power usage and heat are *pathetic*. I have an HTPC so I do care about thermals, looking forward for RDNA2 and its promised superior efficiency. If it under performs, then a 3070 TI 16GB (being more efficient in theory, as it's 220W) could be the deal.



Game developers target the most popular GPUs and most of them contain 8GB of VRAM or even less. Consoles also won't really feature more than 8GB of VRAM because their memory pool is shared between an OS core, game code and VRAM and that all should fit into 16GB or less. And both consoles feature uber fast textures streaming off their storage.
Also, it's been shown time and again, that NVIDIA has superb drivers and their GPUs performance is not seriously affected even when there's not "enough" VRAM. E.g. Call of Duty: Modern Warfare eats VRAM for breakfast (over 9.5GB of VRAM use at 4K), yet, are NVIDIA cards with 6GB of VRAM affected? Not at all. Besides with RTX IO it all becomes moot.
Lastly, by the time 10GB of VRAM is not enough, this GPU performance will be too low even if had twice as much VRAM.
Still you can always buy a future-proof GPU from AMD. BTW, do you remember Radeon R9 Fury? Was released with paltry 4GB of VRAM which was strangely enough for AMD fans.


----------



## wolf (Sep 17, 2020)

@W1zzard , you say undervolting isn't possible, can you elaborate on that? I'm thinking perhaps you refer to the boost curve editor in MSI AB that most would use to undervolt, but is it possible to set the power limit lower than 100%? say 90% which would effectively underclock and undervolt the card.



B-Real said:


> I'm just curious about those people's opinion about the power consumption who in the old days made fun of the 290X or 390X, or even the 480 or Vega cards. What do they say now about the extra 90W power it needs compared to the 2080 or the efficiency increase compared to the 2080 vs. 1080 to the 980?



Ok lets take Vega, the problem is that it was ~1 year late to the party in matching GTX1080 performance, add to that it wanted a LOT more power. If it wanted more power but offered more performance it would have been a lot less of an issue IMO. 

With the 3080 absolutely it want's a lot of power, perhaps alarmingly so for some, but it's the undisputed top dog, totally unmatched for now *and* managed to further the bar in efficiency, they've just ratcheted up the board power to deliver the big gains. Could it be better? sure. But it's certainly not a repeat of Vega. If AMD swoop in now and beat perf/watt ratio then it might look even worse, but it was still first to the party.


----------



## gabiroli (Sep 17, 2020)

I won't buy an AMD GPU.  As a matter of fact, I have a 1080ti in my system right now.    No matter what AMD brings to the table I have to buy Nvidia for moonlight stream.   That doesn't mean that I have to like everything Nvidia tries to sell me.   I'm not trying to spoil the launch for everyone.   And I will be picking up an Nvidia graphics card that has Adequate ram...  Just not the 3080.  Maybe the 3080ti.    It still doesn't change the fact that this GPU is way overhyped now.   Just to prove i am not and FANBOY.   Ill take whichever side has features performance i need.  For remote streaming, i prefer moonlight now.   





birdie said:


> It's amazing you've just signed up to spoil the launch with your very valuable opinion. Also, remember, we live in a free society, so please stop it with "overpriced". Did NVIDIA put a gun against your head and ask you to buy any of their GPUs? No? Then how on Earth are they overpriced? Also, I agree, "*0GB* VRAM is very low... for 4k". Except this card features 10GB and we have close to zero games which actually require more than 8GB of VRAM at 4K. Also read the rest of my comment.
> 
> 
> 
> ...



You do realise that the reason why PS5 and xb1 comes with pci gen 4 ssd is to use some of it as RAM  and video ram for games right???   



birdie said:


> It's amazing you've just signed up to spoil the launch with your very valuable opinion. Also, remember, we live in a free society, so please stop it with "overpriced". Did NVIDIA put a gun against your head and ask you to buy any of their GPUs? No? Then how on Earth are they overpriced? Also, I agree, "*0GB* VRAM is very low... for 4k". Except this card features 10GB and we have close to zero games which actually require more than 8GB of VRAM at 4K. Also read the rest of my comment.
> 
> Speaking of _"planned obsolescence"_ due a lack of VRAM:
> 
> ...


----------



## birdie (Sep 17, 2020)

wolf said:


> @W1zzard , you say undervolting isn't possible, can you elaborate on that? I'm thinking perhaps you refer to the boost curve editor in MSI AB that most would use to undervolt, but is it possible to set the power limit lower than 100%? say 90% which would effectively underclock and undervolt the card.



Power limiting is possible and has been tested by computerbase.de: https://www.computerbase.de/2020-09/geforce-rtx-3080-test/6/

The card becomes a lot more power efficient once it's being limited which means NVIDIA has clocked it as high as possible to gain extra performance at the expense of efficiency. I expect both RTX 3090 and 3070 to be more power efficient than RTX 3080 where RTX 3070 will be the most power efficient of these three.



gabiroli said:


> Adequate ram



I'd like to see older cards with "inadequate" VRAM where the lack of it, and not the lack of GPU power, causes them to severly underperform in modern games.

*Edit*: found two, but both games are limited to the idTech Vulkan engine which probably could have been optimized better for high-res textures.


Spoiler











 and 











*Edit 2*: here's a very interesting write-up on VRAM usage: https://www.resetera.com/threads/vram-in-2020-2024-why-10gb-is-enough.280976/ Looks like monitoring tools are often reporting incorrect data.


----------



## wolf (Sep 17, 2020)

birdie said:


> Power limiting is possible and has been tested by computerbase.de: https://www.computerbase.de/2020-09/geforce-rtx-3080-test/6/
> 
> The card becomes a lot more power efficient once it's being limited which means NVIDIA has clocked it as high as possible to gain extra performance at the expense of efficiency.



ooh many thanks for that link! good news personally as I to intend to at least *try* and buy one, and upgrading from a GTX1080 the performance uplift is so big I would be happy to shave a little off the top for cooler/quieter/less hungry.


----------



## birdie (Sep 17, 2020)

Actually I've found a review with undervolting: https://www.gpureport.cz/recenze/26...ektivita-undervolting.aspx?article=268&page=9

-50W power consumption for the loss of 1fps in the Division 2 at 4K.



> For my NVIDIA RTX 3080 Founders Edition graphics card, for example, it was 0.806 V at 1800 MHz . With this setup, I not only saved not only a few tens of watts, degrees and decibels, but what's more, I didn't even lose virtually any power. This is also evidenced by the following video, in which you will also find instructions on how to perform undervolting on GeForce RTX 3080 graphics cards.. For my NVIDIA RTX 3080 Founders Edition graphics card, for example, it was 0.806 V at 1800 MHz . With this setup, I not only saved not only a few tens of watts, degrees and decibels, but what's more, I didn't even lose virtually any power. This is also evidenced by the following video, in which you will also find instructions on how to perform undervolting on GeForce RTX 3080 graphics cards.


----------



## Stefem (Sep 17, 2020)

gridracedriver said:


> as expected by me, + 10~20% increase in perf / watt and + 20~30% performance from 2080ti, but which 1.9x declared in the nVidia slides.
> I guess they are really worried about the arrival of RDNA2


It isn't NVIDIA's fault if you can't read a chart, it's 1.9x at the same performance...


----------



## Flanker (Sep 17, 2020)

First time I've seen a card so overkill for my needs that doesn't even tickle my upgrade itch. 
Looking forward to single 8-pin cards in this generation and see if there is any magic in perf/watt at 1080/1440p


----------



## nguyen (Sep 17, 2020)

Flanker said:


> First time I've seen a card so overkill for my needs that doesn't even tickle my upgrade itch.
> Looking forward to single 8-pin cards in this generation and see if there is any magic in perf/watt at 1080/1440p



It's easy to kill the performance if you so wish though, DSR for example...
Fans of maximum efficiency can reduce the TGP of 3080 to something like 180W and still get 2080 Ti performance, though 3070 can do it too at 220W TGP and cost less.


----------



## R0H1T (Sep 17, 2020)

kings said:


> Vega's consumption was criticized, due to the performance/watt ratio offered


Vega was also on an inferior node & admittedly an inferior uarch, that compounded their problem. But what's clear that there's still *bias against AMD* & towards Nvidia, JHH can launch a proverbial turd (remember Fermi?) & still get accolades while *AMD not only has to please the (gaming) audience but also pay them to do so* 

I'm willing to bet even if AMD comes within 5~15% of Nvidia's performance & perf/W lots of users will still want them to be a lot cheaper!


----------



## sergionography (Sep 17, 2020)

EarthDog said:


> Math my man.
> 
> that is 46% faster. You realize that 2x = 100% right? For example if card A ran at 100 FPS and card B ran at 146 FPS, card B is 46% faster than card A. If it was "double" it would be 100%.


Actually you are incorrect my friend. 54% the performance of 1080ti means it performs about half as fast. It's *100/54* not *100 - 54*


----------



## Auer (Sep 17, 2020)

R0H1T said:


> Vega was also on an inferior node & admittedly an inferior uarch, that compounded their problem. But what's clear that there's still *bias against AMD* & towards Nvidia, JHH can launch a proverbial turd (remember Fermi?) & still get accolades while *AMD not only has to please the (gaming) audience but also pay them to do so*
> 
> I'm willing to bet even if AMD comes within 5~15% of Nvidia's performance & perf/W lots of users will still want them to be a lot cheaper!


Well, if its slower it should be cheaper no?


----------



## R0H1T (Sep 17, 2020)

A lot i.e. more than 5~15% incommensurate with their performance.


----------



## mechtech (Sep 17, 2020)

birdie said:


> It's amazing you've just signed up to spoil the launch with your very valuable opinion. Also, remember, we live in a free society, so please stop it with "overpriced". Did NVIDIA put a gun against your head and ask you to buy any of their GPUs? No? Then how on Earth are they overpriced? Also, I agree, "*0GB* VRAM is very low... for 4k". Except this card features 10GB and we have close to zero games which actually require more than 8GB of VRAM at 4K. Also read the rest of my comment.
> 
> *Speaking of "planned obsolescence" due a lack of VRAM:*
> 
> ...



Most people who have a budget in the GTX 1060 range are also people whose monitor budget is in the 1080p range.  Also every game is different, look at Witcher 3.








						The Witcher 3: Performance Analysis
					

In this article, we put the GTX Titan X, R9 295X2, GTX 980, R9 290X, GTX 970, R9 290, GTX 960, and R9 285 through the Witcher 3: Wild Hunt. We do so at resolutions of 1600x900, 1920x1080, 2560x1440, and 4K to assess what hardware you need to play this recently released top-title.




					www.techpowerup.com
				




If all I had was a 1080p and a small budget, I wouldn't get more than 4GB, and maybe even a 2GB card since a GPU's life cycle is so short anyway, compared to that of a monitor which is usually 10 years.


----------



## nguyen (Sep 17, 2020)

R0H1T said:


> Vega was also on an inferior node & admittedly an inferior uarch, that compounded their problem. But what's clear that there's still *bias against AMD* & towards Nvidia, JHH can launch a proverbial turd (remember Fermi?) & still get accolades while *AMD not only has to please the audience but also pay them to do so*



RTG just reap what they sown, the market will response when RTG make a good product like Ryzen.
Vega was too inefficient
Navi is too expensive (compare to Polaris), 5500XT and 5600XT are like sacrificial pawn.
Not to mention the state of their drivers and hardware compatibilities, well you can blame AMD for that because the R&D funding was pooled into CPU developments, that was also why Raja left.


----------



## moproblems99 (Sep 17, 2020)

nguyen said:


> that was also why Raja left.



We'll find out soon if that matters.


----------



## Caring1 (Sep 17, 2020)

WeeRab said:


> There's no way i'm buying ANYTHING from someone who wears a leather jacket in the kitchen.


Would you rather he wore only an apron?


----------



## mechtech (Sep 17, 2020)

W1zz, what is the BIOS version for that EVGA Z390 motherboard?


----------



## R0H1T (Sep 17, 2020)

nguyen said:


> RTG just reap what they sown, the market will response when RTG make a good product like Ryzen.


Oh sure when AMD fucks up it's their fault, when they don't it's still their fault  

Let's see how many GPUs or series (uarch) do you think they didn't botch in the last 5 years?


nguyen said:


> well you can blame AMD for that because the *R&D funding was pooled into CPU* developments


For good reason, going *EPYC* not only helped them stay afloat but also now lead the top end CPU market virtually on all platforms.


----------



## xorbe (Sep 17, 2020)

Looks good, but I am uneasy with that much wattage for a video card.  The 3070 is supposed to be 220W?  And only 8GB, why would anyone buy 220W of gpu fire power, and only have 8GB.


----------



## z1n0x (Sep 17, 2020)

Too much hype man, too much hype, nothing can live up to overhype. (ot) CP2077 will have the same faith.

OMG all those CUDA cores, forgeting the need to have balanced architecture that can feed all those executiion units.  Forgot Vega64 vs GTX1080, 4k vs 2.5k ALUs

It's good generational jump.

Nvdia shouldnt have pushed it beyond the efficiency curve. Also 10GB for 4k card, yeee no.


----------



## W1zzard (Sep 17, 2020)

chispy said:


> do you think with v.mod like shunt mods i have done on many cards will help with overclocking headroom


most certainly



jellyrole said:


> Is there any plan to add the Call of Duty series back into reviews any time soon?


No plans, bnet banned me for trying to benchmark the game, and their support didn't want to admit it and chose to ignore my ticket for 3 months.
Always online sucks, because they will force perf-changing patches on you whenever it's most inconvenient.


----------



## Dyatlov A (Sep 17, 2020)

If it has a manual power limit adjustment range of up to 370 W, from a default of 320 W, is it also possible to lower max power consumption, for example to 250W?


----------



## Xex360 (Sep 17, 2020)

While the performance is impressive finally a true 4k card and now that it's been confirmed that on LG OLEDs it works at 4k120HDR, this combination is just phenomenal.
But the card is way too expensive, the 700$ is just fake (I assume it doesn't include tax) elsewhere it's much more expensive (even in Asia where the card is built it can cost up to 1000$ or more), why not use a similar pricing as consoles (MSRP), the PS5 and Series X cost around the same in Europe, north America and Japan. 
At the end of the day, great performance ruined by stupid pricing (I'm talking about nVidia MSRP).


----------



## nguyen (Sep 17, 2020)

Xex360 said:


> While the performance is impressive finally a true 4k card and now that it's been confirmed that on LG OLEDs it works at 4k120HDR, this combination is just phenomenal.
> But the card is way too expensive, the 700$ is just fake (I assume it doesn't include tax) elsewhere it's much more expensive (even in Asia where the card is built it can cost up to 1000$ or more), why not use a similar pricing as consoles (MSRP), the PS5 and Series X cost around the same in Europe, north America and Japan.
> At the end of the day, great performance ruined by stupid pricing (I'm talking about nVidia MSRP).



You are mixing up supply and demand there, retailers just like to jack up the price when stuffs are in short supply.
Just wait few months when price settles, PS5/XBX price will be inflated when they first launch too.


----------



## wolf (Sep 17, 2020)

Dyatlov A said:


> If it has a manual power limit adjustment range of up to 370 W, from a default of 320 W, is it also possible to lower max power consumption, for example to 250W?


There are a couple of links on page 9, it seems that this is indeed possible and the results are promising. One site went down the slider to 83% max power limit to match the wattage draw of a 2080Ti at 270w, but it only resulted in a ~4% performance loss, not bad at all! the hypothesis is that Nvidia have had to push the GPU beyond the optimal efficiency sweet spot to hit the performance target they were seeking.


----------



## laszlo (Sep 17, 2020)

if they would produce it at TSMC on 7nm it will have the projected leap and would be cooler also...; price-wise no matter how i look i'll never pay more than 300 for a card so good purchase to those who'll buy it!


----------



## Taraquin (Sep 17, 2020)

W1zzard said:


> That's because lower resolutions are CPU limited.
> 
> Edit: Actually you're making a great point. I'm using the same 303 W typical power consumption value from the power measurements page on all 3 resolutions, which isn't 100% accurate. Because it's some games are CPU limited, then in those games the power consumption is down, too, which I'm not taking into account


Perhaps all GPU-powerconsumption-testing should be done at 4k so we get a true idea of how much power it uses if not CPU-limited?


----------



## Xex360 (Sep 17, 2020)

nguyen said:


> You are mixing up supply and demand there, retailers just like to jack up the price when stuffs are in short supply.
> Just wait few months when price settles, PS5/XBX price will be inflated when they first launch too.


No I'm not, this is the MSRP of nVidia, for instance they price the 3080 at 699$ in the US but 1000$+ in Japan the prices are from their website.
So why they didn't price the cards at 699$ or thereabouts everywhere?
The consoles are around the same MSRP in most markets.


----------



## hkpolice2 (Sep 17, 2020)

Under the overclocked performance testing, what settings were used in Unigine Heaven?

Thanks


----------



## gridracedriver (Sep 17, 2020)

Stefem said:


> It isn't NVIDIA's fault if you can't read a chart, it's 1.9x at the same performance...


of course I had noticed it, on tom's IT I had also pointed it out to everyone, but still they declared 1.5x with the same power draw! see the picture.
Well, where is this 1.5x in performance/watt?
declare false, there is no excuse.


----------



## DuxCro (Sep 17, 2020)

13-24% performance increase over 2080Ti. Depending on resolution. With 28% higher TDP. Don't bother selling your 2080Ti.


----------



## londiste (Sep 17, 2020)

Xex360 said:


> No I'm not, this is the MSRP of nVidia, for instance they price the 3080 at 699$ in the US but 1000$+ in Japan the prices are from their website.
> So why they didn't price the cards at 699$ or thereabouts everywhere?
> The consoles are around the same MSRP in most markets.


Consoles are not quire around the same MSRP either. For example, the recently announced price of Xbox Series X is $499/499€/£449 and 49980 yen. Compared to US price Japanese price is reduced as it is a difficult market for Xbox and EU/GB prices reflect included taxes.

There are a number of reasons to vary MSRP across different regions or countries - taxes are a big one and marketing position is another.



DuxCro said:


> 13-24% performance increase over 2080Ti. Depending on resolution. With 28% higher TDP. Don't bother selling your 2080Ti.


15-31%. RTX3080 is considerably more CPU limited on 1440p.


----------



## Dyatlov A (Sep 17, 2020)

wolf said:


> There are a couple of links on page 9, it seems that this is indeed possible and the results are promising. One site went down the slider to 83% max power limit to match the wattage draw of a 2080Ti at 270w, but it only resulted in a ~4% performance loss, not bad at all! the hypothesis is that Nvidia have had to push the GPU beyond the optimal efficiency sweet spot to hit the performance target they were seeking.



Combining that with undervolting and maybe no performance lost


----------



## nguyen (Sep 17, 2020)

Xex360 said:


> No I'm not, this is the MSRP of nVidia, for instance they price the 3080 at 699$ in the US but 1000$+ in Japan the prices are from their website.
> So why they didn't price the cards at 699$ or thereabouts everywhere?
> The consoles are around the same MSRP in most markets.



US and Japan do not include VAT into their listing price, everywhere else does LOL. Right now it tough for you to avoid VAT when buying online in the US so yeah please add 8-12% VAT depending on where you live.
If you can buy online from Nvidia website without VAT, go right ahead, it's a mighty good deal.


----------



## Valantar (Sep 17, 2020)

nguyen said:


> Just check the relative performance of 2080 Ti compare to 3080 across 3 resolution and you will see a change
> 1080p: 87%
> 1440p: 81%
> 4K:        75%
> ...


...if those numbers show anything, it's that there are no CPU bottlenecks at 4k, given that it scales beyond the lower resolutions. Which is exactly what I was pointing out, while @birdie was claiming that this GPU is CPU limited even at 4k:


birdie said:


> quite a *lot of games being reviewed are severely CPU limited even at 4K*!


Which I then asked for some data demonstrating, as this review fails to show any such bottleneck. And, as GN pointed out in their excellent video review, not all seeming CPU limitations are actual CPU limitations - some examples of poor GPU scaling are down to the architecture or layout of the GPU causing GPU bottlenecks that don't show up as 100% load.


Vya Domus said:


> That's a strange way to think about it, if it is possible to exceed that amount then it inevitably becomes a limitation.


Only if there is a perceptible difference in graphical quality - without that, the difference is entirely theoretical. And that's the entire point: a lot of games have extremely "expensive" Ultra settings tiers with near imperceptible or even entirely imperceptible differences in quality. If your benchmark for not having a VRAM limitation is the ability to enable all of these, then your benchmark is problematic. If the thing you're worried about for this GPU is that you might at some point in the future need to lower settings an imperceptible amount to maintain performance, then ... what are you worried about, exactly? Stop promoting graphical quality nocebo effects, please. Because at that point, all you are arguing for is the value/security of knowing your GPU can handle everything set to Ultra, no matter what this actually means for the quality of the game. Which is just silly. Not only is it an impossible target, but it's a fundamentally irrational one.


----------



## FinneousPJ (Sep 17, 2020)

I don't know guys. The 2080 Ti is 19% slower on average in these tests. That doesn't seem super impressive to me -- especially compared to the hype.


----------



## Xex360 (Sep 17, 2020)

londiste said:


> Consoles are not quire around the same MSRP either. For example, the recently announced price of Xbox Series X is $499/499€/£449 and 49980 yen. Compared to US price Japanese price is reduced as it is a difficult market for Xbox and EU/GB prices reflect included taxes.
> 
> There are a number of reasons to vary MSRP across different regions or countries - taxes are a big one and marketing position is another.
> 
> 15-31%. RTX3080 is considerably more CPU limited on 1440p.


Yeah but the difference is slight in consoles, but here we are talking of more than 40% increase on the official MSRP.


----------



## webdigo (Sep 17, 2020)

Auer said:


> If youre truly worried about the power consumption, play games on your phone.



Some people actually like a quiet system. The more power draw, the more increased rpm on psu fan.


----------



## Anymal (Sep 17, 2020)

You just need to cash out more to silent PC.


----------



## nguyen (Sep 17, 2020)

Valantar said:


> ...if those numbers show anything, it's that there are no CPU bottlenecks at 4k, given that it scales beyond the lower resolutions. Which is exactly what I was pointing out, while @birdie was claiming that this GPU is CPU limited even at 4k:



Here you go, this is from 3080 review






Here is from 10900K vs 3900XT





This game probably shows the biggest difference of 10%, other games show that 10900K or 3900XT can improve 3080's FPS by a few %. Overall the standard testing rig show sign of CPU bottleneck even at 4K


----------



## Valantar (Sep 17, 2020)

nguyen said:


> Here you go, this is from 3080 review
> 
> 
> 
> ...


Exactly! I am really not the person you need to be showing this to.



Testsubject01 said:


> Well, I wouldn't completely mark it down to unwillingness.
> Getting a PC + Display to enjoy all the bells and whistle settings you had on 1080p at resolutions of 1440p @120+ or 2160 @60 without selling organs, just started to get feasible now.
> 
> 
> Let's hope RDN2 delivers as well in October and this trend continues.


Put it this way: if you're willing to pay $700 for a GPU (no matter the generation) yet unwilling to pay more than ~$300 for your monitor, that is ... wait for it ... unwillingness. There have been GPUs perfectly capable of handling >60fps @1440p for years below $700, and there have been 1440p>60Hz monitors at relatively reasonable prices for just as long. And while I'm in no way denying the effect of high refresh rates on perceived visual quality or the gaming experience as a whole, unless one is a die-hard esports gamer I would say 1440p120 beats 1080p240 every time.


----------



## Assimilator (Sep 17, 2020)

All the people crying "herp derp it's only whatever % faster than 2080 Ti" are missing the point entirely: RTX 3080 is the first graphics card that gets proportionally faster than its predecessors *at higher resolutions*. That is _nucking futs_!

Not to mention that RTX is finally able to render at framerates almost matching rasterised performance (or exceeding it - see Control @ 4K RTX). Yes it still requires DLSS to get there, but the fact that this was literally not even a possibility *two years ago* (before Turing launch) is just mindblowing. The next generation will almost certainly bring performance parity in RTX and rasterised rendering, and that's insane: we will have gone from "ray-traced rendering is an academic novelty" to "ray-traced rendering is the standard" in probably half a decade.

This is the world's first true 4K graphics card and NVIDIA should be congratulated for it. My only question is, what happens to us non-4K plebs now? Because NVIDIA is building GPUs that are literally too powerful for the rest of our systems!


----------



## efikkan (Sep 17, 2020)

birdie said:


> BTW, do you remember Radeon R9 Fury? Was released with paltry 4GB of VRAM which was strangely enough for AMD fans.


Back with the 200/300 series, 8 GB was important for future-proofing, but for Fury 4 GB was plenty due to HBM? 
Those who make these anecdotes can't even be consistent. People should instead look at thorough reviews, those will reveal any memory problems. And when cards truly run out of VRAM, it will be an unplayable nightmare, not just 5-10% performance drop.


----------



## WeeRab (Sep 17, 2020)

Caring1 said:


> Would you rather he wore only an apron?


Joking of course.
Although I can't stand the man...This product is Mega good. And although I don't game as much as I used to  - The latest gen may have me reaching for my wallet.
 Will wait for AMD's response - Big Navi.   But I honestly can't see them beating, nor even coming close to Nvidia this time.


----------



## M2B (Sep 17, 2020)

efikkan said:


> Back with the 200/300 series, 8 GB was important for future-proofing, but for Fury 4 GB was plenty due to HBM?
> Those who make these anecdotes can't even be consistent. People should instead look at thorough reviews, those will reveal any memory problems. And when cards truly run out of VRAM, it will be an unplayable nightmare, not just 5-10% performance drop.



It depends on the game/engine I think.
Some games/engines handle the slight or modest VRAM limitations really well and don't ruin the user experience.


----------



## birdie (Sep 17, 2020)

*A 20GB version of the RTX 3080* will likely be launched in the next two months:









						NVIDIA 'teases' GeForce RTX 3080 with 20GB memory - VideoCardz.com
					

NVIDIA hints that RTX 3080 with a different memory layout could come out later. NVIDIA GeForce RTX 3080 >>10GB<< 20GB It is no secret that NVIDIA is also preparing a 20GB model of the GeForce RTX 3080 graphics card. It has been speculated for weeks now and we had no problems confirming this...




					videocardz.com


----------



## Valantar (Sep 17, 2020)

Assimilator said:


> All the people crying "herp derp it's only whatever % faster than 2080 Ti" are missing the point entirely: RTX 3080 is the first graphics card that gets proportionally faster than its predecessors *at higher resolutions*. That is _nucking futs_!
> 
> Not to mention that RTX is finally able to render at framerates almost matching rasterised performance (or exceeding it - see Control @ 4K RTX). Yes it still requires DLSS to get there, but the fact that this was literally not even a possibility *two years ago* (before Turing launch) is just mindblowing. The next generation will almost certainly bring performance parity in RTX and rasterised rendering, and that's insane: we will have gone from "ray-traced rendering is an academic novelty" to "ray-traced rendering is the standard" in probably half a decade.
> 
> This is the world's first true 4K graphics card and NVIDIA should be congratulated for it. My only question is, what happens to us non-4K plebs now? Because NVIDIA is building GPUs that are literally too powerful for the rest of our systems!


While a lot of what you say here is true, the first point needs some moderation: this is mostly due to CPU limitations and/or architectural bottlenecks preventing some games scaling to higher FPS numbers, not due to 4k performance itself increasing more than lower resolution performance in a vacuum. Of course we can't get faster CPUs than what exists today, nor can we magically make game engines perform better or the Ampere architecture re-balance itself for certain games on demand, so it is what it is in terms of performance - but you are arguing as if this was caused by the GPU alone rather than these contextual factors.

RTX performance with DLSS looks excellent, and I'm definitely looking forward to how this in combination with RT-enabled consoles will affect games going forward. The big questions now become not only whether AMD is able to match Nvidia's RT performance, but if they can provide any type of alternative to DLSS.

Also, this is a perfectly suitable 1440p120 or 1440p144 GPU. No need for 4k, though it's obviously also great for that.

One question though: going by the meager perf/W increases, was anything beyond die sizes on 12nm stopping Nvidia from delivering this kind of performance with Turing?



birdie said:


> *A 20GB version of the RTX 3080* will likely be launched in the next two months:
> 
> 
> 
> ...


Ouch, that's going to piss off a lot of early buyers. Not that there's any reason to - those extra 10GB are most likely going to be entirely decorative for the lifetime of the GPU - but people tend not to like being presented "the best product ever, go buy it now" and then having it superseded in just a few months by the same company.


----------



## Fourstaff (Sep 17, 2020)

What a beast of a card. Hope it drives the prices of all other graphics card downwards so 1080p people like me get some trickle benefits.


----------



## Xex360 (Sep 17, 2020)

webdigo said:


> Some people actually like a quiet system. The more power draw, the more increased rpm on psu fan.


I second that, I don't use headphones so quietness is quite important.


----------



## Assimilator (Sep 17, 2020)

Xex360 said:


> I second that, I don't use headphones so quietness is quite important.



More like QUIET important, amirite???

On a more serious note - I don't use headphones either, my PC is under 1m away from my ears, and I never hear it. Fan stop under a certain temperature threshold for GPUs and PSUs is amazing.


----------



## robal (Sep 17, 2020)

W1zzard said:


> I doubt anyone will buy that card for 1080p


I think I will....
But that's only because I'm a weird case:
- gaming on a 1080p@120Hz projector  (there are no 4k projectors with >60Hz and/or low input lag yet)
- I love eye-candy. Details at max. Vsync ON.

What I need is GPU capable of consistent fps above 120 with max eye candy, Vsync and RTX ON.
RTX-3080 may still be an overkill, but...  "will it run Crysis ?"


----------



## montana013 (Sep 17, 2020)

I prefer to wait for RTX 20 GB or the next AMD 16gb graphics card that reassures when you buy a product at 800 dollars (for 5 years 6 years) that the product is more or less future proof, as PC games are developed to be suitable for console therefore for 16GB of Vram. there is a good chance that the developers will not bother and transpose this on PC for  the maximum setting .


----------



## lesovers (Sep 17, 2020)

OK now from the performance summary lets see if this is more of an upgade than the 1080 to 2080S cards;

*1080P:*
1080 to 2080S -> +45%
2080S to 3080 -> +30%

*1440P:*
1080 to 2080S -> +56%
2080S to 3080 -> +43%

*4K:*
1080 to 2080S -> +60%
2080S to 3080 -> +56%

So the 1080 to 2080S was a better upgrade than 2080S to 3080 at the three resolutions interesting!!!


----------



## londiste (Sep 17, 2020)

lesovers said:


> OK now from the performance summary lets see if this is more of an upgade than the 1080 to 2080S cards;


Did you choose 2080S just so it would illustrate your point?

The main generations - GTX1080 is from 2016, RTX2080 is from 2018, 3080 is 2020.
S was a mid-lifecycle update to Turing. 1080 got some update as well but that was more minor and didn't take off.


----------



## Auer (Sep 17, 2020)

webdigo said:


> Some people actually like a quiet system. The more power draw, the more increased rpm on psu fan.


Well of course, I do as well. But quality hardware can handle that. 
My Fractal Ion Platinum PSU is dead quiet even under heavy loads, In a windowless Fractal Define Nano S case.


----------



## lesovers (Sep 17, 2020)

londiste said:


> Did you choose 2080S just so it would illustrate your point?
> 
> The main generations - GTX1080 is from 2016, RTX2080 is from 2018, 3080 is 2020.
> S was a mid-lifecycle update to Turing. 1080 got some update as well but that was more minor and didn't take off.


The current replacement cards from 2019 are all supers so why not, they should have given us this performance in 2018?

Here is the numbers with the 2080 non-super cards
*1080P:*
1080 to 2080 -> +39%
2080 to 3080 -> +35%

*1440P:*
1080 to 2080 -> +47%
2080 to 3080 -> +51%

*4K:*
1080 to 2080 -> +50%
2080 to 3080 -> +67%

Almost even up to 4K using 2080 which was replaced only after 9 months by the 2080S super cards.


----------



## Chrispy_ (Sep 17, 2020)

Finally had time to catch up on this review.

It's impressive, in the same way that the AMD Radeon Fury was impressive - it brute-forced its way past the previous generation but at great cost.

The 3080 is overshadowed by several glaring issues for anyone who has been following:

It's not the 90% faster than Nvidia claimed. That's barely realistic even with RTX and DLSS-optimised titles that have been carefully hand-tuned by Nvidia/developer collaboration and many new games won't get that preferential treatment.
10GB RAM. Yeah, it's enough for now but it's not 'plenty' and it's not necessary enough for that much longer.
That power draw. Even though the cooler is quiet enough, that heat has to go somewhere. Your poor CPU, RAM, motherboard, drives, and armpits are going to pay for that. Whatever's in the XBSX and PS5 can't possibly be as power hungry because their design and power bricks simply aren't enough to handle a 350W GPU.
RDNA2 is looming ominously and this has failed to meet the overblown claims. Rushed out too early to try and maximise sales at the current price before big Navi lands? Nvidia must be working closely with game devs and insiders who also have hands-on time with the consoles, so regardless of confidentiality agreements, I'm sure someone's been coerced into leaking info to Nvidia about the RDNA2 in the upcoming consoles.


----------



## ppn (Sep 17, 2020)

Uh oh we are comparing GA102 die to TU104, which is wrong, compare it to TU102, +50% ingame performance + 50% transistors.

next is 54000 Mtr in 2022 moores law

so this is just another GTX780, a xx102 based xx80 card. 3080 as good as it might seem, it is not a 5-6 year investment. 10-12 months and be replaced by 4070, or 4060 even. If with next gen we have a stronger leap like 960 to 1060. 3080 is just a glorified 4060 with double the power comsumption, and if you buy this now, you have to keep dragging it for years or sell it at a hudge loss.


----------



## Colddecked (Sep 17, 2020)

Assimilator said:


> This is the world's first true 4K graphics card and NVIDIA should be congratulated for it. My only question is, what happens to us non-4K plebs now? Because NVIDIA is building GPUs that are literally too powerful for the rest of our systems!



Jack up the render scale and AA and mine eth while playing for us 1080p 144hz plebs that want this card lol...


----------



## W1zzard (Sep 17, 2020)

hkpolice2 said:


> Under the overclocked performance testing, what settings were used in Unigine Heaven?


Custom settings, custom scene, nothing you can reproduce


----------



## Arcdar (Sep 17, 2020)

And it went INSTANTLY from "notify me" to "out of stock" …. quite literally…. I was updating the page around release time, 30 sec before it still showed the "notify me" button and then, after sipping from my coffee before Refreshing again, it showed "out of stock" …

GG Nvidia…. I guess scalpers used their bots again and the cards will Pop up in the next few days on eBay and co ….

really well done…. *sigh*


----------



## mouacyk (Sep 17, 2020)

common unicorns


----------



## Chomiq (Sep 17, 2020)

"Now available"


----------



## Arcdar (Sep 17, 2020)

Chomiq said:


> "Now available"
> View attachment 168984



It was like that already a few seconds after release. I guess no human being ever saw the purchase button. Only bots did.

I was actually waiting and from "notify me" to "out of stock" was literally a sip from my coffee .... I refresehd, not Ready to buy yet, supped my coffee, pressed F5, out of stock ….


Really not funny from NVidia, whatever the reason for it is


----------



## Chomiq (Sep 17, 2020)

Arcdar said:


> It was like that already a few seconds after release. I guess no human being ever saw the purchase button. Only bots did.
> 
> I was actually waiting and from "notify me" to "out of stock" was literally a sip from my coffee .... I refresehd, not Ready to buy yet, supped my coffee, pressed F5, out of stock ….
> 
> ...


I guess it's a mix of limited supply + miners.


----------



## Testsubject01 (Sep 17, 2020)

Valantar said:


> Put it this way: if you're willing to pay $700 for a GPU (no matter the generation) yet unwilling to pay more than ~$300 for your monitor, that is ... wait for it ... unwillingness. There have been GPUs perfectly capable of handling >60fps @1440p for years below $700, and there have been 1440p>60Hz monitors at relatively reasonable prices for just as long. And while I'm in no way denying the effect of high refresh rates on perceived visual quality or the gaming experience as a whole, unless one is a die-hard esports gamer I would say 1440p120 beats 1080p240 every time.



I mentioned 1440p@120+ and 2160p@60, because that is why someone most likely was willing to go for a 700€+ card, actually much more giving Turing's pricing.
Further, the reference point was 1080p@144+ with all the jazz (Paneltype, proper HDR, colorspace, contrast, backlighting, etc.), which is indeed readily available for ~200-300€.
Add to that a beefy system to max everything in 1080p at those framerates ~800 to 1000€. All said and done, 1080p goodness for ~1000 to 1500€. (or just a single AIB RTX 2080 Ti)

1440p@120+ and 2160p@60 displays with the same properties ~500€ to "the sky is the limit"€ add a system to match, easily an additional 2500€+.
Even with the RTX 2080 Ti, you still had to fiddle with settings to really enjoy those resolutions on such displays in current titles and some from past years.

Pascal didn't have the horsepower, Turing was horridly overpriced (while also still kinda lacking the performance).
Just Ampere now (and hopefully RDN2), at a sufficient performance envelope as well as a more reasonable price point, made it feasible to really jump to 1440p@120+ and 2160p@60 without compromising for anything you had with 1080p.

It was just a pricy compromise in the past, a trend which now changes and hopefully gains more traction, that's why I questioned the term "unwillingness" in regards to 1080p users.
I totally agree that 1080p gets rather long in the tooth. Regarding the pace, tech went from 4:3 resolutions to 16:9 720p, 1080p, and an excursion to ultra and "ultra-ultra "-wide.


----------



## Colddecked (Sep 17, 2020)

Chrispy_ said:


> Finally had time to catch up on this review.
> 
> It's impressive, in the same way that the AMD Radeon Fury was impressive - it brute-forced its way past the previous generation but at great cost.
> 
> ...





Chomiq said:


> I guess it's a mix of limited supply + miners.



Its resellers.


----------



## EarthDog (Sep 17, 2020)

Arcdar said:


> It was like that already a few seconds after release. I guess no human being ever saw the purchase button. Only bots did.
> 
> I was actually waiting and from "notify me" to "out of stock" was literally a sip from my coffee .... I refresehd, not Ready to buy yet, supped my coffee, pressed F5, out of stock ….
> 
> ...


I had some in my cart within the first minute. But by the time the cart loaded it was already sold out. I was chasing the tail the entire time, lol.

It's too bad we got shafted by a partner or we would have had one for review. Piiiiiiiiiiiised.


----------



## Dyatlov A (Sep 17, 2020)

Shit, cannot buy...


----------



## Xex360 (Sep 17, 2020)

Assimilator said:


> More like QUIET important, amirite???
> 
> On a more serious note - I don't use headphones either, my PC is under 1m away from my ears, and I never hear it. Fan stop under a certain temperature threshold for GPUs and PSUs is amazing.


I think you confuse quiet and quite. Damn sometimes English is so confusing.
I sit closer to my PC, but if I undervolt my hopeless 5700 it's very "quiet".
Yea fan stop is great, when I'm not using the GPU the case is basically silent.


----------



## Valantar (Sep 17, 2020)

Xex360 said:


> I think you confuse quiet and quite. Damn sometimes English is so confusing.
> I sit closer to my PC, but if I undervolt my hopeless 5700 it's very "quiet".
> Yea fan stop is great, when I'm not using the GPU the case is basically silent.


It was a pun.


----------



## Xex360 (Sep 17, 2020)

Valantar said:


> It was a pun.


I guess the remaining of my neurones are in need of sleep.


----------



## Fluffmeister (Sep 17, 2020)

EarthDog said:


> I had some in my cart within the first minute. But by the time the cart loaded it was already sold out. I was chasing the tail the entire time, lol.
> 
> It's too bad we got shafted by a partner or we would have had one for review. Piiiiiiiiiiiised.



Yeah can't even access some online shops, they are getting hammered.


----------



## Assimilator (Sep 17, 2020)

Fluffmeister said:


> Yeah can't even access some online shops, they are getting hammered.



Remember all the AMD fanboys saying NVIDIA is scared of RDNA2? Turns out AMD fanboys aren't the ones killing e-shops trying to buy NVIDIA's latest GPU, whodathunkit?


----------



## Deleted member 24505 (Sep 17, 2020)

i feel sorry for you all


----------



## TheoneandonlyMrK (Sep 17, 2020)

It was worse than I expected, the launch sell out that is, wow.


----------



## MAXLD (Sep 17, 2020)

> "I have to applaud NVIDIA for including a 12-pin to 2x 8-pin adapter with every Founders Edition card as it ensures people can start gaming right away without having to hunt for adapter cables of potentially questionable quality."



I don't think for all the positive things this card has, that that's a thing that deserves any kind of applause... without nVidia providing an adapter, every single buyer of a reference Ampere card would just have bought a solid useless brick that they couldn't use (and a very expensive one at that), because there would be no cables to connect to it at launch... What would nVidia expect? To people buy the card in mass at launch but then having to wait until they could also buy, or request their PSU brands to get them a completely new cable design that serves only nVidia reference cards? (they'll make them from now on, but mostly for some clients due PC build aesthetic reasons). Therefore, the included custom made non-standard adapter is something that is... "expected" (to say the least) to be shipped with their own new cards.

The custom 12pin is a problem they created themselves to exclusively serve their own reference PCB design decisions... it's not an established (or even recent) standard that was already offered with any PSU units for some time now. It's not even something that is essentially and technically needed for the card to work VS standard PCI-E ones, it's a FE design related decision (and nothing that AIB's have the need to jump into). For what is worth, AIB Ampere cards just proved nVidia didn't need any of that V-shaped PCB stuff, and by consequence none of that non-standard 12pin connector... all they needed was a proper normal cooler design, a normal PCB, and normal PCI-E connectors. But they did it anyway, because they wanted to (and that's nothing wrong, really), but they do have to provide the means for the buyers to connect and use what they're buying.

That's like applauding a CPU cooler brand for including their proprietary bracket / MB backplate that servers their own new retention system... You don't applaud them for that, you expect and demand that they include their completely new custom parts inside the box, so you can actually... use the product you just bought from them. I mean, if you're designing your own system to make your product actually "work" in a way you just made up that isn't compatible with the standard brackets form AMD/Intel, then why would you ship your product to the client without that critical custom part that is 100% needed for it's most basic function (since there's no other means to get it any way else when you launch the cooler)?

(anyway, regardless, great review, I'm just baffled by that part)


----------



## mouacyk (Sep 17, 2020)

theoneandonlymrk said:


> It was worse than I expected, the launch sell out that is, wow.


This hobby is only going to get worst.  The market's ratio of money/sense is continually increasing.  COVID stimulus money doesn't help the situation either (those that don't NEED it, mind you).


----------



## TheoneandonlyMrK (Sep 17, 2020)

mouacyk said:


> This hobby is only going to get worst.  The market's ratio of money/sense is continually increasing.  COVID stimulus money doesn't help the situation either (those that don't NEED it, mind you).


I think many would just make the money available, that's what I do, bills n food be dammed, can't if they're not available though.


----------



## KainXS (Sep 17, 2020)

I tried to buy one as soon as they went live, after 10 minutes all I could hear is










I never even saw the chance to buy it at nvidia's site


----------



## Caring1 (Sep 17, 2020)

Arcdar said:


> It was like that already a few seconds after release. I guess no human being ever saw the purchase button. Only bots did.
> 
> I was actually waiting and from "notify me" to "out of stock" was literally a sip from my coffee .... I refresehd, not Ready to buy yet, supped my coffee, pressed F5, out of stock ….
> 
> ...


On the bright side, you can now wait that little bit longer and buy the 20GB version when it's released.
At least then you won't feel ripped off.


----------



## Athlonite (Sep 17, 2020)

I just knew we here in Gougeland were going to be ripped a* BIG FAT *new one $1469.99NZD to $1798NZD the Asus TUF is the cheapest and the Asus Strix is the most expensive every other AIB partner cards is in between those two prices but makes no difference even if you could afford one then tuff they're outta stock


----------



## Stefem (Sep 17, 2020)

gridracedriver said:


> of course I had noticed it, on tom's IT I had also pointed it out to everyone, but still they declared 1.5x with the same power draw! see the picture.
> Well, where is this 1.5x in performance/watt?
> declare false, there is no excuse.


You are still not fully reading the chart, the 3080 is at the end of the green line at 320W, you extrapolated 1.5x perf/W at the same power as a 2080 Super but to validate this you have to lower the power target of the 3080 and test with Control at 4K.
As with prior architectures you gain lots of efficiency just by lowering the power target, if you don't want to create a custom voltage/frequency curve, an RTX 2080 Ti at the same power as a GTX 1080 is faster than a 1080 Ti at 280W just to give you a picture, Pascal behave similarly too.


----------



## Melvis (Sep 18, 2020)

Athlonite said:


> I just knew we here in Gougeland were going to be ripped a* BIG FAT *new one $1469.99NZD to $1798NZD the Asus TUF is the cheapest and the Asus Strix is the most expensive every other AIB partner cards is in between those two prices but makes no difference even if you could afford one then tuff they're outta stock



Your right! here in Aussie land the cards are WAY over priced, $1300 is the cheapest and they should be more priced at $900, what a load of BS! Its even more expensive then a 2080 TI brand new lol


----------



## WeeRab (Sep 18, 2020)

Does it all matter?
 After all...No one can actually buy the thing.
It's all marketing hype at the moment...and I can guarantee it will never be available for £649 - EVER.


----------



## a_ump (Sep 18, 2020)

It’s not bad but “biggest gen performance leap ever!”????? Nvidia must have forgotten 7 series to G80. 8800gtx was +40-90% every game @1600x1200, sometimes more than double FPS. I expected almost 2x Perf but it came up WAY short. I’m disappointed based on the false hype Nvidia planted with that quoted^ statement.


----------



## Fluffmeister (Sep 18, 2020)

WeeRab said:


> Does it all matter?
> After all...No one can actually buy the thing.
> It's all marketing hype at the moment...and I can guarantee it will never be available for £649 - EVER.



Pre order now, they will honour it:






						NVIDIA Graphics Cards | RTX 30 Series | GeForce GTX GPUs | Ebuyer.com
					

Check out the brand new RTX 30 Series graphics cards at Ebuyer. We have a huge range of NVIDIA GeForce RTX and GTX GPUs at the best prices you'll find online.




					www.ebuyer.com


----------



## gridracedriver (Sep 18, 2020)

Stefem said:


> You are still not fully reading the chart, the 3080 is at the end of the green line at 320W, you extrapolated 1.5x perf/W at the same power as a 2080 Super but to validate this you have to lower the power target of the 3080 and test with Control at 4K.
> As with prior architectures you gain lots of efficiency just by lowering the power target, if you don't want to create a custom voltage/frequency curve, an RTX 2080 Ti at the same power as a GTX 1080 is faster than a 1080 Ti at 280W just to give you a picture, Pascal behave similarly too.


it is you who does not look well at the graph, if we extended the Turing line to 320watt the perf / watt with Ampere would be even higher than 1.5x
they're lying to you, get over it!


----------



## Valantar (Sep 18, 2020)

gridracedriver said:


> it is you who does not look well at the graph, if we extended the Turing line to 320watt the perf / watt with Ampere would be even higher than 1.5x
> they're lying to you, get over it!


The problem there? Good luck getting a 2080/2080 Super to consume 320W without running exotic cooling. At that point you'd be pushing clocks so high that you would be _seriously_ skewing any comparison. Both comparisons in the annotated graph above are more or less reasonable as in they are power draw and performance combos that are reachable with existing silicon without exotic cooling or other modifications, a standard that a theoretical 320W 2080/S fails to meet. The 1.9x number is less reasonable than the 1.5x one though, as we will never, ever see a GA102 chip running at that low power. Or, you know, maybe we will if Nvidia goes absolutely mental for their mobile chips this time around. But not on the desktop.


----------



## efikkan (Sep 18, 2020)

WeeRab said:


> Does it all matter?
> After all...No one can actually buy the thing.
> It's all marketing hype at the moment...and I can guarantee it will never be available for £649 - EVER.


Just because demand is greater than the supply doesn't mean no one can buy it, nor does it tell how many are shipping.

For the last couple of releases, short supplies has been the norm. Most of you have forgotten it now, but both GTX 1080 and RTX 2080 moved record volumes in their first quarter despite being "sold out" for 2-3 months. Other cards like Vega 56/64 were mostly sold out for over a year, long after the mining mania, and still barely managed to show up in statistics like the Steam hardware survey, so they didn't sell many cards at all.

Despite the rumor that the MSRP was just to trick people, even stores like Newegg have at least four partner cards at $699 (out of stock of course).
If you look around, you can probably find one that lets you put one on backorder at MSRP. I found several in my country, you can probably find some in yours too.

Don't be tempted to buy one at an inflated price. Put one on backorder or wait, you will get one eventually.


----------



## Flanker (Sep 18, 2020)

After reading the reviews of custom AIB versions, looks like the performance of the exotic looking FE cooler isn't all that special.


----------



## londiste (Sep 18, 2020)

Flanker said:


> After reading the reviews of custom AIB versions, looks like the performance of the exotic looking FE cooler isn't all that special.


How many of the AIB cards are dual-slot?


----------



## Athlonite (Sep 18, 2020)

londiste said:


> How many of the AIB cards are dual-slot?



As far as I know none are dual slot they're all triple slot solutions


----------



## Valantar (Sep 18, 2020)

Athlonite said:


> As far as I know none are dual slot they're all triple slot solutions


True, though most would claim they are "2.x slot", which doesn't make much of a difference. It matters in SFF builds though, where a 2.25 or 2.5 slot card might fit but not a 2.75 or 3-slot one.


----------



## Deleted member 24505 (Sep 18, 2020)

any card that is 2.1+ may as well be classed as 3, as it will still nigh on take 3 slots


----------



## Valantar (Sep 18, 2020)

tigger said:


> any card that is 2.1+ may as well be classed as 3, as it will still nigh on take 3 slots


Yes, as I said, it


Valantar said:


> doesn't make much of a difference


outside of SFF builds. Anything exceeding 2 slots obviously blocks access to the third PCIe slot.


----------



## Deleted member 24505 (Sep 18, 2020)

Valantar said:


> Yes, as I said, it
> 
> outside of SFF builds. Anything exceeding 2 slots obviously blocks access to the third PCIe slot.



except i simplified it. any 2.x slot card may as well be three. you was babbling about 2.25/2.75 whatever the eff they are. Any card over 2 slots is a 3 slot card. simple


----------



## Bubster (Sep 18, 2020)

i Have been a 4k gamer only for more than 3 years and the rtx 3080 has plenty of advantages as RTX io and Broadcast for streamers DLSS2 and RTX on for better Image quality we don't know by how much yet we need to see more footage and to know for sure ....RTX has been a gimmick in the last 2 years (Remember BF5 fiasco in RTX launch) we'll see if Nvidia really has made it worthy...3090 on the hand is every good boy's dream come true hopefully it gets cheaper later...


----------



## Stefem (Sep 18, 2020)

gridracedriver said:


> it is you who does not look well at the graph, if we extended the Turing line to 320watt the perf / watt with Ampere would be even higher than 1.5x
> they're lying to you, get over it!


Oh really? Maybe your math skills are really worse than your drawing one but you don't need more, just extend the line like you did earlier... even ignoring the leveling and drop that happen when you keep increasing the power are you seriously suggesting that Turing line on the graph would be like that or "even higher" to quote you?
Maybe in your fantasy world where actual physical laws don't apply...


----------



## xorbe (Sep 18, 2020)

^^ that graph doesn't represent what I saw in the TPU review ...


----------



## Valantar (Sep 18, 2020)

tigger said:


> except i simplified it. any 2.x slot card may as well be three. you was babbling about 2.25/2.75 whatever the eff they are. Any card over 2 slots is a 3 slot card. simple


...except when it isn't, such as SFF cases with limited space. So yes, in terms of ATX and mATX cases and slot availability, you are right, but in ITX cases where GPU space is at a premium, the difference between 2.5 and 2.75 slots could be the difference between fitting your side panel on or not.


----------



## lexluthermiester (Sep 19, 2020)

mouacyk said:


> You seem to have missed the fundamentals that Micron is only producing 8Gb and 16Gb chips, for use with 32-bit memory controllers, of which the 3080 has 10 enabled of the 12 total. Hence, we got 10GB, and may possibly get 20GB in a future refresh.


And we have an answer;








						NVIDIA Readies RTX 3060 8GB and RTX 3080 20GB Models
					

A GIGABYTE webpage meant for redeeming the RTX 30-series Watch Dogs Legion + GeForce NOW bundle, lists out eligible graphics cards for the offer, including a large selection of those based on unannounced RTX 30-series GPUs. Among these are references to a "GeForce RTX 3060" with 8 GB of memory...




					www.techpowerup.com


----------



## medi01 (Sep 19, 2020)

Assimilator said:


> 4K RTX





Assimilator said:


> still requires DLSS to get there





Assimilator said:


> true 4K



Amazing stuff.
Could you please stop calling upscaled images "4k"?


----------



## Auer (Sep 19, 2020)

medi01 said:


> Amazing stuff.
> Could you please stop calling upscaled images "4k"?


I would call it "Better than 4K"


----------



## the54thvoid (Sep 19, 2020)

I used the DLSS setting on the new Control DLC at 1440 native. It set the resolution at something weird (960?) In game, I genuinely couldn't see any diff. It looked like 1440p. It's very impressive.


----------



## Calmmo (Sep 20, 2020)

hmmm..

__
		https://www.reddit.com/r/ZOTAC/comments/iv4j17


----------



## lexluthermiester (Sep 20, 2020)

Calmmo said:


> hmmm..
> 
> __
> https://www.reddit.com/r/ZOTAC/comments/iv4j17


Can you please post information that has more credibility? Some random user on Reddit does not qualify.


----------



## claes (Sep 21, 2020)

lexluthermiester said:


> Can you please post information that has more credibility? Some random user on Reddit does not qualify.


Their American sales rep seems to offer credibility


__
		https://www.reddit.com/r/ZOTAC/comments/iv4j17/_/g5q82zr


----------



## lexluthermiester (Sep 21, 2020)

claes said:


> Their American sales rep seems to offer credibility
> 
> 
> __
> https://www.reddit.com/r/ZOTAC/comments/iv4j17/_/g5q82zr


Fair enough. Not very surprising though. We're talking about base clocks. 1% - 3% is not much to whine about let alone make a mountain out of molehill over, especially when you can use a utility to OC and get that back plus more.


----------



## Caring1 (Sep 21, 2020)

claes said:


> Their American sales rep seems to offer credibility


Hmm, should I listen to people with technical credibility, or ask a sales rep?


----------



## R-T-B (Sep 21, 2020)

lexluthermiester said:


> Can you please post information that has more credibility? Some random user on Reddit does not qualify.



No random user anywhere qualifies.  You need uh... qualifications.  Paper or otherwise.  Hard to find but very useful in debates to use people that have those.  10/10, would recommend.



Caring1 said:


> Hmm, should I listen to people with technical credibility, or ask a sales rep?



Depends, are you looking at getting a job in sales, or asking technical questions about computer parts?

Depending on your answer, one may be more suitable than the other.


----------



## Calmmo (Sep 21, 2020)

lexluthermiester said:


> Can you please post information that has more credibility? Some random user on Reddit does not qualify.


the _*Zotac rep*_ says it was intentional to make their more expensive cards look faster


----------



## gridracedriver (Sep 21, 2020)

Stefem said:


> Oh really? Maybe your math skills are really worse than your drawing one but you don't need more, just extend the line like you did earlier... even ignoring the leveling and drop that happen when you keep increasing the power are you seriously suggesting that Turing line on the graph would be like that or "even higher" to quote you?
> Maybe in your fantasy world where actual physical laws don't apply...
> View attachment 169101


but what are you saying?
simply the graph says that for the same watt, at 240watt Turing generates 60fps, Ampere 90fps, + 50% performance for the same watt and this does not happen in reality, they are lying... 
wake up


----------



## lexluthermiester (Sep 21, 2020)

Calmmo said:


> the _*Zotac rep*_ says it was intentional to make their more expensive cards look faster


Yeah, everyone does that. Try comparing to the reference clocks, then check everyone out and we'll talk some more..


----------



## medi01 (Sep 21, 2020)

Calmmo said:


> hmmm..
> 
> __
> https://www.reddit.com/r/ZOTAC/comments/iv4j17


I always wondered, why wouldn't companies that are sending cards in for review, send cherry picked better-than-usual versions.


----------



## Vayra86 (Sep 21, 2020)

Athlonite said:


> As far as I know none are dual slot they're all triple slot solutions



Saw a 3090 blower the other day... so its possible, if you want to abandon all quality of life.



Stefem said:


> Oh really? Maybe your math skills are really worse than your drawing one but you don't need more, just extend the line like you did earlier... even ignoring the leveling and drop that happen when you keep increasing the power are you seriously suggesting that Turing line on the graph would be like that or "even higher" to quote you?
> Maybe in your fantasy world where actual physical laws don't apply...
> View attachment 169101



I'm not sure its wise to discuss this graph at all. It puts 'FPS on the Y and 'W' on the X without any further info telling us it massively improved.... yeah. Best case, RT enabled scenario since the first iteration was utterly shite. I'm sure in some weird niche the info is sound. But useful it is most certainly not.



Bubster said:


> I already have a 1080 SLI 3 years ago and it's great for 4k experience minimum 70 - 80 fps in Ultra in most games...



Not any games going forward as Nvidia is dropping SLI profile support entirely within 4 months (see headlines). Maybe they'll push some sort of community effort...?


I gotta say that overall the more I'm seeing this 3080 'mature' into the market and with the recent higher cap VRAM versions announced... I'm not very convinced with Ampere just yet. Let's pray for some real competition and movement. There's no real rush, zero content is out that actually truly needs the RT yet...


----------



## lexluthermiester (Sep 22, 2020)

Vayra86 said:


> Maybe they'll push some sort of community effort...?


Very unlikely unless they release the source code for the drivers, or at least the SLI portion.


----------



## claes (Sep 22, 2020)

lexluthermiester said:


> Fair enough. Not very surprising though. We're talking about base clocks. 1% - 3% is not much to whine about let alone make a mountain out of molehill over, especially when you can use a utility to OC and get that back plus more.


I don’t know — at the point when a sales rep admits that it’s a lower-end design, I think that means it’s a lower end design. Not really news, although bottom-barrel nvidia cards usually perform on par with FE cards. Still, for once I (partially) agree with @lexluthermiester — 1-3% is practically a margin of error. I’d be behooved not to correct you, though — even when OC’d the Zotac card performs worse than an FE overclocked, as some reputable reviewers have shown. Unlike the FE's 370W power limit, the Zotac card is limited to 336W.

In any case, you're right -- the performance difference is mostly negligible in the real world (unless you overclock). It's not at all unusual for AIBs to release different tiers of cards, and it's not like ASUS or Gigabyte released their high-end cards at launch (and MSI if there's a lightning). *shrug*

Relatively, the Zotac doesn’t overclock well and is slower than the FE OOB. It also seems to be missing some components that the FE has. It’s Zotac news for Zotac fans, and some frustrations for buyers who just wanted a "reference" PCB. Maybe the delay means they’ll go the extra mile with Amp! this round, but given the 3080's limitations I doubt it.


Caring1 said:


> Hmm, should I listen to people with technical credibility, or ask a sales rep?





R-T-B said:


> No random user anywhere qualifies.  You need uh... qualifications.  Paper or otherwise.  Hard to find but very useful in debates to use people that have those.  10/10, would recommend.
> 
> Depends, are you looking at getting a job in sales, or asking technical questions about computer parts?
> 
> Depending on your answer, one may be more suitable than the other.


If you'll read the source in question before making assumptions, it was reviews like this one that got people upset:








						Zotac GeForce RTX 3080 Trinity Review
					

Zotac's GeForce RTX 3080 Trinity comes with up to five years of warranty and a large triple-slot, triple-fan cooler. In our review, we saw excellent noise levels that are much better than the Founders Edition, and it runs cooler as well. A very solid card for $699.




					www.techpowerup.com
				




Couple that with "issues" like these and you might see why early adapters are a bit miffed:


__
		https://www.reddit.com/r/EKWB/comments/iv1ugg


----------



## Athlonite (Sep 22, 2020)

Vayra86 said:


> Saw a 3090 blower the other day... so its possible, if you want to abandon all quality of life.




yeah I saw that too unfortunately I think the only thing "Turbo" on that card is going to be the fan and the amount of noise it's going to make when you hit 70+ degrees C


----------



## R-T-B (Sep 22, 2020)

claes said:


> If you'll read the source in question before making assumptions



I was speaking generally.


----------



## Bubster (Sep 22, 2020)

Athlonite said:


> yeah I saw that too unfortunately I think the only thing "Turbo" on that card is going to be the fan and the amount of noise it's going to make when you hit 70+ degrees C


at least you never have to worry about the freezing cold as with the 3090 you get an automatic free heater


----------



## EarthDog (Sep 22, 2020)

Athlonite said:


> yeah I saw that too unfortunately I think the only thing "Turbo" on that card is going to be the fan and the amount of noise it's going to make when you hit 70+ degrees C


@Vayra86 - I heard the blower is good for about .5 bar / 7 psi... will take 6th gen Honda Civic B16a2 motor to 300whp (intercooler required).


----------



## Athlonite (Sep 23, 2020)

Bubster said:


> at least you never have to worry about the freezing cold as with the 3090 you get an automatic free heater



HA LOL have you seen the absurd prices here in New Zealand $1,798.99 NZD for a 3080 you can bet your top dollar the 3090 will be twice that price @ $3,000+NZD


----------



## Caring1 (Sep 24, 2020)

@W1zzard Are all the 3080 reviews done now?
The Temp and noise table in the review here needs updating still.


----------



## Athlonite (Sep 24, 2020)

Athlonite said:


> HA LOL have you seen the absurd prices here in New Zealand $1,798.99 NZD for a 3080 you can bet your top dollar the 3090 will be twice that price @ $3,000+NZD



Well I was right $3500+NZD for the 3090


----------



## Caring1 (Sep 25, 2020)

Athlonite said:


> Well I was right $3500+NZD for the 3090


$2,749 is the cheapest across the ditch in AU, how does that compare with the current exchange rate?


----------



## Athlonite (Sep 25, 2020)

Caring1 said:


> $2,749 is the cheapest across the ditch in AU, how does that compare with the current exchange rate?



Poorly very poorly @ $ 2,821.22NZD for one from Oz vs $3554.99NZD to buy here still waaaaaaaaaaay to expensive


----------



## Anymal (Sep 25, 2020)

Hell, we just need 3080 SE, Stock Edition.


----------



## W1zzard (Sep 25, 2020)

Caring1 said:


> @W1zzard Are all the 3080 reviews done now?
> The Temp and noise table in the review here needs updating still.


Updated, thanks for reminding


----------



## lexluthermiester (Sep 26, 2020)

Athlonite said:


> Poorly very poorly @ $ 2,821.22NZD for one from Oz vs $3554.99NZD to buy here still waaaaaaaaaaay to expensive


If you're not in a hurry give it a few months. November is going to be a good time to buy. Drivers that fix a few issues that have been discover will be out and supplies will be restocked.



W1zzard said:


> Updated, thanks for reminding


Is someone at TPU going to do a review on the 3090 or are you focusing on "other" reviews ATM?


----------



## Anymal (Sep 26, 2020)

You mean stock 3090 out of stock?


----------



## SoppingClam (Oct 1, 2020)

I have a problem with my nvidia 3080, while it runs perfectly fine if you don't overclock it tends to live in the 1800-1900mhz range. My issue is the Asus RTX 2080 Ti I also have that I am replacing seems to get much higher scores than what the benchmarks in the review have it as. I'm confused how with a simple power increase it runs between 2070-2130mhz core and 8000mhz mem while never exceeding the 60s. 
I play in 4k with HDR (when available) and the FPS difference so far has a maximum of 9% Port Royal.. while all the other 3Dmark tests have the 2080 Ti slower by 0.04%-9%. Is this because it is more of an 'apples to apples' comparison due to the 2080 Ti running a similar power limit? I guess most reviews must use cards with low power limits and no power limit increase from afterburner or bios update. 
Just find it weird as the 3080 has many more CUDA cores but I suppose it generally runs at a slower speed and memory speed isn't much different after overclock. Yet the 2080 Ti even with the overclock is 100% stable and runs cooler where minor overclocks have caused me issues with the 3080..
Guess I fell for the hype. Might sell the 3080 as it would be a significant upgrade from a non 2080 ti user


----------



## lexluthermiester (Oct 1, 2020)

SoppingClam said:


> I have a problem with my nvidia 3080, while it runs perfectly fine if you don't overclock it tends to live in the 1800-1900mhz range. My issue is the Asus RTX 2080 Ti I also have that I am replacing seems to get much higher scores than what the benchmarks in the review have it as. I'm confused how with a simple power increase it runs between 2070-2130mhz core and 8000mhz mem while never exceeding the 60s.
> I play in 4k with HDR (when available) and the FPS difference so far has a maximum of 9% Port Royal.. while all the other 3Dmark tests have the 2080 Ti slower by 0.04%-9%. Is this because it is more of an 'apples to apples' comparison due to the 2080 Ti running a similar power limit? I guess most reviews must use cards with low power limits and no power limit increase from afterburner or bios update.
> Just find it weird as the 3080 has many more CUDA cores but I suppose it generally runs at a slower speed and memory speed isn't much different after overclock. Yet the 2080 Ti even with the overclock is 100% stable and runs cooler where minor overclocks have caused me issues with the 3080..
> Guess I fell for the hype. Might sell the 3080 as it would be a significant upgrade from a non 2080 ti user


There are many instances of people saying that buying a 3080 to replace a 2080ti might yield results that you may be less that happy with. There is no doubt that the 3080 beats out the 2080ti. Might not be by much for some programs, but the 2080ti is not the victor in that competition. What you should have bought was the 3090. The 3090 is the replacement for the 2080ti and would have granted the general leap in performance you were expecting. As both were in the same price class as each other(about $1500), the 3090 was the logical choice.


----------



## EarthDog (Oct 1, 2020)

SoppingClam said:


> I have a problem with my nvidia 3080, while it runs perfectly fine if you don't overclock it tends to live in the 1800-1900mhz range. My issue is the Asus RTX 2080 Ti I also have that I am replacing seems to get much higher scores than what the benchmarks in the review have it as. I'm confused how with a simple power increase it runs between 2070-2130mhz core and 8000mhz mem while never exceeding the 60s.
> I play in 4k with HDR (when available) and the FPS difference so far has a maximum of 9% Port Royal.. while all the other 3Dmark tests have the 2080 Ti slower by 0.04%-9%. Is this because it is more of an 'apples to apples' comparison due to the 2080 Ti running a similar power limit? I guess most reviews must use cards with low power limits and no power limit increase from afterburner or bios update.
> Just find it weird as the 3080 has many more CUDA cores but I suppose it generally runs at a slower speed and memory speed isn't much different after overclock. Yet the 2080 Ti even with the overclock is 100% stable and runs cooler where minor overclocks have caused me issues with the 3080..
> Guess I fell for the hype. Might sell the 3080 as it would be a significant upgrade from a non 2080 ti user


At 4K is where you should see the most improvement. At an average of 26% (TPU review of Asus TUF), that is a generational leap. Yes, a 2080Ti overclocks more than these 3080's. A 3090 would have given you another 10-15% performance, but @ double the price, the value proposition (I use that term loosely) isn't there. Note according to NV, the 3090 is essentially a Titan replacement, as well as the 3080 being labeled the "flagship" in the same video).  That said, I surely wouldn't return/sell the 3080. You can sell the 2080Ti for $500 and have a $3080 for ~$400. I'd call that an upgrade. 

Depending on what benchmarks, the CPU in play may have something to do with the results too. 3DMark likes cores and threads, for example...


----------



## lexluthermiester (Oct 1, 2020)

EarthDog said:


> A 3090 would have given you another 10-15% performance, but @ double the price, the value proposition (I use that term loosely) isn't there.


While that a good point, the price paid for the 2080ti is the same as one would pay for a 3090. They are effectively in the same class for their respective generation.


EarthDog said:


> Note according to NV, the 3090 is essentially a Titan replacement


Sort of. Jensen did call it "Titan class performance", but in reality the real Titan replacement will be a Quadro card. The 3090 is effectively the "ti" of this generation of GPU's, both when gauged by performance and by price.


----------



## EarthDog (Oct 1, 2020)

lexluthermiester said:


> While that a good point, the price paid for the 2080ti is the same as one would pay for a 3090. They are effectively in the same class for their respective generation.
> 
> Sort of. Jensen did call it "Titan class performance", but in reality the real Titan replacement will be a Quadro card. The 3090 is effectively the "ti" of this generation of GPU's, both when gauged by performance and by price.


What someone paid a couple of years back has little bearing (in my mind) on what is what today. Price and class are different things as both are moving targets. Clearly NV has dropped prices significantly so it's tough to compare like that.

A 3090 is a Titan replacement. It has (most of) the makings of a Titan, including inflated price, too much ram for gaming, and Jensen said so "Titan class".  It just isn't called one (for marketing reasons). If you forced me to guess, I'd say it is to get better market segmentation. The 3090 today is more gaming than the Titan RTX, but it is still a crossover card, surely. If you look at price and performance, you'll see that a 3090 more closely resembles a Titan than it does 2080 to 2080Ti, for example. We see meager increase in performance (even at 4K), and shed loads of RAM. The difference between the 2080 and 2080 Ti was almost 25% whereas now 3080 to 3090 is ~15%. Barely a SKU difference. The titan to 2080Ti was a couple of % IIRC.

I'm also thinking if the 3090 was the Ti, they'd call it the Ti. The Ti of this gen is going to be a 3080 Ti down the pike. Similar to the GTX 780 and then the 780 Ti that came out later (with Titan on top of that). They just dropped the Titan name.


----------



## R0H1T (Oct 1, 2020)

I dunno, there's a good chance we might see a *Titan A* (trademarked ) some time down the line especially if AMD can't come close to 3090 levels of performance &/or *RDNA3* cards don't launch in a year's time.


----------



## SoppingClam (Oct 2, 2020)

Price of 3090 in Aust. is around $3k Pricing of 3080s

I paid $1259- Aus back in Sept 2018 for my Asus 2080 Ti. Currently, the RTX 3080s are closer to $1500 as the $699 USD comes didn't come here for the exchange rate price of $920 but $1100-1200. Since Nvidia on sent a handful here the other branded ones start at the $1400 mark and go right up. 

So, a RTX 3090 is closer to 3x the cost of my 2080 Ti. The RRP never seemed to exist. 

Now that I have plonked a 366w bios on my card up from the 312w it still runs plenty cool. Testing both cards they now run at about the same power usage and get maybe a 10% difference. Then again, almost all reviews that have the 3080 I get the same or a bit better FPS in general. Just the 2080 Ti I am using is 15-25% faster than the ones they using. 

The 3080 is a nice card, but I guess as it is closer to $1500 and a 3090 is $3000 the $1259 I paid prior to the price hike has a happy life. While I run at i7-10700k @ 5-5.1ghz even my old i7-4770k @ 4.4ghz with the 2080 Ti got close to 9600 points in Port Royal 4770k Port Royal when Nvidia and most reviewers have the 2080 Ti maxing out at 8600. Weird


----------



## Turmania (Oct 20, 2020)

Lovely card and performance and great pricing. Thumbs up to green team. However, all that with massive power consumption, which I can not justify it. Very hard to justify. This is so unlike Nvidia.


----------



## critofur (Dec 27, 2020)

I got my 5700XT for $300 and I'm currently running a 1440p @144Hz "gaming monitor".  95% of games are not significantly subjectively improved by upgrading to 3080, given this monitor.  But, reviews had me convinced that Ray Tracing was finally worth having w/Cyberpunk 2077, and, I wanted to experience what gaming at a steady 144Hz was like.

Ray Tracing maybe "cool" in CP77, but, once I start _playing _(rather than taking a break to just look around) I stop noticing the improvements brought by it.



Turmania said:


> ...However, all that with massive power consumption...


Aside from the $809 price (of the FTW3 Ultra, which is the cheapest I could find in stock), the power consumption was the hardest aspect for me to swallow - if I hadn't read about undervolting, I don't think I could keep this card.

I hope they add excellent undervolting support to Precision X as MSI Afterburner doesn't seem to 100% work w/EVGA's 3rd fan.  Also, Afterburner's "Voltage Curve" editor is unintuitive to me, I think they got the Mhz/Voltage axises backwards and it's not clear to me how to set the voltage for very low speeds, such as when you're just doing light 2d work like web browsing.


----------



## cst1992 (Jan 2, 2021)

Awesome cooler design. Not a single visible screw and magnetic screw covers, plus two fans routing air two ways - very impressive!

One question though - is the back panel mesh any use in such a card? If I'm reading it right, no air should be passing through that mesh anymore(unlike a blower-type card or even a fan-based card which has *some* air passing through there).


----------



## Valantar (Jan 2, 2021)

cst1992 said:


> Awesome cooler design. Not a single visible screw and magnetic screw covers, plus two fans routing air two ways - very impressive!
> 
> One question though - is the back panel mesh any use in such a card? If I'm reading it right, no air should be passing through that mesh anymore(unlike a blower-type card or even a fan-based card which has *some* air passing through there).


The fan closest to the rear I/O exhausts pretty much entirely out the I/O shield - it's not like the air can go through the PCB after all, and there's a thick aluminium frame surrounding the fin stack, so the air has nowhere to go except out the back. A tiny amount will go out to the middle fin stacks and into the case, but the openings through the frame for that are very small.


----------

