# NVIDIA GeForce RTX 3060 Ti Founders Edition



## W1zzard (Dec 1, 2020)

NVIDIA's GeForce RTX 3060 Ti comes at incredible pricing of $399, yet beats the GeForce RTX 2080 Super in rasterization, and even the RX 6800 in raytracing. In our RTX 3060 Ti review, we found that NVIDIA picked a fantastic cooler that's extremely quiet, yet fits everywhere because of its dual-slot design. 

*Show full review*


----------



## Flanker (Dec 1, 2020)

That power efficiency is pretty good. Wonder if AMD cards in this perf segment can beat it


----------



## Sovsefanden (Dec 1, 2020)

Good review!

As expected, 12-15% below 3070.

A good 1440p/144Hz card, or 1080p/240-360Hz! Will sell like hotcakes, if stock is good

Looking forward to see 6700XT vs 3060 Ti


----------



## Valantar (Dec 1, 2020)

Well, that came out of the blue. Good to see an actual hard launch, and competition seems to be working given the good price/perf of this GPU. Also a sentence nobody would have expected a year or two ago: it's nice to see Nvidia catch up to AMD's efficiency! XD Here's hoping supplies last at least a little.

This makes me very hopeful for the RX 6700 and 6700 XT - the RX 6800 is rather weirdly priced, even if it does beat the 3070 overall, so I'm hoping Nvidia being early with a good contender at $400 forces AMD's hand in making the next tier down a $400 GPU as well. Anything above that would be disappointing, even if it's faster.


----------



## dayne878 (Dec 1, 2020)

Stock will probably be abysmal, but I'm hoping that the sales of 3060ti and 3070 will take some heat off the 3080s so I can eventually snag one in early 2021.


----------



## Sovsefanden (Dec 1, 2020)

Edit


Valantar said:


> Well, that came out of the blue. Good to see an actual hard launch, and competition seems to be working given the good price/perf of this GPU. Also a sentence nobody would have expected a year or two ago: it's nice to see Nvidia catch up to AMD's efficiency! XD Here's hoping supplies last at least a little.
> 
> This makes me very hopeful for the RX 6700 and 6700 XT - the RX 6800 is rather weirdly priced, even if it does beat the 3070 overall, so I'm hoping Nvidia being early with a good contender at $400 forces AMD's hand in making the next tier down a $400 GPU as well. Anything above that would be disappointing, even if it's faster.



3070 beats 6800 on perf per dollar, when looking at msrp pricing but 6800 is priced several hundred dollars above it's 580 msrp and nowhere in sight because of no stock, making it an even worse deal (halo product for now). Several retailers already said that AMD stock won't become normal before deep into 2021, this includes upcoming 6700 series too

So.. Easy win for Nvidia - Going Samsung 8nm paid off it seems, AMD can't deliver chips because TSMC 7nm is overused at this point


----------



## Hugis (Dec 1, 2020)

Cheers for all the reviews @W1zzard, you must have been uber busy!
Great reviews as allways!
Lets hope people wanting this card can get it.....


----------



## N3M3515 (Dec 1, 2020)

Sovsefanden said:


> Edit
> 
> 
> 3070 beats 6800 on perf per dollar, when looking at msrp pricing but 6800 is priced several hundred dollars above it's 580 msrp and nowhere in sight because of no stock
> ...



Easy win for nobody because there are none....


----------



## EzioAs (Dec 1, 2020)

Pretty good card overall. Even though it's still relatively pricey compared to prices of mid-range GPUs from 4+ years ago, but compared to current gen, it's great value. Of course, I'm talking about the announced MSRP here. Who knows how much these cards go for in the current market.


----------



## Vayra86 (Dec 1, 2020)

Looks decent and balanced, well priced. The first truly good one in this line up.


----------



## AnarchoPrimitiv (Dec 1, 2020)

Flanker said:


> That power efficiency is pretty good. Wonder if AMD cards in this perf segment can beat it



Literally, I have no doubt that they will


----------



## Valantar (Dec 1, 2020)

Sovsefanden said:


> Edit
> 
> 
> 3070 beats 6800 on perf per dollar, when looking at msrp pricing but 6800 is priced several hundred dollars above it's 580 msrp and nowhere in sight because of no stock, making it an even worse deal (halo product for now). Several retailers already said that AMD stock won't become normal before deep into 2021, this includes upcoming 6700 series too
> ...


... so Nvidia is delivering? Where? How? At what prices? Both pricing and availability are about equally bad for both parties, though Nvidia has had slightly longer to fix things due to their earlier launch. I also never said the 6800 beat the 3070 in terms of value, only in terms of overall performance - and my opinion of the 6800's pricing should be rather clear from my stated desire in the post you quoted for this to force 6700-series pricing down, no? You're writing this as if you're arguing against me, yet nothing you're saying actually makes sense in that context. Also, you seem to have missed the part in the conclusion commenting on the MSRP for this also likely being make-believe.

As for 8nm being a good choice for Nvidia - not from what we've seen so far. It's a process they're essentially alone in using, yet they're still not managing to come close to meeting demand. Hopefully this levels out, but so far Samsung 8nm for Nvidia looks exactly like what it is: a choice they were forced to make as it was the only reasonably high end process they could get access to in the relevant time frame. For that it definitely isn't bad as long as yields/wafer output increases, but we still don't know when that will happen.


----------



## mb194dc (Dec 1, 2020)

So just a cheaper 2080 super essentially, Is it available to buy in the UK from 1400 on 02/12?  Doesn't seem to be even listed anywhere yet today?

Having a look at big UK retailers, OCUK, SCAN, Amazon etc. There isn't stock of any mainstream graphics cards at all, just super expensive 3090 or 2080 ti, very low end and pro cards. 

These are strange times indeed.


----------



## AnarchoPrimitiv (Dec 1, 2020)

Sovsefanden said:


> Edit
> 
> 
> 3070 beats 6800 on perf per dollar, when looking at msrp pricing but 6800 is priced several hundred dollars above it's 580 msrp and nowhere in sight because of no stock, making it an even worse deal (halo product for now). Several retailers already said that AMD stock won't become normal before deep into 2021, this includes upcoming 6700 series too
> ...



"AMD Can't Deliver chips"....you're saying that as if Nvidia can deliver chips or has been delivering chips?  Nvidia is just as bad, only at least with AMD, they need to use the same supply for CPUs, GPUs, and two brand new, extremely popular consoles....Nvidia literally has zero excuse for their supply shortages, and according to Moore's Law is Dead, who has been correct on just about everything for over a year now, Nvidia has done this purposely to drive prices up....it was a bait a switch, the MSRPs were purposely made attractive to alleviate the anger from Turing, but they purposely held back stock to drive the price up to satisfy AIBs who, at the MSRP, would have razor thin margins, so with short supply, the AIBs have been able to drive prices up and increase profit margins.  Then on top of that , there's news reports, from reputable outlets, that Nvidia has been selling $100+ million in cards directly to crypto miners...I don't like taking sides, and while it's abhorrent that both companies don't have stock, it seems as though AMD's lack of stock has at least an understandable reason, whereas Nvidia's is just up to something nefarious.  AMD, if they could, would absolutely want to have as much stock of GPUs to capture market share, so I don't see any reason for this shortage to be "manmade" on their end, whereas with NVidia, unless Samsung as the worst yields ever known in the history of microprocessors, I can't think of a legitimate excuse for their shortage.


----------



## HenrySomeone (Dec 1, 2020)

Sovsefanden said:


> Edit
> 
> 
> 3070 beats 6800 on perf per dollar, when looking at msrp pricing but 6800 is priced several hundred dollars above it's 580 msrp and nowhere in sight because of no stock, making it an even worse deal (halo product for now). Several retailers already said that AMD stock won't become normal before deep into 2021, this includes upcoming 6700 series too
> ...


So true! Whatever they lost in efficiency by going Samsung, they will more than make up for in actually delivering the cards and for the next round where I expect them to go TSMC 5nm, they will be unparalleled once again.


----------



## the54thvoid (Dec 1, 2020)

@W1zzard Nice spread of reviews. Now you can rest and enjoy... No, wait... one more AMD card to do!


----------



## Lycanwolfen (Dec 1, 2020)

2160P not even impressed. My 2x 1070Ti's in SLI still spank the newer stuff unless its 2x 3090's in Sli. 4k gaming is where it's at and the problem is without SLi it like running in slow motion. I pushing my games around 100 fps in SLi. Heck I bet my 660 Ti's in SLI would beat the 3060ti in 1080P benchmarks. I understand directx 12 does not support it much but most games today still running dx11. Imagine if two 3060ti's was allowed to run SLi it would be 800.00 ish for same performance of a 3080 or better. But Nvidia is not only allowing the super rich afford 4k gaming.


----------



## EzioAs (Dec 1, 2020)

Lycanwolfen said:


> 2160P not even impressed. My 2x 1070Ti's in SLI still spank the newer stuff unless its 2x 3090's in Sli. 4k gaming is where it's at and the problem is without SLi it like running in slow motion. I pushing my games around 100 fps in SLi. Heck I bet my 660 Ti's in SLI would beat the 3060ti in 1080P benchmarks. I understand directx 12 does not support it much but most games today still running dx11. Imagine if two 3060ti's was allowed to run SLi it would be 800.00 ish for same performance of a 3080 or better. But Nvidia is not only allowing the super rich afford 4k gaming.



.....
To each their own, I guess...


----------



## Vayra86 (Dec 1, 2020)

Lycanwolfen said:


> 2160P not even impressed. My 2x 1070Ti's in SLI still spank the newer stuff unless its 2x 3090's in Sli. 4k gaming is where it's at and the problem is without SLi it like running in slow motion. I pushing my games around 100 fps in SLi. Heck I bet my 660 Ti's in SLI would beat the 3060ti in 1080P benchmarks. I understand directx 12 does not support it much but most games today still running dx11. Imagine if two 3060ti's was allowed to run SLi it would be 800.00 ish for same performance of a 3080 or better. But Nvidia is not only allowing the super rich afford 4k gaming.



Too bad SLI is history...


----------



## Lycanwolfen (Dec 1, 2020)

Vayra86 said:


> Too bad SLI is history...


Yep I guess 1080P is what nvidia wants us to game at. Two years ago 4k and 8k was the dream. My PS5 can do 4k better now.


----------



## renz496 (Dec 1, 2020)

Lycanwolfen said:


> Yep I guess 1080P is what nvidia wants us to game at. Two years ago 4k and 8k was the dream. My PS5 can do 4k better now.



playing at 1080p is better then letting yourself to fall into hardware maker trap that want people to play at 4k and ask them to get expensive gear for it.


----------



## Lycanwolfen (Dec 1, 2020)

SIngle cards are fine for 1080p even 1440P not much difference but when you goto 4k everything changes. Take battlefield 5 as an example with two 2080 super in SLI and one in single config. at 1080P both do an average 120 FPS in the game. When you goto 4k then you notice the difference the single card averages about 80 to 90 FPS while the SLi you see the biggest gains upwards of 140 FPS because one card is rendering the bottom 1080P and the other is rendering the top 1080P.  I'll stay with my 1070's in SLi till I see something better. Maybe the 4060 4070's single card will produce 4k better.



renz496 said:


> playing at 1080p is better then letting yourself to fall into hardware maker trap that want people to play at 4k and ask them to get expensive gear for it.


Ya might be an issue since my native res is 2160P on my monitor


----------



## EzioAs (Dec 1, 2020)

Lycanwolfen said:


> My PS5 can do 4k better now.



No, they can't.


----------



## Raendor (Dec 1, 2020)

EzioAs said:


> No, they can't.


PS5 not, but Series X can. It's a shame, because this card costs a bit less than getting a whole gaming system that can play games on that same level. X is around the same 2080 level performance.


----------



## Nordic (Dec 1, 2020)

That face when your gpu is the lowest product line on the chart. 

I haven't felt the need to upgrade in years and still don't for 1080p gaming. The only thing that makes me want to is that upgrade itch.


----------



## Mistral (Dec 1, 2020)

Seems like an excellent mainstream card. 
So, the average person will be able to buy one some time next year?


----------



## usiname (Dec 1, 2020)

Can I ask what happen with the performance/watt chart? 6800 was way better than 1660 ti and now is on same level.


			https://scontent-sof1-1.xx.fbcdn.net/v/t1.15752-9/125876655_124975112510121_4056106078622035150_n.png?_nc_cat=105&ccb=2&_nc_sid=ae9488&_nc_ohc=GCbvZbaP0zAAX9HARSe&_nc_ht=scontent-sof1-1.xx&oh=b781afaa1b235a78cfcab2003c45d70b&oe=5FEB70F0


----------



## RandallFlagg (Dec 1, 2020)

Valantar said:


> ... so Nvidia is delivering? Where? How? At what prices?



Pretty much every major OEM is shipping systems with 3070, 3080, and some with 3090.  Nvidia is at least supplying the OEMs, and they are clearly opting to sell entire systems to capitalize on the populatirity.  AMD is apparently has nothing to sell, as not even OEMs have 6800/6800XT and most don't have Zen 3 either.

There are literally pages of them on Amazon and most you can get by the end of this week.

Try doing this same search for 6800 or 6800XT :






Dell Alienware options :


----------



## Lycanwolfen (Dec 1, 2020)

Raendor said:


> PS5 not, but Series X can. It's a shame, because this card costs a bit less than getting a whole gaming system that can play games on that same level. X is around the same 2080 level performance.


LOL a MS lover. Oh I bet I know which new game will come out for it another Halo. When I have to fix Microsoft crap all day long of all the issues they create I will never buy a single xbox.


----------



## Hyderz (Dec 1, 2020)

RTX 3070 (sweating profusely) *waves at RTX 3060ti


----------



## mechtech (Dec 1, 2020)

Nice review W1zz.

A few questions/comments.

"NVIDIA is positioning the GeForce RTX 3060 Ti Founders Edition at $399, which is an extremely competitive price. At that price point, it offers price/performance comparable to the GTX 1660 Super, which has the best price/performance ratio of all NVIDIA offerings. "

I'm assuming that was supposed to be 2080 Super?

2nd
For relative performance, were older cards such as the Radeon RX480, etc., tested on the i9-9900K test bed for apples to apples comparison?

Thanks


----------



## Chrispy_ (Dec 1, 2020)

$399 for something approaching a 2080S. Ampere offers theoretical improvements in performance/$ across the whole range.

Except $399 is a fabrication for the next 4 months at least. Scalpers, miners, high-demand, and low supply mean that you're still probably going to be paying $600 for them. When you look at the performance and performance/Watt of the 2070 Super, it's not _that_ different and the 2070S is a $500 card _from 18 months ago._


----------



## Raendor (Dec 1, 2020)

Lycanwolfen said:


> LOL a MS lover. Oh I bet I know which new game will come out for it another Halo. When I have to fix Microsoft crap all day long of all the issues they create I will never buy a single xbox.


A good product lover. I had a lot of consoles (PS2, PS3, PS4, PSP, X360, GBA, GBC, Switch and Series X now) in addition to PC so it's ridiculous to read nonsense comment like yours. X is a great machine (certainly more attractive than ps5 for me, which I won't even consider until slim comes out at least) and it easily beats 3060ti/3070/6800 from a simple value proposition to power it offers. But you surely must be one of those sony fanboys salivating on mostly mediocre "exclusives" if anything MS-related causes such a twist in your brain.


----------



## W1zzard (Dec 1, 2020)

usiname said:


> Can I ask what happen with the performance/watt chart?


I switched to Metro 1440p for high-performance cards, to reduce CPU bottleneck, especially on AMD, who have more driver overhead, and thus more susceptible to this.



mechtech said:


> I'm assuming that was supposed to be 2080 Super?


Look at perf/$ chart, best NV card is 1660 Super there



mechtech said:


> For relative performance, were older cards such as the Radeon RX480, etc., tested on the i9-9900K test bed for apples to apples comparison?


Of course. Every rebench I spend like two weeks testing all these cards back-to-back


----------



## mechtech (Dec 1, 2020)

W1zzard said:


> I switched to Metro 1440p for high-performance cards, to reduce CPU bottleneck, especially on AMD, who have more driver overhead, and thus more susceptible to this.
> 
> 
> Look at perf/$ chart, best NV card is 1660 Super there
> ...



Thanks - I was looking at wrong chart.

Is it possible to add the older cards to the game benchmarks for easy comparison?


----------



## W1zzard (Dec 1, 2020)

mechtech said:


> Is it possible to add the older cards to the game benchmarks for easy comparison?


I make the cut at 75% and 125%, or the charts will become too busy. some important SKUs like 2060/2060S are included because same "60" number


----------



## Tatty_One (Dec 1, 2020)

Pretty impressed TBH, almost identical power consumption at peak gaming as my 2060 Super but with 28% more performance at my res.


----------



## hardcore_gamer (Dec 1, 2020)

Lycanwolfen said:


> My PS5 can do 4k better now.



At 30fps, yes.


----------



## okbuddy (Dec 1, 2020)

the real price is $699


----------



## Khonjel (Dec 1, 2020)

I didn't check the Ray Tracing section of TPU's reviews before today but my god it's confusing af. My eye and my brain both were hurting. And it's just two games at three resolution. What will happen when more RT games are added.


----------



## ODOGG26 (Dec 1, 2020)

Sovsefanden said:


> Edit
> 
> 
> 3070 beats 6800 on perf per dollar, when looking at msrp pricing but 6800 is priced several hundred dollars above it's 580 msrp and nowhere in sight because of no stock, making it an even worse deal (halo product for now). Several retailers already said that AMD stock won't become normal before deep into 2021, this includes upcoming 6700 series too
> ...


Same point you made about stock normalising in 2021 is same thing said about NVIDIA. So what is your point?


----------



## ShurikN (Dec 1, 2020)

Mistral said:


> Seems like an excellent mainstream card.
> So, the average person will be able to buy one some time next year?


$400 isn't really mainstream.


----------



## r9 (Dec 1, 2020)

Is there a specific time when these go on sale tomorrow ?


----------



## nguyen (Dec 1, 2020)

Chrispy_ said:


> $399 for something approaching a 2080S. Ampere offers theoretical improvements in performance/$ across the whole range.
> 
> Except $399 is a fabrication for the next 4 months at least. Scalpers, miners, high-demand, and low supply mean that you're still probably going to be paying $600 for them. When you look at the performance and performance/Watt of the 2070 Super, it's not _that_ different and the 2070S is a $500 card _from 18 months ago._



Better to compare 3070 to 2070S and 3060Ti to 2060S no ?


----------



## squallheart (Dec 1, 2020)

Chrispy_ said:


> $399 for something *approaching *a 2080S. Ampere offers theoretical improvements in performance/$ across the whole range.
> 
> Except $399 is a fabrication for the next 4 months at least. Scalpers, miners, high-demand, and low supply mean that you're still probably going to be paying $600 for them. *When you look at the performance and performance/Watt of the 2070 Super, it's not that different and the 2070S* is a $500 card _from 18 months ago._



It's faster than the 2080S, approaching is not the right word here.

The 3070Ti has roughly 25% improvement in performance/watt over the 2070S. Not that different? I don't understand how you can make a statement like that when objective there is right there.


----------



## W1zzard (Dec 1, 2020)

r9 said:


> Is there a specific time when these go on sale tomorrow ?


From what I understand 6:00 a.m. PT, 3:00 p.m. CET, 2:00 p.m. UK



Khonjel said:


> I didn't check the Ray Tracing section of TPU's reviews before today but my god it's confusing af. My eye and my brain both were hurting. And it's just two games at three resolution. What will happen when more RT games are added.


Yeah, it's a lot of data, I'll think of a better presentation for the 2021 reviews


----------



## Sithaer (Dec 1, 2020)

okbuddy said:


> the real price is $699



Where I live and the limited places where I can see the price listed its between 600-700$ so yea._ 'Its also being sold on the ~local second hand market already'_

Funny it was mentioned in the review that its a better deal than a 5600 XT if you can buy it for 400, yea not happening anytime soon/if ever. 

Other than that the card itself is pretty solid imo.


----------



## Ja.KooLit (Dec 1, 2020)

So glad I sold my 2080S


----------



## birdie (Dec 1, 2020)

A great card except:

Ridiculously overpriced and people are seemingly not concerned at all. Like really? The GTX 970 launch price was $330!
In power efficiency it doesn't beat Turning 16XX cards which is ... what? How? I was under the impression that modern GPUs/CPUs can power gate pretty much any part of the chip which means for games lacking RTRT/DLSS Ampere cards must have been a _lot more_ power efficient than previous generation cards.
Why are we _still_ trailing previous generation cards in terms of the bang for the buck? Is it also the new norm?
This is _not_ a replacement for the wildly successful GTX 1060 6GB. Hopefully NVIDIA will release the RTX 3060 at $299 or lower.

@W1zzard I know you're absolutely busy with reviews but please start testing NVIDIA cards with a power limit applied, e.g. at 90%, 80% and 70%. NVIDIA has OC'ed the cards to the absolute limit this generation and by doing so, they've made them not as power as efficient as the node actually allows.


----------



## SIGSEGV (Dec 1, 2020)

another milking card.


----------



## QUANTUMPHYSICS (Dec 1, 2020)

I wouldn't mind having a 3060...in my laptop. 

If you are a desktop user, you have but two real choices:  3080 or 3090.


----------



## birdie (Dec 1, 2020)

QUANTUMPHYSICS said:


> I wouldn't mind having a 3060...in my laptop.
> 
> If you are a desktop user, you have but two real choices:  3080 or 3090.



I would absolutely mind a 200W beast in my laptop unless you're willing to use it for cooking or torturing.


----------



## Vya Domus (Dec 1, 2020)

As I previously said, horrendously overpriced, GPU pricing has been damaged beyond repair at this point I'm afraid. Fortunately or unfortunately it doesn't matter because there will be none available for the foreseeable future.



AnarchoPrimitiv said:


> Nvidia is just as bad



That's an understatement, it's way worse than just "bad". As far as we know no one else is using Samsung's 8nm node, so there are only two possibilities : yields are unbelievably bad or their GPUs are going somewhere else, after all there have been rumors that they have been mostly sold to mining farms which seem more believable with each passing day.

So, they either can't make them or they want to sell them to an entirely different group of customers. Either way it's terrible.


----------



## Vayra86 (Dec 1, 2020)

Chrispy_ said:


> $399 for something approaching a 2080S. Ampere offers theoretical improvements in performance/$ across the whole range.
> 
> Except $399 is a fabrication for the next 4 months at least. Scalpers, miners, high-demand, and low supply mean that you're still probably going to be paying $600 for them. When you look at the performance and performance/Watt of the 2070 Super, it's not _that_ different and the 2070S is a $500 card _from 18 months ago._



Why would you pay 600 though, I really dont follow this logic. GPUs are nonessential, why pay more than you would want?

It is exactly that twisted logic that gave scalping its business case.


----------



## RandallFlagg (Dec 1, 2020)

birdie said:


> I would absolutely mind a 200W beast in my laptop unless you're willing to use it for cooking or torturing.




3060 Ti based laptop concept :

View attachment 2020-01-20-image-2.webp


----------



## mechtech (Dec 1, 2020)

W1zzard said:


> I make the cut at 75% and 125%, or the charts will become too busy. some important SKUs like 2060/2060S are included because same "60" number



ok

I guess I will have to do some extrapolation for my RX 480


----------



## ShurikN (Dec 1, 2020)

Sovsefanden said:


> Going Samsung 8nm paid off it seems


Yeah, Samsungs 8nm has been amazing, that's why you have Ampere cards filling up stores and shelves all across the world.


----------



## Vayra86 (Dec 1, 2020)

mechtech said:


> ok
> 
> I guess I will have to do some extrapolation for my RX 480



No just find its relative performance in the TPU database!


----------



## squallheart (Dec 1, 2020)

birdie said:


> A great card except:
> 
> *Ridiculously overpriced* and people are seemingly not concerned at all. Like really? The GTX 970 launch price was $330!
> In power efficiency it doesn't beat Turning 16XX cards which is ... what? How? I was under the impression that modern GPUs/CPUs can power gate pretty much any part of the chip which means for games lacking RTRT/DLSS Ampere cards must have been a _lot more_ power efficient than previous generation cards.
> ...



You are actually REALLY wrong.

Actually, it's beating the historic average perf/$ in that price range.






						GPU Performance Trajectories - Google Drive
					






					docs.google.com
				




The fact that you compared it to the 970 launching at $330 really doesn't make sense either.

Most people complaining this gen are more hung up on the price of the model number rather than comparing price/perf at price points, which annoys me slightly.


----------



## HTC (Dec 1, 2020)

W1zzard said:


> From what I understand 6:00 a.m. PT, 3:00 p.m. CET, 2:00 p.m. UK
> 
> 
> Yeah, it's a lot of data, *I'll think of a better presentation for the 2021 reviews*



If i may make a suggestion, how about without RT on one side and with RT on the other PER GAME so that it can be seen without / with more clearly? A 3rd graph below could be shown with DLSS turned on in order to compare that to "just RT".


----------



## Vya Domus (Dec 1, 2020)

squallheart said:


> Actually, it's beating the historic average perf/$ in that price range.



That means nothing as long as the overall pricing has increased from the previous SKU which it replaced, show me a time in history when perf/$ has went down, this is hardly an achievement. It's the same story with performance/watt, yes Ampere it's more power efficient but it doesn't change the fact that it peaks at 400W.


----------



## bobmeix (Dec 1, 2020)

@W1zzard 
Oh, you have added the weight of the cards! Much appreciated!


----------



## mechtech (Dec 1, 2020)

Vayra86 said:


> No just find its relative performance in the TPU database!



Thanks,

Only games I have/play on TPUs extensive list are BL3 and Witcher 3, so I try to do those games specifically.


----------



## W1zzard (Dec 1, 2020)

mechtech said:


> Thanks,
> 
> Only games I have/play on TPUs extensive list are BL3 and Witcher 3, so I try to do those games specifically.


Oh wait, I just realized we dropped RX 480 from the test group for last rebench, only RX 580 now. But you should be able to extrapolate


----------



## robb (Dec 1, 2020)

Valantar said:


> Well, that came out of the blue. Good to see an actual hard launch, and competition seems to be working given the good price/perf of this GPU. Also a sentence nobody would have expected a year or two ago: it's nice to see Nvidia catch up to AMD's efficiency! XD Here's hoping supplies last at least a little.
> 
> This makes me very hopeful for the RX 6700 and 6700 XT - the RX 6800 is rather weirdly priced, even if it does beat the 3070 overall, so I'm hoping Nvidia being early with a good contender at $400 forces AMD's hand in making the next tier down a $400 GPU as well. Anything above that would be disappointing, even if it's faster.


What do you mean hard launch? These cards aren't even available anywhere yet.


----------



## Chrispy_ (Dec 1, 2020)

Vayra86 said:


> Why would you pay 600 though, I really dont follow this logic. GPUs are nonessential, why pay more than you would want?
> 
> It is exactly that twisted logic that gave scalping its business case.


Which is exactly why I'm not buying one; I don't make up the current market prices, they are what they are and until the supply matches demand these won't sell at MSRP anywhere, so it's not even relevant to the discussion.

IMO if you have an RTX card already, you're good for the moment. If you don't have one just yet, you need to wait or get fleeced, and I can strongly recommend "not getting fleeced" as the better option.


----------



## HenrySomeone (Dec 1, 2020)

ShurikN said:


> Yeah, Samsungs 8nm has been amazing, that's why you have Ampere cards filling up stores and shelves all across the world.


3070 at least is quite readily available, not at msrp true, but still... 6800 series on the other hand is simply non-existent "in the wild" and by the time that notably changes, an overwhelming majority of people in the market for a new high end gpu will have already bought an Ampere card.


----------



## Vayra86 (Dec 1, 2020)

mechtech said:


> Thanks,
> 
> Only games I have/play on TPUs extensive list are BL3 and Witcher 3, so I try to do those games specifically.



But will you play those indefinitely? Its generally pretty samey, but new engines do present a different sort of load that does work out differently on newer architectures, too. The overall relative performance is where its really at, I think, and then for your specific resolution (and a higher one you might target). Games do get sequels much like BL3 and TW3 did, and engines do get updated. With a new console gen, big changes happen. Point is, those two games only tell you about those two games - their number becomes less relevant as time passes and the engine gets outdated.

Bigger datasets generally improve accuracy


----------



## Valantar (Dec 1, 2020)

RandallFlagg said:


> Pretty much every major OEM is shipping systems with 3070, 3080, and some with 3090.  Nvidia is at least supplying the OEMs, and they are clearly opting to sell entire systems to capitalize on the populatirity.  AMD is apparently has nothing to sell, as not even OEMs have 6800/6800XT and most don't have Zen 3 either.
> 
> There are literally pages of them on Amazon and most you can get by the end of this week.
> 
> ...


That's a false equivalency. Nvidia's head start due to launching their GPUs a month earlier entirely makes up for that deficit, especially when one factors in the preferential treatment Nvidia gets (arguably buys, but never mind that - they all pay marketing incentives after all) from most OEMs in terms of which products get featured and put into which configurations. For now we simply don't have the data to make that kind of comparison. Saying "this product that has been on the market 1 1/2 month is available in more prebuilts than this one that launched two weeks ago" is an inherently biased comparison.



robb said:


> What do you mean hard launch? These cards aren't even available anywhere yet.


Announcement December 1st, release December 2nd? That's about as hard a launch as you get, at least if you want to be somewhat nice to your partners and give them time to actually get reviews out before sales start.


----------



## RandallFlagg (Dec 1, 2020)

So I'm not getting all these cards and the power draws. 

Looking at this old review of a 980, it had a peak power consumption (not overclocked) of around 185W.  

I remember ~5 years ago, word was nobody really needed a PSU > 600W unless they were doing crossfire or some such, and most people were fine with 450W which was rapidly becoming the norm. 

Now we have 3060 Ti at > 200W peak loads, not OC'd.   

Seems like these cards are made for the 1% of users that have rigs that can meet those specs, common here, but not so common in the wider market.

To make use of something like this 3060 Ti (and all other GPUs recently released) you need :

CPU no less than a 3600 or 9600, even then very CPU limited on 3070+ or 6800+, and even this 3060 Ti will get CPU limited quite a bit. 
If you are on 2600 or 7700K even, esp with slower RAM, you'll be very disappointed
Just look at the benchmarks of these on the benchmarks forum on Tomb Raider, those CPUs will make a 3080 perform like a 2070 - 2070 Super

600W+ power supply, 750+ for higher cards
So like, what % of users has this?   It's gonna be close to zero.  This kind of stuff might push a lot of people towards consoles.

Edit: This is what I'm talking about, ripped from that thread.  A 3800X is 100% CPU limited with a 6800XT on Tomb Raider :


----------



## mahirzukic2 (Dec 1, 2020)

RandallFlagg said:


> So I'm not getting all these cards and the power draws.
> 
> Looking at this old review of a 980, it had a peak power consumption (not overclocked) of around 185W.
> 
> ...


I would actually love that as that would force the prices of the GPUs down, so a win win for the customers.


----------



## evernessince (Dec 1, 2020)

Mistral said:


> Seems like an excellent mainstream card.
> So, the average person will be able to buy one some time next year?



Don't know if I'd call $400 mainstream.  The GTX 960 launched at $200, literally half the price.  That's where the mainstream pricing should be and no amount of inflation can account for a doubling of price.

It is amazing to me that more people are not calling out the pricing.


----------



## Fluffmeister (Dec 1, 2020)

Not bad at all, and seems to priced competitively against the likes of the 5700 XT.


----------



## RandallFlagg (Dec 1, 2020)

mahirzukic2 said:


> I would actually love that as that would force the prices of the GPUs down, so a win win for the customers.



Ya, but a potential long-term loss for the PC gaming industry.


----------



## Frick (Dec 1, 2020)

So I assume <€200 GPUs is a thing of the past now.


----------



## neatfeatguy (Dec 1, 2020)

Lycanwolfen said:


> 2160P not even impressed. My 2x 1070Ti's in SLI still spank the newer stuff unless its 2x 3090's in Sli. 4k gaming is where it's at and the problem is without SLi it like running in slow motion. I pushing my games around 100 fps in SLi. Heck I bet my 660 Ti's in SLI would beat the 3060ti in 1080P benchmarks. I understand directx 12 does not support it much but most games today still running dx11. Imagine if two 3060ti's was allowed to run SLi it would be 800.00 ish for same performance of a 3080 or better. But Nvidia is not only allowing the super rich afford 4k gaming.



DX12 supports SLI/Crossfire. The setup of multi-gpu, however, needs to be configured in DX12 by the developer. If it's not setup, then having multiple cards won't do you any good.

DX11/10/9 SLI support is only possible through drivers done by Nvidia. Nvidia has announced they will no longer support SLI, so even if any new games release under DX11, without driver support multiple cards won't mean squat.


----------



## ShurikN (Dec 1, 2020)

Frick said:


> So I assume <€200 GPUs is a thing of the past now.


Gt 1030 replacement incoming at $150


----------



## hardcore_gamer (Dec 1, 2020)

birdie said:


> A great card except:
> 
> Ridiculously overpriced and people are seemingly not concerned at all. Like really? The GTX 970 launch price was $330!



GTX 900 series is an interesting point because it was on 28nm. Per-transistor cost reductions slowed down somewhere around this point. 






Per chip costs are going up for GPUs (with significant transistor counts increments each generation). I doubt we'll see $300-350 upper mid-range cards ever again.


----------



## Sithaer (Dec 1, 2020)

evernessince said:


> Don't know if I'd call $400 mainstream.  The GTX 960 launched at $200, literally half the price.  That's where the mainstream pricing should be and no amount of inflation can account for a doubling of price.
> 
> It is amazing to me that more people are not calling out the pricing.



I'm also wondering this, yes prices do go up but this is just crazy and not what I would call mainstream at all.

Tho I noticed this trend lately around here, certain ppl calling 4k gaming the new standard and 400-700$ GPUs completely normal priced and affordable to the average user.

While I don't want to be mean or anything but imo those ppl need a reality check and also realize that the world/market is bigger than USA and prices and values can be completely different depending on where ppl live.

I was a part of a Discord server for around 1 year, hardware/gaming related and there were like 300-400 ppl from all over the world and most of the ppl there had budget systems ranging from 750 Ti-s/1030s and other budget cards and the 1060s/RX 580s were already in the minority.
High end users were only a few.

So far the new gen cards _'both AMD and Nvidia_' are nothing but completely overpriced things only for the higher end/richer users.

At this rate we gonna pay 300+$for a RTX 3050 in my country if it ever exists?What a joke.


----------



## Chrispy_ (Dec 1, 2020)

evernessince said:


> Don't know if I'd call $400 mainstream.  The GTX 960 launched at $200, literally half the price.  That's where the mainstream pricing should be and no amount of inflation can account for a doubling of price.
> 
> It is amazing to me that more people are not calling out the pricing.



For a start, the Ti isn't the same as the regular card, it's historically been priced at $300 not $200.

GTX 260 core 216 (like a Ti) = $300
GTX 465 (basically a 460Ti) = $280
GTX 560Ti = $290
GTX 660Ti = $300

Inflation from back then means that $300 is about $350 now and you have to account for not just inflation but also the trade sanctions against China that now hurt pricing too.

So calling them out for pricing these at $400 is baseless. If anything, the cut from 970 to 960 was much greater than typical for that generation so the 960 at $200 was simply priced according to its reduced performance. If you want further evidence of that, the vanilla GTX 760 that came before it was $250 and the 1060 that came after it was also $250. The 960 itself is a pricing anomaly so basing your argument on it alone is a mistake.

Perhaps Nvidia wanted to price the 960 at $250 but they couldn't because the old HD7950 was really close in performance and those were still selling for $160 with 50% more GDDR5 than the 960 too. The same card, rebranded to the R9 280 was also only $200 and had been in the market for a few months already, so there were discounts on that too; The 960 brought competition to the $200 price point but wasn't really a good deal even then because the market price of the competition was lower than the MSRP suggested. Sure, the GTX 960 sold but it wasn't an easy recommendation for people on a strict budget as the 2GB cards were already starting to struggle in some games and 3GB cards with similar performance were available for less; All it had going for it was lower power consumption.


----------



## RandallFlagg (Dec 1, 2020)

Sithaer said:


> I'm also wondering this, yes prices do go up but this is just crazy and not what I would call mainstream at all.
> 
> Tho I noticed this trend lately around here, certain ppl calling 4k gaming the new standard and 400-700$ GPUs completely normal priced and affordable to the average user.
> 
> ...



Edit: added higher end 10X0 cards -

Really has nothing to do with being in the US. 

Steam HW survey pretty much says it all, I just totaled percent of users on 20X0 cards + 5700 / XT + 5600 / XT + 1070 / 1080 variants and it is about 21%.  That would imply that 78%+ of gamers are using a 1660 Ti \ 5500 XT or less.  

Using that metric - and usually when we talk about midrange it means where half are above this and half are below this - even a 1660 or 5500XT could be considered high end. 

Looks like you have to drop all the way down to 570 / 1050 Ti / 1650 to be middle of the pack.


----------



## Sithaer (Dec 1, 2020)

RandallFlagg said:


> Edit: added higher end 10X0 cards -
> 
> Really has nothing to do with being in the US.
> 
> ...



I get that yea but what I meant is that ppl who comment such things are mainly from richer countries and USA.

Anyway I digress, its just what I observed in the past months and I will never agree that a 400$ _'in reality 500-700$' _card is mainstraim or ever will be for most ppl.

Theres like a 90% chance that I will buy a RX 5600 XT in January and that will be my card for the next ~3 years _'I keep my cards for 2-3 years always', _was hoping that maybe I could grab the lower end new gen cards but seeing these prices yea aint happening.


----------



## dyonoctis (Dec 1, 2020)

Lycanwolfen said:


> Heck I bet my 660 Ti's in SLI would beat the 3060ti in 1080P benchmarks.


Thanks, I had a good laugh


----------



## RandallFlagg (Dec 1, 2020)

Okay here is the big question.

What time do they go up for sale?


----------



## ahmadmob (Dec 1, 2020)

I registered here to give my special thanks to W1zzard for all these fantastic reviews.

Thanks again for the awesome work. I can imagine how hard and time consuming these reviews and charts can be, but know that we are all very grateful for that hard work.

Keep it up please.


----------



## tancabean (Dec 1, 2020)

evernessince said:


> Don't know if I'd call $400 mainstream.  The GTX 960 launched at $200, literally half the price.  That's where the mainstream pricing should be and no amount of inflation can account for a doubling of price.
> 
> It is amazing to me that more people are not calling out the pricing.



Mainstream for me is still $200-$300. But that’s now the domain of x50 parts. Which is fine to be honest as those cards still work well for 1080p. A $400 3060 is definitely not mainstream.


----------



## Fourstaff (Dec 1, 2020)

Nice price/perf if they can sell at $400, but I doubt they will be reaching that soon. Foundry undercapacity coupled with enlarged demand (from miners and analytics) mean we will have to get comfortable with elevated prices for the next few years. On the bright side for mainstream gamers RX 570 can be had for less than $150. Still plenty powerful for 1080p gaming.



tancabean said:


> Mainstream for me is still $200-$300. But that’s now the domain of x50 parts. Which is fine to be honest as those cards still work well for 1080p. A $400 3060 is definitely not mainstream.


If 1080p is mainstream gaming, then x50 cards are more than capable.


----------



## dicktracy (Dec 1, 2020)

Almost on par with the overpriced 6800XT in Ray Tracing without using DLSS. Ouch!


----------



## Caring1 (Dec 1, 2020)

"NVIDIA's GeForce RTX 3060 Ti comes at incredible pricing of $399 "
It's incredible literally, because no one believes it.


----------



## Mussels (Dec 2, 2020)

This seems like a great replacement for my 1080 8GB, similar wattage and a good chunk more performance.

It's hurting me inside how power hungry modern GPU's are tho


----------



## r9 (Dec 2, 2020)

I be fine getting 2070 super for $300 but that won't happen any time soon as people still asking $500 for their old shit. Everybody wants to sell their old crap use the money to upgrade and profit $100 on top.


----------



## mechtech (Dec 2, 2020)

Vayra86 said:


> But will you play those indefinitely? Its generally pretty samey, but new engines do present a different sort of load that does work out differently on newer architectures, too. The overall relative performance is where its really at, I think, and then for your specific resolution (and a higher one you might target). Games do get sequels much like BL3 and TW3 did, and engines do get updated. With a new console gen, big changes happen. Point is, those two games only tell you about those two games - their number becomes less relevant as time passes and the engine gets outdated.
> 
> Bigger datasets generally improve accuracy



Won't argue with that, however if I'm gonna sink hard earned cash into a card, it would be based on the games I own/play currently.



W1zzard said:


> Oh wait, I just realized we dropped RX 480 from the test group for last rebench, only RX 580 now. But you should be able to extrapolate
> 
> View attachment 177753View attachment 177754



Now that's service 

Thanks


----------



## evernessince (Dec 2, 2020)

Frick said:


> So I assume <€200 GPUs is a thing of the past now.



If so that's a huge problem.  The vast majority of the PCs I have sold have had GPUs at $200 USD or under.



ShurikN said:


> Gt 1030 replacement incoming at $150



That's for the DDR3 version, gotta pay at least $200 for that GDDR5.



Sithaer said:


> I'm also wondering this, yes prices do go up but this is just crazy and not what I would call mainstream at all.
> 
> Tho I noticed this trend lately around here, certain ppl calling 4k gaming the new standard and 400-700$ GPUs completely normal priced and affordable to the average user.
> 
> ...



For sure the bulk of the market is in the lower price segments.  I don't see this pricing being healthy for the PC platform in general.



tancabean said:


> Mainstream for me is still $200-$300. But that’s now the domain of x50 parts. Which is fine to be honest as those cards still work well for 1080p. A $400 3060 is definitely not mainstream.



The RX 480 / GTX 1060 had 1080p down pretty well years back and the launch price of the RX 480 was $200.  The 1650 is slower and the 1660 is not much faster and priced higher.  That's considering the RX 480 was launched 4 years ago.

The lack of improvement in price to performance is a problem, especially for budget cards.


----------



## candle_86 (Dec 2, 2020)

Chrispy_ said:


> For a start, the Ti isn't the same as the regular card, it's historically been priced at $300 not $200.
> 
> GTX 260 core 216 (like a Ti) = $300
> GTX 465 (basically a 460Ti) = $280
> ...



real fast the GTX465 wasn't faster than a GTX460, it was the stop gap because GTX460 GF104 wasn't ready yet.

Here is the Trend over the last decade for price per bracket for which series card, some things stick out Top end cards have move from 700 to 1200, the midrange moved from 250 to 499.

I think Nvidia has lost touch, the GTX 1660 was not a good replacement for the GTX 1060 6gb, it was more of a side grade and at times maybe 10% better, the real improvement meant moving up bracket and out of the midrange territory and into the range of the high end

A few oddites to notice, Geforce 3 was over priced but like the later Nvidia cards ATI had nothing to compete, and the GTX200 series is all over the place on launch prices, you can tell when AMD launched the HD4000 series, so it seems normally if Nvidia has competition prices go down, but not this time.

But going back in time the Geforce2 Ultra for 500 is only equal to about 750 in todays money, so its not just inflation


2000-3000GTX Titan ZGTX Titan VTitan RTX1000-2000GTX 690GTX Titan Black, GTX TitanGTX Titan XGTX Titan X, GTX Titan XPRTX 2080 TiRTX 3090700-10008800 Ultra9800 GX2GTX 590GTX 780 Ti600-7007800 GTX 512, 7800 GTX 5127950 GX28800 GTXGTX 780GTX 980 TiGTX 1080 TiRTX 2080 Super, RTX 2080RTX 3080500-600UltraTi 500, 3
Ti 4800, Ti 4600
FX 5800 Ultra, FX5900 Ultra, FX5950 Ultra6800 Ultra Extreme7900 GTXGTX 280GTX 480GTX 580GTX 680GTX 980GTX 1080RTX 2070 Super, RTX 2070RTX 3070400-5006800 Ultra7800 GT8800 GTS 640GTX 295, GTX 260GTX 670GTX 770GTX 1070 TiRTX 2060 SuperRTX 3060 Ti300-400Pro, GTSTi 200Ti 4400FX 5800, FX 59007800 GS7950 GT, 7900GT8800 GTS 512, 8800 GTS 3209800 GTXGTX 285, GTX 260 216GTX 470GTX 570GTx 660 TiGTX 970GTX 1070
RTX 2060250-300Ti6800 GT7900 GSGTX 275GTX 465GTX 560 Ti 448
GTX 1660 Ti200-250Ti 4200FX 5900XT68008800GT9800 GTGTX 460GTX 560 TiGTX 660GTX 760GTX 960GTX 1060 6gbGTX 1660 Super, GTX 1660150-200MX 400, MXMX 460FX 5700 Ultra, FX 5700, FX 5600 Ultra, FX 56006800 LE, 6600 GT7600GT8800GS, 8600 GTS9600 GTGTS 250GTX 460 768GTX 560GTX 650 Ti Boost, GTX 650 TiGTX 750 TiGTX 950GTX 1060 3gbGTX 1650 Super100-150MX 440FX 5700LE, FX 5600 LE, FX 5200 Ultra66007600GS8600 GT9600 GSOGT 240GTS 450GTX 550 TiGTX 650GTX 750GTX 1050 Ti, GTX 1050GTX 1650


----------



## Valantar (Dec 2, 2020)

RandallFlagg said:


> So I'm not getting all these cards and the power draws.
> 
> Looking at this old review of a 980, it had a peak power consumption (not overclocked) of around 185W.
> 
> ...


You're not wrong about the power draws increasing this generation, but a GPU running at ~210W, peaking at ~220W does not require a >600W PSU unless your build is _really_ out there. Most PCs these days have one SSD, _maybe_ one HDD, a few fans, perhaps an AIO pump, and that's it. Combine the GPU power draw with the real-world power draw of a CPU in a matching price range, like the 3600 or 5600X, you have about 300W, add in another 50-70W for the motherboard, RAM, storage and fans, and add another 20% or so for safety and margin for PSU wear. That leaves you with a minimum PSU of ~420-440W. So even a 500W PSU would be _plenty_ and would leave you room for future upgrades too (though statistically the chance of someone moving significantly up in power level from their current GPU when they upgrade is rather unlikely - the far more likely thing is for them to buy a newer GPU in the same tier, with roughly comparable power draw). This of course assumes one buys PSUs of reasonable quality from reliable manufacturers, but that's a given.


----------



## dont whant to set it"' (Dec 2, 2020)

I just imagined the Ozzymanreviews bloke saying "Yeah ... Nah , just nah" at 400 buckaroos for a maybe 250 to 275 $ worth.


----------



## nguyen (Dec 2, 2020)

If the standard 3060 comes out with 6GB VRAM and 10% slower than 3060Ti at 330usd, it's gonna be instant smash hit that could replace the almighty 1060 very soon.


----------



## medi01 (Dec 2, 2020)

Flanker said:


> That power efficiency is pretty good. Wonder if AMD cards in this perf segment can beat it




So, napkin math:

3070 - 5888 faux shaders (real figure: 2944)
3060Ti - 4864 faux shaders (real figure: 2432)
(note: just because shader can do fp+fp doesn't turn it into 2 shaders)

Drop vs 3070: 17.4%


6800 - 4608 shaders  (6800 is 5%+ faster than 3070)
6700 - 2560 shaders

Drop vs 6800: 45%
Is rumored to hit 3Ghz, should still be slower, but much cheaper to produce too.


3060Ti is likely the chip NV hoped to label as 3070 or even 3070Ti, before Lisa hammered Jensen with RDNA2.



W1zzard said:


> ...and even the RX 6800 in raytracing...


I don't think it's a fair assessment of the cards capabilities, as RT perf varies wildly from Control to Dirt5 and with COD somewhere in the middle and with only 2 games tested one (both?) green sponsored.







Chrispy_ said:


> $399 for something approaching a 2080S. Ampere offers theoretical improvements in performance/$ across the whole range.


Except 2080s is hard to be called a "fairly priced card".


----------



## Xaled (Dec 2, 2020)

Once again TPU falls in the trap of nvidia and participate in the fraud of announcing an unavailable product at a fake MSRP.
Shame!


----------



## medi01 (Dec 2, 2020)

Xaled said:


> Once again TPU falls in the trap of nvidia and participate in the fraud of announcing an unavailable product at a fake MSRP.
> Shame!


How could TPU control pricing/MSRP pretty please?

And it's not a secret figure either, you can see it with your eyes easily (unlike performance, which is what this site is testing)


----------



## Chomiq (Dec 2, 2020)

r9 said:


> Is there a specific time when these go on sale tomorrow ?


Probably 2PM GMT.


----------



## Xaled (Dec 2, 2020)

medi01 said:


> How could TPU control pricing/MSRP pretty please?
> 
> And it's not a secret figure either, you can see it with your eyes easily (unlike performance, which is what this site is testing)


If the price is fake then the whole review is invalid or meaningless. Because the major part of the conclusion is based on the price/performance of the card, especially when this segment is supposedly a price/performance segment. 
Not to mention that the who whole Founder Edition has just been fake, just a paper launch or only being sent to media. No FE has been actually available. They are either not sold at all or either sold in a very limited numbers, just not to get sued or legally get into problems.


----------



## the54thvoid (Dec 2, 2020)

Pretty much most top-end, or 'cutting-edge' process tech is thin on the ground. The foundries are having problems supplying demand, whether it's Nvidia and Samsung, or AMD and TSMC. The one thing that truly is missing, however, is patience. From most people, it seems.


----------



## W1zzard (Dec 2, 2020)

Xaled said:


> Once again TPU falls in the trap of nvidia and participate in the fraud of announcing an unavailable product at a fake MSRP.
> Shame!


Have you read the last paragraph? I basically say "I think these prices are fake"


----------



## Xaled (Dec 2, 2020)

W1zzard said:


> Have you read the last paragraph? I basically say "I think these prices are fake"


I've definitely read that. And I believe that is not enough. Because not only the prices are fake, but the product itself is fake too. It is an (almost) always out-of-stock product, only available at launch, only for media, with of course, a fake price.


----------



## Valantar (Dec 2, 2020)

Prices are definitely ... not there. Here in Sweden, the MSRP is supposedly SEK 4500, yet the cheapest one I can find (all of which are fully booked by pre-orders, from what I can tell) is at 4800, with most being 5000 or more.

Still, I think @W1zzard did a good job in pointing out this issue. It's not as if TPU can help this.

Still, it makes me wonder what exactly is happening with tech pricing these days. Consoles being a bit more expensive than previously is understandable to a certain degree, and of course the effects are being exacerbated here due to exchange rates fluctuating, but with GPUs in particular ... something must be off, with relatively high MSRPs, yet those still producing 0% margins for partners, all the while there is no supply to speak of. Are these new process nodes _that_ expensive? Are the chips _that_ big? Are yields _that_ bad? Are we in a new, less visible crypto boom? Is TSMC 7nm _that_ squeezed for capacity? Is Covid having a bigger than expected effect on manufacturing and shipping? Is (regular end-user, not miner, datacenter, etc.) demand _that_ much higher than previously?


----------



## EzioAs (Dec 2, 2020)

Valantar said:


> Is Covid having a bigger than expected effect on manufacturing and shipping?



Yes.


----------



## SIGSEGV (Dec 2, 2020)

NVIDIA has full fabs at Samsung yet failed to deliver the products. It's so ironic. 
Well, it's completely weird.  It's a shady business to jack the price up. I don't believe the scalpers.  
I believe they are laughing at their customer right now.


----------



## HenrySomeone (Dec 2, 2020)

Xaled said:


> I've definitely read that. And I believe that is not enough. Because not only the prices are fake, but the product itself is fake too. It is an (almost) always out-of-stock product, only available at launch, only for media, with of course, a fake price.


You're exaggerating wildly, I fully expect the 3060Ti to be readily available at not much over MSRP in a couple weeks or less. The main reason being, it only matches the second best card of previous generation in performance (2080 super), which means that it won't be interesting to owners of 1080Ti, 2080, 2080s, 2070s, 2070, 5700XT and likely also not to those of 2060s, 5700, 2060, 1080, Vega 64 and even 5600xt, 1070Ti and Vega 56


----------



## nguyen (Dec 2, 2020)

Well as a flight crew I can chime into why the cost of products are going up, because shipping cost is rising to accomodate the quarantine rules of each country.

Before Covid: daily flight
Now: 3 cargo flight per week, because crews have to remain in quarantine (14 days) after oversea flights leading to not enough crew. 
So yeah, I have been in quarantine for the past month   doing cargo flights. I would assume the same things are happening to other airlines and air crews.
Basically air crews are being treated as potential oversea virus carriers .

Whoever keep complaining about no GPU availability can suck it


----------



## ShurikN (Dec 2, 2020)

the54thvoid said:


> Pretty much most top-end, or 'cutting-edge' process tech is thin on the ground. The foundries are having problems supplying demand, whether it's Nvidia and Samsung, or AMD and TSMC. The one thing that truly is missing, however, is patience. From most people, it seems.


It's easy for you to be patient with a 2080ti. Some people are stuck on turds and wanna upgrade in time for CP2077.


----------



## medi01 (Dec 2, 2020)

Caring1 said:


> "NVIDIA's GeForce RTX 3060 Ti comes at incredible pricing of $399 "
> It's incredible literally, because no one believes it.


Craziest part here is that nobody believes an xx60 card is priced at $400 and it's because people think it's too cheap.


----------



## Valantar (Dec 2, 2020)

medi01 said:


> Craziest part here is that nobody believes an xx60 card is priced at $400 and it's because people think it's too cheap.


People don't think it's too cheap - that wording implies that people think they're charging less than it's worth. People don't believe it'll actually be available at that price, due to current market realities. Also, an xx60 Ti card is not an xx60 card.


----------



## tancabean (Dec 2, 2020)

SIGSEGV said:


> NVIDIA has full fabs at Samsung yet failed to deliver the products. It's so ironic.
> Well, it's completely weird.  It's a shady business to jack the price up. I don't believe the scalpers.
> I believe they are laughing at their customer right now.



Who says the problem is at Samsung. There’s a whole line of suppliers and vendors involved in getting a GPU onto shelves.


----------



## Vayra86 (Dec 2, 2020)

Valantar said:


> You're not wrong about the power draws increasing this generation, but a GPU running at ~210W, peaking at ~220W does not require a >600W PSU unless your build is _really_ out there. Most PCs these days have one SSD, _maybe_ one HDD, a few fans, perhaps an AIO pump, and that's it. Combine the GPU power draw with the real-world power draw of a CPU in a matching price range, like the 3600 or 5600X, you have about 300W, add in another 50-70W for the motherboard, RAM, storage and fans, and add another 20% or so for safety and margin for PSU wear. That leaves you with a minimum PSU of ~420-440W. So even a 500W PSU would be _plenty_ and would leave you room for future upgrades too (though statistically the chance of someone moving significantly up in power level from their current GPU when they upgrade is rather unlikely - the far more likely thing is for them to buy a newer GPU in the same tier, with roughly comparable power draw). This of course assumes one buys PSUs of reasonable quality from reliable manufacturers, but that's a given.



Quality > wattage. The impact of ripple is getting higher as the peak demands of GPUs go up, and they do. Clean power delivery matters, but high wattage on a somewhat less impressive unit in terms of ripple, can still end up with lower ripple at the typical load.



candle_86 said:


> real fast the GTX465 wasn't faster than a GTX460, it was the stop gap because GTX460 GF104 wasn't ready yet.
> 
> Here is the Trend over the last decade for price per bracket for which series card, some things stick out Top end cards have move from 700 to 1200, the midrange moved from 250 to 499.
> 
> ...



Its not just inflation. Its a marketplace. Lots of factors are of varying influence. Inflation is just a constant influence.

Even if the MSRPs are set, we see that external influences do affect price. Currently we have a lot of those factors stacking up. Covid, trade war, Christmas shopping, past gen with lacking competition, and a good product line on both sides today after several years of weak releases.

But another aspect I always see people omitting is the fact that right now the performance delta between the lower end and the highest end card is absolutely friggin massive. It wasn't always like that. You can now buy 1080p capable cards in the lower regions (as in ultra 60-120 fps capable, which used to be a holy grail just a few years back) and when you hit the midrange, you'll be set for 1440p gaming. Since Pascal - by the way - which also got the first notable price hike per tier. I said back then it was justified and I still believe it is. Pascal offered a major jump, bigger than most generations before it, its not weird Nvidia wanted to cash in on that - you get what you pay for. Ironically, the current price point of RDNA2 confirms that, too. Pascal was priced right.

Right now people are somehow convinced they MUST game at 4K and then start complaining the cards are so expensive... do we even logic. Similar ideas apply to quality and FPS targets... everyone has to have high refresh rate ultra it seems, even if the wallet isn't deep enough to get there.

Has Nvidia lost touch... meh. Not sure. The stars just didn't align for Ampere for them, I think. They're still leading in many ways, but that leadership is definitely shifting away from gaming and more towards datacenter/enterprise. That began properly with Volta and their attempt to acquire ARM. All they need going from there to feed Geforce is just a trickle down.


----------



## Valantar (Dec 2, 2020)

Vayra86 said:


> Quality > wattage. The impact of ripple is getting higher as the peak demands of GPUs go up, and they do. Clean power delivery matters, but high wattage on a somewhat less impressive unit in terms of ripple, can still end up with lower ripple at the typical load.


Yeah, PSU quality is so much more important than the rated wattage. It's a damn shame that no PSU manufacturer is willing to make good-quality, gold or higher rated 400-500W modular units at affordable prices, as that would fulfill the needs of >90% of users. Sadly these likely wouldn't sell well as too many people still believe in and regurgitate the decade-old silliness of planning for 2x the necessary capacity - which was smart back when you couldn't trust the wattage rating of your PSU, but is a load of rubbish these days. The average gaming PC (something like an i5 or Ryzen 5 + a 1060 or similar) doesn't even hit 300W internal power draw under gaming loads, and likely not even under a power virus load, so people buying 600W+ PSUs for these things is just silly. Of course the CYA approach of "minimum PSU wattage" ratings for components just exacerbates this, as Nvidia and AMD both seem to factor in the most outlandish configurations, worst PSUs, and still add heaps of headroom to these numbers. I've been happily running my i5-2400+RX 570 off of a 350W (fanless, custom AC-DC+PicoPSU) PSU for a year or so, and it doesn't even get close to warm under load, and IIRC it doesn't consume more than ~230W at the wall, including PSU losses, under a torture load - so I've got plenty of headroom for upgrades, even. I'm greatly looking forward to the day when people start buying sensible PSUs for themselves.


----------



## okbuddy (Dec 2, 2020)

MSRP ia too low, so the market is adjusting the prices according to performance to boost the econ performance


----------



## ThrashZone (Dec 2, 2020)

W1zzard said:


> Have you read the last paragraph? I basically say "I think these prices are fake"


Hi,
Probably should of been in the first paragraph lol


----------



## W1zzard (Dec 2, 2020)

ThrashZone said:


> Hi,
> Probably should of been in the first paragraph lol


I always write my conclusions following a certain flow, pricing discussion is a the end. I always thought it makes sense to know more about the product first and then talk about pricing


----------



## ThrashZone (Dec 2, 2020)

Hi,
It takes a long time usually 4-6 months for prices to settle after a release.



W1zzard said:


> I always write my conclusions following a certain flow, pricing discussion is a the end. I always thought it makes sense to know more about the product first and then talk about pricing


Hi,
I've seen you add you opinions in first paragraph too


----------



## candle_86 (Dec 2, 2020)

Vayra86 said:


> Quality > wattage. The impact of ripple is getting higher as the peak demands of GPUs go up, and they do. Clean power delivery matters, but high wattage on a somewhat less impressive unit in terms of ripple, can still end up with lower ripple at the typical load.
> 
> 
> 
> ...



Disagree, pascal was a return to normal where the mid range was competitive and then they decided meh. I can think of bigger generation jumps.

6600gt vs fx5950 Ultra 
7600gt vs 6800 Ultra 
Gtx 460 vs gtx 285
And you only had to wait a year between generations


----------



## RandallFlagg (Dec 2, 2020)

Valantar said:


> You're not wrong about the power draws increasing this generation, but a GPU running at ~210W, peaking at ~220W does not require a >600W PSU unless your build is _really_ out there. Most PCs these days have one SSD, _maybe_ one HDD, a few fans, perhaps an AIO pump, and that's it. Combine the GPU power draw with the real-world power draw of a CPU in a matching price range, like the 3600 or 5600X, you have about 300W, add in another 50-70W for the motherboard, RAM, storage and fans, and add another 20% or so for safety and margin for PSU wear. That leaves you with a minimum PSU of ~420-440W. So even a 500W PSU would be _plenty_ and would leave you room for future upgrades too (though statistically the chance of someone moving significantly up in power level from their current GPU when they upgrade is rather unlikely - the far more likely thing is for them to buy a newer GPU in the same tier, with roughly comparable power draw). This of course assumes one buys PSUs of reasonable quality from reliable manufacturers, but that's a given.



You're assuming no one ever OC's anything in their rig, they all buy Founders Editions, they never get a higher end CPU, they aren't using any USB 3.2 \ USB-C devices, no SATA drives or HDD, and so on.  Keep in mind these benchmarks are on stripped down systems.  

A single HDD for example, can consume 20W.  SSDs are better, but Tom's shows for example using a WD Blue vs a Samsung 850 can lop 30 minutes of battery life off a laptop.  It's not zero.

For me, I have a USB hub plugged into a USB-C port that I use to charge my phone, keyboard, mouse, and so on.  Separately I have a USB 3.1 external 3TB HDD.  I also have both a SATA SSD and an M.2 (tests only have one M.2), and a pcie wireless/bluetooth card.  This stuff all adds up, I bet there's an extra 50W draw in there, and if I plug in multiple devices to that hub it could be more.  I don't think this is unusual, plenty of folks have much more.  

This is why the rec from Nvidia is to have 650W for a 3070 and 750W for a 3080.    I'm sure Nvidia will rec 550W+ for a 3060 Ti.


----------



## Lizard (Dec 2, 2020)

Chomiq said:


> Probably 2PM GMT.



...until 2.06PM as it seems. Whole 6 minutes until not available in UK/Ireland anymore. 
same story, different day. what a joke 

369GBP/405Eur as expected.


----------



## Valantar (Dec 2, 2020)

RandallFlagg said:


> You're assuming no one ever OC's anything in their rig, they all buy Founders Editions, they never get a higher end CPU, they aren't using any USB 3.2 \ USB-C devices, no SATA drives or HDD, and so on.  Keep in mind these benchmarks are on stripped down systems.
> 
> A single HDD for example, can consume 20W.  SSDs are better, but Tom's shows for example using a WD Blue vs a Samsung 850 can lop 30 minutes of battery life off a laptop.  It's not zero.


Nope. Please re-read - though there were a few details I left out: a core tenet of this approach is checking _real-world power draws_, in other words _the numbers for the GPU you're planning to buy_. Not generic numbers, not FE numbers (unless you're buying the FE), not total system power numbers. If you plan to OC, obviously factor that in, but the _vast_ majority don't OC, and besides, the 20% total overhead is typically sufficient to account for that. Getting a higher end CPU doesn't make that much of a difference - CPU power draws scale far less than GPU power draws, except for the past two generations of Intel chips, of course (though even those are _far_ from their peak draws while gaming). But again, the 20% headroom accounts for that.

USB devices generally consume little power, and are unlikely to be in heavy use while the PC is under heavy load, like a game being run. The same goes for drives - and as I said, the average gaming PC today has a single SSD and possibly a HDD. HDD peak power draw happens _only_ during spin-up, so the chances of that happening during gaming is ... tiny. In-use power for a 7200rpm 3.5" HDD is <10W. But more importantly: cumulative power numbers adds a lot of invisible headroom. Gaming _never_ stresses both CPU and GPU to their maximum power draw, let alone the rest of the system. So if you have a peak 90W CPU and a peak 150W GPU, you're never going to see 240W from those two components while gaming. Games don't load the whole PC 100%. So in real-world usage scenarios those additional 20% are _already on top of built-in headroom_.


RandallFlagg said:


> For me, I have a USB hub plugged into a USB-C port that I use to charge my phone, keyboard, mouse, and so on.  Separately I have a USB 3.1 external 3TB HDD.  I also have both a SATA SSD and an M.2 (tests only have one M.2), and a pcie wireless/bluetooth card.  This stuff all adds up, I bet there's an extra 50W draw in there, and if I plug in multiple devices to that hub it could be more.  I don't think this is unusual, plenty of folks have much more.


That is definitely above average, if not _uncommon_. As I said, most PC builds these days have a single SSD, and _maybe_ an HDD. Two years ago HDDs were ubiquitous, but not today. Mice and keyboards consume maybe a few watts each - they need to be USB 2.0 compliant, which means 2.5W max, though typically much less unless they have excessively bright RGB. Desktop USB-C ports output a maximum of 15W (5V3A) - that's all the specification allows for without an external PSU. And your HDD again might peak at 20W, but is more likely to be idling at 1-3W or running at 5-10W while the PC is being stressed.



RandallFlagg said:


> This is why the rec from Nvidia is to have 650W for a 3070 and 750W for a 3080.    I'm sure Nvidia will rec 550W+ for a 3060 Ti.


The thing is, even with your additional numbers, you get _nowhere near_ 650W. Not even close. The 3070 is a 220W (240W peak) GPU. Add a ~150W CPU, ~25W for the motherboard and RAM, 20W for a couple of drives, 20W for a few fans and an AIO pump, and another 10W for peripherals, and you get 465W, or 558W with a 20% margin. And again, _that system will never, ever consume 465W_. Never. That's not how PCs work. Every single component is _never_ under full stress at the same time, even for a millisecond, let alone long enough for it to trip the PSU's OCP. And remember, that's with a 150W CPU, not a 95W or 65W one. There is, in other words, plenty of headroom built into these numbers already. For any other 3070 than the FE, exchange 240W in the calculation with its peak power draw. It really isn't hard.


----------



## Xaled (Dec 2, 2020)

Vayra86 said:


> Right now people are somehow convinced they MUST game at 4K and then start complaining the cards are so expensive... do we even logic. Similar ideas apply to quality and FPS targets... everyone has to have high refresh rate ultra it seems, even if the wallet isn't deep enough to get there.


Just imagine if Samsung said: 1280x720 is enough for mobile users, we should convince them if they want more resolutions, they should pay more and do not complain about higher prices
Or if Qualcomm said same thing about processors
or if Apple said 4" is enough, people should not convince themselves to use bigger screen, bigger screens are more expensive, their production is costly.
do you see now how your poor your point is?


----------



## Chrispy_ (Dec 2, 2020)

Xaled said:


> Just imagine if Samsung said: 1280x720 is enough for mobile users, we should convince them if they want more resolutions, they should pay more and do not complain about higher prices
> Or if Qualcomm said same thing about processors
> or if Apple said 4" is enough, people should not convince themselves to use bigger screen, bigger screens are more expensive, their production is costly.
> do you see now how your poor your point is?


I've had a 4K screen for gaming for most of this decade and ignoring the fact that it's been costly to keep it up to date with higher-tier graphics cards, I do not think that many games justify the extra resolution. You can still see polygon edges, 2D shader effects, and plenty of textures aren't really 4K assets even in AAA 2020 titles.

What 4K gives us is less aliasing and slightly crisper fine detail in the distance, but it sure as hell isn't a make-or-break feature, even in ridiculously pretty games like HZD. I prefer to run games at 1080p60 with everything cranked to the max, rather than 4K60 and hope that my hardware can maintain that minimum 16ms frametime for EVERY frame without fail or spend half an hour messing around with graphics settings to see what I need to disable to get stutter-free 4K60 gameplay.

IMO the sweet spot right now is 1440p144Hz and that's what my main desktop runs. 1440p is much closer to 1080p than it is to 4K, but the extra fluidity of 100Hz+ is well worth the drop in resolution.


----------



## candle_86 (Dec 2, 2020)

Valantar said:


> Nope. Please re-read - though there were a few details I left out: a core tenet of this approach is checking _real-world power draws_, in other words _the numbers for the GPU you're planning to buy_. Not generic numbers, not FE numbers (unless you're buying the FE), not total system power numbers. If you plan to OC, obviously factor that in, but the _vast_ majority don't OC, and besides, the 20% total overhead is typically sufficient to account for that. Getting a higher end CPU doesn't make that much of a difference - CPU power draws scale far less than GPU power draws, except for the past two generations of Intel chips, of course (though even those are _far_ from their peak draws while gaming). But again, the 20% headroom accounts for that.
> 
> USB devices generally consume little power, and are unlikely to be in heavy use while the PC is under heavy load, like a game being run. The same goes for drives - and as I said, the average gaming PC today has a single SSD and possibly a HDD. HDD peak power draw happens _only_ during spin-up, so the chances of that happening during gaming is ... tiny. In-use power for a 7200rpm 3.5" HDD is <10W. But more importantly: cumulative power numbers adds a lot of invisible headroom. Gaming _never_ stresses both CPU and GPU to their maximum power draw, let alone the rest of the system. So if you have a peak 90W CPU and a peak 150W GPU, you're never going to see 240W from those two components while gaming. Games don't load the whole PC 100%. So in real-world usage scenarios those additional 20% are _already on top of built-in headroom_.
> 
> ...



Not true fs2020 will max cpu and gpu it's the new furmark + prime95 for torturing your comp


----------



## Xaled (Dec 2, 2020)

Chrispy_ said:


> I've had a 4K screen for gaming for most of this decade and ignoring the fact that it's been costly to keep it up to date with higher-tier graphics cards, I do not think that many games justify the extra resolution. You can still see polygon edges, 2D shader effects, and plenty of textures aren't really 4K assets even in AAA 2020 titles.
> 
> What 4K gives us is less aliasing and find detail in the distance, but it sure as hell isn't a make-or-break feature and I prefer to run games at 1080p60 with everything cranked to the max, rather than 4K60 and hope that my hardware can maintain that minimum 16ms frametime for EVERY frame without fail or spend half an hour messing around with graphics settings to see what I need to disable to get stutter-free 4K60 gameplay.
> 
> IMO the sweet spot right now is 1440p144Hz and that's what my main desktop runs. 1440p is much closer to 1080p than it is to 4K, but the extra fluidity of 100Hz+ is well worth the drop in resolution.


Anything higher than 720 for mobile is not noticable as well, but competition and high demand in phone market had the phone devices develop so quick and fast even that most phones now have features specs that is way higher than what actually this device need. Same thing would've been done in PC world too, All peoples would've been using 4k 144hz mintors by now if the market wasnt dominated by (until recently) Intel and Nvidia. We've been using the same Intel CPU technology for almost 10 yeras. And Nvidia has been giving almost the same FPS per Dollar for almost five years. These two firms are the only reason that PCs are not evolving the same way they were evolving before  or the same way mobile devices are evolving.
Only CPUs and GPUs are preventing evoluition of the PC. Moore's law has been present in almost all parts, Monitors, 4k 120hz monitors now are not more expensive that what 1080p 120hz was 10 years ago. Same thing can be said for HDDs, SSDs, etc.


----------



## Chomiq (Dec 2, 2020)

Didn't even bother to check availability at launch. A lot of prebuilds are listed at various online retailers. They all have "Nvidia 3060 Ti" listed, no AIB partner mentioned so I guess you don't really know which card you will get. I guess stores decided to scalp actual customers prevent scalping by focusing on prebuilds.


----------



## QUANTUMPHYSICS (Dec 2, 2020)

I waited on line at Microcenter from 8:10AM till open at 9AM.

There were 20 people ahead of me (at least). 

They snapped up the FE cards, but I managed to get the FTW3.

They had more than enough 3rd party cards including Eagle, Aorus, MSI trio, etc.


----------



## Valantar (Dec 2, 2020)

candle_86 said:


> Not true fs2020 will max cpu and gpu it's the new furmark + prime95 for torturing your comp


Nah - FS2020 is far more taxing than the average game, and it does spawn a lot of threads, but all of those threads don't load their cores to 100%, and definitely not with power-hungry workloads. It is - like most games, just to a higher degree - limited by a couple of high performance threads, it just scales slightly better than average with more cores. It's more important to have a few fast cores than many cores, which speaks against it loading all threads with heavy workloads.


----------



## Chomiq (Dec 2, 2020)

Anyone from Europe can verify if Nvidia has the FE listed on their local site? Here it is not even listed.


----------



## RandallFlagg (Dec 2, 2020)

Valantar said:


> Nah - FS2020 is far more taxing than the average game, and it does spawn a lot of threads, but all of those threads don't load their cores to 100%, and definitely not with power-hungry workloads. It is - like most games, just to a higher degree - limited by a couple of high performance threads, it just scales slightly better than average with more cores. It's more important to have a few fast cores than many cores, which speaks against it loading all threads with heavy workloads.



I've seen this type of thread before and I know what happens.  Some poor slob buys into the napkin math and a couple months later there are posts about system rebooting and funny electrical smells, traced back to the PSU. 

Tellling people they're fine with under-rated PSUs based on not having a hard drive, not adding SSDs, not having anything connected to their USB ports, not having a PCI-e network adapter and so on assumes they don't or will never have those use cases.  The recommended numbers are for a person to be able to fully utilize the connectivity in their PC without fear of instability.  Adding say WiFi 6 and a backup USB drive should not result in a failing rig and $100 wasted on the wrong PSU.

And that manufacturer rec is this :


----------



## tancabean (Dec 2, 2020)

QUANTUMPHYSICS said:


> I waited on line at Microcenter from 8:10AM till open at 9AM.
> 
> There were 20 people ahead of me (at least).
> 
> ...



Microcenter near me still has a few left in stock. Hopefully that means the scalpers aren’t that interested.


----------



## EarthDog (Dec 2, 2020)

RandallFlagg said:


> I've seen this type of thread before and I know what happens.  Some poor slob buys into the napkin math and a couple months later there are posts about system rebooting and funny electrical smells, traced back to the PSU.
> 
> Tellling people they're fine with under-rated PSUs based on not having a hard drive, not adding SSDs, not having anything connected to their USB ports, not having a PCI-e network adapter and so on assumes they don't or will never have those use cases.  The recommended numbers are for a person to be able to fully utilize the connectivity in their PC without fear of instability.  Adding say WiFi 6 and a backup USB drive should not result in a failing rig and $100 wasted on the wrong PSU.
> 
> And that manufacturer rec is this :


You're aware that these recommendations always estimate high, right (err on the side of caution)? They have to consider that most users have average to potato level PSUs as well as some consideration (not to your level) of peripheral attachment and power use. That said, I thought any Type-A USB ports pull off the 5vSB and not the 12V rail anyway... Type-C may be 12v? Hubs will (molex or sata powered).

Also, TPU threads we always tell them QUALITY PSUs... so no, sir, we don't often (ever?) see people come back claiming the PSU we suggested underpowered their rig. You're welcome to show a thread where that happened... maybe there is one, but it certainly isn't a theme as you want to portray.

That chart, lol... I'm running a 10980XE overclocked to 4.6 GHz and a Strix RTX 3080 at stock (320W). I have mouse/asus led keyboard/BT cans/Xbox controller/flight stick/media card reader/charger for cans/5 total fans/pump/4x RAM/2 M.2/1 SSD/2 HDD... I'm running on a quality 750W PSU without issue. I pull a bit over 600W FROM THE WALL (so ~550W actual) while gaming. I ran AIDA64 and looped 3DMark w/o issue as well peaking around 625W (actual). Most systems could run with a quality 550W PSU and a 3060Ti even with overclocking. 3060Ti is 100W less and likely so is any other CPU you put against mine while overclocked.


----------



## RandallFlagg (Dec 2, 2020)

EarthDog said:


> You're aware that these recommendations always estimate high, right (err on the side of caution)? They have to consider that most users have average to potato level PSUs as well as some consideration (not to your level) of peripheral attachment and power use. That said, I thought any Type-A USB ports pull off the 5vSB and not the 12V rail anyway... Type-C may be 12v? Hubs will (molex or sata powered).
> 
> Also, TPU threads we always tell them QUALITY PSUs... so no, sir, we don't often (ever?) see people come back claiming the PSU we suggested underpowered their rig. You're welcome to show a thread where that happened... maybe there is one, but it certainly isn't a theme as you want to portray.
> 
> That chart, lol... I'm running a 10980XE overclocked to 4.6 GHz and a Strix RTX 3080 at stock (320W). I have mouse/asus led keyboard/BT cans/Xbox controller/flight stick, media card reader/charger for cans/5 total fans/pump/4x RAM/2 M.2/1 SSD/2 HDD... I'm running on a quality 750W PSU without issue. I pull a bit over 600W FROM THE WALL (so ~550W actual). I ran AIDA64 and looped 3DMark w/o issue as well pulling around 650W (actual).



A quick search reveals that there are many current threads about PSUs on 3060 Ti and most people are recommending 600W+ for 3060 Ti (mfr rec also).  A few people are saying you can try it with a 550W and worst case have to upgrade.

But if you want to jump into those threads and recommend a 500W psu like the person I was responding to, you go right ahead.


----------



## rtwjunkie (Dec 2, 2020)

Xaled said:


> Once again TPU falls in the trap of nvidia and participate in the fraud of announcing an unavailable product at a fake MSRP.
> Shame!


All W1zz can do is list the MSRP. Should he list a Wishful Thinking price as well?  Real prices will depend on the market and will vary by region and sales platform. The one commonality is the MSRP from Nvidia


----------



## EarthDog (Dec 2, 2020)

RandallFlagg said:


> A quick search reveals that there are many current threads about PSUs on 3060 Ti and most people are recommending 600W+ for 3060 Ti (mfr rec also).  A few people are saying you can try it with a 550W and worst case have to upgrade.
> 
> But if you want to jump into those threads and recommend a 500W psu like the person I was responding to, you go right ahead.


Sorry... you're moving the goal posts a bit.

You said... 





> Some poor slob buys into the napkin math and a couple months later there are posts about system rebooting and funny electrical smells, traced back to the PSU.



..and I replied those threads don't exist as frequently as you say they do (if they even do). That's all. 

FTR, I'd probably run some systems with a 500W PSU and 3060Ti, especially at stock speeds... 5600x? yup... locked Intel or i5's... yuppers.


----------



## Valantar (Dec 2, 2020)

EarthDog said:


> You're aware that these recommendations always estimate high, right (err on the side of caution)? They have to consider that most users have average to potato level PSUs as well as some consideration (not to your level) of peripheral attachment and power use. That said, I thought any Type-A USB ports pull off the 5vSB and not the 12V rail anyway... Type-C may be 12v? Hubs will (molex or sata powered).
> 
> Also, TPU threads we always tell them QUALITY PSUs... so no, sir, we don't often (ever?) see people come back claiming the PSU we suggested underpowered their rig. You're welcome to show a thread where that happened... maybe there is one, but it certainly isn't a theme as you want to portray.


Motherboard-side USB-C is still just 5V, AFAIK there's no provision for USB-PD output support from host devices - or there might be, but it's never been implemented. 5V3A is the standard there. GPUs with USB-C seem to provide 9V3A, as that's the VirtualLink spec. I find it a bit odd that they didn't just add 12V3A to that, given the availability of 12V on the GPU, but I guess they didn't deem it necessary.

Other than that: yes, PSU quality is paramount. And quality is _far_ more important than the rated wattage. We've all seen enough "800W" units blow up under 400W loads to know that.


RandallFlagg said:


> I've seen this type of thread before and I know what happens.  Some poor slob buys into the napkin math and a couple months later there are posts about system rebooting and funny electrical smells, traced back to the PSU.
> 
> Tellling people they're fine with under-rated PSUs based on not having a hard drive, not adding SSDs, not having anything connected to their USB ports, not having a PCI-e network adapter and so on assumes they don't or will never have those use cases.  The recommended numbers are for a person to be able to fully utilize the connectivity in their PC without fear of instability.  Adding say WiFi 6 and a backup USB drive should not result in a failing rig and $100 wasted on the wrong PSU.
> 
> ...


Again, you dont' actually seem to read what I'm writing. I'm putting this in a spoiler tag to spare the rest of the readers here the OT discussion.


Spoiler



a) that table is 100% CYA, representing numbers designed around making it _extremely_ unlikely that someone should blame Nvidia for telling them to buy a too-weak PSU for their GPU. It also factors in people buying utter crap PSUs because they don't know any better. I take for granted that people _do_ know better, as I tell them as much. You should _never_ skimp on your PSU. Period.
b) Who uses PCIe network adapters? In my experience people either buy motherboards with it built in, or use USB adapters. Regardless if it's m.2, PCIe or USB, it doesn't use more than a few watts (it's all the same base hardware anyway).
c) My formula _explicitly_ accounts for storage devices (and fans, pumps, motherboards, RAM, etc.). If you see it as likely you'll add more in the future, feel free to add another 5-20W to your numbers, though the 20% margin makes that safe for most people. After all, most SSDs consume <5W, and most HDDs consume <20W for a couple of seconds during spin-up and ~5-10W in use. That's not a massive addition by any means.

Adding a PCIe WiFi 6 card and an external backup HDD to your PC does essentially nothing to its overall power draw. External bus powered HDDs pull less than 5W (they're mostly 5400rpm 2.5" drives - 3.5" drives need external 12V power after all, and the bus powered ones need to work on 5W-limited USB ports), PCIe network cards are a few watts - let's be generous and say 10. If 15W is the difference between your PSU being stable and not, you're not even close to following my recommendations. Unless, I guess, your calculations told you that you needed a 90W PSU, as that's how low you'd need to go for the 20% margin to be exceeded by that additional load. 

An example: A user has an i5-9400, a stock power GTX 1060, 2x8GB of DDR4-3200 and a 1TB SATA SSD.  Two case fans + a 240mm AIO. With my formula that's 100W (134W load minus 44W idle, + 10W for CPU idle power) + 125W + ~25W for the motherboard and RAM + 5W for the SSD, 20W for four fans and another 5W for the AIO pump = 280W. Add the 20% margin and you're at 310W. That's not a lot at all, and there are no quality PSU options that low, so their closest option is a 450W unit, with a 550W likely costing the same and having better availability (and likely a higher efficiency rating). That's of course complete and utter overkill, and leaves them _tons_ of headroom - the PC will literally never exceed 50% PSU load! - but that's reality. They can add whatever upgrades they want. But if they want to go SFF with a custom PSU or external power brick, they can also feel safe that that will be entirely sufficient, despite the scaremongering from Nvidia that the 1060 needs a 400W PSU. If, on the other hand, there were high quality 300W PSUs out there, I would say that those are _sufficient, _while cautioning that it's too tight to allow for any upgrades in the future that add noticeable power draw. 350W would be entirely fine. Again, as long as the PSU is of decent quality - but that's a base necessity no matter what.

On the other end, if someone has a higher end build - let's say an i7-9700K, AIB RTX 2060S, 2x8GB of DDR4-3600 and a 1TB SAT SSD, with the same cooling. That's (180W-41W+10W=) 150W + 205W + 30W mobo/RAM + the same 25W for cooling and 5W for an SSD. That adds up to 415W, or 498W with a 20% margin. If they plan to OC, just replace 180W for the CPU with 237W and go from there. They could in other words still get by just fine with a 500-550W PSU at stock, and the 20% margin would let them upgrade to, say, a stock-clocked RTX 2080 (226W according to TPU) without even getting close to the rating of the PSU (still just 436W), or an aftermarket one (likely more like 460W) with still little risk. Why? _Because that PC will never consume 415W under load_. That number is made by adding up the _peak_ power numbers of every single component, a scenario that will _never_ happen in the real world unless you work very hard to make it happen. Games don't stress your CPU to 100% power draw, nor your drives, peripherals, or anything else. In the _vast_ majority of scenarios, games stress your GPU 100%, 1-2 CPU cores 80-100%, and the rest of the system is relatively relaxed, with bursty drive loads and continuous, irrelevantly small loads from RAM and peripherals.

I'm not saying this formula is idiot-proof - nothing is! - but it does the job as long as you know what you're planning and those plans don't include massive upgrades. Which most people don't plan for, or have the funds for.


----------



## RainingTacco (Dec 2, 2020)

Mussels said:


> This seems like a great replacement for my 1080 8GB, similar wattage and a good chunk more performance.
> 
> It's hurting me inside how power hungry modern GPU's are tho


200W is power hungry? Whaaat. You must be dinosaur then.


----------



## Valantar (Dec 2, 2020)

RandallFlagg said:


> A quick search reveals that there are many current threads about PSUs on 3060 Ti and most people are recommending 600W+ for 3060 Ti (mfr rec also).  A few people are saying you can try it with a 550W and worst case have to upgrade.
> 
> But if you want to jump into those threads and recommend a 500W psu like the person I was responding to, you go right ahead.


I would definitely not recommend against that. A 500W PSU would handle that setup perfectly fine as long as the PSU is of decent quality. It would never get even close to 100% load unless it was paired with an overclocked 10700K/10900K or a HEDT CPU.


----------



## RainingTacco (Dec 2, 2020)

Vayra86 said:


> Right now people are somehow convinced they MUST game at 4K



Rightfully so. Just as people expected to game at 1080p 60fps ultra when consoles did 1080p 30fps, now consoles do 4k30fps, so we expect 4k60fps going mainstream.


----------



## RandallFlagg (Dec 2, 2020)

Valantar said:


> Motherboard-side USB-C is still just 5V, AFAIK there's no provision for USB-PD output support from host devices - or there might be, but it's never been implemented. 5V3A is the standard there. GPUs with USB-C seem to provide 9V3A, as that's the VirtualLink spec. I find it a bit odd that they didn't just add 12V3A to that, given the availability of 12V on the GPU, but I guess they didn't deem it necessary.
> 
> Other than that: yes, PSU quality is paramount. And quality is _far_ more important than the rated wattage. We've all seen enough "800W" units blow up under 400W loads to know that.
> 
> ...




You're correct, I do not and will not read that.


----------



## EarthDog (Dec 2, 2020)

RandallFlagg said:


> You're correct, I do not and will not read that.


That's a shame. You might learn something.


----------



## Chrispy_ (Dec 2, 2020)

Chomiq said:


> Anyone from Europe can verify if Nvidia has the FE listed on their local site? Here it is not even listed.


Listed and out of stock as expected.


----------



## RandallFlagg (Dec 2, 2020)

EarthDog said:


> Sorry... you're moving the goal posts a bit.
> 
> You said...
> 
> ...



The reason they don't exist in quantity is because people recommending significantly underpowered PSUs isn't happening.  I'm not going to dig through years of posts to find when they did, but the recent ones people aren't doing that.  

To note, I'm now repeating myself because you didn't get that from previous post.  

Also, I did not say they happened frequently (_that's you again constructing straw man arguments_). I said I've seen them before. Frankly, most people have better sense than to say its ok to get a 450W-500W PSU for these types of setups, but it did here and yes that's you defending it and saying pretty much the same thing.

As I stated (now the third time), people making those recs in those threads are not recommending 100W under mfr spec PSUs.  

If you want to do that, go ahead.  Valatar is using napkin math to justify saying you can use 450W-500W PSUs with 200W-240W GPUs.  If that is your thinking too, go for it man, make those recommendations.  Lets see how that turns out.



EarthDog said:


> That's a shame. You might learn something.



Unlikely.  I'm a software engineer in controls industrial engineering.  I work around electronics and controls (electrical) engineers all the time.  I've been doing this for 25 years.

People should get what the engineers recommend, not what you threw up into this thread.


----------



## RainingTacco (Dec 2, 2020)

How on earth a 500W won't suffice, on a 200W GPU, when a whole system will take at max 400W under worst case scenario, like prime95+furmark? Even a crappy group regulated PSU will have 400W on 12V, with 500W overall rating. We of course aren't talking about these aliexpress gpu for 10usd, that have 250W on 12V when they have 500W overall. We are talking about an entry level psu 80+ certified, or 80+ bronze certified. This will do fine with 500W of power. A good quality 500W psu will whitstand that gpu with ease, going 500W on 12V and even more. Nvidia can't take into account these worthless 10usd psus, becasue in that case, they should recommend a 700-900W psu for such gpu -since such crappy psus can outpust 300-400W at most on 12V.  
It's not upto nvidia to cover for user imbecilic choice which is an unprobable edge case, affecting less than 1% of users, who are braindead and bought a 10usd psu for 400 usd gpu...


----------



## RandallFlagg (Dec 2, 2020)

RainingTacco said:


> How on earth a 500W won't suffice, on a 200W GPU, when a whole system will take at max 400W under worst case scenario, like prime95+furmark? Even a crappy group regulated PSU will have 400W on 12V, with 500W overall rating. We of course aren't talking about these aliexpress gpu for 10usd, that have 250W on 12V when they have 500W overall. We are talking about an entry level psu 80+ certified, or 80+ bronze certified. This will do fine with 500W of power. A good quality 500W psu will whitstand that gpu with ease, going 500W on 12V and even more. Nvidia can't take into account these worthless 10usd psus, becasue in that case, they should recommend a 700-900W psu for such gpu -since such crappy psus can outpust 300-400W at most on 12V.
> It's not upto nvidia to cover for user imbecilic choice which is an unprobable edge case, affecting less than 1% of users, who are braindead and bought a 10usd psu for 400 usd gpu...



This would be my rig with a 10600K and a 3070, assuming no overclocking and leaving power limits intact on the 10600K (fat chance).  Also RAM OC will draw more power.

Doesn't matter though, at stock 400 and 450W PSUs would fail.

To use a 500W PSU at all, you'd have to be perfectly balanced on all rails.  Not likely.

Power unlock or use OC ram, or possibly even plug a high charge rate phone into an usb-c, and you're DOA with 500W.

Edit: 10600K can surge to 180W for 7 seconds even stock, 125W is average max over 28s by default and the power limit for 7s tau is 225W.


----------



## RainingTacco (Dec 2, 2020)

If someone is scared he/she can buy 550W and be safe. Transient power spikes are not to be accounted for calculating PSU wattage -some gpu can have a transient power spike doubling the power for few miliseconds, but PSUs can handle that. Only the worst PSUs will trip over such spikes, causing reset or if it doesnt have any meaningful protection, a burnt cap or other piece of electronics. 
Recommended PSU wattage is complete bollocks on that calculator -it's for really bad PSUs. The minimal PSU wattage is more realistic. 
And who the hell use blu ray drive in PCs nowadays? That's 30W less. Same for 3 USB devices beside keyboard and mice? What for? You are heating your tea with some usb heater? One usb drive i can understand. I mean i can stretch that scenario even more, adding more HDD, but we should talk about typical pcs. Blu ray drive is extremely uncommon. 70W for motherboard? That must be really high end motherboard, because i've never seen such power draw from my B450 motherboard -it takes 50W max. It isn't a christmas tree mind you.


----------



## EarthDog (Dec 2, 2020)

lol, I'd gladly run that listed machine on a quality 500W single rail PSU... I'd even stress test it... and maybe overclock some. 

550W leaves proper headroom for quiet operation and expansion... but hell yeah.. especially at stock speeds. I don't 'play' P95 and stress tests... so long as it passes them when I overclock. It's on. 

Anyway, 3060ti thread.... thats a 3070 listed, lol


----------



## RandallFlagg (Dec 2, 2020)

EarthDog said:


> lol, I'd gladly run that listed machine on a quality 500W single rail PSU... I'd even stress test it... and maybe overclock some.
> 
> 550W leaves proper headroom for quiet operation and expansion... but hell yeah.. especially at stock speeds. I don't 'play' P95 and stress tests... so long as it passes them when I overclock. It's on.
> 
> Anyway, 3060ti thread.... thats a 3070 listed, lol



So now we're up to 550W, nevermind 450 and 500W?   

And the FE 3060 Ti takes 208W card only in gaming.   

The Strix version of the 3060 Ti takes _*251W in gaming*_.  _*31W more than the 3070 in that image.  *_

Like I repeatedly noted in the post, the numbers from that calc are for non OC parts.  A lot of parts are OC out of the box.


----------



## EarthDog (Dec 2, 2020)

You may want to read my post, bud. Done here though.


----------



## r9 (Dec 2, 2020)

Just got back from microcenter they pulled the last one from somewhere for the guy in front of me.They had 3080ti for $750 too rich for my blood. So I had to make peace with the i7 9700k and msi z390 for $310 with tax.


----------



## Mussels (Dec 3, 2020)

RainingTacco said:


> 200W is power hungry? Whaaat. You must be dinosaur then.



200W for the midrange card? yeah, that's high. Whatt.


----------



## nguyen (Dec 3, 2020)

Mussels said:


> 200W for the midrange card? yeah, that's high. Whatt.



Welcome to the Polaris fans club


----------



## Chrispy_ (Dec 3, 2020)

RandallFlagg said:


>


For an intel CPU sucking down 125W and with VRMs running at 92-94% effieciency, that's about 9W lost to the VRMs and Intel rates the Z490 chip at 6W. There really isn't anything else on a board that uses any power. Maybe add one extra Watt if the board has particularly stupid amounts of RGBLED....


----------



## RandallFlagg (Dec 3, 2020)

Chrispy_ said:


> For an intel CPU sucking down 125W and with VRMs running at 92-94% effieciency, that's about 9W lost to the VRMs and Intel rates the Z490 chip at 6W. There really isn't anything else on a board that uses any power. Maybe add one extra Watt if the board has particularly stupid amounts of RGBLED....



There's much more than 9W *difference* between boards with the same chipset in reviews.  

This is a 49W delta low to high :





40W delta under load :


----------



## Vayra86 (Dec 3, 2020)

Xaled said:


> Just imagine if Samsung said: 1280x720 is enough for mobile users, we should convince them if they want more resolutions, they should pay more and do not complain about higher prices
> Or if Qualcomm said same thing about processors
> or if Apple said 4" is enough, people should not convince themselves to use bigger screen, bigger screens are more expensive, their production is costly.
> do you see now how your poor your point is?



Its not poor, its the self-destructive idea of commerce. I can't help the fact its a paradox. The fact is, these products are entirely not necessary for anything. Its luxury - that is reality for ya. If you get one, lucky you. Nobody is entitled to it and the fact people are so eager is up to themselves and them alone.

Its obvious the salesman won't tell you not to buy. Doesn't make it a poor point. Things get scarce, we're too many people.



Xaled said:


> Anything higher than 720 for mobile is not noticable as well, but competition and high demand in phone market had the phone devices develop so quick and fast even that most phones now have features specs that is way higher than what actually this device need. Same thing would've been done in PC world too, All peoples would've been using 4k 144hz mintors by now if the market wasnt dominated by (until recently) Intel and Nvidia. We've been using the same Intel CPU technology for almost 10 yeras. And Nvidia has been giving almost the same FPS per Dollar for almost five years. These two firms are the only reason that PCs are not evolving the same way they were evolving before  or the same way mobile devices are evolving.
> Only CPUs and GPUs are preventing evoluition of the PC. Moore's law has been present in almost all parts, Monitors, 4k 120hz monitors now are not more expensive that what 1080p 120hz was 10 years ago. Same thing can be said for HDDs, SSDs, etc.



But PCs are evolving. Its just a lot more sensible for the most part.
Mobile market has taken off and never came down, now they jump from one shitty innovation to the next to keep selling phones, the better half of those ideas failing within a year or two. But you see the same things that happened on the PC. Your midrange phone is now feature complete, it also costs as much as a high end phone used to cost. Shareholders are happy. The phones still die in two-three years tops. PCs are not that quick to die - so what does the industry do to keep a constant flow of demand? They create heavy graphics settings, and push higher resolutions. They also push RT.

Demand for that high res and RT is exactly a key point of discussion right now. Do you need it, does it benefit gaming? Important questions to ask, as much as they are with a phone that has 8 camera lenses. There is a point at which 'desires' become straight up looney bin material IMO. We're going there - if we haven't arrived already some time ago. Look at the console launch titles. There's barely anything to show for it and yet we shrug and move on. There is so much to buy, you can hardly choose and we've lost touch with truly good content and what's bog standard filler. Information overload. Too much commerce?

The fact Intel never made more than a quad core... there was no demand for it. Even today you can still do quite fine with it in most applications.


----------



## Valantar (Dec 3, 2020)

RandallFlagg said:


> There's much more than 9W *difference* between boards with the same chipset in reviews.
> 
> This is a 49W delta low to high :
> 
> ...



That delta is more likely down to MCE (multi-core enhancement, aka. boards allowing multi-core boost to go as high as single-core boost) and boosting behaviour than it is down to motherboard power draw. Motherboards often differ dramatically in how they manage the CPU's boost power and time limits as well as multi-core boost. What else exactly is supposed to be consuming all that power? Unless the board has integrated 10GbE, it's unlikely to have any power hungry controllers at all - SATA and USB is integrated into the SoC or PCH, at a few watts. (And even 10GbE tops out at about 5W using a modern controller.) Audio consumes next to nothing. I guess RGB could be noticeable, in the 5-10W range if the board has a ton of it, but that is accounted for separately in your table. VRM losses are as @Chrispy_ said tiny. So the only reasonable explanation for a delta that big is that one board causes the CPU to boost higher than the others, or feeds the CPU more voltage than the others. There could of course be power losses due to voltage drop in the board, but those are unlikely to be above a handful of watts. So again, accounting for _real-world_ power draw numbers for these components, using reviews to source them, is a safe way to account for this. You call what I do "napkin math" and brag about your engineering chops, yet base your calculations on a calculator using on-paper specs rather than real-world data. Get off your damn high horse, please.


Spoiler: trying to keep the thread at least a bit tidy



Also, please note that the calculator you're quoting quite obviously contradicts your concerns about peripherals, PCIe networking, etc. It even bases itself on _much_ lower power numbers per fan, SSD, HDD etc. than my formula - in other words, my calculations have more built-in headroom than that table does. The only thing I really disagree with in those numbers (aside from the margins added at the end, which are at first sensible, then go into plain silliness) is the ridiculous power draw allotted to the motherboard, particularly when that number doesn't also account for RAM. A high-end, feature-packed 2020 motherboard including 4 sticks of fast RAM is unlikely to consume more than 50W, even including VRM losses when powering a 250W CPU like an i9-10900K. You might see a combined 75W for mobo+RAM with a TB3/10GbE-equipped motherboard when those controllers are under heavy load - though that's highly unlikely to coincide with a heavy CPU+GPU load. Or do you tend to do long-term continuous >1Gbps data transfers while gaming? The same goes for the ODD - which >1% of PC builds in 2020 have at all - how often is that going to be running full tilt at the same time as the CPU and GPU are? Are you ripping blu-rays while gaming? Do you see that as a common use case for a PC?

You don't seem to grasp the crucial point here, which means I have to repeat it yet again: _normal consumer workloads never ever stress every component to 100% at the same time_. It doesn't happen. Period. Games _never_ stress the CPU _and_ GPU to 100%, which means that starting with real-world maximum power draw numbers for each of those _already includes a significant margin_. I mean, just look at the difference between TPUs 10600K review power draw numbers. 162W under CB, 191W under P95, and 383W while gaming. The test setup uses an EVGA 2080Ti FTW3, which _alone_ consumes 304W average while gaming. That means _the rest of the system is consuming around 79 watts while gaming_. CPU, motherboard, VRMs, SSDs, RAM, USB, PSU efficiency losses, _everything_. Do you see how that leaves _a lot_ of headroom if you account for ~125W for the CPU alone, plus ~50-75W for the rest of the system? That is about 120W of margin just from base component numbers, _before_ my added safety margin. There are of course games that need more CPU power than TW3, but 120W more? Not even close. My calculations for that same setup would end up at ~490W (~125W CPU + 304W GPU + 35W motherboard/RAM + 25W storage and cooling, of course depending on the specific configuration)  + 20% margin, or a 590W recommended PSU (550W would be fine, but cutting it a bit close, so below what I would recommend). Yet the real-world gaming numbers for that exact setup are below 400W. And you somehow claim that my calculations are unsafe? Based on what, exactly? 

This is, for the record, also a case where Nvidia's recommended PSU numbers align decently, as they recommend a 650W PSU for the 2080 Ti (as does EVGA for that specific OC model), though you could perfectly safely game on this setup with a high quality 500W unit - with more than 100W of headroom! - just don't run furmark+P95 on it. Step down to a less power hungry CPU and/or GPU, and you're looking at a smaller recommended PSU - a Ryzen 5 3600 + 2070S would cut ~45W and ~75W from the base numbers, for example, bringing the recommended PSU down to ~450W including a 20% margin, and real-world power draws would likely be closer to 300W. Heck, I've seen enough people run undervolted 2080 Tis off 400W SFF PSUs to know that is entirely feasible as long as you're comfortable with running on the bleeding edge. Yet for a setup like that you'd be more likely to find people saying "get a 750W PSU just to be safe" (or also the classic "aim for 2x power draw, so get an 800W unit"), than for people to make reasonable recommendations based on actual data.


My calculations are based on real-world data, take the user's intended use case and potential upgrade path into account, and adds a safety margin on top of a consciously unrealistic summation of peak power draws. Your preferred method is apparently "don't ask questions, don't look at data, just trust what's on the box, and aim high". I definitely know which method I trust more.



Xaled said:


> Anything higher than 720 for mobile is not noticable as well, but competition and high demand in phone market had the phone devices develop so quick and fast even that most phones now have features specs that is way higher than what actually this device need. Same thing would've been done in PC world too, All peoples would've been using 4k 144hz mintors by now if the market wasnt dominated by (until recently) Intel and Nvidia. We've been using the same Intel CPU technology for almost 10 yeras. And Nvidia has been giving almost the same FPS per Dollar for almost five years. These two firms are the only reason that PCs are not evolving the same way they were evolving before  or the same way mobile devices are evolving.
> Only CPUs and GPUs are preventing evoluition of the PC. Moore's law has been present in almost all parts, Monitors, 4k 120hz monitors now are not more expensive that what 1080p 120hz was 10 years ago. Same thing can be said for HDDs, SSDs, etc.


I don't know what kind of eyesight you have, but the difference in text rendering sharpness between a 720p and a 1080p (non-pentile) is _very_ noticeable to my +1/+1.25 eyes _without glasses_. 1080p to 1440p is less noticeable but still there. Pentile subpixel layouts mess everything up, of course, with their significant reduction in subpixel density and their diagonal grid layout making straight lines look fuzzy. While text rendering is a bit of an edge case for sharpness, and the difference when looking at photos or videos is much smaller, text rendering still accounts for a _huge_ amount of the use cases for  a smartphone. The same goes for the difference in perceived smoothness from 60Hz to 90Hz - it's very clearly noticeable even when scrolling through web pages or texts. Does that make either _necessary_? Of course not. 

As for everyone using 4k 144Hz monitors today if it wasn't for the non-competitive markets we've had for the past decade? Sorry, no. We would likely have been further along in performance and performance/$, but monitor prices would still have been far too high. It's more expensive to make denser displays, especially when you start to add features like HDR and direct backlighting. That's unavoidable. That obviously doesn't mean that 4k120/144Hz HDR displays won't come down in price, but for that to happen production and sales volumes need to increase dramatically - which is difficult when most people are happy with 1080p or 1440p. And if you're playing a fast-paced game, it's highly unlikely you'll be able to tell the difference between 4k120 and 1440p120, and even 1080p120 might be a great experience for most people, leaving 4k high refresh monitors a luxury rather than a necessity.


----------



## Vayra86 (Dec 3, 2020)

RainingTacco said:


> Rightfully so. Just as people expected to game at 1080p 60fps ultra when consoles did 1080p 30fps, now consoles do 4k30fps, so we expect 4k60fps going mainstream.



To each his own loss then I guess. There is nothing 'rightful' about it, because the consoles don't even push native 4K, just as last gen didn't do native 1080p. Internal render res is usually fár lower, and most of what you see is made dynamic. The box of tricks consoles use to keep it somehow playable is always a reduction in quality.

Marketing <> Reality.


----------



## Randomoneh (Dec 3, 2020)

Are you continuing to add performance per dollar page to new reviews as a tradition? It's confusing at best and misleading at worst. Either link it to crawler that checks for availability or don't include it.


----------



## Shatun_Bear (Dec 3, 2020)

$399 compared to Turing seems like a decent price but puts the card just $100 cheaper than the next gen consoles, which outperform it (not talking about on paper specs, look at an analysis of the settings and performance the PS5 and Xbox are running AC:Valhalla; equivalent to a 3070).

In fact the DE PS5 is the same price as this card, a whole console, next-gen controller, super fast SSD/IO....I'm a PC gamer primarily but I just dont see great value in a best case $400 card around 2080S level.



Raendor said:


> PS5 not, but Series X can. It's a shame, because this card costs a bit less than getting a whole gaming system that can play games on that same level. X is around the same 2080 level performance.



PS5 outperforms Series X in every next gen multiplatform game. It's probably faster real world because of its customizations. And in games, both Series X and its rival runs games the same as a 3070, not 2080.


----------



## RandallFlagg (Dec 3, 2020)

Well, it's not like Nvidia isn't making cards.  It was hard to quantify but at least on Steam, there are now more 3080's showing than 5600XT's, about the same as the 5700.  I would expect the 3070 and so on to start showing up too.  This  actually reflects significant volume :


----------



## Valantar (Dec 3, 2020)

RandallFlagg said:


> Well, it's not like Nvidia isn't making cards.  It was hard to quantify but at least on Steam, there are now more 3080's showing than 5600XT's, about the same as the 5700.  I would expect the 3070 and so on to start showing up too.  This  actually reflects significant volume :
> 
> View attachment 178013


It would be really interesting if Steam ever reported on the number of respondents for their surveys, and how large a portion that amount is of their daily/weekly/monthly/quarterly active users. Still, that is indeed a significant number - the similarly priced 2080 Super is at .88%, so unless the numbers here have a significant margin of error (which is of course not unlikely) they've definitely sold a significant amount of 3080s.


----------



## RandallFlagg (Dec 3, 2020)

Valantar said:


> It would be really interesting if Steam ever reported on the number of respondents for their surveys, and how large a portion that amount is of their daily/weekly/monthly/quarterly active users. Still, that is indeed a significant number - the similarly priced 2080 Super is at .88%, so unless the numbers here have a significant margin of error (which is of course not unlikely) they've definitely sold a significant amount of 3080s.



Agree one has to be careful about conclusions.  However, I do think Steam is very representative of people who buy their PCs for gaming, and has some weight for occasional gamers.  I also think it is meaningless in the overall market.  

Looking at the people I know (keep in mind I'm 51) I think maybe 10% game on PC outside of say facebook or some such.   That may be high, as this article from 2016 has numbers that would amount to well under 5% of PCs having a discrete GPU at all (more like 2%).


----------



## Chrispy_ (Dec 3, 2020)

Valantar said:


> That delta is more likely down to MCE (multi-core enhancement, aka. boards allowing multi-core boost to go as high as single-core boost) and boosting behaviour than it is down to motherboard power draw. Motherboards often differ dramatically in how they manage the CPU's boost power and time limits as well as multi-core boost.


This. The 49W delta is down to whether the BIOS applies an automatic overclock at stock settings or not. I've seen MSI B450 boards that provide an almost 50W overclock to a 105W part out of the box, just using AUTO settings for everything.

Some boards need to be tweaked to run a CPU at stock speeds, others run at stock and need to be tweaked to apply an overclock.

Either way, that 49W delta in power draw numbers come from the CPU, and doesn't explain where they're finding the mystery power consuption on the board to claim a 70W motherboard power draw. If that was true, your motherboard would need a sizeable heatsink and fan, similar to that of the i5 stock cooler, just for itself. Instead, it usually gets puny "decorative" heatsinks that exist more for corporate branding reasons than actual cooling, and usually no fan at all.


----------



## Randomoneh (Dec 4, 2020)

RandallFlagg said:


> Well, it's not like Nvidia isn't making cards.  It was hard to quantify but at least on Steam, there are now more 3080's showing than 5600XT's, about the same as the 5700.  I would expect the 3070 and so on to start showing up too.  This  actually reflects significant volume :
> 
> View attachment 178013



Sorry, what?  

From another person:


> The 980 first showed up in the November 2014 survey with 0.18% market share, 74 days after launch. The 1080 first showed up in the July 2016 survey with 0.26% market share, 65 days after launch. Finally, the 2080 first showed up in December 2018 with 0.21%, 102 days after launch. For comparison the 3080 is at 0.22%, also 74 days after launch.




That would make
  980 grow 0.07% per month
1080 grow 0.12% per month
2080 grow 0.06% per month
3080 grow 0.09% per month

Not unprecedented number of sales by any stretch of imagination.


----------



## Valantar (Dec 4, 2020)

RandallFlagg said:


> Agree one has to be careful about conclusions.  However, I do think Steam is very representative of people who buy their PCs for gaming, and has some weight for occasional gamers.  I also think it is meaningless in the overall market.
> 
> Looking at the people I know (keep in mind I'm 51) I think maybe 10% game on PC outside of say facebook or some such.   That may be high, as this article from 2016 has numbers that would amount to well under 5% of PCs having a discrete GPU at all (more like 2%).


Yeah, gaming is definitely a niche within the overall PC market - that includes enterprise, after all, which is easily 10-20x the size of the gaming market in terms of units shipped. Then again, gaming drives a lot of revenue due to high component costs and performance requrements. But as you say, not _that_ many people game on PCs - even if 70-80% of people below the age of 40 play games with some regularly, at least half of those play on mobile only, and likely 2/3 of the rest are console players. And of course a lot of people playing games on PCs are anything but hardware enthusiasts, making do with whatever they have or whatever pre-built fits their budget. So it's not a massive group, even if it is substantial. As for Steam being representative, I absolutely think it is given the ubiquity of their launcher, but one does have to wonder how representative the selection of systems polled is. The results being consistent over time speaks to a decent degree of reliability, but we still can't know how large the error margins are, sadly. So it's entirely possible that RTX 3080 owners are significantly overrepresented in these statistics, but I don't think it's very likely.


----------



## Pumper (Dec 5, 2020)

Shatun_Bear said:


> $399 compared to Turing seems like a decent price but puts the card just $100 cheaper than the next gen consoles, which outperform it (not talking about on paper specs, look at an analysis of the settings and performance the PS5 and Xbox are running AC:Valhalla; equivalent to a 3070).



How can you compare the console version of Valhalla, which drops the resolution to 1440p on PS5 to maintain framerate close to 60, with a 3070 running native 4K?


----------



## Super XP (Dec 7, 2020)

Thanks to this 3060Ti release and price target, Nvidia single handily catabolized its RTX 3070 graphics card. 
I wonder how 3070 owners feel about this release? I can also see AMD releasing the 6700XT sooner for a cost equal or lesser over the 3060Ti. 
Moor's Law is Dead had some info on this strategy.



RandallFlagg said:


> Pretty much every major OEM is shipping systems with 3070, 3080, and some with 3090.  Nvidia is at least supplying the OEMs, and they are clearly opting to sell entire systems to capitalize on the populatirity.  AMD is apparently has nothing to sell, as not even OEMs have 6800/6800XT and most don't have Zen 3 either.
> 
> There are literally pages of them on Amazon and most you can get by the end of this week.
> 
> ...


Nvidia released Ampere sometime within September 2020. AMD released its RDNA lineup in November 2020. Seeing how Nvidia has more than a 1 month lead, it takes time to fill OEM channels, not to mention RNDA2's are being sold out like hot cakes. AMD can't make enough of them.



Pumper said:


> How can you compare the console version of Valhalla, which drops the resolution to 1440p on PS5 to maintain framerate close to 60, with a 3070 running native 4K?


And 99.9% of people won't see a difference in PQ and Frame Rate Drops on the PS5 & the XboxSX.



Valantar said:


> Yeah, gaming is definitely a niche within the overall PC market - that includes enterprise, after all, which is easily 10-20x the size of the gaming market in terms of units shipped. Then again, gaming drives a lot of revenue due to high component costs and performance requrements. But as you say, not _that_ many people game on PCs - even if 70-80% of people below the age of 40 play games with some regularly, at least half of those play on mobile only, and likely 2/3 of the rest are console players. And of course a lot of people playing games on PCs are anything but hardware enthusiasts, making do with whatever they have or whatever pre-built fits their budget. So it's not a massive group, even if it is substantial. As for Steam being representative, I absolutely think it is given the ubiquity of their launcher, but one does have to wonder how representative the selection of systems polled is. The results being consistent over time speaks to a decent degree of reliability, but we still can't know how large the error margins are, sadly. So it's entirely possible that RTX 3080 owners are significantly overrepresented in these statistics, but I don't think it's very likely.


Despite being a Niche, it has grown massively throughout the years. PC Gaming is becoming more popular with each and every year. And now with this Covid nonsense, its become even more popular. Steam has seen record users since this fake pandemic took ahold of the world.


----------



## RandallFlagg (Dec 7, 2020)

Super XP said:


> Nvidia released Ampere sometime within September 2020. AMD released its RDNA lineup in November 2020. Seeing how Nvidia has more than a 1 month lead, it takes time to fill OEM channels, not to mention RNDA2's are being sold out like hot cakes. AMD can't make enough of them.


They were announced in Sept (note this is different from 'release date').

The first 3090s were released early Oct.  The 3080s 2nd half of Oct, and the 3070 released about the same time as the 68XX AMD cards.


----------



## EarthDog (Dec 7, 2020)

RandallFlagg said:


> They were announced in Sept (note this is different from 'release date').
> 
> The first 3090s were released early Oct.  The 3080s 2nd half of Oct, and the 3070 released about the same time as the 68XX AMD cards.


The 3080 was launched (sites had them and reviewed them) 9/17 and the 3090 9/24. They were available on the shelves shortly thereafter (both) in October. IIRC, availability of one of these were delayed for increased stock. They were both available in October. They still had plenty enough of a headstart in availability was his point. 



Super XP said:


> not to mention RNDA2's are being sold out like hot cakes. AMD can't make enough of them.


Nvidia still has this problem.


----------



## RainingTacco (Dec 7, 2020)

Super XP said:


> RNDA2's are being sold out like hot cakes. AMD can't make enough of them.


Source?


----------



## EarthDog (Dec 7, 2020)

RainingTacco said:


> Source?


You really need one?


----------



## RainingTacco (Dec 7, 2020)

EarthDog said:


> You really need one?



I don't trust AMD marketing, i don't want info from them, because they are most likely lying.


----------



## EarthDog (Dec 7, 2020)

RainingTacco said:


> I don't trust AMD marketing, i don't want info from them, because they are most likely lying.


Wh.........wow. Ok.

Just go try to buy one and tell us if you can find them in stock. Be it scalpers or miners or the average Joe, these, like NV GPUs, are very difficult to get a hold of. Come on now... use a bit of common sense.


----------



## Valantar (Dec 7, 2020)

RainingTacco said:


> I don't trust AMD marketing, i don't want info from them, because they are most likely lying.


The RX 6800 series being sold out isn't info from AMD, you can look at literally any online store...


----------



## RandallFlagg (Dec 7, 2020)

EarthDog said:


> Wh.........wow. Ok.
> 
> Just go try to buy one and tell us if you can find them in stock. Be it scalpers or miners or the average Joe, these, like NV GPUs, are very difficult to get a hold of. Come on now... use a bit of common sense.



I don't think he's asking the question you think he is.

Here's something that doesn't seem to fit AMDs narrative, and noting that none of the large OEMs (Dell, HP etc) carry the new AMD GPUs (or CPUs) :


"Other vendors said _*the supply of Radeon RX 6800XT cards is so low, they’re not even bothering with them until the inventory improves.*_
*'Radeon 6000-series looks like a paper launch to me,' one vendor said. “I hate to say it, but it is because there’s just no product.*”

On Nvidia, and noting that in fact you can buy a Dell or HP anytime you want with a new RTX card :


"One theory for the shortage for smaller system builders is that *the big OEMs are getting preferential supply access*. "
 "One promising sign: *Several vendors disclosed they were receiving 30-series cards on an almost-regular schedule. *"










						Ryzen 5000, RTX 30-series and Radeon 6000 shortages hit system builders too
					

The outsize demand for Ryzen 5000, RTX-series GPUS, and AMD Raseon 6000 has hit system vendors as hard as it's hit DIY builders. We look into why.




					www.pcworld.com


----------



## Valantar (Dec 7, 2020)

RandallFlagg said:


> I don't think he's asking the question you think he is.
> 
> Here's something that doesn't seem to fit AMDs narrative, and noting that none of the large OEMs (Dell, HP etc) carry the new AMD GPUs (or CPUs) :
> 
> ...


That Nvidia has better OEM availability doesn't really tell us much, given the well established fact that Nvidia has much stronger OEM ties than AMD does. AMD is likely channelling as much inventory as they have to DIY markets, as that's where they have a hope of gaining mindshare, which is where they're the furthest behind. This obviously does nothing to change the fact that there's no GPUs on the market, but at least there's a reasonable explanation for this specific difference.


----------



## EarthDog (Dec 7, 2020)

RandallFlagg said:


> Here's something that doesn't seem to fit AMDs narrative, and noting that none of the large OEMs (Dell, HP etc) carry the new AMD GPUs (or CPUs) :


These cards have been out for 2 weeks(ish)? I don't recall how quickly we saw OEM's with NV cards inside pre-builts....

Looks like they (NV/AMD) are simply going in different directions on OEM support..


----------



## Fluffmeister (Dec 7, 2020)

RainingTacco said:


> I don't trust AMD marketing, i don't want info from them, because they are most likely lying.



Yeah fact is despite many saying the 3000 series was a paper launch, AMD have managed to do even worse, hell some of the bigger online retailers here in the UK have removed the 6800 series cards completely from their stores. It's like they haven't even launched yet (which is kinda true frankly).


----------



## RandallFlagg (Dec 7, 2020)

EarthDog said:


> These cards have been out for 2 weeks(ish)? I don't recall how quickly we saw OEM's with NV cards inside pre-builts....
> 
> Looks like they (NV/AMD) are simply going in different directions on OEM support..



Yeah, AMD tried to just supply DIY which is like 1-2% of the market.  They failed.

Nvidia supplied major OEMs who comprise about 95% of the market, and thus far have been successful in doing that.  I can still buy any of several dozen systems with RTX 3XXX cards not only off Amazon from 3rd tier SI's, but straight from players like HP and Dell.  

The fact of the matter is, AMD only had 20% of its wafer allocation from TSMC for their GPU + Zen 3 launches.  Now they are  falling back on 'unprecedented demand' excuses for cover.  

Their situation is clearly a result of being fabless.  That's fine.  I wouldn't blame them for that.  

They should probably just say that, "hey we wanted you to know we have these great CPUs and GPUs so we're doing a limited qty release, but we can't make many right now because we're contractually obligated to Microsoft and Sony."

You know that would be like, honest.  

Instead we get legions of mindless AMD fans regurgitating Lisa Su's lies.  Their use of social media to enlist a million free marketing agents is also disturbing.  Genius?  Perhaps that too, in a sort of corporate Darth Vader way.  

So this is going to continue to unravel.  PS 5/ Xbox are still selling out.  They'll need massive allocation in Q1 again.  AMD is just going to make as much bank as possible with the few PC chips they can make and charge as much as possible.  But in the end, this is going to backfire.


----------



## RainingTacco (Dec 7, 2020)

I mean there are people who defend the Azor claim with straight face, like Moore's Law. Worse part that there are gullible people defending Moore's Law and Azor xD. AMD fans are high on copium. That's why i want a credible source, a data from actual shop, with real sale numbers, with real provide/stock numbers and not marketing babble.


----------



## Super XP (Dec 7, 2020)

RainingTacco said:


> Source?


They are sold out everywhere. As soon as stock comes in, they are gone. So yes RDNA2 is being sold out like hot cakes.
But so is Nvidia's Ampere.



RandallFlagg said:


> I don't think he's asking the question you think he is.
> 
> Here's something that doesn't seem to fit AMDs narrative, and noting that none of the large OEMs (Dell, HP etc) carry the new AMD GPUs (or CPUs) :
> 
> ...


Umm, Nvidia had a paper launch when they released Ampere, they had nothing, absolutely nothing. At least when AMD launched there RX 6800XT & 6800 they actually had many which sold out. I believe AMD underestimated the demand for these GPUs. But to claim AMD was a paper launch is absolutely false. Nvidia Ampere started to show up only after about 4-6 weeks later.











RainingTacco said:


> I mean there are people who defend the Azor claim with straight face, like Moore's Law. Worse part that there are gullible people defending Moore's Law and Azor xD. AMD fans are high on copium. That's why i want a credible source, a data from actual shop, with real sale numbers, with real provide/stock numbers and not marketing babble.


*Moors Law is Dead* already proved his credibility.


----------



## EarthDog (Dec 7, 2020)

Super XP said:


> Umm, Nvidia had a paper launch when they released Ampere, they had nothing, absolutely nothing.


This statement is patently false. Thousands (tens of?)  sold out in seconds. The trickle after was small. There was a dead week where many implemented new methods to get these in human hands and once in place, the slow trickle started. They were there.

Remember AMD had hindsight with them to try and prevent bot sales.


----------



## RainingTacco (Dec 8, 2020)

AMD had just one job: Provide 20-30% of what nvidia provide[because that's their max customer base for high end gpus] to shops. They didn't, and their job was 70% easier than what nvidia needed.


----------



## Valantar (Dec 8, 2020)

RandallFlagg said:


> Yeah, AMD tried to just supply DIY which is like 1-2% of the market.  They failed.
> 
> Nvidia supplied major OEMs who comprise about 95% of the market, and thus far have been successful in doing that.  I can still buy any of several dozen systems with RTX 3XXX cards not only off Amazon from 3rd tier SI's, but straight from players like HP and Dell.
> 
> ...


While there's little doubt Nvidia is supplying more GPUs than AMD currently, you're misrepresenting things here. OEM desktops might be 95% of the market (though not for gaming; laptops are 95% of the market, with DIY and prebuilt sharing the rest), but it's not like Nvidia is saturating that market. The reason for there being prebuilts with RTX 3000 GPUs available is rather simple: a lot more people in the market for a new GPU at launch are in the market for _a new GPU_, and not a new system. That you're willing and able to spend $7-800 on a GPU doesn't mean you're willing and able to spend $2-3000 on a pre-built desktop if you can't find your desired GPU for sale.

As for AMD's situations "clearly [being] a result of being fabless" - how does AMD differ from Nvidia there? Last I checked Nvidia doesn't own any fabs either...

There's little doubt that there are _far_ too few GPUs out there to meet demand. That's beyond any doubt. And consoles are likely a part of the reason why, though Zen 3 and everyone else clamoring to get a piece of the TSMC 7nm pie is also probably hurting availability.


----------



## RandallFlagg (Dec 8, 2020)

Valantar said:


> While there's little doubt Nvidia is supplying more GPUs than AMD currently, you're misrepresenting things here. OEM desktops might be 95% of the market (though not for gaming; laptops are 95% of the market, with DIY and prebuilt sharing the rest), but it's not like Nvidia is saturating that market. The reason for there being prebuilts with RTX 3000 GPUs available is rather simple: a lot more people in the market for a new GPU at launch are in the market for _a new GPU_, and not a new system. That you're willing and able to spend $7-800 on a GPU doesn't mean you're willing and able to spend $2-3000 on a pre-built desktop if you can't find your desired GPU for sale.
> 
> As for AMD's situations "clearly [being] a result of being fabless" - how does AMD differ from Nvidia there? Last I checked Nvidia doesn't own any fabs either...
> 
> There's little doubt that there are _far_ too few GPUs out there to meet demand. That's beyond any doubt. And consoles are likely a part of the reason why, though Zen 3 and everyone else clamoring to get a piece of the TSMC 7nm pie is also probably hurting availability.



As for the first paragraph there, I think I can safely assume that Dell ships more Alienwares and HP more Omens than there are DIY systems in total.  Both of these companies based on market share ship *literally millions* of computers each month.  It would only take a small fraction of a fraction of that to be shipping tens of thousands per month.  And again, the main reason reviewers have asked the question of what OEMs are using your chip / GPU in the past has been this volume.  Not having such contracts is a big red sign.  

But that's one data point among many.  The 20% allocation of wafers, the Steam survey showing 3080s after just 5-6 weeks, the plethora of 3rd tier SI's like CyberPowerPC, iBuyPower, CLC, and so on supplying RTX 3XXX cards.   You can explain any one of these away sure, but taken as a whole the picture is clear.  There are a ton of RTX cards entering the market.

They're both fabless yes, but AMD chose to stick with the best node they can get and the one that is in the most demand - TSMC.  They have no wiggle room there, and that in large part is because they must get the PS5 / XBox chips out.  

Nvidia's going with Samsung may have looked like a poor choice early on, but they have Samsung's attention and a big chunk of their production.   This is almost certainly why we're seeing so much more RTX in the market and virtually no Zen 3 or 68XX.  

Now to be fair, if we were counting XBox's and Playstations, we'd probably see there is a big chunk of new AMD users out there.  But that's not really this space.

Since they are entirely dependent on TSMC for all of their chips now, they may be unable to get out of that chip sucking console vortex for another quarter or two.  I think that is where this would really backfire on them.


----------



## Raendor (Dec 8, 2020)

Shatun_Bear said:


> $399 compared to Turing seems like a decent price but puts the card just $100 cheaper than the next gen consoles, which outperform it (not talking about on paper specs, look at an analysis of the settings and performance the PS5 and Xbox are running AC:Valhalla; equivalent to a 3070).
> 
> In fact the DE PS5 is the same price as this card, a whole console, next-gen controller, super fast SSD/IO....I'm a PC gamer primarily but I just dont see great value in a best case $400 card around 2080S level.
> 
> ...


1 - Not every
2 - Devs got Xbox kits later and had less time to optimize.
3- PS5 "customization" is just stripping features they didn't want to pay for.
4 - Ubisoft games are a bad benchmark.


----------



## QUANTUMPHYSICS (Dec 11, 2020)

tancabean said:


> Microcenter near me still has a few left in stock. Hopefully that means the scalpers aren’t that interested.




The profit margin on 3060Ti is low. I saw a bunch sitting on Ebay for $700. There's not much demand for them. 

The 3070 had more hype so it sold easier. 

the 3080 was the sweet spot, but I think demand will be very high for the 3080Ti.


----------



## Valantar (Dec 11, 2020)

QUANTUMPHYSICS said:


> The profit margin on 3060Ti is low. I saw a bunch sitting on Ebay for $700. There's not much demand for them.
> 
> The 3070 had more hype so it sold easier.
> 
> the 3080 was the sweet spot, but I think demand will be very high for the 3080Ti.


Is "availability" on eBay at a 75% markup a sign of there not being much demand? Hardly. More likely the people interested in a $400 GPU are (far) less likely to be able to pay scalper prices than ones interested in a $700 GPU.


----------



## RandallFlagg (Dec 12, 2020)

QUANTUMPHYSICS said:


> The profit margin on 3060Ti is low. I saw a bunch sitting on Ebay for $700. There's not much demand for them.
> 
> The 3070 had more hype so it sold easier.
> 
> the 3080 was the sweet spot, but I think demand will be very high for the 3080Ti.




Naaah.  Plenty of sources have said there was a lot more 3060 Ti supply than they had 3070 / 3080 / 3090.

Also you can get a 3070 for a tad over $700 right now.  So whatever you saw on ebay as far as a $700 3060 Ti was overpriced, even for a scalper, so yeah they will sit there until an idiot comes along.  3060 Ti's are going for a bit under $600 right now.


----------



## QUANTUMPHYSICS (Dec 12, 2020)

RandallFlagg said:


> Naaah.  Plenty of sources have said there was a lot more 3060 Ti supply than they had 3070 / 3080 / 3090.
> 
> Also you can get a 3070 for a tad over $700 right now.  So whatever you saw on ebay as far as a $700 3060 Ti was overpriced, even for a scalper, so yeah they will sit there until an idiot comes along.  3060 Ti's are going for a bit under $600 right now.
> 
> View attachment 179195




Cyberpunk is helping to move cards. 

I sold my 3060Ti to a friend of my cousin. I only charged him $600 instead of $700. 

My 3070 I sold for around $700


----------



## RandallFlagg (Dec 12, 2020)

QUANTUMPHYSICS said:


> Cyberpunk is helping to move cards.
> 
> I sold my 3060Ti to a friend of my cousin. I only charged him $600 instead of $700.
> 
> My 3070 I sold for around $700



At least here in the states, a $590 3060 Ti is not truly outrageously above full msrp cost.   You don't pay sales tax on these, so -

A $400 MSPR 3060 Ti + 8.25% sales tax = $433.   So such a card is right at 36% above store cost.   It's not quite as dramatic s some clickbait articles say and their direct comparison to MSRP. 

I'm also seeing the market depth of sellers > bidders.   This bodes well for prices coming down to normal soon.


----------



## TheinsanegamerN (Dec 29, 2020)

Xaled said:


> *Just imagine if Samsung said: 1280x720 is enough for mobile users, we should convince them if they want more resolutions, they should pay more and do not complain about higher prices*
> Or if Qualcomm said same thing about processors
> or if Apple said 4" is enough, people should not convince themselves to use bigger screen, bigger screens are more expensive, their production is costly.
> do you see now how your poor your point is?


Have you just completely missed that phoneshave moed up to well north of $1000, sometimes north of $1300 as resolutions have increased, and consoomers continue to lap them up? How about the ever increasing price of qualcomm 800 series chips? 

What you sarcastically described is EXACTLY WHATS BEEN HAPPENING for years now. 


RandallFlagg said:


> Yeah, AMD tried to just supply DIY which is like 1-2% of the market.  They failed.
> 
> Nvidia supplied major OEMs who comprise about 95% of the market, and thus far have been successful in doing that.  I can still buy any of several dozen systems with RTX 3XXX cards not only off Amazon from 3rd tier SI's, but straight from players like HP and Dell.
> 
> ...


The steam stats dont lie. The RTX 3080 has roughly the same number of users as the RX 5600xt or 5700 after a few months on the market. Now unless AMD had massive shortages for the entire Navi run that we all missed, that would indicate that those $700 cards are selling quite well, and people are somehow or another getting their hands on them. 

Meanwhile the AMD RX 6000 series doesnt appear at all.


----------



## cst1992 (Jan 4, 2021)

Nice review as always, @W1zzard!
I have one question.
I just got myself one of these, and I noticed the fans are different than any I've seen before. Rather than being just a fan with blades, the actual fan is a disk. Here's what I mean:



As you can see, the blades are joined to one another by a plastic rim.
Why would NVIDIA do that? Does the 20 series also have this design?



RandallFlagg said:


> So whatever you saw on ebay as far as a $700 3060 Ti was overpriced, even for a scalper, so yeah they will sit there until an idiot comes along. 3060 Ti's are going for a bit under $600 right now.


Even in India, where we have high import duties on all hardware, the MRP as printed on the box is INR 49,990, which by current conversion rates is $684. Not to mention the card actually is sold in the market for INR ~40,000. So $700 is definitely overpriced(to the point of being illegal) in almost every market.


----------



## RandallFlagg (Jan 4, 2021)

TheinsanegamerN said:


> The steam stats dont lie. The RTX 3080 has roughly the same number of users as the RX 5600xt or 5700 after a few months on the market. Now unless AMD had massive shortages for the entire Navi run that we all missed, that would indicate that those $700 cards are selling quite well, and people are somehow or another getting their hands on them.
> 
> Meanwhile the AMD RX 6000 series doesnt appear at all.



Just to add, the 3080 appears to have doubled its market share in the past month.  There are a lot of them being shipped via HP and Dell, and with Christmas / New Years over they're showing up in the December charts now.  

Navi 2 still missing.  Zen / AMD also taking a big hit in percentage of users.  

AMD had its Q4 allocation at TSMC ~120,000 wafers going to XBox/PS5 and ~30,000 wafers for everything else (Zen 3, RDNA 2, Threadripper, EPYC).  

It does not matter how good the product is, if you don't have it you can't sell it.  That is showing up in the charts.


----------



## cst1992 (Jan 4, 2021)

RandallFlagg said:


> There are a lot of them being shipped via HP and Dell


OT, but I don't like the included cards in these PCs. My cousin got one prebuilt PC from Best Buy and he found out the card in it had a downgraded cooler and it was underclocked and undervolted. Still the card hit 80C on load(It's a RX 5700 XT).


----------



## Mussels (Jan 4, 2021)

cst1992 said:


> Nice review as always, @W1zzard!
> I have one question.
> I just got myself one of these, and I noticed the fans are different than any I've seen before. Rather than being just a fan with blades, the actual fan is a disk. Here's what I mean:
> View attachment 182414
> ...



to guide the airflow straight up, instead of spreading out in every direction

case fans want the airflow to go everywhere to remove hotspots, this fan has a specific goal of directly up


----------



## Valantar (Jan 5, 2021)

Mussels said:


> to guide the airflow straight up, instead of spreading out in every direction
> 
> case fans want the airflow to go everywhere to remove hotspots, this fan has a specific goal of directly up


It also helps keep the shape of the fan constant, stopping the tips of the blades from stretching outwards at high speed and touching the surrounding fin stack.


----------



## cst1992 (Jan 5, 2021)

Valantar said:


> It also helps keep the shape of the fan constant, stopping the tips of the blades from stretching outwards at high speed and touching the surrounding fin stack.


I never knew that kind of thing also happens.
Do the fans spin high enough for that?
My card's fans only went till 1700 RPM at 76C max.


----------



## Mussels (Jan 5, 2021)

cst1992 said:


> I never knew that kind of thing also happens.
> Do the fans spin high enough for that?
> My card's fans only went till 1700 RPM at 76C max.


well, gigabyte GPU's are known for issues like that - it's not super common


----------



## Valantar (Jan 5, 2021)

cst1992 said:


> I never knew that kind of thing also happens.
> Do the fans spin high enough for that?
> My card's fans only went till 1700 RPM at 76C max.


It's not exactly common, but for high speed fans (>3000rpm) in tight installations it's not uncommon either. The faster Gentle Typhoon case/radiator fans had outer rings, and that's a 120mm fan - but a very fast, high powered one. That's the reason for Noctua using an expensive liquid crystal polymer in their NF-A12x25 too - it's stiffer and allows the blades to stretch less, meaning they could design the fans with a much smaller gap between the fan frame and blades, improving performance. Most fans avoid this issue by simply having a relatively large gap between the fan blade tips and fan frame/heatsink (often 2mm or thereabouts, compared to .5mm for the NF-A12x25), though that sacrifices a surprising amount of performance as that gap creates turbulence along the fan blade tips, increasing noise and reducing airflow. A ring of course has the downside of dramatically increasing the rotating mass of the fan, necessitating a more powerful motor and more durable bearing, plus a rather fundamental change to how airflow is directed (which can be both good and bad depending on your use case).


----------



## Chrispy_ (Jan 10, 2021)

cst1992 said:


> I never knew that kind of thing also happens.
> Do the fans spin high enough for that?
> My card's fans only went till 1700 RPM at 76C max.


The Noctua NF-A12x25 fan was designed with a new type of reinforced plastic just to combat this issue.

The alternative is to just have sloppier tolerances and have a big gap between the fan blade tips and the frame or heatsink fins, but then air leaks through that gap and reduces the efficiency of the fan blades.






						Sterrox® liquid-crystal polymer (LCP) /
					

Designed in Austria, Noctua's premium cooling components are renowned for their superb quietness, exceptional performance and thoroughgoing quality.




					noctua.at


----------

