# AMD is Allegedly Preparing Navi 31 GPU with Dual 80 CU Chiplet Design



## AleksandarK (Jan 23, 2021)

AMD is about to enter the world of chiplets with its upcoming GPUs, just like it has been doing so with the Zen generation of processors. Having launched a Radeon RX 6000 series lineup based on Navi 21 and Navi 22, the company is seemingly not stopping there. To remain competitive, it needs to be in the constant process of innovation and development, which is reportedly true once again. According to the current rumors, AMD is working on an RDNA 3 GPU design based on chiplets. The chiplet design is supposed to feature two 80 Compute Unit (CU) dies, just like the ones found inside the Radeon RX 6900 XT graphics card.

Having two 80 CU dies would bring the total core number to exactly 10240 cores (two times 5120 cores on Navi 21 die). Combined with the RDNA 3 architecture, which brings better perf-per-watt compared to the last generation uArch, Navi 31 GPU is going to be a compute monster. It isn't exactly clear whatever we are supposed to get this graphics card, however, it may be coming at the end of this year or the beginning of the following year 2022.




*View at TechPowerUp Main Site*


----------



## Dristun (Jan 23, 2021)

So, 1999$ MSRP, 3k+ on the market?


----------



## DeathtoGnomes (Jan 23, 2021)

The Voodoo that you do so welllll!

"paper launch q1 2023!"


----------



## spnidel (Jan 23, 2021)

Dristun said:


> So, 1999$ MSRP, 3k+ on the market?


and a grand total of 5 units manufactured throughout the entire year, only to be sold by scalpers


----------



## AusWolf (Jan 23, 2021)

spnidel said:


> and a grand total of 5 units manufactured throughout the entire year, only to be sold by scalpers


Never mind, even if you could afford it and actually buy it, there's no power supply on this planet that can feed this beast.


----------



## MrGRiMv25 (Jan 23, 2021)

Considering it mentions RDNA 3 as the base for the two chiplets then I'd doubt it would be this year at the earliest, and would probably be late next year?


----------



## rsouzadk (Jan 23, 2021)

This supply mrsp thing is really becoming a fucking meme right now.


----------



## Chrispy_ (Jan 23, 2021)

"We heard you liked stock shortages, so we're going to take the constrained supply you're waiting on and use it to make half as many graphics cards"


----------



## Octopuss (Jan 23, 2021)

Meanwhile in Czech republic for example:


----------



## HD64G (Jan 23, 2021)

5nm could easily make that happen and consume close to 350W at over 2GHz. Especially if combined with HBM(3?) for low latency which might be needed more when using chiplets.


----------



## medi01 (Jan 23, 2021)

DeathtoGnomes said:


> The Voodoo that you do so welllll!
> 
> "paper launch q1 2023!"



Why people keep mumming "paper launch", when cards are available in online stores (e.g. mindfactory in Germany) for WEEKS? 
Now, yeah, pricing is way above MSRP, but so was 2080Ti's price up until Ampere came.


----------



## Chrispy_ (Jan 23, 2021)

rsouzadk said:


> This supply mrsp thing is really becoming a fucking meme right now.


I just feel sorry for people who didn't pick up a modern GPU last generation. They're completely boned right now.

It's understandable, too, because Turing was a complete rip-off for the first year on the market, and then when Navi arrived on the scene a year late the party it was underwhelming AF; The 5700 wasn't even 15% faster than the Vega56, and even at MSRP, it actually offered lower performance/$. Street pricing on Vega56 by that point was way lower than the 5700-series ever reached.

The only "good" thing that happened that entire generation is that AMD rejoining the GPU market after a year of total nothingness brought TU104 down to under $500 in the 2070 Super. That didn't make it good value, but at least it restored price/performance to somewhere that wasn't obscene. Unfortunately Nvidia stopped manufacturing Turing around 9 months after the Super series launched, so if you didn't grab one during that short window you're SOL right now.


----------



## dyonoctis (Jan 23, 2021)

Remember when you could get excited about a futur product without wondering if you could get them, but rather when and how it will perform ? I miss those simplier times.


----------



## AusWolf (Jan 23, 2021)

Chrispy_ said:


> I just feel sorry for people who didn't pick up a modern GPU last generation. They're completely boned right now.
> 
> It's understandable, too, because Turing was a complete rip-off for the first year on the market, and then when Navi arrived on the scene a year late the party it was underwhelming AF; The 5700 wasn't even 15% faster than the Vega56, and even at MSRP, it actually offered lower performance/$.
> 
> The only "good" thing that happened that entire generation is that AMD rejoining the GPU market after a year of total nothingness brought TU104 down to under $500 in the 2070 Super. That didn't make it good value, but at least it restored price/performance to somewhere that wasn't obscene. Unfortunately Nvidia stopped manufacturing Turing around 9 months after the Super series launched, so if you didn't grab one in that window you're SOL right now.


The other thing about Navi is that it was still available when Turing was already out of stock and Ampere wasn't available yet (as it still isn't). That's how and why I managed to pick up my 5700 XT that will hopefully keep me happy for at least a good year or two.


----------



## Chrispy_ (Jan 23, 2021)

AusWolf said:


> The other thing about Navi is that it was still available when Turing was already out of stock and Ampere wasn't available yet (as it still isn't). That's how and why I managed to pick up my 5700 XT that will hopefully keep me happy for at least a good year or two.


My 5700XT is doing fine work, I bought it at launch and given what a damp squib RTX has been, it's turned out to be an excellent investment.

CP2077's DLSS is the first time I though that the 2070S I also own showed any noteworthy advantage over the 5700XT. Again, DLSS has been a complete non-feature for over two years, because until CP2077 showed up, none of the implementations of it were worth writing home about.

Unfortunately the miners have ruined it all again, I'm seeing 5700-series cards at almost 50% premium over MSRP, which was never that competitive in the first place....


----------



## Robert Bourgoin (Jan 23, 2021)

Probably be around $4000.00 if you can find one.
Crazy how computer building has gone.


----------



## AusWolf (Jan 23, 2021)

Chrispy_ said:


> My 5700XT is doing fine work, I bought it at launch and given what a damp squib RTX has been, it's turned out to be an excellent investment.
> 
> CP2077's DLSS is the first time I though that the 2070S I also own showed any noteworthy advantage over the 5700XT. Again, DLSS has been a complete non-feature for over two years, because until CP2077 showed up, none of the implementations of it were worth writing home about.
> 
> Unfortunately the miners have ruined it all again, I'm seeing 5700-series cards at almost 50% premium over MSRP, which was never that competitive in the first place....


To be honest, I still don't give a **** about DLSS, and I don't think I ever will. I'd much rather run everything at native resolution without any AA. Ray tracing is a little tempting, but it's still in its early stages even with Ampere. Not worth the premium just yet imo.


----------



## Deleted member 205776 (Jan 23, 2021)

Chrispy_ said:


> I just feel sorry for people who didn't pick up a modern GPU last generation. They're completely boned right now.


My 2070 died in December 2020 and I had a panic attack because this is the worst time for a GPU to die. No iGPU meant no using PC until I got a new GPU. So glad I found a 3070 Gaming X Trio in stock close to MSRP.


----------



## TumbleGeorge (Jan 23, 2021)

How expect to be infinity cache of this monster? 256 or 512MB...1GB?


----------



## AusWolf (Jan 23, 2021)

Alexa said:


> My 2070 died in December 2020 and I had a panic attack because this is the worst time for a GPU to die. No iGPU meant no using PC until I got a new GPU. So glad I found a 3070 Gaming X Trio in stock close to MSRP.


Maybe I'm weird, but I have a 1050 Ti on the shelf just for this purpose.


----------



## Deleted member 205776 (Jan 23, 2021)

AusWolf said:


> Maybe I'm weird, but I have a 1050 Ti on the shelf just for this purpose.


I have no parts lying around other than the ones in my PC, I suppose I should buy some cheapo gpu just for this lol



TumbleGeorge said:


> How expect to be infinity cache of this monster? 256 or 512MB...1GB?


Doubt it's anything above 256MB.


----------



## neatfeatguy (Jan 23, 2021)

Chrispy_ said:


> My 5700XT is doing fine work, I bought it at launch and given what a damp squib RTX has been, it's turned out to be an excellent investment.
> 
> CP2077's DLSS is the first time I though that the 2070S I also own showed any noteworthy advantage over the 5700XT. Again, DLSS has been a complete non-feature for over two years, because until CP2077 showed up, none of the implementations of it were worth writing home about.
> 
> Unfortunately the miners have ruined it all again, I'm seeing 5700-series cards at almost 50% premium over MSRP, which was never that competitive in the first place....



My brother picked up a 5700 XT two Decembers back. He spent about $400 on it. I told him if he wanted to nearly double his money he could sell it on ebay - they're fetching upwards of $700+. But he doesn't have a spare GPU so he couldn't play any games...

He's not much on following hardware and I explained to him how his card,normally would only bring around $200 or so at this point on the second hand market and that GPUs were hard to find. He started looking around and noticed no one has any new GPUs in stock and that ebay prices for scalping was astronomical.



Alexa said:


> I have no parts lying around other than the ones in my PC, I suppose I should buy some cheapo gpu just for this lol
> 
> 
> Doubt it's anything above 256MB.



Keep old hardware. I've got two spare PSUs, spare RAM (well, DDR3 so it wouldn't help if my new build needed RAM to test with) and two GPUs. When I upgrade I try to keep old working hardware for spare parts in testing just to have for a backup so I can figure out any possible problems that come up in the future if I need spare parts to test with.


----------



## damric (Jan 23, 2021)

_10240 cores (two times 5120 cores on Navi 21 die). Combined with the RDNA 3 architecture, which brings better perf-per-watt compared to the last generation uArch, Navi 31 GPU is going to be a compute monster. _

Great I'm sure miners will love it.


----------



## Deleted member 206429 (Jan 23, 2021)

rsouzadk said:


> This supply mrsp thing is really becoming a fucking meme right now.



Joke***  The correct word is joke.

To all the nerds who are crying about GPU shortages, price hikes, out of stock items: Grow up.  Go outside, find a new hobby, actually talk to people in real life (I know, its a wild concept), discover there is more to the world than just Fall Guys and Among Us


----------



## Turmania (Jan 23, 2021)

AusWolf said:


> Maybe I'm weird, but I have a 1050 Ti on the shelf just for this purpose.


No, you are very clever, I have gtx 1060 on the shelf as well for that purpose.


----------



## ZoneDymo (Jan 23, 2021)

dyonoctis said:


> Remember when you could get excited about a futur product without wondering if you could get them, but rather when and how it will perform ? I miss those simplier times.


Well you can still wonder on how it will preform, there are youtubers for that who will get one for review.
I was never in the market for 1000 dollar gpus and will never be either so does not change much for me if they are available or not.


----------



## ZoneDymo (Jan 23, 2021)

AusWolf said:


> Maybe I'm weird, but I have a 1050 Ti on the shelf just for this purpose.



Always have back up parts, perferably evne a complete back up pc so you can chuck your main storage drive in there if something is wrong with it.


----------



## Anymal (Jan 23, 2021)

Turmania said:


> No, you are very clever, I have gtx 1060 on the shelf as well for that purpose.


You are smart, I have gtx670 on the shelf for that purpose.


----------



## Vya Domus (Jan 23, 2021)

Chrispy_ said:


> Again, DLSS has been a complete non-feature for over two years, because until CP2077 showed up, none of the implementations of it were worth writing home about.



That’s because CP uses some really heavy post processing and has a very soft image quality anyway.


----------



## Vader (Jan 23, 2021)

AusWolf said:


> To be honest, I still don't give a **** about DLSS, and I don't think I ever will. I'd much rather run everything at native resolution without any AA. Ray tracing is a little tempting, but it's still in its early stages even with Ampere. Not worth the premium just yet imo.


Personally i'm all about DLSS. The performance gap between an AAA game like CP2077 and the "old but still hella fun", or competitive online games is getting ridiculous.
Having $500 (or more) gpus can feel like a waste sometimes, especially when those demanding games you bought the card for turn out to be not that fun.
My RX 480 has spent probably 85% of it's lifetime playing games that are far below what it can handle.


----------



## Deleted member 205776 (Jan 23, 2021)

DLSS is objectively great.


----------



## Valantar (Jan 23, 2021)

Chrispy_ said:


> "We heard you liked stock shortages, so we're going to take the constrained supply you're waiting on and use it to make half as many graphics cards"


I mean, margins are much higher on higher end GPUs, so why use two GPUs to sell two $1000 GPUs when you can use the same silicon to sell a single $3000 GPU? Makes perfect sense to me  

In all seriousness, I really, really hope the current silicon wafer, packaging and silicon fabrication shortages start sorting themselves out soon. I can't really wrap my head around how we got here - did sales volumes in 2020 for ... silicon, in general? take off by that much? Really? Or has this been years in the making by the consolidation of the silicon litography industry, but nobody has wanted to talk about it?


I mean, it's great if AMD is working on MCM GPUs for RDNA3. Hopefully that isn't just for the high end, but lower-end too - if they could use a single ~150mm2 GPU die across 1, 2 and 3-die configurations that could make for a high volume, low cost product stack. But I need to see it work before I believe it, and then see actual products on actual/virtual shelves before I _really_ believe it.


----------



## kapone32 (Jan 23, 2021)

Alexa said:


> DLSS is objectively great.


It could also be called a gimmick as not even 1% of PC games support it.


----------



## AusWolf (Jan 23, 2021)

Vader said:


> Personally i'm all about DLSS. The performance gap between an AAA game like CP2077 and the "old but still hella fun", or competitive online games is getting ridiculous.
> Having $500 (or more) gpus can feel like a waste sometimes, especially when those demanding games you bought the card for turn out to be not that fun.
> My RX 480 has spent probably 85% of it's lifetime playing games that are far below what it can handle.


Maybe that's the point. I'm not all about performance or competitiveness. If I can run my games at frame rates that are comfortable for my eyes (what that means varies between 30 and 60 fps depending on game), I'm fine.


----------



## Deleted member 205776 (Jan 23, 2021)

kapone32 said:


> It could also be called a gimmick as not even 1% of PC games support it.


Definitely. But I sure was gobsmacked when I first experienced it.


----------



## cueman (Jan 23, 2021)

ouucch, when we looks incoming nvidia and amds gpus this or next year i wonder only one think..
where we need thouse monsters?

you must have 8K monitor and top pick component that its is worth.


well, sure some1 have cssh, but its just much under 5 percent.

well, competition os always good.


----------



## kruk (Jan 23, 2021)

Very surprising rumor. About two years ago, AMD representatives said it's not really trivial to do MCM for gaming workloads, and thus one would expect to see this type of design in CDNA first, and much later it would be used in RDNA. It's rumored even nVidia postponed their MCM GPU arch. Don't get your hopes up ...


----------



## Deleted member 205776 (Jan 23, 2021)

cueman said:


> ouucch, when we looks incoming nvidia and amds gpus this or next year i wonder only one think..
> where we need thouse monsters?
> 
> you must have 8K monitor and top pick component that its is worth.
> ...


What I want is a good sub $200 GPU again. What happened? That's where most of the money comes from, AMD and NVIDIA.


----------



## Frick (Jan 23, 2021)

vanishs14 said:


> Joke***  The correct word is joke.
> 
> To all the nerds who are crying about GPU shortages, price hikes, out of stock items: Grow up.  Go outside, find a new hobby, actually talk to people in real life (I know, its a wild concept), discover there is more to the world than just Fall Guys and Among Us



There's a pandemic going on. Also Among Us is quite fun, and it was originally released as a mobile game, and it can run on basically any PC, listed minimum specs is a Pentium 4 2Ghz and a Geforce 510.


----------



## mechtech (Jan 23, 2021)

Chrispy_ said:


> I just feel sorry for people who didn't pick up a modern GPU last generation. They're completely boned right now.
> 
> It's understandable, too, because Turing was a complete rip-off for the first year on the market, and then when Navi arrived on the scene a year late the party it was underwhelming AF; The 5700 wasn't even 15% faster than the Vega56, and even at MSRP, it actually offered lower performance/$. Street pricing on Vega56 by that point was way lower than the 5700-series ever reached.
> 
> The only "good" thing that happened that entire generation is that AMD rejoining the GPU market after a year of total nothingness brought TU104 down to under $500 in the 2070 Super. That didn't make it good value, but at least it restored price/performance to somewhere that wasn't obscene. Unfortunately Nvidia stopped manufacturing Turing around 9 months after the Super series launched, so if you didn't grab one during that short window you're SOL right now.



Still using my RX480, and will continue to until a card comes out at around 2x performance at around the same price I paid for it.


----------



## AusWolf (Jan 23, 2021)

mechtech said:


> Still using my RX480, and will continue to until a card comes out at around 2x performance at around the same price I paid for it.


That's my way of thinking too. I've never understood those people who get excited about 10% more performance which is barely noticeable to the human eye.

Actually, I remember upgrading my Radeon X800 XT to a GeForce 7800 GS. One of the worst investments of my life.


----------



## Deleted member 205776 (Jan 23, 2021)

AusWolf said:


> That's my way of thinking too. I've never understood those people who get excited about 10% more performance which is barely noticeable to the human eye.


big number good


----------



## AusWolf (Jan 23, 2021)

Alexa said:


> big number good


It's not about big numbers. I highly doubt anyone can see the difference between 30 and 33 fps, or 100 and 110 fps. That's 10%.


----------



## Deleted member 205776 (Jan 23, 2021)

AusWolf said:


> It's not about big numbers. I highly doubt anyone can see the difference between 30 and 33 fps, or 100 and 110 fps. That's 10%.


Was being sarcastic but okay


----------



## FreedomEclipse (Jan 23, 2021)

Meanwhile at AMD HQ....


----------



## Tatty_One (Jan 23, 2021)

Let's all get back to discussing GPU releases and let the nationalistic off topic chat go away please.


----------



## TumbleGeorge (Jan 23, 2021)

Guess how will be number of performance of Navi 31.
2X cores + new RDNA3 which will come with up to +50% combine IPC/power consumption per gigaflops thanks to new R&D and new lithography?
How will be sum of all parts of equation? 3X than RX 68006900 XT? Is possible with GDDR6/X because GDDR7 will be ready for use not before RDNA4(or nextgen architecture beyond of RDNA if RDNA3 is last of it's kind).
Edit in "equation" !


----------



## Totally (Jan 23, 2021)

Vya Domus said:


> That’s because CP uses some really heavy post processing and has a very soft image quality anyway.



*knock knock* It's the FBI we have some questions that we would like you to answer.


----------



## Fouquin (Jan 23, 2021)

dyonoctis said:


> Remember when you could get excited about a futur product without wondering if you could get them, but rather when and how it will perform ? I miss those simplier times.



Yeah like the PIII-1000 paper launch, Radeon X800 paper launch, GeForce 6800U, X1900, 8800 GTX, GTX 480... We've never had paper launches before.


----------



## dicktracy (Jan 23, 2021)

high latency with CPU was already terrible and now they want high latency on GPU? Get out.


----------



## evernessince (Jan 23, 2021)

dicktracy said:


> high latency with CPU was already terrible and now they want high latency on GPU? Get out.



You do know AMD has the lead in every category CPU wise right now correct?  The merits of MCM design far outweigh the negatives, which is why both Nvidia and Intel are trying to do the same.  On latency by the way, the University of Toronto showed it's possible to design a chiplet based CPU that has lower latency than a monolithic one through the use of an active interposer.


----------



## BSim500 (Jan 23, 2021)

AusWolf said:


> Maybe I'm weird, but I have a 1050 Ti on the shelf just for this purpose.


You're not weird. I've always kept a spare PSU, keyboard & mouse lying around. I have an old 1050Ti that's been sitting in it's box waiting for Ebay, but after these past highly insane 6 months I'm definitely keeping that as a backup too.


----------



## Vya Domus (Jan 23, 2021)

dicktracy said:


> high latency with CPU was already terrible and now they want high latency on GPU? Get out.


You now pretend like you know anything about any of this in addition to posting nonsense fanboy drivel ?

I guess you are a troll from top to bottom.


----------



## Valantar (Jan 23, 2021)

dicktracy said:


> high latency with CPU was already terrible and now they want high latency on GPU? Get out.


This isn't MCM like in Ryzen - which by the way has largely overcome its latency issues anyhow. There was a recent AMD patent published about MCM GPUs with passive interposers, though it's also feasible for them to use an active interposer with the Infinity cache on it rather than the die itself - either of those would drastically reduce latencies compared to what we see in MCM-on-substrate Ryzen. It stands to reason that if they're actually bringing this to market, it won't be with a fundamental design flaw of such a magnitude that it kills performance entirely.


----------



## d0x360 (Jan 23, 2021)

MrGRiMv25 said:


> Considering it mentions RDNA 3 as the base for the two chiplets then I'd doubt it would be this year at the earliest, and would probably be late next year?


It will be late this year.  It will be priced very well too.  Also to everyone talking **** about supply...grow up.  AMD is providing chips for not only themselves but they also have contracts to make consoles...5-6 different consoles.  6 if sony is still ordering both ps4 and the pro, 5 if they are only ordering 1 of them like Microsoft who stopped ordering the xbox one x.

AMD can only buy some much capacity from TSMC and that capacity is being split between all those consoles and previous gen CPU's along with new CPU's and GPU'S all with multiple skews.

You wanna know where all the hardware is going?  Its not scalpers it's system builders like Dell or any number of the companies making gaming PC's.

Come back to reality.  Btw... AMD is making tons of stuff but nvidias not.  nVidia is also letting Samsung's fab sit idle when they could easily produce tons of chips for nVidia.

The reason they aren't is because of AMD coming out strong with RDNA2..so strong that nVidia basically stopped making 3080s because the 6800xt devalues nVidia products at the high end which is why we will see production ramp up when they deliver the 3080s or ti.


----------



## jamexman (Jan 23, 2021)

spnidel said:


> and a grand total of 5 units manufactured throughout the entire year, only to be sold by scalpers



Quoted for the truth...


----------



## ixi (Jan 23, 2021)

Vya Domus said:


> That’s because CP uses some really heavy post processing and has a very soft image quality anyway.


I can add few things to you said stuff.

CP is trash game.
Why?
 1) quality is not so good if you check closely
 2) cpu and gpu overused because of graphics even if it does look like crap.
 3) optimisasion is equal to 0
 4) devs who created the visuals are freaking   retarded ( sorry for saying this ).


----------



## dyonoctis (Jan 23, 2021)

Fouquin said:


> Yeah like the PIII-1000 paper launch, Radeon X800 paper launch, GeForce 6800U, X1900, 8800 GTX, GTX 480... We've never had paper launches before.


I wasn't into tech in that era so I didn't know. But looking at how people are acting it doesn't feels like what we living right now is just "business as usual, move along"


----------



## pantherx12 (Jan 23, 2021)

Sounds like fun been waiting for it since I heard rumours about chiplets for zen.


----------



## medi01 (Jan 23, 2021)

Vader said:


> The performance gap ... is getting ridiculous.


Dropping from 4k to 1440p (2.2 times less pixels) boosts performance, who would have thought...  

On "paper launch": in stock for WEEKS on german online retailer site (at price higher than claimed MSRP, of course):


__
		https://www.reddit.com/r/realAMD/comments/l3e3k1

Nothing "paper" about it, besides, perhaps, MSRP price, but I was told gamers were fine with it, 2080Ti, cough.

*Radeon Top 5 Selling Brand Lines!*


RX 6900XT = 460 Units.
RX 6800XT = 260 Units.
RX 5700XT = 220 Units
RX 6800 = 175 Units.
RX 5500XT = 80 Units.


*Nvidia Top 5 Selling Brand Lines!*


RTX 3080 10GB = 530 Units.
RTX 3070 8GB = 410 Units
GTX 1660 Super= 250 Units.
GT 710 = 130 Units.
GTX 1050 TI = 120 Units.


----------



## r9 (Jan 23, 2021)

d0x360 said:


> It will be late this year.  It will be priced very well too.  Also to everyone talking **** about supply...grow up.  AMD is providing chips for not only themselves but they also have contracts to make consoles...5-6 different consoles.  6 if sony is still ordering both ps4 and the pro, 5 if they are only ordering 1 of them like Microsoft who stopped ordering the xbox one x.
> 
> AMD can only buy some much capacity from TSMC and that capacity is being split between all those consoles and previous gen CPU's along with new CPU's and GPU'S all with multiple skews.
> 
> ...



"AMD is making tons of stuff but nvidias not.  nVidia is also letting Samsung's fab sit idle when they could easily produce tons of chips for nVidia.
The reason they aren't is because of AMD coming out strong with RDNA2..so strong that nVidia basically stopped making 3080s because the 6800xt devalues nVidia products at the high end"

You can't be older than 10 with that logic. Okay maybe a slow 12yo.


----------



## Valantar (Jan 23, 2021)

medi01 said:


> Dropping from 4k to 1440p (2.2 times less pixels) boosts performance, who would have thought...
> 
> On "paper launch": in stock for WEEKS on german online retailer site (at price higher than claimed MSRP, of course):
> 
> ...


This is, what, three months after launch? Or have we hit four now? And a significant retailer in a country of >80 million inhabitants has less than 1000 RTX 3000 units to sell? Yeah, sorry, that's not a lot. Unless every SKU is in stock at prices at least close to those they were announced at (Covid has increased shipping and distribution costs, so some hikes are expected), that's still very low availability.


----------



## medi01 (Jan 23, 2021)

Valantar said:


> This is, what, three months after launch? Or have we hit four now? And a significant retailer in a country of >80 million inhabitants has less than 1000 RTX 3000 units to sell? Yeah, sorry, that's not a lot. Unless every SKU is in stock at prices at least close to those they were announced at (Covid has increased shipping and distribution costs, so some hikes are expected), that's still very low availability.


6900 was released in December.
Situation wasn't much different 3 weeks ago (same site), if anything, GPUs were a tad cheaper.

Compared to pre-current gen GPUs, mindfactory was selling roughly the same ballpark of GPUs weekly.


----------



## Freebird (Jan 24, 2021)

AusWolf said:


> Never mind, even if you could afford it and actually buy it, there's no power supply on this planet that can feed this beast.


Most likely it would be produced on some iteration of 5nm, and probably lower clocks, thereby drastically reducing power consumption.  If done on N5P with reduced clocks it could be 50% less power on the same arch.
 --  The N5P technology provides roughly 20% speed improvement or about 40% reduction in power consumption compared with the 7nm process technology.




__





						5nm Technology - Taiwan Semiconductor Manufacturing Company Limited
					

TSMC’s 5nm (N5) Fin Field-Effect Transistor (FinFET) technology successfully entered volume production in the second quarter of 2020 and experienced a strong ramp in the second half of 2020.




					www.tsmc.com


----------



## Mussels (Jan 24, 2021)

If the latency is high, they'll throw it out as a compute card for rendering and such where horsepower is needed, not lowest latency


----------



## Prefix (Jan 24, 2021)

AusWolf said:


> Never mind, even if you could afford it and actually buy it, there's no power supply on this planet that can feed this beast.


Actually it would run fine on a 750w PSU. the card would use roughly 500W and possibly even less if it uses HBM.

The 6900xt is 300w and chiplets for GPU are really efficient because u don't have to have the ram twice the power and the die's can actually be a bit smalelr each using less pwwoer per core and considering they likely will use HBM it could use much less power.


It could possibly be a 400w card if its tuned properly.


----------



## evernessince (Jan 24, 2021)

medi01 said:


> Dropping from 4k to 1440p (2.2 times less pixels) boosts performance, who would have thought...
> 
> On "paper launch": in stock for WEEKS on german online retailer site (at price higher than claimed MSRP, of course):
> 
> ...



Those numbers are extremely bad given we are months out from launch.  In fact they are downright pathetic.  The definitely says paper launch to me.


----------



## Nkd (Jan 24, 2021)

MrGRiMv25 said:


> Considering it mentions RDNA 3 as the base for the two chiplets then I'd doubt it would be this year at the earliest, and would probably be late next year?



Redgaming tech has pretty solid sources for AMD and was nearly spot on about RDNA2. He mentioned it will be in 2022 at earliest likely second half. Also according to his source the target for RDNA3 big chip is really high at 2.5x performance of 6900xt. He its going to be a big architectural changed when it comes to geometry engine and big leap in ray tracing as well. Watch his latest video on it and his reaction to it lol. RDNA3 is goin to be a monster chip.


----------



## InVasMani (Jan 24, 2021)

AusWolf said:


> Never mind, even if you could afford it and actually buy it, there's no power supply on this planet that can feed this beast.



Based on what evidence exactly? I wouldn't expect a single PCB RDNA3 twin chip design with a CU count like the RX 6900 XT to be worse on power draw than two discrete RNDA2 RX 6900 XT's. There should be power optimized refinements from RDNA2 to RNDA3 and just because it's using a pair of chips with the same CU doesn't mean on a single PCB they will be clocked the same which even if they were and AMD's optimized various aspects of it to increase power and efficiency it'll still draw less power. Even if AMD optimized nothing at all a single board design is going to be more efficient in practice. The more challenging aspect would be cooling it, but they'll probably make it water cooled or at least a 3-slot cooler rather than a 2-slot design. I don't know if they'd go with a 4-slot cooler or not they certainly could, but if they don't increase clock speeds while making it more power efficient for RDNA3 they might not need to. Just put a vapor chamber on both sides of the PCB with 4 U-shaped heat pipes connected spaced out across the PCB length and two blower fans on each side of the PCB on the rear exhausting all the heat. That would exhaust plenty of heat all outside of the case. The heat-pipes themselves could be filled with a bit of liquid metal if they aren't already done that way these days.



HD64G said:


> 5nm could easily make that happen and consume close to 350W at over 2GHz. Especially if combined with HBM(3?) for low latency which might be needed more when using chiplets.



Exactly and with that many CU's you'd want to make use of the latest HBM chips regardless. The cost of HBM is more justified in a design with that much compute power that can make heavy use of tons of bandwidth. Plus combined with infinity cache and with BAR SIZE it'll make it more worth while and easier to reduce the memory bus width itself to subsidize some of the cost especially so in a twin chip design with more leeway and wiggle room. The faster chips get the more wiggle room is available to reduce bus width a bit w/o really having a major performance impact in all scenario's. GPU's don't just push raw bandwidth at all times by nature a lot of data is shuffled about in chunk sizes or loaded first taking a moment to do so then pooled from pretty much indefinitely or until shuffled in and out as needed.


Chrispy_ said:


> "We heard you liked stock shortages, so we're going to take the constrained supply you're waiting on and use it to make half as many graphics cards"



I hear you, but they are a business and they can make significantly more money selling cards aimed more towards the scientific, healthcare, 3D artists, CGI industry, and so on than selling them to the game community. It's just the way it is we paved the way for all that across the years, but we're lowest on the priority list in a lot of ways at the same time. It doesn't help that they sell cards like the RX 6900 XT in the fist place to high end enthusiast gamer's rather than splitting it in two and selling two cards instead which would benefit the industry as a whole more and keep things more fair at the same time to gamer's that can't pay to win in the same way.



ZoneDymo said:


> Always have back up parts, perferably evne a complete back up pc so you can chuck your main storage drive in there if something is wrong with it.



That's my current I believe I unintentionally installed bitlocker on it and don't know the password for it so I booted up to a orange corrupted looking screen. I tried to install windows 10 from media creation on it no luck. I tried windows repair no luck. I still haven't sorted thru if I can salvage the drive yet or not. Tried using it as a portable USB drive and didn't work at first then partially manged to work enough to copy data off it, but can't write to it to erase or format it. I tried to install windows 10 on another drive with it in the system and that wouldn't even work. The portable drive I turned on after the initial OS was booted from off a HDD that I installed it on after removing the SSD. Horrible pita experience. I thought it was the GPU or display initially though so I'm actually relieved it's not either of those particularly the GPU that wouldn't be cheap to replace at this point time. Windows 10 is unbelievably slow and terrible on a HDD for the record as well holy f*ck it's a pure garbage experience. It's pretty tempting to install Linux frankly.



Vya Domus said:


> That’s because CP uses some really heavy post processing and has a very soft image quality anyway.



The cloudy piss soft image quality effect. I generally just don't like soft unless it's lighting and even then I prefer to keep it minimal to avoid it be exaggerated to heavily.




Alexa said:


> DLSS is objectively great.



Objectively I'd much rather have the RAW GPU resources to devout to rasterization and reshade shader effects myself and other aspects like more TMU's. You can do upscale and downscale of individual post process effects in reshade rather trivially and a enormous degree of custom configuration of shaders. A lot can be tuned from like 5FPS impact downward to 1FPS and not look drastically different while saving a lot of performance or even objectively looking better. Too much over exaggeration of post process can actually have a rather nasty negative impact on image quality rather than improve it including DLSS. The performance impact is overblown if you turn off AA you gain a lot of performance if you upscale from a lower resolution you save a lot of performance overhead if you downsample at a lower resolution you save a lot of overhead as well. It's a matter of balance really and frankly like a ASIC is fixed function optimized performance post process is best when done similarly, but more time consuming. AI will get better at inference and can close that gap as it advances and in many cases surpass human inaccuracies even, but it's far from perfect today.


----------



## Khonjel (Jan 24, 2021)

My recommendation to people traversing TPU: ignore reading/discussing technical posts like these.

Jesus christ! I felt like I swam in a facebook slime and I could literally feel thousands of my brain cells dying after reading through the comments.


----------



## Valantar (Jan 24, 2021)

medi01 said:


> 6900 was released in December.
> Situation wasn't much different 3 weeks ago (same site), if anything, GPUs were a tad cheaper.
> 
> Compared to pre-current gen GPUs, mindfactory was selling roughly the same ballpark of GPUs weekly.


... I was talking about the RTX GPUs, wasn't I? There is obviously more reason for Radeons to have less availability, at least for the time being. It's still kind of scary that they had more 6900 XTs than 6800 + 6800 XT put together. 

Also, at what point were they selling comparable amounts? Just before the launch of these new GPUs? Or 1-4 months after the launch of the previous generation? I sincerely hope you see the difference in what sales numbers to expect at those different times.


----------



## yeeeeman (Jan 24, 2021)

I find these articles so funny cause every single time AMD is announced to destroy the competition and after the product launches, no comment about it that it got beaten. Bit rumours start to circulate again about next gen being some sort of monster that will take over the competition. The end result...we know. Nvidia will launch something much better and we will then move to rdna 4 rumours.


----------



## 1d10t (Jan 24, 2021)

Upcoming Ray Tracing 2.0, apparently AMD borrowed learn from nVidia


----------



## Legacy-ZA (Jan 24, 2021)

Why stop there? Why not $1,000 000.00 people seem to have more money than sense, why not?


----------



## InVasMani (Jan 24, 2021)

Mussels said:


> If the latency is high, they'll throw it out as a compute card for rendering and such where horsepower is needed, not lowest latency


It's probably not even a if, but their intended market. My best assumption is AMD is planning to utilize the less perfect monolithic chip yields for MCM for more serious use compute workloads that aren't are latency critical or less so and more parallel computational critical in nature as they improve that tech over time. That similar to EPYC reduces the cost of the more premium perfect single chiplet yields. If it's used for gaming at all it'll likely start with niche enthusiasts initially and be catered more toward niche markets where the latency concerns are less damning like 4K and 8K because frame rates are lower and GPU rendering needs are greatly higher. The latency penalty is going to be more pronounced for lower resolution high refresh rate so they'd target that market last.

The lucrative markets that actually require GPU for work as opposed to fun and games will help subsidize the MCM GPU tech for gaming in future iterations with latency improvements down the road however because it's sure to be refined over time. Luckily what AMD's learned from Ryzen and with RNDA2 will help a lot in practice. I don't think this tech will have as much problems as Ryzen had with CCX issues because some of that is been ironed out with Ryzen already and additionally infinity cache is a big innovation in itself. AMD is gearing up for major innovation in the coming years as is the industry as a whole.



Legacy-ZA said:


> Why stop there? Why not $1,000 000.00 people seem to have more money than sense, why not?



You'd be surprised what governments pay for the right leading edge technology.


----------



## Totally (Jan 24, 2021)

Fouquin said:


> Yeah like the PIII-1000 paper launch, Radeon X800 paper launch, GeForce 6800U, X1900, 8800 GTX, GTX 480... We've never had paper launches before.



tbf, with paper launch there's no need to sweat it, the part is coming eventually, and now it's "How much time and effort could I put towards beating the scalpers and miners? Can I even beat the scalpers/miners?"


----------



## Kohl Baas (Jan 24, 2021)

medi01 said:


> Nothing "paper" about it, besides, perhaps, MSRP price, but I was told gamers were fine with it, 2080Ti, cough.



Well, if you remember Jensen's speech at launch, gamers weren't fine with it. That's why he had to address the GTX10 owners specially saying "Now is the time to buy new GPUs." Screwed them again, but ultimately screwing the company because only the small minority got the cards whom either was able to get it from nVidia at MSRP or whom got the RTX2080Ti too 'cause don't care anyways...


----------



## medi01 (Jan 24, 2021)

evernessince said:


> Those numbers are extremely bad given we are months out from launch.  In fact they are downright pathetic.  The definitely says paper launch to me.


Your statement makes ZERO SENSE.
You have NO IDEA how many cards were sold by the said retailer normally.
Oh, yeah, actually roughly THE number they sell now.
Oh, and almost none of those cards are sold out either, all yours, just go and buy it.

There is nothing "paper" in this launch at this point, it's just it came with *price gauging *not only on the green side (which people are used to) but also on red side.


----------



## TumbleGeorge (Jan 24, 2021)

InVasMani said:


> You'd be surprised what governments pay for the right leading edge technology


Show me, please for hardware especially. What is leading edge technology with which government hardware in usage(with DOS, 95/98, 2000, XP...) is different and better than hardware(supercomputers or other devices) in private hands? I hope this is not another conspiracy theory. If it is, just stop!


----------



## cueman (Jan 24, 2021)

1st amd put two rx 5700xt core one gpus inside,born rx 6000 series, now they want push four 5700xt for ne gpu inside...rx 7000 series.
is it that way,..that nothing really not coming there amd engineering..come on...its just enough...

anywya,looks future, 300W tdp is nothing soon, we took 400w gpus tdp 'normal'


i think its wrong way....but looks this day engineering and process tech ,its unpossible.


maybe when we go about 3nm process tech under 300W tdp might happend...short time.


my opinion is that 300W tdp should be limit...soon we need 1000W psu for normal gaming rig.. not good... i mean these days..


----------



## TumbleGeorge (Jan 24, 2021)

cueman said:


> i think its wrong way


Yes this is way of muscles not way of brain.


----------



## evernessince (Jan 24, 2021)

medi01 said:


> Oh, and almost none of those cards are sold out either, all yours, just go and buy it.



1. They are factually sold out from every US retailer so yes, they are sold out by definition.
2. Being available from 3rd parties doesn't change the fact of their sold out status for Official retailers.
3. Advising people to buy from scalpers, paying 2- 3 times the price of an already expensive product isn't a solution.  When people say you can buy something, it's implied that said product has general availability and comes from a reputable source.  I didn't go around and tell my friend they could have bought the 3900X at launch because it was very unlikely that they could in fact at a reasonable cost.  Your comment is the picture perfect internet comment, in that you are saying "But you can buy it!!!".  Yeah, if you ignore you are paying 2 to 3 times the already high price plus loosing warranty and only from a 3rd party.  Context is important and it's precisely there that your point falls flat on it's face.  The internet is the only place you can make an argument stripped of such context because if you told your other friends with computers that they could buy the card and then later add "but at 2 - 3 x the price...." when they find out that all retailers are sold out they aren't going to be your friends for very long.  If it's not saying you'd say to people you know, why are you trolling us on the forums with this poor logic?




medi01 said:


> Your statement makes ZERO SENSE.
> You have NO IDEA how many cards were sold by the said retailer normally.
> Oh, yeah, actually roughly THE number they sell now.



I do have evidence of absence, which is a legitimate form of suitable evidence.

On the other hand, you are submitting a claim with zero evidence.  You claim that stock numbers are normal but provide no definition of "normal" nor any data to back it up.

Lay off the weed buddy, you can't even make a cognizant reply, let alone debate a topic.


----------



## Kyuta (Jan 24, 2021)

AusWolf said:


> Never mind, even if you could afford it and actually buy it, there's no power supply on this planet that can feed this beast.


it would not consume that much, 5nm for sure for 300W TDP


----------



## HD64G (Jan 24, 2021)

cueman said:


> 1st amd put two rx 5700xt core one gpus inside,born rx 6000 series, now they want push four 5700xt for ne gpu inside...rx 7000 series.
> is it that way,..that nothing really not coming there amd engineering..come on...its just enough...
> 
> anywya,looks future, 300W tdp is nothing soon, we took 400w gpus tdp 'normal'
> ...


Since the twice the shaders of 5700XT and made on the same node, 6900XT, even clocked higher consumes only 300W which is just 36% more than the 5700XT power draw having double the VRAM size, imho if made on 5nm node and with HBM VRAM, the Navi31 will consume easily closer to 350W than to 400W.


----------



## medi01 (Jan 24, 2021)

evernessince said:


> They are factually sold out from every US retailer


However they are available FOR WEEKS AND WEEKS in German retailer.
Puzzling eh?

And, *what about PS5 and XSeX, by the way, nowhere to be found last time I've checked, were those also "paper launched"?*


evernessince said:


> 2. Being available from 3rd parties doesn't change the fact of their sold out status for Official retailers.


There is nothing "innoficial" about mindfactory.
In fact, what is unusual for German market is buying from amd.com site.



evernessince said:


> 3. Advising people to buy from scalpers, paying 2- 3 times the price of an already expensive product isn't a solution.


a) MSRP of AIB cards was higher upfront.
b) It is up to 60%, not 2-3 times
c) There seem to be plenty buyers of it at that price, once market runs out of those people (which why would it, given how many bought 2080Ti for $200 more than official MSRP) prices might go down



evernessince said:


> I do have evidence of absence, which is a legitimate form of suitable evidence.


BS.
Your statement would only make sense if sales were lower than before, something that you have NO IDEA about, yet make the claim. That describes the whole "paper launch" BS from the whiney camp.



HD64G said:


> Since the twice the shaders of 5700XT and made on the same node, 6900XT, even clocked higher consumes only 300W which is just 36% more than the 5700XT power draw having double the VRAM size, imho if made on 5nm node and with HBM VRAM, the Navi31 will consume easily closer to 350W than to 400W.


It's not the same 7nm node though.

I doubt that AMD would use expensive HBM again, given how 6900XT manages to beat 3090 despite using notably slower VRAM on a narrower bus.


----------



## DeathtoGnomes (Jan 24, 2021)

medi01 said:


> Why people keep mumming "paper launch", when cards are available in online stores (e.g. mindfactory in Germany) for WEEKS?
> Now, yeah, pricing is way above MSRP, but so was 2080Ti's price up until Ampere came.


oh wait I thought  this  was  the  intel  PR for  its 10nm, my bad....


----------



## Valantar (Jan 24, 2021)

medi01 said:


> And, *what about PS5 and XSeX, by the way, nowhere to be found last time I've checked, were those also "paper launched"?*


Arguably, yes. Though of course there's also the issue there of no interesting titles available either, so ... more like a "don't worry about availability, it will be better once there is actually a point to buying these."

As for German stocks, Mindfactory doesn't have a single 3080 SKU in stock, but they do have some 3070s ... at prices significantly higher than the 3080's MSRP. They also have some 3090s at near €500 premiums. Caseking has a single €2250 3090 SKU in stock, no 3080s, no 3070s and no 3060 Tis. Anywhere else you had in mind?


InVasMani said:


> It's probably not even a if, but their intended market. My best assumption is AMD is planning to utilize the less perfect monolithic chip yields for MCM for more serious use compute workloads that aren't are latency critical or less so and more parallel computational critical in nature as they improve that tech over time. That similar to EPYC reduces the cost of the more premium perfect single chiplet yields. If it's used for gaming at all it'll likely start with niche enthusiasts initially and be catered more toward niche markets where the latency concerns are less damning like 4K and 8K because frame rates are lower and GPU rendering needs are greatly higher. The latency penalty is going to be more pronounced for lower resolution high refresh rate so they'd target that market last.
> 
> The lucrative markets that actually require GPU for work as opposed to fun and games will help subsidize the MCM GPU tech for gaming in future iterations with latency improvements down the road however because it's sure to be refined over time. Luckily what AMD's learned from Ryzen and with RNDA2 will help a lot in practice. I don't think this tech will have as much problems as Ryzen had with CCX issues because some of that is been ironed out with Ryzen already and additionally infinity cache is a big innovation in itself. AMD is gearing up for major innovation in the coming years as is the industry as a whole.
> 
> ...


This rumor specifically says RDNA, so this isn't headed for compute workloads - that's what CDNA is for. RDNA's biggest improvements came in gaming performance/TFLOP after all, which doesn't quite make sense for compute workloads. CDNA has significantly better compute performance per watt and per die area than RDNA - otherwise AMD wouldn't have gone to the trouble of branching off the two architectures, after all. Of course RDNA 2 delivered massive perf/W gains, but I would be quite shocked if none of those carried over to newer revisions of CDNA as well. I would indeed be surprised if we saw MCM GPUs with RDNA before CDNA - that kind of tech seems like a shoo-in for compute and HPC - but that's not what this rumor alleges. Of course it's entirely possible that there are MCM CDNA GPUs in the pipeline before this - enterprise GPUs don't tend to get much press attention.


----------



## InVasMani (Jan 24, 2021)

To be fair if it works well enough in terms of latency even in it's infancy they'll use it for gaming. Frankly it only needs to improve a bit over CF/SLI has traditionally performed and enough people will be interested for AMD to refine it further over time. I still think the lions share of potential and profit for MCM GPU's especially initially doesn't exactly reside with gaming. Also what's to say this isn't the intended revisions to CDNA initially!? Just because it's based on RDNA doesn't mean a lot once you venture down MCM GPU design where you could double, triple, quadruple performance with chiplets potentially at least done well. From a time to market standpoint not redesign another CDNA specific chip and using MCM in place is probably better anyway for now at least. It's hard to say what AMD plans are for which market. Til more details are known it's too hard to tell definitively speaking at this stage. There is a good chance you're right about the RNDA/CDNA aspect though and if it is the case AMD must feel confident enough about the MCM approach and it's potential. To be fair I can't see it being bad myself even if it's basically CF/SLI with some infinity cache between chiplets that's still good progress I feel over the previous situation. Just having a single PCB can cut down on power a lot with a two chip GPU over two discrete ones and if they can make them sync better with less hardware overhead penalty that's going to be big progress. AMD's choice of words is something to think about too "orchestration" that gives the impression of a rather seamlessly well synchronized workload. Let's hope it's more than a clever name title that's more suggestive than actual functional substance.


----------



## medi01 (Jan 24, 2021)

Valantar said:


> Arguably, yes. Though of course there's also the issue there of no interesting titles available either, so ... more like a "don't worry about availability, it will be better once there is actually a point to buying these."


3.4 million PS5 units were sold in the first 4 weeks.
Paper launch is a nonsensical take on "how does availability of new anticipated products work".



Valantar said:


> Mindfactory doesn't have a single 3080 SKU in stock


We were talking about AMD.
NV is clearly discouraged to produce 3080 (why bother with GA102, when GA104 3070 sells for nearly as much).
I think the last time mindfactory had several hundred of those, it took it about 15 hours to sell them at somewhat above 1k Euros.


----------



## Batailleuse (Jan 24, 2021)

Chrispy_ said:


> I just feel sorry for people who didn't pick up a modern GPU last generation. They're completely boned right now.
> 
> It's understandable, too, because Turing was a complete rip-off for the first year on the market, and then when Navi arrived on the scene a year late the party it was underwhelming AF; The 5700 wasn't even 15% faster than the Vega56, and even at MSRP, it actually offered lower performance/$. Street pricing on Vega56 by that point was way lower than the 5700-series ever reached.
> 
> The only "good" thing that happened that entire generation is that AMD rejoining the GPU market after a year of total nothingness brought TU104 down to under $500 in the 2070 Super. That didn't make it good value, but at least it restored price/performance to somewhere that wasn't obscene. Unfortunately Nvidia stopped manufacturing Turing around 9 months after the Super series launched, so if you didn't grab one during that short window you're SOL right now.



vega56 is actually kind of amazing you can flash a 64 bios on them and they perform at 95% like a vega64 in games, which itself performs between a 1080 and a 1080ti. 

for the price it was hard to do better perf/$ until this gen ... 3060/3060ti when their price drop is no brainer in this area.


----------



## Midland Dog (Jan 24, 2021)

*cOmpUtE MonSteR*
is slower than 3090 in fp32


----------



## Minus Infinity (Jan 24, 2021)

So in Australia I've seen the 6900XT being advertised for $2K so I would expect this ridiculous card to be over $4K as we'll no doubt see AMD also increase prices again with next gen CPUs and GPUs.

I'd rather they increased bus width as it's patently obvious that the reason Navi 21 falls behind at 4K is the 256 bit bus width. It was always a stupid decision for the high end cards despite what infinity cache achieves. Same will happen with 6700XT being only 192 bit.


----------



## awesomesauce (Jan 25, 2021)

one word: driver


----------



## Nihilus (Jan 25, 2021)

MEGA Navi


----------



## Valantar (Jan 25, 2021)

awesomesauce said:


> one word: driver


I haven't heard of any significant driver issues with RDNA 2. Not whatsoever.


----------



## saikamaldoss (Jan 25, 2021)

If they can release it in 2nd half, it would be awesome


----------



## Valantar (Jan 25, 2021)

medi01 said:


> 3.4 million PS5 units were sold in the first 4 weeks.
> Paper launch is a nonsensical take on "how does availability of new anticipated products work".
> 
> 
> ...


Console APUs have been in mass production since before summer 2020; stocks have had a lot more time to build up than GPUs. Them still being out of stock speaks of gross underestimations of demand there too. At least there are no miners using consoles ...

As for speaking of AMD vs. Nvidia, I still can't find any noticeable stock there either, so...

Also, what they are encouraged/discouraged to produce is meaningless - if a GPU hits shelves in Europe/the US today, that means its silicon started its journey through the fab 3-4 months ago at the very least. If Nvidia made adjustments to which die each wafer made is immediately after launch, we'd be seeing those adjustments about now, if not a bit into the future. And there's no way they made any such adjustments that early. So your quasi-conspiracy theory has pretty wobbly legs to stand on at best.


----------



## AusWolf (Jan 25, 2021)

InVasMani said:


> *Based on what evidence exactly?* I wouldn't expect a single PCB RDNA3 twin chip design with a CU count like the RX 6900 XT to be worse on power draw than two discrete RNDA2 RX 6900 XT's. There should be power optimized refinements from RDNA2 to RNDA3 and *just because it's using a pair of chips with the same CU doesn't mean on a single PCB they will be clocked the same* which even if they were and AMD's optimized various aspects of it to increase power and efficiency it'll still draw less power. Even if AMD optimized nothing at all a single board design is going to be more efficient in practice. The more challenging aspect would be cooling it, but they'll probably make it water cooled or at least a 3-slot cooler rather than a 2-slot design. I don't know if they'd go with a 4-slot cooler or not they certainly could, but if they don't increase clock speeds while making it more power efficient for RDNA3 they might not need to. Just put a vapor chamber on both sides of the PCB with 4 U-shaped heat pipes connected spaced out across the PCB length and two blower fans on each side of the PCB on the rear exhausting all the heat. That would exhaust plenty of heat all outside of the case. The heat-pipes themselves could be filled with a bit of liquid metal if they aren't already done that way these days.


Why would I need evidence to support my speculation that is based on a news article that is also based on speculation? I am not stating truths here, merely guessing based on current GPU trends.

Clocking different cores of a CPU differently makes a lot of sense as they're usually working on different tasks. Clocking two chiplets of the same GPU differently sounds like a terrrible idea to me. Mismatched SLI never worked, and there's plenty of videos on youtube on mismatched Crossfire setups performing worse than single cards in some games. Chiplets are on the same die, have the same memory allocation, are connected to the same display, they essentially perform together as one chip. They can't be clocked differently, otherwise shader processors of current GPUs could be clocked differently too. But again, I'm merely speculating and we'll see what the future brings.

Let's just hope those people who commented about RDNA 3's 5 nm process being a lot more energy efficient than 7 nm are right.  I have no desire for 300 W+ cards either PSU or cooling-wise.


----------



## Valantar (Jan 25, 2021)

AusWolf said:


> Why would I need evidence to support my speculation that is based on a news article that is also based on speculation? I am not stating truths here, merely guessing based on current GPU trends.
> 
> Clocking different cores of a CPU differently makes a lot of sense as they're usually working on different tasks. Clocking two chiplets of the same GPU differently sounds like a terrrible idea to me. Mismatched SLI never worked, and there's plenty of videos on youtube on mismatched Crossfire setups performing worse than single cards in some games. Chiplets are on the same die, have the same memory allocation, are connected to the same display, they essentially perform together as one chip. They can't be clocked differently, otherwise shader processors of current GPUs could be clocked differently too. But again, I'm merely speculating and we'll see what the future brings.
> 
> Let's just hope those people who commented about RDNA 3's 5 nm process being a lot more energy efficient than 7 nm are right.  I have no desire for 300 W+ cards either PSU or cooling-wise.


I think you misread what they said about clocks: I don't think they meant we might see two dice on the same board with different clocks, but that we shouldn't expect to see dice clocked as high when we have two on one board as when there's only one (like the 6900 XT). Given that power draw increases roughly by the square of voltage, dropping voltage even 100mV can make a huge difference, so a small-ish downclock can definitely make a dual-die solution stay at or below 300W. You could probably run 2x80CU RDNA2 7nm at ~1800-2000MHz at 300W if you tune the voltages ever so slightly.

Oh, and chiplets are not on the same die. That's kind of the definition of MCM/chiplet architectures: multiple chips/chiplets/dice (those terms are _nearly_, though not quite, interchangeable) on the same substrate or package.


----------



## InVasMani (Jan 26, 2021)

Valantar said:


> I think you misread what they said about clocks: I don't think they meant we might see two dice on the same board with different clocks, but that we shouldn't expect to see dice clocked as high when we have two on one board as when there's only one (like the 6900 XT). Given that power draw increases roughly by the square of voltage, dropping voltage even 100mV can make a huge difference, so a small-ish downclock can definitely make a dual-die solution stay at or below 300W. You could probably run 2x80CU RDNA2 7nm at ~1800-2000MHz at 300W if you tune the voltages ever so slightly.
> 
> Oh, and chiplets are not on the same die. That's kind of the definition of MCM/chiplet architectures: multiple chips/chiplets/dice (those terms are _nearly_, though not quite, interchangeable) on the same substrate or package.


This I was implying using different chiplet clock speeds between single monolithic and MCM GPU designs. The MCM GPU's could be a lower grade chip, but with lower voltages and heat dissipation. They might be better chips alternatively and just better distribution of heat across the heat spreader potentially though it's less likely between the two. AMD could maximize profit combining lower grade chiplets on MCM's and saving the best binned chips for monolithic single chiplet GPU's. I imagine eventually they will all be MCM GPU's or all, but highest binned monolithic chips. I think outside of the gaming market though AMD will take a pairing of premium well binned chips in the other more lucrative markets.



AusWolf said:


> Let's just hope those people who commented about RDNA 3's 5 nm process being a lot more energy efficient than 7 nm are right.  I have no desire for 300 W+ cards either PSU or cooling-wise.


It'll defiantly offer possibilities of reduce power consumption even with higher performance, but how much comes down to AMD's design decisions about ideal trade off between the two. It'll probably be similar to 12nm to 7nm, but maybe a bit less pronounced. It's tough to say though if it enables them to actually physically arrange and squeeze in more chips underneath a heat spreader that changes things a lot and it could end up being higher as a byproduct.

Far as the based on what evidence exactly thing goes I just meant what makes you feel that way aside from a gut feeling. I personally think it can make things cheaper because they can utilize lower quality smaller more imperfect chips together in pairs or more that require less heat and voltage added together monolithic chip of similar design circuitry would require. It allows them to better dissipate heat across the heat spreader even as well which of course is a advantage. It has the potential to offer more performance better efficiency and lowered cost plus it makes binning more flexible as well. It's actually what AMD needs to better compete with Nvidia that often trounces them on SKU offerings for a GPU architectural generation. Nvidia routinely has way more chips diced and binned into more GPU brackets covering a wider swath of performance. They have a tendency to squeeze AMD on both sides of competing GPU offerings since they've got a bigger R&D budget and aren't competing against Intel simutaniously to the same degree.


----------



## medi01 (Jan 26, 2021)

Valantar said:


> Console APUs have been in mass production since before summer 2020; stocks have had a lot more time to build up than GPUs. Them still being out of stock speaks of gross underestimations of demand there too


No.
Out of stock situation means that console has not failed (yet), merely that.
Shortage itself is a mere reflection of supply chains not being capable of "burst" deliveries.
You can calculate whatever demand figure you want, it doesn't magically increase  your production capacity.

This discussion is particularly ironic in the light of EU's vaccine woes...  



Valantar said:


> As for speaking of AMD vs. Nvidia, I still can't find any noticeable stock there either, so...


I see in stock overpriced (not even "accepted" 20% on top of MSRP, thank you, 2080Ti, but +60% and more)  GPUs from both manufacturer, with only 3080 being rather rare (but still real) guest.

Sales are also 60% vs 40% green vs red, so, AMD has grabbed some of the market back, despite production shortages.


----------



## Valantar (Jan 26, 2021)

medi01 said:


> No.
> Out of stock situation means that console has not failed (yet), merely that.
> Shortage itself is a mere reflection of supply chains not being capable of "burst" deliveries.
> You can calculate whatever demand figure you want, it doesn't magically increase  your production capacity.


I never said they had failed, I said they had grossly underestimated demand. I never said they could have increased production volumes up until now either, only that they had been mass producing consoles for several months before launch and should have had a better estimate of necessary stocks. Of course, Covid has likely increased game console demand noticeably, which would have been very difficult for manufacturers to plan for, but they've nonetheless clearly and significantly underestimated demand. When the consoles are still out of stock globally several months after release, they fucked up. Plain and simple. Calling it a paper launch might be a _tad_ harsh on my end, but it's essentially that in the end - no availability, and no actual games to play on them beyond forwards-compatible previous-gen titles.


medi01 said:


> I see in stock overpriced (not even "accepted" 20% on top of MSRP, thank you, 2080Ti, but +60% and more) GPUs from both manufacturer, with only 3080 being rather rare (but still real) guest.
> 
> Sales are also 60% vs 40% green vs red, so, AMD has grabbed some of the market back, despite production shortages.


10-20% premiums for high end OC models with fancy coolers is somewhat acceptable. 60% is a travesty, and really shouldn't affect any account of whether a product is in stock or not. Whether that is from a major retailer or a reseller, those are scalper prices, and can't be taken into account alongside normal sales at normal prices. Any product can be kept in stock given a sufficient increase in the sales price, after all. As for sales distributions, this is too small a sample size to really say. I'm hoping AMD grabs back some market share (we _despreately_ need a more competitive GPU market), but it's going to take a long, long time for them to reach 40% global marketshare.


----------



## ratirt (Jan 26, 2021)

I like the idea of dual GPU but in a chiplet form not 2 monolithic chips paired together. I think chiplets, is not such a bad idea to introduce in the GPU market especially for AMD. The company has huge experience with chiplets with Ryzen and thus far it does have the performance and efficiency. Both have been improved by every new release. It is promising and I'd like to see more of this solution and how the company will introduce it in GPUs. I'm only hoping this not to be another SLI/Crossfire like solution though.


----------



## MikeMurphy (Jan 26, 2021)

AusWolf said:


> Never mind, even if you could afford it and actually buy it, there's no power supply on this planet that can feed this beast.


Power consumptions comes way down when run at the golden ratio along the efficiency curve.  I could see this as being 350-400w.


----------



## AusWolf (Jan 26, 2021)

MikeMurphy said:


> Power consumptions comes way down when run at the golden ratio along the efficiency curve.  I could see this as being 350-400w.


That is true, though I don't want to see 350-400 W being the new "way down".


----------



## Valantar (Jan 26, 2021)

MikeMurphy said:


> Power consumptions comes way down when run at the golden ratio along the efficiency curve.  I could see this as being 350-400w.


350-400W is the most that can feasibly be cooled in a consumer PCIe product, so aiming for that at stock clocks is... iffy. Slow and wide is alswus faster than narrow and fast, so they don't have much to lose by clocking these really, really low. Also, there'd be lots of OC headroom for whoever wants it!


----------



## InVasMani (Jan 27, 2021)

Valantar said:


> 350-400W is the most that can feasibly be cooled in a consumer PCIe product, so aiming for that at stock clocks is... iffy. Slow and wide is alswus faster than narrow and fast, so they don't have much to lose by clocking these really, really low. Also, there'd be lots of OC headroom for whoever wants it!


I feel with MCM GPU's 3-slot coolers will be upper mid range 4-slot high end 2-slot lower mid range and 1-slot low end or SFF. Which chips are used and why will depend on circumstances.


----------



## pantherx12 (Jan 27, 2021)

Minus Infinity said:


> So in Australia I've seen the 6900XT being advertised for $2K so I would expect this ridiculous card to be over $4K as we'll no doubt see AMD also increase prices again with next gen CPUs and GPUs.
> 
> I'd rather they increased bus width as it's patently obvious that the reason Navi 21 falls behind at 4K is the 256 bit bus width. It was always a stupid decision for the high end cards despite what infinity cache achieves. Same will happen with 6700XT being only 192 bit.




If they have IO as a separate chip then it will be easier to have higher band width cache presumably. 

This could also bring down prices as they could do as they do with zen now and have the IO on a cheaper larger process node and just have the compute parts on 7 or 5nm etc.


----------



## medi01 (Jan 27, 2021)

Valantar said:


> I said they had grossly underestimated demand.


This statement is based purely on fact of supply not being demand.
And it turns blind eye to already voiced "but you can't just ramp up production at will".
And no, "but you can produce it in advance" does not fly, production starts only a few months before sales, as chips that go inside consoles are rather new.


----------



## Valantar (Jan 27, 2021)

medi01 said:


> This statement is based purely on fact of supply not being demand.
> And it turns blind eye to already voiced "but you can't just ramp up production at will".
> And no, "but you can produce it in advance" does not fly, production starts only a few months before sales, as chips that go inside consoles are rather new.


Mass production of console SoCs start at least 6 months before launch, precisely due to the long and complex supply chain for these devices as well as the need to build up volumes for launch. That obviously doesn't mean their makers  have much opportunity to increase volumes after that start, given the typically ~6-month lead time on foundry orders and the lack of sales data before launch. But it's still undeniable that their initial sales predictions have very clearly been woefully low compared to actual demand. That might be due to Covid, it might be due to a heap of other reasons, but that still doesn't change the fact that they severely underestimated demand, and have completely failed to supply enough consoles for them to be widely available. But as I said above, there aren't any current-gen titles either (well, a handful at best), so it's not much of a loss even if the backwards compatibility experience is excellent. And hopefully by now they've long since increased their orders with TSMC - but it might be that they too are held back by other current shortages such as in chip packaging, silicon wafers or even DRAM.


----------



## r.h.p (Jan 28, 2021)

DeathtoGnomes said:


> The Voodoo that you do so welllll!
> 
> "paper launch q1 2023!"



voodoo 4 i think , was excited i was .Swapped to Ge force from memory


----------



## Vayra86 (Jan 28, 2021)

Valantar said:


> I never said they had failed, I said they had grossly underestimated demand. I never said they could have increased production volumes up until now either, only that they had been mass producing consoles for several months before launch and should have had a better estimate of necessary stocks. Of course, Covid has likely increased game console demand noticeably, which would have been very difficult for manufacturers to plan for, but they've nonetheless clearly and significantly underestimated demand. When the consoles are still out of stock globally several months after release, they fucked up. Plain and simple. Calling it a paper launch might be a _tad_ harsh on my end, but it's essentially that in the end - no availability, and no actual games to play on them beyond forwards-compatible previous-gen titles.
> 
> 10-20% premiums for high end OC models with fancy coolers is somewhat acceptable. 60% is a travesty, and really shouldn't affect any account of whether a product is in stock or not. Whether that is from a major retailer or a reseller, those are scalper prices, and can't be taken into account alongside normal sales at normal prices. Any product can be kept in stock given a sufficient increase in the sales price, after all. As for sales distributions, this is too small a sample size to really say. I'm hoping AMD grabs back some market share (we _despreately_ need a more competitive GPU market), but it's going to take a long, long time for them to reach 40% global marketshare.



The shortage is global. I'm not sure if fucking up is the right word for it. When you haven't got capacity, its just not there. This is not directly pandemic related either, the whole supply chain is under stress because demand has exceeded supply for years now. This is why prices have been surging for several components like NAND, RAM, etc and keeps doing so.

At the same time, these chips are produced for the cutting edge of consumer hardware, so you can't just relegate that to older nodes and be done with it - that directly hurts the selling point of your new product, if it doesn't kill it altogether.

I'm still puzzled by the whole entitled attitude wrt buying luxury products like these. Still whining about scalper prices. This is how free markets work, if it took you until today to figure that out... well... welcome to Earth. It's been like this for over 2000 years, ever since we discovered the concept of trade. Scarce = expensive. Simple.


----------



## kapone32 (Jan 28, 2021)

Vayra86 said:


> The shortage is global. I'm not sure if fucking up is the right word for it. When you haven't got capacity, its just not there. This is not directly pandemic related either, the whole supply chain is under stress because demand has exceeded supply for years now. This is why prices have been surging for several components like NAND, RAM, etc and keeps doing so.
> 
> At the same time, these chips are produced for the cutting edge of consumer hardware, so you can't just relegate that to older nodes and be done with it - that directly hurts the selling point of your new product, if it doesn't kill it altogether.
> 
> I'm still puzzled by the whole entitled attitude wrt buying luxury products like these. Still whining about scalper prices. This is how free markets work, if it took you until today to figure that out... well... welcome to Earth. It's been like this for over 2000 years, ever since we discovered the concept of trade.


I was talking to a friend at my local PC shop. They have lots of 3070 and a few 3080 from last week's stock drop. They had 30 6000 series cards listed at the same time and they sold in less than 2 hours with pick up only.


----------



## Valantar (Jan 28, 2021)

Vayra86 said:


> The shortage is global. I'm not sure if fucking up is the right word for it. When you haven't got capacity, its just not there. This is not directly pandemic related either, the whole supply chain is under stress because demand has exceeded supply for years now. This is why prices have been surging for several components like NAND, RAM, etc and keeps doing so.
> 
> At the same time, these chips are produced for the cutting edge of consumer hardware, so you can't just relegate that to older nodes and be done with it - that directly hurts the selling point of your new product, if it doesn't kill it altogether.
> 
> I'm still puzzled by the whole entitled attitude wrt buying luxury products like these. Still whining about scalper prices. This is how free markets work, if it took you until today to figure that out... well... welcome to Earth. It's been like this for over 2000 years, ever since we discovered the concept of trade. Scarce = expensive. Simple.


I don't disagree with any of that, especially the expectation of widely available parts - it's more that I'm surprised at the seeming lack of foresight of these companies. Though there are of course relevant questions to be asked regarding their actual influence in that regard, seeing how they are all fighting over the same limited supply.

I don't quite remember where, but I raised the exact question of why nobody seems to have seen this coming despite a situation like this needing several years to take place in some other thread. It's frankly rather baffling, and makes me wonder if it's simply down to consolidation (ever fewer actors competing in every part of the value chain) or if the problem is more fundamental than that. I mean, we all know how the memory industry actively works to maintain _just the right amount_ of scarcity to keep prices high without triggering a crisis, so I'm suspecting similar ideologies to be at play here as well in some form or other. I.e. nobody wants to compete in the high-end fab business just to break even (or perhaps more precisely, investors wouldn't allow them to, would tank their stock prices and drive them into bankruptcy for not being profitable _enough_), but someone somewhere must have seen where this was going.

A decreasing rate of node improvements has no doubt contributed by squeezing ever more products onto an ever smaller collection of nodes, as has ever-increasing chip demands for an ever expanding range of products (light bulbs 10 years ago didn't have much silicon in them, nor did toasters or coffee makers), but the assumption from that if going by simple supply/demand logic would be a continuous expansion of the silicon fab industry. Of course building a fab is a 5+-year, multi-billion-dollar bet, so there can't be many companies willing and able to make those bets, but it still seems really weird that so few seem to have seen this as a major opportunity. I guess the increased digitalization of  ... well, everything is likely the biggest contributor here (the amount of advanced computation taking place in contemporary cars is downright staggering), but it also seems to me that the loss of GloFo in the high-end fab space was a bigger blow than anyone seemed to notice, and it's starting to look like Intel's previous 14nm shortage was more of a precursor to a broader problem than anyone thought. I mean, since 2017 we've gone from 3 major high-end fab actors to 1, with Samsung being the perennial outsider looking in but somehow seeming to be number two now? It's all rather weird.

It's maybe even more baffling again that there are currently major, long-term shortages for ... silicon wafers and chip packaging substrates. I mean, really? I get that wafers require very pure silicon and highly specialized equipment, but it's still just monocrystalline melted sand spun into a cylinder(-ish) and cut into discs, and packaging substrates are ultrafine fiberglass and copper sandwiches. Are these things _that_ difficult to scale up? Or are the forces behind these industries just not interested in investing into expansion due to the commodity nature of the products?

So while I am in some sense frustrated that there's a sudden and systemic shortage of what seemed to be widely available consumer products pretty much yesterday, I'm more baffled by just how unprepared the industry seems to be for the direction it's growing in, especially given just how the very same industry constantly promotes how computerization of anything and everything will somehow save the world. It just looks like they "suddenly" got their wish (well, over 10+ years), yet had never really imagined it actually happening, and certainly hadn't actually considered what that scenario coming true would actually mean.


----------



## Vayra86 (Jan 28, 2021)

Valantar said:


> I don't disagree with any of that, especially the expectation of widely available parts - it's more that I'm surprised at the seeming lack of foresight of these companies. Though there are of course relevant questions to be asked regarding their actual influence in that regard, seeing how they are all fighting over the same limited supply.
> 
> I don't quite remember where, but I raised the exact question of why nobody seems to have seen this coming despite a situation like this needing several years to take place in some other thread. It's frankly rather baffling, and makes me wonder if it's simply down to consolidation (ever fewer actors competing in every part of the value chain) or if the problem is more fundamental than that. I mean, we all know how the memory industry actively works to maintain _just the right amount_ of scarcity to keep prices high without triggering a crisis, so I'm suspecting similar ideologies to be at play here as well in some form or other. I.e. nobody wants to compete in the high-end fab business just to break even (or perhaps more precisely, investors wouldn't allow them to, would tank their stock prices and drive them into bankruptcy for not being profitable _enough_), but someone somewhere must have seen where this was going.
> 
> ...



Simple answer: managers have zero knowledge or care about details. Their reality held up in normal times but when the shit hits the fan, it all falls apart. I no see it at work, I see it in news and in siciety at large. We managed away responsibility.

We live in a paper reality and the pandemic shows us where the problems really are. The same thing occurs in politics especially in first world countries. Netherlands, the UK, the US.. we are fuckihg failures with no direction whatsoever. No perspective and no vision. The stuff that happens over here lately you coukdnt think it up. And its all down to that same distanf mentality: managers trying to manage nothing but their own PR.

Thats that top 3% for ya.


----------



## Valantar (Jan 28, 2021)

Vayra86 said:


> Simple answer: managers have zero knowledge or care about details. Their reality held up in normal times but when the shit hits the fan, it all falls apart. I no see it at work, I see it in news and in siciety at large. We managed away responsibility.
> 
> We live in a paper reality and the pandemic shows us where the problems really are. The same thing occurs in politics especially in first world countries. Netherlands, the UK, the US.. we are fuckihg failures with no direction whatsoever. No perspective and no vision. The stuff that happens over here lately you coukdnt think it up. And its all down to that same distanf mentality: managers trying to manage nothing but their own PR.
> 
> Thats that top 3% for ya.


That's true in a lot of ways - IMO this can mostly be attributed to the growth of neoliberalism since the Reagan era, with its combination of deregulation of business, export of production to wherever the cheapest (and least organized, least protected) labor can be found, and introduction of an ideology obsessed with measuring "productivity" and "profitability" in ever-smaller and more arbitrary metrics while ignoring the big picture whenever possible. As long as the numbers look right (which they do, as they've defined what to count by what looks best when counted and can be optimized for counting), nobody is to blame when the house of cards collapses. While I'm rather scared of the isolationist/xenophobic undertones in a lot of it, I'm glad various governing bodies around the world are finally starting to see the necessity of maintaining a global distribution of high-end industries. Who knew it would be a bad idea to let the industries producing the things we fundamentally need for _everything_ to be concentrated in a few regions, in constant competition, go through heaps of buyouts and mergers, and become ever more consolidated? Yeah, someone ought to have seen that one coming. At this point, who knows if it's too late already and we're heading into some major slump? Guess we'll see in a year or two.


----------



## medi01 (Jan 29, 2021)

Valantar said:


> But it's still undeniable that their initial sales predictions have very clearly been woefully low compared to actual demand.


Dude, you are freaking me out.
There has not been A SINGLE MAJOR CONSOLE LAUNCH when product was not sold out for many weeks.
Because there is BURST demand in the beginning that nobody can match production capacity wise.
Because you cannot have BURST production and neither could you afford producing stuff YEARS before launch.

"Oh, you suddenly need 10 times normal supply of X? Well, am I supposed to build additional production capacity just to satisfy your needs for the next 3 onths???"

How hard is that to comprehend?

In such situation, something being sold out is NOT and can NOT be a sufficient indicator of someone mis-calculating sales.

There is exactly ZERO evidence either Sony or Microsoft underestimated demand for PS5 and XSex respectively.



Valantar said:


> That might be due to Covid


Someone sane expected demand for AT HOME ACTIVITY goods to DROP due to covid? 
/facepalm

PS
And on purchasing and supply chains, including logistics.
I have, in fact, been working with IT projects in the are.
Stop trash talking about those guys, the do their job pretty damn well, in facts, it's shocking how good they are at something that complex.


----------



## Valantar (Jan 29, 2021)

medi01 said:


> Dude, you are freaking me out.
> There has not been A SINGLE MAJOR CONSOLE LAUNCH when product was not sold out for many weeks.
> Because there is BURST demand in the beginning that nobody can match production capacity wise.
> Because you cannot have BURST production and neither could you afford producing stuff YEARS before launch.
> ...


First off: being "sold out for many weeks" does not equate to _zero_ availability for several months, as we're seeing now. A restock blip every two weeks that sells out in a few hours is still zero availability. Previous consoles have been hard to get after launch, but not at this level, at least in recent history.

Secondly, the burst of initial demand is _exactly_ why stocks are built up beforehand. You're not saying anything that hasn't already been addressed at length. If absolute peak production is X consoles/month, expected normal demand is, say, 1/2 X (to ensure sufficient production overhead to account for sales fluctuations), and launch month demand is expected to be 5X, then you try to produce consoles at maximum capacity for 5 months before launch to ensure sufficient supply. That's common practice, I'm just saying they missed their estimates. Of course there are tons of complicating factors like the added difficulty of shipping and distributing huge quantities just before launch (compared to the steady flow of regular sales), but given the sustained lack of stock, that clearly wasn't the issue. It is of course entirely possible that demand has been so high that no possible amount of pre-launch production could reasonably have ensured availability, but if that's the case, the companies involved should (and likely would) have addressed it far more clearly than they have.

Third, _of course _there is evidence of underestimated demand. There being no stock is evidence of that. Nobody plans to be sold out for months and months - consoles are a low-margin business reliant on after-sales, so every sale that doesn't happen hurts their bottom line. You can't buy games for a console you don't have, so they clearly plan to have the product available at all times. Nobody plans for stock shortages unless they're in the fashion industry. Again, there is a possible second explanation: component shortages beyond the control of the console maker. But again, if that was the case, why haven't they addressed that publicly? It's no skin off their backs to say "sorry, we're working as fast as we can, but we're struggling to get a hold of [component A] and production volumes are thus lower than our targets."

Fourth: please reread. I very clearly didn't say anyone expected at-home activity to drop due to covid, I said that the influence of covid on demand might be one of the reasons why their estimates were off, as it's an unknown factor that obviously makes estimating demand much, much more difficult. (Especially as it works both ways: on the one hand there's increased demand for at-home activities, while on the other hand a lot of people are struggling financially. Balancing the two without any real data to go by - which doesn't exist - is essentially just guesswork.) I mean, at this point you're just misreading me on purpose if that's what you got out of that sentence. Please take a breath and swallow your indignation, as it's causing you to not actually read what I'm saying.

Fifth: "trash talking"? Seriously? I mean, get a grip, man. This is just ridiculous. Saying that a vaguely defined group of people spanning dozens of companies and hundreds if not thousands of people missed some estimates and thus seriously messed up a product launch constitutes _trash talking_ to you? Calm down, please. There's nothing personal in this, neither directed at you or at anyone involved in this process. Not even anything saying they're bad at their jobs - I've never said anything to that effect. Estimates are always estimates, they are never a sure thing. This just happens to be a much bigger mess-up than what we normally see. Stop making this personal.


----------



## Sunny and 75 (Jan 29, 2021)

Another reason to skip current gen! RX 7000 series and RTX 40 series is the place to be. And maybe Xe-HPG if it proves to be a competitive Intel product. We'll see.

Chiplets on GPU! Curious about the result of such implementation and if RYZEN is anything to go by, I'd say Nvidia Lovelace just got a good run for their money.



Valantar said:


> there aren't any current-gen titles



Sony themselves said the real PS5 games won't be available before 2022.





						Sony boss: no 'generation-defining' PlayStation 5 games until 2022
					

Sony Interactive Entertainment CEO Jim Ryan says that 'generation defining' games will arrive on the PlayStation 5 in 2022.




					www.tweaktown.com


----------



## watzupken (Feb 3, 2021)

Actually I will be really interested to see how hot these chiplet GPUs run. Considering the Ryzen 7 5800X easily hits 90 degs with the 142W thermal limit and a 360 AIO cooler, these high end GPUs are not going to be just sipping power if the current gen GPUs are a precursor of what to expect in future GPUs. I think it will be a challenge to try and keep 2x 100+ watts of chiplets cool since each chiplet should be quite small in size.



Valantar said:


> Console APUs have been in mass production since before summer 2020; stocks have had a lot more time to build up than GPUs. Them still being out of stock speaks of gross underestimations of demand there too. At least there are no miners using consoles ...
> 
> As for speaking of AMD vs. Nvidia, I still can't find any noticeable stock there either, so...
> 
> Also, what they are encouraged/discouraged to produce is meaningless - if a GPU hits shelves in Europe/the US today, that means its silicon started its journey through the fab 3-4 months ago at the very least. If Nvidia made adjustments to which die each wafer made is immediately after launch, we'd be seeing those adjustments about now, if not a bit into the future. And there's no way they made any such adjustments that early. So your quasi-conspiracy theory has pretty wobbly legs to stand on at best.


Actually I think it is a combination of supply and demand that is causing havok. 

Supply - If you look at the timeline, COVID lockdowns hit really hard when the products are supposed to start or started mass production. So if AMD for example have an obligation to fulfill XXX number of XBOX and PS5 SOCs, it is likely they will fall behind in the production. So when things started to improve towards the end of the year, they are still busy trying to fulfill those orders. As a result, there is a knock on impact to producing Ryzen 5000 and RX 6900/6800. And don't forget, the lockdowns did not happened concurrently globally, they took place at different times, and with different durations. So I am sure there will be parts/ components shortages.

Demand - Having launched multiple times in the past, both AMD and Nvidia are very seasoned when estimating demand. But I feel the exception this year is that AMD became very competitive against Nvidia's Ampere. In the past few generations, it has always been a cake walk for Nvidia and they can afford to price their products at a premium. As a result, Nvidia generously handed gamers a GA102 chip @ USD 699, and the GA104 @ US 499. I feel Nvidia would have priced the RTX 3080 easily at USD 999 without competition, or give you a RTX 3080 based on the smaller GA104 chip. So it came as a surprise to us all, and sells like hot cakes. Of course the GA102 being a big and complex chip, was never meant to be mass produced in large numbers to begin with. As AMD joined the party late, they know they can't compete in terms of features, so accordingly have to price their RDNA2 flagships lower to compete. So its not that AMD or Nvidia did not expect demand to be high, its because competition forced them to compete on price vs performance, hence causing the imbalance between supply and demand. At least this is my opinion.


----------



## medi01 (Feb 3, 2021)

Valantar said:


> First off: being "sold out for many weeks" does not equate to _zero_ availability for several months, as we're seeing now


"Sony has sold X million units" somehow equates to "zero availability".



Valantar said:


> Secondly, the burst of initial demand is _exactly_ why stocks are built up beforehand.


Except that "beforehand' is never long enough to accommodate for the burst, as designs are finalized not long before launch.



Valantar said:


> Third, _of course _there is evidence of underestimated demand. There being no stock is evidence of that.


So, repeating moot argument as third point somehow makes it true. got it.
Once again, for particularly bright people: there are reasons OTHER THAN underestimating demand that lead to product shortages. Such as, you know, MANUFACTURING. AMD would love to bump production of its stuff by about 50% more waffers, except, nopie, TSMC is at full capacity, you know.
BUT CAN'T I BUILD IT IN ADVANCE? Yes, you can, what we got now was built in advance, as early as it was technically possible.


----------



## evolucion8 (Feb 4, 2021)

GPU market now is terrible. I had a Radeon VII Anniversary Edition which I paid $699 for it in April 2019 directly from AMD, and sold it three weeks ago for $1300 in Ebay, like I just put it on bid and see how far would it go. I end up using that same money and add $100 more and purchased an RX 6900XT from Facebook marketplace, brand new, unopened. Very happy with the purchase but just insane how it goes regarding pricing of new and old hardware.


----------



## Valantar (Feb 4, 2021)

medi01 said:


> "Sony has sold X million units" somehow equates to "zero availability".
> 
> 
> Except that "beforehand' is never long enough to accommodate for the burst, as designs are finalized not long before launch.
> ...


And 'round and 'round we go. Did you notice that nothing you said there actually contradicts my reasoning? I've never said that factors beyond poor planning haven't contributed significantly to this. Nor have I said that they haven't sold a significant amount of units. It's obvious that these products are subject to the same limitations as everything else. What I did say was that it's clear that they underestimated demand - which they did, otherwise they would have moved up preproduction to increase stocks - which by itself amounts to poor planning, regardless of contributing circumstances. I never said that this is the sole reason for the shortages, not anything to that effect. Though there's one thing worth pointing out: final console specs are decided quite late, but the silicon design for the SoC is finalized at least a year before launch, though often more like 18 months. Then it goes to tape-out, test production, then mass production - as I said, at least half a year before launch. There is typically wiggle room there to get things going faster if necessary.

And again: total sales numbers don't matter much if there's a sustained shortage - the shortage itself tells us that there is insufficient supply, which is a supply-side issue. You're presenting it as if this is only down to higher than normal demand, which is simply not true. Demand is indeed higher than normal, but supply is also significantly constrained, and its reasonable to say that console makers should have been aware of these challenges and done their best to account for them. It's also entirely possible that they have done so, but given the distinct lack of clear public statements about this that doesn't seem like the most reasonable assumption.


watzupken said:


> Actually I will be really interested to see how hot these chiplet GPUs run. Considering the Ryzen 7 5800X easily hits 90 degs with the 142W thermal limit and a 360 AIO cooler, these high end GPUs are not going to be just sipping power if the current gen GPUs are a precursor of what to expect in future GPUs. I think it will be a challenge to try and keep 2x 100+ watts of chiplets cool since each chiplet should be quite small in size.
> 
> 
> Actually I think it is a combination of supply and demand that is causing havok.
> ...


I think you're being far too generous with Nvidia here. Remember, Turing prices were unprecedented bad value over previous generations. $699 for the 3080 isn't generous as much as it is a return to a semblance of normalcy, though the lack of $300 and below products still tells of skewed market with a harmful focus on high ASPs and profits. Hopefully the Ampere and RDNA2 stacks will fill out soon to rectify this, but until then we still have a really bad GPU market regardless of supply. Of course production costs are rising (10GB of GDDR6X costs more now than 4GB of GDDR5 did for the 980, after all, and a huge die on 8nm costs a lot more than a huge die on 28nm), but that still doesn't explain the huge cost increases we've seen in recent years.

But other than that, yeah, it's clear that there's an unprecedented confluence of supply-side issues and higher than normal demand. It also seems like the supply-side issues run down the production chain quite a bit, which is quite worrying. A demand spike is one thing, but it shouldn't be enough to trigger what we've seen over the past few months.


----------



## medi01 (Feb 6, 2021)

Valantar said:


> What I did say was that it's clear that they underestimated demand - which they did, otherwise they would have moved up preproduction to increase stocks


Oh boy.
You were specifically told about "no, you can't go into preproduction as early as you will at least twice in this thread.


Valantar said:


> And 'round and 'round we go.


Ironic.


Valantar said:


> Though there's one thing worth pointing out: final console specs are decided quite late, but the silicon design for the SoC is finalized at least a year before launch, though often more like 18 months. Then it goes to tape-out, test production, then mass production - as I said, at least half a year before launch. There is typically wiggle room there to get things going faster if necessary.


There is exactly ZERO indication of console APUs not being produced half a year before launch, nor is there any evidence of that possibly having any visible effect on availability TODAY.



Valantar said:


> It's also entirely possible that they have done so, but given the distinct lack of clear public statements about this that doesn't seem like the most reasonable assumption.


In other words, instead of admitting the obvious (made up accusations of "miscalculation of demand") let's call that weird theory "most reasonable assumption", shall we...


----------



## InVasMani (Feb 7, 2021)

Perhaps by the year 2035 we'll have a GTX 1080 level of performance from a new GPU for $200's with the rate of innovation that's happening. The Radeon 580 is still on a island of it's own about 5 years after it launched in terms of value for dollar at the $200 price point it's tragic.


----------



## Valantar (Feb 7, 2021)

medi01 said:


> Oh boy.
> You were specifically told about "no, you can't go into preproduction as early as you will at least twice in this thread.


This is going to become a theme for this response: I never said that. Please stop putting words in my mouth. I said that they could have worked to bring production forward a bit. An initial production timeline always has some margins, some leeway, some room to tune or push things. If that to you is the same as saying one can "go into preproduction as early as [one wants]" then the error lies in your reading, not my writing.


medi01 said:


> There is exactly ZERO indication of console APUs not being produced half a year before launch, nor is there any evidence of that possibly having any visible effect on availability TODAY.


Again, I never said that. I said that they could have worked to push preproduction slightly earlier than the original plan. Which would have had an effect on availability, as a larger proportion of interested buyers would have been able to get their hands on consoles, lowering demand. That obviously isn't saying that, for example, an extra month of preproduction would have eradicated all shortages (that's very unlikely), but it would have improved things.


medi01 said:


> In other words, instead of admitting the obvious (made up accusations of "miscalculation of demand") let's call that weird theory "most reasonable assumption", shall we...


Ah, yes, the "accusations". Who am I accusing, specifically? And of what, specifically? I don't know why you're choosing to take this as some sort of personal attack (whether against you personally or against some vaguely defined group for which you are choosing to stand in - I honestly can't tell), but ... it isn't. It's undeniable that there has been a supply, manufacturing and distribution chain failure to meet demand. Period. Demand has also been unprecedented, but that doesn't mean that there is nothing that could have been done to alleviate things. And just because you seem to like misreading things, I'm not saying (and have never said) that the supply chain could have entirely avoided shortages. I'm just saying they could have handled this better. I was hoping you could see the difference between describing a systemic failure to respond to a situation and accusing either individuals or groups of not doing their jobs, but ... well, apparently not.

As for reasonable assumptions: do you have any arguments to say that it's _unreasonable_ to think this could have been handled better? Because I have yet to see any, beyond you somehow claiming that I'm insulting the people doing these jobs, which ... isn't an argument, but a derailing tactic.

And again: if we had seen statements from console makers to the effect of "we're producing these as fast as we can, but volumes are constrained by factors outside of our control" or something similar (which is quite common) we could have reasonably believed that they had been on this from early on and had been actively working to improve supply. Instead, all public evidence points towards initial sales estimates being significantly below actual demand, with console makers then having to scramble to increase volumes after launch. Which, as we have both been saying, takes quite some time, and likely won't have noticeable effects for several months, if not half a year.

Now can we please leave this silly off-topic discussion alone? Feel free to PM me if you want to continue this, but at least let us save the other people watching this thread from the pain of watching this play out.


----------



## medi01 (Feb 7, 2021)

Valantar said:


> This is going to become a theme for this response: I never said that.


You are responding to a quote about something being SAID TO YOU.
Are you ok?



Valantar said:


> And of what, specifically? a larger proportion of interested buyers would have been able to get their hands on console


How much larger a proportion? How much "earlier"? 


Valantar said:


> Who am I accusing, specifically?


Console manufacturers.


Valantar said:


> And of what, specifically?


That they are literally idiots who haven't learned how to estimate demand despite having it done so many times.

Something something, AMD availability, something:


__
		https://www.reddit.com/r/realAMD/comments/le2wlh


----------



## Valantar (Feb 8, 2021)

medi01 said:


> You are responding to a quote about something being SAID TO YOU.
> Are you ok?


So, let's take a teeny tiny step back here. For telling me that "no, you can't go into preproduction as early as you will" to make any kind of sense, I must first have said something to the effect of "yes, you can go into preproduction as early as you will." That statement is explicitly formulated as a contradiction, either of an explicit statement or of something implicit. Yet I have neither said nor implied that such a thing is possible, making your contradiction meaningless. What are you contradicting? The fact that I never said such a thing? Or are you just saying things to try to win some argument you've concocted? 'Cause making a contradictory statement that isn't actually contradicting something anyone said? Yeah, that's pretty much the definition of a straw man argument.


medi01 said:


> How much larger a proportion? How much "earlier"?


Does that matter? Any improvement is an improvement. And that's all I've ever said. To summarize what I've argued all along: "Things could have been better if the response had been better suited to the situation at hand." Obviously none of us here are in a position to go into specifics on anything like this - unless you're an executive at one of the console manufacturers?


medi01 said:


> Console manufacturers.


... you're aware that those are giant corporations, right? As in: not humans. Companies. Entities gathering the labor of hundreds if not thousands of people into more-or-less concerted efforts to achieve whatever the people in power decide to, with dozens of levels of in-between management to try to make this all work. The margin of error is _huge_ in any undertaking even remotely resembling a corporation. So, "accusing" them of underestimating demand is bad because ... some executive somewhere might take offense to a random forum user saying they could have handled this better? Yeah, sorry, I don't see the issue.

They underestimated demand. Period. They could have done a better job. If me saying that is insulting to you, please grow up.


medi01 said:


> That they are literally idiots who haven't learned how to estimate demand despite having it done so many times.


_Please, pretty please_, show me a quote of me saying that - or anything even remotely to that effect. Seriously. Otherwise, please go away. I mean, seriously. "Literally idiots"? Where? And are you saying that experience actually makes you immune from making poor decisions? Are you actually saying that an experienced person when faced with an unprecedented situation (such as a pandemic, which technically not unprecedented hasn't happened in a century) could not possibly make a slightly bad call? I mean, is your thinking so black and white that me being slightly critical of this must mean that I'm saying they are literally idiots? I honestly don't even know what to do with absurd statements like that.

Also, I don't know what you're trying to say with those numbers... it maybe looks like shipments are picking up somewhat? I frankly have no idea. Still sold out is still sold out.

Oh, and btw:

__ https://twitter.com/i/web/status/1357040327870328833Not necessarily a bullet-proof source (I'm generally wary of "industry analysts", but they do tend to have a lot of access), but at least part of the sales discrepancy between the (equally sold-out) competing current-gen consoles is then explained by one company starting manufacturing later than the other. Hm, maybe, just maybe, they might have sold more units if they had pushed to get mass production going slightly earlier? It obviously wouldn't have alleviated the other issues also affecting supply, nor would it have erased the massive demand, but demand does after all get saturated after a while, so earlier production would have meant a higher chance for an earlier drop-off in demand, no?


----------



## Octopuss (Feb 8, 2021)

/thread


----------



## GeeBee (Feb 8, 2021)

I am one of the unfortunate ones with 2 PCS at home and neither having a decent card at the moment. 

All I see is 2300 Euro worth of cards over at eBay. Don't know what to do and how to get one without getting robbed.

I'm reading online all sorts of stuff and it seems to me (albeit I ain't no expert), that it must be due to a combination of things really.

1) Miners indeed.
2) Scalpers also.
3) Unexpected demand much higher than usual.
4) Lack of foundry diversity.

Just talking out of my ass here, but it seems to me that only having tsmc producing stuff, is a pretty frightening thought. I mean, even Intel considers outsourcing to tsmc? Are we for real? 

Someone outside of the business, would assume that it's an ideal opportunity to jump in the wagon and make some money. Lots of money, in fact. However and now things get really interesting and pretty strange for an outsider, that seems to be out of the question? I mean, is it so inherently complex to start designing a new graphics card architecture as well as a fab to manufacture it? It must be cause nobody seems to be able to do it. Even the lads with the deep pockets are not interested, it seems.

Also, I've been reading some stuff about the start ups designing AI chips for NN training etc. They are quite a few around and again, all they can afford is the DESIGN? I mean, as far as construction, they send orders to tsmc also? What the hell is going on in here? 

Anyway, I'm really really disappointed I cannot get a couple of 3080s for the 2 PCs we have at home, let alone just 1. Sorry about the rant, lads.


----------



## medi01 (Feb 9, 2021)

Valantar said:


> Any improvement is an improvement


No.
In this context, small improvement is highly unlikely to have visible impact. 
I.e. if we call bump by 5% to be "small improvement" supply would need to be at 95% demand for situation to change from "missing" to "available".

This also addresses "things could have been better... if they'd started one month earlier" => highly unlikely.


----------



## Valantar (Feb 9, 2021)

medi01 said:


> No.
> In this context, small improvement is highly unlikely to have visible impact.
> I.e. if we call bump by 5% to be "small improvement" supply would need to be at 95% demand for situation to change from "missing" to "available".
> 
> This also addresses "things could have been better... if they'd started one month earlier" => highly unlikely.


Uh... how is that unlikely? Starting production X time earlier would, unless that early start meant a much slower ramp-up (which is possible, but not given) mean an earlier ramp to full production capacity, meaning more units produced proportional to how much earlier they started. So, a month earlier should equate to roughly a month's worth of units having been made more than the current state, which would then have served to fulfill some of the current demand. Again, I've never said this would have fixed things and magically solved availability, but at this point you're effectively arguing that producing more units doesn't make more units available. Which... well, good luck with that.

Also, those percentage examples are meaningless. Neither demand nor supply can be expressed without taking time into account, so unless you're speaking of, say, X%/month, those numbers mean nothing. Of course shipping and distribution makes this all the more complex as there's a 1-3 month delay between production and sales. Demand also always tapers off over time (more people are interested in buying while the product is new, and over time the market is saturated), meaning that boosting production will help get closer to being ahead of the curve there. So yes, any improvement helps. 

It's also pretty telling that all of a sudden you've abandoned all of your claims of be saying that "they" (presumably specific people, or at least specific groups of people) are "literally idiots". Hm. Might it be that you had absolutely zero basis for saying this? Moving the goal posts, personal attacks, straw man arguments and accusing people disagreeing with you of acting poorly are all classic signs of bad-faith arguing, so I'd recommend you take a minute to try to identify what aggregated you so much about this and why. Because it doesn't seem to be relevant to this thread.


----------



## medi01 (Feb 9, 2021)

Valantar said:


> at this point you're effectively arguing that producing more units doesn't make more units available.


Wrong.
I'm telling you, that to stop "out of stock" situation supply needs to beat demand.
If you are able to satisfy only 40% of orders, even DOUBLING number of units available, i.e getting to 80% of demand, won't get you out of "out of stock" situation.


----------



## Valantar (Feb 9, 2021)

GeeBee said:


> I am one of the unfortunate ones with 2 PCS at home and neither having a decent card at the moment.
> 
> All I see is 2300 Euro worth of cards over at eBay. Don't know what to do and how to get one without getting robbed.
> 
> ...


Pretty much everything you're saying here is accurate, but to answer your main question:
Designing chips is overall relatively cheap - they're all built from the base silicon components (the specifics of which are provided by the fab you're working with for your chip), so you "just" need appropriate design software and a team with the skillset necessary to understand how to make transistors do whatever magic it is you want them to do. That's a millions-of-dollars type of cost scale, unless you're designing mature, highly optimized products in a very competitive market, in which tuning and tweaking massively increases costs. Manufacturing chips is _incredibly_ expensive and complex. Developing cutting-edge fabrication nodes is something very,  very few companies are capable of. Intel's persistent failure to deliver new nodes on time for the past 7 years and GloFo dropping out entirely of the cutting-edge node game are both clear signs of this. We're talking development costs in the billions of dollars, as well as similar costs for every single fab built.

Another crucial issue that causes some of this that you didn't mention is that there's just one company producing lithography machines for these nodes: ASML. _Everyone_ buys the machines they use to make chips from them - TSMC, Samsung, Intel, everyone. And they have relatively limited production capacities - on the order of ~100 machines a year, which when considering that each fab needs more than one of these for any kind of volume production, is not a lot. There are other companies making lithography machines for legacy nodes (IIRC both Zeiss and Nikon have been in this market, though I don't know if they still are), but given that the manufacturing and development of this equipment is in and of itself incredibly expensive and complex, things tend to consolidate in unregulated capitalist systems. What we're seeing in this regard is just an expression of a decades-long process.

I don't see this improving without public/government intervention, as the pressures on companies to deliver profits to investors in most parts of the world are far too harsh for such a venture to be feasible. The world needs to start recognizing that silicon fabrication is a crucial aspect of global infrastructure, and that it needs to be broadly supported if we are to avoid potentially very dangerous shortages. And happily a lot of governments (at least the US and EU) are recognizing this, but these things are slooooooow to get off the ground. I mean, first you need billions in funding, then you need land (with highly specific requirements) and 100% stable power infrastructure and a qualified (and extremely specialized) workforce, then you need to build a fab, which takes a long time, then you need to get your lithography machines, which takes several years .... yeah, this is going to take some time to fix. But for any fix to be sustainable, it needs to be kept at least partially out of the control of kleptocrat investors and financiers - unless the knowledge requires to build, maintain and develop this is at least partly publicly owned and available, we'll just keep falling into this situation.



medi01 said:


> Wrong.
> I'm telling you, that to stop "out of stock" situation supply needs to beat demand.
> If you are able to satisfy only 40% of orders, even DOUBLING number of units available, i.e getting to 80% of demand, won't get you out of "out of stock" situation.


Again: please stop pretending that demand and supply aren't affected by time. Speaking of these things as if they are static or in "snapshots" like that is meaningless. Demand is dynamic and finite, just as supply is. As more is produced, you get closer to meeting demand.

There are easily 50 million people worldwide interested in a new console. That obviously doesn't mean even remotely close to all of them have the money to buy one now - that number (which is a random example, obviously) is total possible demand for the product. We know console sales are highly cyclical, with massive sales spikes around new releases and smaller spikes around refreshes, with everything else following a pretty typical decelerating decline until some semblance of a steady-state level of demand is hit. So there might be 15 million willing and able to buy a console in the first 6 months, but another 10 million over the next year, and another 5 the year after that, and so on. To meet demand, you need to chip away at that initial mass, as that's where the bulk of sales are found and where issues arise. And the longer you take to meet that initial demand, the longer you'll have shortages, angry customers, etc., as the volumes are that much higher. In this example scenario, if you're only able to meet 50% of demand for the first 6 months, that pushes another 7.5 million buyers into the next year, nearly doubling demand for that period. So the longer you're unable to meet demand, the more harmful it is, as you just keep pushing customers away, and keep piling up unmet demand. That's why you do everything in your power - hire more temporary staff, order more parts, rent more facilities, even build new factories! - to meet demand as early as possible, as those are a) sales just waiting to happen, meaning revenue, and b) potential angry customers that will hurt your reputation long-term if you fail to meet demand. Compared to that, whatever initial costs are needed are typically quite small, and can be made up for in various ways (if you've built a factory, you can always sell it or use it for something else, after all).

And again, I've never argued that this would magically fix things, I said it would _improve_ things. If you meet _a higher proportion of demand_ at any given time, that means you've eliminated more of the total pool of demand. That means there's less future demand to deal with in order to catch up. Thus, any increase in production volumes _helps_. Is that really so hard to grasp?


----------



## medi01 (Feb 9, 2021)

Valantar said:


> So the longer you're unable to meet demand, the more harmful it is


Yep, as in "it's better to be healthy and rich, than ill and poor", as in something super obvious nobody on this planet has ever argued about.



Valantar said:


> there's less future demand to deal


The argument is, that current situation (acute shortage of consoles... oh wait, as every single time with any popular console ever) somehow demonstrates "miscalculation" of the demand (your statement). Which lacks any sort of evidence whatsoever.

Healthy and rich bit is not only "not hard to grasp" but nothing someone is arguing with you about.


----------



## Valantar (Feb 9, 2021)

medi01 said:


> Yep, as in "it's better to be healthy and rich, than ill and poor", as in something super obvious nobody on this planet has ever argued about.
> 
> 
> The argument is, that current situation (acute shortage of consoles... oh wait, as every single time with any popular console ever) somehow demonstrates "miscalculation" of the demand (your statement). Which lacks any sort of evidence whatsoever.
> ...


The evidence lies in that there has never ever been a shortage on this scale. Period. And while demand has never been higher either, its increase is largely predictable (due to increased social acceptance of gaming etc.). So they must clearly have been planning for more sales than for the previous generation - anything else would be lunacy. But they still failed worse than ever before, speaking to a larger gap than normal between demand and supply. Which, yes, tells us that someone miscalculated demand. They might have miscalculated the size of the gaming market compared to 2013, the demand spike due to Covid, or any other bunch of factors. That is unknowable. What we do know is that we're seeing an unprecedented shortage. Which, again, means they miscalculated something. Period.

Also, you know that you've been arguing that nothing would have been better if they had made some more units, right? Putting your oversimplifying metaphors aside, delivering more consoles is good in and of itself. Isn't that an improvement, then?


----------



## GeeBee (Feb 9, 2021)

Valantar said:


> Pretty much everything you're saying here is accurate, but to answer your main question:
> Designing chips is overall relatively cheap - they're all built from the base silicon components (the specifics of which are provided by the fab you're working with for your chip), so you "just" need appropriate design software and a team with the skillset necessary to understand how to make transistors do whatever magic it is you want them to do. That's a millions-of-dollars type of cost scale, unless you're designing mature, highly optimized products in a very competitive market, in which tuning and tweaking massively increases costs. Manufacturing chips is _incredibly_ expensive and complex. Developing cutting-edge fabrication nodes is something very,  very few companies are capable of. Intel's persistent failure to deliver new nodes on time for the past 7 years and GloFo dropping out entirely of the cutting-edge node game are both clear signs of this. We're talking development costs in the billions of dollars, as well as similar costs for every single fab built.
> 
> Another crucial issue that causes some of this that you didn't mention is that there's just one company producing lithography machines for these nodes: ASML. _Everyone_ buys the machines they use to make chips from them - TSMC, Samsung, Intel, everyone. And they have relatively limited production capacities - on the order of ~100 machines a year, which when considering that each fab needs more than one of these for any kind of volume production, is not a lot. There are other companies making lithography machines for legacy nodes (IIRC both Zeiss and Nikon have been in this market, though I don't know if they still are), but given that the manufacturing and development of this equipment is in and of itself incredibly expensive and complex, things tend to consolidate in unregulated capitalist systems. What we're seeing in this regard is just an expression of a decades-long process.
> ...



Excellent post, Valantar - thank you.

Indeed I forgot to mention ASML - the company behind all of tech nobody ever heard of. I recall some videos I've watched on youtube and I was amazed with the size of those tools and therefore their complexity. It was like watching a Sci-Fi bio thriller or something.

You' re quite right that maybe it's the time for governments to chime in and save the industry, because clearly this ain't going anywhere. It will just get worse given that more and more industries get digital. Now we hear about cars, tomorrow it will be refrigerators with cameras and sensors and ethernet cards and all sorts. Like having agents in a program watching when we run out milk and fruit and then send an order directly at Tescos or something.

Far-fetched maybe, but this scenario may unfold sooner than later in smart cities and the like.

Cheers


----------



## Valantar (Feb 9, 2021)

GeeBee said:


> Excellent post, Valantar - thank you.
> 
> Indeed I forgot to mention ASML - the company behind all of tech nobody ever heard of. I recall some videos I've watched on youtube and I was amazed with the size of those tools and therefore their complexity. It was like watching a Sci-Fi bio thriller or something.
> 
> ...


The funny thing is, as with most research, a huge portion of lithographic development is done in universities and thus mostly funded by the public already (including in countries where universities are private, as they still rely heavily on public research grants). It's just rather baffling that these funding programmes haven't long since tackled how deeply problematic it is that the public pays for development of technologies that are then made proprietary to whatever company the university sells it to or creates to own it, meaning public funding is funneled into creating the basis for private profits with no direct reciprocation. The current move towards open publishing in academia is a good step in the right direction, but we need a lot more public ownership of the technologies that public funding make possible.


----------



## medi01 (Feb 9, 2021)

Valantar said:


> The evidence lies in that there has never ever been a shortage on this scale.


By which metric?
We have seen acute shortages of PS3, PS4, XBoxSomething, Nintendo Switch that lasted months and months after release.


----------



## Valantar (Feb 9, 2021)

medi01 said:


> By which metric?
> We have seen acute shortages of PS3, PS4, XBoxSomething, Nintendo Switch that lasted months and months after release.


While I haven't bothered looking up data for it, my own experience selling the PS4 and XBone in retail at launch definitely speaks against any major shortages there. It was periodically sold out for the first couple of months, but not consistently, our suppliers always had stocks on the way, and outside of launch day a "stock is gone in minutes/hours" scenario never happened. Supply was tight, but sufficient within a relatively short period of time. Pre-orders were fulfilled day 1. And I know from speaking to colleagues across the Norwegian electronics retail business that that impression is representative. Of course that's just one country, but given how small a market it is it's hardly likely that Norway got a massive allocation of consoles, no matter how wealthy people there are. Xbox 360 and PS3? As for anything pre-PS2, you can't really compare those businesses, given the _vastly_ different scales they operated on. So nah, sorry. Selling out at launch, and having spotty supply for weeks or even a couple of months afterwards? That's normal. Every single restock disappearing within a few hours, globally, for three months after launch? For two simultaneously launched consoles? Sorry, but that's unprecedented. And remember, spotty supply (i.e. stock coming in, then selling out over some time, but before the next restock appears) is quite different from stock being ripped off shelves as soon as it appears. Oh, and the Nintendo Switch? It's hardly a secret that Nintendo massively underestimated demand. They literally confirmed that themselves. So using that as an example of people _not_ underestimating demand doesn't quite work, eh?

I still don't understand why you're taking on some imagined burden of defending the poor, put-upon amorphous mass of the console production and distribution chain, as if it's somehow being mistreated by me stating the simple fact that from everything we currently know, it sounds reasonable that more could have been done earlier on to alleviate supply issues. There is nothing personal in saying that - it's a perfectly normal thing. How is anything supposed to improve if we're not allowed to point out things that haven't gone as they should?


----------



## voltage (Mar 22, 2021)

always copying everyone else ideas.


----------



## Vayra86 (Mar 22, 2021)

medi01 said:


> By which metric?
> We have seen acute shortages of PS3, PS4, XBoxSomething, Nintendo Switch that lasted months and months after release.



The current situation includes many unpredictable elements, and that's new. Its just that simple. On the larger scale of things... what we are seeing now are ALL results of overpopulation and global pressure on systems. We're too many and we want too much. You can easily translate that, directly, to:

- pandemic; a virus is a direct countermeasure of the planet against overpopulation. Its an ecosystem and we're trying to break it, it does fight back - and we will probably win, like we did in the past, making the problem and the threat even greater. The only way forward here is escalation, if we keep growing as a species. There are no square miles added to the earth surface and something's gonna give. More people & animals per square mile = more disease, its a statistical truth we'll never overcome.

- demand; we want and require chips in everything these days. Mechanical is being replaced with digital processing, in every place all over the world. Automotive is going full mental now, essentially turning cars into massive computers - Internet of Things is here.

- wealth; China is now at the point where it has feature and technological parity with the Western world. They have in-house development instead of copying, and they have a third of the world's population now gearing up to do as we have done for decades. Another major demand factor with a trickle effect to other Asian countries, while more developed Asian countries show no signs of decrease of demand that was already there - they have the same IoT development going on.

- shortages; chip production is stalling not because of fab capacity, but supplies to make them. The whole supply chain needs to satisfy the demand - not just lithography.


----------

