Wednesday, November 23rd 2022

TSMC 3 nm Wafer Pricing to Reach $20,000; Next-Gen CPUs/GPUs to be More Expensive

Semiconductor manufacturing is a significant investment that requires long lead times and constant improvement. According to the latest DigiTimes report, the pricing of a 3 nm wafer is expected to reach $20,000, which is a 25% increase in price over a 5 nm wafer. For 7 nm, TSMC managed to produce it for "just" $10,000; for 5 nm, it costs the company to make it for the $16,000 mark. And finally, the latest and greatest technology will get an even higher price point at $20,000, a new record in wafer pricing. Since TSMC has a proven track record of delivering constant innovation, clients are expected to remain on the latest tech purchasing spree.

Companies like Apple, AMD, and NVIDIA are known for securing orders for the latest semiconductor manufacturing node capacities. With a 25% increase in wafer pricing, we can expect the next-generation hardware to be even more expensive. Chip manufacturing price is a significant price-determining factor for many products, so the 3 nm edition of CPUs, GPUs, etc., will get the highest difference.
Sources: DigiTimes, via RetiredEngineer
Add your own comment

49 Comments on TSMC 3 nm Wafer Pricing to Reach $20,000; Next-Gen CPUs/GPUs to be More Expensive

#26
AceKingSuited
I think it's smart for AMD to price their next gen substantially below Nvidia's offerings. At some point, high prices are going to limit demand as these aren't necessities and there are cheaper options. It's not like gas, food, or housing where you don't have cheaper choices. $1600 got a single GPU is just an insane price! I built my entire top of the line PC for less than $1600 just 3 years ago.
Posted on Reply
#27
hat
Enthusiast
the54thvoidWell, at some point, the DIY PC culture will collapse. Constant price increases are not sustainable for 'non-essential' items.

Tech is meant to become cheaper as it evolves. Seems big corporations didn't get that memo.
I'm not so sure. It's all chips here, not just those going to the DIY PC market. Phones are a great example. It all depends on what the market will bear. So far it does seem like Nvidia is getting some pushback for ridiculous RTX 4000 series prices.

Side note: I've said before that I thought chip makers were relying too much on advancements from the foundry rather than making their own advancements in architecture. Looks like now they're going to have to pay for that... and pass the cost along to us, the consumers, of course. It may be interesting to see what happens going forward.
Posted on Reply
#28
TheinsanegamerN
dragontamer5788Nah, just the "PC Master Race" culture will collapse, which is probably a good thing.

PCs are tools. DIY PCs are other tools, be it a NAS, a server, a rendering box, or what have you. Lawnmowers aren't really improving that much, but there are plenty of DIY lawncare people around the USA. If your "gaming box" doesn't need a $5000 GPU, then its probably a good thing to stick with $500 GPUs or cheaper.

The concept of "keeping up with the Joneses" is pretty toxic in general IMO. Its one thing if you want better raytracing performance because you wanna see what it looks like. But its another thing to be buying $1500 GPUs just because they exist.
You dont need a $500 GPU. A $30 GPu does the job just fine. You dont need 1080p to play games. We should ban all GPUs over $50 for consumers.
Posted on Reply
#29
Jism
No shit.

The more smaller the node the more investment in R&D is required. And R&D is "expensive".
Posted on Reply
#30
Ownedtbh
So the gpu chip will not cost in production 65$ but will cost arround 99$ ?

geee i wonder how much more I have to pay then while getting the same hardware but with another chip on my gpu.
Posted on Reply
#31
64K
TheinsanegamerNYou dont need a $500 GPU. A $30 GPu does the job just fine. You dont need 1080p to play games. We should ban all GPUs over $50 for consumers.
Says the guy with the $800 card.
Posted on Reply
#32
stimpy88
I think they discovered nGreedia's business tactics. We need competition in this market!
Posted on Reply
#33
Fatalfury
TSMC be like : if u want us to build the 3nm FAB in USA(Arizona).. then the US companies(AMD,Nvidia,Qualcomm etc..) better contribute for the
development as well. ;)
Posted on Reply
#34
stimpy88
FatalfuryTSMC be like : if u want us to build the 3nm FAB in USA(Arizona).. then the US companies(AMD,Nvidia,Qualcomm etc..) better contribute for the
development as well. ;)
The US taxpayer already is... A lot!
Posted on Reply
#35
THU31
It does not look too bad. Only a 25% increase over 5 nm. The differences between 10, 7 and 5 are so much bigger according to that graph.

Seems it might be a good idea to stick with my 3080 for two more years, I do not mind lowering details. I will still have to pay over $1000 for a 5080, but at least I will get double the performance (so 4090 level or more).
Posted on Reply
#36
RedelZaVedno
It's not like world is experiencing stagflation or anything. We're all loaded with cash just waiting for $3000 GPU to finally hit the market. Sarcasm press :banghead:
Posted on Reply
#37
mechtech
DavenThis is probably the best time to build a complete PC from the ground up before prices get out of hand. All companies have released or about to release their greatest hits. Competition is at an all time high and the consumer is winning for the most part.

I understand waiting if you don’t have the budget but for anyone who has the disposable income now and need a new PC, now is the time.
Yep. In progress. Glad I had the patience to wait out the vids pricing. ;)
Posted on Reply
#38
medi01
GunShot...or if Apple, NVIDIA, Qualcomm, etc. ALL COLLABORATE and tell TSMC, "go pound sand!" watch how TSMC go hide under their bed in fear!
That is what would happen if customers vote with their wallet and tell NV, Apple, AMD etc to go pound sand.
Posted on Reply
#39
Assimilator
The reason is quite simple: EUV machines are hellishly complex and therefore expensive devices, upwards of $340 million dollars for one, and the smaller the node gets, unfortunately the fewer wafers each machine can process in a certain amount of time. Meaning you need more machines to achieve the same output, and they're costing ever-more, and the fabs really have no choice but to pass that cost on to their direct customers, who pass it on to consumers.

Silicon is at the end of the road for semiconductors, but unfortunately there isn't yet a compelling replacement for it, and until there is the cost of chips is going to keep increasing.
Posted on Reply
#40
Aquinus
Resident Wat-man
medi01That is what would happen if customers vote with their wallet and tell NV, Apple, AMD etc to go pound sand.
Well, just about every GPU chip these days comes from a TSMC fab, don't they? Intel, AMD, and nVidia? It's hard to vote with your wallet when all of your options use TSMC.
Posted on Reply
#41
Assimilator
AquinusWell, just about every GPU chip these days comes from a TSMC fab, don't they? Intel, AMD, and nVidia? It's hard to vote with your wallet when all of your options use TSMC.
It's more that all of the options use ASML (including even the mighty Intel, who's bowed to the inevitable) and ASML has no competition. I don't believe that they're price-gouging, because they (and their suppliers like Carl Zeiss) are pushing the limits of how precise lithography technology can be while still being physically possible, but more competitors is never a bad thing. I was hoping that America's CHIPS act was going to go towards funding the type of R&D required to build that competitor, but it seems like it's just subsidies for fabs, which quite honestly isn't going to spur any sort of innovation if they're just importing ASML's machines into the USA as well as Taiwan.
Posted on Reply
#42
Palladium
RedelZaVednoIt's not like world is experiencing stagflation or anything. We're all loaded with cash just waiting for $3000 GPU to finally hit the market. Sarcasm press :banghead:
Also, it's not like the gaming industry is already terrible at extracting performance out of multiple-billion transistor chips, like the newest Pokemon with .pptx frame rates.
Posted on Reply
#43
The Riddler
R&D for chips has become more and more expensive too. It's a more critical factor than wafer prices for the high CPU/GPU prices. NVIDIA's gross margin is %58 but net margin is %21 now.

And N3E has theoretically 180 million/mm2 maximum transistor density which can shrinks RTX 4090 GPU to below 450 mm2. Lots of dies from a 300 mm wafer:

www.silicon-edge.co.uk/j/index.php/resources/die-per-wafer
Posted on Reply
#44
THU31
The RiddlerAnd N3E has theoretically 180 million/mm2 maximum transistor density which can shrinks RTX 4090 GPU to below 450 mm2. Lots of dies from a 300 mm wafer:
But what is the point of shrinking that chip if it does not get cheaper?

It seems like new nodes are only relevant for increasing performance and efficiency, but not for reducing cost. But it makes lower-tier GPUs rather pointless from an upgrade path perspective.

The 4060 is supposed to have 3070 performance at $400. But the 3070 was only $500. 4060 will release almost three years after the 3070, so the performance per dollar increase will be terrible over such a long time period.
I feel like people will be upgrading GPUs far less often, because the performance gain will not be worth the cost.

They should focus on developing drastically different architectures and do what Maxwell did in 2014 on the same node as Kepler.
Posted on Reply
#45
Assimilator
THU31But what is the point of shrinking that chip
You answered your own question:
THU31It seems like new nodes are only relevant for increasing performance and efficiency
and this chart explains why: compute revenue has increased from ~30% in 2020 to more than 41% this year. Companies that buy compute products are ready and willing to fork out large amounts of cash for performance improvements that allow their workloads to complete in less time and thus make said companies more money, and they are willing to pay huge premiums to get that performance, which means compute is far more profitable for NVIDIA, hence why their gaming GPUs the past few generations are derived from their compute products.
Posted on Reply
#46
The Riddler


Are these inflation-adjusted prices or not? Cumulative inflation between October 2020 and October 2022 is %14 and if these numbers are not inflation-adjusted actual price increase is not 25 percent.

www.bls.gov/data/inflation_calculator.htm
THU31But what is the point of shrinking that chip if it does not get cheaper?

It seems like new nodes are only relevant for increasing performance and efficiency, but not for reducing cost. But it makes lower-tier GPUs rather pointless from an upgrade path perspective.

The 4060 is supposed to have 3070 performance at $400. But the 3070 was only $500. 4060 will release almost three years after the 3070, so the performance per dollar increase will be terrible over such a long time period.
I feel like people will be upgrading GPUs far less often, because the performance gain will not be worth the cost.

They should focus on developing drastically different architectures and do what Maxwell did in 2014 on the same node as Kepler.
Yes, Moore's Law is definitely dead but i am saying that wafer prices are not the biggest part of it. The real problem is about R&D of semiconductors:



I think consoles have a big advantage on there.
Posted on Reply
#48
Assimilator
Prima.VeraLooking forward to 2027 when a new microconductor corporation (Rapidus Corp.) will start production on 2nm first, to challenge TSMC and Samsung. Too much monopoly by those greedy companies.
english.kyodonews.net/news/2022/11/148c2f25de20-breaking-news-japan-announces-strategy-for-domestic-production-of-advanced-chips.html
From that very same article:
As part of efforts to secure stable domestic chip production, the Japanese government has decided to provide up to 476 billion yen to a Taiwan Semiconductor Manufacturing Co. subsidiary to fund the construction of a plant in Kumamoto Prefecture, southwestern Japan.
70 billion yen to rebuild their domestic semiconductor industry, nearly 7 times that amount for TSMC to build a plant in Japan. At least they're realistic.

Also stop parroting this nonsense about greed. It's not greed:
EUV mirrors from ZEISS are the most accurate mirrors in the world today. Scaled up to the size of Germany, the largest unevenness would be just a tenth of a millimeter. The sensors and actuators in a ZEISS projection optics work so precisely that a reflected laser beam could hit a golf ball on the moon with pinpoint accuracy.
Every part of the semiconductor manufacturing process at the leading-edge nodes is pushing the bounds of what human technology is quite literally physically capable of. That's why it took ASML and its partners nearly two decades to rise to the position they're at now, and why doing so cost so very much, and why continual innovation there continues to cost so much. And that's ignoring the production costs.
Posted on Reply
Add your own comment
Nov 16th, 2024 02:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts