Monday, April 3rd 2023

Intel's Next Generation GPUs to be Made by TSMC, Celestial Set for 3 nm Process

Intel has awarded TSMC with some big contracts for future manufacturing of next generation GPUs, according to Taiwan's Commercial Times. As previously covered on TPU, the second generation Battlemage graphics processing units will get fabricated via a 4 nm process. According to insider sources at both partnering companies, Intel is eyeing a release date in the second half of 2024 for this Xe2-based architecture. The same sources pointed to the third generation Celestial graphics processing units being ready in time for a second half of 2026 launch window. Arc Celestial, which is based on the Xe3 architecture, is set for manufacture in the coming years courtesy of TSMC's N3X (3 nm) process node.

One of the sources claim that Intel is quietly confident about its future prospects in the GPU sector, despite mixed critical and commercial reactions to the first generation line-up of Arc Alchemist discrete graphics cards. The company is said to be anticipating great demand for more potent versions of its graphics products in the future, and internal restructuring efforts have not dulled the will of a core team of engineers. The restructuring process resulted in the original AXG graphics division being divided into two sub-groups - CCG and DCAI. The pioneer of the entire endeavor, Raja Koduri, departed Intel midway through last month, to pursue new opportunities with an AI-focused startup.
The request for a supposedly very large quantity production of next-gen Intel GPUs has surprised industry analysts, especially when remembering the late arrival of the Arc 6 nm-based family of cards, also manufactured under contract by TSMC. Final silicon was ready by the middle of 2022, but further delays resulted in cards not reaching customers until later in the year. Intel looks determined to secure its third pillar in a market segment long dominated by NVIDIA and AMD, and reports of the bulk orders at TSMC seem to confirm that ambition, alongside continued improvement of drivers for the current crop of Arc cards. GPU R&D projects are ongoing and want to meet the demand from entertainment consumers (video games) and enterprise (artificial intelligence assisted tasks) alike. In the more immediate future, Intel is expected to launch a refreshed "Alchemist+" range of graphics cards. Insiders are pointing to a late 2023 launch for the Arc refresh.
Sources: TomsHardware, Commercial Times Taiwan
Add your own comment

21 Comments on Intel's Next Generation GPUs to be Made by TSMC, Celestial Set for 3 nm Process

#1
Darmok N Jalad
It’s a rough time to be the startup, even if you’re Intel. I read that Apple cut M2 production significantly due to weak sales, and M1 models were quite successful. Intel placing a large order is really quite surprising, especially in a space where competition can get rather brutal. Beating nvidia on pricing isn’t the hard part.
Posted on Reply
#2
TheoneandonlyMrK
Darmok N JaladIt’s a rough time to be the startup, even if you’re Intel. I read that Apple cut M2 production significantly due to weak sales, and M1 models were quite successful. Intel placing a large order is really quite surprising, especially in a space where competition can get rather brutal. Beating nvidia on pricing isn’t the hard part.
Getting chips in OEM units wasn't ever the hard part for Intel and from 0 Dgpu sold per year, the graph probably looks quite promising ATM.
Posted on Reply
#3
Jism
Raja's work is done.
Posted on Reply
#4
Operandi
Any new CPU or GPU is a absolutely insane undertaking endeavor in terms of resources and man hours, never mind when you have no real experience building GPUs at this scale. Predicting performance and projecting timelines is very difficult for AMD and Nvidia but when you are starting from zero thats about what your chances of getting it right are. Intel has to be willing to spend a lot of money and be willing to loose at least a few generations in order to get this right and be at some sort level footing with AMD and Nvidia whom they want to ultimately compete with. The Underlying hardware looks promising, hopefully they don't cut resources to point where they make it impossible for the GPU team to succeed.
Posted on Reply
#5
TumbleGeorge
OperandiIntel has to ............ and be willing to loose at least a few generations
I didn't agree. That is myth. When GPU architectures was developed by hand with pencil you is right. Today peoples using powerful supercomputers. This works much faster.
Posted on Reply
#6
Solaris17
Super Dainty Moderator
Darmok N JaladBeating nvidia on pricing isn’t the hard part.
But may make it worth the fight! I’m still rooting for them! A third competitor regardless of who is welcome. I find the actual determination to be something worth studying. I mean it’s the driver development team is going 110% with the improvements. If the interviews of the engineers are a reflection of the teams they front for then they seem genuinely excited for what they do. Even if other divisions don’t seem that married to their projects. Would be a fun time to be an engineer there I think.
Posted on Reply
#7
Daven
A two year cadence for doubling performance is not enough to break out of the low midrange to budget market segment.
Posted on Reply
#8
TumbleGeorge
DavenA two year cadence for doubling performance is not enough to break out of the low midrange to budget market segment.
Lol. Make cadence 3-4 years or more so that companies can milk their customers even longer without having to work hard.
Posted on Reply
#9
Gmr_Chick
DavenA two year cadence for doubling performance is not enough to break out of the low midrange to budget market segment.
I would argue that a low midrange to budget competitor is exactly what's needed right now, though, as it feels like both AMD and Nvidia have seemingly abandoned this critical market. Nvidia's last great budget/low midrange GPU was probably the 1660 Super/Ti; not sure about AMD though, as the 6500XT was honestly a flop.
Posted on Reply
#10
Darmok N Jalad
Gmr_ChickI would argue that a low midrange to budget competitor is exactly what's needed right now, though, as it feels like both AMD and Nvidia have seemingly abandoned this critical market. Nvidia's last great budget/low midrange GPU was probably the 1660 Super/Ti; not sure about AMD though, as the 6500XT was honestly a flop.
I just grabbed a used 5600 XT, and it’s a great card. Runs cool and quiet and only needs one 8 pin. Nothing new out there seems to want to run on one 8 pin anymore. That’s the most my old Mac Pro will drive without doing cable mods. In 2010, dual 6 pin was high end!
Posted on Reply
#11
SOAREVERSOR
Gmr_ChickI would argue that a low midrange to budget competitor is exactly what's needed right now, though, as it feels like both AMD and Nvidia have seemingly abandoned this critical market. Nvidia's last great budget/low midrange GPU was probably the 1660 Super/Ti; not sure about AMD though, as the 6500XT was honestly a flop.
I'd argue their last good midrange GPU was the 8800gt, Before that the 6800gt.
Posted on Reply
#12
Squared
The original timelines, as I recall, were 2022H1 for Alchemist, 2023 for Battlemage, and 2024 for Celestial. AMD and Nvidia seem to be on two-year cycles now so maybe this pace is fast enough, but Intel's upcoming products sounded more promising when they were only a year apart from one another. The product also needs the team behind it to be committed to its success, so it's good to hear that part is in good health.
Posted on Reply
#13
Fourstaff
Good. Intel's drivers still require a lot of work, this will buy them some more time to polish.
Posted on Reply
#14
wolf
Better Than Native
Glad to hear they're sticking it out, I'm rooting for you Intel, flood the mid to low end, and heck if you can make a good flagship then I'm all ears.
Darmok N JaladNothing new out there seems to want to run on one 8 pin anymore.
You'll be able to get a 4070 with a single 8-pin, so I'd imagine anything below should be that or less, like single 6 pin. At the risk of sounding like a broken record, the biggest issue will be the price. Just about anything can be excused if the price is low enough. Like I find the 6500XT insultingly bad, buy hey if it was brand new for $49 that would be acceptable to me, for example.
Posted on Reply
#15
Darmok N Jalad
wolfGlad to hear they're sticking it out, I'm rooting for you Intel, flood the mid to low end, and heck if you can make a good flagship then I'm all ears.

You'll be able to get a 4070 with a single 8-pin, so I'd imagine anything below should be that or less, like single 6 pin. At the risk of sounding like a broken record, the biggest issue will be the price. Just about anything can be excused if the price is low enough. Like I find the 6500XT insultingly bad, buy hey if it was brand new for $49 that would be acceptable to me, for example.
The 6500XT was a victim of circumstance, or perhaps we were the victim of circumstance. Anything better than that when it launched was hard to get and cost way too much. I'm not sure if AMD intentionally crippled the specs to make it unappealing to miners, or if they just wanted to see how desperate the customer base really was. I mean, it performed about as well as an RX480, and those were selling for over $200 used on eBay at that time. I think the 6500XT was the canary in the coal mine. I remember buying an R9 380 for $175 at that time. At least I sold it for about what I paid before the market improved. How sad.
Posted on Reply
#16
lexluthermiester
Solaris17I’m still rooting for them!
As am I. ARC cards have solid value, excellent 1080p performance and respectable 4k performance.
Posted on Reply
#17
ratirt
Intel is pulling all the stops. I think that is a good move. I doubt their graphics can reach or surpass NV and AMD but might get closer than with this gen ARC.
I only hope this move is not "to be or not to be" for Intel.
Posted on Reply
#18
watzupken
While I feel that the first gen ARC GPUs are pretty decent, the timing of its release is not. The fact that it got launch along with Nvidia’s Ada LoveLace basically destined it to be in the bargain bin, which is what we are seeing now. Price cut of price cuts to try to sell their A7xx cards and to underprice themselves from the aggressive pricing by AMD. The kicker is that AMD is trying to clear a previous gen GPU, while Intel is trying to introduce a new GPU, but at rock bottom prices. If the next gen ARC is going to be launched in H2 2024, I feel the same will happen again.
Darmok N JaladThe 6500XT was a victim of circumstance, or perhaps we were the victim of circumstance. Anything better than that when it launched was hard to get and cost way too much. I'm not sure if AMD intentionally crippled the specs to make it unappealing to miners, or if they just wanted to see how desperate the customer base really was. I mean, it performed about as well as an RX480, and those were selling for over $200 used on eBay at that time. I think the 6500XT was the canary in the coal mine. I remember buying an R9 380 for $175 at that time. At least I sold it for about what I paid before the market improved. How sad.
The RX 6500 XT is a forgettable card which I feel that AMD was trying to capitalise due to the desperate market. The reason why I think this is the case is because of the sorry state of the card. Lack of VRAM, I get it. Lack of updated video encoder/decoder which its bigger brothers have, lack of display outputs (limited to 2) and lack of PCI-E bus (limited to 4x), basically means it is mostly bad in everything. In games, it can be slower than a GTX 1650 even with a system that supports PCI-E 4.0. I assume AMD should have stopped production of RX 6500 and 6400 now, or they will be struggling to sell them off.
Posted on Reply
#19
N3utro
imagine going from beeing the world greatest founder to having to rely on another founder because you can't keep up. What a shame for intel
Posted on Reply
#20
ratirt
watzupkenWhile I feel that the first gen ARC GPUs are pretty decent, the timing of its release is not. The fact that it got launch along with Nvidia’s Ada LoveLace basically destined it to be in the bargain bin, which is what we are seeing now. Price cut of price cuts to try to sell their A7xx cards and to underprice themselves from the aggressive pricing by AMD. The kicker is that AMD is trying to clear a previous gen GPU, while Intel is trying to introduce a new GPU, but at rock bottom prices. If the next gen ARC is going to be launched in H2 2024, I feel the same will happen again.
To be fair, time and performance combined dictate if it is decent or not since you have to compare it to something. If it was released 3 years ago it would have been decent but with today's reality it is rather average at best. So I guess you can't say it is decent but then say timing is wrong. That does not make much sense. Heck, if it was release 10 years ago it would have been a killer GPU not just decent.
I really wish Intel caught up with AMD and NV next gen and hopefully 3nm TSMC is going to help with that even if only a bit.
watzupkenThe RX 6500 XT is a forgettable card which I feel that AMD was trying to capitalise due to the desperate market. The reason why I think this is the case is because of the sorry state of the card. Lack of VRAM, I get it. Lack of updated video encoder/decoder which its bigger brothers have, lack of display outputs (limited to 2) and lack of PCI-E bus (limited to 4x), basically means it is mostly bad in everything. In games, it can be slower than a GTX 1650 even with a system that supports PCI-E 4.0. I assume AMD should have stopped production of RX 6500 and 6400 now, or they will be struggling to sell them off.
6500XT is really bad just like the 6600XT in my opinion. If you look at the 5600XT, that was a very rounded up good card.
Posted on Reply
#21
Operandi
TumbleGeorgeI didn't agree. That is myth. When GPU architectures was developed by hand with pencil you is right. Today peoples using powerful supercomputers. This works much faster.
Its not a myth its now the industry works and these products are made. Design and prototyping take years, testing and refinement years more. Automated design tools make the design process possible, its been decades since anything was done without them.
Posted on Reply
Add your own comment
Dec 22nd, 2024 00:25 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts