• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Denies Radeon RX 9070 XT $899 USD Starting Price Point Rumors

Joined
Dec 24, 2010
Messages
620 (0.12/day)
Location
mississauga, on, Canada
System Name YACS amd
Processor 5800x,
Motherboard gigabyte x570 aorus gaming elite.
Cooling bykski GPU, and CPU, syscooling p93x pump
Memory corsair vengeance pro rgb, 3600 ddr4 stock timings.
Video Card(s) xfx merc 310 7900xtx
Storage kingston kc3000 2TB, amongst others. Fanxiang s770 2TB
Display(s) benq ew3270u, or acer XB270hu, acer XB280hk, asus VG 278H,
Case lian li LANCOOL III
Audio Device(s) obs,
Power Supply FSP Hydro Ti pro 1000w
Mouse logitech g703
Keyboard durogod keyboard. (cherry brown switches)
Software win 11, win10pro.
As usual, the most obvious explanation proves correct. AMD had an over-optimistic pricing scheme at the outset, then panicked when Nvidia announced the MSRP on Blackwell. That much has been pretty clear from the outset. The only remaining question was, "so how delusional were AMD's intended prices?" Padded-room levels of delusional, it turns out.

Whatever you want to say about the products specifically, or the market generally, it's clownish for AMD to spend the last year making a big deal about how they're bowing out of the high end and then turn around and price their next GPU at just ever so slightly less than their previous "high end" cards. Given that context, even $750 would look goofy, irrespective of the competition's pricing.

I don't have a lot of sympathy for the "AMD can't just give their products away" defense. Profit margins are sky high on these chips. If AMD wants to continue their decline into total irrelevance on the GPU market, then by all means, cling to those profit margins. But cutting prices to move product seems like a much smarter play. AMD doesn't even have the excuse that all these consumer GPUs could have gone for a much higher price in the enterprise market. From that perspective, I'm almost coming around to the idea that AMD is greedier than Nvidia.
”From that perspective, I'm almost coming around to the idea that AMD is greedier than Nvidia.”

what?. Nvidia’s gross margins are 50 percent greater/higher than AMD inc OR apple inc. look it up.
IMO it is not possible for a company to be greedier than a company that has a significantly higher gross margin…
 
Joined
Jan 14, 2019
Messages
14,697 (6.56/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
”From that perspective, I'm almost coming around to the idea that AMD is greedier than Nvidia.”

what?. Nvidia’s gross margins are 50 percent greater/higher than AMD inc OR apple inc. look it up.
IMO it is not possible for a company to be greedier than a company that has a significantly higher gross margin…
Nvidia isn't greedy because it offers last gen performance for last gen price - no price increase is a positive thing, right? AMD is greedy because... well... the same. They should decrease prices, or offer more performance, obviously. :oops:

People will say anything to justify buying Nvidia. It boggles the mind.
 
Joined
Jan 18, 2021
Messages
237 (0.16/day)
Processor Core i7-12700
Motherboard MSI B660 MAG Mortar
Cooling Noctua NH-D15
Memory G.Skill Ripjaws V 64GB (4x16) DDR4-3600 CL16 @ 3466 MT/s
Video Card(s) AMD RX 6800
Storage Too many to list, lol
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply Corsair RM750x
Mouse Too many to list, lol
Keyboard Keychron low profile
Software Fedora, Mint
No, they do not. Not for the past 10 years.
RDNA 2 is the generation that put AMD back onto the map after GCN. You can disagree with me all you want, but I won't budge on this one.

Yeah, RDNA 2 was the last time AMD had a full stack with competitive perf and sensible pricing/timing, more or less from top to bottom--or it would have been, if not for the crypto-mining boom. RDNA 2 was also the first time AMD had a full stack scheme with all of the above characteristics in many years prior (since the 200 series in ~2013?). Obviously I can't say how things would have played out without the crypto/COVID mess, but RDNA2 sure looked like a genuine contender on paper, a serious product launch by a serious GPU company.

That was then. It feels like AMD is still punch drunk from that bit of bad luck in 2021. "We tried to do things properly for once, and God smote us, so let's do all sorts of random shit instead!" Of course, COVID wasn't the only bit of bad luck. Their chiplet strategy didn't pan out as well as they'd hoped, either.

I'm clearly more pro-AMD than JustBenching here. I do think AMD makes good products. Unfortunately, everything surrounding those products ranges from mediocre to dumpster fire. What's so frustrating about these discussions is that on the one hand, despite its second-place status, AMD is still one of the most high-tech organizations in the history of human endeavor, staffed with world class engineers. On the other hand, their public behavior is goofy enough to make you forget that fact on a near daily basis. It's as if someone handed a toddler the keys to a Ferrari.

what?. Nvidia’s gross margins are 50 percent greater/higher than AMD inc OR apple inc. look it up.
IMO it is not possible for a company to be greedier than a company that has a significantly higher gross margin…
The point is that Nvidia at least has the excuse that their consumer GPUs could have been sold for much more on the enterprise market. I don't quite subscribe to the popular notion that Nvidia's consumer GPUs represent "charity," but a case could be made. AMD has no such excuse.
 
Joined
Sep 17, 2014
Messages
23,420 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
This sounds interesting. I vaguely remember hearing something about Forza using Ray Tracing for their audio.
Bwhahahaa ray traced audio. Imagine that. What's next, AI shoelaces?
 
Joined
Jan 14, 2019
Messages
14,697 (6.56/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Yeah, RDNA 2 was the last time AMD had a full stack with competitive perf and sensible pricing/timing, more or less from top to bottom--or it would have been, if not for the crypto-mining boom. RDNA 2 was also the first time AMD had a full stack scheme with all of the above characteristics in many years prior (since the 200 series in ~2013?). Obviously I can't say how things would have played out without the crypto/COVID mess, but RDNA2 sure looked like a genuine contender on paper, a serious product launch by a serious GPU company.

That was then. It feels like AMD is still punch drunk from that bit of bad luck in 2021. "We tried to do things properly for once, and God smote us, so let's do all sorts of random shit instead!" Of course, COVID wasn't the only bit of bad luck. Their chiplet strategy didn't pan out as well as they'd hoped, either.
Chiplets are all about cost saving on throwing away smaller chips when they turn out to be defective - nothing else. Whoever expected AMD to introduce chiplets into the GPU space and dominate Nvidia a million times over, lives in lala land. Sure, it didn't work out because they couldn't divide the compute die into smaller chunks, but at least they tried something new, which deserves a point in my books.

I'm clearly more pro-AMD than JustBenching here. I do think AMD makes good products. Unfortunately, everything surrounding those products ranges from mediocre to dumpster fire. What's so frustrating about these discussions is that on the one hand, despite its second-place status, AMD is still one of the most high-tech organizations in the history of human endeavor, staffed with world class engineers. On the other hand, their public behavior is goofy enough to make you forget that fact on a near daily basis. It's as if someone handed a toddler the keys to a Ferrari.
What I don't understand is if we have people like yourself saying that AMD makes good products, then why on Earth do we have to talk about their "public behaviour"? Yes, they're goofy, yes they're awkward on stage, their marketing is f*ed up beyond measure, but who the heck cares if the product is good? I pay to be able to play games, not to watch a CEO waffle some shit in front of a bunch of journalists.

The point is that Nvidia at least has the excuse that their consumer GPUs could have been sold for much more on the enterprise market. I don't quite subscribe to the popular notion that Nvidia's consumer GPUs represent "charity," but a case could be made. AMD has no such excuse.
That's not a case. The consumer and enterprise markets are entirely different. Besides, AMD has enterprise stuff as well. I've heard their MI Instinct stuff isn't half bad (I don't know, I've just heard).
 
Joined
Jun 14, 2020
Messages
4,581 (2.66/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Yeah, RDNA 2 was the last time AMD had a full stack with competitive perf and sensible pricing/timing, more or less from top to bottom--or it would have been, if not for the crypto-mining boom. RDNA 2 was also the first time AMD had a full stack scheme with all of the above characteristics in many years prior (since the 200 series in ~2013?). Obviously I can't say how things would have played out without the crypto/COVID mess, but RDNA2 sure looked like a genuine contender on paper, a serious product launch by a serious GPU company.

That was then. It feels like AMD is still punch drunk from that bit of bad luck in 2021. "We tried to do things properly for once, and God smote us, so let's do all sorts of random shit instead!" Of course, COVID wasn't the only bit of bad luck. Their chiplet strategy didn't pan out as well as they'd hoped, either.

I'm clearly more pro-AMD than JustBenching here. I do think AMD makes good products. Unfortunately, everything surrounding those products ranges from mediocre to dumpster fire. What's so frustrating about these discussions is that on the one hand, despite its second-place status, AMD is still one of the most high-tech organizations in the history of human endeavor, staffed with world class engineers. On the other hand, their public behavior is goofy enough to make you forget that fact on a near daily basis. It's as if someone handed a toddler the keys to a Ferrari.


The point is that Nvidia at least has the excuse that their consumer GPUs could have been sold for much more on the enterprise market. I don't quite subscribe to the popular notion that Nvidia's consumer GPUs represent "charity," but a case could be made. AMD has no such excuse.
I wouldn't say they were competitive from top to bottom with rdna 2 but surely they had some really compelling options. Especially the 6800 (non xt) was a killer product. It cost more than the 3070 but I'd gladly pay it for 2x the vram and faster raster. Mining killed their momentum though.
 
Joined
Nov 15, 2024
Messages
264 (2.47/day)
Chiplets are all about cost saving on throwing away smaller chips when they turn out to be defective - nothing else. Whoever expected AMD to introduce chiplets into the GPU space and dominate Nvidia a million times over, lives in lala land. Sure, it didn't work out because they couldn't divide the compute die into smaller chunks, but at least they tried something new, which deserves a point in my books.
I can't seem to find the article I read but AMD were mentioning how it provides a lot of flexibility to tailor chips (or configurations) to use cases. Now they seemingly have communication between CCUs on the Halo chip and with Deepseek taking advantage of parallelization (something apparently CUDA can't do) and with Neural rendering on the horizon I'm interested to see what they have planned for their next architecture.
 
Joined
Jan 18, 2021
Messages
237 (0.16/day)
Processor Core i7-12700
Motherboard MSI B660 MAG Mortar
Cooling Noctua NH-D15
Memory G.Skill Ripjaws V 64GB (4x16) DDR4-3600 CL16 @ 3466 MT/s
Video Card(s) AMD RX 6800
Storage Too many to list, lol
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply Corsair RM750x
Mouse Too many to list, lol
Keyboard Keychron low profile
Software Fedora, Mint
What I don't understand is if we have people like yourself saying that AMD makes good products, then why on Earth do we have to talk about their "public behaviour"? Yes, they're goofy, yes they're awkward on stage, their marketing is f*ed up beyond measure, but who the heck cares if the product is good? I pay to be able to play games, not to watch a CEO waffle some shit in front of a bunch of journalists.
We all have an interest in healthy competition. I want AMD to stop stepping on its own dick. What I don't understand is why you'd defend them when they make avoidable errors. "B-But Nvidia is bad too" doesn't move me at this point. I have dozens of posts shitting on Nvidia. I'm a Linux cultist, after all. But AMD is not some battered wife or tiny baby in need of coddling. It is a multi-billion-dollar high-tech company with some of the world's most brilliant engineers, who by the way are probably annoyed than I am with corporate/marketing's mistakes.

That's not a case. The consumer and enterprise markets are entirely different. Besides, AMD has enterprise stuff as well. I've heard their MI Instinct stuff isn't half bad (I don't know, I've just heard).
AMD uses different chips for enterprise vs consumer. Nvidia does not. That is the point. AMD will be merging their architecture with the upcoming UDNA. Hopefully that will be successful.

I wouldn't say they were competitive from top to bottom with rdna 2 but surely they had some really compelling options. Especially the 6800 (non xt) was a killer product. It cost more than the 3070 but I'd gladly pay it for 2x the vram and faster raster. Mining killed their momentum though.
Yeah, banger of a card. I have one. There still isn't a compelling upgrade option for it, and it looks like there won't be for some years to come. Not that I'm in the market anyway. I don't play demanding games often enough to care. Frankly every time I look at the AAA space, I wonder why anyone does.
 
Last edited:
Joined
Jan 14, 2019
Messages
14,697 (6.56/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
I can't seem to find the article I read but AMD were mentioning how it provides a lot of flexibility to tailor chips (or configurations) to use cases. Now they seemingly have communication between CCUs on the Halo chip and with Deepseek taking advantage of parallelization (something apparently CUDA can't do) and with Neural rendering on the horizon I'm interested to see what they have planned for their next architecture.
It provides a lot of flexibility on CPUs that don't need a super tight latency between cores. Not so much on GPUs. That's why their CPU business is booming while RDNA 3 as a whole generation was just a bit meh despite all the money they pumped into making chiplets work.

We all have an interest in healthy competition. I want AMD to stop stepping on its own dick. What I don't understand is why you'd defend them when they make avoidable errors. "B-But Nvidia is bad too" doesn't move me at this point. I have dozens of posts shitting on Nvidia. I'm a Linux cultist, after all. But AMD is not some battered wife or tiny baby in need of coddling. It is a multi-billion-dollar high-tech company with some of the world's most brilliant engineers, who by the way are probably annoyed than I am with corporate/marketing's mistakes.
I'm not defending the company. I'm defending their products which you yourself admitted were good. That's all I care about - products. Dr Su herself could come on stage during the next show and say that RDNA 4 is shit and no one should buy it, but I don't care. If it's good, I'll buy it. Marketing is for idiots.

Same goes for Nvidia (despite all the crap I've given them lately) - if they produce something solid, they'll get my vote (again). The problem is that they don't. They keep pushing the same architecture again and again for a higher price, hidden behind more smoke and mirrors with every gen. AMD is at least trying (even if they fail sometimes), but Nvidia clearly doesn't give a crap about gaming.

Oh, and I'm a (recent) Linux cultist as well thanks to Bazzite. :)

AMD uses different chips for enterprise vs consumer. Nvidia does not. That is the point. AMD will be merging their architecture with the upcoming UDNA. Hopefully that will be successful.
That's a fair point.
 
Joined
Nov 13, 2024
Messages
207 (1.92/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
[Took this from another post and migrated it here, because it was a bit off topic in the other tread]

The price difference between the cheapest 7900xtx & 4080Super/5080 is pretty big right now in my region (VAT included):

The cheapest 4080 Super is 1132 € (and steadily increasing price)
1738326096644.png

The cheapest 5080 will be 1169€
1738326013672.png

Currently the almost cheapest 7900xtx is 899 €
1738326049211.png


There is now a 230€ Price difference between the cheapest 4080 Super/5080 and the 7900xtx (where 3 Months in the past it was a 100€ difference)

Seems like Nvidia really doesn't care about the 7900xtx or they want to signal AMD with "Hey, you don't want to increase P/P this gen, right? Just sell your next card linear (P/P) to your old stack (7000series)"

either way, this gen looks rather bad, pretty sad. Hopefully I am wrong. *pls*

Edit: Also the 899 Price makes absolutely no sense if the 7900xtx costs the same right now, unless they value FSR4 & less energy used highly
 
Last edited:
Joined
Nov 26, 2021
Messages
1,863 (1.56/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
It provides a lot of flexibility on CPUs that don't need a super tight latency between cores. Not so much on GPUs. That's why their CPU business is booming while RDNA 3 as a whole generation was just a bit meh despite all the money they pumped into making chiplets work.


I'm not defending the company. I'm defending their products which you yourself admitted were good. That's all I care about - products. Dr Su herself could come on stage during the next show and say that RDNA 4 is shit and no one should buy it, but I don't care. If it's good, I'll buy it. Marketing is for idiots.

Same goes for Nvidia (despite all the crap I've given them lately) - if they produce something solid, they'll get my vote (again). The problem is that they don't. They keep pushing the same architecture again and again for a higher price, hidden behind more smoke and mirrors with every gen. AMD is at least trying (even if they fail sometimes), but Nvidia clearly doesn't give a crap about gaming.

Oh, and I'm a (recent) Linux cultist as well thanks to Bazzite. :)


That's a fair point.
Chiplets are good for yields and reducing costs, but they also allow scaling to higher performance than a single die: see the MI300X. As for latency, GPUs actually don't have better core to core latency than CPUs.

1738333073277.png

Contrast GPUs' core to core latency with CPUs: latencies range from sub 20 ns to 80 ns for Zen 4. On the same CCD, latencies are much lower than GPUs could ever dream of.
1738333132396.png
 
Last edited:
Joined
Jan 14, 2019
Messages
14,697 (6.56/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Chiplets are good for yields and reducing costs, but they also allow scaling to higher performance than a single die: see the MI300X. As for latency, GPUs actually don't have better core to core latency than CPUs.

View attachment 382693
Contrast GPUs' core to core latency with CPUs: latencies range from sub 20 ns to 80 ns for Zen 4. On the same CCD, latencies are much lower than GPUs could ever dream of.
View attachment 382694
I see. But what about splitting the GPU into chiplets? As far as I recall, AMD experimented with that on RDNA 3, and that introduced the unwanted latency. That's why they ended up cutting only the cache / memory controller off of the main die in the end.
 
Joined
Nov 26, 2021
Messages
1,863 (1.56/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
I see. But what about splitting the GPU into chiplets? As far as I recall, AMD experimented with that on RDNA 3, and that introduced the unwanted latency. That's why they ended up cutting only the cache / memory controller off of the main die in the end.
That apparent increase was an error due to AMD's aggressive power saving on he GCD to MCD link. There's a latency difference, but it isn't as dramatic as initial tests suggested.

We see a similar pattern with vector accesses, where measured Infinity Cache latency dropped from 199 ns to 150.3 ns with the card in a higher power state.

For the monolithic 7600, power saving didn't impact it as badly.

Vector accesses show similar behavior, with just a 9.5% Infinity Cache latency difference depending on whether the Infinity Fabric was in power saving state.
 
Joined
Oct 21, 2009
Messages
139 (0.02/day)
Location
Netherlands
System Name Whitewonder
Processor 7800X3D
Motherboard Asus Proart X670-E Creator
Cooling Corsair custom Watercooled
Memory 64 Gb
Video Card(s) RX 6800 XT, 2080
Storage Too much to mention in all 1190 TB
Display(s) 2 x Dell 4K @ 60 hz
Case White XL case
Audio Device(s) Realtek + Bayer Dynamics 990 Pro headset
Power Supply 2100 watt
Mouse Corsair cord mouse
Keyboard Corsair red lighter cabled keyboard ages old ;)
When AMD does have a design win, it sells well but Nvidia still outsold them, while AMD continued to lose marketshare. I don't know how well consistency would help Radeon as they would have to do something to attract customers, something that Nvidia doesn't have since Nvidia has been successful in selling software at a significant premium. Before RDNA, AMD did have consistency and it didn't seem to help them enough.
However the CPU market isn't the same as the GPU market, AMD would still have to overcome the software features which are proprietary to Nvidia, and compete on gaming when most of those features run better with Nvidia hardware. AMD would have to launch faster cards than Nvidia for several generations, and have better RT and upscaling in order to get the win over the market that Ryzen has achieved. AMD would also have to get away from the old stigma of bad drivers, and the complaints of their cards being too expensive, AMD always has to undercut to the point of having low margins, there can't be consistency when there is likely barely enough R&D to develop the next product.
Nvidia also gets away with a lot of anti-consumer things like GPP, handing game devs piles of money to develop games with features only avaible to Nvidia users, marketing lies like saying the 5070 is faster than a 4090, or the MSRP's being completly fake as the MSRP only applies to the FE card which are usually purposely supply limited.
AMD is always been seen as the underdog even though the AMD products are pretty good in the last years
But they also need more capacity for build the company used AI gpu clusters ( supercomputers) because they do well at that front as far as i know.
The info from within AMD has dried up for me as the friend worked there had retired, he knew every secret and never ever told any info.
That of course did not stop me to try to get the latest and greatest, but he always waited till the secret was already leaked :laugh:
 
Top