Sunday, October 3rd 2021

AMD Ryzen Threadripper 5000 Series Delayed to 2022?

Launch of AMD's upcoming Ryzen Threadripper 5000 series high-end desktop (HEDT) and Threadripper WX workstation processors, is rumored to have been delayed to 2022, according to Greymon55, a reliable source with AMD leaks. Codenamed "Chagall," these processors are compatible with existing sTRX4 and sWRX8 motherboards, based on the AMD TRX40 and AMD WRX80 chipsets, respectively. What's new, is the "Zen 3" microarchitecture.

It remains to be seen if the delay is the result of a last-minute decision by AMD to go with the newer "Zen 3" CCD that comes with 3D Vertical Cache technology, over the conventional "Zen 3" CCD; or some other reason. A 2022 launch would mean that Threadripper 5000 series will be launching around the time when Intel has desktop platforms with DDR5 memory and PCI-Express Gen 5. Threadripper 5000 chips with quad-channel DDR4 memory (four 64-bit wide channels) will be seen offering only comparable memory bandwidth to "Alder Lake" systems with overclocked DDR5 memory (four 40-bit wide channels). AMD is likely to prioritize its next "big" socket for the enterprise segment with EPYC "Genoa," as the company could find itself embattled with Xeon "Sapphire Rapids" processors that come with next-gen I/O.
Sources: Greymon55 (Twitter), VideoCardz
Add your own comment

39 Comments on AMD Ryzen Threadripper 5000 Series Delayed to 2022?

#26
Lionheart
TurmaniaI just want to point out to the double standards here nothing else, If this happened to Intel, you could see hundreds of comments mocking them, but when it happens to AMD, all those fanboys runs out of excuses.
These things can happen in tech industry just very sad to see blind loyalty.
What blind loyalty? It's just facts, AMD back in the FX days were absolute garbage and were made fun of & righteously so, zen rumors/hyped started and people had their doubt's including myself but hey, they delivered, It wasn't perfect that's for sure but it was a start.
HenrySomeoneBoy, the fanboyism is running rampant around here! But the first dose of cure is fortunately just around the corner.
Labelling simple logic "fanboyism" is pretty sad.
Posted on Reply
#27
Aquinus
Resident Wat-man
HenrySomeoneThreadripper with 4-times the cores, pal. If 5600x was beating a 24 core xeon, you'd be all over it! :rolleyes:
It's a 16 core part, with 8 of those cores being lower power gracemont cores, but they're still cores being used that contribute to computation performance and power consumption. That would be half the cores, not a quarter. My point is that it's Intel with a 16c/24t part that scores about the same as AMD's 16c/32t part. That's also comparing it to the 5950X which is already in the real world and in people's machines. There is also an open question as to how much power this CPU was using under full load. As we know, it has a 125W TDP and we know boost clocks and power consumption are going to be much higher. Take the 10900k for example, 10c/20t with a 125w TDP and an insane boost consumption north of 300W. Even the 9880H in my laptop, which has a 45w TDP has a short term power limit of almost 95w and a long term of 65w... in a laptop. So until we see more numbers, I wouldn't get too excited because as far as we know, Intel has tuned this CPU to eat power full tilt and to bounce off the thermal limits of the chip like they have in the past.
HenrySomeoneBoy, the fanboyism is running rampant around here! But the first dose of cure is fortunately just around the corner.
Don't call people names. It makes the optics look bad for you and your argument. It's ad hominem in broad daylight.
Posted on Reply
#28
las
AquinusYou mean how Alder Lake was keeping up with a Threadripper from 3 years ago but gets wrecked by a modern 32c/64t TR chip? Sure bub. I don't think AMD is scared and as far as we know, this chip was consuming 200w to do it.
Yeah it beat a 3 year old Treadripper but i9-12900K in leaks also performed like Ryzen 5950X in multi and beat it alot in single, thats pretty decent

Intel 7 aka 10nm+ is on par with TSMC 7nm - Hence the new naming scheme. Future is looking bright for Intel with new superfabs under contruction and Samsung are also improving fast, TSMC better not sleep (or they will loose Apple too)

Can't wait to see AMDs take on hybrid design
Posted on Reply
#29
Aquinus
Resident Wat-man
lasYeah it beat a 3 year old Treadripper but i9-12900K in leaks also performed like Ryzen 5950X in multi and beat it alot in single, thats pretty decent
Yeah, that's something worth calling out, but as I said before. Traditionally the way Intel has pulled that off was by having boost power consumption go through the roof, particularly for those high single core numbers. I'm hoping that 10nm will help improve those numbers, but I'm not very confident and I'll explain why. When you shrink the process you're concentrating heat in a smaller area, so heat flux becomes more of a problem. AMD actually has an advantage here because of the chiplet design which spreads out the heat producing components. Intel is still producing large monolithic dies, so even if power consumption does get to be under control, I think we'll find that thermals won't be.

Also mind you that the 5950X is a product that has been out in the wild for almost a year now. So while Intel might be making progress, they're still coming back from behind. That isn't to say that I'm not excited for these new chips. I just think we need to be careful with the hype train, particularly given what Intel has done in the past.
Posted on Reply
#30
las
AquinusYeah, that's something worth calling out, but as I said before. Traditionally the way Intel has pulled that off was by having boost power consumption go through the roof, particularly for those high single core numbers. I'm hoping that 10nm will help improve those numbers, but I'm not very confident and I'll explain why. When you shrink the process you're concentrating heat in a smaller area, so heat flux becomes more of a problem. AMD actually has an advantage here because of the chiplet design which spreads out the heat producing components. Intel is still producing large monolithic dies, so even if power consumption does get to be under control, I think we'll find that thermals won't be.

Also mind you that the 5950X is a product that has been out in the wild for almost a year now. So while Intel might be making progress, they're still coming back from behind. That isn't to say that I'm not excited for these new chips. I just think we need to be careful with the hype train, particularly given what Intel has done in the past.
Personally I don't really care about CPU watt usage in a desktop PC.
My 9900K uses like 100-150 watts running at 5.2 GHz during gaming. It takes synthetic burn-in (avx2 especially) or 100% load across all cores to make it hit 200+ however there's no difference in noise or temps inside case for me, so yeah, not really bothered. Nvidia 3090 and AMD 6900XT can peak to 600+ watts but a CPU can't use more than 150 watts? I don't really understand this.

5950X might have been out for a year (or actually more like 10 months) but it was a huge paperlaunch. Tons of buyers waited for months and months after release to recieve one. I actually ordered one but cancelled my order after 6 weeks and looking back I'm glad I did. I will be waiting for DDR5 to mature and pick up something truly next gen in 2023-2024 instead, my 9900K is holding up really well.
Posted on Reply
#31
HenrySomeone
LionheartWhat blind loyalty? It's just facts, AMD back in the FX days were absolute garbage and were made fun of & righteously so, zen rumors/hyped started and people had their doubt's including myself but hey, they delivered, It wasn't perfect that's for sure but it was a start.
The first two iterations of zen (that is zen & zen+) were no better in comparison to its rivals (Kaby&Coffee Lake) than Buldozer was next to Sandy Bridge. It's just that that FX chips remained almost the same for more than 5 years (in the very end "competing" against Kaby Lake even), that's why even some confirmed team red, hmmm aficionados? :D now say they were crap (to try and make themselves seem less obvious). It's only with Zen 3 that they finally released something worth buying and even that only if you owned less than an 8700k or well, if you really needed lots of cores on only two memory channels.
Posted on Reply
#32
seth1911
Now it isnt a problem because AMD still on ddr4 and pcie 4, but Intel is a damn Asshole Company cause DDR4 and PCIe 3.0,
Intel release DDR5 and PCIe 5.0 and in the future AMD is great cause they make PCIe 6.0 :laugh:

Stupid Fanboy sight;)
Posted on Reply
#33
Aquinus
Resident Wat-man
lasPersonally I don't really care about CPU watt usage in a desktop PC.
My 9900K uses like 100-150 watts running at 5.2 GHz during gaming. It takes synthetic burn-in (avx2 especially) or 100% load across all cores to make it hit 200+ however there's no difference in noise or temps inside case for me, so yeah, not really bothered. Nvidia 3090 and AMD 6900XT can peak to 600+ watts but a CPU can't use more than 150 watts? I don't really understand this.
Mind you that's with a CPU with a 95w TDP. A chip with a 125w TDP is going to go further, and with all due respect I have a machine with a 3930k and a Vega 64 and it's no stranger to consuming power and when you start pushing ~500-700w from the wall for your system, it's not just going to heat up the room, it's going to be audible unless you limit how much air goes through the system and let thermal boost algos tune it down when it gets too toasty, but the simple fact of the matter is that if you produce more heat, you need to have more air to get rid of it, otherwise temps go up. The only alternative to that is altering the ∆T by lowing the ambient air temperature.

My simple point is that just having a performance metric doesn't really tell us a whole lot. With enough power and cooling, a lot of these chips will produce nice benchmark numbers. The question is how far did Intel have to go to achieve those numbers. With a 125w TDP, I suspect we're not talking 100-150w under load, but rather something closer to 130-200 with some situations pushing it closer to 250w, which isn't unrealistic given what we've seen in the past.

With that said, I'm optimistically skeptical. Stuff like this almost always leads to disappointment, as most hype-trains do.
Posted on Reply
#34
Lionheart
HenrySomeoneThe first two iterations of zen (that is zen & zen+) were no better in comparison to its rivals (Kaby&Coffee Lake) than Buldozer was next to Sandy Bridge. It's just that that FX chips remained almost the same for more than 5 years (in the very end "competing" against Kaby Lake even), that's why even some confirmed team red, hmmm aficionados? :D now say they were crap (to try and make themselves seem less obvious). It's only with Zen 3 that they finally released something worth buying and even that only if you owned less than an 8700k or well, if you really needed lots of cores on only two memory channels.
Well yes Skylake/kabylake/coffeelake whatever you wanna call it lake, still had better IPC, clocks & lower latency than first gen zen & zen +, AMD clearly new this so they had to go the price to performance route & more cores, worked out in the end. I disagree with you on Zen 3 being the only thing worth buying, Zen 2 is what really shook things up followed by Zen 3.
Posted on Reply
#35
r9
It's for exactly the same reason Intel stop pushing development, lack of competition.
AMD are as good as Intel at milking old tech.
Posted on Reply
#36
trsttte
r9It's for exactly the same reason Intel stop pushing development, lack of competition.
AMD are as good as Intel at milking old tech.
You're not wrong but there's also supply shortages and other priorities. Regular threadripper is a niche product, threadripper pro potential users can make do with epyc

It's a bit of a weird situation they got going and why I'd argue that they should merge threadripper and threadripper pro into a single a product line
Posted on Reply
#37
Hardware Geek
lynx29this is what i want it for though, i never owned a ps4. i was living at college during those days and just busy with other stuff. i missed all of the sony games. and i want to experienced them remastered.
Being able to play ps4 games at higher quality and/or more stable frame rates is a huge plus. I was planning on getting a ps5 eventually and happened to get lucky.
Posted on Reply
#38
THEDOOMEDHELL
As a workstation user myself, I'm really hoping this CPU actually drops threadripper prices. even 1st gen is still in the 250$ price range on used markets... its abysmal compared to intel's HEDTs. I know they dont really care how used markets work, but here's to hoping it does something. Or else I'm gonna be stuck with Intel HEDT for a long time!
Posted on Reply
#39
doejohn
Minus InfinityLooking already scared of what they are seeing from Alder Lake benchmarks. Sapphire Rapids will smash Threadripper if it's just old Zen3 cores.
Alder Lake tech will not play well in HEDT market. Remember, the 16(8e 8p) core consumes almost as much power as a threadripper 32 core system.
The 32 core or better will most likely consume insane amount of power of levels of the silly W-3175X "5 Ghz all 28 core" (that consumed more than 1000W from the wall and required a chiller + special plug).
HenrySomeoneThreadripper with 4-times the cores, pal. If 5600x was beating a 24 core xeon, you'd be all over it! :rolleyes:
Hu, but that is not even the comparison. The apt comparison would be a 5950X (which is the flagship) beating the 24 core overclocked Xeon.
The difference is, AMD is still using less power and less heat.
Posted on Reply
Add your own comment
Dec 18th, 2024 00:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts