• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen Threadripper 5000 Series Delayed to 2022?

Joined
Nov 6, 2016
Messages
1,770 (0.60/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
Sure bro.
hahaha, I know right....that guy probably saw the one benchmark where alderlake MATCHED a 2990WX from three years ago on the Zen+ architecture and somehow extrapolates that to it BEATING a 32 core Zen3 threadripper....I

I just want to point out to the double standards here nothing else, If this happened to Intel, you could see hundreds of comments mocking them, but when it happens to AMD, all those fanboys runs out of excuses.
These things can happen in tech industry just very sad to see blind loyalty.
Yeah....I just want to point out that Intel's R&D budget for 2020 was literally 6.5x more than AMD's, and Intel's 2020 Revenue ($78 Billion) is about 8x more than AMD's 2020 Revenue ($9.76 Billion)....so when intel doesn't deliver with literally a magnitude greater resources than AMD, there's no excuse for Intel, meanwhile AMD is working on a shoestring compared to Intel, and has been whipping them for several years now....which makes AMD's victories all that more impressive.....please, name any other business from any other industry that is BEATING a competitor that has a 6.5x greater R&D budget and an 8x greater Revenue stream. For that matter, Nvidia's R&D budget for 2020 was $4 billion, double that of AMD's, and AMD has to divide their R&D budget between GPUs and CPUs while Nvidia just spends it on GPUs, so it'd be more accurate to say Nvidia's R&D budget is more than 4x greater than AMD's (as I'm sure AMD spends more of their budget on x86 since that has a much larger T.A.M.), and AMD is still able to compete, and in a sane world where we could go on MSRP, AMD would be arguably offering a better value in each price tier.

We live in a reality dominated by financial resources, and until you factor in those resources in assessing which company is doing better or worse, your assessment is divorced from reality....when you consider that AMD beats Intel and competes (and even beats in rasterization) against Nvidia, AMD's performance is more impressive than either of those companies. If AMD had Intel's financial resources in x86 and Nvidia's financial resources in dGPU, they'd have completely run away with it over the past few years.

Boy, the fanboyism is running rampant around here! But the first dose of cure is fortunately just around the corner.
How is it fanboyism when he is stating the fact that alderlake was beating a 3 year old Threadripper chip with Zen+ architecture (which literally represents a near 40% decrease in IPC vs Zen3)?....it seems more like fanboyism to be ignoring that fact like somebody around here
 
Joined
Apr 30, 2008
Messages
4,901 (0.81/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 8745H
Motherboard MinisForum 870 Slim Board
Cooling Mini PC Cooling
Memory Crucial 32GB 5600Mhz
Video Card(s) Radeon 780M
Storage Kingston 1TB SSD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case MinisForum 870 Slim Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply 120w External Power Brick
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 11 Pro 64bit
Benchmark Scores Don't do them anymore.
I just want to point out to the double standards here nothing else, If this happened to Intel, you could see hundreds of comments mocking them, but when it happens to AMD, all those fanboys runs out of excuses.
These things can happen in tech industry just very sad to see blind loyalty.
What blind loyalty? It's just facts, AMD back in the FX days were absolute garbage and were made fun of & righteously so, zen rumors/hyped started and people had their doubt's including myself but hey, they delivered, It wasn't perfect that's for sure but it was a start.

Boy, the fanboyism is running rampant around here! But the first dose of cure is fortunately just around the corner.
Labelling simple logic "fanboyism" is pretty sad.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.80/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Threadripper with 4-times the cores, pal. If 5600x was beating a 24 core xeon, you'd be all over it! :rolleyes:
It's a 16 core part, with 8 of those cores being lower power gracemont cores, but they're still cores being used that contribute to computation performance and power consumption. That would be half the cores, not a quarter. My point is that it's Intel with a 16c/24t part that scores about the same as AMD's 16c/32t part. That's also comparing it to the 5950X which is already in the real world and in people's machines. There is also an open question as to how much power this CPU was using under full load. As we know, it has a 125W TDP and we know boost clocks and power consumption are going to be much higher. Take the 10900k for example, 10c/20t with a 125w TDP and an insane boost consumption north of 300W. Even the 9880H in my laptop, which has a 45w TDP has a short term power limit of almost 95w and a long term of 65w... in a laptop. So until we see more numbers, I wouldn't get too excited because as far as we know, Intel has tuned this CPU to eat power full tilt and to bounce off the thermal limits of the chip like they have in the past.
Boy, the fanboyism is running rampant around here! But the first dose of cure is fortunately just around the corner.
Don't call people names. It makes the optics look bad for you and your argument. It's ad hominem in broad daylight.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
You mean how Alder Lake was keeping up with a Threadripper from 3 years ago but gets wrecked by a modern 32c/64t TR chip? Sure bub. I don't think AMD is scared and as far as we know, this chip was consuming 200w to do it.

Yeah it beat a 3 year old Treadripper but i9-12900K in leaks also performed like Ryzen 5950X in multi and beat it alot in single, thats pretty decent

Intel 7 aka 10nm+ is on par with TSMC 7nm - Hence the new naming scheme. Future is looking bright for Intel with new superfabs under contruction and Samsung are also improving fast, TSMC better not sleep (or they will loose Apple too)

Can't wait to see AMDs take on hybrid design
 
Last edited:

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.80/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Yeah it beat a 3 year old Treadripper but i9-12900K in leaks also performed like Ryzen 5950X in multi and beat it alot in single, thats pretty decent
Yeah, that's something worth calling out, but as I said before. Traditionally the way Intel has pulled that off was by having boost power consumption go through the roof, particularly for those high single core numbers. I'm hoping that 10nm will help improve those numbers, but I'm not very confident and I'll explain why. When you shrink the process you're concentrating heat in a smaller area, so heat flux becomes more of a problem. AMD actually has an advantage here because of the chiplet design which spreads out the heat producing components. Intel is still producing large monolithic dies, so even if power consumption does get to be under control, I think we'll find that thermals won't be.

Also mind you that the 5950X is a product that has been out in the wild for almost a year now. So while Intel might be making progress, they're still coming back from behind. That isn't to say that I'm not excited for these new chips. I just think we need to be careful with the hype train, particularly given what Intel has done in the past.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Yeah, that's something worth calling out, but as I said before. Traditionally the way Intel has pulled that off was by having boost power consumption go through the roof, particularly for those high single core numbers. I'm hoping that 10nm will help improve those numbers, but I'm not very confident and I'll explain why. When you shrink the process you're concentrating heat in a smaller area, so heat flux becomes more of a problem. AMD actually has an advantage here because of the chiplet design which spreads out the heat producing components. Intel is still producing large monolithic dies, so even if power consumption does get to be under control, I think we'll find that thermals won't be.

Also mind you that the 5950X is a product that has been out in the wild for almost a year now. So while Intel might be making progress, they're still coming back from behind. That isn't to say that I'm not excited for these new chips. I just think we need to be careful with the hype train, particularly given what Intel has done in the past.

Personally I don't really care about CPU watt usage in a desktop PC.
My 9900K uses like 100-150 watts running at 5.2 GHz during gaming. It takes synthetic burn-in (avx2 especially) or 100% load across all cores to make it hit 200+ however there's no difference in noise or temps inside case for me, so yeah, not really bothered. Nvidia 3090 and AMD 6900XT can peak to 600+ watts but a CPU can't use more than 150 watts? I don't really understand this.

5950X might have been out for a year (or actually more like 10 months) but it was a huge paperlaunch. Tons of buyers waited for months and months after release to recieve one. I actually ordered one but cancelled my order after 6 weeks and looking back I'm glad I did. I will be waiting for DDR5 to mature and pick up something truly next gen in 2023-2024 instead, my 9900K is holding up really well.
 
Joined
Apr 16, 2019
Messages
632 (0.30/day)
What blind loyalty? It's just facts, AMD back in the FX days were absolute garbage and were made fun of & righteously so, zen rumors/hyped started and people had their doubt's including myself but hey, they delivered, It wasn't perfect that's for sure but it was a start.
The first two iterations of zen (that is zen & zen+) were no better in comparison to its rivals (Kaby&Coffee Lake) than Buldozer was next to Sandy Bridge. It's just that that FX chips remained almost the same for more than 5 years (in the very end "competing" against Kaby Lake even), that's why even some confirmed team red, hmmm aficionados? :D now say they were crap (to try and make themselves seem less obvious). It's only with Zen 3 that they finally released something worth buying and even that only if you owned less than an 8700k or well, if you really needed lots of cores on only two memory channels.
 
Joined
Oct 23, 2020
Messages
671 (0.44/day)
Location
Austria
System Name nope
Processor I3 10100F
Motherboard ATM Gigabyte h410
Cooling Arctic 12 passive
Memory ATM Gskill 1x 8GB NT Series (No Heatspreader bling bling garbage, just Black DIMMS)
Video Card(s) Sapphire HD7770 and EVGA GTX 470 and Zotac GTX 960
Storage 120GB OS SSD, 240GB M2 Sata, 240GB M2 NVME, 300GB HDD, 500GB HDD
Display(s) Nec EA 241 WM
Case Coolermaster whatever
Audio Device(s) Onkyo on TV and Mi Bluetooth on Screen
Power Supply Super Flower Leadx 550W
Mouse Steelseries Rival Fnatic
Keyboard Logitech K270 Wireless
Software Deepin, BSD and 10 LTSC
Now it isnt a problem because AMD still on ddr4 and pcie 4, but Intel is a damn Asshole Company cause DDR4 and PCIe 3.0,
Intel release DDR5 and PCIe 5.0 and in the future AMD is great cause they make PCIe 6.0 :laugh:

Stupid Fanboy sight;)
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.80/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Personally I don't really care about CPU watt usage in a desktop PC.
My 9900K uses like 100-150 watts running at 5.2 GHz during gaming. It takes synthetic burn-in (avx2 especially) or 100% load across all cores to make it hit 200+ however there's no difference in noise or temps inside case for me, so yeah, not really bothered. Nvidia 3090 and AMD 6900XT can peak to 600+ watts but a CPU can't use more than 150 watts? I don't really understand this.
Mind you that's with a CPU with a 95w TDP. A chip with a 125w TDP is going to go further, and with all due respect I have a machine with a 3930k and a Vega 64 and it's no stranger to consuming power and when you start pushing ~500-700w from the wall for your system, it's not just going to heat up the room, it's going to be audible unless you limit how much air goes through the system and let thermal boost algos tune it down when it gets too toasty, but the simple fact of the matter is that if you produce more heat, you need to have more air to get rid of it, otherwise temps go up. The only alternative to that is altering the ∆T by lowing the ambient air temperature.

My simple point is that just having a performance metric doesn't really tell us a whole lot. With enough power and cooling, a lot of these chips will produce nice benchmark numbers. The question is how far did Intel have to go to achieve those numbers. With a 125w TDP, I suspect we're not talking 100-150w under load, but rather something closer to 130-200 with some situations pushing it closer to 250w, which isn't unrealistic given what we've seen in the past.

With that said, I'm optimistically skeptical. Stuff like this almost always leads to disappointment, as most hype-trains do.
 
Joined
Apr 30, 2008
Messages
4,901 (0.81/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 8745H
Motherboard MinisForum 870 Slim Board
Cooling Mini PC Cooling
Memory Crucial 32GB 5600Mhz
Video Card(s) Radeon 780M
Storage Kingston 1TB SSD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case MinisForum 870 Slim Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply 120w External Power Brick
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 11 Pro 64bit
Benchmark Scores Don't do them anymore.
The first two iterations of zen (that is zen & zen+) were no better in comparison to its rivals (Kaby&Coffee Lake) than Buldozer was next to Sandy Bridge. It's just that that FX chips remained almost the same for more than 5 years (in the very end "competing" against Kaby Lake even), that's why even some confirmed team red, hmmm aficionados? :D now say they were crap (to try and make themselves seem less obvious). It's only with Zen 3 that they finally released something worth buying and even that only if you owned less than an 8700k or well, if you really needed lots of cores on only two memory channels.
Well yes Skylake/kabylake/coffeelake whatever you wanna call it lake, still had better IPC, clocks & lower latency than first gen zen & zen +, AMD clearly new this so they had to go the price to performance route & more cores, worked out in the end. I disagree with you on Zen 3 being the only thing worth buying, Zen 2 is what really shook things up followed by Zen 3.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.55/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
It's for exactly the same reason Intel stop pushing development, lack of competition.
AMD are as good as Intel at milking old tech.
 
Joined
Jun 18, 2021
Messages
2,567 (2.01/day)
It's for exactly the same reason Intel stop pushing development, lack of competition.
AMD are as good as Intel at milking old tech.

You're not wrong but there's also supply shortages and other priorities. Regular threadripper is a niche product, threadripper pro potential users can make do with epyc

It's a bit of a weird situation they got going and why I'd argue that they should merge threadripper and threadripper pro into a single a product line
 
Joined
Jun 12, 2017
Messages
184 (0.07/day)
System Name Linotosh
Processor Dual 800mhz G4
Cooling Air
Memory 1.5 GB
this is what i want it for though, i never owned a ps4. i was living at college during those days and just busy with other stuff. i missed all of the sony games. and i want to experienced them remastered.
Being able to play ps4 games at higher quality and/or more stable frame rates is a huge plus. I was planning on getting a ps5 eventually and happened to get lucky.
 

THEDOOMEDHELL

New Member
Joined
Jan 27, 2020
Messages
7 (0.00/day)
As a workstation user myself, I'm really hoping this CPU actually drops threadripper prices. even 1st gen is still in the 250$ price range on used markets... its abysmal compared to intel's HEDTs. I know they dont really care how used markets work, but here's to hoping it does something. Or else I'm gonna be stuck with Intel HEDT for a long time!
 
Joined
Apr 5, 2010
Messages
8 (0.00/day)
Looking already scared of what they are seeing from Alder Lake benchmarks. Sapphire Rapids will smash Threadripper if it's just old Zen3 cores.
Alder Lake tech will not play well in HEDT market. Remember, the 16(8e 8p) core consumes almost as much power as a threadripper 32 core system.
The 32 core or better will most likely consume insane amount of power of levels of the silly W-3175X "5 Ghz all 28 core" (that consumed more than 1000W from the wall and required a chiller + special plug).

Threadripper with 4-times the cores, pal. If 5600x was beating a 24 core xeon, you'd be all over it! :rolleyes:
Hu, but that is not even the comparison. The apt comparison would be a 5950X (which is the flagship) beating the 24 core overclocked Xeon.
The difference is, AMD is still using less power and less heat.
 
Top