• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Defends MCM Approach for Building 32-core EPYC: Huge Cost Savings of ~41%

Joined
Jun 28, 2016
Messages
3,595 (1.16/day)
You're kidding :confused: zen is more efficient than anything Intel has atm, excluding Denverton.
I meant "ULV lineup" - the mobile parts.
And I didn't say they're more efficient than Zen. I said that they exist and work beautifully. So Intel can make a good purpose-built CPU for passively cooled notebooks.

Lets face it. Intel mobile CPUs work beautifully in the frugal 7.5W mode, while it's difficult to imagine a Zen+Vega package sipping under 15W (but I'd love to be surprised).
You see: since this is MCM, the bigger it is, the more efficient it gets. But it's also pretty complicated as a result: Infinity Matrix, SenseMI, large cache... a lot of things that are important for the efficiency of the whole package, but need energy as well.

The result of this could be that Zen is very effective where it's not that important (gaming desktops) and not very effective where it really matters (ultrabooks).

Look at this Ryzen review:
https://www.techpowerup.com/reviews/AMD/Ryzen_3_1200/18.html
Ryzen CPUs are efficient under load, but suck at idle. All of them.

And now Vega:
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/29.html
Vega sucks at both idle and load.

BTW: Intel HD is 1-2W. I bet Infinity Matrix itself draws more.

If anything RR is the reason Intel might've been spooked & released mobile parts first, before desktop CFL.
Mobile KL and Broadwell were also released before desktop variants, Skylake and Haswell - together.
Mobile segment is way more important for Intel. That's where the new tech goes first.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,630 (6.68/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
notb must be brothers with cvrk
 
Joined
Nov 13, 2007
Messages
10,845 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Am I missing something? Ryzen is 10-15% inferior at idle. That's really not huge.

isn't like 95%+ of the time during regular usage idle though? for mobile parts that adds up to 36 mins - 1+ hours in a normal usage cycle (6-10 hours).
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
So, a smaller die allows cost savings. You hear that, Vega?
 
Joined
Sep 6, 2013
Messages
3,392 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
10 years ago AMD was saying how important it was to create a monolithic true quad core processor, instead of gluing together two dual core processors. And that was off course because Intel was gluing together dual cores back then. The end result showed that gluing cores was the correct thing to do. AMD delayed with it's quad core model, while Intel moved faster in the market with it's gluing plan. And for what? AMD's quad core processor wasn't much faster compared to two dual cores, at least at that software that was written 10 years ago and could take advantage of quads.

Now AMD, out of necessity mostly, is doing the right thing. It's not financially strangling itself, just so it can produce a product that will be 10% faster and cost much more to make, only to end up unable to sell it against Intel products because of it's price.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
10 years ago AMD was saying how important it was to create a monolithic true quad core processor, instead of gluing together two dual core processors. And that was off course because Intel was gluing together dual cores back then. The end result showed that gluing cores was the correct thing to do. AMD delayed with it's quad core model, while Intel moved faster in the market with it's gluing plan. And for what? AMD's quad core processor wasn't much faster compared to two dual cores, at least at that software that was written 10 years ago and could take advantage of quads.

Now AMD, out of necessity mostly, is doing the right thing. It's not financially strangling itself, just so it can produce a product that will be 10% faster and cost much more to make, only to end up unable to sell it against Intel products because of it's price.
We didn't have as many transistors on a chip 10 years and we also weren't as close to silicon's physical limitations either. There is the distinct possibility that a monolithic CPU made sense 10 years ago, but doesn't today. I'm not saying that's the case, because I don't have all the data, I'm saying both AMD's statements could be true, given the timeline.
 
Joined
Jun 28, 2016
Messages
3,595 (1.16/day)
isn't like 95%+ of the time during regular usage idle though? for mobile parts that adds up to 36 mins - 1+ hours in a normal usage cycle (6-10 hours).
It's way worse, actually.
It's 10% for the whole system. 4-core Kaby Lakes use 5W in idle - the rest is the platform.
I'm not saying Zen CPU uses twice as much, but it doesn't look good either.
 
Joined
Sep 6, 2013
Messages
3,392 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
We didn't have as many transistors on a chip 10 years and we also weren't as close to silicon's physical limitations either. There is the distinct possibility that a monolithic CPU made sense 10 years ago, but doesn't today. I'm not saying that's the case, because I don't have all the data, I'm saying both AMD's statements could be true, given the timeline.
In the end, it didn't.

AMD's efforts to be again the company that leads in technology, by designing the first true quad core processor, lead to delays in the server market
AMD shoves delayed Barcelona chip out of the door • The Register (9 Apr 2008)
Until then Intel was having a party by gluing dual cores
Intel launches first quad-core chips (November 14, 2006)

Their ego, gave the server market to Intel on a plate. Not that things where better for the consumer market.
Intel's Core 2 Extreme QX6700: The Multi-core Era Begins (November 2, 2006)

AMD managed to offer Phenom much latter, a year latter
AMD's Phenom Unveiled: A Somber Farewell to K8 (November 19, 2007)
only to end up with a bug
The "TLB Bug" Explained - AMD's B3 Stepping Phenom Previewed, TLB Hardware Fix Tested


As for the question if it was worth it and why it proved not to? Here is what Anand had found in his Phenom review

Looking at the multi-threaded results we see that AMD does gain some ground, but the standings remain unchanged: Intel can't be beat. We can actually answer one more question using the Cinebench results, and that is whether or not AMD's "true" quad-core (as in four cores on a single die) actually has a tangible performance advantage to Intel's quad-core (two dual-core die on a single package).

If we look at the improvement these chips get from running the multi-threaded benchmark, all of the Phenom cores go up in performance by around 3.79x, while all the Intel processors improve by around 3.53x. There's a definite scaling advantage (~7%), but it's not enough to overcome the inherent architectural advantages of the Core 2 processors.

So, AMD spent millions and millions of dollars, so it can lose a huge part of the server and consumer markets, in no time because of delays, and the icing on the cake, eat a nice TLB bug in it's face. And for what? So it can get a 7% extra performance by creating the first true quad core cpu.

AMD did a smart move by gluing cpus this time. Intel knows it and it just say the same thing AMD was saying 10 years ago.

Advanced Micro Devices plans to produce quad-core chips from the middle of next year on 65nm. It says Intel’s chips are not true quad-core in that its rival has combined two dual-core chips.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Eh, until AthlonXP, AMS was always "the company that makes cheap x86 CPUs". They didn't take the server world by storm, but AthlonXP put them on the map for many users.
Hindsight is always 20/20, but to me AMD is just the underdog that sometimes punches above above their weight (and sometime releases Bulldozer). I like AMD, there was a time when I thought I'll never buy anything but AMD, but at the end of the day it's just business.
 
Joined
Dec 28, 2012
Messages
3,955 (0.90/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
You don't have a surprise for competition if you parrot EVERYTHING to the whole world. It has NOTHING to do with R&D budget. I'm just surprised Ryzen distrupted the market as much considering how they parroted everything straight to Intel. They didn't even need to use leaks and corporate espionage, everything was served directly to them. I guess after having no competition for so long even Intel didn't believe it was real or something...
Your inner intel fanboi is showing. since you seem so convinced, how does infinity fabric technically work, since you can read these slides? Could you take these slides, and make a CPU using them as your only instructions?

No, you cant. Because all AMD has done is communicate ideas of how their tech works. They have never shown the technical details of how it works, and considering intel has done the gluing dies together before, AMD broadcasting their idea harms nobody. The only reason we have some understanding of infinity fabric is from reviews, but the actual IP, the method of its operation, is a closely held secret.

Seriously, you act like AMD released their internal transistor level design or something. These are just investor powerpoint slides, there is no technical detail here.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
Look at this shit. Now I'm Intel fanboy. Weren't I a massive AMD fanboy just a week or two ago? Make up your damn mind people lololololol
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Your inner intel fanboi is showing. since you seem so convinced, how does infinity fabric technically work, since you can read these slides? Could you take these slides, and make a CPU using them as your only instructions?

No, you cant. Because all AMD has done is communicate ideas of how their tech works. They have never shown the technical details of how it works, and considering intel has done the gluing dies together before, AMD broadcasting their idea harms nobody. The only reason we have some understanding of infinity fabric is from reviews, but the actual IP, the method of its operation, is a closely held secret.

Seriously, you act like AMD released their internal transistor level design or something. These are just investor powerpoint slides, there is no technical detail here.
This time the truth really is in the middle. No, you can't surprise your competition if your spill your beans before time, but no, AMD did not do that. Intel knew about AMD gluing dies together long before AMD outed their InfinityFabric slides. How? They simply keep an eye on the talent AMD hires.
This is how Microsoft pretty much told the world Google is going to develop an OS: they told us that looking at the talent Google was hiring at the time, they found a striking resemblance to what Microsoft required from Windows hires. And I assure you, there are fewer people in this world that can design a chip than there are software engineers.
 
Joined
Jul 25, 2009
Messages
147 (0.03/day)
Location
AZ
Processor AMD Threadripper 3970x
Motherboard Asus Prime TRX40-Pro
Cooling Custom loop
Memory GSkil Ripjaws 8x32GB DDR4-3600
Video Card(s) Nvidia RTX 3090 TI
Display(s) Alienware AW3420DW
Case Thermaltake Tower 900
Power Supply Corsair HX1200
The interconnect lends to more latency doesn't it? I wonder what the actual numbers are.
 
Joined
Jun 28, 2016
Messages
3,595 (1.16/day)
The interconnect lends to more latency doesn't it? I wonder what the actual numbers are.
You can look at Ryzen reviews. Yes, there is a visible latency - one of the reasons why they don't perform very well in games (and few other tasks as well).

But it's not at all harmful in many other scenarios, so the whole idea is still valid - just not perfect for gaming.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
You don't have a surprise for competition if you parrot EVERYTHING to the whole world. It has NOTHING to do with R&D budget. I'm just surprised Ryzen distrupted the market as much considering how they parroted everything straight to Intel. They didn't even need to use leaks and corporate espionage, everything was served directly to them. I guess after having no competition for so long even Intel didn't believe it was real or something...

Once again, the reality check...

AMD's current CPU design is just four dualcore Intels with fast interconnects; and then another fast interconnect for big dies. Intel already has this interconnect too, just uses it differently. Intel's HT is AMD's later SMT. Tell me again please, what exactly has AMD shared with the world that's new and doesn't suck? Has it occurred to you that many of the features you see in Ryzen are Intel ideas put to practice and proven as effective designs?

Or lets look at GPUs. AMD has now still NOT implemented all the features you find on Nvidia GPUs, their boost is clunky as hell, several features don't even work proper. Meanwhile, Nvidia is rocking a GP100 and GV100 with HBM that actually sells at a margin, while AMD is struggling to get it functional on a consumer GPU, let alone sell it without a loss. Again, what's new here from AMD? Mantle perhaps? Oh wait, DX12 cut them to the chase, and back then we also heard your argument, which made just as little sense as it does today. Q3 2017 and we can count Vulkan games on óne hand.

You may or may not have noticed, but in the semicon architecture business, all the architects are having a real close look at each others' designs, take the best there is, and iterate from there. If you think a few press slides with block diagrams are what detail an architecture to the level of what it really is, you really need to stop drinking the kool-aid.
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
No it hasn't. DX12 exists because AMD pushed the low level API idea. Mantle existed long before DX12. And hell, whole Mantle replaced entire OpenGL as Vulkan. But AMD didn't du nuffin for the industry, right? Also, what features? PhysX? Even NVIDIA didn't invent that, they just bought the IP from Ageia and changed it for CUDA. Which is still total utter garbage even on GTX 1080Ti. Killing Floor 2 and Cryostasis are entirely unplayable with it on bloody GTX 1080Ti. And one is like a decade old game now...
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
No it hasn't. DX12 exists because AMD pushed the low level API idea. Mantle existed long before DX12. And hell, whole Mantle replaced entire OpenGL as Vulkan. But AMD didn't du nuffin for the industry, right? Also, what features? PhysX? Even NVIDIA didn't invent that, they just bought the IP from Ageia and changed it for CUDA. Which is still total utter garbage even on GTX 1080Ti. Killing Floor 2 and Cryostasis are entirely unplayable with it on bloody GTX 1080Ti. And one is like a decade old game now...

We're talking architectural block diagrams and you pop around the corner speaking of PhysX and re-iterating the nonsensical rumor of DX12 being based on Mantle, which is WCCFTech-level garbage that speaks volumes of how little you know and what you read.

On an architectural level, ever since GCN and Ryzen, AMD is only chasing the industry leaders, even though they like to say this is not the case and they are revolutionizing everything they touch. And at the every turn where they refuse to follow the industry leaders, they produce failures. This ranges from the modular approach of the FX-processors, to the way too late implementation of delta compression (Nvidia's first), tiled rendering (Nvidia's first), pushing heavily on core clocks (Nvidia first), hell even the fast interconnect isn't an AMD idea, just the way they implemented it is original (and at the same time, not even a novel thing in the industry). I could go on with probably ten more examples of AMD following the popular and leading design choices of the industry.

But at least GCN is a great arch for (semi) pro markets, right? Much better than Nvidia's CUDA which most applications now talk fluently with. AMD is bleeding market share for everything related to GPUs, why the F. would any competitor look at their slides and steal their ideas...

'AMD pushed the low level API idea'... forgot 3DFX? Had a look at Vulkan and DX12 adoption rates?
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
I literally NEVER said DX12 is based on Mantle. Start reading things and understanding context. This however is what I sad: DX12 exists because AMD pushed the low level API idea. Quite a difference between "based on" and "pushed the idea about" isn't it? As for architecture, tiling is also nothing new, it wasn't invented by NVIDIA either as it existed long ago and used in practice on PC segment as well (Kyro). People should stop undermining AMD's first's and achievements. People just always dismiss them as unimportant because they are not fueling their narratives. When NVIDIA does it, everyone is raving about it, but when AMD does it, everyone just kinda prefers to look the other way in ignorance. It's slowly getting on my nerves (and then people quickly call me an AMD fanboy for being annoyed by that and then in almost the same breath call me an Intel fanboy because of some other post I made somewhere...).

As for 3dfx, no, I haven't forgot it. I guess you just don't realize that without Glide, we'd probably never get Direct3D or significantly later than we did. The problem with Glide was that it was entirely proprietary. The same as Mantle, which is why Mantle also didn't took off. The story changes with Vulkan which is fully open. The reason why it's not widely adopted is because games aren't 5 digit projects anymore. In the past when new DX came out, they could make a new game based on it in a very short time. Now, it's a multi million project. Things take time and that's why adoption is so slow. But realistically, DX12 support is reasonably wide given the situation. It all trickles down to engine support. If Unreal and Unigine both support DX12 and Vulkan, adoption will spread quickly. Otherwise it's limited to individual studios that decide to use it. Things have changed since year 2000-ish when we basically had Unreal and Quake engine running pretty much everything. Today, even Unreal isn't all that popular as studios opt to write their own for the most part.
 
Joined
Jul 10, 2015
Messages
754 (0.22/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
Rejzor, you should learn when to stop. You are too emotional like a true fanboy. Ryzen is good but ipc is haswell-ish. Stop partying like its 2013.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,671 (2.86/day)
Location
w
System Name Black MC in Tokyo
Processor Ryzen 5 7600
Motherboard MSI X670E Gaming Plus Wifi
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Corsair Vengeance @ 6000Mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston KC3000 1TB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Dell SK3205
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
Ryzen is good but ipc is haswell-ish.

Here's some charts on CPUs running 2.8Ghz. Haswell IPC ain't to be sneezed at, because cores and progress, etc. You know the argument. I find The Witcher 3 720p benchmarks most hilarious.
 
Joined
Oct 21, 2005
Messages
7,074 (1.01/day)
Location
USA
System Name Computer of Theseus
Processor Intel i9-12900KS: 50x Pcore multi @ 1.18Vcore (target 1.275V -100mv offset)
Motherboard EVGA Z690 Classified
Cooling Noctua NH-D15S, 2xThermalRight TY-143, 4xNoctua NF-A12x25,3xNF-A12x15, 2xAquacomputer Splitty9Active
Memory G-Skill Trident Z5 (32GB) DDR5-6000 C36 F5-6000J3636F16GX2-TZ5RK
Video Card(s) ASUS PROART RTX 4070 Ti-Super OC 16GB, 2670MHz, 0.93V
Storage 1x Samsung 990 Pro 1TB NVMe (OS), 2x Samsung 970 Evo Plus 2TB (data), ASUS BW-16D1HT (BluRay)
Display(s) Dell S3220DGF 32" 2560x1440 165Hz Primary, Dell P2017H 19.5" 1600x900 Secondary, Ergotron LX arms.
Case Lian Li O11 Air Mini
Audio Device(s) Audiotechnica ATR2100X-USB, El Gato Wave XLR Mic Preamp, ATH M50X Headphones, Behringer 302USB Mixer
Power Supply Super Flower Leadex Platinum SE 1000W 80+ Platinum White, MODDIY 12VHPWR Cable
Mouse Zowie EC3-C
Keyboard Vortex Multix 87 Winter TKL (Gateron G Pro Yellow)
Software Win 10 LTSC 21H2
AMD's logic is sound there is nothing wrong with MCM, the Q6600 was MCM 2x dual core and AMD at the time blasted intel for not having a "true" quad like the Phenom, regardless the Q6600 was a better processor at that time until the Phenom II. If it gives you the performance you need at the price you need because you get the proper yield than so be it. The latency between the packages might be a problem but I guess you can make up for its shortcomings with core count and thread speed.
 
Top