• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Some AMD Ryzen 5 7600X Processors Made with Dual-CCD Packages

5800X3D is good, but it's not magic. 12700(F) is still a good CPU, it will perform similarly in games, but you still get extra threads for other applications.
The 12700 and 5800X3D are not even same ballpark in a large number of games. Check the recent comparison article of 5800 vs X3D.
Overall it beats even the 12900k, and it beats it in the games where it matters most, the ones CPU bound more readily than GPU.

And it manages to do all this at lower power too.
So you have 4 extra (E-) threads... who cares, honestly, we're talking about 16 vs 20, and both have '8 P/SMT' cores regardless.

Its definitely magic; magic cache... the X3D is really positioned perfectly for most consumer use cases, in every metric: power/perf/w, FPS/w, gaming latency/frame times, and even MT perf is more than fine.

I mean if you compare the X3D to stuff lower in the stack, sure, you can get a more competitive gaming CPU. But when you want perf at the top end... The 12700 is old news.
 
you can find 12400 for 150usd on ebay and can do 5GHZ+ easily
Lol intel's fanboys are smart as f, pay $100 more for ddr5 + $200 for board that will allow overclock of non K intel to get less than 10% improvement :roll:, lets not forget second hand 12400f from ebay :roll:
meanwhile cheapest b350 or b450 for $50-60 is enough to overclock both the ram and the r5 5600
 
Last edited:
The 12700 and 5800X3D are not even same ballpark in a large number of games. Check the recent comparison article of 5800 vs X3D.
Overall it beats even the 12900k, and it beats it in the games where it matters most, the ones CPU bound more readily than GPU.
Yeah, if you still count 1080p as relevant data. Reality is most modern CPUs get you very high framerates and once you go to 4K they are still GPU limited.

5800X3D is an amazing deal now that AMD dropped prices a whole lot, it wasn't as great when it cost more than 12700.
 
It's actually quite simple: while attempting to make a 7900X, one of the CCDs got damaged somehow and became sufficiently damaged to force it's scrapping.

Instead of scrapping the whole CPU, they simply disabled any and all links to that CCD and re-used the other chiplets for "a lower model" CPU.

This is actually a good thing because, were this a monolithic chip, the whole CPU would have to be scrapped.

I wouldn't be surprised it this actually occurred more often, and not just with Zen 4 either.
Why is it that we have to explain this with every processer launch, every GPU launch, ever? Like, do people just have the worst memory ever? For some reason, it seems like every year the same things have to be explained over and over again because people just cant fathom that silicon would be harvested for lower parts.
 
You can disable cores also with monolithic chip and also chiplets are composed of chips that can be cut- see amd 2*5 chiplet cpus.

You missed this bit of what i said:

sufficiently damaged to force it's scrapping

Damage "of this magnitude" would force the scrap of the entire chip, were it monolithic.

With chiplets, so long as it's not the IO Die, it's possible to salvage it by using a 2nd chiplet, besides the damaged one, just like in the OP's example.
 
avoiding waste by recycling stuff?

that's awesome, tbf ... and that's also what most peoples do not understand :p

(electronic stuff ... i recycle a lot and resell some if i do not need to those who need and offer them to either re buy them or dispose of them when they do not need them, or direct them toward adequate disposale point for it )

although at the price ... i'd take a 5800X for the same price and be content with the performances upgrade, while not needing to upgrade the rest (Mobo/RAM)
 
avoiding waste by recycling stuff?

I doubt it: my guess is that placing the 1st CCD gave no problems but, with the 2nd, there was "an accident" and it was no longer viable for a 7900X.

In order to avoid scrapping the entire thing, AMD simply severed all connections between the 2nd CCD and the 1st CCD / IO Die, and then "transformed it" in to a 7600X.
 
Did der8aurer ever say how thick the new cpu cover is??
 
Besides recycling failed 7900x, another option is they just needed more 7600x to put on shelves so they scrapped working 79's, manufacturing cost is not that different in the end of the day and sold product is better than warehoused product.
 
It is a common practice with all chip manufactors (intel, amd, nv, Samsung ect).
You can disable cores also with monolithic chip and also chiplets are composed of chips that can be cut- see amd 2*5 chiplet cpus.
What is sad with zen4 is that amd keep all the profit from this saving to themselves..
2*5? What 2*5? I'll be mildly surprised if 10-core models aren't 6+4.
 
2*5? What 2*5? I'll be mildly surprised if 10-core models aren't 6+4.
I see your surprise, but they already exist and if I'm not mistake zen4 with 10 cores (2*5) are coming for ome (not for us consumers). You will always prefer the 1:1 outlet.
 
they already exist and if I'm not mistake zen4 with 10 cores (2*5) are coming for ome (not for us consumers).

Where? There has not been any zen 10 core cpu so far, what makes you think one will come in the future? It's probably to close to an 8 core cpu for it to make sense to use 2 chiplets with the added latencies that incurs.

I'd love to see an 8+4 myself, an alternative take on big little of sorts. Not that it's a configuration that makes any sense, just something I think could be cool
 
I doubt it: my guess is that placing the 1st CCD gave no problems but, with the 2nd, there was "an accident" and it was no longer viable for a 7900X.

In order to avoid scrapping the entire thing, AMD simply severed all connections between the 2nd CCD and the 1st CCD / IO Die, and then "transformed it" in to a 7600X.
It's exactly what recycling demi defective higher end products would mean for me, different explanation, same result ;)
 
It's exactly what recycling demi defective higher end products would mean for me, different explanation, same result ;)

That's not what i mean: i think both CCDs were good for a 7900X prior to their "installation", but something went wrong and the 2nd CCD became too damaged in the assembly stage, which is what forced the switch from a 7900X to a 7600X.

It's different from "installing" CCDs with known defects in the CPU.

Come to think of it, BOTH CCDs were "already slightly defective" to begin with, or they would have been used for a 7950X and NOT a 7900X.
 
That's not what i mean: i think both CCDs were good for a 7900X prior to their "installation", but something went wrong and the 2nd CCD became too damaged in the assembly stage, which is what forced the switch from a 7900X to a 7600X.

It's different from "installing" CCDs with known defects in the CPU.

Come to think of it, BOTH CCDs were "already slightly defective" to begin with, or they would have been used for a 7950X and NOT a 7900X.
that's still recycling a CPU with defective CCD to make one lower CPU thus : waste avoidance
 
5800X3D is good, but it's not magic. 12700(F) is still a good CPU, it will perform similarly in games, but you still get extra threads for other applications.
It's magic.

It keeps up with the best of the best CPUs in gaming, while using drastically less power
TPU doesnt cover 0.1% lows but getting double the low FPS of the 13900K at 45W vs 110W?
Magic.
Not a single other CPU comes close on the 0.1% values
1667364794964.png

1667364828128.png
 
It's magic.

It keeps up with the best of the best CPUs in gaming, while using drastically less power
TPU doesnt cover 0.1% lows but getting double the low FPS of the 13900K at 45W vs 110W?
Magic.
Not a single other CPU comes close on the 0.1% values
Sorry, but that is very irrelevant. When AMD sticks 3d cache into a laptop then maybe we can talk about it.
 
I doubt it: my guess is that placing the 1st CCD gave no problems but, with the 2nd, there was "an accident" and it was no longer viable for a 7900X.

In order to avoid scrapping the entire thing, AMD simply severed all connections between the 2nd CCD and the 1st CCD / IO Die, and then "transformed it" in to a 7600X.

Or have software that cuts it off at boot. There's microcode in every CPU in regards of software. It dictates the brand and / or model too upon boot.
 
  • Like
Reactions: HTC
Sorry, but that is very irrelevant. When AMD sticks 3d cache into a laptop then maybe we can talk about it.
sorry, HOW is it irrelevant?
Or did you change the topic and we forgot to update the threads title


a blatantly incorrect statement was made, and got refuted by multiple people with evidence to back it up
 
Back
Top