Monday, February 20th 2023

AMD Ryzen 9 7950X3D Runs First Benchmarks

AMD's upcoming Ryzen 9 7950X3D processor will bring 16 cores and 32 threads along with 16 MB of L2 cache and 128 MB of L3 cache for 144 MB of 3D V-cache present on the package. Today, we get to see it in action for the first time in benchmarks like Blender for 3D content creation and Geekbench 5 for synthetic benchmarks, where we get to compare the scores to the already existing models. In Blender, the new AMD Ryzen 9 7950X3D scores 558.59 points, while the regular Ryzen 9 7950X scores 590.28 points. This represents a 5.4% regression from the original model; however, we are yet to see how other content creation benchmarks suit the new CPU.

For Geekbench 5 synthetics, the upcoming Ryzen 9 7950X3D scores 2,157 points in the single-core score and 21,841 points in the multi-core score. The regular Ryzen 9 7950X can reach around 2246 points for single-core and 25,275 points for multi-core score, which is relatively faster than the new cache-enhanced Ryzen 9 7950X3D design. Of course, some of these benchmark results show that the 4.2 GHz base frequency of Ryzen 9 7950X3D plays a significant role in the overall performance comparison, given that the regular Ryzen 9 7950X is set to a 4.5 GHz base clock. Both designs share the same 5.7 GHz boost speed, so we have yet to see more benchmarks showing other differences induced by larger cache sizes.
Source: via Tom's Hardware
Add your own comment

76 Comments on AMD Ryzen 9 7950X3D Runs First Benchmarks

#51
Fierce Guppy
GodrillaI once had a i7 980x at 4.3 ghz all core oc that lasted 10 years and was able to play metro exodus at extreme settings at 3440 by 1440p no rt with 1080ti ftw3 at 60 fps. The best way to make a cpu last almost a decade like in my case is upgrade the resolution. Unfortunately 8k gaming is no where near from grasp due to stagnation of display technology and slow adaptation of dp 2.X standards. Even without upgrading your monitor resolution there is always downscaling from a higher resolution to a lower resolution to further improve image quality and keep yourself gpu bound longer like in dsr, dldsr, nvidia's dlaa ai anti aliasing and AMD's virtual super resolution. As GPUs become more powerful I believe these downscaling techniques will become more popular especially if there is performance on the table.
Many people talk down buying high-end PC gear, but I want good performance and a long time between PC upgrades. The bedroom PC which I built in Jan 2009 runs Battlefield 4 very smoothly. A couple of years after BF5 arrived I swapped it for a (6-core) Xeon X5690 to get BF5 to also run very smoothly. If it gets old and crusty, replace it with a Xeon. However, your i7-980X is probably in the ballpark of a X5690 already, which just goes to show that paying for the good stuff from the get-go means all one has to care about for many years to come is buying a new graphics card.

I might try downscaling to see if it improves BF2042 performance on the 2009 PC. Runs like a slideshow atm.
Posted on Reply
#52
freeagent
Fierce GuppyHowever, your i7-980X is probably in the ballpark of a X5690 already,
It’s the same CPU.

The 980X is actually better, because it has memory dividers that can be useful if you have good memory.
Posted on Reply
#53
Sara bnt yazn
There is a different processor than its 5800x3d processor, its speed was much lower than the 5800x, but it was higher performance than it in games by a large difference. Now the 7900 and 7950 processors will come at the same speeds as the original processor in version x, and in the future the differences in processors in games will be similar to the differences in graphics cards, and God knows best. people's inventions.
Posted on Reply
#54
JustBenching
las7800X3D will probably beat 13900KS in most CPU bound games, while costing half and using much less watts
Αnd then you compared it to a 13600k, you realize it only beats it by 10% while costing way more and being way slower in everything but games. Yikes
john_We are in an era where Intel have taken over the market by selling higher number of cores, with most of them usually being E cores. For many 24 cores are 24 cores. They close their eyes on the fact that 16 of those are E cores. And never, I mean NEVER ask themselves "What if we had 24 P cores? What difference in performance we would have witnessed?".
Now AMD comes with X3D chips that will probably win many gaming benchmarks and because we have the non X3D chips in the market and we know what those chips can do with unlocked TDP, people try to find flaws to paint a negative image to those X3D chips. Intel using E cores might translate to like, 50+% degradation in multi core performance, but no, we should start a revolution because X3D chips will be 10% slower in the game of... Cinebench.

These chips are what people where asking from AMD. To integrate X3D cache to hi end models, and not just the 8 core model. AMD did it and now the only thing we have to do is wait and see if that extra cache and benchmark results in games can justify those prices.
The difference is ecores makes the CPU cheaper (less die space) while providing MORE mt performance. The 3d cache makes the CPU more expensive while hinderingg performance in everything but games. So not really a good analogy
Posted on Reply
#55
armit
AMD processors are very unstable. Even if you record phenomena such as freezing interruption as a video and send it to the A/S center, they judge the defective product as a normal product because in case of continuous defect, AMD CPU must be refunded in accordance with the Consumer Protection Act. In the case of Intel, it recalled in the same situation.

In the end, the Ryzen and Asus b450 boards were discarded and replaced with Intel CPUs and boards, so there was no problem except for heat.
Posted on Reply
#56
AnotherReader
The Ryzen CPUs with stacked cache will have lower clock frequencies than their regular counterparts until the stacked cache die is moved below the main die. It is possible, because AMD's MI300 does that.
Posted on Reply
#57
HD64G
Something else that most people tend to ignore/forget: The die without the 3Dcache will not be hampered in clocks, so in all apps apart from the heavy-MT ones, the CPU will match 7950X and in cache sensitive ones will easily win it. Personally, I cannot find any con in 7900X3D and 7950X3D cpus especially with their power limits being absolutely sensible.
Posted on Reply
#59
imeem1
so around 5% slower outside of gaming vs. the 7950x. Would it in theory be 5% faster in gaming?
Posted on Reply
#60
Godrilla
imeem1so around 5% slower outside of gaming vs. the 7950x. Would it in theory be 5% faster in gaming?
It might have better 0.1% lows and less frame variance even if the average is 5% delta gain but the 0.1% lows are much better even at higher practical resolutions like 4k it will be a win imo.
Posted on Reply
#61
Avro Arrow
I really think that AMD made a huge mistake by creating the R9-7900X3D and R9-7950X3D. We've already seen from tests with the R7-5800X3D that the increased cache does little to nothing to improve productivity performance while the reduced (and restricted) clock speeds do a lot to hinder it. I think that people who buy these APUs are going to be unhappy with them because, generally, gamers (almost) never buy 12 or 16-core chips, we (almost) always buy 6 or 8 cores. I fear that the R9 X3D APUs will end up being a bit of a debacle for AMD because I don't see these selling well at all. Prosumers are, on average, more tech-savvy than gamers. I mean, sure, there are gamers who are every bit as tech-savvy as any prosumer but there are also the kids who game on Alienwares and don't have a clue about PC tech so, yeah, I think that, on average, Prosumers are more tech-savvy than gamers. Besides the lack of sales that I foresee, those that do take the chance and buy one won't be satisfied with it, something that is especially bad considering their price.

There's also the huge opportunity that AMD missed by not having an R5-7600X3D. We've seen that X3D can have huge performance advantages in gaming and, while I'm sure that the R7-7800X3D will sell well, AMD could've been the undisputed king of the gaming CPU market with an R5-7600X3D. Any competition from Intel in that space would've been a joke at best. AMD might change its mind and end up releasing an R5-7600X3D but by then it might be too little, too late because if gamers are forced to buy something else, like the R5-7600(X) or i5-13500K, they're not going to throw money down on a new (and more expensive) APU to get that extra performance after they've already shelled out once.

I believe that AMD's product choices with regard to the Ryzen 7000-series X3D parts have been based on greed and that greed is going to cost them in the long-term. When a decision like this is made for short-term benefit, sooner or later the future comes along and plants a big, juicy bite right on the buttocks of the company that made said decision. It happened to Intel because of their refusal to offer more than four cores in a CPU for too long and I foresee it happening to AMD. They just never seem to learn, eh?
Posted on Reply
#62
A Computer Guy
Avro ArrowThere's also the huge opportunity that AMD missed by not having an R5-7600X3D. We've seen that X3D can have huge performance advantages in gaming and, while I'm sure that the R7-7800X3D will sell well, AMD could've been the undisputed king of the gaming CPU market with an R5-7600X3D. Any competition from Intel in that space would've been a joke at best. AMD might change its mind and end up releasing an R5-7600X3D but by then it might be too little, too late because if gamers are forced to buy something else, like the R5-7600(X) or i5-13500K, they're not going to throw money down on a new (and more expensive) APU to get that extra performance after they've already shelled out once.
I suspect a 7600X3D would have cut too close into 5800X3D sales. If you were already on AM4 a 5800X3D would be the final stop for many looking for an upgrade that would be worth it and who don't need or necessarily want more cores. Maybe it's greed or perhaps they are trying to hit some target that makes something look good on paper because they know they are going to take some losses with current trends going into 2023.
Posted on Reply
#63
Avro Arrow
A Computer GuyI suspect a 7600X3D would have cut too close into 5800X3D sales. If you were already on AM4 a 5800X3D would be the final stop for many looking for an upgrade that would be worth it and who don't need or necessarily want more cores.
I honestly don't know how an R5-7600X3D would affect R7-5800X3D sales because they're on completely different platforms. The only people who would be interested in the 5800X3D would be people who already own AM4. There's no way to make the performance of the R5-7600X3D over the R7-5800X3D worth the cost of an expensive B650 motherboard and new DDR5 RAM. I don't think that it would have any effect on people who already own AM4 because the 5800X3D is already amazing and no new motherboard or RAM is required. OTOH, people who don't already own AM4 aren't interested in the 5800X3D because AM4 is a dead platform. If they're going to invest new money, they're going to go for the AM5 APU over the AM4 CPU. The R7-5800X3D is only worth it if you already have an AM4 motherboard while an R5-7600X3D would only be worth it if you didn't already have an AM4 motherboard. There's no way that could possibly change because the adoption cost of AM5 isn't cheap and people will either want to put off paying for AM5 as long as possible or would be averse to spending money on a dead platform instead of investing in the new one. I really think that this is a non-issue.

Besides, even if you were right, all that would mean to AMD is more motherboard sales and I'm sure that wouldn't be considered a bad thing. I don't see that being their reasoning.
A Computer GuyMaybe it's greed or perhaps they are trying to hit some target that makes something look good on paper because they know they are going to take some losses with current trends going into 2023.
That sounds a lot more plausible but if it's true, AMD's greed is causing them to be galactically stupid here.

I say this because I believe the R9-7900X3D and R9-7950X3D to be complete wastes of time and resources on AMD's part. As a rule, people who buy 12 and 16-core R9s don't buy them for primarily for gaming. Sure, they might game a bit on them but their primary use will always be productivity. The only people who buy those specifically for gaming are people with more money than brains and while these people surely do exist, there aren't enough of them to make a product successful.

We've all seen from the tests on the R7-5800X3D that the cheaper R7-5800X beats it in productivity because of its faster stock clocks and as its ability to be overclocked. The 3D cache has a net-negative impact on productivity because while very few (if any) productivity suites benefit from the extra cache, they all suffer from the clock speed restrictions. The people who are looking for a gaming APU will choose the R7-7800X3D while the people looking for a productivity APU will choose the R9-7900X or R9-7950X. After all, why would anyone pay more for a product that they know is going to be inferior for their purposes than an APU that costs a good deal less?

If your answer is "nobody" then you win the prize. In this case, the prize is AMD losing a crap-tonne of money when they could've made an absolute killing with an R5-7600X3D. The very expensive (not just to buy but to produce) R9-7900X3D and R9-7950X3D will gather dust on the shelves and AMD will be forced to take a huge loss on them. Even worse, AMD's brand-image will be damaged because knowingly bringing a useless product to market which will inevitably result in returns from unsatisfied customers is easily the best way to drag your own name through the mud. I honestly can't believe that Lisa Su signed off on it because she's a lot smarter than this.
Posted on Reply
#64
AnotherReader
Avro ArrowThe very expensive (not just to buy but to produce) R9-7900X3D and R9-7950X3D will gather dust on the shelves and AMD will be forced to take a huge loss on them. Even worse, AMD's brand-image will be damaged because knowingly bringing a useless product to market which will inevitably result in returns from unsatisfied customers is easily the best way to drag your own name through the mud.
The difference between the 7950X3D and the 7950X is rather low in productivity. The higher speed CCD helps there. The difference is almost entirely due to the higher power limit for the 7950X which makes the 7950X3D a much saner product. Moreover, if you're buying a CPU for some work related multithreaded application, then you should look at just the benchmarks for that particular application. For some scientific workloads, the 7950 X3D is significantly faster than the 7950X. For other workloads, the 13900k is faster. As always, for specialized workloads, don't look at the general rating, but instead, look at workloads similar to yours.

Posted on Reply
#65
A Computer Guy
Avro ArrowI honestly don't know how an R5-7600X3D would affect R7-5800X3D sales because they're on completely different platforms. The only people who would be interested in the 5800X3D would be people who already own AM4. There's no way to make the performance of the R5-7600X3D over the R7-5800X3D worth the cost of an expensive B650 motherboard and new DDR5 RAM. I don't think that it would have any effect on people who already own AM4 because the 5800X3D is already amazing and no new motherboard or RAM is required. OTOH, people who don't already own AM4 aren't interested in the 5800X3D because AM4 is a dead platform. If they're going to invest new money, they're going to go for the AM5 APU over the AM4 CPU. The R7-5800X3D is only worth it if you already have an AM4 motherboard while an R5-7600X3D would only be worth it if you didn't already have an AM4 motherboard. There's no way that could possibly change because the adoption cost of AM5 isn't cheap and people will either want to put off paying for AM5 as long as possible or would be averse to spending money on a dead platform instead of investing in the new one. I really think that this is a non-issue.

Besides, even if you were right, all that would mean to AMD is more motherboard sales and I'm sure that wouldn't be considered a bad thing. I don't see that being their reasoning.
Just spit balling some numbers $350 for 5800x3D (newegg sale right now $310 but ignore that for a moment)

About $570 for a AM5 with DDR5 and 7600x3D (estimated).

$570 - $350 = $220 now when the cheaper AM5 boards come out and price reductions in DDR5 knock off at least $100 now your close to $120.

If you are on older AM4 x300 or x400 with Zen, Zen+, or Zen2 it might be more appealing to jump to a growing AM5 platform with 7600x3D for the minor difference in price and a significant bump in performance.

But since it's not available AMD can milk the 5800x3D for a bit longer. Motherboard sales in a bit of slump right now so AMD is in a unique position to move CPU inventory on AM4 with CPU upgrades.
Posted on Reply
#66
ThrashZone
john_We are in an era where Intel have taken over the market by selling higher number of cores, with most of them usually being E cores. For many 24 cores are 24 cores. They close their eyes on the fact that 16 of those are E cores. And never, I mean NEVER ask themselves "What if we had 24 P cores? What difference in performance we would have witnessed?".
Now AMD comes with X3D chips that will probably win many gaming benchmarks and because we have the non X3D chips in the market and we know what those chips can do with unlocked TDP, people try to find flaws to paint a negative image to those X3D chips. Intel using E cores might translate to like, 50+% degradation in multi core performance, but no, we should start a revolution because X3D chips will be 10% slower in the game of... Cinebench.

These chips are what people where asking from AMD. To integrate X3D cache to hi end models, and not just the 8 core model. AMD did it and now the only thing we have to do is wait and see if that extra cache and benchmark results in games can justify those prices.
Hi,
Yeah those e threads are worse than skylake-x 79 series thermals but at least it was easy to delid :laugh:
Posted on Reply
#67
Avro Arrow
AnotherReaderThe difference between the 7950X3D and the 7950X is rather low in productivity. The higher speed CCD helps there. The difference is almost entirely due to the higher power limit for the 7950X which makes the 7950X3D a much saner product. Moreover, if you're buying a CPU for some work related multithreaded application, then you should look at just the benchmarks for that particular application. For some scientific workloads, the 7950 X3D is significantly faster than the 7950X. For other workloads, the 13900k is faster. As always, for specialized workloads, don't look at the general rating, but instead, look at workloads similar to yours.

The problem is the fact that it doesn't change the price and the lack of value. All that anyone has to do if they want sane power draw and temps from an R9-7950X is enable eco-mode which, as you can see, completely nullfies the advantage to which you refer. If I'm a prosumer who wants a processor for productivity, I'm going to choose the R9-7950X because it's a lot less expensive and performs better in productivity workloads, even if the difference isn't huge. I can decide then if I want eco-mode or full-out performance and/or overclocking while paying less for it. If I also want to game with it, that's no problem either because the R9-7950X is still a top-tier gaming APU, matching the i9-12900K for much less money than the R9-7950X3D:

If the picture isn't visible, click here to see it.
Even at 1080p (a resolution that nobody will game at anyway), the gaming performance difference between the R9-7950X and R9-7950X3D would be imperceptible while the productivity performance difference would be quite obvious. Therefore, it's a bad product for prosumers.

For gamers, it's an even worse value proposition because the R7-5800X3D is coming out. The reason we gamers only choose 6 or 8 cores is because all you get beyond that is a bunch of expensive cores sitting idle and eating expensive power for no reason.

The title of Steve Walton's review says it all and confirms everything that I've said about AMD's moronic product choices:

[URL='https://www.techspot.com/review/2636-amd-ryzen-7950x3d/']AMD Ryzen 9 7950X3D Review: Gamers, Don't Buy This One![/URL]

[URL='https://www.techspot.com/review/2636-amd-ryzen-7950x3d/']Check Out Our 7800X3D Simulated Results Instead[/URL]

If you're wondering what "Simulated 7800X3D Results" are, Steve did something pretty ingenious by disabling the R9-7950X3D's conventional CCX so that the APU was only using the CCX with the 3D cache on it to simulate the performance of the R7-5800X3D. I've seen this done before for similar purposes but I didn't see any other reviewer try this with the R9-7950X3D. As long as Steve did though, we can see just how terrible a product the R9-7950X3D is:

If the picture isn't visible, click here to see it.

Now, simulations like this are never 100% accurate because the standalone chip usually performs better than the one that was cut in half. This only makes things worse for the R9-7950X3D. I actually expected this because the 3D cache makes CPU and RAM speeds essentially irrelevant (within the same generation) so even having the R7-5800X3D running at lower clock speeds than the R9-7900/50X3D won't matter. The R9-7900X3D will be a real dumpster fire because it's going to have only six cores in its 3D-imbued CCX and will therefore perform in games like an R5-7600X3D would.

I said before that AMD made a colossal mistake by creating R9 APUs instead of R5 APUs with 3D cache. I caught A LOT of flak from fools who can't see the bigger picture but I didn't care because you can't fix stupid. I said that I hoped I was wrong but I knew that I wasn't because AMD isn't magic and you can't make a CPU that's a "best choice" for both gaming and productivity at the same time. The vindication is bittersweet though because it doesn't change the fact that every member of AMD's executive leadership belongs in Arkham Asylum for this.

AMD made the most galactically-stupid decision that I've ever seen them make by producing two APUs that couldn't succeed (R9-7900X3D and R9-7950X3D) instead of one that couldn't fail (R5-7600X3D). This is not a new concept and it really didn't take a genius to foresee this and I don't know why so few people did.
Posted on Reply
#68
A Computer Guy
Avro ArrowEven at 1080p (a resolution that nobody will game at anyway),
I think this is incorrect. Plenty of people will game at 1080p for reasons I won't bother to enumerate. 1080p gaming is still a reality probably most people still have because of their hardware, the game, and/or the desired experience.
Avro ArrowFor gamers, it's an even worse value proposition because the R7-5800X3D is coming out. The reason we gamers only choose 6 or 8 cores is because all you get beyond that is a bunch of expensive cores sitting idle and eating expensive power for no reason.
I think 7950X3D will make a lot of sense for some people not just gaming but gaming and streaming or doing other activities. You can still pin your games to 3D cached cores while streaming and other activities on the remaining cores. From that vantage point it's pretty useful especially in a multi-monitor setup.
Just yesterday I was playing TF2 on my 5950x with Youtube music playing and browsing the web between respawns. The cores don't eat expensive power as you describe when they are not being used to my understanding.
Avro ArrowNow, simulations like this are never 100% accurate because the standalone chip usually performs better than the one that was cut in half.
I'm not so sure about that and I think the margins of difference would be really small if any. I have a 3800x and 3950x and cutting the 3950x in half from what I recall basically equaled the 3800x in performance benchmarks I did on my machines at the time. That's not to say the same will happen to 7800x3D/7950x3D but I think it's a reasonable expectation. Testing will bear out whether or not that expectation was realistic or not but I wouldn't expect there to be a huge difference.
Posted on Reply
#69
AnotherReader
Avro ArrowIf you're wondering what "Simulated 7800X3D Results" are, Steve did something pretty ingenious by disabling the R9-7950X3D's conventional CCX so that the APU was only using the CCX with the 3D cache on it to simulate the performance of the R7-5800X3D. I've seen this done before for similar purposes but I didn't see any other reviewer try this with the R9-7950X3D. As long as Steve did though, we can see just how terrible a product the R9-7950X3D is:
@W1zzard also simulated the 7800X3D by disabling a CCD. As for the rest of your points, I agree that it doesn't make sense for a gaming only CPU. However, I disagree on the pointlessness of the 7900 X3D and the desirability of the 7600 X3D. If the former is pointless, than a 7600 X3D would be pointless too. As far as productivity is concerned, for most users, the 7950X is the better option. However, productivity isn't a one size fits all case, and there are non gaming workloads where the 7950 X3D dominates the 7950X. For productivity, you should always research what suits your application and buy that.

The 7950 X3D isn't stupid; it's appealing to the braggarts and the spendthrifts among us. As Nvidia has proven, that is a winning strategy. It's also suitable for those who have workloads that need the multi threaded performance, and want to both game and work on the same system.
Posted on Reply
#70
Avro Arrow
A Computer GuyI think this is incorrect. Plenty of people will game at 1080p for reasons I won't bother to enumerate. 1080p gaming is still a reality probably most people still have because of their hardware, the game, and/or the desired experience.
You didn't correctly read what I wrote. I said EVEN at 1080p..... which INCLUDES 1080p.
A Computer GuyI think 7950X3D will make a lot of sense for some people not just gaming but gaming and streaming or doing other activities.
No, it really won't.
A Computer GuyYou can still pin your games to 3D cached cores while streaming and other activities on the remaining cores. From that vantage point it's pretty useful especially in a multi-monitor setup.
You can already do that with an R9-7950X for $100 less. It has the same gaming performance as the i9-12900K which makes it an incredible gaming APU in its own right and definitely NOT in need of an increase in gaming performance. The R9-7950X would actually be MORE suitable for that because it's already so fast in gaming that adding the 3D cache won't make any appreciable difference no matter what resolution that you use:


You didn't think that just because AMD released this stupid abomination R9 X3D APU that suddenly the R9-7950X started to suck at gaming, did you? :laugh:
A Computer GuyJust yesterday I was playing TF2 on my 5950x with Youtube music playing and browsing the web between respawns. The cores don't eat expensive power as you describe when they are not being used to my understanding.
I'm afraid that you're wrong. As long as they are active, they do eat power, just not as much as when they're being used. It's called "Idle power draw" and it has always been a thing. The only way that they don't use power is if they're disabled in the BIOS.
A Computer GuyI'm not so sure about that and I think the margins of difference would be really small if any.
That's exactly the point. The margins will be similar and they could easily price an R5-7600X3D at the same or higher price than the R7-7700X which would make it even MORE profitable because I can guarantee you that a little extra cache silicon doesn't cost AMD very much.
A Computer GuyI have a 3800x and 3950x and cutting the 3950x in half from what I recall basically equaled the 3800x in performance benchmarks I did on my machines at the time. That's not to say the same will happen to 7800x3D/7950x3D but I think it's a reasonable expectation. Testing will bear out whether or not that expectation was realistic or not but I wouldn't expect there to be a huge difference.
I would actually expect that the R7-3800X actually performed slightly better than the R9-3950X with one CCX disabled.
Posted on Reply
#71
A Computer Guy
Avro ArrowYou didn't correctly read what I wrote. I said EVEN at 1080p..... which INCLUDES 1080p.
You also said "(a resolution that nobody will game at anyway)" hence my reply to that.
Avro ArrowYou didn't think that just because AMD released this stupid abomination R9 X3D APU that suddenly the R9-7950X started to suck at gaming, did you? :laugh:
Of course not. I'm not sure what you are driving at here.
Posted on Reply
#72
Avro Arrow
AnotherReader@W1zzard also simulated the 7800X3D by disabling a CCD. As for the rest of your points, I agree that it doesn't make sense for a gaming only CPU. However, I disagree on the pointlessness of the 7900 X3D and the desirability of the 7600 X3D. If the former is pointless, than a 7600 X3D would be pointless too.
You're 100% wrong. If what you say here is true, then there would be no R7-5800X3D and no R7-7800X3D. The 3D cache is beneficial in GAMING and ONLY GAMING. Therefore, an CPU like the R7-5800X3D or an APU like the R7-7800X3D or R5-7600X3D would be extremely attractive to gamers (as the R7-5800X3D has been) because gamers tend to buy 6 or 8-core processors.
AnotherReaderAs far as productivity is concerned, for most users, the 7950X is the better option. However, productivity isn't a one size fits all case, and there are non gaming workloads where the 7950 X3D dominates the 7950X.
You think that the odd win of no more than 10% is "domination"? I have seen nothing in any of these tests that say "This APU is worth $100 more than the R9-7950X" and neither have you, you just don't realise it.
AnotherReaderThe 7950 X3D isn't stupid; it's appealing to the braggarts and the spendthrifts among us. As Nvidia has proven, that is a winning strategy.
No, nVidia hasn't proven that at all. What nVidia has proven is that if your competition has historically had unstable drivers, as long as you make sure that yours are stable, people will flock to you because they don't know better. THEN you can charge whatever you want.

If AMD has made a very limited number of these, then sure, it's not that bad. The fact that they didn't make an R5-7600X3D, an APU that would have guaranteed their dominance over Intel in the gaming space, IS that bad. I'm not even saying this for me, because I HAVE an R7-5800X3D which means that I have no interest in buying anything from the first generation of AM5. What AMD did is push god-only-knows how many consumers into Intel's arms instead of releasing an R5-7600X3D and getting thousands more people onto the AM5 platform with it, guaranteeing future APU sales because of their platform philosophy that started with AM4.

You're either relatively new at PC tech or you just don't pick-up on historical patterns. That doesn't mean that they're not there.
AnotherReaderIt's also suitable for those who have workloads that need the multi threaded performance, and want to both game and work on the same system.
Again, you're only proving how little you understand PC tech because the R9-7950X can already do that. It has the gaming performance of the i9-12900K and any attempt to increase its gaming performance will go unnoticed because of how good that it already is. As I said to another person, do you think that just because this X3D abomination came out that the R9-7950X suddenly started to suck at gaming? I honestly wonder if you're both young because nobody with any real level of expertise would use such broken arguments as that. As if you need to spend an extra $100USD just to be able to do something that can already be done with the extant R9-7950X. Only a newbie would think that and I don't say that to be insulting, I say it because it's true. Anyone with any degree of experience and expertise immediately sees the R9-7950X3D for what it is, a shameless and useless cash-grab. That you would defend it with a bunch of broken arguments says a lot more about you than it does about AMD.

That you would say that the utility of 3D cache in the 6-core R5-7600X is no different than the utility in the 16-core R9-7950X means that you really don't have much understanding of PC tech, how it's used and what is good for what. I honestly can't take you seriously after these words.
Posted on Reply
#73
A Computer Guy
Avro ArrowI'm afraid that you're wrong. As long as they are active, they do eat power, just not as much as when they're being used. It's called "Idle power draw" and it has always been a thing. The only way that they don't use power is if they're disabled in the BIOS.
My understanding was with Zen series cores (default configuration) when you weren't using a core it was parked and hence not active thus not using any meaningful power. I could be wrong.
Avro ArrowI would actually expect that the R7-3800X actually performed slightly better than the R9-3950X with one CCX disabled.
Well to my recollection the CCD1 was on par with my 3800x and CCD2 was a tad slower.
Posted on Reply
#74
Avro Arrow
A Computer GuyMy understanding was with Zen series cores (default configuration) when you weren't using a core it was parked and hence not active thus not using any meaningful power. I could be wrong.


Well to my recollection the CCD1 was on par with my 3800x and CCD2 was a tad slower.
Well, you're wrong. Even parked cores use some power, just very little. Being parked is like sleep, not shutdown.
Posted on Reply
#75
95Viper
Stay on topic.
Stop your bickering...
Stop the insulting remarks.
Have a civil discussion... if you cannot do this... back out and take a breather.

And, another bit of advice from the guidelines:
Reporting and complaining
All posts and private messages have a "report post" button on the bottom of the post, click it when you feel something is inappropriate. Do not use your report as a "wild card invitation" to go back and add to the drama and therefore become part of the problem.
Posted on Reply
Add your own comment
Dec 23rd, 2024 09:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts