• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen 9 7950X3D

Xbox Gamebar and balanced power plan is needed for it to run properly?!? lol wow, that's a big no.. smh

GN youtube review around the 26m mark.. wow wtf ... it's needed for windows to park the higher frequency cores.. lol, looks like the 7800X3D will be AMD gamers CPU choice


EDIT: that's all i needed to hear.. not interested in any more reviews about these chips
Yes, the 7800X3D will be the AMD CPU gamers choice if they're not doing productivity work. This wasn't ever really in question. The 7950X3D is pretty clearly a content creator + gamer CPU. It is excellent at both.
13900KF on Amazon.com: 568$
7950X3D: 700$ + mandatory DDR5

I can use a 13900KF at full, 4 hours at day, for 5 years, to reach the 132$ difference. I've to use 13900KF for more than 10 years to justify the price difference between them.

You can buy whatever you want, but the power consumption on a high end computer it's almost nothing to his retail pricing.

Both are excelents processors, but Intel it's cheapest and better on applications, almost a tie on games. You choose.
If you buy a 13900KF and don't buy DDR5, it will be slower than the 7950X3D in almost everything. Value wise they come out a bit closer than you would think, as the 13900k does require more expensive cooling than the 7950X3D if you're actually using it for productivity, whereas the 7950X3D gets by just fine with the Noctua U-14S shown in the temperatures test in the TPU review.
 
Would love to know how this CPU does in Hogwarts Legacy and Witcher 3 remastered with maxed out settings + ray tracing.
 
page title says 4090 but charts say 3080 which is it?
All charts should be saying 4090 right now, if there's one that I missed, let me know. I've tested on 4090

5800X3D comparable would a been nice
Added, charts are rendering right now

Would love to know how this CPU does in Hogwarts Legacy and Witcher 3 remastered with maxed out settings + ray tracing.
I've started work on a 50+ game comparison article today, no plans for Witcher 3 remasted with RT though, just classic Witcher 3, the current release is a trainwreck
 
If you buy a 13900KF and don't buy DDR5, it will be slower than the 7950X3D in almost everything.
This sentence needs proofs. However, If i'm going to buy one of them, and I've DDR4, I can save 300€ easy and have almost the same performance. Maybe a little worst, but many users won't notice it, and it's a lot of money, 300 + 130 = 430€ maybe for the GPU, maybe for a future refresh.

As I can see, this time the AMD choice it's a lot more expensive, for a similar performance.
 
I'm aware puget's got some other benchmarks for performance in there. Is there a chance of a test against DaVinci and Lightroom? Lightroom especially.
 
Thanks w1zzard

For a gamer (me) not so interested in these higher core count and expensive pieces but glad we got to see (7800X3D-LIKE) CCD1 disabled reviews. I'm blown away at the efficiency of these X3D models... better than i expected.

AM5 + forward Gen support platform just got a little more interesting for me. Look forward to seeing the 7800X3D review.

I've started work on a 50+ game comparison article today, no plans for Witcher 3 remasted with RT though, just classic Witcher 3, the current release is a trainwreck

wow!! can't wait! Hope it includes the 7800X3D and 13700K :)
 
175% more efficient than 13900K in gaming. Intel needs new management over at their CPU R&D department :roll:
 
This sentence needs proofs. However, If i'm going to buy one of them, and I've DDR4, I can save 300€ easy and have almost the same performance. Maybe a little worst, but many users won't notice it, and it's a lot of money, 300 + 130 = 430€ maybe for the GPU, maybe for a future refresh.

As I can see, this time the AMD choice it's a lot more expensive, for a similar performance.

If you have DDR4 and you "won't notice" the performance loss, then you're wasting money on the wrong CPUs. Try a 5800X3D or Core i5 13600K.
 
If you have DDR4 and you "won't notice" the performance loss, then you're wasting money on the wrong CPUs. Try a 5800X3D or Core i5 13600K.
untitled-8.png

untitled-4 (2).png

untitled-2.png
untitled-3.png
untitled-4 (1).png


I can continue but I think the point it's clear. You need to update your biases about DDR4 and DDR5 (and I'm writing this from a computer with DDR5).
 
At $.40 per kWh, the price difference would cover 330 kWh of electricity. That's 330,000 watts.

In the multithreading test, AMD hit 140 watts, intel 276. That's a difference of 136 watts. That is 2426 hours of operation. At 4 hours a day, 365 days a year, it would take you 606 days of operation, or 1.6 years, to break even.

Now, fi you are going off of the application tests, AMD pulls 79 watts, intel 169. Difference of 90 watts. 3667 hours of operation. At 4 hours per day, 917 days, 2..5 YEARS, to break even.

If breaking even after two and a half YEARS on $132 in electricity is a major concern for you, you are not in the market for $500+ CPUs in the first place. The "muh power bill" arguments simply put, do NOT work out. You're fretting over the cost of gas while browsing ferraris.
4 hrs? If I'm not actively using the PC to do something intensive then it's running AV1 encoding almost all the time, so we're talking more like 16 hrs/day. That's basically less time than between generations AND I'm on a superior platform (AM5) AND am not cooler-constrained and can use my fav air cooler AND have a better CPU all along. And also - Ferraris? When talking about $132 dollars? My guy stop using these brain dead analogies because you have no clue what you're saying. :roll:
 
Guys, calm down. Why bother wasting your time arguing with that f-something guy. He said I was uneducated in another post and I simply set it to ignore him.
You could never wake someone who pretends to be asleep, you know? Believe me, not seeing those stubborn posts would return you a pleasant day.

Anyway, I was worried about scheduling problems, too. Turns out they did a very good job. That efficiency just shines. And I always prefer the simplest - in this case the simpler 7800X3D for me.
Competition is finally back in these years. Enjoy the journey.
 
Last edited:
Its a nice CPU but damn what am I reading?

 
The 5800X3D is going to go down as an "all-timer" piece of hardware. An exceptional icon ahead of its time. Like the 1080 Ti.
 
This sentence needs proofs. However, If i'm going to buy one of them, and I've DDR4, I can save 300€ easy and have almost the same performance. Maybe a little worst, but many users won't notice it, and it's a lot of money, 300 + 130 = 430€ maybe for the GPU, maybe for a future refresh.

As I can see, this time the AMD choice it's a lot more expensive, for a similar performance.

If someone has something to throw a fit about in terms of price, it’s the cost of motherboards.

I just purchased a kit of 32gb ddr5 m-die dominators for $30 less than what I spent on 32gb ddr4 dominators.

There’s also very little reason to spend the premium high end x670e boards sit at this gen. Unless you need as much connectivity as possible, the only benefit you can get for overclocking on AM5 (and largely the same with AM4) is the ability to tune ram. Most midrange boards have adequate VRMs that won’t melt, the enthusiast tier boards are just so ridiculously overbuilt there’s no point buying them.
 
Last edited:
Will computer break on 1000 FPS?
 
Thank you for the good review @W1zzard I have been excited for an entire month waiting for it. Really good stuff. Hopefully the industry continues to innovate and do well on all sides.

Thanks for rendering/adding in the 5800x3d scores. Looking forward to that being updated soon.
 
All charts should be saying 4090 right now, if there's one that I missed, let me know. I've tested on 4090


Added, charts are rendering right now


I've started work on a 50+ game comparison article today, no plans for Witcher 3 remasted with RT though, just classic Witcher 3, the current release is a trainwreck
thank you
 
This sentence needs proofs. However, If i'm going to buy one of them, and I've DDR4, I can save 300€ easy and have almost the same performance. Maybe a little worst, but many users won't notice it, and it's a lot of money, 300 + 130 = 430€ maybe for the GPU, maybe for a future refresh.

As I can see, this time the AMD choice it's a lot more expensive, for a similar performance.
The 12600k was 4% slower @1080p in games with DDR4 3600 CL14 than it was with DDR5 6000 CL36 in Hardware Unboxed Memory Scaling review. I'd expect the 13900k to suffer a similar performance reduction. That would put the 13900k into around the 13700k's numbers in this review, and make it pretty obviously slower than the 7950X3D. DDR4 3600 CL14 isn't even all that cheap. For example, Gskill Flare X5 6000 DDR5 CL36 2x16 kit is $110 right now on pcpartspicker. The cheapest DDR4 3600 CL14 2x16 kit is $100. Cost savings for RAM have become negligible.

Besides, I don't see the point in 'saving money' when you're buying $500+ flagships, and if money is no issue, there's not much separating the 13900k vs 7950X3D. However, if you're just gaming and want value you're better off with something like a 13700k/13600k+DDR4.
 
One question lingers in my mind about these new X3D CPU's. How does the performance compare in gaming when using the on-chip video between the X3D vs. non-3D chips?
 
Will computer break on 1000 FPS?
no, its flux capacitor will begin to operate at 1000FPS and send you back into the future...and then back again to 1985
 
Mobile Ryzen 7k is looking good
 
4 hrs? If I'm not actively using the PC to do something intensive then it's running AV1 encoding almost all the time, so we're talking more like 16 hrs/day. That's basically less time than between generations AND I'm on a superior platform (AM5) AND am not cooler-constrained and can use my fav air cooler AND have a better CPU all along. And also - Ferraris? When talking about $132 dollars? My guy stop using these brain dead analogies because you have no clue what you're saying. :roll:
Where does this silly notion that you can't use your favorite cooler on intel comes from? Im using a u12a, happily runs ycruncher at 330 watts on a 13900k.
 
The 12600k was 4% slower @1080p in games with DDR4 3600 CL14 than it was with DDR5 6000 CL36 in Hardware Unboxed Memory Scaling review. I'd expect the 13900k to suffer a similar performance reduction. That would put the 13900k into around the 13700k's numbers in this review, and make it pretty obviously slower than the 7950X3D. DDR4 3600 CL14 isn't even all that cheap. For example, Gskill Flare X5 6000 DDR5 CL36 2x16 kit is $110 right now on pcpartspicker. The cheapest DDR4 3600 CL14 2x16 kit is $100. Cost savings for RAM have become negligible.

Besides, I don't see the point in 'saving money' when you're buying $500+ flagships, and if money is no issue, there's not much separating the 13900k vs 7950X3D. However, if you're just gaming and want value you're better off with something like a 13700k/13600k+DDR4.

The real winner here is that 7950x3d is 175% more efficient than 13900k in gaming. That is substantial.
 
Back
Top