• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen 7 7800X3D

and in idle is widely known Intel CPU are more efficient than AMD’s.
Certainly, I don't test or report idle though. Idle measured on the 12V CPU line is also misleading you need to measure idle power draw whole platform

In gaming you have results at least 30W higher than usual
Source for "usual" ? Do they measure using the CPU's software sensors or using test equipment?

In CP2077 is at least 40/50W less than that
At higher resolution when GPU limited, yes .. 720p: 196.8 W, 1080p: 200.5 W, 1440p: 197.7 W, 4K: 135.8 W (13900K)
 
Nice review, I'm wondering if you can use slower memory like you could with the 5800x3d without a big performance hit.
 
indeed... THIS IS THE WAY!

(we need a formidably unyielding performing straight-face mandalorian emoji)
What we need is this little turnip :pimp:
Star Wars Shrug GIF by Disney+
 
Enthusiasm is at its maximum. Of course, in order to maximize profit, the offer will be limited and the prices tailored. I anticipate a ~500 euros in Europe for this processor.

However, let's ask some questions.
1. Is it the best gaming processor?
Definitely yes! And the most energy efficient, from what we can see. The condition is to have a suitable video card, in the top 5 at the moment.
2. Is it the best performing ryzen 7 in applications?
Of course not! Here the price/performance ratio is disastrous. Comparing the encoding and rendering from the review, it doesn't even exceed my 13500, which costs half as much and I can fit a cpu-MoBo-RAM platform with the price of this processor (market price, not MSRP).
3. Is it worth paying so much for this processor if I have a low-entry-mainstream video card?
Although no one tests this version of the system (maybe they don't have AMD's agreement, I don't know), I bet not?

It's not Intel versus AMD. It's just those small letters that everyone needs to see. I am convinced that the enthusiasts on this topic will be very disappointed if the reliable reviewers will compare the processors (X3D, non X3D, Intel) using a video card from the range of those found in probably more than 80% of the gaming systems on the planet.
 
It's not Intel versus AMD. It's just those small letters that everyone needs to see. I am convinced that the enthusiasts on this topic will be very disappointed if the reliable reviewers will compare the processors (X3D, non X3D, Intel) using a video card from the range of those found in probably more than 80% of the gaming systems on the planet.
Right, because among the enthusiasts on this topic, you are the only one who understood the scaling needed between CPU and GPU performance and the notion of bottleneck? Indeed, how come nobody ever thought of that before :roll:
 
Got it shipped Tuesday with a B650E-E Strix, i'm happy.

PS: 4070ti PNY's
 
This issue reviewers face that review hardware is that everyone games different. For the average gamer who just maxes out settings and games at 1440p/4k those results are likely going to be the most important but there are also gamers who play at competitive settings and in that scenario the 720p/1080p results might be the most important. It's impossible for a reviewer to cover all the bases.... Even though I tend to game at 4k with maxed out settings I still look at the 1% low 1080p data when purchasing a cpu as the most important result.
@W1zzard covers the academic staff but also the more important real life results, like 1% lows and high resolutions results.

Seems to me, lots reviewers are just enthusiasts like us, but don't necessarily know how to properly review hardware, at least not to actually help people making purchasing decisions.
 
  • Like
Reactions: Kei
I really look forward to playing witcher 3 remaster with this cpu - love the game, but it is TERRIBLY optimized in regards to cpu usage, so this should help alot.
 
So like a lot of people I have been paying close attention to the 7800X3D. So while the gaming perf is roughly upto 37% faster 1440p, its either only slightly faster or in-fact slower than my 5900x in non-gaming tasks.

Total cost of upgrade to the new platform would be close to £1000 and tbh its not looking like its worth it. I would be better trading in my 3080 for a 4080/7900XTX (would give me upto 45% more perf over my 12gb-3080) and save myself a few £££ at the same time.
 
Scalpers already at it charging $700 as I predicted making about $35 on the hustle. I guess every penny counts.
 
@W1zzard covers the academic staff but also the more important real life results, like 1% lows and high resolutions results.

Seems to me, lots reviewers are just enthusiasts like us, but don't necessarily know how to properly review hardware, at least not to actually help people making purchasing decisions.

He's one of the best if not the best at trying to cover every base.
 
I'd say if you were to criticize anything it would be the lack of a slightly bloated windows install to simulate someones actual gaming PC, but it's not just him... it's the rest of techreviewers, no one tests it... even if it's not the primary testing methodology. Multiplayer games are rarely tested as well, not talking about a built in benchmark. HuB is one of the few places that does.
 
I'd say if you were to criticize anything it would be the lack of a slightly bloated windows install to simulate someones actual gaming PC, but it's not just him... it's the rest of techreviewers, no one tests it... even if it's not the primary testing methodology. Multiplayer games are rarely tested as well, not talking about a built in benchmark. HuB is one of the few places that does.
A bit of a philosophical question. Adding bloat is very unpredictable though. I do have VBS enabled, because it's the fresh OS installation default.

Multiplayer games are terrible to test reliably, because they will just release a patch and that invalidates all previous testing. For me that means retesting these CPUs:
 
Can anyone replicate these results?


5.403 ghz oc?
 
Right, because among the enthusiasts on this topic, you are the only one who understood the scaling needed between CPU and GPU performance and the notion of bottleneck? Indeed, how come nobody ever thought of that before :roll:
I bet many don't?
 
He's one of the best if not the best at trying to cover every base.

i totally agree and on top the presentation of all the gathered data/info is flawless! Very easy to fall back on and quickly pull relevant material. Things are just getting better and better... eg. the addition of min. FPS charts, game relevant power consumption charts, etc. A fulfilling experience considering at some point in the past i did ask for both of these supplements. The only thing missing for me is a mandalorian "this is the way" emoji hehe

Can anyone replicate these results?


5.403 ghz oc?

Unlike the 5800X3D, its great to see the 7800X3D and co being unlocked. If I were buying into one, I can't see myself delving into the complexity of it all but it would have been nice to break through what seems like a deliberate 5ghz MBC cap.

How sensitive is the stacked cache to raised voltages/temperatures?
 
Scalpers already at it charging $700 as I predicted making about $35 on the hustle. I guess every penny counts.
I haven't ever paid a dime to any scalper, whether it be for hardware/tickets or really anything else! Getting scalped is a choice, albeit not an easy one in rare cases where you really need something.
 
I bet many don't?

Mate, just get over it. Most of that 80% segment, if not all, wouldn't even buy $450/+ gaming processors or look to purchase top-end $800+ graphics cards. If some buyers just fancy the bees knees of a processor, overkill or not with an inferior paired GPU, i don't see that being a problem either. Look at it from another perspective... its also believable most gamers are on lower-tier to mid-tier previous 2-3 Gen GPUs. Even the bottom level ~$200'ish 13th Gen or 7000-series chips in various play configurations will be bottlenecked by these entry-level or above average GPUs (of course depending on title-type/resolution/quality presets/etc). That shouldn't mean we shouldn't upgrade. In all frankness, i wouldn't mind GPU bottlenecks over CPU constrains seeing how expensive graphics cards are. CPUs are more affordable and offer way more than just gaming, on top opening doors to future GPU investments.

Also when i look to processor reviews regardless whether its a $200 chip or a $600 one, i'm expecting uncapped scalability at the GPU end opposed to seeing newer Gen chips all hitting a brick wall with a static FPS reading with inferior cards. A 4090 for testing purposes is perfect for the task. If buyers want to check specific hardware benchmarks pertaining to or best resembling what they're running or intend on running, thats not hard to work out either using the existing data or for the laymen as easy as searching the web/YT/etc.
 
  • Some setup/verification required for optimum performance
Which are they ?
 
As mentioned before, I tried, I even transplanted the whole power profile from a fresh OS. Made no difference
Did you modify profiles under overlays and balanced scheme ? Or just the setting under the topmost scheme. Also restoring (in "old" gui) won't revert overlays or profile settings.

Anyway I tested this with custom built provisioning package that provisioned my 3900x analogously to what AMD does to 2ccd x3d cpus. I wrote a bigger guide here, maybe it will be useful for someone.
 
A bit of a philosophical question. Adding bloat is very unpredictable though. I do have VBS enabled, because it's the fresh OS installation default.

Multiplayer games are terrible to test reliably, because they will just release a patch and that invalidates all previous testing. For me that means retesting these CPUs

For sure, I never said it was going to be easy or without it's own issues. In the case of multiplayer games, variation is offset with number of runs, which then requires more manpower to yield results with strong confidence. I understand completely why reviewers DONT do partially bloated systems or multiplayer games, that doesn't mean we don't need it or those results are much more applicable compared to what we get.

Some of the most CPU intensive games are multiplayer games. I for instance always have Rivatuner running and keep track of CPU utilization, especially battle royals with more players and interactions between elements in the game itself leads to much higher utilization compared to a singleplayer experience. One of the reasons I never skimp on a CPU and would rather be GPU bound then ever run into a scenario where my CPU is getting taxed too hard as that leads to hit registration issues, desync, input lag, hiccups, and all around horrible experience when it really matters.

In the case of solo testers, I don't think this should replace normal testing, but these things should be looked at, if in their own section with asterisks.

Steam could actually fix all of this, but Valve doesn't care about gamers anymore.
 
i totally agree and on top the presentation of all the gathered data/info is flawless! Very easy to fall back on and quickly pull relevant material. Things are just getting better and better... eg. the addition of min. FPS charts, game relevant power consumption charts, etc. A fulfilling experience considering at some point in the past i did ask for both of these supplements. The only thing missing for me is a mandalorian "this is the way" emoji hehe



Unlike the 5800X3D, its great to see the 7800X3D and co being unlocked. If I were buying into one, I can't see myself delving into the complexity of it all but it would have been nice to break through what seems like a deliberate 5ghz MBC cap.

How sensitive is the stacked cache to raised voltages/temperatures?
About the locked CO on the 5800X3D, it depends on the motherboard. My Prime B550-Plus has it unlocked (as a matter of fact, my X3D is at CO -30 since day two, 4.45GHz single and multicore clocks baby!).
Also, it can be set in Win with a software (which I don’t remember right now).
 
Last edited:
  • Some setup/verification required for optimum performance
Which are they ?
Making sure the AMD chipset driver works correctly. See the "Core Parking Fail" page in the review. This probably won't affect most people, I still feel that raising awareness is important. Once you've got your X3D setup, fire up Cyberpunk and check that ALL your VCache capable cores are loaded and not parked or you will lose performance in games. Cinebench and similar benchmarks will not show the issue

Did you modify profiles under overlays and balanced scheme ? Or just the setting under the topmost scheme. Also restoring (in "old" gui) won't revert overlays or profile settings.

Anyway I tested this with custom built provisioning package that provisioned my 3900x analogously to what AMD does to 2ccd x3d cpus. I wrote a bigger guide here, maybe it will be useful for someone.
Good article. End of next week I'll have more time to mess around with CPUs again
 
Back
Top