• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI OCLab Reveals Ryzen 9000X3D 11-13% Faster Than 7000X3D, AMD Set to Dominate "Arrow Lake" in Gaming

Less than 10% might as well be zero. You'll never notice that in a game. Why bother even launching other than to win a bar chart.
 
:kookoo:
It looks like many people here have BSOD problem with their Intel's systems and they fill affected. Sorry for your mistake, but next time just avoid Intel
I've worked on nearly every generation of Intel since Core 2 Quads era and never had constant BSOD issues, and this includes a 14900k currently in service. Your trolling is 100% PEBCAK.
 
I've worked on nearly every generation of Intel since Core 2 Quads era and never had constant BSOD issues, and this includes a 14900k currently in service. Your trolling is 100% PEBCAK.
What do they say about anecdotal evidence.
 
Why some people still think that every CPU/GPU generation is meant for single generational upgrades is beyond me.
I think it becomes a special interest of sorts. You see it everywhere really. I'm into photography, and the amount of gear that people buy and sell in that space can put PC hardware to shame, and there's just as much brand-fanaticism there, too. Chasing a pixel-peeping difference in IQ, or marginaly better autofocus or some insane burst rates. One thing I've noticed in both spaces, many folks that buy the best expect too much and the happiness doesn't last.
 
I think it becomes a special interest of sorts. You see it everywhere really. I'm into photography, and the amount of gear that people buy and sell in that space can put PC hardware to shame, and there's just as much brand-fanaticism there, too. Chasing a pixel-peeping difference in IQ, or marginaly better autofocus or some insane burst rates. One thing I've noticed in both spaces, many folks that buy the best expect too much and the happiness doesn't last.
How many times have I upgraded with great expectations only to be left feeling "meh, it's ok" in the end. Not anymore, though. I think I'll replace my 6750 XT with top RDNA 4 because I'm curious, and leave it at that until I actually need something better. With the way these generational differences are getting smaller and smaller, I'm sure that and my 7800X3D will be good enough for a long time.
 
I think it becomes a special interest of sorts. You see it everywhere really. I'm into photography, and the amount of gear that people buy and sell in that space can put PC hardware to shame, and there's just as much brand-fanaticism there, too. Chasing a pixel-peeping difference in IQ, or marginaly better autofocus or some insane burst rates. One thing I've noticed in both spaces, many folks that buy the best expect too much and the happiness doesn't last.

Yeah, chasing the dragon. It happens in every enthusiast community. I'm certainly not immune. There's nothing for it but to pinch yourself, once in awhile. Maybe take a break from tech forums from time to time.

A good method for keeping perspective is to imagine yourself applying the same level of exacting enthusiasm to every area of your life, which then leads to a humbling realization that you (or, at least, me) really don't know (or care to know) about all sorts of things. Whenever I hear someone bemoan the state of "normies," that's always my first thought--at this very moment, there are probably countless enthusiasts in other communities who feel exactly the same way about people like me. We're all normies in certain contexts. And for the most part, we're happy being normies.
 
How many times have I upgraded with great expectations only to be left feeling "meh, it's ok" in the end. Not anymore, though. I think I'll replace my 6750 XT with top RDNA 4 because I'm curious, and leave it at that until I actually need something better. With the way these generational differences are getting smaller and smaller, I'm sure that and my 7800X3D will be good enough for a long time.
I’ve yet to justify upgrading past my Ivy E and 5600XT. It does okay at 60fps for what I play. I can’t justify paying a bunch more for a hobby these days. Inflation has kinda put my hobbies on hold.
 
I’ve yet to justify upgrading past my Ivy E and 5600XT. It does okay at 60fps for what I play. I can’t justify paying a bunch more for a hobby these days. Inflation has kinda put my hobbies on hold.
And that's honestly okay. Older hardware can and will keep up and all this upgrading is more of a hobby thing if it's within reason.
 
I’ve yet to justify upgrading past my Ivy E and 5600XT. It does okay at 60fps for what I play. I can’t justify paying a bunch more for a hobby these days. Inflation has kinda put my hobbies on hold.
I'd justify it if more games came out worth playing. Space Marine 2 is the only game that has really pushed my 6800xt to an uncomfortable point that i've enjoyed playing, but I finished it in one weekend so that wasnt enough.

But this isnt a bad thing. I'm not sad about saving money, especially given our economy right now.
 
The biggest change is that the X3D chips won't have as large of a productivity penalty compared to their non X3D parts, which is neat but also somewhat pointless for gaming purposes.
Does make them less of a compromise compared to the 7000/5000 series though.
 
Three games, especially those three games, don't tell much. Shadow of the Tomb Raider is six years old, Far Cry 6 is three years old. No one is buying a new CPU because it's faster in Far Cry 6. Wukong is new, but it's not particularly CPU limited. I'll wait for reviews with 20+ games including more relevant new stuff.
 
Three games, especially those three games, don't tell much. Shadow of the Tomb Raider is six years old, Far Cry 6 is three years old. No one is buying a new CPU because it's faster in Far Cry 6. Wukong is new, but it's not particularly CPU limited. I'll wait for reviews with 20+ games including more relevant new stuff.

I mean...yeah? obviously, this is just leaked stuff to give some sort of POTENTIAL indication, nothing real
 
Three games, especially those three games, don't tell much. Shadow of the Tomb Raider is six years old, Far Cry 6 is three years old. No one is buying a new CPU because it's faster in Far Cry 6. Wukong is new, but it's not particularly CPU limited. I'll wait for reviews with 20+ games including more relevant new stuff.
New stuff is irrelevant as it's all GPU bottlenecked even by a RTX 4090 in 1080p ultra settings.
you're better off using older games that can be CPU limited to show differences.
this need to use popular games that everyone plays is asinine & ruining reviews.
 
New stuff is irrelevant as it's all GPU bottlenecked even by a RTX 4090 in 1080p ultra settings.
you're better off using older games that can be CPU limited to show differences.
this need to use popular games that everyone plays is asinine & ruining reviews.

Why not Alan Wake 2, it's the last with PT (RT) and RR... is there a more demanding game out ?
 
Why not Alan Wake 2, it's the last with PT (RT) and RR... is there a more demanding game out ?
It's a GPU benchmark. You don't benchmark CPUs with games like that.

Starfield, on the other hand, can make CPUs cry. Cyberpunk 2077 scales exceptionally well (despite anything beyond 150 FPS is practically an overkill in this case), one might also want to crunch it with CoD, Counter-Strike, Fortnite and other games where having a lot of FPS is much higher priority than having optimal visuals.

Using old games, you clearly don't have to worry if your GPU is a bottleneck. Using new games, you will need to use outdated resolutions like 720p and set everything GPU intensive to the lowest possible settings if you wanna be as CPU of benchmarker as possible.
 
New stuff is irrelevant as it's all GPU bottlenecked even by a RTX 4090 in 1080p ultra settings.
you're better off using older games that can be CPU limited to show differences.
this need to use popular games that everyone plays is asinine & ruining reviews.
I don't think that's true. If it was, every gaming focused CPU review would just say buy a 7600X (or whatever). Baldur's Gate 3, Cities Skylines 2, and Warhammer Total War 3 scale with CPU performance at 1080p.

I want to know how a CPU will run games I might actually play. Being 10% faster at running Shadow of the Tomb Raider is irrelevant because anyone who wanted to play it did so years ago. I could see if it was using an engine that was still really popular, but no one is using the Foundation Engine. If it turns out that any CPU will work for new games because I'll always be GPU bound, then that's great because I can save money.
 
Hey come on, it's not like you hadn't a sandy bridge-ivy or 4790k,( maybe you preffer the powerful bulldozer, it can hold still lots of papers.
You missed all gen till that 1k bucks 7800-3D ( ? ). Or you hought 7 or 8 series? If you bought the over praised Ryzen sorry dude..

Also this must be hurtful but costly as it is indeed, intel is still better for gaming but cine bench where amd and its 59 trillion cores shine. It doesn't help that lot of press having sony, micro and valve with a mediocre apu even the pro to be released unless it has a 4090 ^^.

Yeah finally years of had press is affecting intel, still what they got in revenue 1 un fiscal year can use it to buy amd no prob. But they make fool gpu XD.
 
I am surprised that people are missing one of the big changes for things according to this report.

According to those charts there is a real good chance the X3D x900 and x950 parts are NOT compromised massively on productivity like the 7xxx series parts were!!
.
If this is true then your not trying to jugle between the "money maker" non x3d parts for your day to day work and sacrificing the gaming benefits the X3D gives you.
 
Hey come on, it's not like you hadn't a sandy bridge-ivy or 4790k,( maybe you preffer the powerful bulldozer, it can hold still lots of papers.
You missed all gen till that 1k bucks 7800-3D ( ? ). Or you hought 7 or 8 series? If you bought the over praised Ryzen sorry dude..

Also this must be hurtful but costly as it is indeed, intel is still better for gaming but cine bench where amd and its 59 trillion cores shine. It doesn't help that lot of press having sony, micro and valve with a mediocre apu even the pro to be released unless it has a 4090 ^^.

Yeah finally years of had press is affecting intel, still what they got in revenue 1 un fiscal year can use it to buy amd no prob. But they make fool gpu XD.
How big was the bowl you smoked today?

6000/8000 Bulldozer/Piledriver are horrible now and were horrible then and a 4790k will keep back a 1080ti from fully flexing. I personally tested this.

7800x3d were never that much, don't know where you got that info.

Intel has been proven to not be at the top for "gaming" chips. They've been pretty good general usage chips but definitely not top in games. If anything in R23 they've been way up there contrary to what you said.

They would never buy out AMD due to licensing and Ark cards are not bad by any means. Their AV1 encoding is strong and they do good for the price/performance.
 
I think it becomes a special interest of sorts. You see it everywhere really. I'm into photography, and the amount of gear that people buy and sell in that space can put PC hardware to shame, and there's just as much brand-fanaticism there, too. Chasing a pixel-peeping difference in IQ, or marginaly better autofocus or some insane burst rates. One thing I've noticed in both spaces, many folks that buy the best expect too much and the happiness doesn't last.
there aren’t benchmarks for subject tracking and ergonomics the whole industry is based on bought opinions of social media talking heads and charts of dynamic range that are imperceivable anywhere but in those charts

worst of all you can make stunning images with trash cameras and optics.
but for some reason I still need a New camera and a bunch of lenses…
 
they got 5.2 ghz...
hmm... x3d for 200mhz less? (compared to 9700x)
i wouldn't mind that slight frequency drop
 
Not surprising, at least to me. I guess we "should" get used to such small increment, hey at least is not a regression :D
I'll wait for the review, if the 9800X3D can improve efficiency, I might upgrade, after all it's just a drop in :rolleyes:


SS1.png
 
Less than 10% might as well be zero. You'll never notice that in a game. Why bother even launching other than to win a bar chart.
Intel gone over 300Watts and end up with degrading and unstable CPUs for less than 10%. 10% could be the performance difference between two CPUs that have an over $100 price difference.

And to make it a little more personal (sorry in advance), I am pretty sure that you justified paying for those "G.SKILL Ripjaws S5 DDR5 6000 CL30-40-40-96 (F5-6000J3040F16GX2-RS5K)" that you have in your system specs, instead of going for cheaper 4800-5200-5600 RAM, even when the performance difference usually is not more than 2-3%.


The fun part of desktop PCs is upgrade ability and lot's of options. As long as these options don't become the fundamental for a negative change in the desktop PC market, for example in prices (the Nvidia Titan was just that and we see the results today), any new hardware part is welcomed.
 
Back
Top