• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are people planning an upgrade?

I have mine locked at 5.55 GHz in a manual OC at 1.23 V, it has actual second gen V-Cache and the wider cores of Zen 5 which aren't immediately obvious in gaming benchmarks currently. A big leap from Zen 4, just not at stock. The 7800X3D Is only efficient because it's forced to run low volts from first gen V-Cache, not by AMD's choice. As for heat and power draw it's irrelevant and entirely dependent on settings, just like Zen 4.

About to tune a 6600 MT/32 64 GB white Dominator Titanium kit that's arriving today. Got some TG Putty Pro to keep it cooler than stock.
6600 is a ***tch to run. You need to run a LOT of stability tests cause IMC / FCLK instabilities are hard to detect. Or im just clueless. I settled for 6400c28 cause I couldn't bother finetuning for 6600.

Also looks like you have a godtier bin, mine hits a wall at 5.4ghz / 1.3v flat. Which is terrible...
 
Yeah, the 7800X3D barely crosses 1 V on the cores. It is set up extremely conservatively by default, but that's why I love it. The 9800 is maybe 10% faster, but with 100% higher power. Wider cores and new V-cache sound nice on paper, but I'm a practical person
The only thing I'd add here is that's either an exaggeration or you're being fairly selective about the numbers you want to put forward. Near 100% more power is in all core workloads, not games, were it's closer to 20% faster on average, of course with apps that do better and worse than that. Then in gaming it's closer to your 10% faster but uses more like 40% more power (46w VS 65w). Still relatively quite inefficient no doubt about it, but there more to it borne out in the numbers, and given its a through and through gaming chip, I doubt many are fussed about productivity power consumption. I want to be clear that I agree it's considerably more inefficient.
 
6600 is a ***tch to run. You need to run a LOT of stability tests cause IMC / FCLK instabilities are hard to detect. Or im just clueless. I settled for 6400c28 cause I couldn't bother finetuning for 6600.

Also looks like you have a godtier bin, mine hits a wall at 5.4ghz / 1.3v flat. Which is terrible...
I'm not going to run 6600.

It's just a high bin kit I bought.
 
I-I dunno, I dunno if I'm being overly petty here. Even if you don't know I'm using one, I can't digest why I ...or monitors of that form factor got hit with such a, erm, bad rude take.
There's a wall between "waste of money scammy trash" and "top-of-the-line product in a niche form factor with a price tag to match". And maybe I'm crazy, but for this specific case I know well enough on what I'm gaining with the product. I'm commited to the plan, and I don't need someone else trashing on the plan.
YEAH, I agree and have seen that there are even 24" 4K, which, again were "blessed by happy photoshoppers", but, even for this one I have "my story": I got 24" WQHD (2K) one for office work, then... realized it's UNUSABLE at 100% without deep-eye-penetrating into screen, which is already pain in the.. umm well again eyes, because the screen is small to stare at, compared to, say, 27". So again we have "case", where pictures/movies will be "PERFECT to watch at", but at the same we are having LOWER THAN ADVERTISED RESOLUTION (recommended by Windows is 125% which is 2048x1152, wtf is this compared to 1080p then, and even lower than 1200p vertical of that "gold 16:10 era" lol) or we need to burn eyes using it at 100%. So it isn't "niche", it's better "dumb", sorry. No offence bro.:oops:

I'm really skeptical about the new GPU releases and GPU's in general. On one hand you have NV with the high prices, despite what NV says about those and what tech it offers. On the other hand you have AMD which may be lower in price but still does not offer much of a performance bump. Intel, well Intel is still waiting for an opportunity. I must say, it will be hard for me to upgrade my graphics. Considering i dont play much nowadays, and if I do, my 6900xt agrees with me fully with games I play, I might skip this gen upgrade as well.

Yeah i dont need scaling as well with my 27-inch 4K screen and why would I? I see everything clearly. 32-inch is too big for me anyway.
same here. I understand that this is "bad" for the eyes, but that's me, when "chillin" with PC I'm ok to "sit close" to monitor utilizing full resolution for my tasks whichever this resolution is. But for 9-17 workday I sit from distance and using crazy scaling factor, well, just not to burden my eyes lol.:) as about 32" I have tried to use 32" TV (FHD lol) as a monitor. Super good for "healthy distance", not usable for "close sit" then. Well, I'm currently a lappie-boy, (don't want to mess with external monitor again), but, in future if I will go for 32" it will be clearly 4K, and if I'll choose 2K (I hate WQHD break-ya-tongue-name) it will be 27".
 
Last edited:
The only thing I'd add here is that's either an exaggeration or you're being fairly selective about the numbers you want to put forward. Near 100% more power is in all core workloads, not games, were it's closer to 20% faster on average, of course with apps that do better and worse than that. Then in gaming it's closer to your 10% faster but uses more like 40% more power (46w VS 65w). Still relatively quite inefficient no doubt about it, but there more to it borne out in the numbers, and given its a through and through gaming chip, I doubt many are fussed about productivity power consumption. I want to be clear that I agree it's considerably more inefficient.
That's true, but that's why I'm not keen on it. Pushing considerably more power for a tiny bit of more performance isn't progress. It's brute force. Exactly what Intel has been doing lately. I love my 7800X3D, but the 9800 doesn't impress me.
 
Most likely I'll jump to 64 gigs this year (not that I need' em, just an itch). Also, some racing stuff is required in order to be more comfy, probably a Next Level Racing chassis of some kind (i still have to figure out the actual space required for that). CPU and GPU upgrades - I guess 2027 at best, not sooner, I don't have any reason to upgrade neither (at least for now).
 
Definitely older tech at lower prices. Video cards and motherboards have skyrocketed since the crypto rush. I remember paying insane money for my GTX 1080; on sale.

I need an overhaul though. I'd like to give Win11 a try, without the worry of getting cut off at any moment. And I definitely need a new GPU that can run HDR/4k/10-bit/4:4:4/120hz all at the same time. I'm not too worried about the performance, low-end is 1080 level by now.
 
I mean - half a grand these days is like 2 runs to the grocery store

What on earth are you buying from there? Premium Japanese beef, caviar and truffles, all downed with some rare vintage spirits? lol Nah i agree, nowadays stomach/home-wares have blown projected-budget shopping sketches through the roof. Esp. if you're hunting for farmer fresh veg/fruits and generally a healthier lifestyle. I still check receipts when the Mrs returns and have those 'silent WTF moments'.

You got me thinking - discounted bulk-buy months supply of instant noodles for the fam and put that bundle of joy towards a more comfy 5080 splurge. Sacrifices must be made for the greater good ;)
 
but the 9800 doesn't impress me.
From the perspective of a 7800/7900/7950X3D owner mostly interested in gaming, I doubt it impressed many if any. My perspective is it was enough to entice myself and others from generations of AMD and Intel older than that.
 
YEAH, I agree and have seen that there are even 24" 4K, which, again were "blessed by happy photoshoppers", but, even for this one I have "my story": I got 24" WQHD (2K) one for office work, then... realized it's UNUSABLE at 100% without deep-eye-penetrating into screen, which is already pain in the.. umm well again eyes, because the screen is small to stare at, compared to, say, 27". So again we have "case", where pictures/movies will be "PERFECT to watch at", but at the same we are having LOWER THAN ADVERTISED RESOLUTION (recommended by Windows is 125% which is 2048x1152, wtf is this compared to 1080p then, and even lower than 1200p vertical of that "gold 16:10 era" lol) or we need to burn eyes using it at 100%. So it isn't "niche", it's better "dumb", sorry. No offence bro.:oops:

It's probably only "dumb" for your eyes.
If my IT department somehow offers to swap / add a monitor of 24" 1440p I know I will raise hand and then use it at 100% scaling and probably some more on Excel.
( I don't mean any offense here. There are older wiser calmer colleagues than me who struggles / has to scale their generic 1080p, so that is that. )

But, the "LOWER THAN ADVERTISED RESOLUTION" argument doesn't quite stand because
whatever the GPU outputs after scaling is still in the display's intended resolution by default. I can see the images smoother in my 27" 4K than my brother's 27" 1440p in terms of pixelating, be it in games like Forza Horizon 5 or texts on general Windows usage (with mine at 150% scaling currently because 1080p displays on its side). it's not annoying at all but I can notice that in probably close but normal distance without trying any hard.

Let's just say we have very very very different experience and views on this topic, maybe I'm the crazy niche one here and move on.
 
I'm pretty much always planning an upgrade. Don't always pull the trigger tho.
 
I'm upgrading my cooling and just going to push my hardware a little further. Nothing coming down the pipe interests me, so no need for new hardware.
 
YEAH, I agree and have seen that there are even 24" 4K, which, again were "blessed by happy photoshoppers", but, even for this one I have "my story": I got 24" WQHD (2K) one for office work, then... realized it's UNUSABLE at 100% without deep-eye-penetrating into screen, which is already pain in the.. umm well again eyes, because the screen is small to stare at, compared to, say, 27". So again we have "case", where pictures/movies will be "PERFECT to watch at", but at the same we are having LOWER THAN ADVERTISED RESOLUTION (recommended by Windows is 125% which is 2048x1152, wtf is this compared to 1080p then, and even lower than 1200p vertical of that "gold 16:10 era" lol) or we need to burn eyes using it at 100%. So it isn't "niche", it's better "dumb", sorry. No offence bro.:oops:


same here. I understand that this is "bad" for the eyes, but that's me, when "chillin" with PC I'm ok to "sit close" to monitor utilizing full resolution for my tasks whichever this resolution is. But for 9-17 workday I sit from distance and using crazy scaling factor, well, just not to burden my eyes lol.:) as about 32" I have tried to use 32" TV (FHD lol) as a monitor. Super good for "healthy distance", not usable for "close sit" then. Well, I'm currently a lappie-boy, (don't want to mess with external monitor again), but, in future if I will go for 32" it will be clearly 4K, and if I'll choose 2K (I hate WQHD break-ya-tongue-name) it will be 27".
It is not bad for the eyes if you have sufficient ambient light. If you sit 1m from the monitor or around you should be ok.
For me the 4K is the way to go.
 
It is not bad for the eyes if you have sufficient ambient light. If you sit 1m from the monitor or around you should be ok.
For me the 4K is the way to go.
1 m is what I practice at work. At home I'm more like 40-50 cm, using laptop. With desktops it could be possible 60-70 cm.

It's probably only "dumb" for your eyes.
If my IT department somehow offers to swap / add a monitor of 24" 1440p I know I will raise hand and then use it at 100% scaling and probably some more on Excel.
( I don't mean any offense here. There are older wiser calmer colleagues than me who struggles / has to scale their generic 1080p, so that is that. )

But, the "LOWER THAN ADVERTISED RESOLUTION" argument doesn't quite stand because
whatever the GPU outputs after scaling is still in the display's intended resolution by default. I can see the images smoother in my 27" 4K than my brother's 27" 1440p in terms of pixelating, be it in games like Forza Horizon 5 or texts on general Windows usage (with mine at 150% scaling currently because 1080p displays on its side). it's not annoying at all but I can notice that in probably close but normal distance without trying any hard.

Let's just say we have very very very different experience and views on this topic, maybe I'm the crazy niche one here and move on.
Well, wont argue with that. I COULD use 24" at 100% WQHD, but... eyes just will tire fast. I HATE AMBIENT LIGHT for PC usage. Same goes for 4K 27". Lucky You, then.;)
 
50xx look amazing :D

GhVvlbwbkAAWrd_
 
Last edited:
50xx look amazing :D

GhVvlbwbkAAWrd_

I would have liked to have seen the 5070 TI beat the 4070 TI SUPER by a decent margin of 10-15%. Those are low expectations for a $750-$800 card.

It would have been nice to see the 5080 match or closely trail the 4090 but then again this was hardly expected with Nvidias carefully and tightly calibrated performance tiers to encourage upselling and AMD with their tails between their legs and exiting the higher-tier race.

Not exactly "amazing" but a reasonable generational upgrade! The crafty upselling is working on me with the 5070 TI locking in my performance goals but for $250+ the 5080 itch is growing like that weird rash you know you shouldn’t scratch but absolutely can’t resist

Anyway - hope the larger game suite benchmarks with NDAs lifted don't fall short of those digits, or better yet, exceed them!
 
Surely the 50 Series has to be scale better than 1 to 1 with the current one... (while also consuming more power)
Just like I said several pages ago now. From Nvidia's own slides, the 5090 is about 30% more expensive, uses about 30% more power, and has about 30% better performance than 4090 (I think the numbers are more like 26/28/20-40 with performance depending on the game and if RT is present or not). Definitely seems to scale 1:1 with 4090 on price/power/performance if MFG is not factored in. Edit: also, I'm just talking MSRP which has been practically mythological for many people the last ~4 years...but I got a 4090 for MSRP, which means it can happen...and it's the only way to actually track this reliably.
 
I'm torn on whether I want/need to upgrade. The games I like to play aren't super demanding in terms of GPU or CPU horsepower, so my 7700X and 3070 are good in that regard. I do like to play around with Blender. But peer pressure is a bitch and marketing doesn't help. Makes me feel like I'm "inferior" somehow, like what I have isn't good enough. I know it's silly, but that's how I feel about the market in general right now.

That being said, about the only things I can think of that would benefit from an upgrade are my case (I love it, but it's too heavy and cumbersome for me now. I'd like to downsize a little) my AIO, and my monitor (I want to get either a second monitor or a bigger one with speakers so I can play my Switch on it without having to use headphones)
 
I'm torn on whether I want/need to upgrade. The games I like to play aren't super demanding in terms of GPU or CPU horsepower, so my 7700X and 3070 are good in that regard. I do like to play around with Blender. But peer pressure is a bitch and marketing doesn't help. Makes me feel like I'm "inferior" somehow, like what I have isn't good enough. I know it's silly, but that's how I feel about the market in general right now.
Your hardware isn't inferior in any way. Its sole purpose is to fulfill your needs. As long as it does that, it's good enough.

I have a HTPC with a low-power Haswell i7 and a GT 1030 in it and I love it to bits. It plays all my films just fine and doesn't overheat with only a passive aluminium block on both the CPU and GPU.

I also had a 7700X that I upgraded to a 7800X3D. I'm happy with my choice because of the massively decreased power consumption and thermals (especially with 4800 MHz RAM), but I don't feel the bump in gaming performance. All those reviews out there with a 4090 are massively exaggerated compared to the average user experience.
 
I'm torn on whether I want/need to upgrade. The games I like to play aren't super demanding in terms of GPU or CPU horsepower, so my 7700X and 3070 are good in that regard. I do like to play around with Blender. But peer pressure is a bitch and marketing doesn't help. Makes me feel like I'm "inferior" somehow, like what I have isn't good enough. I know it's silly, but that's how I feel about the market in general right now.

That being said, about the only things I can think of that would benefit from an upgrade are my case (I love it, but it's too heavy and cumbersome for me now. I'd like to downsize a little) my AIO, and my monitor (I want to get either a second monitor or a bigger one with speakers so I can play my Switch on it without having to use headphones)
Don't be pressured to upgrade. If you were looking at a 4K monitor and some newer games that use Ray Tracing, I'd say "sure, consider a GPU upgrade", but from what you've said there's definitely no "need". "Want"...well, only you can decide that. If you're using your GPU for work, that can be a whole different story when time is money and you can make more money by using a more powerful GPU.

Nvidia is trying to wind down availability of 4000-series before 5000-series launches, but when that happens in a couple weeks, we'll see some price adjustments on everything. the 5070 seems like it may be a decent bang/buck card and there's a chance (just rumors for now) that AMD's 9070XT actually hits a really good performance/price spot with some good RT-improvements as well. AMD says the rumors are wrong but hasn't released any information, so we're still waiting to find out, but with two series launching in the next couple months, it will mean the used market could be pretty interesting too when other people upgrade.

So in the end, you upgrade your GPU when you have the money, there's a card available that meets your performance/price requirements, and what you have isn't meeting your requirements anymore. I think we all get the upgrade itch sometimes (I'm certainly not immune), but often that just makes us forget to just enjoy the games. Honestly, I've seen the advice given elsewhere, but sometimes if the number you're seeing in the FPS counter is bothering you, simply turning off the FPS counter can make the game more enjoyable.
 
I'm with the guys above... go with: Fulfill your 'needs' or your 'wants' will consume you

I have never upgraded to anything with long-lasting satisfaction... even great hardware delivering the desired performance doesn't help. Forget lasting satisfaction, it doesn’t take long after getting the ‘want’ for the hardware hype, minor gains and that bloody endless upgrade itch to resurface. Credit to me, i do a better job nowadays focusing on the "needs" (actually gaming isn't a need - shit, i gotto rethink everything lol)
 
I'm with the guys above... go with: Fulfill your 'needs' or your 'wants' will consume you

I have never upgraded to anything with long-lasting satisfaction... even great hardware delivering the desired performance doesn't help. Forget lasting satisfaction, it doesn’t take long after getting the ‘want’ for the hardware hype, minor gains and that bloody endless upgrade itch to resurface. Credit to me, i do a better job nowadays focusing on the "needs" (actually gaming isn't a need - shit, i gotto rethink everything lol)
How many times I've upgraded with great expectations, and then reacted with "meh, it's fast, so what?" Then I downgraded and was just as happy as before.

Sometimes, making new games run on old, or low-end hardware feels more satisfying than all the shiny new stuff out there. :)

Edit: This is what an "enthusiast" should be defined by, imo. Someone who's willing to put the effort in learning and making things work. Not someone who just throws money at problems that don't even exist.
 
Credit to me, i do a better job nowadays focusing on the "needs" (actually gaming isn't a need - shit, i gotto rethink everything lol)
Gaming is a 'need' to some to vent off from real life. Nowadays I just open up games that I very familiar with and just having a stroll, driving or flying in GTA San Andreas while listening to talk show in game radio, take Roach on horse ride along the island of Skellige in Witcher 3 while listening to game OST just to relax after work.
 
Truth to be told, I am not even sure why I bought the 7900 XTX. I was planning on buying the 6800 or the 6800 XT, but then decided to wait a little and bought the 7900 XTX right after it released in my country for 950 euro. It's an overkill for me and will continue to be in the following years. Maybe I'll upgrade around RTX 7000 series. Wether I'll upgrade to an AMD GPU or an Nvidia one depends only on AMD - if they bring FSR4 to the 7000 series they'll have my support for the next gen as well. And if they don't - I might as well buy Nvidia if I am going to get feature locked due to not having the latest product. I will likely not use upscaling anyway, but it is simply a matter of principle for me. I might upgrade sooner if AMD comes up with a 95-105% more raster performance GPU for the same price (+-15% difference in price is acceptable), but this is unlikely before UDNA2. For now my GPU is enough for my 1440p@170Hz and 4K@60-120 Hz (depending on the game) gameplay.
 
Back
Top