• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are people planning an upgrade?

Techpowerup RT charts have an odd way of showing it with the - x%, but yes, RT effects enabled take a smaller hit gen on gen with the same amount of RT 'cores'. Ray triangle intersection performance doubling doesn't necessarily mean double the resultant FPS With RT on in every single title as there's more to it than that, but it's demonstrable that RT and Tensor performance increases.
The only thing that does is shorten the latency inside of the rendering time for it in the units. It won't change the major bottleneck of having too much competition through the "render out puts" in the end.
You got tensors cores/RT cores, Shaders, Textures units, normal rasterizations, all trying to pushed through a certain amount of R.O.P's which is the major limit on all current card designs.
 
Last edited:
7800x3d, i keep it and am happy, more than if i had a 9800x3d btw, no AI gen 1.0 !

4070ti, i keep until there is a 70 series that pushes my 7800x3d to his knees, it will be 7070 or 8070.
 
i was planning on getting a 5070ti, but it's over budget, the 5070 doesn't have 16GB, so i will wait for the 5060ti
 
I'd love to upgrade my GRE Red Devil if the stars all align and 9070XT has enough performance to make it not a side grade.
 
No, unless I find a cheap X3D replacement for the 7600. I'm done shopping for the forseeble future.
 
No i have zero plans for upgrades this year. I have rtx 4090 and a 5950X cpu. That should be sufficient for the this year at 1440P. I have 32 gb ram and sufficient storage. So i see no reason to do anything.
 
I prefer Always More Disappointments myself :nutkick:

Why are we so hung up on accepting forward technologies? We are supposed to be geeks, seeking excellence, adopting and embracing new technologies are they come.
I like me a new bit of tech, but have been paying attention to the vendor lock in of proprietary tech since forever and especially since consoles showed their true nature as walled gardens for gaming. It just feels bad, a bit like selling your soul to company A or B. And that's the last thing I want. They can have my money, but the sale is the sale, and after that, they can fck right off. I don't want companies nagging me to keep buying, or 'nudging' me to the latest greatest all the time. Because then, my purchase has a negative psychological effect added to it. Why would I spend hard-earned money to get something bad as long as I keep using said purchase? I want companies to make me feel good about my purchase. Not bad because I'm not spending more on them. Similar things occur when I see creators ask for donations everywhere. Its fine in moderation though.

For similar reasons I don't do much if any online service-model oriented gaming anymore. Its all a major clusterfuck that will show itself sooner or later. It never works out well, and has lots of similarities to drug addicts looking for dealers. If I want that, I'll just do drugs, much more fun.

Still I understand not everyone is like this / has not experienced the negative effects of these lock ins yet or what it does to their own behaviour. And I might turn to green again too, lacking good alternatives. It is what it is, but as long as we still have the opportunity to resist those lock ins, I will resist them. In the end, it IS a lock in and it does lead to monopolists that kill the value proposition for us, and eventually, worse products at higher prices. This is already happening bigtime, we're knee deep into this, and it relates directly to DLSS.

Ask yourself this. Are you happier now with Freesync, or would you have preferred paying Nvidia for that 100 dollar Gsync module that doesn't net you much if any additional advantages? I do know I"m a lot happier with Freesync just being present on virtually every monitor with more than 60hz, without paying premium. Both are forward thinking technologies. The tech isn't the issue. Its the way it is sold.

An interesting one is AI right now. The overwhelming majority is cloud powered and undergoing constant change = you have no control whatsoever. That's possibly the worst place for AI to ever be a trustworthy 'companion' to the consumer market. Lots of people barely realize this, but its huge. Now factor in how Nvidia is positioning itself as a software company. How they keep pushing NOW. Etc. The push is clear: you will own nothing and be happy. I have nothing other than a hard pass and middle finger for that reality, its a transfer of influence and power that will only happen over my dead body, I shit you not.
 
Last edited:
All they're giving us is more software gimmicks on the same hardware
But think about it. If that is the case, shouldn't every other hardware manafacturer (including Intel btw) be miles ahead of them in raw performance? But they aren't. Top 6-7 cards in raster performance are all nvidias, top 10+ in RT same thing. That's without using the software gimmicks. So clearly, they are giving us more hardware than anyone else does.
 
Yeah, I'm getting a 5090. I bought an RTX A2000 I got a nice deal on to use in the interim and that will also replace the GTX 1070 Ti I have on my secondary system with something RT capable. The 4080 will be getting a new home soon. It was gonna be a 6500 XT, but once I saw that one listed...


Slot powered and quite capable for the footprint. Will be a great one to own. Just gotta find a way to get it to work with my TV though (so a mDP to HDMI converter).
Where did you even found one for a reasonable price? Anything "Quadro" here usually gets an awful markup because it's "professional" stuff, no matter how old/slow it is.
I totally disagree there. Nvidia doesn't even let you monitor your card, change power settings, over/underclock, over/undervolt, etc. The AMD control panel feels better, too. Nvidia crams everything into the "3D Settings" tab, which is just silly
Funnily enough, Nvidia's panel on Linux allows you to do do, whereas AMD has no GUI stuff for Linux at all.
Don't we all buy these cards to play games?
Nope :p
 
Where did you even found one for a reasonable price? Anything "Quadro" here usually gets an awful markup because it's "professional" stuff, no matter how old/slow it is.

Mercado Livre, 1400ish shipped. I mean, yeah, bored at night i just go search for this stuff and sometimes, I do find nice things. Just don't usually have the money.

There's someone selling a pair of GP100s there too, 3500 each, seen it go down to 3325 on occasion.
 
But think about it. If that is the case, shouldn't every other hardware manafacturer (including Intel btw) be miles ahead of them in raw performance? But they aren't. Top 6-7 cards in raster performance are all nvidias, top 10+ in RT same thing. That's without using the software gimmicks. So clearly, they are giving us more hardware than anyone else does.
What's the connection there? Just because one company focuses most efforts on software, it doesn't automatically mean that the other two suddenly got all their hardware figured out. There are people working on these designs, it's not like you press a button to pump money in, then faster GPU comes out.

My theory is that the whole GPU scene is hitting a wall either in chip design or production, that everybody is trying to bridge with software, which is currently best suited to Nvidia.
 
My theory is that the whole GPU scene is hitting a wall either in chip design or production, that everybody is trying to bridge with software, which is currently best suited to Nvidia.
Exactly. That was really my point. Look at the 5090, way bigger chip than the 4090, way more power, way more bandwidth, yet it looks (until reviews come out) that its' not scaling with hardware AT ALL. Realistically with those kinds of specs it should be a good 60-80% faster than the 4090.
 
What's the connection there? Just because one company focuses most efforts on software, it doesn't automatically mean that the other two suddenly got all their hardware figured out. There are people working on these designs, it's not like you press a button to pump money in, then faster GPU comes out.

My theory is that the whole GPU scene is hitting a wall either in chip design or production, that everybody is trying to bridge with software, which is currently best suited to Nvidia.
Companies always hit walls, and then they have extra special pieces of rock to sell you. This is the story of business in a nutshell. Create demand, sell. A great way to create demand is to keep telling the world how incredibly difficult it is what you're doing. Instant value increase. And then, if everyone in the field agrees (of course they do! its their paycheck) you've created a reality.

But its not hard, its just work. AMD can make a faster piece of hardware just fine. But they don't target it. Look at RDNA4 and you have the perfect example of this. So I'm inclined to side with @JustBenching here. Nvidia's focus isn't just software. It has been long term strategy that they have been continuously refining and they have been busy trying to make the best possible GPUs with the technologies they can think of. Of course, they're also making those technologies look like extra special pieces of rock... but that's par for the course if you have the best stuff on offer.

Its entirely up to the REST of the market to prove Nvidia wrong. But AMD ain't doing it - they're focusing on other things. So you can rest assured this is not a problem of 'can't do'... its a situation of 'won't do'. The technology isn't the limitation here. Have you forgotten that the gaming line ups are exclusively scraps and leftovers to begin with?!
 
I vote we wait and see what the 5000 and 9000 series offer after reviews and then decide how much of a wall these companies are hitting.

And while sure they might not be able to blow the 4090 away there is no reason they shouldn't be able to blow the 4060ti away for 4 hundred....
 
Companies always hit walls, and then they have extra special pieces of rock to sell you. This is the story of business in a nutshell. Create demand, sell. A great way to create demand is to keep telling the world how incredibly difficult it is what you're doing. Instant value increase. And then, if everyone in the field agrees (of course they do! its their paycheck) you've created a reality.

But its not hard, its just work. AMD can make a faster piece of hardware just fine. But they don't target it. Look at RDNA4 and you have the perfect example of this. So I'm inclined to side with @JustBenching here. Nvidia's focus isn't just software. It has been long term strategy that they have been continuously refining and they have been busy trying to make the best possible GPUs with the technologies they can think of. Of course, they're also making those technologies look like extra special pieces of rock... but that's par for the course if you have the best stuff on offer.

Its entirely up to the REST of the market to prove Nvidia wrong. But AMD ain't doing it - they're focusing on other things. So you can rest assured this is not a problem of 'can't do'... its a situation of 'won't do'. The technology isn't the limitation here. Have you forgotten that the gaming line ups are exclusively scraps and leftovers to begin with?!
Scraps or not, I'm just basing my assumptions on the fact that I can't see barely any architectural change since Turing except for the dual issue shaders introduced in Ampere. Nvidia certainly isn't touching their hardware too much, only tweak this, fix that.

AMD focused a lot more on hardware with RDNA 3, starting from the chiplet design and whatnot, only that it didn't pay off because of their bad product positioning, timing and marketing. Now, they only promised improved RT and bug fixes for RDNA 4, so maybe they've decided to focus more on software as well.
 
AMD focused a lot more on hardware with RDNA 3, starting from the chiplet design and whatnot, only that it didn't pay off because of their bad product positioning, timing and marketing. Now, they only promised improved RT and bug fixes for RDNA 4, so maybe they've decided to focus more on software as well.
They focused on hardware, made a bet and lost. That's not a result of impossibility, but of plain incompetence if you ask me. RDNA3 didn't pay off because it didn't perform as expected - that's all, and only, AMD brainpower at work right there. And it is a recurring issue, every time they approach high(er) end, they misfire. At the same time, because the basis is not good, they don't really get to software either because that's always in support of hardware.

There's also things going well on the AMD side, but for HOW LONG have they been doing chiplet now? Look at the tight series of refinements on CPU. The dedication is clear. We don't see this dedication on GPU; not the forward thinking approach, not the leapfrogging ahead every time with something good. What AMD has decided is NOT to focus enough on GPUs, and it shows itself not just in software, but also in hardware. I can only draw one conclusion and zero excuses: they're incompetent and its not just the marketing department anymore.
 
I'll believe it when I see it in benchmarks
I believe it because I see it in benchmarks, like TPU when it's not a single image chosen, other reputable sites and my own cross gen testing. 4080 to 4080 super, now that's a "refresh".
 
They focused on hardware, made a bet and lost. That's not a result of impossibility, but of plain incompetence if you ask me. RDNA3 didn't pay off because it didn't perform as expected - that's all, and only, AMD brainpower at work right there. And it is a recurring issue, every time they approach high(er) end, they misfire. At the same time, because the basis is not good, they don't really get to software either because that's always in support of hardware.
AMD likes to gamble on experimental tech and lose big time, like they did with Bulldozer. But it's not just that with RDNA 3. The products weren't that bad, imo. The 7800 XT is a great card, but every 9 out of 10 people thought that it's a bad replacement for the 6800 XT purely by its name, even though it was priced the same as the 6700 XT with 50% more performance, so it's clearly a 6700 XT replacement. It also came too late. The 7700 and 7900 XT were fine, just priced way too close to the 7800 and the XTX to make any sense. Only the 7600 didn't sell simply because it was too slow.

I believe it because I see it in benchmarks, like TPU when it's not a single image chosen, other reputable sites and my own cross gen testing. 4080 to 4080 super, now that's a "refresh".
Check any GPU benchmark of your choosing. It's all there, in all of them. It would be pointless for me to spam the thread with more graphs showing the same thing.
 
I was planning 5070Ti, but might stretch to 5080.
 
Check any GPU benchmark of your choosing. It's all there, in all of them. It would be pointless for me to spam the thread with more graphs showing the same thing.
I've seen them, many others, and done my own testing, and they support my claim, so I won't spam the thread either. We end up at agree to disagree quite often don't we.

In other news, today was a good day for 9070XT leaks, seems like the performance potential there is perhaps a little better than I personally anticipated, and the first showing of FSR4 at CES shows considerable promise.
 
AMD likes to gamble on experimental tech and lose big time, like they did with Bulldozer. But it's not just that with RDNA 3. The products weren't that bad, imo. The 7800 XT is a great card, but every 9 out of 10 people thought that it's a bad replacement for the 6800 XT purely by its name, even though it was priced the same as the 6700 XT with 50% more performance, so it's clearly a 6700 XT replacement. It also came too late. The 7700 and 7900 XT were fine, just priced way too close to the 7800 and the XTX to make any sense. Only the 7600 didn't sell simply because it was too slow.
Do you read yourself?

"The products weren't that bad"... But they also weren't great. There's not a single AMD GPU in RDNA3 that offers more over its direct competitor other than VRAM.

"I view the 7800XT not as successor to the 6800XT"... but AMD did, and the 7900XT was indeed placed in the wrong spot, but at the same time, if you look at AMD's entire stack, it is only just below halo card territory. The 4090 rained on that parade and made the 7900XTX look like 'just another high end offering', and the 7900XT joined as the 'below high end offering'. The bottom line here is that AMD tried to upsell RDNA3 and that, after its somewhat below expectation performance and lacking RT performance, killed the entire proposition. Its really not about what you think, but about how the market responds.

You're just making up excuses here, imho, to not have to admit AMD fucked up one thing after another. Now we're hoping RDNA4 will be different... and the first signs... are nothing other than a repeat of what AMD"s always done.

Fingers crossed FSR4 is actually good this time.
 
other than VRAM.
The sad state of affairs is exactly that. The only reason amd has that 10% marketshare is nvidias stingyness with vram. Which I've heard a lot of people say "costs nothing". But if that's the case, nvidia can just remove amd from the market with little to no cost, which is just sad. Where the heck are the ATI days huh?
 
While I will always keep an eye on what AI bollocks NVIDIA and AMD will throw at us, whatever the real benchmarks number are for the RTX50 and RX9070 will be, I don't think my plan of waiting for RTX7080 equivalent will be moved.

I'm more anxiously waitng for the reviews for new 27" 4K OLEDs as these are in my immediate plan (and I'm more tolerant to low-ish FPS).
27" and 4K is trash unless you are crazy "retina-fanboy". Scaled you will be ended up with 2K or a lil more, go 32" 4K, don't be stupid like me, I got 4K for "RESOLUTION", then realized at 100% it's eye-destroying or scaling it won't be 4K, then the question is, are you OKAY with $$$ WASTE for marketing SCAM?
 
Already had my upgrade, but ill be skipping the rtx 50 series this time.
 
27" and 4K is trash unless you are crazy "retina-fanboy". Scaled you will be ended up with 2K or a lil more, go 32" 4K, don't be stupid like me, I got 4K for "RESOLUTION", then realized at 100% it's eye-destroying or scaling it won't be 4K, then the question is, are you OKAY with $$$ WASTE for marketing SCAM?
Even 4k 32" needs scaling unless you have some superhuman vision or sit too close. At around ~1 meter distance I cannot use it to read text with 100% scaling.
 
Back
Top