• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are people planning an upgrade?

Having a 4090, I don't really need to upgrade my GPU for sure, but I have a 4K 240Hz QD-OLED monitor and would love to be able to play most newer and Next-Gen games at 120fps+ !
I only play Single-Player games so I don't need 500fps+ and very low input lag, therefore I might sell my 4090 and get a 5090.
=> But if I was playing Multi-player games I wouldn't upgrade for sure (since FG is adding a lot of input lag)
 
Which brand of adapter is it? I reckon I can look for it here, since I haven't had much luck finding a mini DP to HDMI 2.1 cable. There is a DP to 2.1 cable, pricy, but the card comes with those mDP>DP adapters, will be my last resort if I can't get something more elegant
I got this one pictured below, shows as currently unavailable, but as you say combined with a certified HDMI 2.1 48 Gbps cable it's relatively elegant and doesn't mess with adding yet more adapters that can break the spec.

EDIT: Looks like this one could be the replacement? https://www.amazon.com.au/Matters-D...le-Adapter/dp/B00HNF0KF0?ref_=ast_sto_dp&th=1


1736406788034.png
 
I liked AMDs Tress FX/Purehair I honestly though AMD should have kept at it and though it was generally better than hairworks although neither company seems interested in that anymore just FG and upscaling....
Yes, and that's sad, too.

That is only because they are in no position to do otherwise although 7900XT shows they will at least try and be Nvidia.... If they did offer a better than 5090 card you better believe it would cost more same all the way down the stack.
That's the thing, though. People bring up the 7900 XT as a scummy product from a scummy company sort of like a "but but" argument to slash back at the "scummy Nvidia" arguments. To me, the 7900 XT is just a badly positioned, badly priced product that no one should choose over the XTX because they're too close in price and you get extra 4 GB VRAM. It's a fine card otherwise, just AMD doesn't know how to position products and incentivise people to buy them. I don't see it in relation to what Nvidia is doing. I see it only in relation to its own price.

The 5600X is a good example the minute AMD was slightly better we got the 300 usd 6 core......
That's when people should go "nah, F it" and the market adjusts itself. Just like the 9800X3D which I believe is still out of stock. The number of people ditching their 7800X3Ds for that extra 10% that you don't feel even with a 4090 is astonishing. Not to mention the much higher power consumption which put me off, but people don't care for some reason. When Intel does it, it's bad, but on an X3D chip it's fine?

So yes, the number of blind fans is mind-boggling on all sides. I just wish PC building was still just a cool hobby, and not the basis for tribal warfare on an online forum.
 
I got this one pictured below, shows as currently unavailable, but as you say combined with a certified HDMI 2.1 48 Gbps cable it's relatively elegant and doesn't mess with adding yet more adapters that can break the spec.

EDIT: Looks like this one could be the replacement? https://www.amazon.com.au/Matters-D...le-Adapter/dp/B00HNF0KF0?ref_=ast_sto_dp&th=1


View attachment 379165

Yeah, no luck. These all seem out of stock. I guess I'll get a 4K 60Hz/2.0 spec cable, it's going to be used with a 32 inch 1080p60 TV later on anyways. I can live with 1080p 120Hz for a few weeks
 
They always underdeliver and they never innovated anything GPU related with a single exception (RDNA2, Infinity Cache) in decades so I'm definitely not. Just plain low quality copycatting with "but it's open source!" sauce at the very best. Do I need to care for it being open source if it's so garbage I can't even enjoy it in 99% scenarios? Pricing is also ridiculous. Being 2+ generations behind on everything that's not raw raster performance means anything but the 5 to 10 % discount is enough.
They made the first chiplet GPU. You can argue that you don't feel its effects in your games, but it's still more innovation in their hardware than Nvidia did since Turing, which is nothing. All they're giving us is more software gimmicks on the same hardware that has a different codename for some reason. You can choose to buy RDNA 2 refresh-refresh, or Turing refresh-refresh-refresh. Please excuse me if I'm not amazed.

So unless AMD sell their dGPU division to anyone who actually CARES, I'm not buying their GPUs. At least for serious business. Fake frames or not, NV GPUs are just a more finished product. And they age better for the most part.
That's where our opinions differ. Since I don't care about fake frames, AMD's products don't feel any less finished to me than Nvidia's.

I start a game on an Nvidia card, it runs. I start it on an AMD card, it also runs. What more is there to "finish"?

When you compete there is no reason for you to allow your competition benefits of what you invented.
Absolutely. But then, I also have no reason to see it as anything other than an attempt to establish a monopoly which is bad for every consumer, regardless of which side you prefer.

Freedom is great but when you're free to choose atwixt a streaming pile of shit and a bulldozed piece of frozen urine does this freedom actually count? Iunno, man.
To you, it's a steaming pile of shit. To me, it's a graphics card that plays my games like any other, just slightly cheaper.
 
I always plan to......the timetable just keeps moving around :D
Same, I call it upgrade limbo. Good fun, you can always get excited about something you might eventually not buy anyway.

After buying, you can't get excited anymore (for a while), its a great way to keep your money in your wallet.
 
but it's still more innovation in their hardware than Nvidia did since Turing, which is nothing.
Do I need to remind you that Turing, a 6 years old architecture, is STILL more feature rich than whatever AMD have got in stock? There is zero software that runs on AMD GPUs but doesn't on Turing. However, a lot prosumer stuff refusing to do so (at least tinker-free) on AMD GPUs. Linux drivers might be better for AMD though but it's not a game changer. Sure, even 2080 Ti is slower than top dog RDNA2 and RDNA3 products but it'd be ridiculous if it wasn't.
They made the first chiplet GPU
Sure, it's an innovation. I forgot to mention it. However, why have they abandoned it right away? Seemed silly.
All they're giving us is more software gimmicks on the same hardware that has a different codename for some reason.
They're giving us more and more quality in said gimmicks so they no longer are gimmicks. DLAA (this is NOT upscaling) is a brilliant way to improve your visuals without buying a more powerful monitor (FSR and XeSS are very far behind at that). DLSS (this, however, is upscaling) is a great tool to let your ageing GPU have another couple years to last. RT... can't give any credit to NV in this regard because it's not them who invented this conceptually but at least their hardware does it better than anything else. Also... if you wanna buy the best raster performer it's an NVIDIA GPU. For, like, two decades straight with two brief periods of AMD GPUs being slightly more productive. Sure, expensive but it's the best. The best must be expensive.
I don't care about fake frames
Me neither.
What more is there to "finish"?
Let's start with the fact you have more options on an NVIDIA GPU. You have better image quality because even DLSS Q is better than anything that FSR/XeSS can do, even without upscaling. You have better RT performance. Your GPU can also do non-gaming stuff. Your driver control panel provides more functionality (oh no, I can't measure FPS with NVCP!) and is still better designed. Despite it effectively being a W98 era dinosaur.

AMD GPUs tank in recent titles because they cannot into RT. NVIDIA GPUs have some wiggle room (unless it's an 8 GB nonsense which I despise).
I also have no reason to see it as anything other than an attempt to establish a monopoly which is bad for every consumer, regardless of which side you prefer.
We got a monopoly precisely because AMD don't do anything like that. We would've had a healthy market if there were features exclusive to Radeons. Killer features that is. Like, imagine Radeon GPUs having some advanced thing that can make models and textures from older titles appear much more up to date on the fly without any meaningful performance impact. Or literally anything that gamers will enjoy, like, idk, TAA artifacts mitigator that makes image quality and stability even better than with DLAA.
They, however, just laze out and copy what NV do but so much worse it's not even worth consideration.
To me, it's a graphics card that plays my games like any other, just slightly cheaper.
I wasn't talking hardware, I was talking software. HW is fine in both parties (with an asterisk: AMD GPUs really lack RT performance). SW is relatively good in NV and it's beyond horrible in AMD.
 
They always underdeliver and they never innovated anything GPU related with a single exception (RDNA2, Infinity Cache) in decades so I'm definitely not. Just plain low quality copycatting with "but it's open source!" sauce at the very best. Do I need to care for it being open source if it's so garbage I can't even enjoy it in 99% scenarios? Pricing is also ridiculous. Being 2+ generations behind on everything that's not raw raster performance means anything but the 5 to 10 % discount is enough.

So unless AMD sell their dGPU division to anyone who actually CARES, I'm not buying their GPUs. At least for serious business. Fake frames or not, NV GPUs are just a more finished product. And they age better for the most part.

Normally, people want something that actually works. Something you'd want to enable. Something you'd look at and say, "yes, this was worth it." Locking features to your hardware means you have a much better shot in making it happen since you don't need to focus on hardware made of architecture you can only guess because you've never seen what's under the bonnet. "we wanna do X so let us implement Y hardware for that and that's it" VS "we wanna do X but the competition never made Y hardware so it's either to be emulated or to be abandoned so what do we do boss?"

When you compete there is no reason for you to allow your competition benefits of what you invented. You have a full moral right to keep it to yourself. That's why it's called, competition, not charity. NVIDIA do it right by inventing stuff (usually meaningless, at least if used as intended; but they at least provide new options almost every new generation), improving performance and constantly forcing more and more developers to try their things out so they get more reviews and more bugtraq which is crucial. AMD look like a bunch of amateurs whose main goal is to look as foolish as possible on major displays.

Freedom is great but when you're free to choose atwixt a streaming pile of shit and a bulldozed piece of frozen urine does this freedom actually count? Iunno, man.

AMD should no longer stand for "Advanced Micro Devices," they should change their backronym to "Another MayDay."
Or Always Mocking DGPU
 
People bring up the 7900 XT as a scummy product from a scummy company sort of like a "but but" argument to slash back at the "scummy Nvidia" arguments.
I don't see the 7900XT as scummy at all, that was just a stupid launch price for it. That's all, only the price. From my perch AMD have done several other actually scummy things though so I don't really count the beans on that one, I just don't see them as any better than the other two, they get away with precisely as much as they possibly can, so I don't let a perceived morality of the 3 companies affect wether I buy from them.
They made the first chiplet GPU. You can argue that you don't feel its effects in your games, but it's still more innovation in their hardware than Nvidia did since Turing, which is nothing
We've disagreed here before and I'll do it again. Turing changed the feature set. Ampere added the dual issue shaders, which aren't often leveraged but they absolutely can be. Ada was mostly a clock pump up, and Blackwell... Remains to be seen if IPC has increased but every subsequent generation doubled the ray triangle intersection performance and increased (not sure if double) tensor performance.

All of those effects can be felt in games when RT runs considerably better gen in gen, or DLSS incurs a smaller framtime cost, all the while rasterisation performance has increased.

I can't stop you calling that a refresh of a refresh of a refresh and "nothing", suffice to say I see it differently and I believe the metrics, performance, benchmarks etc demonstrate that.
 
Do I need to remind you that Turing, a 6 years old architecture, is STILL more feature rich than whatever AMD have got in stock? There is zero software that runs on AMD GPUs but doesn't on Turing. However, a lot prosumer stuff refusing to do so (at least tinker-free) on AMD GPUs. Linux drivers might be better for AMD though but it's not a game changer. Sure, even 2080 Ti is slower than top dog RDNA2 and RDNA3 products but it'd be ridiculous if it wasn't.
I don't care about said features, and I don't care about prosumer stuff, but I do very much care about Linux. It is a game changer for me as there is not a chance in hell I'm gonna move back to Windows outside of a virtual machine. So can we just agree to disagree? We obviously have vastly different priorities, which is fine by me.

They're giving us more and more quality in said gimmicks so they no longer are gimmicks. DLAA (this is NOT upscaling) is a brilliant way to improve your visuals without buying a more powerful monitor (FSR and XeSS are very far behind at that). DLSS (this, however, is upscaling) is a great tool to let your ageing GPU have another couple years to last. RT... can't give any credit to NV in this regard because it's not them who invented this conceptually but at least their hardware does it better than anything else. Also... if you wanna buy the best raster performer it's an NVIDIA GPU. For, like, two decades straight with two brief periods of AMD GPUs being slightly more productive. Sure, expensive but it's the best. The best must be expensive.
I don't need a more "powerful" monitor, and I don't need DLAA (although I'm not denying that it must be pretty awesome). AMD has VSR if I wanted to use it, but I don't. Plain 3440x1440 looks pretty darn fine to me. Yes, DLSS is great for aging GPUs, but so is FSR which has improved a lot through the years, but I'd still look at a GPU upgrade if I relied on either. I don't need the best GPU, either. Not for 2 grand, thanks.

Let's start with the fact you have more options on an NVIDIA GPU. You have better image quality because even DLSS Q is better than anything that FSR/XeSS can do, even without upscaling. You have better RT performance. Your GPU can also do non-gaming stuff.
There is no difference in image quality without DLSS/FSR/XeSS which I don't care about anyway. I don't care about non-gaming stuff, either (some of which, like BOINC, works on AMD too, by the way).

Your driver control panel provides more functionality (oh no, I can't measure FPS with NVCP!) and is still better designed. Despite it effectively being a W98 era dinosaur.
I totally disagree there. Nvidia doesn't even let you monitor your card, change power settings, over/underclock, over/undervolt, etc. The AMD control panel feels better, too. Nvidia crams everything into the "3D Settings" tab, which is just silly.

AMD GPUs tank in recent titles because they cannot into RT. NVIDIA GPUs have some wiggle room (unless it's an 8 GB nonsense which I despise).
I turn RT off and have fewer shiny puddles. What a terrible disaster! How can I keep on living now? :rolleyes:

We got a monopoly precisely because AMD don't do anything like that. We would've had a healthy market if there were features exclusive to Radeons. Killer features that is. Like, imagine Radeon GPUs having some advanced thing that can make models and textures from older titles appear much more up to date on the fly without any meaningful performance impact. Or literally anything that gamers will enjoy, like, idk, TAA artifacts mitigator that makes image quality and stability even better than with DLAA.
They, however, just laze out and copy what NV do but so much worse it's not even worth consideration.
That I agree with. AMD needs something of its own.

I wasn't talking hardware, I was talking software.
Yep - the thing I don't care about the slightest. As long as my games run, I'm cool.
 
Honestly, I put a lot of thought into it, at the end of the day, I realized that AMD's issues are all self-inflicted. They were complacent, lazy, and at times, even arrogant despite constantly losing ground.

And at each turn, they ended up:
  • Choosing not to implement optional extensions in DirectX 11 and OpenGL, resulting in drivers that even after they were refactored years later, still fail to support things like Command Lists
  • Choosing not to implement software DXR solution when it mattered and RT developers were just about getting started, handing all attention and development towards the single biggest change in computer graphics since the invention of the programmable pixel shader to their competitors on a silver platter
  • Choosing not to support hardware ray acceleration and the DirectX 12 Ultimate feature set at all until their competitor had a full generational head start
  • Choosing not to integrate dedicated tensor processing capabilities in their hardware, eight years after the first product from the competitor introduced this
  • Choosing not to give the necessary engineering attention to their drivers, instead slapping some new control panel and calling it revitalized when it's still the same crummy garbage underneath
  • Choosing not to support their graphics cards over the years, with a reminder that the GTX 900 series are now in their eleventh year with mainline driver support, still receiving targeted fixes
  • Choosing not to spend time on improving their media encoder for the longest time, opening a massive gap between them and their competitors for content creators
  • Choosing not to invest in a creator ecosystem, their only attempt at a prosumer card, the Radeon Vega Frontier Edition, being abandoned and given up on within pretty much a year
  • Choosing not to listen to their own closed beta testers for a large majority of issues brought up
  • Choosing not to submit their drivers for WHQL validation more than once every quarter
  • Choosing not to spend the resources to have day zero driver releases for major game launches, in fact, sometimes skipping entire launch seasons between stable driver releases
  • Choosing not to follow market trends and constantly make of your product the minimum gate to which developers have no choice but to scale their products to
  • Choosing not to provide documented APIs for their new features, such as the Radeon Anti-Lag issue that was getting players banned simply because some clown decided that injecting code directly would be a good idea instead of writing a properly documented API to which developers have control like Nvidia Reflex
  • Choosing not to come up with a meaningful answer to CUDA right back from the G80 days, instead opting to halfheartedly fund almost hobbyist level projects like ZLUDA that got nuked with a single line change in the CUDA license terms. To this day, they still didn't manage to ship ROCm, if that even catches up in the user space. Good thing their chips at least crunch AI decent enough for someone to be interested
I could go on and on, I'm sure there are some points I missed. All of these were CHOICES. I'm just gonna be real, the bill is almost due, UDNA will be their Zen moment. They cannot afford to flop it, at the same time they can no longer afford to take a meek and conservative approach. If it fails and Intel plays their cards right with Arc Celestial, Radeon will have no future beyond being a third rate iGPU solution for Ryzen chips. A most unfitting end to ATI's legacy, yet many would still question why the competition´s market share is 90% and rising, yet will get genuinely angry that people somehow aren't enthusiastic about AMD graphics hardware anymore.
 
I don't see the 7900XT as scummy at all, that was just a stupid launch price for it. That's all, only the price. From my perch AMD have done several other actually scummy things though so I don't really count the beans on that one, I just don't see them as any better than the other two, they get away with precisely as much as they possibly can, so I don't let a perceived morality of the 3 companies affect wether I buy from them.
That's the healthy attitude we need more of, imo. Yet, here we are arguing about who wants what from a GPU. Some people want DLSS. I don't. Can't we all be right in our own way? Wouldn't the world be a much more boring (and much more expensive) place if we all wanted the same? :)

every subsequent generation doubled the ray triangle intersection performance
All of those effects can be felt in games when RT runs considerably better gen in gen,
Really?
rt-cyberpunk-2077-2560-1440.png


I can't stop you calling that a refresh of a refresh of a refresh and "nothing", suffice to say I see it differently and I believe the metrics, performance, benchmarks etc demonstrate that.
All the metrics demonstrate is that you have more performance if you increase core count, clocks and power consumption. Exactly what you see on AMD, too.
 
All the metrics demonstrate is that you have more performance if you increase core count, clocks and power consumption. Exactly what you see on AMD, too.
Techpowerup RT charts have an odd way of showing it with the - x%, but yes, RT effects enabled take a smaller hit gen on gen with the same amount of RT 'cores'. Ray triangle intersection performance doubling doesn't necessarily mean double the resultant FPS With RT on in every single title as there's more to it than that, but it's demonstrable that RT and Tensor performance increases.
 
Techpowerup RT charts have an odd way of showing it with the - x%, but yes, RT effects enabled take a smaller hit gen on gen with the same amount of RT 'cores'. Ray triangle intersection performance doubling doesn't necessarily mean double the resultant FPS With RT on in every single title as there's more to it than that, but it's demonstrable that RT and Tensor performance increases.
RT perf. / non-RT perf. * 100 = % of performance lost. What's so odd about this?

You can hold a seminar on ray intersections with a hundred slides, but if I see no real-world improvement in the performance loss with RT enabled, I won't care. With this logic, we could also discuss AMD's technical improvements generation-per-generation, because there's plenty, but what does it matter, honestly? Don't we all buy these cards to play games?
 
Or Always Mocking DGPU
Absolutely Mediocre Development.
We obviously have vastly different priorities, which is fine by me.
This is irrelevant. My wants are irrelevant, I'm not the defining customer. Neither are you. I'm talking straightly general arsenal. And it's much more polished in Team Green.
FSR which has improved a lot through the years
Less than DLSS did in the span of the last year. xD
Nvidia doesn't even let you monitor your card, change power settings, over/underclock, over/undervolt, etc.
Sure but you can calibrate the software side behaviour MUCH finer there. This "3D Settings" tab alone is more advanced than the whole Adrenalin software combined. You can do much more DX/VK/OpenGL overrides and they work with much less caveats. DSR, for example, is more powerful at least because you can disable unwanted modes in the driver panel so you have your games less bloated.
I turn RT off
You don't. RT is baked into games. Welcome to the future. You can lower its profoundness but disabling it altogether just disables the game.
As long as my games run, I'm cool.
...until you witness how they're meant to run. But no, you'll buy an AMD GPU once again just in plain spite.

I bought my 6700 XT to see improvements. There were. Over the previous generations. But NVIDIA GPUs still feel much better. Even "outdated" ones such as 2080 Ti.

Freak it, I jump the gun and get me a 5070. Same 12 GB but a whole lot of additional horsepower and features.
 
Absolutely Mediocre Development.

I prefer Always More Disappointments myself :nutkick:

Why are we so hung up on accepting forward technologies? We are supposed to be geeks, seeking excellence, adopting and embracing new technologies are they come.
 
FSR4 is looking promising. Just one game though...

 
You don't. RT is baked into games. Welcome to the future. You can lower its profoundness but disabling it altogether just disables the game.
I haven't played a single game with RT baked into it.

For the rest of your post: who are you trying to convince? Me? Or yourself? I told you what I want from a GPU, and that's that. What the general public wants doesn't bother me the slightest.

This is the same crap that made me not want to disclose details about my upgrade plans earlier... I knew some butthurt Nvidia fan would start a crusade to prove a point that no one needs proving.
 
I've been slowly gathering the parts for a new build. Have everything but CPU (still trying to get a 9800x3d at retail) and GPU (5090 if they don't get sold out and scalped right off the bat). All I have is expensive paperweights sitting in the corner until then.
 
Why are we so hung up on accepting forward technologies? We are supposed to be geeks, seeking excellence, adopting and embracing new technologies are they come.
As someone who calls himself a geek, I'm curious of every new piece of technology, but I'm not going to adopt anything that I don't see fit or necessary for my own personal use. An interest in technology and unquestioning obedience are not the same thing (they're quite the opposite, in fact). Being geeks doesn't mean that there's only one way to love technology, or that we've given up on critical thinking, right?
 
Sticking with 4090 till 6xxx series.

Probably in the same boat especially if raw performance isn't very impressive. MFFG doesn't get me even a little hard....

Beware!

OIP.jpg

Yeah, I'm getting a 5090

5090 and Optimus block.

Yeah, count me in. Just waiting for some benchmarks and let the dust settle on all the hype so the price drops a little from launch prices.

Will definitely be putting it on water so will have to have a look see at what blocks will be available for any given partner card.
 
You can hold a seminar on ray intersections with a hundred slides, but if I see no real-world improvement in the performance loss with RT enabled, I won't care.
Performance increasing at every wattage teir every generation so far (50 series we'll see) and RT lowering in how much it affects performance before and after RT changing show it fairly well.

As for the metric, it can abfuscate outright gen on gen gains because of the non linearity of the impact of adding RT. My example from yesterday here https://www.techpowerup.com/forums/...5090-5080-and-5070.330664/page-3#post-5412976

My point being, Nvidia are not lying that per RT core performance has changed every generation, and with the other things I pointed out, I call it more than a refresh (my bar for a refresh would be Intel's 4 core stagnation with minuscule improvements gen after gen), and assert they continue to innovate. We can agree to disagree, but that's my 2c.
 
Beware!

View attachment 379176




Yeah, count me in. Just waiting for some benchmarks and let the dust settle on all the hype so the price drops a little from launch prices.

Will definitely be putting it on water so will have to have a look see at what blocks will be available for any given partner card.

Yeah, I'll be passing some of my stuff on and once I have enough cash to pay for at least half of it upfront, I'll be grabbing a ROG Astral. It will mean a hefty downgrade for some time but, I am surely gonna live :laugh:
 
My point being, Nvidia are not lying that per RT core performance has changed every generation, and with the other things I pointed out, I deny they've changed "nothing".
I'll believe it when I see it in benchmarks. So far, I don't. What I see is that RT affects performance the same way on Ada as it did on Ampere and Turing. See for yourself in any review here on TPU.
 
Back
Top