The midrange was weak this gen which makes your situation bad. The $400 CAD cards like the 4060 and rx 7600 are basically side grades compared to the 2070 super.
If you want new than you'll most likely just have to wait for the RTX 5060 and RX 9600/9600XT to release but even, then, you're crossing your fingers that these cards can even offer at least 3070/6800 non XT level performance which is the minimum I'd be looking at from a 2070 super
The 3070ti, 3080, or 4070 would seem to offer a nice boost in performance. If I can find them at a reasonable price. It's that damn price-to-performance ratio that keeps me from pulling the trigger. A mid-range GPU should NOT cost the same amount as an entire entry level system for Christ's sake!
Happy I took the bite on the 4080 super, 5000 series seems 'meh' as I gambled it would be, the rumoured VRAM boost not really surfaced and most of any gains are down to better AI (multi frame gen).
Will still get the better DLSS/DLAA so am happy.
For what its worth my prediction is we going to see another delayed pricing drop, I dont think these will sell that well compared to older gens. So I wouldnt buy now and hold off if your plan is to get a 5000 series card.
Yeah, saw that. IF it's not a single type scenario the advance from 3.1 to 4.0 is really jaw dropping. I'll hold my beers until real launch and reviews.
There is also a 3rd party 4x frame gen, a youtuber had access to a beta version of a tool (sorry I already forgot name) and used multi frame gen on a 4070ti super. He tested it on cyberpunk 2077.
Just remembered its this tool, I dont know if its in a public version yet though.
Even 4k 32" needs scaling unless you have some superhuman vision or sit too close. At around ~1 meter distance I cannot use it to read text with 100% scaling.
yea. I sitted close from my 27" 4K. Uhmm because of small desk where keyboard and mouse are too lol. I could see at 100% but it was eye-pain except pics/vids.
Do YOU have to throw that reaction for MY plan/decision to go 27" 4K OLED when I already have a 27" 4K IPS doing okay without scaling / resolution issues for a year? Really? REALLY? What the hell?
I'm not getting into teams discussion this time and somehow a bullet is flying into me.
You don't have to dick that hard about my otherwise rational choice. I'm not pairing a 4090 with a Pentium, for example.
Calm down, you calm down. Now I have to calm down.
There's a significant market demand for 27" 4K displays, OLED or not. Yet, market demand didn't stop any of the "YOU REALLY WANT 27" 4K?" talks. Recently, almost every time I say (not only in TPU) "I want 27" 4K OLED, this is what I have been waiting for!" I have to make lots of effort to justify it and cover my back side like in a team red/green/blue debate when I'm already using one, just not in OLED. I thought I can get away this time. And I don't want to throw a wall of text at questions whenever I mentioned 27" 4K. *sigh*
You found out that you can't get used to 27" 4K or whatever. I'm using one, I'm completely fine with it, and I want a better version. We are very different people. I ain't say anything about you for that reason alone. It is not a scam when a sizable portion of market wants them and is fine with using them. You don't have to laugh that hard.
Uh, UI scaling on OS and game resolution are two different things. It doesn't make games look worse. The games are running at 4K no matter what funny UI scaling setting on OS side is. Just sayin.
edit: moving paragraphs around and lots of wordings and shoving another wall of text into a spoiler box, trying to make this post more structured and less annoying other than this part. I know I'm a vocal minority here just for waiting for a slightly minor product to get out of a clunky setup and have a very noticably better experience. The last time I tried 100% scaling didn't end well, but at this point I probably want to try again for a week just for ships and giggles, just to spite some forum members here and there, and / or get more weirdo points thrown at me.
TLDR: I know what a 27" 4K looks like. I know what a 27" 1440p QDOLED looks like. When they fuse together, the result will be my ideal display.
I had plans for a 27" 4K before I moved to my current place 1.5 yrs ago (the older one can only do 24", desk and shelf doesn't allow 27" for height reasons). And just after I got this part of my clunky plans done, thinking this clunky setup is the best I can do for the time, and get used to it, gaming on OLED become a thing, I learnt that there will be gaming OLED options at 27" 4K in near future, and my dream got bigger. The 1080p displays on its side are now a stopgap solution.
The new space still has physical space limitations (double 32" will be dodgy on my desk, and no, there's no space for a bigger desk). If not, I might have gotten a 32" 4K QDOLED last year. But as I have said, I have got used to a 27" 4K IPS.
It isn't horrible, it's just that its black level looks real bad when compared to the AOC 24G2 on its side. And I got a taste of QDOLED from my brother.
I have been staring at his 27" 1440p QDOLED for some time, and I'm sure I can't go back to 1440p. My ideal display will be a 27" 4K QDOLED with high enough refresh rate that has been "rumoured" for quite some time. I've been waiting for CES2025 just for the confirmation that they will be launched, and I can't wait to throw my hard earned money on one of them.
Look, yes, I'm running 150% scaling on my 4K to roughly match the 1080p displays, I expect to run 125% scaling when I get my hands on whatever new display for a double 27" 4K setup. That's still more than 1440p for that matter. And crisper text for that matter (even if I ended up not okay with 125%). And a (admittedly wasteful) 200% scaling would still look better than a 1080p on a similar size.
Side note just to say that I'm really okay on small text:
On my current day job I got ~5yrs ago, whenever I touch an Excel file on default font settings and files got wide, I always set scaling to 70%. I would love to get 55% usable just to stuff more things into my screen (not going to happen in generic office 1080p displays). I would love to get a secondary monitor for my office desk, preferably anything more than 1080p (also not going to happen because corporate). I'm using the office monitor at an arm's length.
I use an iPhone 13 mini since its launch day as my daily. That needed probably more justification, and that's another can of worm someone opened for me IRL and I don't want to open it here again. EDIT: yes I know, that makes me a weirdo. Good thing I hadn't asked anyone in any forum when the 13mini was launched. TLDR: 13mini is closest in terms of size to the first good responsive smartphone I've ever used, the iPhone 5C.
I'm a fairly short-sighted person in many ways, but my eyesight hasn't got any worse for at least 10yrs, after lots of small text reading and gaming that will understandably get cautioned by my parents.
I'm really skeptical about the new GPU releases and GPU's in general. On one hand you have NV with the high prices, despite what NV says about those and what tech it offers. On the other hand you have AMD which may be lower in price but still does not offer much of a performance bump. Intel, well Intel is still waiting for an opportunity. I must say, it will be hard for me to upgrade my graphics. Considering i dont play much nowadays, and if I do, my 6900xt agrees with me fully with games I play, I might skip this gen upgrade as well.
yea. I sitted close from my 27" 4K. Uhmm because of small desk where keyboard and mouse are too lol. I could see at 100% but it was eye-pain except pics/vids.
I am trying to not get my hopes up too much, but over the past days the information surfacing has gotten better and better, in the other thread I summed up my personal thoughts as below. Didn't get any engagement perhaps because I also went against the tide on the opinions of RT in that thread
Personally, this is shaping up to be ever more appetising to me as my upgrade path. If it really is;
A raster match~ish for a 7900XTX or 4080/S
RT performance that is generationally ahead of Ampere
FSR4 (or what was directly called by them to be a research project) is as good as what we saw in their booth for all or at least most games (and not just fine tuned for 1 title), and is easily adopted widespread or able to be substituted in place of 3.1 as has been rumoured
Some AIB cards have 2x HDMI 2.1
And of course, priced to party...
Well then I'm going to have a hard time justifying to myself paying a bare minimum of $1519 AUD for a 5070Ti or better.
I am trying to not get my hopes up too much, but over the past days the information surfacing has gotten better and better, in the other thread I summed up my personal thoughts as below. Didn't get any engagement perhaps because I also went against the tide on the opinions of RT in that thread
Personally, this is shaping up to be ever more appetising to me as my upgrade path. If it really is;
A raster match~ish for a 7900XTX or 4080/S
RT performance that is generationally ahead of Ampere
FSR4 (or what was directly called by them to be a research project) is as good as what we saw in their booth for all or at least most games (and not just fine tuned for 1 title), and is easily adopted widespread or able to be substituted in place of 3.1 as has been rumoured
Some AIB cards have 2x HDMI 2.1
And of course, priced to party...
Well then I'm going to have a hard time justifying to myself paying a bare minimum of $1519 AUD for a 5070Ti or better.
I-I dunno, I dunno if I'm being overly petty here. Even if you don't know I'm using one, I can't digest why I ...or monitors of that form factor got hit with such a, erm, bad rude take.
There's a wall between "waste of money scammy trash" and "top-of-the-line product in a niche form factor with a price tag to match". And maybe I'm crazy, but for this specific case I know well enough on what I'm gaining with the product. I'm commited to the plan, and I don't need someone else trashing on the plan.
It's one of the things I hear ad nauseum that's for sure. I genuinely really *want* to main a current/new gen Radeon again (and yet avoid allowing it consume me like the people who say those things and the rest of the typical diatribe I'm tired and bored of), so it'd be neat if they can just recapture that lightning in a bottle they've managed a few times over now.
It's one of the things I hear ad nauseum that's for sure. I genuinely really *want* to main a current/new gen Radeon again (and yet avoid allowing it consume me like the people who say those things and the rest of the typical diatribe I'm tired and bored of), so it'd be neat if they can just recapture that lightning in a bottle they've managed a few times over now.
Tbf what I think is happening is the exact opposite. People cry about nvidia cause they want their cards to be cheaper so they can buy amd. They know damn well that if the 5070 was at 999$ the 9070xt would be at 949$.
It's one of the things I hear ad nauseum that's for sure. I genuinely really *want* to main a current/new gen Radeon again (and yet avoid allowing it consume me like the people who say those things and the rest of the typical diatribe I'm tired and bored of), so it'd be neat if they can just recapture that lightning in a bottle they've managed a few times over now.
Tbf what I think is happening is the exact opposite. People cry about nvidia cause they want their cards to be cheaper so they can buy amd. They know damn well that if the 5070 was at 999$ the 9070xt would be at 949$.
This is happening on both sides. The world is going mad, everyone is polarised for some reason. People who are genuinely interested in technology (or anything really) regardless of branding are a dying species.
It's not about DLSS being a thousand times better. It's about FSR being both much worse, much later to the party, and much harder to update for an end gamer. With DLSS, you just download new .dll and you're rocking it. With FSR, however, you're mostly out of luck because only the game developers know where they had dug it.
Probably your standards are extremely low. I have limited attention because of head injuries and the image tends to blur on its own in motion. However, even considering that, I see A WHOLE LOT of difference and whilst DLSS is mostly okay-ish in 70+ % games and it's considerably bad in the others the FSR is just one single game where it didn't make me wanna throw my monitor outta window. And that's it. Total massacre. FSR4, however, proved to be much better but still, is it really worth buying a brand new video card..?
290X was released 11 years ago and is no longer supported. Last drivers: mid '22. Resounding almost 9 years of support. Massive TGP. Extreme size.
970 was released 10 years ago. Still receives updates. TWO TIMES lesser power consumption. Overclocks better.
Performance difference is insignificant. GTX 970 was cheaper, today's price difference is insignificant.
Literally whatever RTX GPU makes fun of AMD GPUs in RT titles which become more and more of a thing. The only exception arises when we're talking 8 GB GPUs, these suck. Even if they're NVIDIA, they still suck. But 3080 Ti aged leagues better than RX 6900 XT did. They were around the same cost.
This is happening on both sides. The world is going mad, everyone is polarised for some reason. People who are genuinely interested in technology (or anything really) regardless of branding are a dying species.
Well sure, but for example, the 5080 is criticized for being an anemic improvement over the 4080, which was also criticized for being anemic and should have been a 4070 and what have you. And yet, after the many consecutive years of anemic this and anemic that, the 5080 will still be the fastest non nvidia card in both RT and raster.
If Intel and amd can't hold a candle to a series of anemic nvidia releases, I'd argue we should be shaming them moreso than Nvidia. Based on the sm count people are calling the 5080 a 5060ti / 5070. Imagine what amd and Intel are feeding us when they can't catch up to a 5060ti.
Well sure, but for example, the 5080 is criticized for being an anemic improvement over the 4080, which was also criticized for being anemic and should have been a 4070 and what have you. And yet, after the many consecutive years of anemic this and anemic that, the 5080 will still be the fastest non nvidia card in both RT and raster.
If Intel and amd can't hold a candle to a series of anemic nvidia releases, I'd argue we should be shaming them moreso than Nvidia. Based on the sm count people are calling the 5080 a 5060ti / 5070. Imagine what amd and Intel are feeding us when they can't catch up to a 5060ti.
The whole GPU industry came to a standstill, everybody is focusing on software and AI. I don't think anyone is trying too hard to dethrone the 40-series, not even Nvidia. Like someone said before (here or in another thread, I can't remember), we're being fed scraps because the big money is in AI now.
Edit: This just makes it look even more ridiculous to fight over brand loyalty.
The whole GPU industry came to a standstill, everybody is focusing on software and AI. I don't think anyone is trying too hard to dethrone the 40-series, not even Nvidia. Like someone said before (here or in another thread, I can't remember), we're being fed scraps because the big money is in AI now.
Edit: This just makes it look even more ridiculous to fight over brand loyalty.
Yeah, but wouldn't you expect the underdogs to be the ones that offer earth and water? Why would nvidia even try, they have the fastest cards and the best sales, it's everyone else that needs to try twice as hard. Yet here we are expecting the market leader to innovare and push performance to the next level. Why? They didnt even have to release Blackwell like at all.
Why would nvidia even try, they have the fastest cards and the best sales, it's everyone else that needs to try twice as hard. Yet here we are expecting the market leader to innovare and push performance to the next level. Why? They didnt even have to release Blackwell like at all.
They're obviously not trying if all they have is "more fake frames" and no data to offer on raw performance. The 5070, for example, has almost the same core count as the 4070. I doubt it'll be a lot faster in pure raster. 4070 Super level, perhaps.
The only thing they're upgrading is the x90 level because that's where people seriously interested in AI are shopping, and like I said, AI is where the big money is.
My 3070 still gets the job done. It manages 1080p60 on all but a few AAA, often 1440p60 with settings tweaks (and I actually enjoy fucking around with settings), and 4K60 is attainable for every AA / indie I've played.
The only game I couldn't enjoy with this card was CP2077. I think Ultra RT is needed to give the game proper ambience, but late-game/Phantom Liberty exhausts memory and causes severe stutters when exiting menus. Every other game: still fun on a 3070! Even the Hogwarts Legacy and Resident Evil games that HWUB uses to bludgeon 8GB cards were fine for me.
My past 30 years of upgrades:
Voodoo1 -> Riva TNT2 -> GeForce3 Ti200 -> 6600 GT -> 9600 GSO -> HD 5850 -> GTX 970 -> RTX 3070.
Same thing every time: +200% performance when upgrading to the next midrange model. (Out of curiosity I even checked the pre-TPU cards: TNT2->GF3 was also +%200 on the one benchmark I could find.)
If you buy Nvidia, then you're an Nvidia fanboy, because Nvidia is evil incarnate. If you buy AMD, then you're an AMD fanboy because AMD is pure shit. The only way out of this mess is an Arc B580, which came straight from heaven, but I'm sure people will pick a fight with Intel, too, sooner or later.
If you buy Nvidia, then you're an Nvidia fanboy, because Nvidia is evil incarnate. If you buy AMD, then you're an AMD fanboy because AMD is pure shit. The only way out of this mess is an Arc B580, which came straight from heaven, but I'm sure people will pick a fight with Intel, too, sooner or later.
to be fair, when you're going in the highway and see cars coming straight at you and start shouting "hey why are these idiots all going the wrong way", odds are (high ones) you're the one on the wrong.
Someone buying the product everyone buys hardly makes anyone a fanboy, but buying the product no one cares about really do.
The itch to upgrade is there but with my 13700K + MSI Z690 Force + 7900XTX it would cost a lot to replace everything with no practical benefit. Not too happy with my 6000@40CL memory which doesn't even work like that, best could do is 5800@36, meh.
5090 looks interesting of course, but would have to replace my 850W PSU, down the rabbit hole it goes
Guess I am on side-lines waiting for the next generation
The 9070XT looks very appealing as an upgrade path for 6xxx series owners