Genuinely curious, I thought in other threads you were already decided to get the 9070XT anyway?
I'll be upgrading within the next 3 months almost certainly, either 9070XT, 5070Ti or higher 50 series - unless they're all so disappointing and/or priced crazy locally that I decide to wait.
It won't be, that's with Multi FG on 3x mode compared to 4090 FG 1x
Even 4k 32" needs scaling unless you have some superhuman vision or sit too close. At around ~1 meter distance I cannot use it to read text with 100% scaling.
Absolutely. But not RTX 50. As no doubt the pricing will be twice the MSRP and will stays for another 6 to 12 months as trend follows here, so not bothering scratching my head over new tech. The luck is, RTX 50 launch would finally drops the prices of RTX 40 series. Hence, a good opportunity window to get a RTX 40 GPU, possibly a 4070 Ti/Super. Finally will let go my RTX 3060Ti for good. For my need, of course and some productivity work, it would serve me for next two years comfortably.
A 5080 if I can get one on release day. My next would be a 5070Ti . I have a new computer built , everything installed up and running with the IGPU. But the card I've been using for the last 10 years is so old its eligible for a old age pension cheque. 4080,s are in short supply here and I may as well buy the latest and so called greatest in a few weeks, In the spring will be a new monitor I'm not sure why but I have never been able to get the brightness contrast and colour right on this wide screen . My old BenQ when it was alive and well was much much better.
.
"The products weren't that bad"... But they also weren't great. There's not a single AMD GPU in RDNA3 that offers more over its direct competitor other than VRAM.
"I view the 7800XT not as successor to the 6800XT"... but AMD did, and the 7900XT was indeed placed in the wrong spot, but at the same time, if you look at AMD's entire stack, it is only just below halo card territory. The 4090 rained on that parade and made the 7900XTX look like 'just another high end offering', and the 7900XT joined as the 'below high end offering'. The bottom line here is that AMD tried to upsell RDNA3 and that, after its somewhat below expectation performance and lacking RT performance, killed the entire proposition. Its really not about what you think, but about how the market responds.
You're just making up excuses here, imho, to not have to admit AMD fucked up one thing after another. Now we're hoping RDNA4 will be different... and the first signs... are nothing other than a repeat of what AMD"s always done.
Looks good, makes it even more strange why they didn't show it on keynote.
However if they plan to implement it only for RDNA4 then it's the same or worse than what nvidia does. What will my 96 AI Accelerators that are written on the box do? I mean at least RTX20-40 owners get new DLSS image improvements, FSR4 not working on RDNA3 is a shot in the knee.
As for upgrade I might sell my XTX and buy 5080, not expecting huge perf improvements but I like to switch between AMD and nvidia. Had 3080 before 7900XTX, maybe its time to go back to team green.
I'm not shopping for anything, but I'm curious how the 9070XT will pan out. If it ends up being close to the 7900XT in raster with meaningfully faster RT, it could be a great option at $400-450.
And honestly, I've kinda lost interest in upgrading. My main rig is already overpowered for my needs, while I can have just as much fun tinkering with one of my retro systems. I find myself spending far more time on my 13 year old backup PC playing indie games than testing new releases anyway.
ATI days where not better. The driver support was horrible. 3rd party drivers where about somewhat better (Omega drivers). AMD's drivers today, people could complain about, but way way better in comparison to ATI days. (ImO)
ATI days where not better. The driver support was horrible. 3rd party drivers where about somewhat better (Omega drivers). AMD's drivers today, people could complain about, but way way better in comparison to ATI days. (ImO)
Have installed and OCed many AMD cards in the past couple of years. 6400 all the way up to 6800. With overclocking. On W10 and 11. No issues. Smooth installs and really decent benchmark scores.
Last card I benched on modern hardware was ATI 850 XT. It was a pain to install a good working driver on W10. I mean it was worth like 6 submissions DX9 only. 3Dmark99, 2000, 01, 03, 05 and 06 and Aquamark. Thats all those cards are good for. Essentially irrelevant, the drivers from Omega where better than ATIs. Today, we don't have 3rd party driver choices....
Have installed and OCed many AMD cards in the past couple of years. 6400 all the way up to 6800. With overclocking. On W10 and 11. No issues. Smooth installs and really decent benchmark scores.
Last card I benched on modern hardware was ATI 850 XT. It was a pain to install a good working driver on W10. I mean it was worth like 6 submissions DX9 only. 3Dmark99, 2000, 01, 03, 05 and 06 and Aquamark. Thats all those cards are good for. Essentially irrelevant, the drivers from Omega where better than ATIs. Today, we don't have 3rd party driver choices....
I don't know, but the issues I have aren't really anything rare, just googling about them will get you thousands of results. EG. having to turn off windows updates or disconnect from the internet to install the drivers so they aren't gone at restart.
Even 4k 32" needs scaling unless you have some superhuman vision or sit too close. At around ~1 meter distance I cannot use it to read text with 100% scaling.
I'm not shopping for anything, but I'm curious how the 9070XT will pan out. If it ends up being close to the 7900XT in raster with meaningfully faster RT, it could be a great option at $400-450.
And honestly, I've kinda lost interest in upgrading. My main rig is already overpowered for my needs, while I can have just as much fun tinkering with one of my retro systems. I find myself spending far more time on my 13 year old backup PC playing indie games than testing new releases anyway.
for me personally, I don't see a lot of reason to unless I get a great deal as I don't see any games I currently want to play in 2025 that need demanding hardware
Path of the Exile 2
Avowed
Solasta 2
Outer Worlds 2
Next Diablo IV expansion
With the impending release of new generation Nvidia, Intel and AMD cards, new AMD and Intel CPUs and APUs, is anyone thinking of upgrading their rigs? Maybe an upgrade to older generation parts which now might come down in price? This is not a question for new builds.
Which would put it slightly below 7900XTX in raster (in Time Spy) and on the same level as XTX in RT (Speed Way), with a lot less WGP.
There was also video from KitGuru from PowerColor booth at CES that drivers that AIB got for testing only allowed full performance on FurMark (for thermal testing), while 3dmark is reduced by 20%. Question whethere those perf numbers are with those drivers or unlocked ones. Time will tell, supposedly launch will be at 22.01.
Looks good, makes it even more strange why they didn't show it on keynote.
However if they plan to implement it only for RDNA4 then it's the same or worse than what nvidia does. What will my 96 AI Accelerators that are written on the box do? I mean at least RTX20-40 owners get new DLSS image improvements, FSR4 not working on RDNA3 is a shot in the knee.
As for upgrade I might sell my XTX and buy 5080, not expecting huge perf improvements but I like to switch between AMD and nvidia. Had 3080 before 7900XTX, maybe its time to go back to team green.
Now lets see it run and available everywhere, too, just like FSR in all of its renditions across various games. I think AMD isnt quite there yet. Games often get lower versions.
27" and 4K is trash unless you are crazy "retina-fanboy". Scaled you will be ended up with 2K or a lil more, go 32" 4K, don't be stupid like me, I got 4K for "RESOLUTION", then realized at 100% it's eye-destroying or scaling it won't be 4K, then the question is, are you OKAY with $$$ WASTE for marketing SCAM?
Do YOU have to throw that reaction for MY plan/decision to go 27" 4K OLED when I already have a 27" 4K IPS doing okay without scaling / resolution issues for a year? Really? REALLY? What the hell?
I'm not getting into teams discussion this time and somehow a bullet is flying into me.
You don't have to dick that hard about my otherwise rational choice. I'm not pairing a 4090 with a Pentium, for example.
Calm down, you calm down. Now I have to calm down.
There's a significant market demand for 27" 4K displays, OLED or not. Yet, market demand didn't stop any of the "YOU REALLY WANT 27" 4K?" talks. Recently, almost every time I say (not only in TPU) "I want 27" 4K OLED, this is what I have been waiting for!" I have to make lots of effort to justify it and cover my back side like in a team red/green/blue debate when I'm already using one, just not in OLED. I thought I can get away this time. And I don't want to throw a wall of text at questions whenever I mentioned 27" 4K. *sigh*
You found out that you can't get used to 27" 4K or whatever. I'm using one, I'm completely fine with it, and I want a better version. We are very different people. I ain't say anything about you for that reason alone. It is not a scam when a sizable portion of market wants them and is fine with using them. You don't have to laugh that hard.
Uh, UI scaling on OS and game resolution are two different things. It doesn't make games look worse. The games are running at 4K no matter what funny UI scaling setting on OS side is. Just sayin.
edit: moving paragraphs around and lots of wordings and shoving another wall of text into a spoiler box, trying to make this post more structured and less annoying other than this part. I know I'm a vocal minority here just for waiting for a slightly minor product to get out of a clunky setup and have a very noticably better experience. The last time I tried 100% scaling didn't end well, but at this point I probably want to try again for a week just for ships and giggles, just to spite some forum members here and there, and / or get more weirdo points thrown at me.
TLDR: I know what a 27" 4K looks like. I know what a 27" 1440p QDOLED looks like. When they fuse together, the result will be my ideal display.
I had plans for a 27" 4K before I moved to my current place 1.5 yrs ago (the older one can only do 24", desk and shelf doesn't allow 27" for height reasons). And just after I got this part of my clunky plans done, thinking this clunky setup is the best I can do for the time, and get used to it, gaming on OLED become a thing, I learnt that there will be gaming OLED options at 27" 4K in near future, and my dream got bigger. The 1080p displays on its side are now a stopgap solution.
The new space still has physical space limitations (double 32" will be dodgy on my desk, and no, there's no space for a bigger desk). If not, I might have gotten a 32" 4K QDOLED last year. But as I have said, I have got used to a 27" 4K IPS.
It isn't horrible, it's just that its black level looks real bad when compared to the AOC 24G2 on its side. And I got a taste of QDOLED from my brother.
I have been staring at his 27" 1440p QDOLED for some time, and I'm sure I can't go back to 1440p. My ideal display will be a 27" 4K QDOLED with high enough refresh rate that has been "rumoured" for quite some time. I've been waiting for CES2025 just for the confirmation that they will be launched, and I can't wait to throw my hard earned money on one of them.
Look, yes, I'm running 150% scaling on my 4K to roughly match the 1080p displays, I expect to run 125% scaling when I get my hands on whatever new display for a double 27" 4K setup. That's still more than 1440p for that matter. And crisper text for that matter (even if I ended up not okay with 125%). And a (admittedly wasteful) 200% scaling would still look better than a 1080p on a similar size.
Side note just to say that I'm really okay on small text:
On my current day job I got ~5yrs ago, whenever I touch an Excel file on default font settings and files got wide, I always set scaling to 70%. I would love to get 55% usable just to stuff more things into my screen (not going to happen in generic office 1080p displays). I would love to get a secondary monitor for my office desk, preferably anything more than 1080p (also not going to happen because corporate). I'm using the office monitor at an arm's length.
I use an iPhone 13 mini since its launch day as my daily. That needed probably more justification, and that's another can of worm someone opened for me IRL and I don't want to open it here again. EDIT: yes I know, that makes me a weirdo. Good thing I hadn't asked anyone in any forum when the 13mini was launched. TLDR: 13mini is closest in terms of size to the first good responsive smartphone I've ever used, the iPhone 5C.
I'm a fairly short-sighted person in many ways, but my eyesight hasn't got any worse for at least 10yrs, after lots of small text reading and gaming that will understandably get cautioned by my parents.
I'm set on the 5090 after skipping 7900XTX. Major dilemma over the motherboard though, I want to get the X870 Apex but my $100 HDV is going rather well and i'm not sure if I want to spend that much more to tweak my memory from 7800 GDM E to 8400 GDM D or something along the lines.
Then there's the 9950X3D which seems enticing but I don't wanna spend almost $4000 for three components. I think i'll just grab the 5090, sell my 3090 and call it a day.
Now if gigabyte releases a Tachyon ice for a reasonable price...
From what I've been told, the 9070xt performs as good as the 4080 super in RT and better in raster. As far as I understand it, the 9070 was the card intended to hit the $500-550 mark pre 5070 price confirmation. A 9070xt at $500 should not be DOA by any means except by hopeful green goblins on this forum.