• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3090 Ti Founders Edition

Hm, we have more than enough gaming GPU for everyone on earth already, why do you think more people should have more GPUs, leading to more e-waste?

You are denoucing one aspect of capitalism while encourage the other: over consumption, that's why I brought out the 6500XT/6400 example.

If next gen GPUs are overpriced, what's to stop people from buying used GPU? Do you think buying used GPU is a sin or something :laugh: .

Like I said many times now, if next gen don't bring any benefit in efficiency and price to performance, there is no reason to buy them, just look away. Nvidia and AMD will get the memo, they need customers, we don't need them.

In short, fighting for more accessible gaming GPU is just useless endeavour, completely meaningless IMO. Gaming is a luxury, not a necessity.
Wait, what? I am explicitly arguing against overconsumption. On several levels, even. If you're reading what I'm saying as if I'm reading for "more people having more GPUs", you're misreading what I'm saying. More equal access to beneficial hobbies and pastimes, such as gaming, is generally beneficial, yes, and is an argument for democratizing access to gaming-capable computers in general. That amounts to a public good. So I'm arguing (although indirectly - there are more ways to game than by having a dGPU) for more people having access to GPUs, but not more people having more GPUs like you put it, as that strongly implies rapid replacement rates for said GPUs as well. I'm a strong proponent of keeping hardware in service as long as possible, which is also reflected in my own hardware choices - as I said before, my previous GPU was a Fury X, bought in 2015, and while I've done a relatively rapid system upgrade now (4 years, 2017 to 2021), my previous CPU and motherboard lasted me nearly a decade, from late 2008 to mid 2017, and I generally always advise against upgrading unless actually necessary (and if necessary, upgrading selectively and critically).

As for how this ties back into the 3090 Ti and its pricing: the knock-on effect of higher flagship prices pulling low-end and midrange pricing up will make gaming less accessible. This might have the unintended side effect of people keeping their GPUs for longer as they can't afford to upgrade, but that's both impossible to control and at best an indirect benefit of a larger harm. Relegating gaming to only the wealthy is not beneficial in any way - not to players, not to the game industry, not to gaming as a cultural phenomenon. And, of course, the potential cut in consumption from unaffordable lower end GPUs is likely offset by the increased prestige in higher end GPUs, leading to higher replacement rates among a larger wealthy audience (which we've already seen alongside the growth of gaming as a more broad-reaching activity). And, crucually, while many enthusiasts will sell or give their used GPUs to someone who will use them, massive amounts of fully usable, still decently performing hardware is discarded every year. More status and prestige surrounding high end hardware will only accelerate this, as the reverse of this will always be negative sentiments surrounding hardware perceived as out-of-date, even if these are rarely expressed. (Though looking around these forums, you won't have to look much to find these sentiments.)

I've never said anything against buying used GPUs whatsoever - I think a more active used hardware market is something PC gaming needs. But what stops people from buying used GPUs is, typically, the prevalence of scams, insecurity around hardware longevity, lack of warranties, lack of protections against scams or unscrupulous sellers, lack of knowledge, lack of maintenance skills, and of course the massive hype and status surrounding new hardware. IMO, anyone buying used rather than new hardware is doing the world a favor.

As for the "if next gen don't bring any benefit in efficiency and price to performance, there is no reason to buy them, just look away" - as has been pointed out plenty of times earlier in this thread, that's nothing more than a cop-out. The hype around new products is essentially always sufficient to ensure sales, as long as they aren't complete trash. And you don't need improvements in either efficiency or price/perf as long as absolute performance is higher and there are people with more money than sense out there - which is why the rumors of 600+W flagships are so scary. Of course, that's also part of what I think Nvidia is doing with the 3090 Ti (and, to some extent, was doing with the 3090) - readying the ground for a "better value" 4090 Ti by producing a predecessor that is such abominably bad value that it's laughable. Anything can be made to look good with sufficient planning.

As for that elitist spiel you end your post with: whether something is a "luxury" or a necessity is extremely dependent on context, and immensely variable, outside of basic needs like food and shelter. Nobody needs to game, but then nobody needs mechanised transportation either. Nobody needs electric lighting. Whatever your cut-off point for "luxury" is, it will always necessarily be arbitrary, making it a slippery slope towards ever-increasing demarcation of what other people don't deserve because it's a "luxury". IDGAF whether you think gaming is a "luxury" - what you're saying here is that you believe that you have the right to modes of enjoyment in life that others don't. That is an explicitly harmful stance, and of course a deeply unegalitarian one. In short: elitist. You're welcome to believe that your circumstances have somehow given you the right to enjoy life in ways that others don't, but that belief will always be predicated upon some deeply flawed and harmful logic.
 
Wait, what? I am explicitly arguing against overconsumption. On several levels, even. If you're reading what I'm saying as if I'm reading for "more people having more GPUs", you're misreading what I'm saying. More equal access to beneficial hobbies and pastimes, such as gaming, is generally beneficial, yes, and is an argument for democratizing access to gaming-capable computers in general. That amounts to a public good. So I'm arguing (although indirectly - there are more ways to game than by having a dGPU) for more people having access to GPUs, but not more people having more GPUs like you put it, as that strongly implies rapid replacement rates for said GPUs as well. I'm a strong proponent of keeping hardware in service as long as possible, which is also reflected in my own hardware choices - as I said before, my previous GPU was a Fury X, bought in 2015, and while I've done a relatively rapid system upgrade now (4 years, 2017 to 2021), my previous CPU and motherboard lasted me nearly a decade, from late 2008 to mid 2017, and I generally always advise against upgrading unless actually necessary (and if necessary, upgrading selectively and critically).

As for how this ties back into the 3090 Ti and its pricing: the knock-on effect of higher flagship prices pulling low-end and midrange pricing up will make gaming less accessible. This might have the unintended side effect of people keeping their GPUs for longer as they can't afford to upgrade, but that's both impossible to control and at best an indirect benefit of a larger harm. Relegating gaming to only the wealthy is not beneficial in any way - not to players, not to the game industry, not to gaming as a cultural phenomenon. And, of course, the potential cut in consumption from unaffordable lower end GPUs is likely offset by the increased prestige in higher end GPUs, leading to higher replacement rates among a larger wealthy audience (which we've already seen alongside the growth of gaming as a more broad-reaching activity). And, crucually, while many enthusiasts will sell or give their used GPUs to someone who will use them, massive amounts of fully usable, still decently performing hardware is discarded every year. More status and prestige surrounding high end hardware will only accelerate this, as the reverse of this will always be negative sentiments surrounding hardware perceived as out-of-date, even if these are rarely expressed. (Though looking around these forums, you won't have to look much to find these sentiments.)

I've never said anything against buying used GPUs whatsoever - I think a more active used hardware market is something PC gaming needs. But what stops people from buying used GPUs is, typically, the prevalence of scams, insecurity around hardware longevity, lack of warranties, lack of protections against scams or unscrupulous sellers, lack of knowledge, lack of maintenance skills, and of course the massive hype and status surrounding new hardware. IMO, anyone buying used rather than new hardware is doing the world a favor.

As for the "if next gen don't bring any benefit in efficiency and price to performance, there is no reason to buy them, just look away" - as has been pointed out plenty of times earlier in this thread, that's nothing more than a cop-out. The hype around new products is essentially always sufficient to ensure sales, as long as they aren't complete trash. And you don't need improvements in either efficiency or price/perf as long as absolute performance is higher and there are people with more money than sense out there - which is why the rumors of 600+W flagships are so scary. Of course, that's also part of what I think Nvidia is doing with the 3090 Ti (and, to some extent, was doing with the 3090) - readying the ground for a "better value" 4090 Ti by producing a predecessor that is such abominably bad value that it's laughable. Anything can be made to look good with sufficient planning.

As for that elitist spiel you end your post with: whether something is a "luxury" or a necessity is extremely dependent on context, and immensely variable, outside of basic needs like food and shelter. Nobody needs to game, but then nobody needs mechanised transportation either. Nobody needs electric lighting. Whatever your cut-off point for "luxury" is, it will always necessarily be arbitrary, making it a slippery slope towards ever-increasing demarcation of what other people don't deserve because it's a "luxury". IDGAF whether you think gaming is a "luxury" - what you're saying here is that you believe that you have the right to modes of enjoyment in life that others don't. That is an explicitly harmful stance, and of course a deeply unegalitarian one. In short: elitist. You're welcome to believe that your circumstances have somehow given you the right to enjoy life in ways that others don't, but that belief will always be predicated upon some deeply flawed and harmful logic.

I gave all my used stuff to my relatives and only keep one PC and one laptop, so yeah i also try to avoid over-consuming to keep my mind sane.

I think TPU reviews and some others are more than detailed enough to educate potential buyers about the pro and con of any new GPU, so you are refering to unrelated and uninformed crowds outside of the tech circle that buy GPU based on hype.

Anyways it has been fun and refreshing arguing with you, but yeah maybe I should come back to playing games now :D, have fun.
 
DLSS excels at sub-pixel detail, it's strengths are often in 1 pixel wide details. You talk with a lot of authority about it, but from what I can see that authority is somewhat misplaced.

In any case, I think that probably does it for me, I don't see this conversation continuing and benefitting either of us, as I neither require nor desire your spurious explanations, and I doubt you'll legitimately take on board anything I have to say about it.
Fundamentally, I don't think we disagree at all. We're both aware of what DLSS can and can't do - whether it provides great image quality or unsatisfactory image quality depends an awful lot on the version of DLSS used, the implementation of DLSS for the game in question, and to a great extent the scene at that particular moment, because DLSS is a compromise that generates excellent, native (and occasionally better-than-native) detail for some things whilst adding motion artifacts and failing to always hide the true, lower, rendering resolution for other things. It's not always universally better or worse, it's situational and at least we have the choice to enable or disable it based on personal preferences.

Desperately trying to get back on topic, DLSS is fancy upscaling that *can* make a game unplayable at 4K look pretty close to 4K. If you can't afford a $2000 GPU there's a lot to be said for buying a $500-600GPU and just using DLSS or FXAA to get you the playable framerates at 4K in that small handful of games that a current-gen $500-600 GPU at 4K won't handle acceptably at 4K native.
 
Not gonna happen, the market is gonna get saturated with old Ampere GPU soon enough that in order to sell new GPU, both AMD and Nvidia have to make the new gen more appetizing: higher efficiency, better price to performance.
I don't think it will have as much of an effect as you think it will.

New PC owners won't risk it (no warranty, limited places to buy, reputation for the used market isn't great)

More enthuasist people like us will logically skip it because we were active in the community summer of 2014 and summer of 2018 so we all saw first hand the backlash that happened the last 2 times used crypto cards flooded the markets (bricked and artificating cards being send out by the hundreds of thousands).
 
I know this is an old post but I drive a £55k bmw. Bmw do a near £90k version of my car. The performance of the M3 isn't 70% better than mine and certainly to 60mph the rwd m3 is actually slower than my 4wd. Either way, you can only go so fast on the road anyway and nearly all people that buy the m3 don't get near its potential, so one could ask why do people pay this money for it when you could have my version and get where you're going in the same time with practically the same interior lay out.
Point is, these things do exist and people do but them. You all got too hung up on the price thing in this discussion board.
 
Desperately trying to get back on topic, DLSS is fancy upscaling that *can* make a game unplayable
DLSS auto/quality can look so close to native when done right, that you just get free FPS for nothing.

Deep rock galactic is like this, where the only time it's visible in certain computer screen displays in the pre-game lobby area (the space rig) - in mission, you cant tell at all and get a good 30% speed boost
 
...what should i say about... since so many years i prefer EVGA and my 3080 FTW 3 Ultra was broken after 23 month. No support and trouble what i did not know from the past years from EVGA. And i needed a new one. Then i see the "low price" (1300€), ja its much but for a Titan Class with 24 GB and 384bit.... hm i orderd. But for first, why Nvidia decide to call that thing 3090 and no Titan?.... okay i orderd.
The 4090 was already on sale (2000€) and for sure 50% more power. I bought the 3090ti Founders. And all what i can say about is uncompromising 4k gaming. All games with Ultra or Maximum settings, Raytracing cobinend with DLSS or older games with true 4k without DLSS. This thing pumps and have enough power to play all what you want. So and just 2 fans only? - No problem and absolut silent by maximum load like 3d Mark. Reach 70degrees with 2040Mhz boost. It is more as good. All settings in Nvidia control center with High Quality, FXAA on, Anisotrophic 16x, Ambient Occulution and VSync on and ingame settings with Maximum or Ultra no problem for the beast. Raytracing combined with DLSS you will get for sure 4k 60fps constant. Without VSync you will get more. Not always reach the card 450w ... so normally in games 240w - 380w and 63degrees. For sure you shluld use a big tower with exellent airflow.
Testet: Red Dead Redemption2, Shadow of the Tomb Raider, Cyberpunk 2077 (- DLSS UltraPreformance but the rest settings on maximum), Jurassic World Evo2, Crysis Remasterd, Bright Infinite, MassEffect Legacy, Chorvs, Spider Man Remasterd, lot of Resident Evil, and so much more ... play what you want, you have no trouble with the RTX 3090ti FE. The mainfan blows out of the PC case and let no heat in and the other one from botton to top so with efficent airflow in your case you will have no heat and really, its a so quiet and powerful card, crazy. The Founders Edition looks solid, clean and good balancend.
Ja, compared to the EVGA 3080FTW3 Ultra, wich was absoute enough to play all the titles without problems too. The 3090ti use more VRAM and is bit faster so the change is really not huge. I never want buy a Titan Class card because of the price and it is not worth, when you get a good flagship class like a XX80 with good power, you do not need it. Finally a 4080 with 16gb should be actually the better choice. But the prices are pervert. Sorry but its true. I remember back when i orderd the EVGA 1080SC for 520€... was double so powerful like the 980 and really that was a jump. Not from 1080 to 2080... the features like Raytracing was for the 20series not really convertible but for the 30series good when using the DLSS feature. I guess you all to suspend a modelrow from each class. When you use the 30series jump to the upcomming 50series. Thats good and save money. The 40series is great and powerful but expensive and maybe not worth. But everyone needs to know for themselves and for wich games you will use the cards.
 

Attachments

  • 3d mark 3090ti.jpg
    3d mark 3090ti.jpg
    441.2 KB · Views: 93
  • 3090.jpg
    3090.jpg
    204.1 KB · Views: 83
  • 3090ti.jpg
    3090ti.jpg
    819.5 KB · Views: 95
  • IMG_20221112_114306136.jpg
    IMG_20221112_114306136.jpg
    835.1 KB · Views: 87
  • IMG_20221112_120944274.jpg
    IMG_20221112_120944274.jpg
    2 MB · Views: 89
Back
Top