• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next Gen GPU's will be even more expensive

Status
Not open for further replies.
7900xtx was $999 too, and an equivalent to, surprise surprise, nvidia's $999 4080 but with worse RT, efficiency and worse upscaling. Yet still people acted like the blame was on one side only, when to my eyes it has always been on both in equal measure.
Except that the 4080 was $1,199, a whole 20% more expensive than the 7900 XTX.

What is also true is all this negativity did not start just now, the people you see xomplaining here have been green team haters for a long time, even when 1080Ti was absolutely killing it in value. I too decided to change things just to try, upgrading a 3060Ti to 6800.
Are we seeing things into the past now? I don't know who you mean, but personally, I had a 6 GB 1060 and absolutely loved the heck out of it! It was quiet, compact (single fan), it sipped power, and killed every game at 1080p - just what I expect from a mid-range card.

And you know what ? It was absolutely meaningless when a card with 16GB of vram fell flat on its face whenever I tried running max settings in new games, because they all required Ray traced GI, reflections and shadows. The numbers were actually worse on 6800, and unplayable on both cards.
Which game requires all that ray tracing by default? I'm curious.

I'm just calling out a specific manufacturer and their high-end buyers that are guilty of exactly what you were saying :p
Is there any need for that? Why is it so goddamn difficult for some to see the point in what others are saying without shifting blame onto one manufacturer or another? :confused:

This behaviour is so f-ing childish, I'm telling 'ya. :( Besides, are you not a high-end buyer of a specific manufacturer by any chance?

I mean sure, but xx90 functionally also replace xx80 Ti.
I disagree. If you look at the specs and price, x90 replaced the Titan, not the x80 Ti.
 
Last edited:
Which game requires all that ray tracing by default? I'm curious.
well, for max settings it's almost 10 out of 10 these days. Would you rather buy a 600eur card that runs them at 50+ or 70+ fps ? Or spend 600eur and just give up on max settings entirely ? If it's the latter, 4070 S still has a major advantage over 7900gre, DLAA namely.
Except that the 4080 was $1,199, a whole 20% more expensive than the 7900 XTX.
must have confused it with 4080S launch, that was 999 iirc.
Are we seeing things into the past now? I don't know who you mean, but personally, I had a 6 GB 1060 and absolutely loved the heck out of it! It was quiet, compact (single fan), it sipped power, and killed every game at 1080p - just what I expect from a mid-range card.
not you, you're one of the most objective guys here imo.
 
I was more into console gaming still but had a Dell with a 6600GT. Pretty sure that was my first Nvidia card.... All ATI prior to that.

I still remember the crazy looking Dell XPS laptops from around that time with I believe was 9700 although it was a desktop 9600XT in disguise If I remember correctly.

Back then I just rolled up to bestbuy and bought whatever looked cool lol..... I do remember the fake HDR age though it was cool.
Dell xps gen 1 here i had a 9800 in it
 
well, for max settings it's almost 10 out of 10 these days. Would you rather buy a 600eur card that runs them at 50+ or 70+ fps ? Or spend 600eur and just give up on max settings entirely ? If it's the latter, 4070 S still has a major advantage over 7900gre, DLAA namely.
Unfortunately, I don't think you can expect max settings + RT in the latest games out of a $5-600 card. Max settings, sure, RT, no chance. Even the 4070 super you mentioned isn't strong enough for that.

must have confused it with 4080S launch, that was 999 iirc.
Yep. A slap in the face of everyone who bought a vanilla 4080. This is why I'm not happy with Nvidia these days, and always look for GPU specs. A partially disabled GPU die usually means that a refresh is coming.

not you, you're one of the most objective guys here imo.
Gee, thanks. :)

I'm just sad that trying one's best to be reasonable and objective seems to be a dying and laughed at trait even here on TPU.
 
I disagree. If you look at the specs and price, x90 replaced the Titan, not the x80 Ti.
Whats a Titan? Outside the Titan V, they were always basically the same thing as the 80 Ti card, but with enabled performance in their Titan drivers that was gimped on GeForce drivers and sometimes a doubling of VRAM over the 80 Ti l, but not always. And sometimes they were faster, the same, or slower than the 80 Ti.

I remain by my statement. xx90 replaced both Titan and xx80 Ti in product stack and pricing. When the 3090 came out it did worse than the Titan RTX at certain workloads, so at the time it actually definitely wasn't a Titan. Nowadays you can do a choice of driver, so it can be either.
 
Don't know how it is in US, but over here in Europe most sellers also add "nvidia tax" meaning that difference between 7900XTX and 4080 was more than 200$. And you could get Sapphire Nitro+ version of 7900XTX for less than cheapest 4080 option, easily 300$ difference. Asus, MSI etc the price difference was even higher. That's why I upgraded from 3080 to 7900XTX instead of 4080. If they plan to launch 5080 at even higher price, and AMD won't compete, then I'm skipping this gen, wanted to sell 7900XTX and buy 5080 but 1500 Euro? Im not crazy enough.
 
Whats a Titan?
To me personally, a Titan is a class of GPU that no one truly needs. It is bought by extreme overclockers, prosumers, and those who just simply want the best and have some cash to throw away. I'm saying "no one truly needs" it because it is not needed for playing the latest games at high detail - there are cheaper cards that can do it just as well. It is also characterised by insane power consumption and insane depreciation in value over time.

Looking at the specs and price of the 5090, and comparing it to the 5080 and my own definition above, it is definitely a Titan in disguise.

Needless to say, this is my opinion, not fact, so if someone takes offence, that isn't my problem.
 
Unfortunately, I don't think you can expect max settings + RT in the latest games out of a $5-600 card. Max settings, sure, RT, no chance. Even the 4070 super you mentioned isn't strong enough for that.
averages 73 fps in latest games, and that is before FG is enabled. I was referring to these numbers in my previous comment, I try to avoid making claims that aren't backed up by trusthworthy sources like computerbase, TPU, pcgh,purepc etc. Those numbers already have dlss/fsr enabled if you were wondering. So, as you can see, rt on is very playable on 4070S, the question is for how long before the card starts being vram limited. My bet is it has until Nvidia's Rubin architecture, because of 5070 still having 12gb, which is probably gonna help 12gb rtx40 cards too in terms of how they (meaniong both game devs and nvidia) optimize vram usage in the foreseeable future. On the other hand, you have 7900gre that never could run rt smoothly.
Opera Zrzut ekranu_2025-01-04_160222_www.computerbase.de.png
 
Last edited:
I'm just sad that trying one's best to be reasonable and objective seems to be a dying and laughed at trait even here on TPU.
Nah..
 
averages 73 fps in latest games, and that is before FG is enabled. I was referring to these numbers in my previous comment, I try to avoid making claims that aren't backed up by trusthworthy sources like computerbase, TPU, pcgh,purepc etc.
View attachment 378327
Fair enough, I'll correct myself - I don't think you can expect max settings + RT on a $5-600 GPU in every game.
 
Don't know how it is in US, but over here in Europe most sellers also add "nvidia tax" meaning that difference between 7900XTX and 4080 was more than 200$. And you could get Sapphire Nitro+ version of 7900XTX for less than cheapest 4080 option, easily 300$ difference. Asus, MSI etc the price difference was even higher. That's why I upgraded from 3080 to 7900XTX instead of 4080. If they plan to launch 5080 at even higher price, and AMD won't compete, then I'm skipping this gen, wanted to sell 7900XTX and buy 5080 but 1500 Euro? Im not crazy enough.
They're marked up as well, been that way since GF2
 
Fair enough, I'll correct myself - I don't think you can expect max settings + RT on a $5-600 GPU in every game.
that is technically correct, but you'll get that in most. like I said, these results are w/o frame generation. I mean, I played over 100hrs of Alan Wake 2 with high path tracing, and I don't think there is anything remotely as taxing as PT these days, I know what I'm saying from experience. Worst you're gonna see is around 70-75 fps with fg on, but most of the time you'll have no problems hitting 90 or more. This is certainly fine for a single player+controller type of gamer like me. I play games for story and graphics these days, mostly on story difficulty not to waste too much time on skill/weapon grind. If I have to do too much grinding for gear/skills, I tend to lose interest in the story.
I get that others prefer to hit very high framerates and not worry too much about image quality being cranked up to the max. I mean, games looked nice before rt was a thing, it's just now they can look more realistic with ray/path traced GI.
 
Last edited:
To me personally, a Titan is a class of GPU that no one truly needs. It is bought by extreme overclockers, prosumers, and those who just simply want the best and have some cash to throw away. I'm saying "no one truly needs" it because it is not needed for playing the latest games at high detail - there are cheaper cards that can do it just as well. It is also characterised by insane power consumption and insane depreciation in value over time.

Looking at the specs and price of the 5090, and comparing it to the 5080 and my own definition above, it is definitely a Titan in disguise.

Needless to say, this is my opinion, not fact, so if someone takes offence, that isn't my problem.
No one took offense. I merely pointed out looking at what Titans were to their respective generations and it's all over the place. The only thing that really made them different from the xx80 Ti cards were their drivers and sometimes VRAM segmentation where they were double that of the xx80 Ti....until they weren't. Titans have almost never been a different class of performer to the xx80 Ti in any significant way on specs.

So in my opinion the 4090 is absolutely both the Titan and flagship gaming card of the 40-series gen at the same time. No 4080 Ti on AD102 silicon released. You now also have a choice of drivers to use whether Game or Studio.

By comparison, when the 3090 released an Nvidia tried to market it as "Titan class" (which means nothing), you didn't have a choice of driver, and the 3090 drivers were just as gimped as any other GeForce card vs actual Titans. LTT oddly enough was one of the only youtubers at the time to pickup on this and show the 3090 lost to the Titan RTX at certain workloads you'd use Titans for.

Just providing some historical context because people forget what Titans actually are.
 
Last edited:
I'm just sad that trying one's best to be reasonable and objective seems to be a dying and laughed at trait even here on TPU.
most of those "lauging face" reactions tell me is the person on the other end is just not properly informed and lacks willingness to do so. They're really a means of masking anger more often than you realize.
 
No one took offense.
That's good to know. :) It just seems like sensitivity about expensive toys is through the roof with some people these days. :(

I merely pointed out looking at what Titans were to their respective generations and it's all over the place. The only thing that really made them different from the xx80 Ti cards were their drivers and sometimes VRAM segmentation where they were double that of the xx80 Ti....until they weren't. Titans have almost never been a different class of performer to the xx80 Ti in any significant way on specs.
That's a fair assumption. By that, though, the 5090 is way more than a Titan class product.
 
That's a fair assumption. By that, though, the 5090 is way more than a Titan class product.
Mostly yes, but also not entirely. It is meant to be a "pro-sumer" card indeed. But iirc, Titans always had the full 102 die. It'd be safer to assume that 3090Ti was one (full ga102), but as for 4090 - not necessarily.
 
That's good to know. :) It just seems like sensitivity about expensive toys is through the roof with some people these days. :(


That's a fair assumption. By that, though, the 5090 is way more than a Titan class product.
Yes I'd agree with that. We have never before seen this level of gap between GeForce branded products.
 
Mostly yes, but also not entirely. It is meant to be a "pro-sumer" card indeed. But iirc, Titans always had the full 102 die. It'd be safer to assume that 3090Ti was one (full ga102), but as for 4090 - not necessarily.
That's because the full AD102 die was never released as a product. Possibly a yield issue?
 
Whats a Titan? Outside the Titan V, they were always basically the same thing as the 80 Ti card, but with enabled performance in their Titan drivers that was gimped on GeForce drivers and sometimes a doubling of VRAM over the 80 Ti l, but not always. And sometimes they were faster, the same, or slower than the 80 Ti.

The Titan needs to have memory on both sides of the PCB, meeting both the dual Vram requirements and or a fully enabled die, the 3090 and Ti are exactly that, but introduced as BFGPUs. The Titans, which had 12GB while their 1080 Ti and 2080 Ti counterparts had 11GB, also meet those requirements. The 4090, 5090 are not only heavily undercut, but they lack double-sided memory, so they are definitely not titan-class.
 
Fair enough, I'll correct myself - I don't think you can expect max settings + RT on a $5-600 GPU in every game.
We have the game developers to blame for that. They are becoming lazy and instead of properly optimizing their games/game engines, they leave it to upscaling technology and frame generation to improve the frame rate.
 
Mostly yes, but also not entirely. It is meant to be a "pro-sumer" card indeed. But iirc, Titans always had the full 102 die. It'd be safer to assume that 3090Ti was one (full ga102), but as for 4090 - not necessarily.
Titans didn't always have the full die. 780 Ti had the full die before they then release the Titan Black for example.

OG Titan: GK110, 2688 cores, 6GB, released 2/21/13

GTX 780 Ti: GK110, 2880 cores, 3GB, released 11/7/13

Titan Black: GK110, 2880 cores, 6GB, release 2/18/14

This goes back to "Titan class" is an extremely loose term with a sliding scale of what it means generation to generation.

I always considered the 3090, 4090 a rebrand and pricehike of the 80 Ti tier first before I ever considered them Titans. The only thing I can say that makes them quasi able to occupy both product tiers now is the choice of drivers and the larger VRAM capacity, and continued use of the flagship die. But as we have seen Titans historically have not always used fully unlocked dies either.
 
Last edited:
Status
Not open for further replies.
Back
Top