• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next Gen GPU's will be even more expensive

Status
Not open for further replies.
What I dislike in RT is the performance impact on both Nvidia and AMD. It's not worth it. I also don't think it looks that great. In most games, I can't see much difference, if any. If Nvidia and AMD can improve performance enough for developers to make really good use of the tech, I'll welcome it. But as long as a little sheen in a puddle tanks my FPS from 80 to 25, I'll say it's a gimmick.

I'm not against DLSS/FSR either. People with lower end GPUs, or those with 4K screens can make good use of it. I'm only against the hype, and the statement that it's outright better than native, which is highly dependent on what you consider native.
In 2024 RT isn't really an option for you with a 6750 XT or a 2070 so that's to be expected.
 
You had a 2070, it just does not compare to what we have now.
I also briefly had a 7800 XT. It's still not that great (I know, it's AMD, but the Nvidia equivalent isn't stellar, either).

In 2024 RT isn't really an option for you with a 6750 XT or a 2070 so that's to be expected.
The problem is that you need at least a 4080 to enjoy RT at any resolution. That's a grand spent on just your GPU. Way over my comfortable budget.
 
The problem is that you need at least a 4080 to enjoy RT at any resolution. That's a grand spent on just your GPU. Way over my comfortable budget.
Who told you that?
 
Who told you that?
This did (random example):
1735403999278.png
 
I game at 3840x2160 60fps, I guess that is why I have no problems. There are a couple of games like CP77 that will thrash this puny card but that game is like a playable tech demo. But I just play at its default settings at 3840x2160 and it is still awesome. They include upscaling, but no blur.
 
I game at 3840x2160 60fps, I guess that is why I have no problems. There are a couple of games like CP77 that will thrash this puny card but that game is like a playable tech demo. But I just play at its default settings at 3840x2160 and it is still awesome. They include upscaling, but no blur.
Upscaling at 2160p is nice because it works with a high enough input resolution to avoid being too blurry.

I consider everything with RT a playable tech demo. As soon as it becomes playable on mid-range cards ($500 and below), I'll welcome it.
 
CP has RT settings, not sure what are in there, maybe all maxed, idk.
I used RTX on my 3060ti and it worked just fine, but like with any game work with the settings.
TPU tests at max settings, but not overdrive (path tracing).
 
Those days are gone man, have you been to the grocery store lately? :D
Tell that to my pay check. :laugh: :(

Edit: Not that I'm complaining. Games still run fine at 1440 UW as long as I don't use RT.
 
Tell that to my pay check. :laugh: :(

Edit: Not that I'm complaining. Games still run fine at 1440 UW as long as I don't use RT.
Yeah man, I don't use my 3070Ti anymore, but my kid has not complained once. These things will last awhile, maybe not at the highest settings.. but at that point you have to ask yourself what you like about gaming the most? Playing for the fun, or just the visual appeal?

Edit:

Keep in mind that I did use a GTX580 for 8 years lol..
 
Last edited:
Yeah man, I don't use my 3070Ti anymore, but my kid has not complained once. These things will last awhile, maybe not at the highest settings.. but at that point you have to ask yourself what you like about gaming the most? Playing for the fun, or just the visual appeal?

Edit:

Keep in mind that I did use a GTX580 for 8 years lol..
I like visuals, but only up to the point where there's a meaningful difference, and as long as it adds to the immersion. For example, Black Mesa is an awesome upgrade over the original Half-Life. But whether there's RT in Cyberpunk or Alan Wake 2, I don't care. Those games look great even without RT. This is the other thing: there's a lot to improve in a game's visuals even without RT up to this day. RT is only icing on the cake.
 
Right?

The cards can perform very well with no RT, so it isn't needed by any stretch.
 
Those days are gone man, have you been to the grocery store lately? :D

Groceries increased some 28%, not the 400% we've seen in GPUs. A 970 cost $330 and would get you 78% of flagship performance. Rumored 5080 is likely to be around 70% flagship performance for $1,370. That's before you consider that the performance gap between the two (like the 4080 vs 4090) will only widen over time due to the 5080 only having half the die size, 16GB of VRAM, and a tiny bus for such an expensive GPU.

This isn't the cause of inflation, it's the result of a complete monopoly with Nvidia exceeding even Bell System's marketshare at the height of their power.
 
@3valatzy
calling any product you dont care about "stupid", just because of its price, doesnt make you sound like anyone trying to be objective.

not everyone buys it for gaming only, and there are ppl that are willing to save up and buy a big chip, so they dont have to get anything for years, not because they are morons throwing money at Nv.
just because it wins in raw perf, doesnt mean much if you start using cuda for say 4K editing. you really think im gonna drop ~30% project speed, so i can have a few more fps in games? nope.

@Hecate91
saying Ngreedy in every post gets annoying after the 3rd time.

and if the competition would actually (consistently) deliver, there would be no ridiculing.
had AMD always been the best perf gpu from low to mid range, every year in the past 15y?
right.

and for me:
why would i buy a non Nv product, if it doesnt do what i tend to use it for? and how does that make a fan, or in any way "influenced" by anyone, buying the product that matches my use,
and have the funds for?
i hate Ferrari (one reason stated above), that doesnt mean i will call ppl names that do own one, or ask them why they didnt buy the faster Lambo.

ppl buy what they like, if they have the money for it, doesnt mean there is a conspiracy connected to it, or that ppl are brand-brainwashed because they bought on brand over another.

@AusWolf
not really, or we wouldn't have virtually all european car makers offer a product similar to the VW GTI.
and they only really sell, because the have at least one thing, that they are "better" at, than the Golf (suspension/power/top speed/looks/features/price),
and the sales numbers for the next "smaller" model show that concentrating on (just) the mass market, isnt a good idea.
you think we would have AMD server chips, if AMD wanted midrange only?


@evernessince
except:
groceries = need
gpu = want

dont have to buy the latter.
 
Supposed leak from an Aus retailer about 5080 pricing, Asus Prime 5080 $2500 AUS cost price, $2750 MSRP, - sales tax and to reflect $ US roughly $1500-$1600 end user pricing, video is obviously speculation/to be taken with a pinch of salt but doesn't seem unlikely

Even if true, isn't the Australia market MSRPs higher than what they are in the US?

My guess is this still points to $1200-$1400 for the US market.
 
Groceries increased some 28%, not the 400% we've seen in GPUs.

that's not true, people lose sight of prices over time. The 3dfx costed 699usd ajusted for inflation.
 
@evernessince
except:
groceries = need
gpu = want

dont have to buy the latter.

GPUs are used in medical, engineering, scientific, AI, streaming, and Computer rendering fields.

For a large number of professions they are in fact a need and their pricing will impact you in one way or another as GPUs are increasingly an necessary element of the modern world. That's whether you use one personally or use a service that uses one (like a medical imaging device).

Mind you, whether a product is a want or a need isn't a rebuttable as to the pricing increase of a product. It's a regressive argument that doesn't address the topic, only avoids it. You could argue you don't need anything but food, water, and sleep. Doesn't change arguments for or against pricing of any product.
 
Last edited:
3440x1440 (all high settings)

Not the best scene for it but...
Comparison is better when download screenshots and switch instantly between them.

No RT, No Upscaling
110-115 FPS

20241228194947_1.jpg


RT Ultra, No Upscaling
50-55 FPS

20241228195105_1.jpg


And now the horror AMD shamefully unleashed...

RT Ultra, FSR 2.1 Quality with 0.7 sharpening
80-85 FPS

20241228195146_1.jpg

--------------------------------------

I'll take RT+FSR 24/7

RT is here to stay and I think adds to the "reality" of a game. Especially on reflections (water, glass and other surfaces)

I know it can be different on a "live" scene because you cant see if there is any shimmering.
If you can believe me it can only been seen some at max distance and after stopping to look for it. And in most cases you have to get really close to the screen.
And the higher the resolution the less unwanted effects the upscaling does. 4K+upscaling has close to nothing.
BTW the game offers FSR 3.0 too but the implementation is bad from project red. Way too much shimmering even tho it offers +5-10% performance
 
ROG-ASTRAL-RTX5080-O16G-GAMING
ROG-ASTRAL-RTX5080-O16G-GAMING
PRIME-RTX5080-O16G
TUF-RX9070XT-O16G-GAMING
TUF-RX9070-O16G-GAMING P
RIME-RX9070XT-O16G PRIME-RX9070-O16G

if these cards come with 16GB, we can only expect the worst for the lower tier ones
 
Why AMD is not using GDDR6X is new gpus ? It's still cheaper than GDDR7.... GDDR6 is very cheap......
 
Last edited:
RT is here to stay and I think adds to the "reality" of a game. Especially on reflections (water, glass and other surfaces)

It's such a pity that artificial anime scenes try to get real-world lighting effects. It's obvious that this is a unloyal to the gamers money-grabbing practice.
AMD, Nvidia, Intel, but mostly Nvidia couldn't find a "better" purchasing, marketing buzzword in order to make the orders of loyal clients spend, and who only get taps on their shoulders - the more you buy, the more you save... :kookoo:

Why AMD is not using GDDR6X is new gpus ? It's still cheaper than GDDR7.... GDDR6 is very cheap......

I would rather ask - why doesn't AMD use the newer TSMC 3nm node? Is this node broken for GPUs?
 
I would rather ask - why doesn't AMD use the newer TSMC 3nm node? Is this node broken for GPUs?
Expensive is the right word..... TSMC is overpricing it's technology because of monopoly the same as nvidia.
 
Expensive is the right word.....

Then two new questions:
Is the 5nm-4nm the end of the road forever?
Is it really better for AMD to keep losing market share till it gets depleted to under 1%, or isn't it better to move now and invest in a more modern and fast product line?
 
Status
Not open for further replies.
Back
Top