• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.
No I am just tired of reading about RTX 4000 series is bad when it is not true at all. Even 4070 Ti beats every single card from last generation when you look at the overall performance across many games, like what TPU is doing.

No, no sane person would begin to praise this VRAM limited card.
It will be slower as time goes by.

1694439511737.png
 
No, no sane person would begin to praise this VRAM limited card.
It will be slower as time goes by.

View attachment 313115
No it won't. You won't see games require alot more memory for years. Curren gen consoles have 16GB shared RAM. UE5 will only run better when devs actually knows how to optimize it properly.

Besides this card is not meant for 4K gaming really, yet it works just fine for 99% of games here, and 100% with DLSS enabled. The only real 4K gaming card today is 4090 which is 25% faster than both 4080 and 7900XTX overall. Completely destroys last generation cards. In a league of its own for 4K gaming really. Hence the price.

7900XT only performs 5 fps higher than 4070 Ti on average in 4K gaming minimum fps -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html

Also 7900XT has no DLSS which easily beats FSR. Techspot tested this recently. FSR did not beat DLSS in a single game. Lifesaver for 4K gamers and AMD users are stuck with FSR only.

Terrible image you posted because it is based on 4K results. For 1440p gaming, 4070 Ti is crazy good and according to Steam, 95% of gamers uses 1440p or less. Performs like a 3090 Ti in 1440p, without using 500 watts to do so.

Yet 4070 Ti only performs 10% less than a 7900XT in 4K gaming and you say it's bad? :roll: Wake up...
 
Last edited:
Probably the best gpu generation I can remember, the 4090 coupled with PT on cyberpunk offers a glimpse into the future TODAY.
 
I really like my 4070ti, 4k games run like a champ and I can do rt with dlss much better than my 2080. It cost too much but performs quite well. I got a new tv just for this card so I'm rocking 4k @ 120hz. Big improvement from 4k 60hz 120mr tv.

Lately been diablo 4 playing but soon going to add starfield to my gaming collection.

Dlss is good, I can only do rt @ 1080p but with dlss I can rt much higher res, 1080p on a 4k tv is blocky but dlss nicely makes the image noticeably less blocky. Apparently there was a starfield mod on day one to add dlss.
 
Too expensive, not enough VRAM, and no sign of innovation except for fake frames. This can be said about all models. Efficiency is good, and the 4090 is uber fast, but this won't save the series if Nvidia doesn't decrease prices. All SKUs should cost $50-100 less to make them a worthy purchase, especially if you consider how small these GPUs and how simple their PCBs are.

I would probably choose the 4060 over the RX 7600 because this is the segment where DLSS and frame generation can potentially make a difference, but the 7700, 7800 and 7900 series are all much better offerings than their Nvidia competitors.

Nvidia should wake up from their post-mining slumber and stop ripping gamers off with simple cards that cost an arm and a leg for no reason whatsoever. This is what I think.
 
Besides this card is not meant for 4K gaming really

It is really for 4K. But the RTX 4070 Ti is a wrong card, with wrong specifications and wrong market position.
The reason for the good performance upgrade is the change from Samsung's 8nm to TSMC's 4nm.

And already memory starved in certain titles:

1694441705540.png


1694441674283.png

 
Low quality post by P4-630
Some of the people on this forum must be really fun at parties..... The balloons don't have enough helium they are suppose to have 16 not 12 psi..... I love 12 psi balloons.... No that is crap.
 
Thats crazy talk. I play at 3840x2160 just fine.

People like to exaggerate small differences and use the same 5 terrible console ports to prove their point.
 
Nevermind..
It's only fitting considering some posters identify one way or the other leading to the other shit thread... so this is fitting.
 
So as stated but with no pre set theoretical statement.

I want your opinions and keep them real, polite and none flamey!?!, I'll add mine after page 5.
Are you a journalists or "think tank researcher" that get's payed by 3-rd party to check what is public opinion on stuff ?

"Ada" is just another series that I will buy one or two cards from for 1/3 or lower than MSRP in few years.
In other words : Just another day in future of computers.
 
Ada in my mind is best described by one word: Lazy.

1. Ampere 'graced' us with oversized coolers throughout the stack, and apparently, despite Ada nearly halving the TDP for some tiers, there are a lot chunky boys around. There is no SFF / 1 slot GPU either, despite the fact this is easily doable.

2. Ada continues the lack of VRAM relative to core power nonsense, clearly pushing every GPU below the 4080 into obsolete territory sooner rather than later. The lack of VRAM is not necessary either - the margins are fat, the TDPs are not high. Another big meh is the 4060/ti not carrying GDDR6X, while they damn well could and the card damn well needs it. Bandwidth on every Ada card below the 4090 is abysmal.

3. The price/perf offer with Ada is the worst it has been in years.

4. The featureset is a per-game implementation offer, Ada is forever in beta and might eventually offer you a performance uptick, IF you enable DLSS3/FG. Another aspect where the card is actually only good in a best-case scenario.

5. The positioning of the whole line up is horrible, to the point of even Nvidia weaseling out from under their "4080" 12GB.

6. DLSS is a crutch, not a core technology guaranteed to appear in every game, but is being sold as if its a basic premise with your GPU. That just oozes all kinds of yuck and puke.
 
There are multiple parts of this and it is hard to sum up in one "yes" or "no" answer.

1. The 4090 is a beast of a card. It absolutely blows away everything and it even does that at lower power draw than a 3080 if you want it to. The catch is that you have to pay $1500+ (sometimes+++) for that. The costs have gone up beyond the rate of inflation, which casts a bad light on it.

2. The rest of the lineup is disappointing. Two 4080's is a good card on it's own, but it's overpriced. Going from $699 "80 level" MSRP to $1199 in one gen is crazy. From a performance, and especially a perf/W perspective, great card...but does it outperform enough other cards in that price range to make it worth it? No.

3. 4080-lite 4070ti is way overpriced. Everything below it is memory limited and extremely price-gouged. Nvidia basically said you have to spend $1200+++ if you want to play over 1440p and even there sometimes. Game devs are designing with 16GB consoles in mind today. Even beyond the total memory limit, the buses and bandwidth are extremely limited this gen, which has shown issues across all sorts of games. Most of Nvidia's cards this gen underperform, especially for the price and it usually is down to the memory choices and the cut-down chips compared to previous generations. There's a very visible decision from them to reduce the amount of silicon they put into gaming GPUs unless you buy the top card. They cherry pick a few titles where they do ok to try and justify it, but there's no longevity in the series below the top two cards and while some of that is the fault of game devs, Nvidia is well aware of the trends and made those decisions anyway. It's clear that they value AI more and are moving silicon allotment in that direction while still charging full price to us (and then some).

4. a note on the prices...You could not get a 4090 for months without spending well over "MSRP". Nvidia is clearly charging so much money that the AIBs have to bump up the prices by quite a bit to make any money. This is why EVGA dropped out of the game and why we have $2.2k 4090s from Asus and cards ranging from $1650 to $1900 from Gigabyte, MSI, PNY, Zotac, etc. Nvidia is making themselves the only viable company to sell cards at "MSRP" in an intentional means of undercutting their partners. This trend is worse with this generation than previous ones so it would have to be weighed against this launch in a "yes" or "no" answer.

So TL-DR - There are a couple great cards in this generation, but most of it has been horribly priced and some of it is so cut-down that it doesn't outperform the previous generation even though their new architecture and platform is considerably more powerful/efficient than the previous gen.
 
Last edited:
^^This is one of the key reasons why Tesla created Dojo to ramp their AI compute needs without feeding into a suppliers 60% plus gross margins.
 
In terms of architecture, Ada is fine at the top and degrades as you go lower in the stack thanks to the increasing restrictions of the coupled reduction in VRAM and VRAM bandwidth. The lower cards' overall performance as we continue into the future will likely degrade more relative to Ampere, Turing, and RDNA.

However that would be fine if pricing was decent, but from the 4080 on down the pricing is in no way an attempt to deliver value. This continues the trend started with Turing and arguably paused with Ampere if you stick with MSRPs, though in reality mining killed Ampere's value. Same with RDNA2. It seems Nvidia realized people will pay and kept prices up accordingly and AMD is happy to undercut Nvidia by just enough at just enough price points to where their cards are viable, but Nvidia doesn't really need to slash prices.

So the whole thing has been a disappointment unless you think it's reasonable to pay $1600 for a GPU.

Put another way: Towards the end of last year I was ready to pay about $700 or so for an Ada or RDNA3 GPU. I got a $550 6800 XT and it continues to very clearly be the best choice I could have made.
 
Last edited:
People could disagree just fine in the 'opposite' RX7000 topic... and its an interesting topic too, mind, because it also shows perspectives and poll changes over time.
 
Last edited by a moderator:
In terms of architecture, Ada is fine at the top and degrades as you go lower in the stack thanks to the increasing restrictions of the coupled reduction in VRAM and VRAM bandwidth. The lower cards' overall performance as we continue into the future will likely degrade more relative to Ampere, Turing, and RDNA.

However that would be fine if pricing was decent, but from the 4080 on down the pricing is in no way an attempt to deliver value. This continues the trend started with Turing and arguably paused with Ampere if you stick with MSRPs, though in reality mining killed Ampere's value. Same with RDNA2. It seems Nvidia realized people will pay and kept prices up accordingly and AMD is happy to undercut Nvidia by just enough at just enough price points to where their cards are viable, but Nvidia doesn't really need to slash prices.

So the whole thing has been a disappointment unless you think it's reasonable to pay $1600 for a GPU.

Put another way: Towards the end of last year I was ready to pay about $700 or so for an Ada or RDNA3 GPU. I got a $550 6800 XT and it continues to very clearly be the best choice I could have made.

Ever increasing cards' prices is a very dangerous and risky game for the players that like to try this approach.
Because by charging more and more even for the lower tiers, they effectively kill the progress start in the whole exercise, and throw the people out of the purchasing offers.
Today you bought the RX 6800 XT for 550$ as you say, next year they will offer 10% more performance for 20% more money, and so on...
This is not sustainable and the market will throw them out as a response and feedback for their alogical strategies.
 
I voted No. Terrible, terrible pricing all-round. Only the 4090 is really worth getting, if you can stomach the price. The 4080 is meh and from there on it goes downhill rapidly. Had the pricing been sane and the sub-4080 cards been less cut down, especially in regards to memory size and bandwidth it could have been an excellent launch cycle. However, my personal conviction is that for Nvidia this has been far less of actual product cycle than a market investigation into how little consumers can be offered, for how much money, and still end up buying their products. Sadly I'm sure it has been a reasonably successful investigation too, from their perspective.

Edit: re. the above post - for Nvidia the risk of "being thrown out of the market" is likely negligible. The money they earn on gamers and semi-professionals who can't afford their big-boy cards counts for nearly naught compared to the professional market and the still nascent AI boom. They could drop us completely tomorrow and the losses in revenue would hardly be noticeable.
 
Warning to all. Keep it civil, on topic, and we're good. If you'd like to say what you think about ADA, feel free. Sack of shit, or bag of glory, please, go ahead.
 
Edit: re. the above post - for Nvidia the risk of "being thrown out of the market" is likely negligible. The money they earn on gamers and semi-professionals who can't afford their big-boy cards counts for nearly naught compared to the professional market and the still nascent AI boom. They could drop us completely tomorrow and the losses in revenue would hardly be noticeable.

If nvidia wants to make the DIY graphics cards segment a niche one, they are fine to go on and see where it leads to...
 
Ever increasing cards' prices is a very dangerous and risky game for the players that like to try this approach.
Because by charging more and more even for the lower tiers, they effectively kill the progress start in the whole exercise, and throw the people out of the purchasing offers.
Today you bought the RX 6800 XT for 550$ as you say, next year they will offer 10% more performance for 20% more money, and so on...
This is not sustainable and the market will throw them out as a response and feedback for their alogical strategies.
Its another 'death of PC gaming' line, but so far none of that has materialized. The influence on graphics card purchases has been different things, but never really the price per FPS.

People just postpone, save longer, buy second hand more easily, dive down a tier or two, etc. Compromises. But they're being made. It'll take a lot more for the market to vanish, and so far, PC gaming has never - not once - in its history seen continuous decline. And overall, its still a growth market too, even DIY PCs, mind. Not too long ago the whole PC gamer space spewed RGB for example with whole product lines popping up left and right, there are more peripheral companies than ever, etc etc etc. Its booming. And it even did that on the back of a crypto craze and grossly inflated GPU prices.

Its not entirely a bad thing we buy cards less frequently, for example, and make bigger jumps in doing so. And if you look at whole inventory of cards in the hands of gamers - devs are just going to optimize for whatever is the mainstream line of performance, even if it stalls. We've been there a few times in GPU history.

ALSO: consider the emergence of the PC handheld. The PC is now getting in on the low barrier of entry race, for real, and I would dare say its a success getting copied throughout the industry now. And what do you think of VR? It wouldn't exist if PC wasn't a huge growth market with potential. Whoever said we needed those overpriced GPU stacks hm?
 
Last edited:
Not worth the jump from previous generation, not that huge leap. 4060 losing to 3060 ti in raw power. 3070 ti beats 4070... 4xxx series is good only with DLSS except two most powerful cards from 4xxx. Have used that few times and I saw difference (in lower quality) than all the advertising etc. With 8GB mega duper nvidia high vram size have ran into wall with few games already (Far Cry 6, Harry Potter, Resident Sleeper) when fps goes from above 60 to slide show :love:. It aint the VRAM, it aint the VRAM goddamit.

Main point of dissapointment is that nvidia pushes useless DLSS in games and makes this DLSS groudbreaking innovation and trying to cash in from people who believes everything they have read, saw and told.

Sad that I got 3060 ti in mining BUM and 6700 xt at that moment costed almost x2 of 3060 ti that time which was already overpriced cr@p.



Short answer - Meh. Next gpu will be from RED.
 
Status
Not open for further replies.
Back
Top