Tuesday, January 3rd 2023

NVIDIA GeForce RTX 4070 Ti Launched at $799 with Performance Matching RTX 3090 Ti

NVIDIA today formally launched the GeForce RTX 4070 Ti "Ada" performance segment graphics card at a starting MSRP of USD $799. Based on the 4 nm "AD104" silicon, the RTX 4070 Ti is essentially the same product as the RTX 4080 12 GB, which NVIDIA decided to cancel from its original mid-November launch, toward a new one this CES, under a new model name. The card maxes out the silicon it's based on, featuring 7,680 CUDA cores, 60 RT cores, 240 Tensor cores, 240 TMUs, and 80 ROPs. It gets 12 GB of GDDR6X memory across a 192-bit wide memory interface, running at 21 Gbps (GDDR6X-effective). The card has a typical power rating of 285 W, and continues to use a 12VHPWR power connector, even on the custom-design products.

NVIDIA claims that the RTX 4070 Ti should enable maxed out AAA gaming with ray tracing at 1440p, while also being formidable at 4K Ultra HD in games that can take advantage of technologies such as DLSS 3 frame-generation, or even classic DLSS 2. The company claims that it offers performance comparable to the previous-generation flagship, the GeForce RTX 3090 Ti "Ampere," with a much higher performance/Watt rating. The RTX 4070 Ti doesn't appear to feature an NVIDIA Founders Edition model, and is a partner-driven launch, with custom-design cards dominating the scene. The RTX 4070 Ti will be available from January 5, 2023, but we'll have reviews for you before that!
Add your own comment

150 Comments on NVIDIA GeForce RTX 4070 Ti Launched at $799 with Performance Matching RTX 3090 Ti

#101
Lycanwolfen
LeiI see, 6.3 times more transistors; 2.75 times more fps:

but 2.75 1080ti would take 300w more watts than a single 4090

Funny thing when two 1080ti's are in SLI its pretty much the same result as the new card today. Took them 4 years to compete with the greatness they allready had but killed off.
Posted on Reply
#102
Unregistered
Another overpriced GPU, hopefully no one buys it, fortunately it seems the case for all new realises, 4090, 4080 7900 all are available to buy at their real retail price.
#103
Denver
RedelZaVednoNgreedia's profit margin tells the different story. They hit 65%, all time high, before mining bust and AMD is no different. It's greed not the cost of dies that's hiking the prices.
This comes from server and AI GPUs that sell for multiple times the price of mainstream consumer GPUs.

The current cost of GPUs comes from multiple factors, the main one is advancing the density of manufacturing processes is getting slower, while the cost/wafer due to complexity practically doubles. Don't expect it to get any cheaper going forward with TSMC alone swimming ahead with no competition.

Second, the obsession with RT and high resolutions requires much larger GPUs with dedicated cores, plus all the additional cost of R&D, and both RT and high resolutions pull massive amounts of (also increasingly expensive) video memory, imagine $300+ in memory alone on recent GPUs.
Posted on Reply
#104
watzupken
MxPhenom 216Die size is not the only thing determining fab costs, if that were the case than we would see very different wafer prices in the industry, but its just not the case. 5nm/4N is expensive AF, no way to shake that. GTX1080 was on 16nm and that process was dirt cheap relative to fab costs at <10nm today. Fab costs have gone up exponentially as nodes have gotten smaller, and TSMC has been announcing price increases the last couple years, and just did again recently.
I don't disagree that cost of fab have increased exponentially over the years. Assuming cost have increase 2x, we are just looking at the cost of the die. In other words, if it cost 150 to manufacture the GA104, and now its 300 for the AD104, I don't think it is enough to account for all the price increases. Its got more GDDR6X, but the cost of VRAM and other components have likely come down due to stable supply and low demand at this point. In my opinion, they are just trying to increase prices knowing that demand is going to be weak, just so that they can maintain a level of profit, by widening the margins.
kiriakostThis is the only thing worth talking about today, power consumption RTX3000 vs RTX4000
RTX3000 failed to have reasonable power consumption in contrast to their transistors count.
And many poor kids wasted piles of money at buying thermal pads and magical pills, and they succeed nothing.
Power consumption is clearly an advantage for Ada. All retail Ampere chips were built on Samsung's 10nm even though they call it 8nm. Looking at the power characteristics of Samsung vs TSMC node, the latter clearly have a significant advantage. This is very apparent in the mobile SOC space, and recently with Qualcomm moving from Samsung to TSMC 5nm, the 8 Gen 1 and Gen 1+ is a good example. So moving from Samsung's 10nm to TSMC's 4nm is almost a good 2 nodes or more upgrade.

Also, thermal pads were meant for the VRAM, not for the core. So at least for my case, the changing for thermal pads clearly brings the memory hotspot temps down significantly.
Posted on Reply
#105
Minus Infinity
Just look at the pure raster numbers of this rip-off. It's not faster than 3090. Without DLSS to save it's arse it's an utter joke. It's a $599 pig dressed up with lipstick for $799. Even it it were 256bit bus and had 16GB just a lot less cuda cores than 4080, it shouldn't be more than $699.
Posted on Reply
#106
Chaitanya
So whole lot of BS without any footnotes, hopefully they get dragged down legally for "performance" claims.
Posted on Reply
#107
wheresmycar
It amazes me how people quickly jump to justify every and any given NVIDIA price point. Doesn't matter how much the 4070 TI goes for (MSRP), you will always get people jumping on the YES-MERCHANT bandwagon and fight tooth and nail for these holy Nvidia revelations. If I was a shareholder or investor it would make sense but as a "consumer", yep just a barebone "consumer", it's difficult to digest the mid-ranged XX70 segment being hijacked with XXX-profiting. Not that i'm surprised.... seeing the exorbitant 4080 hitting the shelves no doubt the 4070~ was always destined to rip holes in the general consumers wallet (a pretty large portion of the consumer base). What is also inevitable is seeing the XX60/XX50 segments equally suffering from this greasy profit war machine and no doubt AMD will sadly follow.

Personally, i might just pull the trigger on one as I had stated previously i am willing to fork out $800 for a GPU (not a dime more).... but i expected way more for this sort of money and it's hardly an "exciting" buy for the monies worth. I feel for the guys on a budget... although most likely some will have a fat enough budget but just not cutting it nowadays with these rediculous post-pandemic pocket pinch price scheming. I dunno, might even give 40-series/RDNA3 a miss ... i'm just not feeling the "GPU" upgrade impulse nowadays (well since 30-series anyway). Closely trailing RDNA3 doesn't seem exciting either.... all PANTS if you ask me.

Looks like most gamers looking for a spectacular eye-candy gaming experience + snappy frame render will have to settle with less. I'm glad i kept the impulse at bay from moving to 4K... that would have sucked for me! 1440p it is for another 5 years or so it seems.
Posted on Reply
#108
Dr. Dro
N/AIt's not bad at all what is not to like. The very fact that it's not 256 bit like it should,

But take into account that 3080 Ti 3090 and all the halo products were discounted to under 1K. from the original 1199 to 1999. 3080 12 GB for as low as 750.

Clearly 48 MB L2$ and 192bit is as efficient as 384 bit 6MB. and the price being slashed from $999 3090 Ti 24GB, to a more reasonable 799 losing half of the bus and the memory, the same 40. Tflops.

The one to get is 4070 5888 Cuda version and if delivering 3070 Ti + 10% is as good as it gets, I'll take that. 3x faster than my 980Ti.
3x performance in 8 years with a relative increase in price is not an accomplishment. The GTX 980 Ti is an ancient graphics card at this point. If the market was anywhere close to healthy, you'd have $150 low-end graphics cards giving it a biblical spanking. But instead, anything below $200 cannot beat it in performance, only in power consumption. That's just sad!
wheresmycarIt amazes me how people quickly jump to justify every and any given NVIDIA price point. Doesn't matter how much the 4070 TI goes for (MSRP), you will always get people jumping on the YES-MERCHANT bandwagon and fight tooth and nail for these holy Nvidia revelations. If I was a shareholder or investor it would make sense but as a "consumer", yep just a barebone "consumer", it's difficult to digest the mid-ranged XX70 segment being hijacked with XXX-profiting. Not that i'm surprised.... seeing the exorbitant 4080 hitting the shelves no doubt the 4070~ was always destined to rip holes in the general consumers wallet (a pretty large portion of the consumer base). What is also inevitable is seeing the XX60/XX50 segments equally suffering from this greasy profit war machine and no doubt AMD will sadly follow.

Personally, i might just pull the trigger on one as I had stated previously i am willing to fork out $800 for a GPU (not a dime more).... but i expected way more for this sort of money and it's hardly an "exciting" buy for the monies worth. I feel for the guys on a budget... although most likely some will have a fat enough budget but just not cutting it nowadays with these rediculous post-pandemic pocket pinch price scheming. I dunno, might even give 40-series/RDNA3 a miss ... i'm just not feeling the "GPU" upgrade impulse nowadays (well since 30-series anyway). Closely trailing RDNA3 doesn't seem exciting either.... all PANTS if you ask me.

Looks like most gamers looking for a spectacular eye-candy gaming experience + snappy frame render will have to settle with less. I'm glad i kept the impulse at bay from moving to 4K... that would have sucked for me! 1440p it is for another 5 years or so it seems.
Agreed, though, I'm unsure I can call either of the Navi 31 duo "settling". They perform very well, even if AD102 is ahead of the curve. It will be some time until games cannot run well on that.
Posted on Reply
#109
MxPhenom 216
ASIC Engineer
ARFThe cable issue is due to the terrible card design. Instead of making a longer PCB and a connector on the far right, they made the opposite - very short PCB and connector in the middle.
If you look at your ATX case and the cable management, you will see that it is the design choice which led to this "cable issue".



RTX 4070 Ti is too slow for this market tier.
Look, if RTX 4080 was 5-10% within RTX 4090, and RTX 4070 Ti was then 5-10% within RTX 4080, it would be better.
Now RTX 4070 Ti will be around 60% the performance of RTX 4090 ! Not ok.
Too slow? if its anywhere close to RTX3090 performance at $799 that ain't that bad. RTX3080 MSRP was $699 and 4070ti will likely be faster.
ARF12nm didn't cost that low :kookoo:
12nm wafer costs roughly the same as 16nm since they are mostly the same thing.
Posted on Reply
#110
Dr. Dro
MxPhenom 216Too slow? if its anywhere close to RTX3090 performance at $799 that ain't that bad. RTX3080 MSRP was $699 and 4070ti will likely be faster.


12nm wafer costs roughly the same as 16nm since they are mostly the same thing.
The primary problem is that the performance should increase generationally, yet the price should not, otherwise that negates that generational leap almost entirely since these GPUs have no major breakthrough features that differentiate them from the previous generation (i.e. Ampere supports DirectX 12 Ultimate to its fullest - and the vast majority software doesn't take advantage of its capabilities yet).

So overall we have NVIDIA's marketing team tryharding with ridiculous lies like "4070 Ti + DLSS = 3x 3090 Ti", as if that 8-month-old $2000 GPU was already an ancient relic, all in an attempt to get the unsuspecting buyers on board. It's beyond low, it's just pathetic. Not two years ago they were touting the RTX 3090 as an 8K-ready next-generation product ready for the future, and now they're calling "mainstream gamer's hardware", a polite dig at "what are you, poor?" which to me, just implies medium settings 1080p experience throughout - the 40 fps or so that my RTX 3090 runs Portal RTX at seems to imply that, at least.

A 192-bit GPU with such modest BoM and on this precise segment should not be over $499.
Posted on Reply
#111
MxPhenom 216
ASIC Engineer
Dr. DroThe primary problem is that the performance should increase generationally, yet the price should not, otherwise that negates that generational leap almost entirely since these GPUs have no major breakthrough features that differentiate them from the previous generation (i.e. Ampere supports DirectX 12 Ultimate to its fullest - and the vast majority software doesn't take advantage of its capabilities yet).

So overall we have NVIDIA's marketing team tryharding with ridiculous lies like "4070 Ti + DLSS = 3x 3090 Ti", as if that 8-month-old $2000 GPU was already an ancient relic, all in an attempt to get the unsuspecting buyers on board. It's beyond low, it's just pathetic. Not two years ago they were touting the RTX 3090 as an 8K-ready next-generation product ready for the future, and now they're calling "mainstream gamer's hardware", a polite dig at "what are you, poor?" which to me, just implies medium settings 1080p experience throughout - the 40 fps or so that my RTX 3090 runs Portal RTX at seems to imply that, at least.

A 192-bit GPU with such modest BoM and on this precise segment should not be over $499.
I mean if we are going to use a GPUs memory bus width as a metric for GPU pricing, sure....i guess
Posted on Reply
#112
Dr. Dro
MxPhenom 216I mean if we are going to use a GPUs memory bus width as a metric for GPU pricing, sure....i guess
Bus width is, imho, a decent metric for a couple of reasons: G6X ICs are still relatively expensive, and the more you add involve more and more PCB complexity, which also increases the BoM.

A good case study is the original RTX 3090, it initially carried hundreds of dollars in memory alone*, due to very high prices earlier on and needing 24 ICs plus a PCB and power delivery system to match.

*= (rumor was that it had roughly $600 of memory on it back then, even accounting for economy of scale, I do not know if this is true but given that the chips used on it still fetch $24.50 each on low volume market, it may very well be)

The RTX 4070 Ti, in contrast should use a much simpler design with only six 16Gbit G6X ICs, and the AD104 is a relatively small processor with a 250-280W footprint, which doesn't require as advanced a VRM as either the AD103 or the AD102.
Posted on Reply
#113
Pumper
Argyrwaiting for 4000 series prices to fall is delusional, just look at current 3000 series prices. it's over. cheap or affordable GPU's are a thing of the past
Is it? GPU sales are at 20 year low, so it's only a matter of time before the scalping corporations will have to do significant price cuts just to free up storage space.
Posted on Reply
#114
john_
PumperIs it? GPU sales are at 20 year low, so it's only a matter of time before the scalping corporations will have to do significant price cuts just to free up storage space.
No he is right. Nvidia didn't dropped prices on cheaper 3000 models, only on expensive ones. Nvidia droped prices only on 3090/Ti and tech press was celebrating with "Nvidia dropped prices" articles creating the misleading impression that prices dropped on all models. AMD did price drops across the line, tech press relegated those to footnotes, while people kept buying Nvidia cards.

The environment is so toxic, so monopolistic, so favoring to Nvidia, that Nvidia, controlling 90% of the market, has no real reasons to drop prices. Intel is still far behind to become a factor, AMD still doesn't seem to be very excited about the GPU market and probably they are more concern about AM5. Both Intel and AMD will gain long term if Nvidia does all the necessary work to establish higher prices in the retail GPU market. Also I doubt OEMs pay these prices. Nvidia advertises 4070 Ti at $800, but what is Dell for example pay to get one? $800? $700? $600? $500? Those high prices are beneficial for every company out there making gaming equipment, just not for consumers. MS and SONY can price their next gen consoles higher, Valve and others building gaming handheld devices, also, Dell that I will use again as an example, buy a 4070 Ti from Nvidia at, let's say $600 and price it at $700, making it's prebuild PCs look like bargains and of course AMD and Intel price their own offering higher. While they will sell less stuff, they will achieve higher profit margins and if TSMC doesn't have enough capacity for everyone, selling less stuff at higher profit margins, is crucial. Nvidia also looks more at AI, roboticks and stuff today than gaming, AMD is concentrating more in servers while it still has an advantage there, as for Intel, they build GPUs out of necessity, not to make gamers happy.

Cheap GPUs are dead. We already witnessed the death of sub $100 GPUs those last years, for sub $200 we only got the laptop RX 6500XT and then the beta A380 and finally the abysmal insult in the form of an old arch GTX 1630. Under $300 we only have one good option, the RX 6600 that no one buys, because it doesn't have an Nvidia logo on it. Instead they buy the RTX 3050. Go figure....

Nvidia dropping prices? Why?
Posted on Reply
#115
zer0day777
Argyrsweet, exactly what I need, nothing more.


RTX 3080 VENTUS 3X PLUS 10G OC - cheapest 3080 in my country, price is exactly 1k EUR
lol I'd need AAA 4K raytracing, sorry. And I definitely mean that WITH sane pricing. Oh, what's that? nuVidiatm RTXtm can't even perform on the third iteration? pfffft

Facts: I've got a 1440p75hz panel since the start of the pandemic, I been gaming on 1440p with an RDNA1 card because RTX1 cost too much money in comparison ($500 floor minimum on 2070 super aka 2-70ti, $410 for a much nicer looking 5700XT, same performance, up to $170 cheaper than what I was looking at back then for 2070s custom AIB). So, to me, I'm gonna have to need 4k raytracing at affordable prices by this point considering we are now up to RTX gen 3. nuVidia no longer has got any excuse on this. It's even worse now that RDNA3 shows up and, underwhelming though it is, still isn't that far behind in RT ultimately.

So basically if it can't even do that, I've got ZERO reason for ever upgrading, because I don't have to replace my monitor because the card is literally unable to do raytracing at 1440p native, even to this day, going by that logic. Really all the DLSS bull$%^& and FSR nonsense, trying to muddy the water (literally lel) when the clear issue is PRICE TO PERFORMANCE. That's it. I don't care what s&%$ this company throws at the fan to trick and confuse people at this point. Jensen honestly bet (and lost his gamble, badly) on mining farms still having demand for scalped af mining GPUs; that's vanished. And so now they're literally stuck with gamers, who they've been screwing for generations, and trying to justify this HORSE %$#&. IT WAS NEVER "INFLATION".

IT WAS LITERALLY ALWAYS THEIR PRICE/PROFIT MODEL. So bad, nobody purchased a 2080, because why tf would you when you could get a 1080ti instead for cheap. And that was a thousand dollars/euros at retail, which then the 2-80ti bumped to $1200. Think about the MSRP for the GTX 1070 or 1070ti. Well then the 2-70ti (aka 2070super) is $500, right? So that's at when I thought the price didn't make sense so I switched to AMD, which hasn't failed me yet (and was frankly an excellent mining card, literally paid for itself with extra profit 2020-2021). That was BEFORE the $600 RTX 3070ti, btw, which WAS BEFORE THE ALLEGED "INFLATION" EVEN STARTED HAPPENING. It's a SCAM.

So, no, Jensen can go F himself, and his shareholders. Anyone who buys this card at anywhere near those prices is a simp, a moron, and a cuck of highest calibre. It's like, they are playing some alleyway scam on total idiots where you show bad thing1 and it sucks less than bad thing2 so they con themselves to thinking bad1 is "a better deal." No, they're both TERRIBLE deals. Like seriously I hopped off team green when RTX 2000 pricing was unbelievable, and as disappointing RDNA3 has been, it's nowhere near the disaster that is Lovelace. I would've even consider it without literally halving the prices.
Posted on Reply
#116
kiriakost
watzupkenI don't disagree that cost of fab have increased exponentially over the years. Assuming cost have increase 2x, we are just looking at the cost of the die. In other words, if it cost 150 to manufacture the GA104, and now its 300 for the AD104, I don't think it is enough to account for all the price increases. Its got more GDDR6X, but the cost of VRAM and other components have likely come down due to stable supply and low demand at this point. In my opinion, they are just trying to increase prices knowing that demand is going to be weak, just so that they can maintain a level of profit, by widening the margins.


Power consumption is clearly an advantage for Ada. All retail Ampere chips were built on Samsung's 10nm even though they call it 8nm. Looking at the power characteristics of Samsung vs TSMC node, the latter clearly have a significant advantage. This is very apparent in the mobile SOC space, and recently with Qualcomm moving from Samsung to TSMC 5nm, the 8 Gen 1 and Gen 1+ is a good example. So moving from Samsung's 10nm to TSMC's 4nm is almost a good 2 nodes or more upgrade.

Also, thermal pads were meant for the VRAM, not for the core. So at least for my case, the changing for thermal pads clearly brings the memory hotspot temps down significantly.
This is the point that you got it wrong, its true that pads serving VRAM, but due VRAM adequate cooling there is removed portion of heat this generated due the GPU too.
It was 100% NVIDIA's responsibility to think first and improve electronics design, instead regular people to become beta testers and search for solutions of their own.
NVIDIA is the one which got your money, and you should receive a trouble free product.

My advice to all, just be more careful about your choices from now and on.
Posted on Reply
#117
ARF
watzupkenI don't disagree that cost of fab have increased
It is exaggerated. The fab cost can actually be lower compared to years in the past because of using more efficient technologies and optimisations, like independent power supply from own photovoltaics, more energy efficient buildings and equipment, less employees overhead, etc cost reductions.

There is no scientific proof that the newest technology today is more expensive than the newest technology 5 or 10 years ago.
It can be fake news, speculations, and lies in order to justify the profit margins, the private jets and yachts for the top management and stockholders.
Posted on Reply
#118
zer0day777
john_No he is right. Nvidia didn't dropped prices on cheaper 3000 models, only on expensive ones. Nvidia droped prices only on 3090/Ti and tech press was celebrating with "Nvidia dropped prices" articles creating the misleading impression that prices dropped on all models. AMD did price drops across the line, tech press relegated those to footnotes, while people kept buying Nvidia cards.

The environment is so toxic, so monopolistic, so favoring to Nvidia, that Nvidia, controlling 90% of the market, has no real reasons to drop prices. Intel is still far behind to become a factor, AMD still doesn't seem to be very excited about the GPU market and probably they are more concern about AM5. Both Intel and AMD will gain long term if Nvidia does all the necessary work to establish higher prices in the retail GPU market. Also I doubt OEMs pay these prices. Nvidia advertises 4070 Ti at $800, but what is Dell for example pay to get one? $800? $700? $600? $500? Those high prices are beneficial for every company out there making gaming equipment, just not for consumers. MS and SONY can price their next gen consoles higher, Valve and others building gaming handheld devices, also, Dell that I will use again as an example, buy a 4070 Ti from Nvidia at, let's say $600 and price it at $700, making it's prebuild PCs look like bargains and of course AMD and Intel price their own offering higher. While they will sell less stuff, they will achieve higher profit margins and if TSMC doesn't have enough capacity for everyone, selling less stuff at higher profit margins, is crucial. Nvidia also looks more at AI, roboticks and stuff today than gaming, AMD is concentrating more in servers while it still has an advantage there, as for Intel, they build GPUs out of necessity, not to make gamers happy.

Cheap GPUs are dead. We already witnessed the death of sub $100 GPUs those last years, for sub $200 we only got the laptop RX 6500XT and then the beta A380 and finally the abysmal insult in the form of an old arch GTX 1630. Under $300 we only have one good option, the RX 6600 that no one buys, because it doesn't have an Nvidia logo on it. Instead they buy the RTX 3050. Go figure....

Nvidia dropping prices? Why?
The thing is though, they don't, it's just people keep trying to claim everyone is buying those. Partly this is because most people buy prebuilts, and nVidia still hasn't managed to F up their contracts (yet) with all the SIs and major brick and mortar stores so every piece of crap prebuilt has got a 1060 or 1650 or 3060 or something like that in it, no matter how bad the hardware actually is nor how outrageous the prices. Like, tf you think I'd pay that much for a 1650 Max-Q? Hell, Craigslist is hysterical or sad though, not sure which, I saw somebody trying to sell no joke a RX 460 2gb system "bought new from Best Buy never opened" with some other garbage hardware for $700. This on the same page someone's selling a used 2080ti for under $400.

I think it's that the megacorpos just love ignorance, hell Capitalistic excess generally favours NPCs, the ignorant, the impulsive, the most childlike creature imaginable to be useless at anything than being a consumer drone, a cheap biorobot worker, and expendable canon fodder (the rest gets shipped to private schools to be the middle-management biorobots). So it's even more obvious to us in tech where we clearly can see the ripoff and are used to working with numbers, and these scumbag dirtbag corporations just pull literally the exact same sort of scammer from Mumbai "hello your PC is broken we need your Social Security number to unlock it from virus" crap on old people, kids, and NORPs that don't know better. And they try making it sound like computers are "really complicated" and they are not, you can easily teach yourself how to not get swindled, it's just they hold back some of the info and try playing 3 card shell games, like calling it "i7 with 8gb RAM" yeah single stick, not dual rank, lowest speed dustbin RAM with an ancient low-tier 9700 or something.

I'm so outraged by nuVidia I'm just not buying anymore. They became nuVidia at RTX frankly, it's embarrassing. Like they had their problems and always been a scumbag scammy company from the alleged GTX 1060 "3gb model" to 3.5gb of GTX 970, but at least Pascal and Maxwell were really good. They didn't even age poorly, no matter how much nuVidia wishes they did. This is because their dumb crap usually doesn't take off because no one wants to deal with them and their proprietary nonsense, so no one uses HairWorks for example can you imagine selling 1080tis purely on "it does HairWorks and AMD cannot"?? They try forcing the market to obsolesence more quickly but really, RTX can't even run RT natively, their bottom end SKUs also can't even run acceptable frames even with the smudgiest DLSS on. The alleged "RTX" 3050 should be a GTX 3050. But the real problem is lots and lots of morons too braindead to understand why a 3050 is a horrific deal at $400, it's literally slapped to s&%$ and thrown in a dumpster by the 5700XT, even 5600XT iirc.
MxPhenom 216I mean if we are going to use a GPUs memory bus width as a metric for GPU pricing, sure....i guess
I think the bigger deal is the fail across every conceivable metric, from pricing, to performance, to power consumption, hell to even memory bus and anything else you can think of. There's no reason to buy a Lovelace GPU at all. You'd have to be a moron or so completely misinformed you already spent your money like a simp and found this thread in like 2025. They SUCK. I think literally the last time nVidia released a generation so terrible was the GT 280? Wasn't that when they charged $650 for a card that AMD's top end HD something beat at $400, so they had to lower prices?

That's the problem--they not only refused to lower prices, they jacked it. I saw those TDPs long ago and was like "alright, AMD needs to not screw up basically or nVidia need to lower prices because I'm not paying that for something so inefficient I need a new PSU." Lo and behold, they not only jacked it, they jacked prices outrageously to literally scalper levels.

Stop giving nuVidia your money.
MxPhenom 216Too slow? if its anywhere close to RTX3090 performance at $799 that ain't that bad. RTX3080 MSRP was $699 and 4070ti will likely be faster.


12nm wafer costs roughly the same as 16nm since they are mostly the same thing.
Like this, why are any of you guys even rationalizing this, it's like watching heroin addicts and alcoholics trying to justify why dying in a gutter is a great life decision for them. Also the 3090 wasn't much better than a 3080, facts. Each time with nVidia lately it felt like paying $500 for 2.5% uplift. That's literally hitting margin of error where you can get that kind of performance by just cleaning out the dust from your GPU and repasting, or getting a different AIB model or something.

You know the funniest thing to me, is they also released such a terrible last few generations that not only people been having all kind performance problems due to dumb s--- like driver bloat, but even just going on Steam all the time it's "how come my 3080 gets such low frames" "hello why is my 3070 getting this problem" and you go through them and quickly realize people are like "I have RX 6800 and it just werks for me."
wheresmycarIt amazes me how people quickly jump to justify every and any given NVIDIA price point. Doesn't matter how much the 4070 TI goes for (MSRP), you will always get people jumping on the YES-MERCHANT bandwagon and fight tooth and nail for these holy Nvidia revelations. If I was a shareholder or investor it would make sense but as a "consumer", yep just a barebone "consumer", it's difficult to digest the mid-ranged XX70 segment being hijacked with XXX-profiting. Not that i'm surprised.... seeing the exorbitant 4080 hitting the shelves no doubt the 4070~ was always destined to rip holes in the general consumers wallet (a pretty large portion of the consumer base). What is also inevitable is seeing the XX60/XX50 segments equally suffering from this greasy profit war machine and no doubt AMD will sadly follow.

Personally, i might just pull the trigger on one as I had stated previously i am willing to fork out $800 for a GPU (not a dime more).... but i expected way more for this sort of money and it's hardly an "exciting" buy for the monies worth. I feel for the guys on a budget... although most likely some will have a fat enough budget but just not cutting it nowadays with these rediculous post-pandemic pocket pinch price scheming. I dunno, might even give 40-series/RDNA3 a miss ... i'm just not feeling the "GPU" upgrade impulse nowadays (well since 30-series anyway). Closely trailing RDNA3 doesn't seem exciting either.... all PANTS if you ask me.

Looks like most gamers looking for a spectacular eye-candy gaming experience + snappy frame render will have to settle with less. I'm glad i kept the impulse at bay from moving to 4K... that would have sucked for me! 1440p it is for another 5 years or so it seems.
lmao
"I will pay the merchant $1299, but not a dime more!"
I wouldn't be willing to pay $600 for a 4070 super, period. I literally just skipped RTX 2070 super based on it being $500 minimum, EVGA models $570+ and meanwhile Radeon releases comparably performing $400 cards. "But it doesn't have raytracing!" So tell me then, was your DLSS 1.0 RT experience on that card really worth the extra $200? And it mined worse too. So it cost less on ebay even during the scalping.

That's what you are asking me to do. Is make my next upgrade be a card that at my targeted segment $500 was getting a bit steep, and going up to $600 I'm starting to expect xx80 performance. Considering the GTX 980 was $550, and not even all that long ago, and was a MUCH better made GPU, I don't find this to be terribly unreasonable to say "I'm not spending a dime more than $600 on a 70ti custom card and that's firm." And then that's also asking me to do it for a generation that's let's face, just plain sucks.
We all know it. It's literally a MEME. Like
honest to God joke GPU. It's a clowncarPU. And I can't even fit the stupid things in my case anyway, takes up 4 freaking slots so goodbye literally anything else (yes I do use those x1 and x8 slots btw) and requires a completely monstrous PSU, which is suddenly making my budget way outside the range of a thousand dollars because think about it:
as the average gamer, you are having, what, like $500 for GPU? Well let's think this way, you've got a new monitor in store too. Let's double that budget. You aren't going to magically have $500 appear from thin air in your account, but let's imagine you do. $1000 to play games at better visual quality. So let's say, $350 for monitor. That's $650 for GPU, now how much is that PSU going to cost? So suddenly, you've already dropped from being able to afford 3080 quality, to 3060ti quality at best. Meanwhile for that same cost you can get 6800XT quality for the same cost, because you don't have to get a bigger PSU.
That's why Lovelace is such a bad deal.

I mean really it's just s#@$ from every conceivable angle. The sole thing nuVidia accomplished was padding shareholder pockets and getting simps to cheer on their own robberies "yeah but it benefits shareholders so it's so smart!" Oh, and allegedly better performance, but when you slam that much additional power and it costs more, you really didn't advance at all, just expanded the range of halo products into more luxury price halo products. You could LN2 a 3080 and do the same thing more or less.
Minus InfinityJust look at the pure raster numbers of this rip-off. It's not faster than 3090. Without DLSS to save it's arse it's an utter joke. It's a $599 pig dressed up with lipstick for $799. Even it it were 256bit bus and had 16GB just a lot less cuda cores than 4080, it shouldn't be more than $699.
This.

But again, this is part of a broader problem. It's like, if a guy is a heroin addict, he's stealing from people, stealing from friends, has a big criminal record, are we really going to spend that much time arguing how he ripped off grandma? It isn't even that he stole grandma's checkbook at that point. It's that Jensen has simply become so freaking brazen about it he's all but throwing a towel on the floor of jail cell "pick up my towel for me, bish." Jensen is openly daring you all to bend over, because he feels like he pushed up on you so much he can do literally anything to you he wants and you'll love him for it.
kiriakostThis is the point that you got it wrong, its true that pads serving VRAM, but due VRAM adequate cooling there is removed portion of heat this generated due the GPU too.
It was 100% NVIDIA's responsibility to think first and improve electronics design, instead regular people to become beta testers and search for solutions of their own.
NVIDIA is the one which got your money, and you should receive a trouble free product.

My advice to all, just be more careful about your choices from now and on.
Man...lol that's the problem, and again I cannot emphasize enough these kind of things is why I hopped off nVidia's dik years ago and got an RDNA1 card instead. People gave the 5700XT so much shit (it deserved it on release, but became a dead horse meme as people realized it got fixed and was one of the best value cards of the pandemic) but really, that's a ~$400 card and meanwhile crickets when you bring up the nonstop issues of a $1200 GPU like the 2080ti. I mean, if I spent over a grand on a GPU, you damn well better work right from the box. Instead, somehow the 2080ti never got the meme status s%$* it rightfully deserved for being 5700XT release tier...and costing over a thousand dollars.

But it didn't end there, did it. Ampere was the same thing, all kinds of problems plagued that series of cards, right up to their halo products bursting into flames. 3090, not 4090, which also burst into flames apparently. POSCAPs was a big issue, they had bad drivers, I remember they actually downclocked the 3080 so hilariously became the anti-finewine of performing worse with drivers over time, because they couldn't stop its crashing otherwise.

It happens because nobody wants to hold nVidia accountable, and you'll notice the actual businessmen who make money not just play with toys all dropped nVidia. Apple, EVGA, Sony, Microsoft, nobody really wants to work with that company and this includes gamedevs. So I think it's even funnier thinking about this and realizing all that AAA gaming is being finetuned on AMD-only hardware, Xbox and PS5 take Ryzen processors and RDNA2 graphics cards, so I'm not sure what people are thinking buying these and trying to claim some AAA gimmick.
Dr. Dro3x performance in 8 years with a relative increase in price is not an accomplishment. The GTX 980 Ti is an ancient graphics card at this point. If the market was anywhere close to healthy, you'd have $150 low-end graphics cards giving it a biblical spanking. But instead, anything below $200 cannot beat it in performance, only in power consumption. That's just sad!



Agreed, though, I'm unsure I can call either of the Navi 31 duo "settling". They perform very well, even if AD102 is ahead of the curve. It will be some time until games cannot run well on that.
Yeah see this guy also gets it, if you actually remembered the 1gb graphics cards being "huge" because we finally hit the gigabyte mark on VRAM...well, not really impressive lately per performance. We had something like this sorta with Fermi, but not even Fermi was anywhere near these TDPs and the 590 was like 365w. Meanwhile we have such pathetic things like GTX 1630 pricing, companies realizing they can sell literally anything and people will buy it. The same crap happened with scalping, because the mining farms set the new standard because it was a direct economic calculation of how much money can I make and how long will it take me to turn my investment from net loss into a monetary gain, that's why they cost like $2000 because you could still turn a profit at that mark. Meanwhile the occasional moron gamer would buy a GPU at these prices to play games on. But when you go "all the way back" to like 10 years ago, a GTX 980ti was a beast at 240w or whatever it was.

If you remember then you're also mentally comparing any card today to the GTX 980 for $550, both in terms of uplift as well as pricing. Same goes for GTX 980ti and GTX 1080ti, the two generations nVidia was undisputably good, before the dark times, before RTX. Every single year since then they've jacked prices and delivered far less. They've had even worse standards and all kinds of problems everywhere, poorer physical products (no $500+ card should have a plastic backplate, the 900 series was all metal backplates though to be fair, they started using more plastic shrouds there was a time when it was all metal) numerous bugs, I mean really nuVidia has zero room to try and poke fun at AMD when bugs-wise nuVidia's been a complete disaster. And it wouldn't even be such a big deal were it not the fact they're charging these ridiculously outrageous prices for a throttling inferno. Like, I can't even imagine paying $800 for a 70ti. That's insane. So, a 4070 is NOT in my plans, period. But the problem is AMD has no reason to set a 7700XT at "normal" $500> if people are stupid enough to buy an nVidia card. So it's like the same problem as people rewarding the scalpers, only now nVidia is the scalper.
Would I publicly shame a man for giving his money to a scalper?
Yes I would.
Posted on Reply
#119
W1zzard
TaishoEverything here gets a 5/5. No matter if it undelivered performance, efficiency, affordability, and cooling - all at the same time. Look at RX 7900XT review. I would remove the star rating entirely (for the past reviews too) because it makes TPU look like sponsored influencers.
We have removed numeric scores a long time ago, because people were crazy about the number and the drama disrupted all other discussion

I guess you mean the rating stars in Google search results? Google awards the ability to display those stars to sites that it considers authoritative for the topic. The stars make the result stand out more, so more clicks. More clicks = higher placement in the search results. Unfortunately a majority of visitors (! I polled people on this) think that the stars are a rating for the article, not for the tested product

Removing the stars will lower the click rates, which will push us down in search results, and people will no longer find our reviews, then why even write reviews
Posted on Reply
#120
zer0day777
WatchThe80s800+27%VAT 1020(only VAT)-1300(bonus for design or for energy price lol) usd matter of how much retailers put on it for the 4070 ti is still damn cheap(though it's not).
3070 ti cheapest are 790 usd(with vat), you can't get 3080 below 1060 usd (with vat and avg price for it is still over 1160 usd with vat), 3080 ti cheapest is 1670 usd(with VAT and avg 1850), 3090 with the best deal of the month 1965 usd (with vat but it's only one piece of card and after this somehow gone the next one is the avg min price is 2500 usd with vat GG Hungary, look how nice retailers we have), and there is no new 3090 ti.

Most of the 4080 tanked at 1560-1690 usd (with vat), 4090 cheapest at 2100 but avg 2600 usd (with vat).
7900 XT ohh boi price for it is 1170-1270 usd with VAT and the AIB design models are go for 1320-1450 usd, 7900 XTX 1530(only the standard)-1800(feel the premium, just feel it you can't see it).

So with that performance i can see it as an actual finally long last buy of a card for that price with 3090 performance, still it could be cheaper for a XX70 card like 600-650+VAT.I use a 970 and in 2021 wanted to upgrade for a 3070 ti or a 3080 HAHAHA lol, still using my 970.
The funniest thing, the only reason to even buy a card today is your old Maxwell/Pascal broke, or because you're getting a 4k or 1440p monitor.
Seriously, that's it. And they can shill this RT b.s. all they want but AMD has RT too so nuVidia's not special, it's just another HairWorks far as I (or a lot of other gamers) are concerned. Games like Battlefleet Gothic Armada II and The Ascent look very pretty regardless, because Unreal is a pretty engine. nVidia would try calling those particle effects "PTXtm ParticleCorestm" or something and try comparing how well their card does that compared to AMD.
Meanwhile in all reality, they're more or less equivalent brands in performance, and the real reason to upgrade is going higher resolution because lots of games also aren't super demanding and it still feels like stagnation in the gaming industry too.

I only even wanted a new card because I was thinking about 4k144hz, and frankly that's what EVERYONE wants. That's why it's so nuts they went with the stupid displayport gimping because 4k144 is quite literally the new normative standard. It's solely about whoever can do 4k144 native better and at a better cost. That's the metric. Nobody is buying an RTX 4000 card because of raytracing or whatever. So basically, if you're still using a 1080p panel, or 1440p75 or anything below 1080p240 you have zero reason to upgrade. 970/980/1060 is more or less what every game needs today anyway.
N/AIt's not bad at all what is not to like. The very fact that it's not 256 bit like it should,

But take into account that 3080 Ti 3090 and all the halo products were discounted to under 1K. from the original 1199 to 1999. 3080 12 GB for as low as 750.

Clearly 48 MB L2$ and 192bit is as efficient as 384 bit 6MB. and the price being slashed from $999 3090 Ti 24GB, to a more reasonable 799 losing half of the bus and the memory, the same 40. Tflops.

The one to get is 4070 5888 Cuda version and if delivering 3070 Ti + 10% is as good as it gets, I'll take that. 3x faster than my 980Ti.
Jesus do you guys really not get this
>as good as it gets
Maybe that's why, you have zero standards. How someone has a 980ti and thinks like this is beyond me.

Yes of course it's cheaper and discounted it's 2 freaking years old! Like, why tf would you expect it to still cost that much? And it's not "as low as" the f'ing things cost $700 at retail brand new at the end of 2020. It's 2023 now. If your 980ti cost "only" $900 today, would you call that a great deal?

>to a more reasonable
No it's not, because that TDP isn't being slashed, and my PSU isn't magically going to do 1 kilowatt. So it's altogether a much worse deal buying powerhog halo products. I'd consider 1080ti or 980ti used but that's more because they're really efficient and just noice cards with a special place in our hearts. Like, imagine GTX 590 performance at over 300w. That's why there's a sort of balance to older and used, because it reaches a certain limit in inefficiency, and the problem with Lovelace and Ampere they're literally the worst inefficient af graphics cards since Fermi, in fact it's literally worse than Fermi. So anyone buying these cards is also going to have a much harder time offloading them on the used market. Like the kind of person that has a 600w PSU and is gaming on a R5 1600 looking for used parts wouldn't be looking at a used 4090 or 3090ti.
Posted on Reply
#121
error1984
TaishoEverything here gets a 5/5. No matter if it undelivered performance, efficiency, affordability, and cooling - all at the same time. Look at RX 7900XT review. I would remove the star rating entirely (for the past reviews too) because it makes TPU look like sponsored influencers.
I browse the tpu forum only. Take a close look at reviews and news section(leading manufacter of or that like every tech company was a god saviour). Tpu its the most anti consumer website out there - just buy its the best and shut up. Also many posts on forum are fake and promote buying overpriced stuff. Once i saw how one guy wanted to cancel 4080 order because of backhlash from other users....and gues what? This wozzard advice him to buy that scam product anyway. Dont trust anything you read here especially reviews!
Posted on Reply
#122
TheinsanegamerN
zer0day777The funniest thing, the only reason to even buy a card today is your old Maxwell/Pascal broke, or because you're getting a 4k or 1440p monitor.
Seriously, that's it. And they can shill this RT b.s. all they want but AMD has RT too so nuVidia's not special, it's just another HairWorks far as I (or a lot of other gamers) are concerned. Games like Battlefleet Gothic Armada II and The Ascent look very pretty regardless, because Unreal is a pretty engine. nVidia would try calling those particle effects "PTXtm ParticleCorestm" or something and try comparing how well their card does that compared to AMD.
Meanwhile in all reality, they're more or less equivalent brands in performance, and the real reason to upgrade is going higher resolution because lots of games also aren't super demanding and it still feels like stagnation in the gaming industry too.

I only even wanted a new card because I was thinking about 4k144hz, and frankly that's what EVERYONE wants. That's why it's so nuts they went with the stupid displayport gimping because 4k144 is quite literally the new normative standard. It's solely about whoever can do 4k144 native better and at a better cost. That's the metric. Nobody is buying an RTX 4000 card because of raytracing or whatever. So basically, if you're still using a 1080p panel, or 1440p75 or anything below 1080p240 you have zero reason to upgrade. 970/980/1060 is more or less what every game needs today anyway.


Jesus do you guys really not get this
>as good as it gets
Maybe that's why, you have zero standards. How someone has a 980ti and thinks like this is beyond me.

Yes of course it's cheaper and discounted it's 2 freaking years old! Like, why tf would you expect it to still cost that much? And it's not "as low as" the f'ing things cost $700 at retail brand new at the end of 2020. It's 2023 now. If your 980ti cost "only" $900 today, would you call that a great deal?

>to a more reasonable
No it's not, because that TDP isn't being slashed, and my PSU isn't magically going to do 1 kilowatt. So it's altogether a much worse deal buying powerhog halo products. I'd consider 1080ti or 980ti used but that's more because they're really efficient and just noice cards with a special place in our hearts. Like, imagine GTX 590 performance at over 300w. That's why there's a sort of balance to older and used, because it reaches a certain limit in inefficiency, and the problem with Lovelace and Ampere they're literally the worst inefficient af graphics cards since Fermi, in fact it's literally worse than Fermi. So anyone buying these cards is also going to have a much harder time offloading them on the used market. Like the kind of person that has a 600w PSU and is gaming on a R5 1600 looking for used parts wouldn't be looking at a used 4090 or 3090ti.
Bruh chill out LMFAO. It's just a GPU, why do you have to be mad?

If power usage is that big a deal to you you shouldnt be buying $500+ GPUs in the first place.

And LOL at thinking the ONLY reason you'd be replacing a maxwell is if it broke or you were going 4k, bud maxwell were great GPUs but its 8 years old now, games have moved on.
Posted on Reply
#123
gasolin
eidairaman1Still overpriced
Why it's as fast or faster and much cheaper than a rtx 3090 ti
Posted on Reply
#124
dir_d
gasolinWhy it's faster and much cheaper than a rtx 3090 ti
You cant compare the price of a Halo Product to a **70Ti. Halo products have their own pricing which is usually unreasonable because its the best. Just because it comes in as fast or slower for cheaper does not mean its good value.
Posted on Reply
#125
xorbe
I didn't see any $799 launches reviewed here today.
Posted on Reply
Add your own comment
Dec 23rd, 2024 15:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts