Tuesday, January 3rd 2023
NVIDIA GeForce RTX 4070 Ti Launched at $799 with Performance Matching RTX 3090 Ti
NVIDIA today formally launched the GeForce RTX 4070 Ti "Ada" performance segment graphics card at a starting MSRP of USD $799. Based on the 4 nm "AD104" silicon, the RTX 4070 Ti is essentially the same product as the RTX 4080 12 GB, which NVIDIA decided to cancel from its original mid-November launch, toward a new one this CES, under a new model name. The card maxes out the silicon it's based on, featuring 7,680 CUDA cores, 60 RT cores, 240 Tensor cores, 240 TMUs, and 80 ROPs. It gets 12 GB of GDDR6X memory across a 192-bit wide memory interface, running at 21 Gbps (GDDR6X-effective). The card has a typical power rating of 285 W, and continues to use a 12VHPWR power connector, even on the custom-design products.
NVIDIA claims that the RTX 4070 Ti should enable maxed out AAA gaming with ray tracing at 1440p, while also being formidable at 4K Ultra HD in games that can take advantage of technologies such as DLSS 3 frame-generation, or even classic DLSS 2. The company claims that it offers performance comparable to the previous-generation flagship, the GeForce RTX 3090 Ti "Ampere," with a much higher performance/Watt rating. The RTX 4070 Ti doesn't appear to feature an NVIDIA Founders Edition model, and is a partner-driven launch, with custom-design cards dominating the scene. The RTX 4070 Ti will be available from January 5, 2023, but we'll have reviews for you before that!
NVIDIA claims that the RTX 4070 Ti should enable maxed out AAA gaming with ray tracing at 1440p, while also being formidable at 4K Ultra HD in games that can take advantage of technologies such as DLSS 3 frame-generation, or even classic DLSS 2. The company claims that it offers performance comparable to the previous-generation flagship, the GeForce RTX 3090 Ti "Ampere," with a much higher performance/Watt rating. The RTX 4070 Ti doesn't appear to feature an NVIDIA Founders Edition model, and is a partner-driven launch, with custom-design cards dominating the scene. The RTX 4070 Ti will be available from January 5, 2023, but we'll have reviews for you before that!
150 Comments on NVIDIA GeForce RTX 4070 Ti Launched at $799 with Performance Matching RTX 3090 Ti
The current cost of GPUs comes from multiple factors, the main one is advancing the density of manufacturing processes is getting slower, while the cost/wafer due to complexity practically doubles. Don't expect it to get any cheaper going forward with TSMC alone swimming ahead with no competition.
Second, the obsession with RT and high resolutions requires much larger GPUs with dedicated cores, plus all the additional cost of R&D, and both RT and high resolutions pull massive amounts of (also increasingly expensive) video memory, imagine $300+ in memory alone on recent GPUs.
Also, thermal pads were meant for the VRAM, not for the core. So at least for my case, the changing for thermal pads clearly brings the memory hotspot temps down significantly.
Personally, i might just pull the trigger on one as I had stated previously i am willing to fork out $800 for a GPU (not a dime more).... but i expected way more for this sort of money and it's hardly an "exciting" buy for the monies worth. I feel for the guys on a budget... although most likely some will have a fat enough budget but just not cutting it nowadays with these rediculous post-pandemic pocket pinch price scheming. I dunno, might even give 40-series/RDNA3 a miss ... i'm just not feeling the "GPU" upgrade impulse nowadays (well since 30-series anyway). Closely trailing RDNA3 doesn't seem exciting either.... all PANTS if you ask me.
Looks like most gamers looking for a spectacular eye-candy gaming experience + snappy frame render will have to settle with less. I'm glad i kept the impulse at bay from moving to 4K... that would have sucked for me! 1440p it is for another 5 years or so it seems.
So overall we have NVIDIA's marketing team tryharding with ridiculous lies like "4070 Ti + DLSS = 3x 3090 Ti", as if that 8-month-old $2000 GPU was already an ancient relic, all in an attempt to get the unsuspecting buyers on board. It's beyond low, it's just pathetic. Not two years ago they were touting the RTX 3090 as an 8K-ready next-generation product ready for the future, and now they're calling "mainstream gamer's hardware", a polite dig at "what are you, poor?" which to me, just implies medium settings 1080p experience throughout - the 40 fps or so that my RTX 3090 runs Portal RTX at seems to imply that, at least.
A 192-bit GPU with such modest BoM and on this precise segment should not be over $499.
A good case study is the original RTX 3090, it initially carried hundreds of dollars in memory alone*, due to very high prices earlier on and needing 24 ICs plus a PCB and power delivery system to match.
*= (rumor was that it had roughly $600 of memory on it back then, even accounting for economy of scale, I do not know if this is true but given that the chips used on it still fetch $24.50 each on low volume market, it may very well be)
The RTX 4070 Ti, in contrast should use a much simpler design with only six 16Gbit G6X ICs, and the AD104 is a relatively small processor with a 250-280W footprint, which doesn't require as advanced a VRM as either the AD103 or the AD102.
The environment is so toxic, so monopolistic, so favoring to Nvidia, that Nvidia, controlling 90% of the market, has no real reasons to drop prices. Intel is still far behind to become a factor, AMD still doesn't seem to be very excited about the GPU market and probably they are more concern about AM5. Both Intel and AMD will gain long term if Nvidia does all the necessary work to establish higher prices in the retail GPU market. Also I doubt OEMs pay these prices. Nvidia advertises 4070 Ti at $800, but what is Dell for example pay to get one? $800? $700? $600? $500? Those high prices are beneficial for every company out there making gaming equipment, just not for consumers. MS and SONY can price their next gen consoles higher, Valve and others building gaming handheld devices, also, Dell that I will use again as an example, buy a 4070 Ti from Nvidia at, let's say $600 and price it at $700, making it's prebuild PCs look like bargains and of course AMD and Intel price their own offering higher. While they will sell less stuff, they will achieve higher profit margins and if TSMC doesn't have enough capacity for everyone, selling less stuff at higher profit margins, is crucial. Nvidia also looks more at AI, roboticks and stuff today than gaming, AMD is concentrating more in servers while it still has an advantage there, as for Intel, they build GPUs out of necessity, not to make gamers happy.
Cheap GPUs are dead. We already witnessed the death of sub $100 GPUs those last years, for sub $200 we only got the laptop RX 6500XT and then the beta A380 and finally the abysmal insult in the form of an old arch GTX 1630. Under $300 we only have one good option, the RX 6600 that no one buys, because it doesn't have an Nvidia logo on it. Instead they buy the RTX 3050. Go figure....
Nvidia dropping prices? Why?
Facts: I've got a 1440p75hz panel since the start of the pandemic, I been gaming on 1440p with an RDNA1 card because RTX1 cost too much money in comparison ($500 floor minimum on 2070 super aka 2-70ti, $410 for a much nicer looking 5700XT, same performance, up to $170 cheaper than what I was looking at back then for 2070s custom AIB). So, to me, I'm gonna have to need 4k raytracing at affordable prices by this point considering we are now up to RTX gen 3. nuVidia no longer has got any excuse on this. It's even worse now that RDNA3 shows up and, underwhelming though it is, still isn't that far behind in RT ultimately.
So basically if it can't even do that, I've got ZERO reason for ever upgrading, because I don't have to replace my monitor because the card is literally unable to do raytracing at 1440p native, even to this day, going by that logic. Really all the DLSS bull$%^& and FSR nonsense, trying to muddy the water (literally lel) when the clear issue is PRICE TO PERFORMANCE. That's it. I don't care what s&%$ this company throws at the fan to trick and confuse people at this point. Jensen honestly bet (and lost his gamble, badly) on mining farms still having demand for scalped af mining GPUs; that's vanished. And so now they're literally stuck with gamers, who they've been screwing for generations, and trying to justify this HORSE %$#&. IT WAS NEVER "INFLATION".
IT WAS LITERALLY ALWAYS THEIR PRICE/PROFIT MODEL. So bad, nobody purchased a 2080, because why tf would you when you could get a 1080ti instead for cheap. And that was a thousand dollars/euros at retail, which then the 2-80ti bumped to $1200. Think about the MSRP for the GTX 1070 or 1070ti. Well then the 2-70ti (aka 2070super) is $500, right? So that's at when I thought the price didn't make sense so I switched to AMD, which hasn't failed me yet (and was frankly an excellent mining card, literally paid for itself with extra profit 2020-2021). That was BEFORE the $600 RTX 3070ti, btw, which WAS BEFORE THE ALLEGED "INFLATION" EVEN STARTED HAPPENING. It's a SCAM.
So, no, Jensen can go F himself, and his shareholders. Anyone who buys this card at anywhere near those prices is a simp, a moron, and a cuck of highest calibre. It's like, they are playing some alleyway scam on total idiots where you show bad thing1 and it sucks less than bad thing2 so they con themselves to thinking bad1 is "a better deal." No, they're both TERRIBLE deals. Like seriously I hopped off team green when RTX 2000 pricing was unbelievable, and as disappointing RDNA3 has been, it's nowhere near the disaster that is Lovelace. I would've even consider it without literally halving the prices.
It was 100% NVIDIA's responsibility to think first and improve electronics design, instead regular people to become beta testers and search for solutions of their own.
NVIDIA is the one which got your money, and you should receive a trouble free product.
My advice to all, just be more careful about your choices from now and on.
There is no scientific proof that the newest technology today is more expensive than the newest technology 5 or 10 years ago.
It can be fake news, speculations, and lies in order to justify the profit margins, the private jets and yachts for the top management and stockholders.
I think it's that the megacorpos just love ignorance, hell Capitalistic excess generally favours NPCs, the ignorant, the impulsive, the most childlike creature imaginable to be useless at anything than being a consumer drone, a cheap biorobot worker, and expendable canon fodder (the rest gets shipped to private schools to be the middle-management biorobots). So it's even more obvious to us in tech where we clearly can see the ripoff and are used to working with numbers, and these scumbag dirtbag corporations just pull literally the exact same sort of scammer from Mumbai "hello your PC is broken we need your Social Security number to unlock it from virus" crap on old people, kids, and NORPs that don't know better. And they try making it sound like computers are "really complicated" and they are not, you can easily teach yourself how to not get swindled, it's just they hold back some of the info and try playing 3 card shell games, like calling it "i7 with 8gb RAM" yeah single stick, not dual rank, lowest speed dustbin RAM with an ancient low-tier 9700 or something.
I'm so outraged by nuVidia I'm just not buying anymore. They became nuVidia at RTX frankly, it's embarrassing. Like they had their problems and always been a scumbag scammy company from the alleged GTX 1060 "3gb model" to 3.5gb of GTX 970, but at least Pascal and Maxwell were really good. They didn't even age poorly, no matter how much nuVidia wishes they did. This is because their dumb crap usually doesn't take off because no one wants to deal with them and their proprietary nonsense, so no one uses HairWorks for example can you imagine selling 1080tis purely on "it does HairWorks and AMD cannot"?? They try forcing the market to obsolesence more quickly but really, RTX can't even run RT natively, their bottom end SKUs also can't even run acceptable frames even with the smudgiest DLSS on. The alleged "RTX" 3050 should be a GTX 3050. But the real problem is lots and lots of morons too braindead to understand why a 3050 is a horrific deal at $400, it's literally slapped to s&%$ and thrown in a dumpster by the 5700XT, even 5600XT iirc. I think the bigger deal is the fail across every conceivable metric, from pricing, to performance, to power consumption, hell to even memory bus and anything else you can think of. There's no reason to buy a Lovelace GPU at all. You'd have to be a moron or so completely misinformed you already spent your money like a simp and found this thread in like 2025. They SUCK. I think literally the last time nVidia released a generation so terrible was the GT 280? Wasn't that when they charged $650 for a card that AMD's top end HD something beat at $400, so they had to lower prices?
That's the problem--they not only refused to lower prices, they jacked it. I saw those TDPs long ago and was like "alright, AMD needs to not screw up basically or nVidia need to lower prices because I'm not paying that for something so inefficient I need a new PSU." Lo and behold, they not only jacked it, they jacked prices outrageously to literally scalper levels.
Stop giving nuVidia your money. Like this, why are any of you guys even rationalizing this, it's like watching heroin addicts and alcoholics trying to justify why dying in a gutter is a great life decision for them. Also the 3090 wasn't much better than a 3080, facts. Each time with nVidia lately it felt like paying $500 for 2.5% uplift. That's literally hitting margin of error where you can get that kind of performance by just cleaning out the dust from your GPU and repasting, or getting a different AIB model or something.
You know the funniest thing to me, is they also released such a terrible last few generations that not only people been having all kind performance problems due to dumb s--- like driver bloat, but even just going on Steam all the time it's "how come my 3080 gets such low frames" "hello why is my 3070 getting this problem" and you go through them and quickly realize people are like "I have RX 6800 and it just werks for me." lmao
"I will pay the merchant $1299, but not a dime more!"
I wouldn't be willing to pay $600 for a 4070 super, period. I literally just skipped RTX 2070 super based on it being $500 minimum, EVGA models $570+ and meanwhile Radeon releases comparably performing $400 cards. "But it doesn't have raytracing!" So tell me then, was your DLSS 1.0 RT experience on that card really worth the extra $200? And it mined worse too. So it cost less on ebay even during the scalping.
That's what you are asking me to do. Is make my next upgrade be a card that at my targeted segment $500 was getting a bit steep, and going up to $600 I'm starting to expect xx80 performance. Considering the GTX 980 was $550, and not even all that long ago, and was a MUCH better made GPU, I don't find this to be terribly unreasonable to say "I'm not spending a dime more than $600 on a 70ti custom card and that's firm." And then that's also asking me to do it for a generation that's let's face, just plain sucks.
We all know it. It's literally a MEME. Like
as the average gamer, you are having, what, like $500 for GPU? Well let's think this way, you've got a new monitor in store too. Let's double that budget. You aren't going to magically have $500 appear from thin air in your account, but let's imagine you do. $1000 to play games at better visual quality. So let's say, $350 for monitor. That's $650 for GPU, now how much is that PSU going to cost? So suddenly, you've already dropped from being able to afford 3080 quality, to 3060ti quality at best. Meanwhile for that same cost you can get 6800XT quality for the same cost, because you don't have to get a bigger PSU.
That's why Lovelace is such a bad deal.
I mean really it's just s#@$ from every conceivable angle. The sole thing nuVidia accomplished was padding shareholder pockets and getting simps to cheer on their own robberies "yeah but it benefits shareholders so it's so smart!" Oh, and allegedly better performance, but when you slam that much additional power and it costs more, you really didn't advance at all, just expanded the range of halo products into more luxury price halo products. You could LN2 a 3080 and do the same thing more or less. This.
But again, this is part of a broader problem. It's like, if a guy is a heroin addict, he's stealing from people, stealing from friends, has a big criminal record, are we really going to spend that much time arguing how he ripped off grandma? It isn't even that he stole grandma's checkbook at that point. It's that Jensen has simply become so freaking brazen about it he's all but throwing a towel on the floor of jail cell "pick up my towel for me, bish." Jensen is openly daring you all to bend over, because he feels like he pushed up on you so much he can do literally anything to you he wants and you'll love him for it. Man...lol that's the problem, and again I cannot emphasize enough these kind of things is why I hopped off nVidia's dik years ago and got an RDNA1 card instead. People gave the 5700XT so much shit (it deserved it on release, but became a dead horse meme as people realized it got fixed and was one of the best value cards of the pandemic) but really, that's a ~$400 card and meanwhile crickets when you bring up the nonstop issues of a $1200 GPU like the 2080ti. I mean, if I spent over a grand on a GPU, you damn well better work right from the box. Instead, somehow the 2080ti never got the meme status s%$* it rightfully deserved for being 5700XT release tier...and costing over a thousand dollars.
But it didn't end there, did it. Ampere was the same thing, all kinds of problems plagued that series of cards, right up to their halo products bursting into flames. 3090, not 4090, which also burst into flames apparently. POSCAPs was a big issue, they had bad drivers, I remember they actually downclocked the 3080 so hilariously became the anti-finewine of performing worse with drivers over time, because they couldn't stop its crashing otherwise.
It happens because nobody wants to hold nVidia accountable, and you'll notice the actual businessmen who make money not just play with toys all dropped nVidia. Apple, EVGA, Sony, Microsoft, nobody really wants to work with that company and this includes gamedevs. So I think it's even funnier thinking about this and realizing all that AAA gaming is being finetuned on AMD-only hardware, Xbox and PS5 take Ryzen processors and RDNA2 graphics cards, so I'm not sure what people are thinking buying these and trying to claim some AAA gimmick. Yeah see this guy also gets it, if you actually remembered the 1gb graphics cards being "huge" because we finally hit the gigabyte mark on VRAM...well, not really impressive lately per performance. We had something like this sorta with Fermi, but not even Fermi was anywhere near these TDPs and the 590 was like 365w. Meanwhile we have such pathetic things like GTX 1630 pricing, companies realizing they can sell literally anything and people will buy it. The same crap happened with scalping, because the mining farms set the new standard because it was a direct economic calculation of how much money can I make and how long will it take me to turn my investment from net loss into a monetary gain, that's why they cost like $2000 because you could still turn a profit at that mark. Meanwhile the occasional moron gamer would buy a GPU at these prices to play games on. But when you go "all the way back" to like 10 years ago, a GTX 980ti was a beast at 240w or whatever it was.
If you remember then you're also mentally comparing any card today to the GTX 980 for $550, both in terms of uplift as well as pricing. Same goes for GTX 980ti and GTX 1080ti, the two generations nVidia was undisputably good, before the dark times, before RTX. Every single year since then they've jacked prices and delivered far less. They've had even worse standards and all kinds of problems everywhere, poorer physical products (no $500+ card should have a plastic backplate, the 900 series was all metal backplates though to be fair, they started using more plastic shrouds there was a time when it was all metal) numerous bugs, I mean really nuVidia has zero room to try and poke fun at AMD when bugs-wise nuVidia's been a complete disaster. And it wouldn't even be such a big deal were it not the fact they're charging these ridiculously outrageous prices for a throttling inferno. Like, I can't even imagine paying $800 for a 70ti. That's insane. So, a 4070 is NOT in my plans, period. But the problem is AMD has no reason to set a 7700XT at "normal" $500> if people are stupid enough to buy an nVidia card. So it's like the same problem as people rewarding the scalpers, only now nVidia is the scalper.
Would I publicly shame a man for giving his money to a scalper?
Yes I would.
I guess you mean the rating stars in Google search results? Google awards the ability to display those stars to sites that it considers authoritative for the topic. The stars make the result stand out more, so more clicks. More clicks = higher placement in the search results. Unfortunately a majority of visitors (! I polled people on this) think that the stars are a rating for the article, not for the tested product
Removing the stars will lower the click rates, which will push us down in search results, and people will no longer find our reviews, then why even write reviews
Seriously, that's it. And they can shill this RT b.s. all they want but AMD has RT too so nuVidia's not special, it's just another HairWorks far as I (or a lot of other gamers) are concerned. Games like Battlefleet Gothic Armada II and The Ascent look very pretty regardless, because Unreal is a pretty engine. nVidia would try calling those particle effects "PTXtm ParticleCorestm" or something and try comparing how well their card does that compared to AMD.
Meanwhile in all reality, they're more or less equivalent brands in performance, and the real reason to upgrade is going higher resolution because lots of games also aren't super demanding and it still feels like stagnation in the gaming industry too.
I only even wanted a new card because I was thinking about 4k144hz, and frankly that's what EVERYONE wants. That's why it's so nuts they went with the stupid displayport gimping because 4k144 is quite literally the new normative standard. It's solely about whoever can do 4k144 native better and at a better cost. That's the metric. Nobody is buying an RTX 4000 card because of raytracing or whatever. So basically, if you're still using a 1080p panel, or 1440p75 or anything below 1080p240 you have zero reason to upgrade. 970/980/1060 is more or less what every game needs today anyway. Jesus do you guys really not get this
>as good as it gets
Maybe that's why, you have zero standards. How someone has a 980ti and thinks like this is beyond me.
Yes of course it's cheaper and discounted it's 2 freaking years old! Like, why tf would you expect it to still cost that much? And it's not "as low as" the f'ing things cost $700 at retail brand new at the end of 2020. It's 2023 now. If your 980ti cost "only" $900 today, would you call that a great deal?
>to a more reasonable
No it's not, because that TDP isn't being slashed, and my PSU isn't magically going to do 1 kilowatt. So it's altogether a much worse deal buying powerhog halo products. I'd consider 1080ti or 980ti used but that's more because they're really efficient and just noice cards with a special place in our hearts. Like, imagine GTX 590 performance at over 300w. That's why there's a sort of balance to older and used, because it reaches a certain limit in inefficiency, and the problem with Lovelace and Ampere they're literally the worst inefficient af graphics cards since Fermi, in fact it's literally worse than Fermi. So anyone buying these cards is also going to have a much harder time offloading them on the used market. Like the kind of person that has a 600w PSU and is gaming on a R5 1600 looking for used parts wouldn't be looking at a used 4090 or 3090ti.
If power usage is that big a deal to you you shouldnt be buying $500+ GPUs in the first place.
And LOL at thinking the ONLY reason you'd be replacing a maxwell is if it broke or you were going 4k, bud maxwell were great GPUs but its 8 years old now, games have moved on.