Friday, February 9th 2024

NVIDIA GeForce RTX 4070 Ti Drops Down to $699, Matches Radeon RX 7900 XT Price

The NVIDIA GeForce RTX 4070 Ti an now be found for as low as $699, which means it is now selling at the same price as the AMD Radeon RX 7900 XT graphics card. The GeForce RTX 4070 Ti definitely lags behind the Radeon RX 7900 XT, and packs less VRAM (12 GB vs. 20 GB), and the faster GeForce RTX 4070 Ti SUPER is selling for around $100 more. The Radeon RX 7900 XT is around 6 to 11 percent faster, depending on the game and the resolution.

The GeForce RTX 4070 Ti card in question comes from MSI and it is Ventus 2X OC model listed over at Newegg.com for $749.99 with a $50-off promotion code. Bear in mind that this is a dual-fan version from MSI and we are quite sure we'll see similar promotions from other NVIDIA AIC partners.
Sources: Newegg.com, via Videocardz.com
Add your own comment

122 Comments on NVIDIA GeForce RTX 4070 Ti Drops Down to $699, Matches Radeon RX 7900 XT Price

#51
bonehead123
Still too damned expensive....regardless of mfgr, model, version etc....

What we need ATM are well-rounded, well-spec'd cards that can do 99% of what we need them to, AND are affordable for the average everyday user, including gamrs, CAD folk, the Blender crowd etc....
Posted on Reply
#52
Craptacular
OnasiI am sure. I guess that’s why NV put in time and RnD into securing this semi-professional and professional market. Because it’s irrelevant. AMD would not really like if production houses, 3D artists and designers and CAD users bought their GPUs. Guess CUDA is a fluke.
Come on now. AMD cards suck at non-gaming workloads. They must do better if they want to compete. The end. Arguing the obvious is silly.
It is almost like AMD has a separate category of cards for those use cases such as the Radeon Pro.....
Posted on Reply
#53
Vya Domus
DaworaIts good that we have at leas one who can releases good Gpus.
It's good that at least one of us can write in proper English.

You're right bro they at leas releases top dog gpus.
Posted on Reply
#54
Vayra86
Vya DomusWhat are you even talking about, all I pointed out is that regular consumers and professionals are different segments.

That's why Quadro and Firepro/Radeon Pro exist, they are clearly distinct markets in the eyes of these companies, mister "oH yOU SO wISe In tHe WAYS OF busInesS".
Dude, its been a thing for decades and still is that people buy none quadro cards to do similar things. There used to be hard segmentation but as it is with CPUs, the HEDT space bled into MSDT.

A lot of pro and semi pros but also amateur creators just want cuda, and Nvidia is catering to them. It also knows creative demand can be satisfied on any GPU.
Posted on Reply
#55
mechtech
Meanwhile in Canada W/o 13% tax
But ya about same price as 7900xt
Posted on Reply
#56
kapone32
mechtechMeanwhile in Canada W/o 13% tax
But ya about same price as 7900xt
Actually the 7900XT is $999 low price on Newegg. There is also a variant for $1049. Give me the extra 8 GB of VRAM for $100+ less vs 12GB for over $1000. That is probably why the 6700XT is the best selling card on Newegg.ca
Posted on Reply
#57
Vya Domus
Vayra86Dude, its been a thing for decades and still is that people buy none quadro cards to do similar things.
Because they share the same architecture, I already explained it, it's invariable that people try to cut costs by going with the cheaper consumer variant but nonetheless these companies clearly see these consumers as being different.
kapone32Using that as a basis would have to include that info. Of course Nvidia did not tell you that distinction.
Well, remember when they lied to investors about their "record gaming sales numbers" during the crypto mining craze. Of course they don't want people to know who exactly is buying what.
Posted on Reply
#58
Vayra86
Vya DomusBecause they share the same architecture, I already explained it, it's invariable that people try to cut costs by going with the cheaper consumer variant but nonetheless these companies clearly see these consumers as being different.
Exactly. Nvidia pushes that demand button constantly. Its part of the reason they bring double vram versions of GPUs.
Posted on Reply
#59
Assimilator
Vya DomusI have an academic background in these things, people used OpenCL a lot in the 2010-2015 era, Nvidia refused to support anything but the basic 1.2 layer for almost a decade. There was nothing wrong with OpenCL, Nvidia killed it by owning the majority platform which supported OpenCL but only a version that was ancient thereby forcing people to use CUDA.
NVIDIA didn't kill OpenCL, OpenCL failed because it was rubbish. It was was late, badly-thought-out, took the worst from the various compute languages/SDKs while ignoring the best, and was disliked by developers. Why would NVIDIA waste valuable development effort on trying to make something bad come good, when they already had something good in CUDA that they could make even better? From a business standpoint there was zero reason for them to support OpenCL.

If anyone killed OpenCL it was AMD, because their OpenCL implementation was buggy as all hell. Nobody was going to mess around with trying to get OpenCL to work on AMD when they could just use CUDA on NVIDIA without any pain. If AMD had actually made the effort to provide a good OpenCL experience, instead of half-assing it like they always do with their software, OpenCL might've had a chance.
Vya DomusUltimately CUDA/HIP/OpenCL/OneAPI/Metal are irrelevant, they're all the same
That's about as laughably incorrect as saying that C and C++ are the same language.
Vya Domusthere is not much value in the software itself
Of course there's no inherent value in the SDKs. The value is what they enable users of those SDKs to build and how quickly and easily they can do so. And CUDA wins hands-down there.
Vya DomusThere is a reason why GPU makers refuse to converge on one unified ISA/shader/compute language, because if that were to happen all their leverage would vanish, all the software would run the same on anything, they'd all be able to optimize everything from day one.
I completely agree that a 100% open compute standard would be an extremely good thing for consumers. But it has to be (a) good (b) well-implemented, and OpenCL was never either of those.
Posted on Reply
#60
Vya Domus
AssimilatorNVIDIA didn't kill OpenCL, OpenCL failed because it was rubbish.
Not even going to try to argue, I have worked with OpenCL/CUDA/HIP myself, I know for a fact what I am talking about. I need to get out of the habit of arguing with forum dwellers that are clearly out of their depth on this subject.
Vayra86Its part of the reason they bring double vram versions of GPUs.
On low end parts, you're not gonna see them doubling the VRAM on a 4090 unless they slap a Quadro sticker on it and charge some 2000$ more.
Posted on Reply
#61
rv8000
theouto
Do remember this poll, lads, it helps to know what the vast majority intend to use their GPU for.
This isn’t a majority though, and taken out of context of the grand scheme. The likely majority of consumers, purchasing a GAMING orient GPU, are using their GPUs for gaming and not browsing a hardware enthusiast website to fill out polls on niche topics.
Posted on Reply
#62
Onasi
rv8000This isn’t a majority though, and taken out of context of the grand scheme. The likely majority of consumers, purchasing a GAMING orient GPU, are using their GPUs for gaming and not browsing a hardware enthusiast website to fill out polls on niche topics.
The majority of consumers are either buying consoles for gaming or, as we have known for years from Steam HS, buy mid range GPUs… from NVidia. And that mindshare is what AMD has to overcome to succeed. And part of said mindshare is, yes, formed by reputation of NV in all kinds of circles, professional and creative ones very much included. It all adds up.

Is any of this sinking in yet? I am almost sorry I started this whole discussion. I merely pointed out that AMD is behind in non-gaming workloads, something that I personally have interest in. This somehow seems to have summoned another pointless forum war with people insulting each other and making vague authority claims to prove… something. *sigh*
Posted on Reply
#63
kapone32
OnasiThe majority of consumers are either buying consoles for gaming or, as we have known for years from Steam HS, buy mid range GPUs… from NVidia. And that mindshare is what AMD has to overcome to succeed. And part of said mindshare is, yes, formed by reputation of NV in all kinds of circles, professional and creative ones very much included. It all adds up.

Is any of this sinking in yet? I am almost sorry I started this whole discussion. I merely pointed out that AMD is behind in non-gaming workloads, something that I personally have interest in. This somehow seems to have summoned another pointless forum war with people insulting each other and making vague authority claims to prove… something. *sigh*
I have a question on the Steam Hardware chart. In the GPU section it shows that the 3060 laptop has increased in January. The only issue with that is that 3060 laptop GPU has not been available since 2022. How could that be?
Posted on Reply
#64
Onasi
kapone32I have a question on the Steam Hardware chart. In the GPU section it shows that the 3060 laptop has increased in January. The only issue with that is that 3060 laptop GPU has not been available since 2022. How could that be?
If I had to take a wild guess - maybe those weird Chinese cards with mobile GPUs on AIB boards? The vendors could have had surplus chips and are recycling them in this way for, say, internet cafes. Would certainly explain the increase.
Just a theory though, hard to speculate. But I am fairly sure that the Chinese market is responsible here.
Posted on Reply
#65
FeelinFroggy
ExcuseMeWtfPrice war is the only war I approve of.
Unless it's with a prostitute.
Posted on Reply
#66
rv8000
OnasiThe majority of consumers are either buying consoles for gaming or, as we have known for years from Steam HS, buy mid range GPUs… from NVidia. And that mindshare is what AMD has to overcome to succeed. And part of said mindshare is, yes, formed by reputation of NV in all kinds of circles, professional and creative ones very much included. It all adds up.

Is any of this sinking in yet? I am almost sorry I started this whole discussion. I merely pointed out that AMD is behind in non-gaming workloads, something that I personally have interest in. This somehow seems to have summoned another pointless forum war with people insulting each other and making vague authority claims to prove… something. *sigh*
There’s nothing to let “sink in”. The overwhelming majority of people playing league, valorant, BF, COD etc… arent buying a 4060, 4070, 7800XT and so on to run blender.

The poll is massively skewed towards TPUs userbase, which makes any conclusion on the actual gaming and consumer market for GPUs in regards to use objectively inaccurate.

I hate to make a car reference, but thats like surveying a car forum dedicated supras asking what car they drive, then claiming everyone in the world drives a supra.
Posted on Reply
#67
freeagent
I shouldn’t really talk.. was just thinking of my last comment in this thread. I bought my 4070Ti last summer during one of the last great Amazon sales that we had, it is still hundreds less than what they go for now. Good deal. Prices everywhere suck still.
Posted on Reply
#68
Chrispy_
freeagentI shouldn’t really talk.. was just thinking of my last comment in this thread. I bought my 4070Ti last summer during one of the last great Amazon sales that we had, it is still hundreds less than what they go for now. Good deal. Prices everywhere suck still.
The 4070 Ti is what the 4070 should have been at launch for $599 and the 4080 at $1200 has always been a massive rip-off.

I'm not even sure I'd want one at $599 these days; It was a decent performer at launch but 2023 was the year Nvidia's VRAM joke came to fruition. 1440p high-refresh is viable on 12GB with 2023 titles, but I suspect we're going to see 12GB cards be a problem at 1440p before the year is out.

For 4K, a not-unreasonable expectation, 12GB is already too small - especially if you want to turn on RT, and you do - otherwise you'd have bought a 7900XT long ago.
Posted on Reply
#69
3valatzy
Chrispy_The 4070 Ti is what the 4070 should have been at launch for $599 and the 4080 at $1200 has always been a massive rip-off.
Not only the boxes got wrong, but also the asking price, hence the over-70% profit margins.

Should be like this instead:

RTX 4090 24GB 1200$
RTX 4080 16GB 699$
RTX 4070 Ti 16GB 549$
RTX 4070 12GB 399$
RTX 4060 Ti 12GB 299$
RTX 4060 10GB 209$
RTX 4050 8GB 129$
Posted on Reply
#70
kapone32
OnasiIf I had to take a wild guess - maybe those weird Chinese cards with mobile GPUs on AIB boards? The vendors could have had surplus chips and are recycling them in this way for, say, internet cafes. Would certainly explain the increase.
Just a theory though, hard to speculate. But I am fairly sure that the Chinese market is responsible here.
Those were never the 3060 6GB. I am sure though as the news was the 4080s and 4090s.
OnasiThe majority of consumers are either buying consoles for gaming or, as we have known for years from Steam HS, buy mid range GPUs… from NVidia. And that mindshare is what AMD has to overcome to succeed. And part of said mindshare is, yes, formed by reputation of NV in all kinds of circles, professional and creative ones very much included. It all adds up.

Is any of this sinking in yet? I am almost sorry I started this whole discussion. I merely pointed out that AMD is behind in non-gaming workloads, something that I personally have interest in. This somehow seems to have summoned another pointless forum war with people insulting each other and making vague authority claims to prove… something. *sigh*
The Steam Deck is the cheapest way to get into PC Gaming. I am sure AMD is making serious money selling those APUs. I am not even sure how many handhelds have been brought to the market using AMD APUs. I know the Claw is using Intel but the price they ask for that is ridiculous as it would be over $1000 Canadian for the base version. A Steam Deck is less than half that and if you are a PC Gamer you already have a Steam library.
Posted on Reply
#71
mb194dc
What do we think demand is like given price cuts from each side every week?
Posted on Reply
#72
freeagent
Chrispy_For 4K, a not-unreasonable expectation, 12GB is already too small - especially if you want to turn on RT
That’s all I play at, usually at maxed settings, RT on. 4K/60 @ 55”
Posted on Reply
#73
kapone32
freeagentThat’s all I play at, usually at maxed settings, RT on. 4K/60 @ 55”
My 7900Xt gives me over 100+ FPS in every Game at 4K. I run at 144Hz on a 43"
Posted on Reply
#74
freeagent
kapone32My 7900Xt gives me over 100+ FPS in every Game at 4K. I run at 144Hz on a 43"
My tv is 60Hz so I aim low :)

Still, erry one talks shit about the card and it’s actually not too bad.
Posted on Reply
#75
3valatzy
freeagentStill, erry one talks shit about the card and it’s actually not too bad.
I haven't seen anyone talking about the RX 7900 XT.
Actually, from the RX 6000 and RX 7000, it is the only 7000 variant worth buying today, along with (maybe) RX 6650 XT, RX 6700 XT and RX 6800 (XT).

Of course, it could have been better if it had been a monolithic GPU - more performance against the chiplets method, plus those missing 10% performance up to the original performance target for Navi 31.
I wonder why AMD doesn't release an improved or fixed card.
Posted on Reply
Add your own comment
Dec 22nd, 2024 01:24 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts