Thursday, September 14th 2023
NVIDIA GeForce RTX 4070 Could See Price Cuts to $549
NVIDIA's GeForce RTX 4070 12 GB graphics card finds itself embattled against the recently launched AMD Radeon RX 7800 XT, and board partners from NVIDIA's ecosystem plan to do something about it, reports Moore's Law is Dead. A GIGABYTE custom-design RTX 4070 Gaming OC graphics card saw a $549 listing on the web, deviating from the $599 MSRP for the SKU, which hints at what the new pricing for the RTX 4070 could generally look like. At $549, the RTX 4070 would still sell for a $50 premium over the RX 7800 XT, probably banking on better energy efficiency and features such as DLSS 3. NVIDIA partners could take turns to price their baseline custom-design RTX 4070 product below the MSRP on popular online retail platforms, and we don't predict an official price-cut that applies across all brands, forcing them all to lower their prices to $549. We could also see NVIDIA partners review pricing for the RTX 4060 Ti, which faces stiff competition from the RX 7700 XT.
Source:
Moore's Law is Dead (YouTube)
130 Comments on NVIDIA GeForce RTX 4070 Could See Price Cuts to $549
Ray Tracing is cool, but not that cool to defend the price you have to pay for those NVIDIA cards.
And what are the other things you are talking about?
But for other gamers with a big ass PSU's (as they usually have) and with a full tower (case) that is as big as a skyscraper, I don't think the power usage on the GPU is even in their minds. They just want the best GPU for the money.
When it comes to 'Ray Tracing', then I have a NVIDIA GPU myself that has Ray Tracing. Ray Tracing today is mostly worth it if you have a 4080 (minimum) or a 4090. Otherwise the performance with Ray Tracing will be 'meh'.
And why would a 'gamer' care about power usage?
Do you think a race driver would care about fuel usage on his race car, just to take an example?
If you are serious about gaming, then you buy what's giving you the best performance no matter how much power it uses.
Gaming performance is a one-dimensional number only if you're 14 years old and this is your first PC.
But I can however see that the other components in the computer can take a little hit if they get blasted with heat from the GPU. But with good airflow, that shouldn't be much of a problem :).
A card like 7800XT can't even utilize 16GB because GPU is too weak. Futureproofing makes no sense when GPU will be what stops you from maxing out games.
More VRAM does not help when GPU is the limiting factor. And this is why 3070 8GB still beats 6700XT 12GB in 4K gaming in 2023 overall with ease -> www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html
And they launched at same MSRP in 2020/2021, or 20 dollars apart but 3070 launched like 6 months before 6700XT. AMD is always too late to the party really.
A 4070 with DLSS enabled would destroy a 7800XT with ease. They perform within 5% of each other without it and 4070 uses 50 watts less, was also released almost half a year ago. Nothing about 7800XT is impressive if you ask me. We had this kind of performance several years ago. However it's a fine mid-end card, AMD just released it too late because they already talk about RDNA4 in 2024 which will replace 7800XT -> RDNA4 will have no high-end products so Nvidia don't even have to respond before 2025.
The 4070 should have a slightly higher price than 7800XT because of superior features and RT performance. 9 out of 10 people that buy GPU today don't solely look at raster performance like its 2015. Killer features like DLSS/DLAA/DLDSR is missing. No way AMD can ask a higher price when all they have is FSR, VSR and mediocre RT performance and features in general.
AMD is always the cheaper option, with subpar features to follow and less optimized drivers overall. Nothing new here. Resell value is also much lower on AMD GPUs (less demand and AMD pricedrops over time makes them almost unsellable after a few years).
You can keep rambling all you want, what you say is exactly what people said about 6700XT because it had 12GB and it aged like milk anyway because GPU is not good enough to utilize it anyway. 3060 Ti 8GB aged just as well and was 79 dollars less on release and have DLSS to save performance meanwhile 6700XT users have identical raster perf but only FSR to help them when card becomes too slow, and it already is considered slow in many new games.
I can't stop laughing when people think VRAM will make their GPU futureproof... Not a single time this has been the case. Even 3090 24GB feels kinda slow for 4K gaming today and people thought it would be able to max out games in 4K for many yeah because of 24GB VRAM... Yet 4070 Ti 12GB beats it overall in 4K gaming today -> www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
7900XT is also only 5 fps faster than 4070 Ti in 4K minimum fps across all the games tested. Meaning that 12GB is plenty and 4070 Ti can always enable DLSS or DLAA to make up for that or to improve visuals further than 7900XT is capable of.
AMD severely lacks proper AA like DLAA. Amazing and best AA solution hands down.
Also, I didn't mention PSU requirements with transient spikes and such. I never said it isn't for current games. However, if I pay this much for a GPU, I expect it to have more VRAM just in case. If the 4070 was the same price as the 7800 XT, it would be an absolutely compelling option.
Not a single game uses more than 12GB even in 4K maxed out. And don't get fooled by allocation like most people do, please. Many game engines utilize 90-95% of VRAM regardless of how much they actually need.
Singular case, I know, but NV's market share is most due to greater availability and MASSIVE marketing.
The best marketing for nvidia is buying an amd card. I just did, never again. Driver issues are just through the roof.
How many times has someone posted about the next amd product that's going to destroy nvidia? Mlid does it daily. Do you think those are not marketing? Nope
And just because you know about the existence of X GPU, you can still go to E3, Gamescom, etc. to drool over it and/or Nvidia/AMD's presentation, and/or try the card first-hand. This is what marketing is.
Or why do you think reviewers get free units? Because they can tell everyone what a great product it is (or at least generate some publicity) so that more people will buy it. Simple.
There's TONS of marketing in GPUs!