Thursday, September 14th 2023

NVIDIA GeForce RTX 4070 Could See Price Cuts to $549

NVIDIA's GeForce RTX 4070 12 GB graphics card finds itself embattled against the recently launched AMD Radeon RX 7800 XT, and board partners from NVIDIA's ecosystem plan to do something about it, reports Moore's Law is Dead. A GIGABYTE custom-design RTX 4070 Gaming OC graphics card saw a $549 listing on the web, deviating from the $599 MSRP for the SKU, which hints at what the new pricing for the RTX 4070 could generally look like. At $549, the RTX 4070 would still sell for a $50 premium over the RX 7800 XT, probably banking on better energy efficiency and features such as DLSS 3. NVIDIA partners could take turns to price their baseline custom-design RTX 4070 product below the MSRP on popular online retail platforms, and we don't predict an official price-cut that applies across all brands, forcing them all to lower their prices to $549. We could also see NVIDIA partners review pricing for the RTX 4060 Ti, which faces stiff competition from the RX 7700 XT.
Source: Moore's Law is Dead (YouTube)
Add your own comment

130 Comments on NVIDIA GeForce RTX 4070 Could See Price Cuts to $549

#52
pavle
CUDA, would-a, should-a... Just drop the price.
Posted on Reply
#53
Assimilator
cvaldesAnd DirectX is really a Microsoft standard that has been forced on the world due to their near monopolistic practices and strangehold on the PC industry (you remember that anti-trust investigation 20 years ago?).
It also happens to be a pretty good standard, though. Certainly better than OpenGL, which is what mostly killed the latter.
Posted on Reply
#54
Vya Domus
So this would be AIBs trying to lower prices not a price cut coming directly from Nvidia. I can't even remember the last time they officially lowered the prices in one of their products, I think 1000 series was the last time they did that.
Posted on Reply
#55
The Norwegian Drone Pilot
lexluthermiesterYour opinion. Clearly, not everyone agrees.

And there you go. Those two things, and a few others you left out, are very good reasons to go with a 4070.
Power usage is something laptop owners cares about. So for a gamer with a stationary computer, that is something most of those doesn't cares about.

Ray Tracing is cool, but not that cool to defend the price you have to pay for those NVIDIA cards.

And what are the other things you are talking about?
Posted on Reply
#56
lexluthermiester
QuattrokongenPower usage is something laptop owners cares about.
Not true. Lots of desktop users care, myself included.
QuattrokongenSo for a gamer with a stationary computer, that is something most of those doesn't cares about.
That's your opinion, not everyone agrees.
QuattrokongenRay Tracing is cool, but not that cool to defend the price you have to pay for those NVIDIA cards.
Another opinion. Not everyone agrees.
QuattrokongenAnd what are the other things you are talking about?
I could mention them, but I'm not going to. If YOU don't understand, that's ok. Everyone else will be happy with their cards.
Posted on Reply
#57
The Norwegian Drone Pilot
lexluthermiesterNot true. Lots of desktop users care, myself included.

That's your opinion, not everyone agrees.

Another opinion. Not everyone agrees.

I could mention them, but I'm not going to. If YOU don't understand, that's ok. Everyone else will be happy with their cards.
The only place where I would care about the power usage in watt, is if the GPU is placed inside a Mini-ITX build. I have a Mini-ITX build myself, but I can easily use GPU's that consume like 350 watt in it with no problems.

But for other gamers with a big ass PSU's (as they usually have) and with a full tower (case) that is as big as a skyscraper, I don't think the power usage on the GPU is even in their minds. They just want the best GPU for the money.

When it comes to 'Ray Tracing', then I have a NVIDIA GPU myself that has Ray Tracing. Ray Tracing today is mostly worth it if you have a 4080 (minimum) or a 4090. Otherwise the performance with Ray Tracing will be 'meh'.
Posted on Reply
#58
lexluthermiester
QuattrokongenThe only place where I would care about the power usage in watt, is if the GPU is placed inside a Mini-ITX build.
That's you. And while there is bound to be people who agree with you, many people care about power usage regardless of the type of system they're running.
QuattrokongenRay Tracing today is motsly woth it if you have a 4080 (minimum) or a 4090. otherwise the performance will be 'meh'.
Nonsense. My 2080 does RTRT just fine and plays at very good framerates. Anyone who thinks a 4080 is the minimum a gamer needs, they're doing something wrong and not doing their research.
Posted on Reply
#59
The Norwegian Drone Pilot
lexluthermiesterThat's you. And while there is bound to be people who agree with you, many people care about power usage regardless of the type of system they're running.

Nonsense. My 2080 does RTRT just fine and plays at very good framerates. Anyone who thinks a 4080 is the minimum a gamer needs, they're doing something wrong and not doing their research.
Yeah sure, you can probably run a 2080 with Ray Tracing on while having the rest of the graphic settings on medium or even on low to even get an acceptable frame rate.

And why would a 'gamer' care about power usage?

Do you think a race driver would care about fuel usage on his race car, just to take an example?
Posted on Reply
#60
lexluthermiester
QuattrokongenAnd why would a 'gamer' care about power usage?
The fact that you have to ask that question proves the answer would mean nothing to you. You do you and quit assuming everyone else agrees with you.
Posted on Reply
#61
The Norwegian Drone Pilot
lexluthermiesterThe fact that you have to ask that question proves the answer would mean nothing to you. You do you and quit assuming everyone else agrees with you.
I'm just asking why that would be something any gamers would care about in the same way as why a race car driver would care about the fuel consumption on his car?

If you are serious about gaming, then you buy what's giving you the best performance no matter how much power it uses.
Posted on Reply
#62
AusWolf
QuattrokongenI'm just asking why that would be something any gamers would care about in the same way as why a race car driver would care about the fuel consumption on his car?

If you are serious about gaming, then you buy what's giving you the best performance no matter how much power it uses.
Heat, noise, cooling/airflow requirements, component longevity...

Gaming performance is a one-dimensional number only if you're 14 years old and this is your first PC.
Posted on Reply
#63
The Norwegian Drone Pilot
AusWolfHeat, noise, cooling/airflow requirements, component longevity...
Yeah, but a GPU with high power usage is usually built to consume a lot of power anyways. And noise will be the least a gamer would care about as they usually are using headphones while gaming.

But I can however see that the other components in the computer can take a little hit if they get blasted with heat from the GPU. But with good airflow, that shouldn't be much of a problem :).
Posted on Reply
#64
las
AusWolfNvidia's partners have been crying about too small profit margins for years, Evga has even dropped out of the game, and now, they suddenly let the 4070 go $50 cheaper out of the kindness of their hearts, with no interference from Nvidia whatsoever. Tsk-tsk-tsk. :rolleyes:

Anyway, a 12 GB card should not cost more than $500 MSRP. At that price, a 16 GB (7800XT) vs DLSS (4070) battle would be exciting. $550 is still too much, imo.
Stop the BS and look at reality. 12GB is plenty for 1440p gaming, and even 4K gaming.

A card like 7800XT can't even utilize 16GB because GPU is too weak. Futureproofing makes no sense when GPU will be what stops you from maxing out games.

More VRAM does not help when GPU is the limiting factor. And this is why 3070 8GB still beats 6700XT 12GB in 4K gaming in 2023 overall with ease -> www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html

And they launched at same MSRP in 2020/2021, or 20 dollars apart but 3070 launched like 6 months before 6700XT. AMD is always too late to the party really.


A 4070 with DLSS enabled would destroy a 7800XT with ease. They perform within 5% of each other without it and 4070 uses 50 watts less, was also released almost half a year ago. Nothing about 7800XT is impressive if you ask me. We had this kind of performance several years ago. However it's a fine mid-end card, AMD just released it too late because they already talk about RDNA4 in 2024 which will replace 7800XT -> RDNA4 will have no high-end products so Nvidia don't even have to respond before 2025.

The 4070 should have a slightly higher price than 7800XT because of superior features and RT performance. 9 out of 10 people that buy GPU today don't solely look at raster performance like its 2015. Killer features like DLSS/DLAA/DLDSR is missing. No way AMD can ask a higher price when all they have is FSR, VSR and mediocre RT performance and features in general.

AMD is always the cheaper option, with subpar features to follow and less optimized drivers overall. Nothing new here. Resell value is also much lower on AMD GPUs (less demand and AMD pricedrops over time makes them almost unsellable after a few years).

You can keep rambling all you want, what you say is exactly what people said about 6700XT because it had 12GB and it aged like milk anyway because GPU is not good enough to utilize it anyway. 3060 Ti 8GB aged just as well and was 79 dollars less on release and have DLSS to save performance meanwhile 6700XT users have identical raster perf but only FSR to help them when card becomes too slow, and it already is considered slow in many new games.

I can't stop laughing when people think VRAM will make their GPU futureproof... Not a single time this has been the case. Even 3090 24GB feels kinda slow for 4K gaming today and people thought it would be able to max out games in 4K for many yeah because of 24GB VRAM... Yet 4070 Ti 12GB beats it overall in 4K gaming today -> www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html

7900XT is also only 5 fps faster than 4070 Ti in 4K minimum fps across all the games tested. Meaning that 12GB is plenty and 4070 Ti can always enable DLSS or DLAA to make up for that or to improve visuals further than 7900XT is capable of.

AMD severely lacks proper AA like DLAA. Amazing and best AA solution hands down.
Posted on Reply
#65
AusWolf
Quattrokongennoise will be the least a gamer would care about as they usually are using headphones while gaming.
Well, I don't. ;)
QuattrokongenBut I can however see that the other components in the computer can take a little hit if they get blasted with heat from the GPU. But with good airflow, that shouldn't be much of a problem :).
Not just other components, but the GPU itself as well. If airflow is restricted, its fans will run harder to keep it cool. I'd recommend looking at the arsenal of cases at any computer store, and see how many of them have got practically zero airflow, even the more expensive ones. You know... glass front panels, idiotic fan mount placements and such.

Also, I didn't mention PSU requirements with transient spikes and such.
lasStop the BS and look at reality. 12GB is plenty for 1440p gaming, and even 4K gaming.
I never said it isn't for current games. However, if I pay this much for a GPU, I expect it to have more VRAM just in case. If the 4070 was the same price as the 7800 XT, it would be an absolutely compelling option.
Posted on Reply
#66
las
AusWolfWell, I don't. ;)


Not just other components, but the GPU itself as well. If airflow is restricted, its fans will run harder to keep it cool. I'd recommend looking at the arsenal of cases at any computer store, and see how many of them have got practically zero airflow, even the more expensive ones. You know... glass front panels, idiotic fan mount placements and such.

Also, I didn't mention PSU requirements with transient spikes and such.


I never said it isn't for current games. However, if I pay this much for a GPU, I expect it to have more VRAM just in case. If the 4070 was the same price as the 7800 XT, it would be an absolutely compelling option.
No games will demand much more VRAM than now for years. UE5 will only get better, because devs will learn how to optimize it + UE5 itself will get tweaked over time. There's no new big game engines coming, no new consoles with way more RAM. Current gen PS5 and XSX have 16GB shared RAM for entire system/os and graphics. You can dream all you want but 16GB don't futureproof a mid-end GPU, because you won't be able to run ultra settings in 2-3 years anyway, unless you enable FSR maybe.

Not a single game uses more than 12GB even in 4K maxed out. And don't get fooled by allocation like most people do, please. Many game engines utilize 90-95% of VRAM regardless of how much they actually need.
Posted on Reply
#67
AusWolf
lasNo games will demand much more VRAM than now for years. UE5 will only get better, because devs will learn how to optimize it + UE5 itself will get tweaked over time. There's no new big game engines coming, no new consoles with way more RAM. Current gen PS5 and XSX have 16GB shared RAM for entire system/os and graphics. You can dream all you want but 16GB don't futureproof a mid-end GPU, because you won't be able to run ultra settings in 2-3 years anyway, unless you enable FSR maybe.
PS and Xbox games don't use the same graphics settings and resolutions as their PC versions. A month ago, I was under the assumption that a 6700-6750 XT or 3070 Ti would be enough for rock solid 60 FPS at 1080p in every game for the next 3-5 years. Then, Starfield came out. So like I said: just in case. ;) Games may or may not need this much VRAM in the near future, and they may or may not run out of GPU horsepower first. But it's better to play it safe. I'm pretty sure similar conversations went through with the 30-series and all the 8 GB cards in it back in the days.
Posted on Reply
#68
JustBenching
QuattrokongenYeah, but a GPU with high power usage is usually built to consume a lot of power anyways. And noise will be the least a gamer would care about as they usually are using headphones while gaming.

But I can however see that the other components in the computer can take a little hit if they get blasted with heat from the GPU. But with good airflow, that shouldn't be much of a problem :).
I'm using open back headphones. Don't like the fan noise.
QuattrokongenRay Tracing is cool, but not that cool to defend the price you have to pay for those NVIDIA cards.
You do realize the overwhelming majority of pc gamers disagree with you, they think it's worth paying the price for those nvidia cards. That's why they have over 80% marketshare.
Posted on Reply
#69
wNotyarD
fevgatosYou do realize the overwhelming majority of pc gamers disagree with you, they think it's worth paying the price for those nvidia cards. That's why they have over 80% marketshare.
Personally, I only have a 3070 because the equivalent Radeons were obtainium when I was to buy a card. Haven't ever used RT or DLSS on it (I did actually try DLSS on F1 2020, to never again touch it).
Singular case, I know, but NV's market share is most due to greater availability and MASSIVE marketing.
Posted on Reply
#70
AusWolf
wNotyarDHaven't ever used RT or DLSS on it (I did actually try DLSS on F1 2020, to never again touch it).
Hah, I'm not the only one, it seems! :oops:
Posted on Reply
#71
JustBenching
wNotyarDPersonally, I only have a 3070 because the equivalent Radeons were obtainium when I was to buy a card. Haven't ever used RT or DLSS on it (I did actually try DLSS on F1 2020, to never again touch it).
Singular case, I know, but NV's market share is most due to greater availability and MASSIVE marketing.
What marketing? Gpus aren't even a product that's really marketed anywhere. Not like you see ads or anything. Amd and Nvidia cards have the exact same exposure, either through reviews on YouTube Twitter etc.

The best marketing for nvidia is buying an amd card. I just did, never again. Driver issues are just through the roof.
Posted on Reply
#72
AusWolf
fevgatosWhat marketing? Gpus aren't even a product that's really marketed anywhere.
Except for games themselves, trailers, news sites, reviews, public events (like E3 or Gamescom), not to mention Nvidia/AMD's own marketing campaigns (website, Youtube channel, social media, etc.)... How many times have you or someone else posted links to reviews that talk about how great DLSS is? Do you think those reviews are not marketing?
fevgatosDriver issues are just through the roof.
Was it a 5700 XT?
Posted on Reply
#73
JustBenching
AusWolfExcept for games themselves, trailers, news sites, reviews, public events (like E3 or Gamescom), not to mention Nvidia/AMD's own marketing campaigns (website, Youtube channel, social media, etc.)... How many times have you or someone else posted links to reviews that talk about how great DLSS is? Do you think those reviews are not marketing?
Whoever watches any of these already knows about amds existence. You think someone is going to E3 and doesn't know that amd makes gpus?

How many times has someone posted about the next amd product that's going to destroy nvidia? Mlid does it daily. Do you think those are not marketing?
AusWolfWas it a 5700 XT?
Nope
Posted on Reply
#74
AusWolf
fevgatosWhoever watches any of these already knows about amds existence. You think someone is going to E3 and doesn't know that amd makes gpus?

How many times has someone posted about the next amd product that's going to destroy nvidia? Mlid does it daily. Do you think those are not marketing?
You just contradicted yourself. First you said there's no marketing in GPUs, and now you're bringing examples of it. Of course these are marketing!

And just because you know about the existence of X GPU, you can still go to E3, Gamescom, etc. to drool over it and/or Nvidia/AMD's presentation, and/or try the card first-hand. This is what marketing is.

Or why do you think reviewers get free units? Because they can tell everyone what a great product it is (or at least generate some publicity) so that more people will buy it. Simple.

There's TONS of marketing in GPUs!
Posted on Reply
#75
hat
Enthusiast
QuattrokongenI'm just asking why that would be something any gamers would care about in the same way as why a race car driver would care about the fuel consumption on his car?

If you are serious about gaming, then you buy what's giving you the best performance no matter how much power it uses.
Gamers aren't your equivalent to race car drivers. Instead, it's extreme overclockers chasing world records using things like liquid nitrogen.
Posted on Reply
Add your own comment
Dec 18th, 2024 02:58 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts