Thursday, September 14th 2023

NVIDIA GeForce RTX 4070 Could See Price Cuts to $549

NVIDIA's GeForce RTX 4070 12 GB graphics card finds itself embattled against the recently launched AMD Radeon RX 7800 XT, and board partners from NVIDIA's ecosystem plan to do something about it, reports Moore's Law is Dead. A GIGABYTE custom-design RTX 4070 Gaming OC graphics card saw a $549 listing on the web, deviating from the $599 MSRP for the SKU, which hints at what the new pricing for the RTX 4070 could generally look like. At $549, the RTX 4070 would still sell for a $50 premium over the RX 7800 XT, probably banking on better energy efficiency and features such as DLSS 3. NVIDIA partners could take turns to price their baseline custom-design RTX 4070 product below the MSRP on popular online retail platforms, and we don't predict an official price-cut that applies across all brands, forcing them all to lower their prices to $549. We could also see NVIDIA partners review pricing for the RTX 4060 Ti, which faces stiff competition from the RX 7700 XT.
Source: Moore's Law is Dead (YouTube)
Add your own comment

130 Comments on NVIDIA GeForce RTX 4070 Could See Price Cuts to $549

#26
N/A
FG DLAA gimmicks should be for free. they insist on pushing it not our problem.
Posted on Reply
#27
oxrufiioxo
N/AFG DLAA gimmicks should be for free. they insist on pushing it not our problem.
100% agree as much as I like it in a couple games it isn't a feature people should be buying any gpu for.
Posted on Reply
#28
TheoneandonlyMrK
oxrufiioxo100% agree as much as I like it in a couple games it isn't a feature people should be buying any gpu for.
And that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.
Posted on Reply
#29
oxrufiioxo
TheoneandonlyMrKAnd that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.
I like DLSS enough that an AMD alternative would have to be clearly better otherwise other than that I don't care much about any other Nvidia feature.
Posted on Reply
#30
TheoneandonlyMrK
oxrufiioxoI like DLSS enough that an AMD alternative would have to be clearly better otherwise other than that I don't care much about any other Nvidia feature.
Yeah I didn't.
3 Fg might be great but I bought into 1 got 2 for free then no 3.
So limited use in the 600+ games I own!
But it doesn't suit most games I play or more succinctly I choose other ways to gain Frames If I need too, such is my revulsion for what a 1080 spin(dirt rally2/others) does with it on and if you're not spinning out sometimes, are you sure you're trying hard enough.
I play online FPS too, in groups so higher FPS stabley and highly accurate outweigh all other needs and then fsr or dlss isn't good enough, none are , and before it's said reflex plus native is again faster than reflex and dlss.
All personal tastes really so I'm not arguing my ways it , I am again saying YdY.
Posted on Reply
#31
Tsukiyomi91
"could" is a very loose term. I say let all of NoVideo's AIB partners suffer a little more and watch the stocks rot away when it's still more expensive than a 7800XT. (yes, AMD also gets the same treatment too.)
Posted on Reply
#32
oxrufiioxo
TheoneandonlyMrKYeah I didn't.
3 Fg might be great but I bought into 1 got 2 for free then no 3.
So limited use in the 600+ games I own!
But it doesn't suit most games I play or more succinctly I choose other ways to gain Frames If I need too, such is my revulsion for what a 1080 spin(dirt rally2/others) does with it on and if you're not spinning out sometimes, are you sure you're trying hard enough.
I play online FPS too, in groups so higher FPS stabley and highly accurate outweigh all other needs and then fsr or dlss isn't good enough, none are , and before it's said reflex plus native is again faster than reflex and dlss.
All personal tastes really so I'm not arguing my ways it , I am again saying YdY.
Me either, for sure everyone should always do what's best for their hobby and always grab the card that fits best with their use case.
Posted on Reply
#33
Keivz
john_Maybe FSR 3.0 works. I mean, without FSR 3.0, the RTX 4070 still enjoys a nice advantage because of Frame Generation and even if someone doesn't care about RT performance or CUDA, or power consumption or whatever, FG, no matter how we see it, does give RTX 4070 a very nice performance advantage, at least on paper, over Radeon cards and even RTX 3000 cards. But IF FSR 3.0 works, then there is no FG advantage. RTX 4060 and RTX 4070 cards lose an advantage against Radeons and RTX 3000 cards, in fact probably the main advantage Nvidia was pushing for RTX 4000 series in games.
Agree 100%.
Posted on Reply
#35
cvaldes
TheoneandonlyMrKAnd that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.
All computer graphics are fakery. They're just mathematic tricks to get you to think something is realistically portrayed.

Better results generally come from more sophisticated formulas. That means increased computational requirements.

Person A: "Hey, there's a new shading model called Gouraud. It looks better than flat shading."
Person B: "Why do I need that? I'm happy with flat shading."



Years later.

Person A: "Hey, there's an even better shading model called Phong. It's more realistic than Gouraud."
Person B: "Nah, I upgraded to Gouraud a couple of years ago. I'm fine with that."

A few more years pass.

Person A: "A mathematician by the name of Jim Blinn has altered Phong shading."
Person B: "I'll check it out. How do you spell his name?"
Person A: "B-L-I-N-N"

DLSS, upscaling, frame generation, ray-trace reconstruction, all part of the evolution of computer graphics.

There's a reason why we don't see flat shading in computer games anymore.

Yes, there might not be a usage case for you today for DLSS 3 Frame Generation or DLSS 3.5 Ray Reconstruction. But someday there probably will be for a usage case (e.g., game title) that you care about. The problem is you just don't know when that will happen.

DLSS 1.0 was not embraced at launch. Now all three GPU manufacturers (Nvidia, AMD, Intel) provide it as a tool for developers to tap into often with great effect. Many now consider DLSS 2.0 to have superior results to conventional TAA.

For sure the technology is improving, often in software. And it's not just about who has better/more transistors. A lot of these implementations are heavily influenced by the quality of the developer tools used to harness this technology.
Posted on Reply
#36
AusWolf
TheoneandonlyMrKAnd that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.
That's why I'm an advocate of hardware-agnostic technologies and winning on pure performance. That makes me an AMD fan in some people's eyes which is laughable considering that maybe 30% of my hardware arsenal is AMD, the rest are Intel/Nvidia. :roll:
Posted on Reply
#37
cvaldes
AusWolfThat's why I'm an advocate of hardware-agnostic technologies and winning on pure performance. That makes me an AMD fan in some people's eyes which is laughable considering that maybe 30% of my hardware arsenal is AMD, the rest are Intel/Nvidia. :roll:
There is no such thing as totally hardware-agnostic technologies in modern computing or consumer electronics.

The reason we have GPUs in the first place is that CPU cores aren't efficient in handling the mathematic calculations for 3D graphics. Many tasks and functions are now handled by specialized ASICs that were formerly handled by the CPU.

Want to see your CPU in action doing 3D graphics calculations? Just run Cinebench. Note the speed that the images are generated. Imagine playing a game with that rendering speed.

If your phone tried to decode video just using CPU cores, the battery life would be minutes, not hours.

When you watch YouTube on your fancy computer, it is using an encoder chip on your graphics card, not the fancy Ryzen 7800X3D CPU. At a fraction of the power (and thus heat). If you forced it to do software decoding handled by the CPU, you'd see a big power spike and complain about the CPU fan being too loud.

At some point, someone came up with algorithms for ray tracing. Originally this was done in software on CPUs. Took forever, not useful for real-time graphics. So it was reserved for still images or a short film (if you had the budget and time) like early Pixar shorts.

At some point, someone said, "hey, let's build a circuit that will handle these calculations more efficiently." Today, we have smartphone SoCs with ray-tracing cores.

Someday in the not too distant future, we'll have some other form of differentiated silicon. MPEG-2 encoders used to be custom ASICs. Today they handle a wide variety of encoding schemes, the latest being AV1. Someday there will something else that succeeds AV1 as the next generation. Performance will suck on today's encoding architecture, will be better with specially modified silicon to help speed things up.

A graphics card purchase is a singular event in time but a usage case may pop up next month that wasn't on the radar last month. We saw this with the crypto mining craze. We also found out what happens when the crypto mining policies change leaving a bunch of cards utterly useless.

I remember buying a Sapphire Pulse Radeon RX 580 new for $180 (down from the original launch MSRP of $230). Six months later during the height of the mining craze, that card was going for 3x what I paid for.
Posted on Reply
#38
AusWolf
cvaldesThe reason we have GPUs in the first place is that CPU cores aren't efficient in handling the mathematic calculations for 3D graphics. Many tasks and functions are now handled by specialized ASICs that were formerly handled by the CPU.

If your phone tried to decode video just using CPU cores, the battery life would minutes, not hours.

There is no such thing as totally hardware-agnostic technologies in modern computing or consumer electronics.
DirectX, OpenGL, programmable shaders... there's a reason why 99% of game technologies run on every CPU and GPU.
Posted on Reply
#39
cvaldes
AusWolfDirectX, OpenGL, programmable shaders... there's a reason why 99% of game technologies run on every CPU and GPU.
Not sure how many DirectX games run on my iPhone.

Before OpenGL was a standard, it was proprietary IrisGL. It's not like OpenGL was instantly welcomed and adopted by everyone the moment the OpenGL ARB pressed the "publish" button. OpenGL wasn't always "just there" and it's not going to last forever either. Years ago Apple deprecated OpenGL (which was invented by SGI, a defunct 3D graphics company whose heyday was in the Nineties).

My computer (Mac mini M2 Pro) doesn't support OpenGL. My Mac mini 2018 (Intel CPU) might if I installed an old version of the operating system.

And DirectX is really a Microsoft standard that has been forced on the world due to their near monopolistic practices and strangehold on the PC industry (you remember that anti-trust investigation 20 years ago?).

A few years ago Vulkan wasn't on anyone's radar. Today it's important. Someday it'll fall to the side, overtaken by more modern graphics technology that has developed to address the changing needs of the industry and its users.

There are basic concepts that span multiple architectures but even in the products from one company, there isn't even compliance. As an AMD guy you should know that DirectX 12 isn't fully and evenly implemented on every single GPU even within one generation.

And designing any computer architecture is both a combination of hardware and software. The people who understand the hardware the best will have the best software. So Nvidia has hardware engineers that work with software engineers the latter writing drivers, APIs, etc. Apple has done this to great effect.

Remember that just because the industry picks a standard doesn't mean that it will be embraced by all forever and ever. How many DVI and VGA connectors does your RX 7800 XT have? Does it have a VirtualLink port (looks like USB-C)?
Posted on Reply
#40
AusWolf
cvaldesNot sure how many DirectX games run on my iPhone.

Before OpenGL was a standard, it was proprietary IrisGL. It's not like OpenGL was instantly welcomed and adopted by everyone the moment the OpenGL ARB pressed the "publish" button. OpenGL wasn't always "just there" and it's not going to last forever either. Years ago Apple deprecated OpenGL (which was invented by SGI, a defunct 3D graphics company whose heyday was in the Nineties).

My computer (Mac mini M2 Pro) doesn't support OpenGL. My Mac mini 2018 (Intel CPU) might if I installed an old version of the operating system.

And DirectX is really a Microsoft standard that has been forced on the world due to their near monopolistic practices and strangehold on the PC industry (you remember that anti-trust investigation 20 years ago?).

A few years ago Vulkan wasn't on anyone's radar. Today it's important. Someday it'll fall to the side, overtaken by more modern graphics technology that has developed to address the changing needs of the industry and its users.

There are basic concepts that span multiple architectures but even in the products from one company, there isn't even compliance. As an AMD guy you should know that DirectX 12 isn't fully and evenly implemented on every single GPU even within one generation.

And designing any computer architecture is both a combination of hardware and software. The people who understand the hardware the best will have the best software. So Nvidia has hardware engineers that work with software engineers the latter writing drivers, APIs, etc. Apple has done this to great effect.

Remember that just because the industry picks a standard doesn't mean that it will be embraced by all forever and ever. How many DVI and VGA connectors does your RX 7800 XT have? Does it have a VirtualLink port (looks like USB-C)?
Sure, things don't always (or rather, usually don't) start as universal, but there's some sort of standardisation along the way. Power connectors, car safety standards, there's lots of things that have been put into law, or at least some sort of general agreement. Companies have their own stuff, which get standardised, or die out, or take the Apple route (closed ecosystem with a solid fanbase) with time. I'm all for standardisation and all against the Apple approach (which Nvidia seems to be following lately). I like choice when I'm buying something, and I don't want to be forced to buy Nvidia because of DLSS, or AMD because of whatever.

I'm not an AMD guy. Only my main gaming rig is fully AMD at the moment, but I've got two HTPCs that are both Intel+Nvidia, and I've got lots of various parts lying around from every manufacturer. I generally prefer AMD's open source approach towards new technologies, but that doesn't mean I restrict my choices to only one brand (although prices seem to be doing that for me anyway).

I hope that makes sense. :)
Posted on Reply
#41
Minus Infinity
ChaitanyaToo little too late for 4060 4070.
Yep, $449 would be closer to the mark for a 12GB 3060 replacement.
Posted on Reply
#42
oxrufiioxo
Minus InfinityYep, $449 would be closer to the mark for a 12GB 3060 replacement.
We'd need a substantially more competitive market for that the 7800XT would likely needed to launch at the same time as the 4070 at $399 and even then I doubt Nvida would price that low.
Posted on Reply
#43
OneMoar
There is Always Moar
call me when it gets to 329
Posted on Reply
#44
lexluthermiester
QuattrokongenNo, even at 499 dollars the 4070 is a bad deal.
Your opinion. Clearly, not everyone agrees.
QuattrokongenThe ONLY thing the 4070 does better than the RX 7800 XT in is in power usage (it's gonna be used on stationary desktops, so who really cares anyways?) and ray tracing. And that's it.
And there you go. Those two things, and a few others you left out, are very good reasons to go with a 4070.

I'm not saying the 7800XT isn't a great card, because it is. I'm saying that it depends in what the user needs and wants out of their gaming experience.
OneMoarcall me when it gets to 329
Wait 2 years and buy it used.
ARFPeople not happy at all, if you ask me.
Those are whiners doing what they do best. The rest of us live in the real world.
cvaldesBetter results generally come from more sophisticated formulas. That means increased computational requirements.
Not always. Frequently method to do the same work in a better, more efficient, ways are developed.
Posted on Reply
#45
ARF
cvaldesThere is no such thing as totally hardware-agnostic technologies in modern computing or consumer electronics.

The reason we have GPUs in the first place is that CPU cores aren't efficient in handling the mathematic calculations for 3D graphics. Many tasks and functions are now handled by specialized ASICs that were formerly handled by the CPU.

Want to see your CPU in action doing 3D graphics calculations? Just run Cinebench. Note the speed that the images are generated. Imagine playing a game with that rendering speed.
Short answer: GPUs have far more floating point execution units than CPUs. Long answer: a GPU is designed for highly parallel, low FP-precision computation, such as graphics rendering.
GPUs (graphics processing units) are optimized for parallel processing, which allows them to perform many calculations at once. This is in contrast to CPUs (central processing units), which are typically optimized for sequential processing. Because of this, GPUs are able to perform many more floating point operations per second (FLOPS) than CPUs. Additionally, GPUs have specialized hardware, such as multiple cores and larger caches, that are optimized for the types of calculations that are commonly used in graphics processing, such as matrix operations. This further increases their ability to perform FLOPS.

GPU computing is faster than CPU computing because GPUs have thousands of processing cores, while CPUs have comparatively fewer cores.
lexluthermiester
lexluthermiesterThose are whiners doing what they do best. The rest of us live in the real world.
Actually it is the opposite. The "whiners" do live in the real world, while those who support unreal pricing, they lost connection with the world situation right now.
It's called stagflation, the worst, if you still remember this.
Posted on Reply
#46
lexluthermiester
ARFActually it is the opposite. The "whiners" do live in the real world, while those who support unreal pricing, they lost connection with the world situation right now.
It's called stagflation, the worst, if you still remember this.
There's a fanciful twist. Reality is accepting things the way they really are. Fantasy is wishing for what you want to be. Those people, as well as many here, are expressing their wishes, fantasies that have zero bearing on actually reality. And some of them are doing so with complaint as a context. Thus, whiners be whining.
Posted on Reply
#47
Legacy-ZA
ChaitanyaToo little too late for 4060 4070.
Yep and no deal if it's not $400
Posted on Reply
#48
ARF
lexluthermiesterThere's a fanciful twist. Reality is accepting things the way they really are. Fantasy is wishing for what you want to be. Those people, as well as many here, are expressing their wishes, fantasies that have zero baring on actually reality. And some of them are doing so with complaint as a context. Thus, whiners be whining.
No, this is simply some capitalists, speaking of nvidia, abusing their monopolistic position in the market.
There are two solutions - vote with your wallet and don't buy (like the "whiners" actually do) which results in an all-time low graphics cards shipments, and if this trend goes on, nivida will be forced to exit the graphics cards market.
Posted on Reply
#49
AusWolf
ARFall-time low graphics cards shipments
Do you have a source on this? This is getting interesting.
Posted on Reply
#50
lexluthermiester
ARFNo, this is simply some capitalists, speaking of nvidia, absuing their monopolistic position in the market.
And THAT is a statement of complaint. I wouldn't call it whining in this context. It is an opinion.
ARFThere are two solutions - vote with your wallet and don't buy (like the "whiners" actually do)
That's fair.
ARFwhich results in an all-time low graphics cards shipments
Not in the case of NVidia. Only the small minority who don't have the money anyway, will boycott them. Not really impactful.
ARFand if this trend goes on, nivida will be forced to exit the graphics cards market.
And you think the likeliness of this is good? Now who's fantasizing?
AusWolfDo you have a source on this? This is getting interesting.
He doesn't.
Posted on Reply
Add your own comment
Dec 18th, 2024 02:47 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts