Thursday, September 14th 2023
NVIDIA GeForce RTX 4070 Could See Price Cuts to $549
NVIDIA's GeForce RTX 4070 12 GB graphics card finds itself embattled against the recently launched AMD Radeon RX 7800 XT, and board partners from NVIDIA's ecosystem plan to do something about it, reports Moore's Law is Dead. A GIGABYTE custom-design RTX 4070 Gaming OC graphics card saw a $549 listing on the web, deviating from the $599 MSRP for the SKU, which hints at what the new pricing for the RTX 4070 could generally look like. At $549, the RTX 4070 would still sell for a $50 premium over the RX 7800 XT, probably banking on better energy efficiency and features such as DLSS 3. NVIDIA partners could take turns to price their baseline custom-design RTX 4070 product below the MSRP on popular online retail platforms, and we don't predict an official price-cut that applies across all brands, forcing them all to lower their prices to $549. We could also see NVIDIA partners review pricing for the RTX 4060 Ti, which faces stiff competition from the RX 7700 XT.
Source:
Moore's Law is Dead (YouTube)
130 Comments on NVIDIA GeForce RTX 4070 Could See Price Cuts to $549
Lets see...
3070 8GB beats 6700XT 12GB in 4K/UHD gaming FYI -> www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
Yet none of those cards are good for 4K/UHD gaming anyway, so who cares. Upscaling can make it happen and DLSS reigns supreme as usual.
8K gaming? Lmao.... Like 1% of Steam users are using 4K/UHD and 8K is literally a standing joke even in the TV market with sales going down YoY. There's not even 8K physical media present og you have to rely on 4K upscaling, meanwhile most people barely stream 1080p on average
Once again, when 20GB VRAM is actually needed 7900XT will be utterly garbage. VRAM never saved a GPU, because the GPU itself will be the limiting factor.
My 4090 would perform identical in 99.9% of games with just 12GB VRAM instead of 24GB and 4090 won't be considered fast in 2-3 generations as well.
4070 Ti 12GB beats 3090 24GB in 4K gaming with half the VRAM.
I'm not saying that paying a lot more for more VRAM for the illusion of futureproofing is worth it, but if two cards come in the same price range, and one has 1.5x or 2x the VRAM, I'll choose that one.
I'll take superior features over more VRAM any day. I have 24GB, yet it is pointless.
I see 4070 Ti and 7900XT as upper mid-end solutions, nothing more. 7900XTX and 4080 are much faster but 4090 is in a league of its own.
A upper mid-end level GPU don't need 20 GB VRAM.
More VRAM is only good when you actually play games maxed out, and you won't be maxing out demanding games in years from now using a 7900XT in 4K/UHD for sure. GPU will be too weak. It is already too weak to do it today.
Lower settings = Less VRAM will be required. This is why it is pointless to put alot VRAM on a weak GPU. GPU will be the limiting factor anyway, forcing you to lower details and hence lowering VRAM.
DLAA beats any other AA method today, and is a part of DLSS. It's simply a preset of DLSS.
DLAA will always improve on native. Bigtime actually. Looks FAR BETTER than native.
DLSS @ Quality Mode will very often as well + improve performance by 50% or so -> www.rockpapershotgun.com/outriders-dlss-performance
Why? Because DLSS has built in AA with sharpening as well = Delivering a cleaner and sharper image than native most of the time while also upping performance.
Native on it's own is pretty much meh -> You need proper AA on top and you might also need some form of sharpening (AMD CAS or Nvidia Sharpening or DLAA which does BOTH)
"Native" is pretty much dying. It's not the best solution in many new games. Native means you need to use some old crappy AA solution like TAA (blurry) or live with jaggies all over. DLAA removes EVERYTHING and makes the picture clean. It is the only point of DLAA = Best looking image and it will beat native any time.
Upscaling is here to stay. The industry embraced it and many new PC games use it by default (Starfield, Remnant 2 among others) and consoles uses it as well in many games (+ dynamic res to make up for the weaker hardware).
It is actually insane that some people think upscaling is only if you lack performance. AMD users think this, because they don't know about DLSS/DLAA and are stuck with FSR which is blurry and has mediocre motion. Yeah FSR upscaling is worse than native most of the time, DLSS is not and DLAA is ALWAYS BETTER than native.
Yeah VRAM matters if the GPU is up to the task. I don't lack VRAM on my 4090. 4080 has more than enough VRAM as well and so does 4070 series.
4060 and Ti 8GB might get a problem in a game or two at high res, when fully maxed, however, 4060 8GB won't max out demanding games at high res anyway, it's a lower end solution.
Besides 4060 8GB vs 16GB was tested and the conclusion was: "No significant performance gains from 16 GB VRAM"
3070 8GB beats 6700XT 12GB in 2023 as well, even in 4K/UHD gaming. Launch MSRP were 20 dollars apart.
But sure, keep thinking VRAM is the most important stuff :laugh:
www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/40.html Nah draw distance is not mostly VRAM. Actually it uses very little VRAM and is more CPU. This is why consoles typically lower draw distance, because CPU is weak. Get some engine knowledge please.
Yeah highest settings = Most VRAM usage. Logic 101.
1080 don't do 4K/UHD gaming really. Lmao. Dailed down settings = Less VRAM usage, which is why it is pointless to try and futureproof with VRAM in the first place. Once again, Logic.
Are you drunk or sumthing? Was funny to read your post :laugh: Makes absolutely no sense. You speak about VRAM is important but also about LOWERING SETTINGS which will LOWER VRAM REQUIREMENT - Sigh :roll:
But tell me... why does it hurt you so much that not everyone has an orgasm when looking at an upscaled image? Considering that RDNA 4 is rumoured not to have a high-end model, and RDNA 5 is about 4-5 years away, I'd say pretty much all of them.
You are in denial really. Because AMD has no features that even comes close to what Nvidia offers with RTX.
RNDA4 is 2024 (will have no high-end offerings)
RDNA5 is 2025 (competes with RTX 5000 series)
VRAM don't matter when GPU is weak and the limiting factor, and every GPU will be considered weak in a few generations and then upscaling will be a must.
Feel free to try and future proof with VRAM, lets talk in a few generations and see how your 7800XT is doing then.
I tested DLSS at 1080p and didn't like it. I just bought a 3440x1440 ultrawide about a week ago, so I'll make sure to give FSR a go (or even pop my 2070 back in to see DLSS), but I highly doubt my opinion will change.
Until then, this topic is closed on my part.
Your 2070 is too slow to handle DLAA in 3440x1440 and too slow to use DLDSR to downsample 1440p-2160p, which will make a 1080p monitor worth looking at. DLAA is all about improving visuals, not upping performance. DLAA beats any other AA solution and very easily beats "native" because native needs an AA solution on top or you will see jaggies and funky visuals. Native without AA is meh even for 4K/UHD gaming. Needs some form of AA + pref. Sharpening to look the best.
AMD CAS is as close to DLAA as you will get but it is still inferior. But still way better than native. Native in itself today is not a goal for me. I want BETTER THAN NATIVE visuals. DLAA gives me that.
FSR is mediocre at best and looks kinda bad in most games, especially in motion. This is probably why AMD owners talk crap about DLSS because they think it's like FSR.
I can keep writing the same forever. I have experiences with both AMD and Nvidia GPUs, tons of it actually. I build high-end custom watercooled PCs for a side business and I touch every high-end part, every generation and I am not touching AMD myself before they can match Nvidia on features and drivers. If my customers request an AMD GPU, sure, I will use it ofc. Most want a Nvidia card tho. That is reality for you.
AMD mostly sells cheaper GPUs, which you can easily confirm by checking Steam HW Survey. Barely any higher end AMD GPUs are represented in the top list. Mostly low to mid-end GPUs. Meanwhile Nvidia have tons of higher end GPUs on the list.
And this is why "native" is pointless. If AI can improve visuals for me, I will use it. AMD has close to nothing when it comes to AI and features. This is why all AMD users speak of "native" and only looks at rasterization performance. Sad but true.
As I have said many times, AMD is cheaper for a reason. They spend little R&D funds on their GPUs and features. Their primary focus is and will always be CPUs and APUs. This is why they can't compete in the GPU segment.
You are happy with native and raster only, because you don't know better - You are simply in denial because you already purchased a 7800XT and refuse to understand that native today is not the best experience
Nah sorry to burst your bubble, AMD is not competing really. Their YoY marketshare went down for several generations in a row. They especially don't compete in the high-end segment which is why RDNA4 won't even have high-end options.
You will see in 2024 when RDNA4 comes out. 8800XT comes in less than a year, just shows how delayed 7700XT/7800XT was. AMD delayed those cards to sell out remaining 6000 series inventory. 7800XT literally replaced 6800XT delivering almost zero gen to gen improvement.
16GB on a card like 7800XT is close to pointless since the GPU is not even fast enough to utilize it properly. 3440x1440 on high/ultra settings in demanding games, forget about it, meaning you need to lower settings to attain decent fps (lowering VRAM usage too) or use FSR which is mediocre.
You see, this is why futureproofing with VRAM is pointless. You won't be able to max out demanding games anyway in a few years. Especially not in 3440x1440. Lower settings = Lower VRAM requirement.
Logic for you. You probably still won't understand it.
Even my 4090 will be considered trash when 24GB VRAM is actually needed in some games, meaning it can't run high settings anyway. I have changed GPU several times by then tho. RTX 5090 by 2025 is my next upgrade probably. Lets see if AMD wakes up and decides to compete in high-end market.
Not that I think that amd will age better, but just saying.
Why do you think I bought a 7800 XT and not a 4070 if I care so much about DLSS? Hm? How about you use DLSS to your heart's content and I don't, and we leave each other alone?
2070 was not enough to play the game on high settings on release, DLSS or not - Maybe in 1080p but 1080p is too low to use DLSS really, you will be running games internally at 720p on Quality preset, not going to look great. DLSS is mainly for 1440p and up, unless you really need the performance you should be using DLDSR at 1080p instead (downsampling = delivering much better visuals than 1080p on a 1080p monitor)
They showcased CP with 2080 Ti but 3000 series came out a few months before CP released back in 2020