Thursday, September 14th 2023

NVIDIA GeForce RTX 4070 Could See Price Cuts to $549

NVIDIA's GeForce RTX 4070 12 GB graphics card finds itself embattled against the recently launched AMD Radeon RX 7800 XT, and board partners from NVIDIA's ecosystem plan to do something about it, reports Moore's Law is Dead. A GIGABYTE custom-design RTX 4070 Gaming OC graphics card saw a $549 listing on the web, deviating from the $599 MSRP for the SKU, which hints at what the new pricing for the RTX 4070 could generally look like. At $549, the RTX 4070 would still sell for a $50 premium over the RX 7800 XT, probably banking on better energy efficiency and features such as DLSS 3. NVIDIA partners could take turns to price their baseline custom-design RTX 4070 product below the MSRP on popular online retail platforms, and we don't predict an official price-cut that applies across all brands, forcing them all to lower their prices to $549. We could also see NVIDIA partners review pricing for the RTX 4060 Ti, which faces stiff competition from the RX 7700 XT.
Source: Moore's Law is Dead (YouTube)
Add your own comment

130 Comments on NVIDIA GeForce RTX 4070 Could See Price Cuts to $549

#101
las
lexluthermiesterThat is an opinion. Some apps can use it. And if devs wrote better draw-distance code, 16GB & 20GB would be amazing in most games.

You need to improve your math skills. 8GB is barely enough for 4k gaming right now. The moment the industry hits 8k displays for gaming, 20GB will not be enough.

But I digress.. We seem to be getting a bit off topic...
*If* yeah... :laugh: Draw distance tanks CPU power as well, not VRAM really...

Lets see...

3070 8GB beats 6700XT 12GB in 4K/UHD gaming FYI -> www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
Yet none of those cards are good for 4K/UHD gaming anyway, so who cares. Upscaling can make it happen and DLSS reigns supreme as usual.


8K gaming? Lmao.... Like 1% of Steam users are using 4K/UHD and 8K is literally a standing joke even in the TV market with sales going down YoY. There's not even 8K physical media present og you have to rely on 4K upscaling, meanwhile most people barely stream 1080p on average

Once again, when 20GB VRAM is actually needed 7900XT will be utterly garbage. VRAM never saved a GPU, because the GPU itself will be the limiting factor.

My 4090 would perform identical in 99.9% of games with just 12GB VRAM instead of 24GB and 4090 won't be considered fast in 2-3 generations as well.

4070 Ti 12GB beats 3090 24GB in 4K gaming with half the VRAM.
Posted on Reply
#102
AusWolf
las*If* yeah... :laugh: Draw distance tanks CPU power as well, not VRAM really...

Lets see...

3070 8GB beats 6700XT 12GB in 4K/UHD gaming FYI -> www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
Yet none of those cards are good for 4K/UHD gaming anyway, so who cares. Upscaling can make it happen and DLSS reigns supreme as usual.
That's the present. You don't know what future games will need.
lasOnce again, when 20GB VRAM is actually needed 7900XT will be utterly garbage. VRAM never saved a GPU, because the GPU itself will be the limiting factor.
You don't know that, either. Maybe, maybe not. If you have more VRAM, at least there is one aspect of future performance you can't control (GPU) instead of two (GPU and VRAM).

I'm not saying that paying a lot more for more VRAM for the illusion of futureproofing is worth it, but if two cards come in the same price range, and one has 1.5x or 2x the VRAM, I'll choose that one.
Posted on Reply
#103
las
AusWolfThat's the present. You don't know what future games will need.


You don't know that, either. Maybe, maybe not. If you have more VRAM, at least there is one aspect of future performance you can't control (GPU) instead of two (GPU and VRAM).

I'm not saying that paying a lot more for more VRAM for the illusion of futureproofing is worth it, but if two cards come in the same price range, and one has 1.5x or 2x the VRAM, I'll choose that one.
Neither do you. Superior upscaling is preferable tho and DLSS wins easily + has option to use DLAA for superior AA instead. Nothing new will happen anytime soon. Next gen consoles are 4-5 years away and no new game engines that will raise requirements further. UE5 engine will only gets better optimized over time and developers will learn how to optimize games better.

I'll take superior features over more VRAM any day. I have 24GB, yet it is pointless.

I see 4070 Ti and 7900XT as upper mid-end solutions, nothing more. 7900XTX and 4080 are much faster but 4090 is in a league of its own.
A upper mid-end level GPU don't need 20 GB VRAM.

More VRAM is only good when you actually play games maxed out, and you won't be maxing out demanding games in years from now using a 7900XT in 4K/UHD for sure. GPU will be too weak. It is already too weak to do it today.

Lower settings = Less VRAM will be required. This is why it is pointless to put alot VRAM on a weak GPU. GPU will be the limiting factor anyway, forcing you to lower details and hence lowering VRAM.
Posted on Reply
#104
lexluthermiester
lasDraw distance tanks CPU power as well, not VRAM really...
Not really. If done right it works very well.
las3070 8GB beats 6700XT 12GB in 4K/UHD gaming FYI -> www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
Yet none of those cards are good for 4K/UHD gaming anyway, so who cares. Upscaling can make it happen and DLSS reigns supreme as usual.
News flash: Those benchmarks are run on the HIGHEST settings possible. No real gamer runs games like that. Your logic has no merit.
las8K gaming? Lmao.... Like 1% of Steam users are using 4K/UHD and 8K is literally a standing joke even in the TV market with sales going down YoY.
This what they said about 4k a few years ago, and here we are..
lasOnce again, when 20GB VRAM is actually needed 7900XT will be utterly garbage.
YOUR opinion, one that does not meet with logic. Another news flash for you: The Geforce GTX 1080 is 7years and 4 generations old, is STILL a good card and can do 4k gaming, when some settings are turned down. The RX580 is 6years, 5 generations old and can do 4k gaming with a few more settings turned down. Both cards are still relevant, though showing their age. In 6years time the 7800XT will be showing it's age but will still be able to push games at 4k very well.
lasVRAM never saved a GPU, because the GPU itself will be the limiting factor.
Absolute moose-muffins! Extra memory can ALWAYS make a difference and often does. Stop shoveling that crap, no one believes you because most of us know better and the rest can use their brains for something more than a seat-cushion.
Posted on Reply
#105
AusWolf
lasNeither do you. Superior upscaling is preferable tho and DLSS wins easily + has option to use DLAA for superior AA instead. Nothing new will happen anytime soon. Next gen consoles are 4-5 years away and no new game engines that will raise requirements further. UE5 engine will only gets better optimized over time and developers will learn how to optimize games better.

I'll take superior features over more VRAM any day. I have 24GB, yet it is pointless.

I see 4070 Ti and 7900XT as upper mid-end solutions, nothing more. 7900XTX and 4080 are much faster but 4090 is in a league of its own.
A upper mid-end level GPU don't need 20 GB VRAM.

More VRAM is only good when you actually play games maxed out, and you won't be maxing out demanding games in years from now using a 7900XT in 4K/UHD for sure. GPU will be too weak. It is already too weak to do it today.

Lower settings = Less VRAM will be required. This is why it is pointless to put alot VRAM on a weak GPU. GPU will be the limiting factor anyway, forcing you to lower details and hence lowering VRAM.
That's an opinion. Personally, I prefer to run games at native resolution and only use upscaling if I have to. To me, it's not a feature to sell a graphics card with, but a helping hand before an upgrade. Textures are what really make a game pop, and that's where your VRAM capacity matters.
Posted on Reply
#106
80-watt Hamster
lexluthermiesterAbsolute moose-muffins! Extra memory can ALWAYS make a difference and often does. Stop shoveling that crap, no one believes you because most of us know better and the rest can use their brains for something more than a seat-cushion.
There's a point of diminishing returns, though. The RX 470/570 and RTX 4060 ti demonstrate this rather conclusively.
Posted on Reply
#107
lexluthermiester
80-watt HamsterThere's a point of diminishing returns, though. The RX 470/570 and RTX 4060 ti demonstrate this rather conclusively.
True. Sticking 64GB of VRAM on a current gen card would be pointless for anything gaming related. But the 16GB or 20GB being discussed has potential. History has shown us that the more memory a card has, the better it will age.
Posted on Reply
#108
AusWolf
lexluthermiesterTrue. Sticking 64GB of VRAM on a current gen card would be pointless for anything gaming related. But the 16GB or 20GB being discussed has potential. History has shown us that the more memory a card has, the better it will age.
Yep. There's also the GTX 960 2 vs 4 GB as a perfect example.
Posted on Reply
#109
80-watt Hamster
lexluthermiesterTrue. Sticking 64GB of VRAM on a current gen card would be pointless for anything gaming related. But the 16GB or 20GB being discussed has potential. History has shown us that the more memory a card has, the better it will age.
AusWolfYep. There's also the GTX 960 2 vs 4 GB as a perfect example.
The 960 4GB, like the RX 580 8GB, aged well because it was a balanced design. The 2G 960 was NV's answer to the question, "How do we sell this card cheaper without sacrificing too much margin?" Contrast the 470/570/4060 ti which were more, "Gamers feel they need a ton of VRAM, so let's slap some on and make a few extra bucks in the process." I actually fell for that twice, on an R9 380 (which did hypothetically benefit; I've no evidence either way) and RX 470. Which I admittedly got for a song.
Posted on Reply
#110
las
AusWolfThat's an opinion. Personally, I prefer to run games at native resolution and only use upscaling if I have to. To me, it's not a feature to sell a graphics card with, but a helping hand before an upgrade. Textures are what really make a game pop, and that's where your VRAM capacity matters.
RTX means DLSS, DLAA, DLDSR, Reflex, useful RT and many others features. AMD has no features that comes close. FSR is mediocre compared to DLSS. They have no answer to DLAA and DLDSR or Reflex and RT performance is too slow. Hence the lower price.

DLAA beats any other AA method today, and is a part of DLSS. It's simply a preset of DLSS.
DLAA will always improve on native. Bigtime actually. Looks FAR BETTER than native.

DLSS @ Quality Mode will very often as well + improve performance by 50% or so -> www.rockpapershotgun.com/outriders-dlss-performance

Why? Because DLSS has built in AA with sharpening as well = Delivering a cleaner and sharper image than native most of the time while also upping performance.

Native on it's own is pretty much meh -> You need proper AA on top and you might also need some form of sharpening (AMD CAS or Nvidia Sharpening or DLAA which does BOTH)

"Native" is pretty much dying. It's not the best solution in many new games. Native means you need to use some old crappy AA solution like TAA (blurry) or live with jaggies all over. DLAA removes EVERYTHING and makes the picture clean. It is the only point of DLAA = Best looking image and it will beat native any time.

Upscaling is here to stay. The industry embraced it and many new PC games use it by default (Starfield, Remnant 2 among others) and consoles uses it as well in many games (+ dynamic res to make up for the weaker hardware).

It is actually insane that some people think upscaling is only if you lack performance. AMD users think this, because they don't know about DLSS/DLAA and are stuck with FSR which is blurry and has mediocre motion. Yeah FSR upscaling is worse than native most of the time, DLSS is not and DLAA is ALWAYS BETTER than native.

Yeah VRAM matters if the GPU is up to the task. I don't lack VRAM on my 4090. 4080 has more than enough VRAM as well and so does 4070 series.

4060 and Ti 8GB might get a problem in a game or two at high res, when fully maxed, however, 4060 8GB won't max out demanding games at high res anyway, it's a lower end solution.

Besides 4060 8GB vs 16GB was tested and the conclusion was: "No significant performance gains from 16 GB VRAM"

3070 8GB beats 6700XT 12GB in 2023 as well, even in 4K/UHD gaming. Launch MSRP were 20 dollars apart.


But sure, keep thinking VRAM is the most important stuff :laugh:

www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/40.html
lexluthermiesterNot really. If done right it works very well.

News flash: Those benchmarks are run on the HIGHEST settings possible. No real gamer runs games like that. Your logic has no merit.

This what they said about 4k a few years ago, and here we are..

YOUR opinion, one that does not meet with logic. Another news flash for you: The Geforce GTX 1080 is 7years and 4 generations old, is STILL a good card and can do 4k gaming, when some settings are turned down. The RX580 is 6years, 5 generations old and can do 4k gaming with a few more settings turned down. Both cards are still relevant, though showing their age. In 6years time the 7800XT will be showing it's age but will still be able to push games at 4k very well.

Absolute moose-muffins! Extra memory can ALWAYS make a difference and often does. Stop shoveling that crap, no one believes you because most of us know better and the rest can use their brains for something more than a seat-cushion.
Nah draw distance is not mostly VRAM. Actually it uses very little VRAM and is more CPU. This is why consoles typically lower draw distance, because CPU is weak. Get some engine knowledge please.

Yeah highest settings = Most VRAM usage. Logic 101.

1080 don't do 4K/UHD gaming really. Lmao. Dailed down settings = Less VRAM usage, which is why it is pointless to try and futureproof with VRAM in the first place. Once again, Logic.

Are you drunk or sumthing? Was funny to read your post :laugh: Makes absolutely no sense. You speak about VRAM is important but also about LOWERING SETTINGS which will LOWER VRAM REQUIREMENT - Sigh :roll:
Posted on Reply
#111
JustBenching
How well a card ages is completely irrelevant. Who the heck will be holding onto a 7900xt 5+ years from now? 1% of the 0.5% that bought it?
Posted on Reply
#112
AusWolf
lasRTX means DLSS, DLAA, DLDSR, Reflex, useful RT and many others features. AMD has no features that comes close. FSR is mediocre compared to DLSS. They have no answer to DLAA and DLDSR or Reflex and RT performance is too slow. Hence the lower price.

DLAA beats any other AA method today, and is a part of DLSS. It's simply a preset of DLSS.
DLAA will always improve on native. Bigtime actually. Looks FAR BETTER than native.

DLSS @ Quality Mode will very often as well + improve performance by 50% or so -> www.rockpapershotgun.com/outriders-dlss-performance

Why? Because DLSS has built in AA with sharpening as well = Delivering a cleaner and sharper image than native most of the time while also upping performance.
Okay, one more time in plain English: I... do... not... give... a... damn... about... upscaling. Got it? ;)
lasIt is actually insane that some people think upscaling is only if you lack performance. AMD users think this, because they don't know about DLSS/DLAA and are stuck with FSR which is blurry and has mediocre motion. Yeah FSR upscaling is worse than native most of the time, DLSS is not and DLAA is ALWAYS BETTER than native.
I know about DLSS. I used it in Cyberpunk while I had a 2070. And guess what. It was sup-par compared to native. I only had to use it to get usable frame rates with RT on. Otherwise, native is better.
lasBesides 4060 8GB vs 16GB was tested and the conclusion was: "No significant performance gains from 16 GB VRAM"
And now we're back to the present after I already said like 3 times that a higher VRAM capacity is about future games, not the present. The 2 GB 960 was fine during release, but fell on its face shortly after. Current cards may suffer the same fate, or may not. But I'd rather make sure, as long as I'm paying the same.

But tell me... why does it hurt you so much that not everyone has an orgasm when looking at an upscaled image?
fevgatosHow well a card ages is completely irrelevant. Who the heck will be holding onto a 7900xt 5+ years from now? 1% of the 0.5% that bought it?
Considering that RDNA 4 is rumoured not to have a high-end model, and RDNA 5 is about 4-5 years away, I'd say pretty much all of them.
Posted on Reply
#113
las
AusWolfOkay, one more time in plain English: I... do... not... give... a... damn... about... upscaling. Got it? ;)


I know about DLSS. I used it in Cyberpunk while I had a 2070. And guess what. It was sup-par compared to native. I only had to use it to get usable frame rates with RT on. Otherwise, native is better.


And now we're back to the present after I already said like 3 times that a higher VRAM capacity is about future games, not the present. The 2 GB 960 was fine during release, but fell on its face shortly after. Current cards may suffer the same fate, or may not. But I'd rather make sure, as long as I'm paying the same.

But tell me... why does it hurt you so much that not everyone has an orgasm when looking at an upscaled image?


Considering that RDNA 4 is rumoured not to have a high-end model, and RDNA 5 is about 4-5 years away, I'd say pretty much all of them.
You are using 1080p, upscaling is not relevant for you really. DLAA and DLDSR would be. DLSS is for 1440p and up. And like you can see in the link, it makes visuals better and sharper than native.

You are in denial really. Because AMD has no features that even comes close to what Nvidia offers with RTX.

RNDA4 is 2024 (will have no high-end offerings)
RDNA5 is 2025 (competes with RTX 5000 series)

VRAM don't matter when GPU is weak and the limiting factor
, and every GPU will be considered weak in a few generations and then upscaling will be a must.

Feel free to try and future proof with VRAM, lets talk in a few generations and see how your 7800XT is doing then.
Posted on Reply
#114
AusWolf
lasYou are using 1080p, upscaling is not relevant for you really. DLAA and DLDSR would be. DLSS is for 1440p and up. And like you can see in the link, it makes visuals better and sharper than native.

You are in denial really. Because AMD has no features that even comes close to what Nvidia offers with RTX.

RNDA4 is 2024 (will have no high-end offerings)
RDNA5 is 2025 (competes with RTX 5000 series)

VRAM don't matter when GPU is weak and the limiting factor, and every GPU will be considered weak in a few generations and then upscaling will be a must.

Feel free to try and future proof with VRAM, lets talk in a few generations and see how your 7800XT is doing then.
You're making it look like I'm suffering from some kind of penis envy because my main gaming PC has an AMD graphics card in it, when in fact, there's absolutely nothing stopping me from buying an Nvidia GPU right now. I just don't want to. Getting the 7800 XT was my choice, I was never locked out of the glorified Nvidia ecosystem by any force. So no, it's not "the poor AMD guy crying that he can't use DLSS". It's me not paying extra for features (gimmicks?) that I don't need, even though I have the means.

I tested DLSS at 1080p and didn't like it. I just bought a 3440x1440 ultrawide about a week ago, so I'll make sure to give FSR a go (or even pop my 2070 back in to see DLSS), but I highly doubt my opinion will change.

Until then, this topic is closed on my part.
Posted on Reply
#115
las
AusWolfYou're making it look like I'm suffering from some kind of penis envy because my main gaming PC has an AMD graphics card in it, when in fact, there's absolutely nothing stopping me from buying an Nvidia GPU right now. I just don't want to. Getting the 7800 XT was my choice, I was never locked out of the glorified Nvidia ecosystem by any force. So no, it's not "the poor AMD guy crying that he can't use DLSS". It's me not paying extra for features (gimmicks?) that I don't need, even though I have the means.

I tested DLSS at 1080p and didn't like it. I just bought a 3440x1440 ultrawide about a week ago, so I'll make sure to give FSR a go (or even pop my 2070 back in to see DLSS), but I highly doubt my opinion will change.

Until then, this topic is closed on my part.
Every and all AMD GPU owners calls DLSS, DLAA and DLDSR, Reflex, RT (and I could go on) for gimmicks, because they can't use them. FOMO hits hard, I know. Us RTX owners use the features alot. We paid for it afterall. AMD is cheaper because features are inferior. If AMD GPUs were actually better, price would be higher, not lower. Business 101.

Your 2070 is too slow to handle DLAA in 3440x1440 and too slow to use DLDSR to downsample 1440p-2160p, which will make a 1080p monitor worth looking at. DLAA is all about improving visuals, not upping performance. DLAA beats any other AA solution and very easily beats "native" because native needs an AA solution on top or you will see jaggies and funky visuals. Native without AA is meh even for 4K/UHD gaming. Needs some form of AA + pref. Sharpening to look the best.

AMD CAS is as close to DLAA as you will get but it is still inferior. But still way better than native. Native in itself today is not a goal for me. I want BETTER THAN NATIVE visuals. DLAA gives me that.

FSR is mediocre at best and looks kinda bad in most games, especially in motion. This is probably why AMD owners talk crap about DLSS because they think it's like FSR.

I can keep writing the same forever. I have experiences with both AMD and Nvidia GPUs, tons of it actually. I build high-end custom watercooled PCs for a side business and I touch every high-end part, every generation and I am not touching AMD myself before they can match Nvidia on features and drivers. If my customers request an AMD GPU, sure, I will use it ofc. Most want a Nvidia card tho. That is reality for you.

AMD mostly sells cheaper GPUs, which you can easily confirm by checking Steam HW Survey. Barely any higher end AMD GPUs are represented in the top list. Mostly low to mid-end GPUs. Meanwhile Nvidia have tons of higher end GPUs on the list.
Posted on Reply
#116
AusWolf
lasEvery and all AMD GPU owners calls DLSS, DLAA and DLDSR, Reflex, RT (and I could go on) for gimmicks, because they can't use them. FOMO hits hard, I know. Us RTX owners use the features alot. We paid for it afterall. AMD is cheaper because features are inferior. If AMD GPUs were actually better, price would be higher, not lower. Business 101.

Your 2070 is too slow to handle DLAA in 3440x1440 and too slow to use DLDSR to downsample 1440p-2160p, which will make a 1080p monitor worth looking at. DLAA is all about improving visuals, not upping performance. DLAA beats any other AA solution and very easily beats "native" because native needs an AA solution on top or you will see jaggies and funky visuals. Native without AA is meh even for 4K/UHD gaming. Needs some form of AA + pref. Sharpening to look the best.

AMD CAS is as close to DLAA as you will get but it is still inferior. But still way better than native. Native in itself today is not a goal for me. I want BETTER THAN NATIVE visuals. DLAA gives me that.

FSR is mediocre at best and looks kinda bad in most games, especially in motion. This is probably why AMD owners talk crap about DLSS because they think it's like FSR.

I can keep writing the same forever. I have experiences with both AMD and Nvidia GPUs, tons of it actually. I build high-end custom watercooled PCs for a side business and I touch every high-end part, every generation and I am not touching AMD myself before they can match Nvidia on features and drivers. If my customers request an AMD GPU, sure, I will use it ofc. Most want a Nvidia card tho. That is reality for you.

AMD mostly sells cheaper GPUs, which you can easily confirm by checking Steam HW Survey. Barely any higher end AMD GPUs are represented in the top list. Mostly low to mid-end GPUs. Meanwhile Nvidia have tons of higher end GPUs on the list.
If I had FOMO, I would have bought a 4070 instead of my 7800 XT, don't you think? Now you're not only not making any sense, but you're also being straight up offensive, which does no good, believe that. You don't know me, and I'd prefer if you didn't make assumptions about me. The fact that not everyone prays to the maker of your shiny toy every morning and night is hard, I know (see, I can do it too).
Posted on Reply
#117
las
AusWolfIf I had FOMO, I would have bought a 4070 instead of my 7800 XT, don't you think? Now you're not only not making any sense, but you're also being straight up offensive, which does no good, believe that. You don't know me, and I'd prefer if you didn't make assumptions about me. The fact that not everyone prays to the maker of your shiny toy every morning and night is hard, I know (see, I can do it too).
You had no experience with the features and obviously thought upscaling was all about improving fps while decreasing visuals, which is not true, unless you cherrypick a game with bad DLSS implementation, or even finds DLSS 1 screenshots. DLSS 2+ was where the magic happened. There is literally tons of games with great implementation. Besides DLAA improves on native.

And this is why "native" is pointless. If AI can improve visuals for me, I will use it. AMD has close to nothing when it comes to AI and features. This is why all AMD users speak of "native" and only looks at rasterization performance. Sad but true.

As I have said many times, AMD is cheaper for a reason. They spend little R&D funds on their GPUs and features. Their primary focus is and will always be CPUs and APUs. This is why they can't compete in the GPU segment.
Posted on Reply
#118
AusWolf
lasYou had no experience with the features and obviously thought upscaling was all about improving fps while decreasing visuals, which is not true, unless you cherrypick a game with bad DLSS implementation, or even finds DLSS 1 screenshots. DLSS 2+ was where the magic happened. There is literally tons of games with great implementation. Besides DLAA improves on native.
Yep. Finishing Cyberpunk with DLSS 2 means no experience at all. I should finish at least 345799345723456 more games before I'm allowed to speak. How foolish of me! :slap:
lasIf AI can improve visuals for me, I will use it.
Then go ahead and use it. But please allow me not to give a hoot.
lasAMD has close to nothing when it comes to AI and features. This is why all AMD users speak of "native" and only looks at rasterization performance. Sad but true.
I don't see anything sad about it. I'm happy without AI, thank you very much.
lasAs I have said many times, AMD is cheaper for a reason. They spend little R&D funds on their GPUs and features. Their primary focus is and will always be CPUs and APUs. This is why they can't compete in the GPU segment.
They can, just with VRAM and price instead of AI, which is totally fine by me.
Posted on Reply
#119
las
AusWolfYep. Finishing Cyberpunk with DLSS 2 means no experience at all. I should finish at least 345799345723456 more games before I'm allowed to speak. How foolish of me! :slap:


Then go ahead and use it. But please allow me not to give a hoot.


I don't see anything sad about it. I'm happy without AI, thank you very much.


They can, just with VRAM and price instead of AI, which is totally fine by me.
Finishing Cyberpunk if not v2.0+ means nothing yeah, especially using a 2070 which can't run the game properly on high settings anyway

You are happy with native and raster only, because you don't know better - You are simply in denial because you already purchased a 7800XT and refuse to understand that native today is not the best experience

Nah sorry to burst your bubble, AMD is not competing really. Their YoY marketshare went down for several generations in a row. They especially don't compete in the high-end segment which is why RDNA4 won't even have high-end options.

You will see in 2024 when RDNA4 comes out. 8800XT comes in less than a year, just shows how delayed 7700XT/7800XT was. AMD delayed those cards to sell out remaining 6000 series inventory. 7800XT literally replaced 6800XT delivering almost zero gen to gen improvement.

16GB on a card like 7800XT is close to pointless since the GPU is not even fast enough to utilize it properly. 3440x1440 on high/ultra settings in demanding games, forget about it, meaning you need to lower settings to attain decent fps (lowering VRAM usage too) or use FSR which is mediocre.

You see, this is why futureproofing with VRAM is pointless. You won't be able to max out demanding games anyway in a few years. Especially not in 3440x1440. Lower settings = Lower VRAM requirement.

Logic for you. You probably still won't understand it.

Even my 4090 will be considered trash when 24GB VRAM is actually needed in some games, meaning it can't run high settings anyway. I have changed GPU several times by then tho. RTX 5090 by 2025 is my next upgrade probably. Lets see if AMD wakes up and decides to compete in high-end market.
Posted on Reply
#120
JustBenching
AusWolfConsidering that RDNA 4 is rumoured not to have a high-end model, and RDNA 5 is about 4-5 years away, I'd say pretty much all of them.
That doesn't make a difference either. You don't have to buy amd, you can buy nvidia next gen and they will have high end models
Not that I think that amd will age better, but just saying.
Posted on Reply
#121
AusWolf
lasFinishing Cyberpunk if not v2.0+ means nothing yeah, especially using a 2070 which can't run the game properly on high settings anyway
It was v 2.0, and it did run the game at high settings - that's why I used DLSS.
lasYou are happy with native and raster only, because you don't know better - You are simply in denial because you already purchased a 7800XT and refuse to understand that native today is not the best experience
Yep, I'm totally stupid, I've never owned a 2070 or tried DLSS, and I only bought a 500 GBP 7800 XT when I could have bought a 500 GBP 4070 instead because I'm poor. Are you done with your insults? :mad:

Why do you think I bought a 7800 XT and not a 4070 if I care so much about DLSS? Hm? How about you use DLSS to your heart's content and I don't, and we leave each other alone?
Posted on Reply
#122
las
AusWolfIt was v 2.0, and it did run the game at high settings - that's why I used DLSS.


Yep, I'm totally stupid, I've never owned a 2070 or tried DLSS, and I only bought a 500 GBP 7800 XT when I could have bought a 500 GBP 4070 instead because I'm poor. Are you done with your insults? :mad:

Why do you think I bought a 7800 XT and not a 4070 if I care so much about DLSS? Hm? How about you use DLSS to your heart's content and I don't, and we leave each other alone?
Sure you played Cyberpunk 2077 v2.0 that came out last week on max settings with a 2070, after you bought a 7800XT. Totally makes sense.
Posted on Reply
#123
AusWolf
lasSure you played Cyberpunk 2077 v2.0 that came out last week on max settings with a 2070, after you bought a 7800XT. Totally makes sense.
It had DLSS 2.0 since patch 1.2.something.
Posted on Reply
#124
las
AusWolfIt had DLSS 2.0 since patch 1.2.something.
It has DLSS 2 since launch, I am talking about CP 2.0 which changed the game completely

2070 was not enough to play the game on high settings on release, DLSS or not - Maybe in 1080p but 1080p is too low to use DLSS really, you will be running games internally at 720p on Quality preset, not going to look great. DLSS is mainly for 1440p and up, unless you really need the performance you should be using DLDSR at 1080p instead (downsampling = delivering much better visuals than 1080p on a 1080p monitor)

They showcased CP with 2080 Ti but 3000 series came out a few months before CP released back in 2020
Posted on Reply
#125
AusWolf
lasIt has DLSS 2 since launch, I am talking about CP 2.0 which changed the game completely
Ah, I see. I just don't know if that's relevant here.
Posted on Reply
Add your own comment
Dec 18th, 2024 03:04 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts