Thursday, January 11th 2024

Feast Your Eyes on These NVIDIA GeForce RTX 40-series SUPER Cards, We'll Be Reviewing Most of These

We were at NVIDIA's CES booth earlier today, and snapped a constellation of brand new GeForce RTX 40-series SUPER graphics cards, waiting to hit the shelves, starting January 17. The RTX 4070 SUPER, RTX 4070 Ti SUPER, and RTX 4080 SUPER, will form the bulwark of NVIDIA's premium GeForce RTX product line, and will see the company through for much of 2024. If you're building a premium 1440p or 4K UHD gaming PC for the Spring-Summer, you might want to check out our reviews. We have close to three dozen custom design graphics card reviews lined up just this month!'

We already went hands on with the RTX 4080 SUPER Founders Edition yesterday at the iBUYPOWER booth, we caught another one of these on display, next to the RTX 4070 SUPER FE. There won't be an RTX 4070 Ti SUPER FE, which will be a custom-only launch. Every single NVIDIA board partner that sells in the West had their models on display at the booth, spanning all three of the new GPUs.
These included Colorful with their iGame Ultra W, PNY with their Velocity-X, Verto, and XLR8.
ZOTAC with their Trinity Black series, and Palit with their JetStream series.
MSI with their Ventus 3X and Ventus 2X series; and Inno3D with their X3 and Twin X2 series.
GIGABYTE with their Eagle, Aero, and Gaming OC series; and GALAX with their 1-click OC, EX Gamer, and SP series.
Gainward with their Ghost and Panther series; and ASUS with their DUAL and TUF Gaming series.
Lastly, some more pics of the Founders Edition cards. These things are jewellery.
Add your own comment

45 Comments on Feast Your Eyes on These NVIDIA GeForce RTX 40-series SUPER Cards, We'll Be Reviewing Most of These

#26
las
Legacy-ZAStill stuck with 12GB VRAM, and a nice new, shitty, power connector. Nope, nopity nopity nope nope. And if the price was near $500, I would say, sure, otherwise, NOOOOOOOOOOOOOO.
Says the 8GB VRAM user? :laugh:

AMD will use 12VHPWR/12V-2x6 for next gen as well. It is the future for GPU power. PCI-SIG developed it and its part of ATX 3.0 spec.

Had zero problems with it and cable management is much easier + looks much better.

GPUs with 3 or even 4 x 8 pin looks absolutely laughable in comparison. Huge cable mess too.
Posted on Reply
#27
Vayra86
JAB CreationsOh, now that the AI hype train is cooling off Nvidia suddenly wants to bring gamers back, you know, the only reason why they even exist to begin with? And the prices are still wildly jacked? Nvidia can keep their inventory and low VRAM GPU garbage.
That's a big pile of nope right there. What are you on about? Nothing is cooling off, and they aren't bringing gamers back either.
NoztraI am guessing "Editors Choice", "Highly Recommended" and "Great Value" award. :)
But Expensive
Posted on Reply
#28
Bwaze
Yeah, if anything AI orders are on the rise, not slowing down. Maybe the general public perception has fallen a bit, since ChatGPT, Midjourney and other tools haven't replaced workers just yet.

I bet Nvidia can internally shuffle revenue by sectors as they please, so after the release of these brand new Super cards (less than a year before the release of new generation) they can show an enormous growth in Gaming, and confirm their correct decision to raise the prices!

And the surveys will show that gamers own the new cards in almost negligible percentages, but that doesn't matter, because they clearly lie and are biased ...
Posted on Reply
#29
las
b1k3rdudeI picked up my Gaming Trio 4080 for £1000 about 6 months ago, so Im just waiting for the new 4080 super at that same price point to be a bit of a meh. There is clearly going to be a 4080TI/Super, but by then you may as wait for the 5080...
SUPER usually means +10% perf after waiting a year longer.
Absolutely no-one with a 4000 series card should upgrade.

I expect 5090 by Q4 myself.
Posted on Reply
#31
Macro Device
lasYou don't buy a 4070 Ti for ultimate 4K gaming tho.
FTFY. With advancement of DLSS/FSR/XeSS and RT being an option rather than a mandatory thing, even an RX 6800 is not bad at 4K.
lasmost will upgrade every gen or every other gen too
Where did you get this info? Most people get their PC upgraded when everything is beyond obsoletion, kind of Haswell + Maxwell combo. Not everyone is a performance enthusiast with pockets deep enough for annual upgrades. Very much not everyone.
lasI expect 12GB VRAM to be enough till 9th gen Console Gen in 2028. Absolutely nothing suggests otherwise and 95% of PC gamers use 1440p or less on top of this. PS5 and XSX have 16GB shared RAM in total. Even the most demanding games on PS5 and XSX barely uses 8GB for graphics.
Consoles use lower resolutions, heavy FSR and lower settings than PC games are "supposed" to run. You can't conclude based off 1080p Low-Med + FSR Balanced/Performance what will be enough for much more common 1440p High + FSR Quality scenarios. Other than that, I agree with GPUs running out of calculating power way before running out of VRAM in 99+ % real gaming scenarios.
Posted on Reply
#32
Scircura
MarsM4NNah, the FE had a shit cooling. :D Just look at the numbers. The only cards who did worse where heavy OC cards. Compared to the most silent card it's almost 10 dBA louder, which is massive for the ears.
Noise-normalized difference FE to Noctua is 8ºC, and temperature-normalized looks to be 6dBA @ 50cm. (FE is 34.3dBA @ 63ºC, Noctua 27.9dBA @ 64ºC, and you can see all cards with dual BIOS trade 0.5-1.0dBA per ºC). At a normal (further) listening distance, I think actual difference could be 3dBA or less. (Edit: I rechecked sound pressure formulas and the difference remains 6dBA regardless of distance.) Audible for sure, but not massive.

I personally would use both a custom fan curve and undervolt on the FE to get it close to the Noctua "Quiet BIOS" volume, accepting a minor drop in performance.
Posted on Reply
#33
Dr. Dro
Bwaze20% lower price?

Although I have heard comments that the street price of base models in US is now about $1100, so the discount might be a bit dissapointing, especially if the rumors about limited manufacturing capabilities due to Nvidia's focus on AI are true. We might even see increased prices compared to non-Super models, explained by low stock.
I mean the "lower price" is an attractive I suppose but, speaking of the hardware itself, I expect the changes to be even more negligible than they were when comparing the RTX 2080 to the RTX 2080 Super, so there's that
b1k3rdudeI picked up my Gaming Trio 4080 for £1000 about 6 months ago, so Im just waiting for the new 4080 super at that same price point to be a bit of a meh. There is clearly going to be a 4080TI/Super, but by then you may as wait for the 5080...
The 5090 will be the only logical upgrade path for people on the RTX 4080 and 4090.
Posted on Reply
#34
gffermari
These new Super models seem to be more appealing to the eye than the normal models.
And as always the FE is the most beautiful card.
Posted on Reply
#35
iameatingjam
lasIts funny how people think you need 16-24GB VRAM to play AAA games on PC. Like 80-90% of users on Steam have 8GB tops and developers knows it. Just slamming games on Ultra enables tons of crap like motion blur, dof etc. that makes VRAM requirement go up but makes game look worse. If you actually tweak settings correctly, 12GB VRAM is more than plenty, even for the 1-2% of PC gamers that use 4K/UHD. GPU power is more important, because you CAN and SHOULD always optimize settings manually which will lower VRAM usage. Most games today looks pretty much identical on high and ultra anyway. Hell even medium/high many times look CLEANER than ultra because of less BLUR and DOF etc. Horrible.
Idk if you were still talking to me, or just in general, but I didn't buy a 4090 because it had 24gb of ram, I bought a 4090 because I needed an upgrade from 8gb of vram, and also wanted a performance improvement since I was buying a new card.

Before the 4090 I did get a 4070 actually, but the first benchmark I did was heaven, in which I got only 1 fps higher than my 3070... which was pretty disappointing. This was probably because of bandwidth and the 8x msaa, which wouldn't apply so much in most new AAA games (which use TAA or something like it,) but I often find myself playing old games and niche games and am super sensitive to aliasing. And seeing how dldsr+reshade is my best way to combat that, I felt this bandwidth limit was relevant to me.

So like I said, I didn't really like the value proposition of either 4070 ti (same bandwidth limit) or the 4080, so 4090 it was. I mean think of it from my perspective, I bought a 3070 right before an avalanche of games came out with issues with 8gb cards, and not just at ultra, to play TLOU I had to use high settings and dlss balanced to get close the vram limit ( I was still over, but it was close enough to stop it from crashing). Oh and the 3070 had chronic overheating problems too.

I knew 12gb would be enough, and I was happy to pay 4070 prices for it, but not 4070 ti prices for it (especially with that bandwidth limit). For that I really would have expected 16GB. I don't think there's anything wrong the 4080, other than the price. $1200, thats a lot. So you can see how I just went for the 4090... and also... I was a little intoxicated.

Since getting the 4090 I am now totally using dldsr in every game, not just games with bad AA. Its getting to the point where running a game (even with good AA) at native resolution (1440p) just doesn't look right. So yes even though I have a 1440p display, I render at 4k.

Anyway I totally agree with everything you were saying about ultra not being necessary and all that. I usually go to the second highest preset by default, and tweak from there if necessary. And I don't use RT either, its not worth the power use imo. I don't like my 4090 even going above 300w. Most of the time its a lot lower than that. I don't shy away from using dlss either, it actually works great in combination with dldsr, for keeping the visual quality up and the power use down.
Posted on Reply
#36
Minus Infinity
DavenAll cards are within the performance of a 4060 and 4090 so no new capabilities here. Just selling off newly collected bins.

I’m more interested in why there are no budget cards below the 4060 and why there isn’t a fully unlocked model 4090 Ti at 18,176 CUDA cores.
LOL 4060 is actually 4050 class card at high prices. Huang claims this is the budget card. Gosh imagine a 64 bit 4050 with 8MB of L2 cache and 4GB. It could give that AMD POS the 6500XT a run for it's money.
Posted on Reply
#37
iameatingjam
Minus InfinityLOL 4060 is actually 4050 class card at high prices. Huang claims this is the budget card. Gosh imagine a 64 bit 4050 with 8MB of L2 cache and 4GB. It could give that AMD POS the 6500XT a run for it's money.
I think the Nvidia is trying to fill the lowest class card spot with their 'new' 3050 6gb. But yeah, when I see the 4060 is now coming in low profile versions... that pretty much convinced me... its the 4050. I did fight back against this idea at first, just because, you know, the naming scheme has never been totally consistent and varied from generation to generation and what not. But yeah... I've been convinced.... its the 4050.
Posted on Reply
#38
Bwaze
"In economics, shrinkflation, also known as the grocery shrink ray, deflation, or package downsizing, is the process of items shrinking in size or quantity, or even sometimes reformulating or reducing quality, while their prices remain the same or increase."
Posted on Reply
#39
las
iameatingjamIdk if you were still talking to me, or just in general, but I didn't buy a 4090 because it had 24gb of ram, I bought a 4090 because I needed an upgrade from 8gb of vram, and also wanted a performance improvement since I was buying a new card.

Before the 4090 I did get a 4070 actually, but the first benchmark I did was heaven, in which I got only 1 fps higher than my 3070... which was pretty disappointing. This was probably because of bandwidth and the 8x msaa, which wouldn't apply so much in most new AAA games (which use TAA or something like it,) but I often find myself playing old games and niche games and am super sensitive to aliasing. And seeing how dldsr+reshade is my best way to combat that, I felt this bandwidth limit was relevant to me.

So like I said, I didn't really like the value proposition of either 4070 ti (same bandwidth limit) or the 4080, so 4090 it was. I mean think of it from my perspective, I bought a 3070 right before an avalanche of games came out with issues with 8gb cards, and not just at ultra, to play TLOU I had to use high settings and dlss balanced to get close the vram limit ( I was still over, but it was close enough to stop it from crashing). Oh and the 3070 had chronic overheating problems too.

I knew 12gb would be enough, and I was happy to pay 4070 prices for it, but not 4070 ti prices for it (especially with that bandwidth limit). For that I really would have expected 16GB. I don't think there's anything wrong the 4080, other than the price. $1200, thats a lot. So you can see how I just went for the 4090... and also... I was a little intoxicated.

Since getting the 4090 I am now totally using dldsr in every game, not just games with bad AA. Its getting to the point where running a game (even with good AA) at native resolution (1440p) just doesn't look right. So yes even though I have a 1440p display, I render at 4k.

Anyway I totally agree with everything you were saying about ultra not being necessary and all that. I usually go to the second highest preset by default, and tweak from there if necessary. And I don't use RT either, its not worth the power use imo. I don't like my 4090 even going above 300w. Most of the time its a lot lower than that. I don't shy away from using dlss either, it actually works great in combination with dldsr, for keeping the visual quality up and the power use down.
In which game is 4070 only 1 fps faster than 3070?

Do you run your windows resolution at your DLDSR resoltion too then? Because this is the big issue with DLDSR. If you use native res in windows and DLDSR res in games, it will be wonky especially when tabbing in and out of games and crashing can easily happen. Tons of posts about this.

This is why most people using DLDSR alot - which is not supported correctly in all games for your information - is running the DLDSR res on the desktop as well. This can make text look weird and introduce other issues like artifacts, depending on your scaling and DLDSR level.
Beginner Micro DeviceFTFY. With advancement of DLSS/FSR/XeSS and RT being an option rather than a mandatory thing, even an RX 6800 is not bad at 4K.

Where did you get this info? Most people get their PC upgraded when everything is beyond obsoletion, kind of Haswell + Maxwell combo. Not everyone is a performance enthusiast with pockets deep enough for annual upgrades. Very much not everyone.

Consoles use lower resolutions, heavy FSR and lower settings than PC games are "supposed" to run. You can't conclude based off 1080p Low-Med + FSR Balanced/Performance what will be enough for much more common 1440p High + FSR Quality scenarios. Other than that, I agree with GPUs running out of calculating power way before running out of VRAM in 99+ % real gaming scenarios.
That surely depends on game and performance target. But yeah Radeon 6800 can play some games at 4K/UHD, mostly older and lesser demanding ones.

DLSS/FSR is not magic. You still need alot of GPU power unless you are willing to use the lower DLSS/FSR presets which looks much worse than native. The Quality/Ultra Quality presets look way better.

Tons of 4K gamers with lacking GPU power is using Performance mode which is 1080p internally. Can look "fine" and somewhat acceptable but won't look good enough for many.

I'd not go 4K unless you can atleast afford a new and somewhat high-end GPU, if you plan to play new AAA games.

I don't see much point in going 4K if you play games at 1080p + Upscaling anyway. I'd rather go 1440p then, with higher refresh rate.

At 1440p you can use DLAA or FSR Native AA which will look and (probably) run better than 4K using FSR/DLSS on the lower presets.

Artifacts and shimmering is the big problem with especially FSR. DLSS has it too but to a much lower degree. This is why DLSS/DLAA pretty much always beat FSR in Techpowerup's testing. They have tons of DLSS/DLAA vs FSR comparisons in newer AAA games by now.
Posted on Reply
#40
iameatingjam
lasIn which game is 4070 only 1 fps faster than 3070?
Heaven... the benchmark. 1440p highest quality settings and 8x msaa. 3070 was 113 fps and 4070 was 114 or something like that.
Do you run your windows resolution at your DLDSR resoltion too then?
Yes
Because this is the big issue with DLDSR. If you use native res in windows and DLDSR res in games, it will be wonky especially when tabbing in and out of games and crashing can easily happen. Tons of posts about this.
Yeah, I switched to using dldsr in desktop just so I didn't have to wait that extra second for the monitor to adjust every time I switch from game to... something else. But good to know I suppose.
This is why most people using DLDSR alot - which is not supported correctly in all games for your information - is running the DLDSR res on the desktop as well. This can make text look weird and introduce other issues like artifacts, depending on your scaling and DLDSR level.
Yeah I just used scaling... that keeps things nice and legible. Can't say I've had any artifacts, not on my 4090 anyway. I've had a few games that didn't recognize the resolution on fullscreen mode, so I just had to switch to borderless window.
Posted on Reply
#41
Macro Device
lasBut yeah Radeon 6800 can play some games at 4K/UHD, mostly older and lesser demanding ones.
It can play almost any game at UHD native at 60+ FPS. With FSR at Balanced and a couple settings at High instead of Ultra, it's literally any game with maybe one or two outliers, namely the latest Avatar game.

RX 7700 XT is actually a worse 4K performer than the RX 6800 but difference is tiny.
lasI don't see much point in going 4K if you play games at 1080p + Upscaling anyway.

I'd rather go 1440p then.
I have the direct access to a 1440p and a 2160p displays, with the former being of higher quality. Despite that, native 1440p is very much rarely superior to 2160p with FSR at 50% render resolution. It's rather a draw, sometimes 4K+FSR is better.

I play 4K60 with FSR: Performance on my 6700 XT. Additional 35% performance from RX 6800 would've allowed me for FSR: Balance and even FSR: Quality in some titles. Can't complain about image quality. Stability is what is the real subject to improvements but we will be there eventually.

7900/4070 Ti level recommendation is completely understandable. More speed always beats less speed. But even people like me who bought a 4K display for work in the first place (my eyes started getting annoyed to death from working with long reads at 1080p) still can enjoy gaming without going for a different monitor despite having a 300 dollar GPU.
Posted on Reply
#42
las
Beginner Micro DeviceIt can play almost any game at UHD native at 60+ FPS. With FSR at Balanced and a couple settings at High instead of Ultra, it's literally any game with maybe one or two outliers, namely the latest Avatar game.

RX 7700 XT is actually a worse 4K performer than the RX 6800 but difference is tiny.

I have the direct access to a 1440p and a 2160p displays, with the former being of higher quality. Despite that, native 1440p is very much rarely superior to 2160p with FSR at 50% render resolution. It's rather a draw, sometimes 4K+FSR is better.

I play 4K60 with FSR: Performance on my 6700 XT. Additional 35% performance from RX 6800 would've allowed me for FSR: Balance and even FSR: Quality in some titles. Can't complain about image quality. Stability is what is the real subject to improvements but we will be there eventually.

7900/4070 Ti level recommendation is completely understandable. More speed always beats less speed. But even people like me who bought a 4K display for work in the first place (my eyes started getting annoyed to death from working with long reads at 1080p) still can enjoy gaming without going for a different monitor despite having a 300 dollar GPU.
Well pretty much all newer GPUs can do 4K/UHD if you tweak settings (lower details) and applies DLSS/FSR, however this also lowers VRAM requirement alot and then stuff like 3070 8GB will do 4K gaming as well. If you are willing to sacrify details by lowering the internal resolution alot, then pretty much every single GPU can do 4K gaming. Won't look great tho.

I have 32" 4K/UHD IPS monitor too, only 60 Hz tho, since its for work. I don't think it looks better than my 27" 1440p at native when using DLAA. Even with DLSS Quality at 1440p it can look close or similar to 4K with DLSS Performance (1080p internal res) depending on game and scene in that game. FSR/DLSS always has more shimmering and wonky stuff going. If I want maximum visuals I always go DLAA instead of DLSS and 4K with DLAA is very demanding.

I use DLSS Quality most of the time when I output to my UHD OLED TV. DLSS is not really worth it when not pixel peeping so yeah, depends on requirements and performance goals.

However personally I can't play games that drops below 85-90 fps minimum, I prefer 100+ at all times and 200+ fps in shooters. High fps means high minimums and I hate low minimums.

Several games on that graph drops closer to 30 fps than 60 fps. I see 60 fps as bare minimum for PC gaming but I guess some people can live with less.

When using my 4K monitor, there's little difference between Performance and Quality really. 1080p vs 1440p, but 1080p upscales easily and perfectly to 2160p, 1440p don't. Probably why most using DLSS with a 4K monitor uses the Performance mode instead of Quality.

Techpowerup tested this in Alan Wake 2 and you can pixel peep in the comparison tool. Very little difference but way more performance using Performance.
Posted on Reply
#43
Macro Device
lasSeveral games on your image there drops closer to 30 fps than 60 fps.
It's just about 10 games you need to adjust the settings, yes. It's billions games in this world and only a dozen needing adjustments. Pretty much lines up with "you can play almost anything at 4K60 with a 6800."
las1440p with DLAA looks close to 4K native for me.
Unfortunately, I don't have access to DLAA, yet I know DLSS is much better than FSR so this statement doesn't scream nonsense to me.
laspersonally I can't play games that drops below 85-90 fps minimum, I prefer 100+ at all times and 200+ fps in shooters.
That is what is called "high standards." It's okay but realistically most AAA games are completely feasible at 60 and going above 120 doesn't make experience as better as you'd expect it to.

I'm just happy with the fact my sub 1000 USD puter can handle some 4K. 10 years ago, that sounded like a very LSD-driven dream.
Posted on Reply
#44
las
Beginner Micro DeviceIt's just about 10 games you need to adjust the settings, yes. It's billions games in this world and only a dozen needing adjustments. Pretty much lines up with "you can play almost anything at 4K60 with a 6800."

Unfortunately, I don't have access to DLAA, yet I know DLSS is much better than FSR so this statement doesn't scream nonsense to me.

That is what is called "high standards." It's okay but realistically most AAA games are completely feasible at 60 and going above 120 doesn't make experience as better as you'd expect it to.

I'm just happy with the fact my sub 1000 USD puter can handle some 4K. 10 years ago, that sounded like a very LSD-driven dream.
Did you actually play some of the recent demanding games tho? Alan Wake 2, Cyberpunk 2.0 + Expansion

These games will melt most older GPUS upscaling or not, because they look great even on low settings

XeSS beats FSR in many games tho, so don't just choose FSR if you have XeSS option -> www.pcgamer.com/cyberpunk-2077-fsr-vs-xess/

In terms of compatibility XeSS is bad, low support, but in terms of quality it's actually closer to DLSS than FSR is.

XeSS support improves slowly tho, so we have 3 upscalers and RTX users can use them all, however in pretty much no DLSS/DLAA games, RTX users need to worry about choosing the right one, it will always be DLSS/DLAA so I hope Intel and AMD can improve their upscaling to match it.
Posted on Reply
#45
Macro Device
lasCyberpunk 2.0 + Expansion
4K, Ultra preset but with Screen Space Reflections at Medium quality, FSR at Balanced outside D-town, at Performance inside D-town. Smooth 60 FPS. Ray tracing obviously is completely disabled.
lasAlan Wake 2
Don't own the game.
lasXeSS beats FSR in many games tho, so don't just choose FSR if you have XeSS option
FSR at Quality is faster than XeSS at Balanced on RDNA GPUs, being a tiny tad slower than the most aggressive XeSS mode. That's why XeSS is only a go-to option if you are an Arc owner or the game for some reason doesn't support DLSS and you're an Ada GPU owner.

GTA V is smooth at 4K. Didn't have much issues in Hogwarts Legacy (the main issue was the game was boring for me). Also played RDR2 at 4K Medium-Low (or Medium-High, don't remember lol), not a hinch. Crysis 3 wasn't a huge issue either.
Posted on Reply
Add your own comment
Dec 18th, 2024 01:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts