# NVIDIA GeForce RTX 4090 Founders Edition



## W1zzard (Oct 11, 2022)

The NVIDIA GeForce RTX 4090 Founders Edition offers huge gains over its predecessors. It's the first graphics card to get you 4K 60 FPS with ray tracing enabled, and upscaling disabled. Do you prefer 120 FPS instead of 60? Just turn on DLSS 3.

*Show full review*


----------



## Kovoet (Oct 11, 2022)

Expect more for that price and size. Will stick with my 3080 but will try the arc for my cheaper rig


----------



## Dirt Chip (Oct 11, 2022)

Very fast, Very much efficent. Very very expensive.
yawn


----------



## Fluffmeister (Oct 11, 2022)

Damn Ray Tracing performance has taken quite a leap up, impressive indeed!

And it's actually rather more efficient too despite what was expected.


----------



## TheDeeGee (Oct 11, 2022)

While not the claimed 4 times faster, the sometimes near double jump is impressive from one generation to the next.


----------



## Arco (Oct 11, 2022)

I wonder what AyyyyyMD is going to come up with!


----------



## Rowsol (Oct 11, 2022)

The efficiency is the most surprising part.


----------



## spnidel (Oct 11, 2022)

63% faster than a 3080 ti, meanwhile the 4080 was advertised to be "2-4x faster than 3080 ti" lmao
yeah this generation is a joke price and power consumption wise
but hey - for only $1600 you can now play cyberpunk 2077 in 4k at 40 fps with raytracing enabled! 
waiting for rdna 3 now


----------



## KarymidoN (Oct 11, 2022)

4k gaming beast.
Not that power hungry and not really hot.
if it fits your case and budget its a great card.

Thx for the amazing Review @W1zzard


----------



## GerKNG (Oct 11, 2022)

TWO TO FOUR TIMES FASTER!!!


----------



## usiname (Oct 11, 2022)

But, but 2-4x faster than 3090ti




Be ready for the disaster, 4080 12GB has 2x less cores => 3080 10GB perforamnce level


----------



## Dirt Chip (Oct 11, 2022)

Also, great review with 2 new and very much welcome tests!
The split-video for the DLSS 3 (page 35) is a wonderful added feature (but atm is an endless loading loop, using chrome).
The upcoming noise-normalized test (page 38) is a terrific tool so very much thank you for adding this


----------



## Zubasa (Oct 11, 2022)

Rowsol said:


> The efficiency is the most surprising part.


Not really, TSMC 5nm class nodes are miles a ahead in efficiency compare to Samsung 8nm, and even to TSMC's N7.
Apple's M1 used a similar node and it blew Intel's previous laptop CPUs out of the water in that regard.


----------



## P4-630 (Oct 11, 2022)

Arco said:


> I wonder what AyyyyyMD is going to come up with!



AMD will probably lag behind with RT.


----------



## Space Lynx (Oct 11, 2022)

30 fps faster at 1440p than a 3080 ti in Elden Ring... and several other games onl 20 fps gains...  I mean its cool that some of the nvidia favored games are seeing 100+ fps gains... but honestly... for $1600... ehh... I will wait for RDNA3, not impressed. 

$1600 should get me more than 30 extra fps in Elden Ring vs a 3080 ti.



P4-630 said:


> AMD will probably lag behind with RT.



good, I want raw fps increases for smoothness, not RT

RDNA3 to the moon


----------



## Chaitanya (Oct 11, 2022)

Given nVidia has official policy of even thermal pad changes lead to void warranties on their FE cards thanks but no thanks. Want to see computational review of these new cards.


----------



## BigMack70 (Oct 11, 2022)

> For a majority of gamers, the "classic" raster performance is very important though—highest settings, RT off, DLSS off—so we made sure to extensively test this scenario using 25 games at three resolutions. The GeForce RTX 4090 achieves incredible performance results here: +45% vs RTX 3090 Ti. Yup, 45% faster than last generation's flagship—this is probably the largest jump gen-over-gen for many year. Compared to RTX 3080 the uplift is 89%—wow!—almost twice as fast. Compared to AMD's flagship, the Radeon RX 6950 XT, the RTX 4090 is 64% faster. Somehow I feel that after RDNA2, Jensen said to his people "go all out, I want to conclusively beat AMD next time".



I can't stand when you guys pop off absolute nonsense like this in your reviews. This is literally the SAME performance uplift as last generation based on YOUR own review data of the 3090. Identical to the uplift from 3090 over 2080 Ti. It's absolutely unremarkable.

The 3090 was 45% faster than the 2080 Ti. The 2080 Ti was 39% faster than the 1080 Ti. The 1080 Ti was 85% faster than the 980 Ti. That's all based on YOUR data for 4k relative performance for the cards at launch.

Stop advertising for Nvidia. This is expected and unremarkable flagship performance for a generational improvement and if you start going back a few generations before Turing, it's historically sub-par. Do better.


----------



## Ayhamb99 (Oct 11, 2022)

Huh...... I was expecting this card to be way more power hungry but for the improvements over the last gen i was surprised that it didn't surpass the 3090 Ti power consumption by a lot...

Of course for $1600 ehhhh best to wait until the 4080 and the imposter 4070 reviews come out


----------



## TheinsanegamerN (Oct 11, 2022)

Zubasa said:


> Not really, TSMC 5nm class nodes are miles a ahead in efficiency compare to Samsung 8nm, and even to TSMC's N7.
> Apple's M1 used a similar node and it blew Intel's previous laptop CPUs out of the water in that regard.


Doesnt change the fact that this is a very BIG card, and every company has been boosting out of the efficiency curve for several generations now. It's surprising nvidia didnt crank thing thing to the moon.


----------



## P4-630 (Oct 11, 2022)

usiname said:


> Be ready for the disaster, 4080 12GB has 2x less cores => 3080 10GB perforamnce level



Don't forget the special sauce and sprinkles on these new GPU's....


----------



## TheinsanegamerN (Oct 11, 2022)

BigMack70 said:


> I can't stand when you guys pop off absolute nonsense like this in your reviews. This is literally the SAME performance uplift as last generation based on YOUR own review data of the 3090. Identical to the uplift from 3090 over 2080 Ti. It's absolutely unremarkable.
> 
> The 3090 was 45% faster than the 2080 Ti. The 2080 Ti was 39% faster than the 1080 Ti. The 1080 Ti was 85% faster than the 980 Ti. That's all based on YOUR data for 4k relative performance for the cards at launch.
> 
> Stop advertising for Nvidia. Do better.


a 45% improvement over an already insane card is unremarkable? 

Go to bed BigMack, you're drunk.


----------



## Frick (Oct 11, 2022)

Would it be interesting to add 4K Vsync 60Hz to this?


----------



## Micko (Oct 11, 2022)

May I notice that using Unigine Heaven at 1920x1080 for overclocked performance is not valid. Heaven is cpu limited in some parts of the test on much weaker cards, let alone 4090. Could you at least switch the res to 4k to minimize CPU bottleneck ? I bet the performance gained from overclocking should be closer to 10% in that case.


----------



## Dirt Chip (Oct 11, 2022)

Rowsol said:


> The efficiency is the most surprising part.


Yep, less total W consumption in gaming (vs. 3090ti, 3090, 3080ti) while much much faster.


----------



## Colddecked (Oct 11, 2022)

This thing packs so much AI its gets bored of rendering graphics below 4k.


----------



## BigMack70 (Oct 11, 2022)

TheinsanegamerN said:


> a 45% improvement over an already insane card is unremarkable?
> 
> Go to bed BigMack, you're drunk.


Yes it's unremarkable. It's ordinary. Expected. Routine.

Don't believe me? Go look at the flagship card launches for every generation of GPUs. Historically, it's ordinary and unremarkable in every way. It's identical to Ampere, slightly higher than Turing, MUCH lower than Pascal. Higher or lower than Maxwell depending on if you consider the 780 or the 780 Ti the Kepler flagship. Lower than Kepler.

Anyone jumping up and down as if a 45% performance improvement is anything other than expected is drinking Jensen's Kool-Aid and needs to take a break from Nvidia's marketing materials.


----------



## birdie (Oct 11, 2022)

The biggest issue of this generation is that other cards despite costing an arm and a leg are castrated too much, not to mention that NVIDIA had no qualms calling RTX 4070 an RTX 4080.

It even looks like there will be no RTX 4060/4050 this gen and Nvidia will continue selling the previous generation stock.


----------



## Colddecked (Oct 11, 2022)

BigMack70 said:


> Yes it's unremarkable. It's ordinary. Expected. Routine.
> 
> Don't believe me? Go look at the flagship card launches for every generation of GPUs. Historically, it's ordinary and unremarkable in every way. It's identical to Ampere, slightly higher than Turing, MUCH lower than Pascal. Higher or lower than Maxwell depending on if you consider the 780 or the 780 Ti the Kepler flagship. Lower than Kepler.
> 
> Anyone jumping up and down as if a 45% performance improvement is anything other than expected is drinking Jensen's Kool-Aid and needs to take a break from Nvidia's marketing materials.


I mean, you can see what it can do in games that can somewhat take advantage of its strong suits.  100% over a 3080 is nuts.


----------



## Rokugan (Oct 11, 2022)

+63% vs  RTX3090 @ 3x  2nd hand market price in Europe (2K vs 700 Eur)
+88% vs RTX3080 @ 4x  2nd hand market price in Europe (2K vs 500 Eur)

Absolutely idiotic pricing in Europe. No thanks Ngreedia


----------



## LFaWolf (Oct 11, 2022)

Wow impressive performance and jump from last gen indeed. Great review!


----------



## erek (Oct 11, 2022)

KarymidoN said:


> 4k gaming beast.
> Not that power hungry and not really hot.
> if it fits your case and budget its a great card.
> 
> Thx for the amazing Review @W1zzard


Agree, immediately noticed the 4K performance,  wonder if I can finally do 4K144 in GTA Online


----------



## Denver (Oct 11, 2022)

I believe the CPU is limiting the performance the card can achieve, especially at 1080P. 

It's worth a new round with the big dogs like 7700x/12900k and 5800x3d


----------



## BigMack70 (Oct 11, 2022)

Colddecked said:


> I mean, you can see what it can do in games that can somewhat take advantage of its strong suits.  100% over a 3080 is nuts.


It's obviously a card that performs incredibly well.

My point is that the hyperbole in the conclusion of this review reflects severe historical ignorance and is unwarranted. It performs incredibly well, and its incredible performance uplift is exactly in the ordinary window of expectations that has been occurring for over a decade as new generations of GPU get launched.

This is a "meets expectations" product concerning traditional performance. Which is perfectly fine - generational improvements are fantastic and game-changing! But to act like this is an "exceeds expectations" product when talking about plain rasterized performance is nonsense. The Ray Tracing and feature set is in many ways extra-ordinary and I think some hyperbole there appears justified. But not on the baseline performance as exists currently in this review's conclusion.


----------



## W1zzard (Oct 11, 2022)

Micko said:


> May I notice that using Unigine Heaven at 1920x1080 for overclocked performance is not valid. Heaven is cpu limited in some parts of the test on much weaker cards, let alone 4090. Could you at least switch the res to 4k to minimize CPU bottleneck ? I bet the performance gained from overclocking should be closer to 10% in that case.


I'm only using a small subset of Heaven, the most demanding area. I've seen it do over 500 FPS in that scene, so not completely CPU limited.
I'll still look into a better test though, but that invalidates all previous data, so lots of retesting will be needed



Denver said:


> It's worth a new round with the big dogs like 7700x/12900k and 5800x3d


Will upgrade CPU later this year. I just spent two weeks in summer retesting on the current 5800X. Due to so many cards, games and resolutions I can't just switch easily


----------



## HD64G (Oct 11, 2022)

Me thinks that using 5800X instead of 5800X3D or 7700X is bottlenecking 4090 up to 1440P. That also keeps power consumption lower and increased the efficiency. My 5c.


----------



## DemonicRyzen666 (Oct 11, 2022)

How is better all they did was add more ROP? there a 57% increase in ROP's vs a 3090 and add mhz to the core the lower node makes this card effeinct it's actally Crappy vs ampere.


TheinsanegamerN said:


> a 45% improvement over an already insane card is unremarkable?
> 
> Go to bed BigMack, you're drunk.



10% increase clock speeds
57% increase in ROP's
48% increase in TMU's
52% increase in Shaders
52% increase in Tensor cores
52% increase in RT cores.
12 times the L2cache

4nm vs 8nm ? 

The power drop seems like it's all from the node.


----------



## Vipeax (Oct 11, 2022)

Average fps being around 150 fps at 4k indicates a CPU limited as others have also concluded. This also makes any talks about percentages irrelevant for now.


----------



## mashie (Oct 11, 2022)

Nice card, won't fit my case.


----------



## Crackong (Oct 11, 2022)

usiname said:


> Be ready for the disaster, 4080 12GB has 2x less cores => 3080 10GB perforamnce level


This

4080(16GB) has exactly 60% of cuda of 4090
4080(12GB) is just 47%


----------



## bobsled (Oct 11, 2022)

Awesome review! Thanks for the testing without DLSS!

Interesting how Nvidia doesn’t seem to be increasing RT capability ahead of raster, but rather keeping it in line. (RT performance impact in FPS vs not enabled)


----------



## ymdhis (Oct 11, 2022)

Ah yes, the new flagship card is out, I can't wait until TPU's entire first page will be filled with 30 iterations of "company X releasing card Y" headlines.

also, 500W, lmao.



BigMack70 said:


> Yes it's unremarkable. It's ordinary. Expected. Routine.
> 
> Don't believe me? Go look at the flagship card launches for every generation of GPUs. Historically, it's ordinary and unremarkable in every way. It's identical to Ampere, slightly higher than Turing, MUCH lower than Pascal. Higher or lower than Maxwell depending on if you consider the 780 or the 780 Ti the Kepler flagship. Lower than Kepler.
> 
> Anyone jumping up and down as if a 45% performance improvement is anything other than expected is drinking Jensen's Kool-Aid and needs to take a break from Nvidia's marketing materials.


This guy has a point, new cards being almost twice as fast has been always a thing. You can practically go back to the 3dfx Voodoo 2 and still see the same numbers.


----------



## bobsled (Oct 11, 2022)

Dirt Chip said:


> Yep, less total W consumption in gaming (vs. 3090ti, 3090, 3080ti) while much much faster.


Samsung 8nm really can’t win against TSMC’s processes. The reduction in transistor size (ie density) alone would’ve done major favours.


----------



## nguyen (Oct 11, 2022)

5800X + 16GB (single rank) memory are just too slow for next gen of GPUs


----------



## gridracedriver (Oct 11, 2022)

Impressed by the performance watt, better than I expected, the dlss3 however does not convince me, excellent performance but low quality for now.

The real problem remains the price, no perf / cost increase and the size, too big.


----------



## Gica (Oct 11, 2022)

Rokugan said:


> View attachment 264998
> 
> +63% vs  RTX3090 @ 3x  2nd hand market price in Europe (2K vs 700 Eur)
> +88% vs RTX3080 @ 4x  2nd hand market price in Europe (2K vs 500 Eur)
> ...


+13223363% vs  Mazda
+354534242% vs Logan

Absolutely idiotic pricing in Europe. No thanks *Ferrari*. 

PS. 
All women in the world are anatomically identical. However, you don't choose randomly, no. If you want one of the elite, you have to make efforts.
As for the price, the RTX 4090 sells for a price at which you'd consider yourself lucky if you got a 3080 last year.


----------



## mechtech (Oct 11, 2022)

Take Aways

TSMC 4nm looks good
Power use can be crazy
Vsync power use surprisingly low
Most 1080p/1440p no point for this card
No DP 2.0
Good RT performance (if you have any games with that)
MSRP not even close to retail - seeing $3400-$4000 here in Canada W/o the 13% tax

Nice beast but hard pass.  Will get all new appliances first.


----------



## wolf (Oct 11, 2022)

That efficiency at 450w dayum, 2x perf watt over Ampere just about, I'd wager it will be further down the stack.

Still, too big, too expensive and not quite enough for me to consider replacing a 3080 with one right this very minute.


----------



## Ayhamb99 (Oct 11, 2022)

While the power consumption at stock speeds wasn't as bad as i thought it would be, for anyone who is thinking of buying this thing and overclocking it, prepare yourself


----------



## CyberCT (Oct 11, 2022)

While impressive, I'm disappointed. There were multiple "leaks" on Videocardz.com showing nearly double the performance of a 3090. This was looking like a 980 => 1080 generational jump. The 4090 isn't anywhere near that. Yes, 64ish % is a nice uplift, but I was hoping those "leaks" were true. The only reason I was looking to upgrade was for better VR performance, and right now, not even the 4090 at it's current performance will be powerful enough for the upcoming high end VR headsets. Heck it won't be completely capable for the currently available Pimax 8KX DMAS in all games. Maybe the simple games will be OK.

With rain effects and multiple cars in Assetto Corsa, I have to drop my Reverb G2 down to 50% resolution scale to get a constant 90fps on my 3080ti. Was hoping the 4090 would allow it to run at the full 100%, which should be possible if it truly was @ double the performance as the leaks showed.

They really need to release VR headsets with VRR or Gsync.


----------



## ThrashZone (Oct 11, 2022)

Hi,
Nice but this does not compel me to want to buy one.


----------



## 64K (Oct 11, 2022)

This GPU is a beast. Very impressive.
Good job Nvidia!


----------



## Easy Rhino (Oct 11, 2022)

Wow, all of those rehashed AAA titles like MW2 and all of those botched AAA titles like Cyber Punk that are unplayable will look amazing.


----------



## SOAREVERSOR (Oct 11, 2022)

I wonder why you didn't ask the most important question!!  Will it blend!  Then it hit me you can't fit this thing in a blender.


----------



## Space Lynx (Oct 11, 2022)

gridracedriver said:


> Impressed by the performance watt, better than I expected, the dlss3 however does not convince me, excellent performance but low quality for now.
> 
> The real problem remains the price, no perf / cost increase and the size, too big.




DLSS3 is vaporware, maybe we will see in 5 games working properly 6 months from now. Physx flashbacks all over again.


also, jayz2cents review:

If I can get a 3090 or 3090 ti for like $899-999 range, I may just do that. i could care less about ray tracing. but I am waiting on rdna3 first. I have a feeling the chiplet design is going to kick Nvidia's ass (not in ray tracing, but in everything else)


----------



## Bwaze (Oct 11, 2022)

Thousand friggin' six hundred dollars.

I have to work for about two months for EU equivalent. 

Who is this card for? Optimistic miners?


----------



## Denver (Oct 11, 2022)

W1zzard said:


> I'm only using a small subset of Heaven, the most demanding area. I've seen it do over 500 FPS in that scene, so not completely CPU limited.
> I'll still look into a better test though, but that invalidates all previous data, so lots of retesting will be needed
> 
> 
> Will upgrade CPU later this year. I just spent two weeks in summer retesting on the current 5800X. Due to so many cards, games and resolutions I can't just switch easily


_Would it be possible to test just the 4090 with a stronger CPU for comparison purposes?_


----------



## Micko (Oct 11, 2022)

W1zzard said:


> I'm only using a small subset of Heaven, the most demanding area. I've seen it do over 500 FPS in that scene, so not completely CPU limited.
> I'll still look into a better test though, but that invalidates all previous data, so lots of retesting will be needed


In that case all is good, thank you! Still, it feels bad to see such poor scaling with overclock.

Relatively low core/memory/hotspot temperatures on this reference model are a pleasant surprise. When the reference model brings top performance, aesthetics, cold and silent card, one might ask what's the point of non reference cards ? Can't wait to read your reviews of custom models.


----------



## Ayhamb99 (Oct 11, 2022)

SOAREVERSOR said:


> I wonder why you didn't ask the most important question!!  Will it blend!  Then it hit me you can't fit this thing in a blender.


They managed to fit an entire blender in another larger one  in one of the Blendtec videos  so i think it's plausible to fit a 4090


----------



## bogda (Oct 11, 2022)

How exactly is 45% speed up "probably the largest jump gen-over-gen for many year"??
100% (3090Ti) vs 64% (2080Ti) is roughly 56% speed up.


----------



## swirl09 (Oct 11, 2022)

CyberCT said:


> I'm disappointed. There were multiple "leaks" on Videocardz.com showing nearly double the performance of a 3090.


Check some more reviews out, there are games showing literally double the performance of a 3090 is rasterization.


Bit surprised by the power charts here, the 4090 is using less power in gaming than a 3090? :S


----------



## Space Lynx (Oct 11, 2022)

jayz2 cents review, 18 minutes in, bend the cable gently folks, lol 

and i also finished the gamersnexus review. its an impressive jump, but at end of day this is just for 4k gamers.  i only care about 1440p 27" high refresh, so I am waiting for high end RDNA3


----------



## Hofnaerrchen (Oct 11, 2022)

Really impressed by performance and energy efficiency (especially if the powerlimit is reduced to 60% or if an FPS limit is in place) but dimensions are a different story - especially if you consider it's energy efficiency. 

Pricing is hard to judge right now. The 90s series of nVIDIA cards always was expensive compared to the rest of the stack but without numbers for 4080 16GB and 4070... ehm sorry... 4080 12GB it's just not possible right now. Until those get tested I might keep my opinion: 4080 is to expensive, the markup to high. And I fear prices for 4070, 4060 and 4050 will also go up significantly and nVIDIA will look even more greedy than it is currently.

Compared to the 3090/3090ti the price is really good - especially if you consider prices of last year or even the beginning of this year.


----------



## Space Lynx (Oct 11, 2022)

guru3d review, ray tracing on at 4k for an old ass game like Legion... still only 60 fps. nah thanks bros, I'd rather be gaming at 144+ fps on a 144+hz monitor.

much more immersive experience that way.

I have said it all along, RT is the new Physx, neat, but too much of a performance hit.


----------



## yeeeeman (Oct 11, 2022)

given dlss is the biggest new feature of this generation, you've spent way too much time on only raster performance which OK, isn't a huge jump, but it isn't the point really. Because if next year on this time, 95% of the games will support dlss 3 then this car will be a 3x the performance of 3090TI card, which is a FANTASTIC performance for the price actually...

And sure, dlss 3 needs to be implemented by game devs, but given there are already numerous partnerships in place for dlss 2, those will most likely add dlss3 support in the games and nvidia being nvidia, will push this even more with other games, given it is their biggest feature IMO for this generation.

I really don't agree with the people saying dlss 3 is vaporware. It has it's flaws, sure, but improving raster performance by 2-3x like DLSS3, this would means 2025-2028 timeframe. With DLSS3 and AI we can get this NOW. That is 4k 144hz RT on, which is great.


----------



## Prince Valiant (Oct 11, 2022)

BigMack70 said:


> It's obviously a card that performs incredibly well.
> 
> My point is that the hyperbole in the conclusion of this review reflects severe historical ignorance and is unwarranted. It performs incredibly well, and its incredible performance uplift is exactly in the ordinary window of expectations that has been occurring for over a decade as new generations of GPU get launched.
> 
> This is a "meets expectations" product concerning traditional performance. Which is perfectly fine - generational improvements are fantastic and game-changing! But to act like this is an "exceeds expectations" product when talking about plain rasterized performance is nonsense. The Ray Tracing and feature set is in many ways extra-ordinary and I think some hyperbole there appears justified. But not on the baseline performance as exists currently in this review's conclusion.


It's also made less impressive when making other considerations like the price increases generation to generation.


----------



## big_glasses (Oct 11, 2022)

agree with others here, nothing exceptional (raster only), and for that pricepoint shouldn't be recommended imo

Anyways, W1zzard, would it be possible to include a unity game to the test list. 
Also an API FPS average  graph

Thanks for the review


----------



## Space Lynx (Oct 11, 2022)

yeeeeman said:


> given dlss is the biggest new feature of this generation, you've spent way too much time on only raster performance which OK, isn't a huge jump, but it isn't the point really. Because if next year on this time, 95% of the games will support dlss 3 then this car will be a 3x the performance of 3090TI card, which is a FANTASTIC performance for the price actually...



what are you on about? most games will not support DLSS3. even the games that most recently supported DLSS2, barely increase FPS at all.  lol check out some of @maxus24 reviews he does comparisons, it does help the image quality some, but fps gains have been very unimpressive lately. DLSS is overhyped. hell they were bragging about DLSS3 for Cyberpunk 2077, but they couldn't even have it ready for launch, its too buggy still. and that is for their flagship showcase game... let alone some small developer, or a console AMD port...


----------



## dir_d (Oct 11, 2022)

If this card cost $1000 or even $1200, it would have been a winner.


----------



## Vipeax (Oct 11, 2022)

This is the truly relevant data in my opinion.  Run your 600W targeted cooling solution at 300W for close to passive cooling and a much better room temperature.


----------



## HTC (Oct 11, 2022)

This chart was made by a youtuber using *nVidia's claimed performance of the card*: the Y axis is how much % faster the card was VS previous X80 card, so the GTX 480 was around 54% faster than the GTX 280, and so on and so forth.

@W1zzard : any chance you could do something similar to this *of the X90 class of GPUs*?


----------



## iO (Oct 11, 2022)

Massive performance but 2150€ is just ridiculous.

And ironically it's so fast that their hyped DLSS3 support becomes kinda moot.


----------



## Bwaze (Oct 11, 2022)

yeeeeman said:


> given dlss is the biggest new feature of this generation, you've spent way too much time on only raster performance which OK, isn't a huge jump, but it isn't the point really. Because if next year on this time, 95% of the games will support dlss 3 then this car will be a 3x the performance of 3090TI card, which is a FANTASTIC performance for the price actually...



No. Right now 95% of games don't support DLSS, and it has been around for two generations. And now game developers have also other options that don't require working closely with graphics card maker / paying to be implemented.


----------



## yeeeeman (Oct 11, 2022)

CallandorWoT said:


> what are you on about? most games will not support DLSS3. even the games that most recently supported DLSS2, barely increase FPS at all.  lol check out some of @maxus24 reviews he does comparisons, it does help the image quality some, but fps gains have been very unimpressive lately. DLSS is overhyped. hell they were bragging about DLSS3 for Cyberpunk 2077, but they couldn't even have it ready for launch, its too buggy still. and that is for their flagship showcase game... let alone some small developer, or a console AMD port...


1. dlss 3 is present in cyberpunk, check other reviews...
2. you are not very well informed. dlss 2 is an upscaling tech. dlss 3 is a frame generation tech, so it will most definitely add a lot of performance in the games, especially in the limited ones, also CPU limited ones. read a bit more about how dlss 3 works..


----------



## Pumper (Oct 11, 2022)

I expect RDNA3 to do great in raster, but AMD will be in trouble if the RT performance is trash like their current cards, seeing how more and more games are using raytracing.


----------



## yeeeeman (Oct 11, 2022)

Bwaze said:


> No. Right now 95% of games don't support DLSS, and it has been around for two generations. And now game developers have also other options that don't require working closely with graphics card maker / paying to be implemented.


would be a bit childish to believe that nvidia spent this much silicon on ADA for DLSS3 and not push it with the devs. also the main gaming engines already support dlss 3 integration out of the box, so newer games will have it...


----------



## BSim500 (Oct 11, 2022)

The only thing that interests me is seeing more efficient and sanely priced lower-end GPU's. My excitement for 4-digit flagship GPU hype is the same as flagship smartphones, absolute zero.


----------



## qubit (Oct 11, 2022)

Looks beautiful, large and heavy. Would make a nice paperweight.


----------



## W1zzard (Oct 11, 2022)

Ayhamb99 said:


> While the power consumption at stock speeds wasn't as bad as i thought it would be, for anyone who is thinking of buying this thing and overclocking it, prepare yourself
> 
> View attachment 265002


Playing Furmark?


----------



## mrpaco (Oct 11, 2022)

Testing this GPU with a 5800x was a bad move.

your review vs the rest, clearly shows a bottleneck at 1080p and 1440p vs other reviews with 7950x or 12900KS

while TPU shows an improvement vs 3090 of 20% at 1080p and 35% at 1440, others with better CPU like Igor's lab for example show improvements of around 60% for the same resolutions


----------



## W1zzard (Oct 11, 2022)

CallandorWoT said:


> DLSS3 is vaporware, maybe we will see in 5 games working properly 6 months from now.





> Kurzes Update über die Verfügbarkeit von DLSS 3-Spielen zum Launch der GeForce RTX 4090. Ein paar Highlights sind unter anderem (es wird voraussichtlich noch ein paar mehr geben diese Woche):
> 
> •	SUPER PEOPLE: Early Access heute Abend um 19 Uhr CET mit DLSS 3 verfügbar
> •	Loopmancer: DLSS-3-Update am 12. Oktober
> ...



Germay, run it through Google Translate.

Enabling DLSS 3 is zero work if you have DLSS 2+Reflex, so most games just need to add Reflex, which takes a day or two



big_glasses said:


> Anyways, W1zzard, would it be possible to include a unity game to the test list.


Would love to., Any recommendation.. that's not super CPU limited due to the pos that Unity is?


----------



## clopezi (Oct 11, 2022)

CallandorWoT said:


> DLSS3 is vaporware, maybe we will see in 5 games working properly 6 months from now. Physx flashbacks all over again.



C'mon, you are not serious here. DLSS are implemented in so many games, especially games that needs it, AAA. DLSS3 will be implemented in every single AAA game over the next years. Nvidia has a big market share and developers need to put in their games because it's important to many gamers.



CallandorWoT said:


> jayz2cents review


I see many tech channels and Jayztwocents it's the classic smart ass guy who knows little about everything and a lot about nothing. Pure clickbait.


----------



## Pumper (Oct 11, 2022)

bogda said:


> How exactly is 45% speed up "probably the largest jump gen-over-gen for many year"??
> 100% (3090Ti) vs 64% (2080Ti) is roughly 56% speed up.


Throw out the CPU limited games like Borderlands 3 and Far Cry 6, etc., and the result will be a lot more impressive.


----------



## Wasteland (Oct 11, 2022)

birdie said:


> The biggest issue of this generation is that other cards despite costing an arm and a leg are castrated too much, not to mention that NVIDIA had no qualms calling RTX 4070 an RTX 4080.
> 
> It even looks like there will be no RTX 4060/4050 this gen and Nvidia will continue selling the previous generation stock.


This is my concern too.  Though Ada's generational uplift is impressive, it seems like NVIDIA plans to top-load that uplift far more than they have in the past, when flagship/halo cards were typically only 10-15% faster than the more sane "high end options."  This is the first time, as far as I'm aware, that the halo card uses a whole different die--not to mention the fact that now NVIDIA will have two 80-class cards using different dies too, lol.

We'll see.  Sure looks like the "4080s" will be far less impressive relative to their predecessors than this card is.  And as you say, even the cheapest 4080 is out of most consumers' price range, so all of the hoopla over Ada is largely academic.  All in all, a good PR day for NVIDIA, but it's hard to look at the overall trend line, in terms of value, as positive.


----------



## bogda (Oct 11, 2022)

HTC said:


> View attachment 265010
> 
> This chart was made by a youtuber using *nVidia's claimed performance of the card*: the Y axis is how much % faster the card was VS previous X80 card, so the GTX 480 was around 54% faster than the GTX 280, and so on and so forth.
> 
> @W1zzard : any chance you could do something similar to this *of the X90 class of GPUs*?


It is pointless to compare x80 graphic cards because it is only a marketing term. You can compare flagship vs flagship card or cards with 102 chips.


----------



## nguyen (Oct 11, 2022)

Pumper said:


> Throw out the CPU limited games like Borderlands 3 and Far Cry 6, etc., and the result will be a lot more impressive.



Yup, 4090 easily beat 3090 by 80-100% at 4K with RT


----------



## DemonicRyzen666 (Oct 11, 2022)

W1zzard any plans later on to compare two RTX 3090 Ti's in SLI/mGPU in RTX SLI vs one RTX 4090?

most likly going to be a no, because you focus on Triple A games that everyone has to be playing for a mass audience. Makes the forum here feel like a gaming forum more than a pc enthuiast fourm

I do have a list on supposivly mGPU's games, they need confirmation, It might help.


----------



## W1zzard (Oct 11, 2022)

mrpaco said:


> Testing this GPU with a 5800x was a bad move.


What CPU do you have in your computer?


----------



## yeeeeman (Oct 11, 2022)

@Wizzard: I think you should've spent more time/more graphs to show what potential DLSS3 has.
People are looking too much into that ~40% only perf increase in raster, which isn't the idea with what nvidia is trying to achieve here.
Also, DLSS3 on bring lower power consumption, instead of 400+W w/ raster only you get 2x fps with dlss and 300W-ish.

People, try to get this in your head. Improving raster performance isn't a simple thing and it's isn't just about adding more SMs. You need to improve everything from memories, controller, buses, algorithms, caches, etc. And increasing only raster will take 10 more years to give you what DLSS3 brings now, 4k RT on 144hz. Just check other reviews w/ DLSS3 and RT games and you'll see what the potential of the card really is... DLSS and AI in general is about instead of doing rasterization the hard way, the brute force way, you do it the smart way...


----------



## Space Lynx (Oct 11, 2022)

W1zzard said:


> Germay, run it through Google Translate.
> 
> Enabling DLSS 3 is zero work if you have DLSS 2+Reflex, so most games just need to add Reflex, which takes a day or two
> 
> ...



this doesn't make any sense to me, then why is it not ready for cyberpunk 2077 at the 4090 launch? they made a big deal of it during their press time.

CB2077 is literally their flagship showcase game for DLSS3, so if it is easy to do that doesn't make any sense to me.



nguyen said:


> Yup, 4090 easily beat 3090 by 80-100% at 4K with RT



All well and good for people that like ray tracing, I'd rather game at higher frame rates though personally. 60 fps just is not immersive to me anymore.



clopezi said:


> C'mon, you are not serious here. DLSS are implemented in so many games, especially games that needs it, AAA. DLSS3 will be implemented in every single AAA game over the next years. Nvidia has a big market share and developers need to put in their games because it's important to many gamers.
> 
> 
> I see many tech channels and Jayztwocents it's the classic smart ass guy who knows little about everything and a lot about nothing. Pure clickbait.



I am serious, go look at Maxus24, he has several reviews on the TPU website comparing DLSS and native in games, and yes DLSS does make games look better, I will concede that 100%, but performance gains really are not all that much, sometimes its only a few fps gain. 

I don't find FSR or DLSS or any of that new tech impressive personally, the whole point of it is to drastically increase FPS without hurting image quality, and it just doesn't give the bonus to fps I'd like to see personally. I will probably play most games in native mode.


----------



## metalslaw (Oct 11, 2022)

8 games over the (the overall avg) 153fps for 4k.

This card really should have come with DP 2.0 ports. The lack of these ports means this card isn't properly future proof enough.

Hope the 4080ti or/and the 4090ti come out with DP 2.0 ports when they arrive in several months time.


----------



## Space Lynx (Oct 11, 2022)

metalslaw said:


> 8 games over the (the overall avg) 153fps for 4k.
> 
> This card really should have come with DP 2.0 ports. The lack of these ports means this card isn't properly future proof enough.
> 
> Hope the 4080ti or/and the 4090ti come out with DP 2.0 ports when they arrive in several months time.



Doesn't HDMI 2.1 already max out at like 4k 240hz and 8k 144hz? I mean who would ever need more than that mate?


----------



## Dazz023 (Oct 11, 2022)

@W1zzard 
Does dlss3 work if you manually cap fps? Lets say I want to keep the game below my monitor refresh rate for gsync etc...


----------



## qubit (Oct 11, 2022)

W1zzard said:


> Playing Furmark?


The burning question is: would removing the Furmark power limiter cause it to consume something like 1000W, or would it just blow up?

It's worth your sacrifice...


----------



## nguyen (Oct 11, 2022)

CallandorWoT said:


> All well and good for people that like ray tracing, I'd rather game at higher frame rates though personally. 60 fps just is not immersive to me anymore.



For someone who never play competitive games, you sure love your FPS LOL


----------



## W1zzard (Oct 11, 2022)

Dazz023 said:


> @W1zzard
> Does dlss3 work if you manually cap fps? Lets say I want to keep the game below my monitor refresh rate for gsync etc...


Yes, just not with V-Sync


----------



## Ayhamb99 (Oct 11, 2022)

W1zzard said:


> Playing Furmark?


You have a point but nevertheless overclocking this beast is going to require a very very beefy PSU.


----------



## W1zzard (Oct 11, 2022)

Ayhamb99 said:


> You have a point but nevertheless overclocking this beast is going to require a very very beefy PSU.


Like 600 W GPU + 300 W CPU ?


----------



## Arco (Oct 11, 2022)

Entire grid explodes.


----------



## FeelinFroggy (Oct 11, 2022)

Kovoet said:


> Expect more for that price and size. Will stick with my 3080 but will try the arc for my cheaper rig?  It doubles the FPS of your 3080.  Of course it's expensive, it is the best GPU in the world.



Maybe you should reconsider your expectations.  I guess doubling the performance of your 3080 is not good enough.  It's easy to complain about the price, but explain which '90 or Titan cards were ever considered reasonably priced?


----------



## HTC (Oct 11, 2022)

bogda said:


> It is pointless to compare x80 graphic cards because it is only a marketing term. You can compare flagship vs flagship card or cards with 102 chips.


No because nVidia changed it: for example, the 4080 series of cards use 103 AND 104 chips apparently, so it wouldn't be an "apples to apples" comparison.

The point is to see how much % faster a 4080 or, in this case, a 4090 card is than it's predecessor *and compare it to the % increases of the previous generations of cards*.


----------



## Space Lynx (Oct 11, 2022)

nguyen said:


> For someone who never play competitive games, you sure love your FPS LOL



I do. Take the Witcher 3 for example, when you are in sword combat at 60 to 90 fps, its a great game sure, but then play the game at 165 fps at 165hz monitor, the sword is so smooth, you can see every intricate detail of its movement through the air, its very immersive imo.

and immersive = more fun

now I will concede ray tracing done right would be amazing, but I am not willing to go back to 90 fps or lower.  unless its an indie game, like Wizard of Legend, i love playing Wizard of Legend at 60 fps, but i mean that game has very simple 2d graphics, so eh I can't tell the difference in that case.


----------



## Xex360 (Oct 11, 2022)

Very fast but at more than twice the price of the 3090 it isn't impressive.


----------



## Space Lynx (Oct 11, 2022)

Xex360 said:


> Very fast but at more than twice the price of the 3090 it isn't impressive.



I actually think the 4090 is a good price, you need to think of it as the new Titan though. This card isn't meant for average gamers, it's meant for 4k stuff and 4k gamers pretty much exclusively. I honestly am not sad at all its out of my price range. I just hope RDNA3 can get me at 165 fps 165hz 1440p in games like cyberpunk 2077 with raytracing turned off.

That's all I want. lol


----------



## damric (Oct 11, 2022)

Looking at that 152FPS average in 4K, it's finally time to test the most powerful cards at 8k.

Those ROPs they crammed in there will like that.


----------



## Bwaze (Oct 11, 2022)

W1zzard, any chance of Microsoft Flight SImulator benchmark? Even Nvidia was showing off 4090 performance in that simulator, especially with DLSS 3.

Also, does any review site do VR benchmarks, at high resolutions of headsets like PIMAX 8K (3840 × 2160), Varjo VR-3 (2880 × 2720), HTC Vive Pro 2 (4896 x 2448), HP Reverb G2 (4320 x 2160)...?


----------



## ThrashZone (Oct 11, 2022)

W1zzard said:


> Like 600 W GPU + 300 W CPU ?


Hi,
If you feel lucky that will be enough


----------



## W1zzard (Oct 11, 2022)

Bwaze said:


> W1zzard, any chance of Microsoft Flight SImulator benchmark? Even Nvidia was showing off 4090 performance in that simulator, especially with DLSS 3.


It's extremely CPU limited. The magic of DLSS 3 is that it doubles frames without consuming CPU or GPU, so it can double frames of CPU limited games too. And no other solution can achieve that, which is why NVIDIA is showcasing it

Also it takes forever to load, is super horrible to benchmark and crashed fairly often on me. I had several WTF moments while taking that video


----------



## THU31 (Oct 11, 2022)

Amazing piece of technology. I would never buy a card like this, but I do not have a problem with a ridiculous flagship product like this existing. This is so much better than SLI ever was (especially the dual-GPU cards). This card will even be whisper quiet after undervolting.

The problem is with the rest of the line-up, at least for now. Maybe AMD can make them lower the prices. If not, we will have to wait until they sell out their 30 series stock.


----------



## Space Lynx (Oct 11, 2022)

W1zzard said:


> It's extremely CPU limited. The magic of DLSS 3 is that it doubles frames without consuming CPU or GPU, so it can double frames of CPU limited games too. And no other solution can achieve that, which is why NVIDIA is showcasing it
> 
> Also it takes forever to load, is super horrible to benchmark and crashed fairly often on me. I had several WTF moments while taking that video



DLSS3 is a dream come true then if it really does manifest itself in this way and for many games not just a handful. Hopefully in 6 months we will have a better picture if it will go the way of Physx or if it will actually make my dreams come true. All I want is high fps, so I am more than happy to give Nvidia the benefit of the doubt for now.


----------



## Fleurious (Oct 11, 2022)

GerKNG said:


> TWO TO FOUR TIMES FASTER!!!


They obviously meant the speed at which cash leaves your wallet


----------



## W1zzard (Oct 11, 2022)

CallandorWoT said:


> DLSS3 is a dream come true then if it really does manifest itself in this way and for many games not just a handful. Hopefully in 6 months we will have a better picture if it will go the way of Physx or if it will actually make my dreams come true. All I want is high fps, so I am more than happy to give Nvidia the benefit of the doubt for now.


I feel like every game that supports DLSS will have DLSS 3. For a gamedev it costs nearly nothing to add, one or two days of work, and I'm sure NVIDIA will appreciate in many ways


----------



## the54thvoid (Oct 11, 2022)

@W1zzard 







Can this be done for the 3090FE for a side by side comparison.

As far as the card, aye, it's excellent. Unfortunately, it's for those that have. The have-nots don't seem to have an option from Nvidia for the 4-series if the 4080 12Gb is coming at at over 900 bucks.

Nvidia has carved a new product niche for it's GPU's and I'll gladly walk away from their brand. Of course, if AMD do the same, hell, I'll give up on PC's for gaming. I mean, this is how you drive people to consoles.


----------



## RandallFlagg (Oct 11, 2022)

HD64G said:


> Me thinks that using 5800X instead of 5800X3D or 7700X is bottlenecking 4090 up to 1440P. That also keeps power consumption lower and increased the efficiency. My 5c.



TechSpot uses a 5800X3D and they got CPU bound on some titles.

I don't think even a 12900KS will escape being CPU bound on some titles with this card, and it's faster than Zen 4 on games in the aggregate when both are equipped with DDR5-6000 like TPU does.


----------



## CyberCT (Oct 11, 2022)

swirl09 said:


> Check some more reviews out, there are games showing literally double the performance of a 3090 is rasterization.
> 
> 
> Bit surprised by the power charts here, the 4090 is using less power in gaming than a 3090? :S



Found one review that has some VR data.









						The RTX 4090 - 45+ Games, Pro Apps & VR Performance
					

The RTX 4090 Founders Edition Arrives at $1599 - Performance Revealed - 45+ Games, VR, SPEC, Pro App & Workstation & GPGPU




					babeltechreviews.com
				




Here are the performance increases in these games, compared to a 3090:

64.4%
67.2%
84.4%
72.0%
70.0%

And this is a Valve Index at 150% resolution, which is still lower than the native resolution of the HP Reverb G2 V2.


----------



## LFaWolf (Oct 11, 2022)

A lot people are missing the point, this is a halo, flagship card. If is not for everyone. If you already have a 3080, chances are you should be looking at 4080 (whatever the variants) or not upgrading generation over generation. The 4090 is for well-to-do enthusiats or power users that need all the vram for professional work and have the proper discretionary income to do so. If you have to look at the performance to price ratio this card is not for you.


----------



## W1zzard (Oct 11, 2022)

the54thvoid said:


> Can this be done for the 3090FE for a side by side comparison.


3090 FE will be the same on all due to power limiter. I touched on this in the conclusion's power paragraph


----------



## HTC (Oct 11, 2022)

LFaWolf said:


> this is a halo, flagship card. If is not for everyone



Isn't that supposed to be the 4090 Ti?


----------



## codex5600x (Oct 11, 2022)

Wow,very good improvements, but is so expensive


----------



## GhostRyder (Oct 11, 2022)

I mean, the power consumption seems high to me but I think for the performance jump it sort of justifies it.  I mean the Ray Tracing performance is great on this card and actually makes me want to get one and start using ray tracing!

Still though, my issue is the price point as its pretty high for what it is.  I will be curious where the RTX 4080 sits compared to this and when the RTX 4080ti comes out to close that gap.

Also that cooler...  Geez that thing is massive!  I mean, good grief it performs well but that card has to be the biggest one I have seen.  At least if you don't include the crazy aftermarket weird cards that have been made in the past.  Though to be fair even the crazy dual GPU cards were not that big LOL.


----------



## codex5600x (Oct 11, 2022)

codex5600x said:


> Wow,very good improvements, but is so expensive


I think amd will not have eny chances


----------



## Space Lynx (Oct 11, 2022)

from techspot








						Nvidia GeForce RTX 4090 Review
					

This is our first look at Nvidia's new flagship GeForce RTX 4090 graphics card. We'll find out all you need to know about this next-gen GPU, most...




					www.techspot.com
				




Honestly. I concede, the 4090 and DLSS3 is impressive. If more and more games can get DLSS3... I mean, this is it, this may be the last gpu anyone needs for like 10 years. I honestly think the price is justified for this reason alone, we are reaching a point of no return... games may get more demanding sure, but I think that is more of a thing of the past, take Unreal Engine 5 for example, its gorgeous and really is not that taxing... engines are getting better and better... combine that FSR 2.0 games and DLSS3 games... the 4090 is very valuable card all things considered.


----------



## PapaTaipei (Oct 11, 2022)

Less than 15% improvements vs 3090ti on 1080p and if you don't care about that useless RT bs. Amazing! So much power draw and wasted transistors on RT and tensor cores... Btw the article would attract more ppl if it included CSGO and Overwatch 2. 

Amazing to see this GPU with 78 BILLION transistors has less FPS vs a 3080 (less than 28b transistors) on some games that do not use RT and on 1080p all the while using up to 600 WATTS!!! This has to be a new record of stupidity.


----------



## mrpaco (Oct 11, 2022)

CallandorWoT said:


> from techspot
> 
> 
> 
> ...


specially if you like shit visual quality, blurry image and getting artifacts everywhere.

pretty amazing technology is dlss for that.


----------



## LFaWolf (Oct 11, 2022)

HTC said:


> Isn't that supposed to be the 4090 Ti?


Sure, but it is not out yet, is it?


----------



## Nordic (Oct 11, 2022)

What was the power used with the 600w overclock? I am also curious how bad the performance per watt is at 600w.


----------



## thunderingroar (Oct 11, 2022)

Holy cpu bottleneck, pls update test system to 13th gen or 3D zen4 when they come out


----------



## LFaWolf (Oct 11, 2022)

PapaTaipei said:


> Less than 15% improvements vs 3090ti on 1080p and if you don't care about that useless RT bs. Amazing! So much power draw and wasted transistors on RT and tensor cores... Btw the article would attract more ppl if it included CSGO and Overwatch 2.
> 
> Amazing to see this GPU with 78 BILLION transistors has less FPS vs a 3080 (less than 28b transistors) on some games that do not use RT and on 1080p all the while using up to 600 WATTS!!! This has to be a new record of stupidity.


4090 and game at 1080p?  Okay…


----------



## GerKNG (Oct 11, 2022)

TheinsanegamerN said:


> a 45% improvement over an already insane card is unremarkable?


yes.
nothing special for a big node change.
the jump from a 7800GTX to the 8800GTX was 2x and depending on the workload even more.
and it didn't cost 2500€...


----------



## Trompochi (Oct 11, 2022)

Nice card, but I'm still very happy with my old 2080 Ti for 1440p.


----------



## PapaTaipei (Oct 11, 2022)

LFaWolf said:


> 4090 and game at 1080p?  Okay…


What is your problem? Yes, if you play competitive at 400-600 fps.


----------



## HTC (Oct 11, 2022)

LFaWolf said:


> Sure, but it is not out yet, is it?



No, it isn't.

However, a "halo" product is supposed to be the "top of the line", and the 4090 IS NOT the "top of the line", regardless of it having been launched or not, because we know *it's already been announced*: were this not the case, then you'd have a point.


----------



## solarmystic (Oct 11, 2022)

Trompochi said:


> Nice card, but I'm still very happy with my old 2080 Ti for 1440p.


Same here, but on a 1080p144Hz monitor , i'd expect this card to be enough for next 3-4 years at that resolution.


----------



## ARF (Oct 11, 2022)

45% performance improvement is really bad. And no, it's not the CPU. If you think that a faster CPU will make a big difference, don't expect it.

Also, it's not really the time for 8K gaming just yet, between 4K and 8K there are other mid res, such as 5K, 6K and 7K 



HTC said:


> View attachment 265010
> 
> This chart was made by a youtuber using *nVidia's claimed performance of the card*: the Y axis is how much % faster the card was VS previous X80 card, so the GTX 480 was around 54% faster than the GTX 280, and so on and so forth.
> 
> @W1zzard : any chance you could do something similar to this *of the X90 class of GPUs*?



lol, 4080-16 is the right last entry, not the rebadged RTX 4070 (""4080-12"" meh!).


----------



## gridracedriver (Oct 11, 2022)

+ 50% perf / watt on the 6950XT, AMD just needs a 350watt (-100watt) card to match this 4090 at least in Raster


			https://tpucdn.com/review/nvidia-geforce-rtx-4090-founders-edition/images/energy-efficiency.png
		

We will see if AMD will have pushed in RT too or not, minimum they have to do 2x on rdna2


----------



## bhappy (Oct 11, 2022)

I wonder how much the 5800X is bottlenecking the 4k performance of this card.


----------



## Dirt Chip (Oct 11, 2022)

I'm horrified by the physical size of this GPU, and it's the smallest of the bunch..
Maybe the cooler is of the finest quality and performing remarkably but just seeing that brick of a thing you can tell it's unbalanced product- just too big for the purpose it serve. The 3rd party coming tomorrow will be just huge abominations. 

We need a new metric: preformance per m^3 (volume) to factor the (un-parallel) increase in physical size, compare to previous gen.


----------



## HTC (Oct 11, 2022)

ARF said:


> lol, 4080-16 is the right last entry, not the rebadged RTX 4070 (""4080-12"" meh!).



Take it up with nVidia: they're the ones that decided to make TWO versions of the 4080, one of which CLEARLY WORSE than the other


----------



## ARF (Oct 11, 2022)

Dear Lord


----------



## LFaWolf (Oct 11, 2022)

HTC said:


> No, it isn't.
> 
> However, a "halo" product is supposed to be the "top of the line", and the 4090 IS NOT the "top of the line", regardless of it having been launched or not, because we know *it's already been announced*: were this not the case, then you'd have a point.


Uh, when and where was 4090 ti announced?


----------



## SOAREVERSOR (Oct 11, 2022)

thunderingroar said:


> Holy cpu bottleneck, pls update test system to 13th gen or 3D zen4 when they come out



Competitive players are at 1080p on 24in monitors that do 240 or 360hz and they do not enable gsync/freesync.  Many of them don't have to pay for their stuff either.


----------



## DemonicRyzen666 (Oct 11, 2022)

Dirt Chip said:


> I'm horrified by the physical size of this GPU, and it's the smallest of the bunch..
> Maybe the cooler is of the finest quality and performing remarkably but just seeing that brick of a thing you can tell it's unbalanced product- just too big for the purpose it serve. The 3rd party coming tomorrow will be just huge abominations.
> 
> We need a new metric: preformance per m^3 (volume) to factor the (un-parallel) increase in physical size, compare to previous gen.



metric: preformance per mm^2 along with preformance per watt pre mm^2 averaged


----------



## ARF (Oct 11, 2022)

gridracedriver said:


> + 50% perf / watt on the 6950XT, AMD just needs a 350watt (-100watt) card to match this 4090 at least in Raster
> 
> 
> https://tpucdn.com/review/nvidia-geforce-rtx-4090-founders-edition/images/energy-efficiency.png
> ...



Yes, the good news today is that AMD has a chance to be very very competitive. Can't wait for the presentation next month!


----------



## R0H1T (Oct 11, 2022)

LFaWolf said:


> Uh, when and where was 4090 ti announced?


It's planned for sure!








						NVIDIA RTX Titan Ada: Four-slot and full AD102 graphics card shelved after melting PSUs
					

Reportedly, NVIDIA was developing an even more powerful version of the GeForce RTX 4090. Billed as the Titan Ada or the RTX 4090 Ti, the graphics card is thought to be a whopping four slots thick and requires twin 16-pin power connectors. Supposedly, NVIDIA has ceased development though, with...




					www.notebookcheck.net


----------



## defaultluser (Oct 11, 2022)

Xex360 said:


> Very fast but at more than twice the price of the 3090 it isn't impressive.




yeah, but identical launch day prices


----------



## HTC (Oct 11, 2022)

LFaWolf said:


> Uh, when and where was 4090 ti announced?



Ooooops: my bad.

I was using this, taken from tom's hardware:





But this is a rumor, rather than something announced.

That said, i seriously doubt nVidia won't release a Ti version of the 4090: they're likely waiting for AMD to launch their cards AND THEN use the 4090 Ti to counter AMD's best.


----------



## SOAREVERSOR (Oct 11, 2022)

R0H1T said:


> It's planned for sure!
> 
> 
> 
> ...


Shelved after melting PSUs....


----------



## W1zzard (Oct 11, 2022)

bhappy said:


> I wonder how much the 5800X is bottlenecking the 4k performance of this card.


Compare it to 1080p FPS. 
Same number? Bottleneck (Borderlands 3, DoS II)
Lower number? No bottleneck


----------



## outpt (Oct 11, 2022)

Very good review. Thanks for taking the time to process so much data. The newer chips should be tested at sometime as they should be much stronger.


----------



## LFaWolf (Oct 11, 2022)

HTC said:


> Ooooops: my bad.
> 
> I was using this, taken from tom's hardware:
> 
> ...


Therefore, *at this moment*, the 4090 is the halo card, just like when the 3090 was released, the 3090 was the halo card that moment.

Of course they will release a 4090 ti at a later time. The 4090 is not a fully enabled chip.


----------



## nguyen (Oct 11, 2022)

W1zzard said:


> Compare it to 1080p FPS. Same number? Bottleneck, lower number? No bottleneck



5800X + 16GB RAM are simply not enough to not-bottleneck the 4090 at 4K, the Ray Tracing results show how much further 4090 can stretch its leg without CPU+RAM bottlenecking.


----------



## ARF (Oct 11, 2022)

@W1zzard Would you make this page active? NVIDIA GeForce RTX 4090 Specs | TechPowerUp GPU Database


----------



## R0H1T (Oct 11, 2022)

Wow look at that *insane *VFM


----------



## W1zzard (Oct 11, 2022)

nguyen said:


> the Ray Tracing results show how much further 4090 can stretch its leg


You mean relative to Ampere? That's because of the improved RT units and the additional power headroom


----------



## vlad.coolish (Oct 11, 2022)

For next tests of 4090 please make an experiment with power limit -100W, -150W and -200W to current max in 460W.
And check power efficient Performance by Watt after.


----------



## clopezi (Oct 11, 2022)

Would be interesting see the performance on PCIe 3.0...


----------



## amit_talkin (Oct 11, 2022)

Intimidating !


----------



## Vayra86 (Oct 11, 2022)

I start to understand why Nvidia only releases two SKUs. The rest is just not worth it versus Ampere.
Basically, Ada is what Ampere shouldve been.

Perf/watt is the only good thing I see here. But they had catching up to do anyway.


----------



## THU31 (Oct 11, 2022)

PapaTaipei said:


> Less than 15% improvements vs 3090ti on 1080p and if you don't care about that useless RT bs. Amazing! So much power draw and wasted transistors on RT and tensor cores... Btw the article would attract more ppl if it included CSGO and Overwatch 2.
> 
> Amazing to see this GPU with 78 BILLION transistors has less FPS vs a 3080 (less than 28b transistors) on some games that do not use RT and on 1080p all the while using up to 600 WATTS!!! This has to be a new record of stupidity.


You are blaming the card for bad 1080p performance? Do even know anything about how video games work?


----------



## outpt (Oct 11, 2022)

These will sale like hotcakes.
Will you do a RT only review or does that go beyond the call of duty?


----------



## defaultluser (Oct 11, 2022)

hey man, proof that doom eternal IS SUCH AN EASY GAME TO RENDER, ITS ALMOST GOT THEW SAME PERFORMANCE AS A 3300X AT 1080P :






BUT NOW WE CAN SEE JUST HOW MUCH CPU THE GPU  TEST PLATFORMS S HAVE LEFTl


----------



## mrpaco (Oct 11, 2022)

W1zzard said:


> What CPU do you have in your computer?


does it matter? I'm no journalist trying to make a review and showcase the true power of a product, that question is just a fallacy

if you are limiting artificially the limits of the product how people can know what the true unlocked performance is?

do you make CPU reviews using a 10€ cpu cooler that would throttle the device at 50%? Then why would do it with this?

Don't get me wrong, I am happy to see this review vs others with the latest cpu because I can now see something what under other circumstances we wouldn't see. But at the same time, it was not the best choice to display the 4090. Adding the top tier CPU changes it so much, that maybe should have been a second CPU test like the ryzen 7000 ones with air cooling and AIO's at the same time, but if you are going to do only one, then I think this was the wrong choice, just my 2c


----------



## EatingDirt (Oct 11, 2022)

Fluffmeister said:


> Damn Ray Tracing performance has taken quite a leap up, impressive indeed!
> 
> And it's actually rather more efficient too despite what was expected.


Raytracing is only a ~5% improvement over the 3090 Ti in terms of frame cost(Original FPS vs Raytracing on FPS). It's just a much faster card overall, so it's just much faster with raytracing enabled, but it's not much more efficient at raytracing.

I'd say the very small raytracing improvement is the largest surprise for me.


----------



## arandomguy (Oct 11, 2022)

@W1zzard 

The clock states table on page 40 "Efficiency and Clock Speeds" seems be the numbers from the Intel Arc 770.


----------



## Razrback16 (Oct 11, 2022)

Nice writeup, thanks. I can easily afford the 4090 or even several of them, but I simply will not pay $1600 for a video card. Not gonna happen. Especially for one that doesn't even have the correct cooling solution installed. A premium card like that should have a waterblock. Period. 

So, cut that price in half and offer it with a FC waterblock and I'd buy one immediately. Otherwise I'm happy to wait for 2nd-hand market in a couple years.


----------



## R0H1T (Oct 11, 2022)

mrpaco said:


> does it matter? I'm no journalist trying to make a review and showcase the true power of a product, that question is just a fallacy
> 
> if you are limiting artificially the limits of the product how people can know what the true unlocked performance is?
> 
> ...


Artificially how? Are you going to subsidize every household to get a (free) upgrade to the 13900ks or 7950x3d & play it with 4090/Ti 

*Most users out there will have an inferior platform *than the one used here, in fact more than 90% or so!


----------



## grammar_phreak (Oct 11, 2022)

ARF said:


> Dear Lord
> 
> View attachment 265026


Someone should make a GPU cooler that resembles a retro 80's Boom Box.


----------



## AusWolf (Oct 11, 2022)

The table on the "clock speeds" page shows the Intel Arc cards' clock speeds.

Nice review by the way.  It's a shame the card is pretty underwhelming considering its price. Only slightly faster than the 6950 XT for double the price. What a massive slap in the face for the 5 green fans who will buy it!


----------



## W1zzard (Oct 11, 2022)

ARF said:


> @W1zzard Would you make this page active? NVIDIA GeForce RTX 4090 Specs | TechPowerUp GPU Database


Done



clopezi said:


> Would be interesting see the performance on PCIe 3.0...


Soon



defaultluser said:


> hey man, proof that doom eternal IS SUCH AN EASY GAME TO RENDER, ITS ALMOST GOT THEW SAME PERFORMANCE AS A 3300X AT 1080P :


Different test system, different game version, different graphics drivers, slightly different test scene



mrpaco said:


> do you make CPU reviews using a 10€ cpu cooler that would throttle the device at 50%?


I tested that, throttling is actually minimal: https://www.techpowerup.com/review/amd-ryzen-9-7950x-cooling-requirements-thermal-throttling/
Let me retest on 13900K and we'll talk again soon



arandomguy said:


> The clock states table on page 40 "Efficiency and Clock Speeds" seems be the numbers from the Intel Arc 770.


I knew that looked wrong, fixing


----------



## AdmiralThrawn (Oct 11, 2022)

Not impressed with the price chart TPU is using here. You cannot use current amazon prices to compare against the launch price of the 4090. The RTX 3080 is not 660$. Please fix this.

I am not defending the 1,600 dollar price but it is incredibly dishonest to say the 3090 is 900 bucks and not state that you are using current ebay/amazon prices instead of launch pricing.


----------



## Legacy-ZA (Oct 11, 2022)

For that price and power draw, this "thing" is an abomination. I expected more performance to make up for it, much more.


----------



## RandallFlagg (Oct 11, 2022)

codex5600x said:


> I think amd will not have eny chances



Actually for the first time in a long time, I'm interested in what AMD provides.

The real game is in the mid and upper midrange.  Call it below the 6800.  Stuff that costs <=$500.

It's entirely possible to win in the midrange, and lose at the halo product or perhaps not even try to make a monster like the 4090.  

Beyond just cost and power, that thing will not even physically fit in a lot of ATX cases.


----------



## grammar_phreak (Oct 11, 2022)

mrpaco said:


> does it matter? I'm no journalist trying to make a review and showcase the true power of a product, that question is just a fallacy
> 
> if you are limiting artificially the limits of the product how people can know what the true unlocked performance is?
> 
> ...


If you know how to read benchmarks, then you can see where there is a CPU bottleneck and where there isn't..... or where there is a limitation with the Game engine. Testing with a 7950x isn't going to magically make a 3090 look any better or worse. 
Since all the cards in the chart are tested with the same setup... the results are not going to be a whole lot different compared to Jayztwocents.... especially at 4k where it really matters.


----------



## Bwaze (Oct 11, 2022)

AdmiralThrawn said:


> Not impressed with the price chart TPU is using here. You cannot use current amazon prices to compare against the launch price of the 4090. The RTX 3080 is not 660$. Please fix this.
> 
> I am not defending the 1,600 dollar price but it is incredibly dishonest to say the 3090 is 900 bucks and not state that you are using current ebay/amazon prices instead of launch pricing.


It says just that - and that is actually relevant to the buyers that are deciding right now what to buy. What should it be compared to? Release MSRP? Increased MSRP mid-generation? Highest achieved prices during cryptoinsanity (then the RTX 4090 would actually look affordable, cheaper than RTX 3080, about the max price RTX 3070 achieved). 

"We looked up each card's current USD price on Newegg and used that and the relative performance numbers to calculate a performance-per-dollar index."


----------



## Dimitriman (Oct 11, 2022)

After reading the 4090 reviews, I feel like this now:


----------



## clopezi (Oct 11, 2022)

I really hope that AMD make battle, because it's good to all, but without DLSS (FSR actual implementation it's like DLSS 5 years ago) and without RT good performance, they cannot make a fight in the same conditions...


----------



## Bwaze (Oct 11, 2022)

I'm worried that RTX 4080 16 GB will trail quite a bit, and will actually have much worse price / performance than RTX 3080 (since it's going to launch at 1500+ EUR in EU) - at least the numbers of how much it's cut compared to 4090 seem so. And the fake RTX 4080 (the one with 12 GB) for 1200+ EUR will be even worse.

And the people will still be buying Ada cards, just because they have the pinnacle in RTX 4090 - even if the mid range will be slower than AMD...


----------



## grammar_phreak (Oct 11, 2022)

I really wish SLI and multi-GPU setups would have evolved.


----------



## W1zzard (Oct 11, 2022)

AdmiralThrawn said:


> Not impressed with the price chart TPU is using here. You cannot use current amazon prices to compare against the launch price of the 4090. The RTX 3080 is not 660$. Please fix this.











						GeForce RTX 3080 GPUs / Video Graphics Cards | Newegg.com
					

Shop GeForce RTX 3080 GPUs / Video Graphics Cards on Newegg.com. Watch for amazing deals and get great pricing.




					www.newegg.com
				




using launch pricing from years ago makes no sense for a review posted today


----------



## HTC (Oct 11, 2022)

Bwaze said:


> I'm worried that RTX 4080 16 GB will trail quite a bit, and will actually have much worse price / performance than RTX 3080 (since it's going to launch at 1500+ EUR in EU) - at least the numbers of how much it's cut compared to 4090 seem so. And the fake RTX 4080 (the one with 12 GB) for 1200+ EUR will be even worse.
> 
> And the people will still be buying Ada cards, just because they have the pinnacle in RTX 4090 - even if the mid range will be slower than AMD...



Which is why i requested this.

Actually, it would also be good to verify / disprove it for both versions of 4080 as well.

It should be possible to do by using TPU's own reviews of the cards and compare the % gains from one generation to the next @ launch.

On a more personal note, i'm so out of touch with nVidia's GPUs that i just noticed that there's no 2090 card ... facepalm ...



grammar_phreak said:


> I really wish SLI and multi-GPU setups would have evolved.



They have, in a much more "internal" way: they're called chiplets now.


----------



## W1zzard (Oct 11, 2022)

grammar_phreak said:


> I really wish SLI and multi-GPU setups would have evolved.


they have, they are dead now


----------



## Parn (Oct 11, 2022)

IF the rest of the Ada lineup could keep up with the efficiency, I might get myself a 4080 provided the price is within reasonable range (< £750, could be dreaming based on the astronomical price of the 4090).


----------



## P4-630 (Oct 11, 2022)

Just 66 degrees @ 35.1dB during gaming really isn't bad for a FE imo....


----------



## Dirt Chip (Oct 11, 2022)

Vayra86 said:


> I start to understand why Nvidia only releases two SKUs. The rest is just not worth it versus Ampere.
> Basically, Ada is what Ampere shouldve been.
> 
> Perf/watt is the only good thing I see here. But they had catching up to do anyway.


I think it's the opposite: according to 4090 high efficiency, lower ada are so much better than ampera pref\watt wise that if they are out now no one will buy any ampera. NV will be stuck with all the stock.
I will patiently wait for a 10-12GB Ada GPU down the road with 150w tdp.


----------



## dick_cheney (Oct 11, 2022)

Parn said:


> IF the rest of the Ada lineup could keep up with the efficiency, I might get myself a 4080 provided the price is within reasonable range (< £750, could be dreaming based on the astronomical price of the 4090).


The 4070 is $899 USD / £949 GBP and the 4080 is $1,199 USD / £1,269 GBP.


----------



## AdmiralThrawn (Oct 11, 2022)

W1zzard said:


> GeForce RTX 3080 GPUs / Video Graphics Cards | Newegg.com
> 
> 
> Shop GeForce RTX 3080 GPUs / Video Graphics Cards on Newegg.com. Watch for amazing deals and get great pricing.
> ...


I would argue that it does make sense, when using price to performance or comparing on a price chart. It would be like saying a 2008 chevy silverado wins in the price chart becuase the 2020 model is 40,000 dollars but you can get a 2008 for 10,000. That is not a fair comparison, especially when you take into account the fact that the 3090ti launched at 1,999.99, a much higher price than 1,600.

Also in the newegg link you posted there was not one single 3080 for $660 but the review lists the 3080 for $660

I am not trying to be rude or argue with you here I just don't think it is 100% honest. A possible solution could be adding a launch price column to the chart, and then a current price column next to it. But I am not really an expert on review format so forgive me if that sounds stupid. Thanks


----------



## Blueberries (Oct 11, 2022)

@W1zzard

I would really appreciate a side by side of 2x TSAA / 4xTSAA / DLSS 3.0 Quality / DLSS 3.0 Performance

I've been using a 28" 4K panel for 6-7 years now and what I've noticed in that time is that I have a very difficult time realizing a difference in smoothness past 2X at that high of a resolution. I'm curious how that compares to DLSS 3.0.


----------



## N/A (Oct 11, 2022)

Parn said:


> IF the rest of the Ada lineup could keep up with the efficiency, I might get myself a 4080 provided the price is within reasonable range (< £750, could be dreaming based on the astronomical price of the 4090).


4080/12 and 4080/16 are based on the same die size and bus width as 3060 and 3070 respectively, and also where they stand performance wise compared to the 3090,
The price gets creepier the lower you go, not better. no hope there. Waiting GDDR7 DP2.0 and super series is the only way.


----------



## neatfeatguy (Oct 11, 2022)

I'm not sure how to really feel about this launch. The 4090 is clearly faster than the 3090Ti, but how Nvidia went about marketing the performance difference being 2-4x on all their slides is kind of shitty on their end.

Guess it doesn't matter to me, I wasn't the target audience for this card to begin with, but their shady PR crap does leave a bad taste in my mouth.


----------



## MarsM4N (Oct 11, 2022)

Just watched _*GamersNexus's review*_, and I have to say the thermal performance (for the power draw) is really nice. 
The clean & elegant design (compared to the gamer'ish partner cards) is also a taker. Paired with the cheaper price, it's going to be *a blood bath* for the partners, lol.

Also never noticed before there is a fan on each side, lol. Is that a 4090 uniqueness or will the lower models also get 2 fans?

About the benchmarks: why is the 4090 trailing behind all old gen cards at 1080p & 1440p res in games like _*Battlefield 5*_, *Borderlands 3*, *Diviniti Original Sin II*, *Far Cry 6*, *Halo Infinite* or *Hitman 3*? That's really odd!  Also if you exclude the 5-6 (out of the 25 tested) games the 4090 is massively outperforming the other cards, the performance gain isn't really that great. Just an observation, no hate boys.



Razrback16 said:


> Nice writeup, thanks. I can easily afford the 4090 or even several of them, but I simply will not pay $1600 for a video card. Not gonna happen. Especially for one that doesn't even have the correct cooling solution installed. A premium card like that should have a waterblock. Period.
> 
> So, cut that price in half and offer it with a FC waterblock and I'd buy one immediately. Otherwise I'm happy to wait for 2nd-hand market in a couple years.



Then go out & just buy one! Or 2, or 3, ... ** _"*The more you buy, the more you save.*"_ Jen-Hsun Huang
You need to support Nvidia (and their shareholders) in these desperate times. Listen to the preacher. Repeat it.












HTC said:


> No, it isn't.
> 
> However, a "halo" product is supposed to be the "top of the line", and the 4090 IS NOT the "top of the line", regardless of it having been launched or not, because we know *it's already been announced*: were this not the case, then you'd have a point.



Given our current *global economic situation* (+the crypto mining crash), I wouldn't place a bet on it.  The 4090 will most likely go down in history as the most rare "mainstream card". Might be a good idea to get it used down the road as an "collectable" for historical reasons, though. But knowing Nvidia, they will likely release a 4090ti, just to collect dem dollarinos from dem high rollers with small ePeens.



mrpaco said:


> Testing this GPU with a 5800x was a bad move.
> 
> your review vs the rest, clearly shows a bottleneck at 1080p and 1440p vs other reviews with 7950x or 12900KS
> 
> while TPU shows an improvement vs 3090 of 20% at 1080p and 35% at 1440, others with better CPU like Igor's lab for example show improvements of around 60% for the same resolutions



*Agree*. __ Gaming benchmarks should have been done with the *most potent gaming CPU*. Which right now is either the 7950x, 12900KS or the 5800X3D (depending on the game).
But especially the 5800X3D would have been very interesting. A little preview of performance gains we could get from the upcomming AMD X3D models.


----------



## Aretak (Oct 11, 2022)

W1zzard said:


> GeForce RTX 3080 GPUs / Video Graphics Cards | Newegg.com
> 
> 
> Shop GeForce RTX 3080 GPUs / Video Graphics Cards on Newegg.com. Watch for amazing deals and get great pricing.
> ...


The cheapest new 3080 there at "$660" is not sold by Newegg, ships from Hong Kong (lol) and shipping is $45. The cheapest one actually sold by Newegg is $740.


----------



## LFaWolf (Oct 11, 2022)

AdmiralThrawn said:


> I would argue that it does make sense, when using price to performance or comparing on a price chart. It would be like saying a 2008 chevy silverado wins in the price chart becuase the 2020 model is 40,000 dollars but you can get a 2008 for 10,000. That is not a fair comparison, especially when you take into account the fact that the 3090ti launched at 1,999.99, a much higher price than 1,600.
> 
> Also in the newegg link you posted there was not one single 3080 for $660 but the review lists the 3080 for $660
> 
> I am not trying to be rude or argue with you here I just don't think it is 100% honest. A possible solution could be adding a launch price column to the chart, and then a current price column next to it. But I am not really an expert on review format so forgive me if that sounds stupid. Thanks


https://www.newegg.com/p/pl?N=10000...ID=KolujHFK2dc-HckoYCJ8Yl.TUxVuCoGfTg&Order=1

You need to sort it by price, and specifically, it is this one, although not sold and shipped by Newegg but by a 3rd party seller.
https://www.newegg.com/gigabyte-gef...oc-10gd/p/N82E16814932459?Item=9SIBF9VJ706737


----------



## 3x0 (Oct 11, 2022)

@W1zzard Have you noticed any fan wobble on your sample? I think Der8auer and TechYesCity had their fans wobble and exhibit strange fan noise at certain levels


----------



## r9 (Oct 11, 2022)

Arco said:


> I wonder what AyyyyyMD is going to come up with!


They don't have anything that can beat this thing but they'll try convince us that's okay and that's like saying that fking with a limp dick is just better.


----------



## Gameslove (Oct 11, 2022)

Wooowww so impressive card, not expected a v-sync 60 Hz power consumption 76W.


----------



## Jism (Oct 11, 2022)

Ayhamb99 said:


> Huh...... I was expecting this card to be way more power hungry but for the improvements over the last gen i was surprised that it didn't surpass the 3090 Ti power consumption by a lot...



The hype was caused because nvidia came up with another power connector up to 900W or so.


----------



## junglist724 (Oct 11, 2022)

A 5800X really isn't enough to test this card. Hell, there doesn't seem to be a CPU in existence that doesn't bottleneck this card.


----------



## Steevo (Oct 11, 2022)

Impressive amount of extra hardware on die, seems like the node change did as much as anything, it seems Nvidia was gambling on crypto not crashing with all the extra math hardware.


The RT performance is only marginally better if you also use interpolated frames or a low resolution  otherwise it scales with the 3000 series.


----------



## 80-watt Hamster (Oct 11, 2022)

LFaWolf said:


> https://www.newegg.com/p/pl?N=100007709 601357247&utm_medium=affiliates&utm_source=afc-TechPowerUp&AFFID=3647627&AFFNAME=TechPowerUp&ACRID=1&ASID=https://www.techpowerup.com/&ranMID=44583&ranEAID=3647627&ranSiteID=KolujHFK2dc-HckoYCJ8Yl.TUxVuCoGfTg&Order=1
> 
> You need to sort it by price, and specifically, it is this one, although not sold and shipped by Newegg but by a 3rd party seller.
> https://www.newegg.com/gigabyte-gef...oc-10gd/p/N82E16814932459?Item=9SIBF9VJ706737



A 3rd-party seller that'll charge you $45 shipping.  Filtering for "New" + "Shipped by Newegg" gets the best price representation, IMO.  That makes minimum buy-in $720, which is more realistic.


----------



## Gameslove (Oct 11, 2022)

junglist724 said:


> A 5800X really isn't enough to test this card. Hell, there doesn't seem to be a CPU in existence that doesn't bottleneck this card.


This graphics card designed for a 4K, 8K gaming, not less. It ridiculous is using this powerful card for 1080p.
4K gaming 5800X quite enough here.

@W1zzard 
Thanks for the review. Waiting also 5K, 6K, 8K test. And PCI-E 3.0 vs PCI-E 4.0?


----------



## R0H1T (Oct 11, 2022)

Dirt Chip said:


> I think it's the opposite: according to 4090 high efficiency, *lower ada are so much better than ampera pref\watt wise* that if they are out now no one will buy any ampera. NV will be stuck with all the stock.
> I will patiently wait for a 10-12GB Ada GPU down the road with 150w tdp.


That usually depends on their clocks. If the bigger chips are clocked much higher then they tend to be a tad more "leakier" but generally speaking the xx80 or xx50/Ti or xx60/Ti are among the most efficient since Kepler, from memory xx80 have been regularly the most efficient ones.


----------



## junglist724 (Oct 11, 2022)

Gameslove said:


> This graphics card designed for a 4K, 8K gaming, not less. It ridiculous is using this powerful card for 1080p.
> 4K gaming 5800X quite enough here.
> 
> @W1zzard
> Thanks for the review. Waiting also 5K, 6K, 8K test. And PCI-E 3.0 vs PCI-E 4.0?


Even with a 12900K/5800X3D/7700X you'll still be CPU limited at 4K in a lot of titles.


----------



## evernessince (Oct 11, 2022)

BigMack70 said:


> Yes it's unremarkable. It's ordinary. Expected. Routine.
> 
> Don't believe me? Go look at the flagship card launches for every generation of GPUs. Historically, it's ordinary and unremarkable in every way. It's identical to Ampere, slightly higher than Turing, MUCH lower than Pascal. Higher or lower than Maxwell depending on if you consider the 780 or the 780 Ti the Kepler flagship. Lower than Kepler.
> 
> Anyone jumping up and down as if a 45% performance improvement is anything other than expected is drinking Jensen's Kool-Aid and needs to take a break from Nvidia's marketing materials.



To be fair, Turing was statistically the worst Nvidia GPU generation ever.  Zero improvement in performance per dollar, only a small performance bump over last gen, and a single digit efficiency increase.

I hope to see the same people that are praising this 45% increase also praise AMD when they achieve the same.



Denver said:


> I believe the CPU is limiting the performance the card can achieve, especially at 1080P.
> 
> It's worth a new round with the big dogs like 7700x/12900k and 5800x3d



No point until the 13900K and 7950X3D or other Z4 3d variants release.


----------



## SOAREVERSOR (Oct 11, 2022)

evernessince said:


> To be fair, Turing was statistically the worst Nvidia GPU generation ever.  Zero improvement in performance per dollar, only a small performance bump over last gen, and a single digit efficiency increase.
> 
> I hope to see the same people that are praising this 45% increase also praise AMD when they achieve the same.
> 
> ...



People praise AMDs stuff but then all say they hope this will force nvidia to drop prices and when they don't they go out and buy nvidia anyways at the inflated price.


----------



## AnotherReader (Oct 11, 2022)

Thanks for the timely review, @W1zzard. The performance, while great, is nothing unexpected; the uplift is within the historical norm and less than some other transitions in the past. Even the much panned Turing saw a similar uplift. Considering the node change and the vastly greater clock speeds, the performance increase is rather lacking. On the plus side, raytracing performance has improved again. To summarize:

*Pros*

Acceptable performance increase over the 3090
Greater increase in raytracing performance compared to rasterization
*Cons*

If going from 82 SMs to 128 SMs at higher clock speeds gets us a nearly 65% increase, the 4080 16 GB is likely to trade blows with the 3090 Ti
Increase in performance is far less than Nvidia's claims and the vast increase in resources and clock speed
Lower performance per TFLOP than Ampere


----------



## W1zzard (Oct 11, 2022)

MarsM4N said:


> why is the 4090 trailing behind all old gen cards at 1080p & 1440p res in games like _*Battlefield 5*_, *Borderlands 3*, *Diviniti Original Sin II*, *Far Cry 6*, *Halo Infinite* or *Hitman 3*


As mentioned in the conclusion, it seems the driver overhead is higher than on Ampere, so this higher CPU usage will eat into the CPU time available for the game and thus make it even more CPU limited



3x0 said:


> @W1zzard Have you noticed any fan wobble on your sample? I think Der8auer and TechYesCity had their fans wobble and exhibit strange fan noise at certain levels


No, runs perfectly smooth



Gameslove said:


> Waiting also 5K, 6K, 8K test.


No plans, unless someone sends me a monitor  Even then, not sure if I can add another resolution.



Gameslove said:


> And PCI-E 3.0 vs PCI-E 4.0?


Very soon


----------



## evernessince (Oct 11, 2022)

64K said:


> This GPU is a beast. Very impressive.
> Good job Nvidia!



Agreed.  Regardless of other factors Nvidia did a good job with this GPU.


----------



## chowow (Oct 11, 2022)

Maybe good thing you tested 5800x my 5600 might not bottleneck this at 4k on ultra settings and VR gaming.


----------



## Richards (Oct 11, 2022)

__ https://twitter.com/i/web/status/1579825383842537473The card needs zen 4 and raptor  lake  to show its true power.. 5800x is bad


----------



## AnotherReader (Oct 11, 2022)

I understand the constraints that @W1zzard was operating under, but this would have been a good time to switch the GPU test bed. If not going with Zen 4 or Alder Lake, replacing the 5800X with a 5800X3D would have been appreciated. That being said, it's unlikely to be a bottleneck in most games at 4K, and for this card, the 4K results are the most relevant ones.


----------



## W1zzard (Oct 11, 2022)

You'll get your 13900K with RTX 4090 results soon enough, just be a bit more patient


----------



## RandallFlagg (Oct 11, 2022)

evernessince said:


> To be fair, Turing was statistically the worst Nvidia GPU generation ever.  Zero improvement in performance per dollar, only a small performance bump over last gen, and a single digit efficiency increase.



That's not accurate.  The 2060 was 45% faster than its predecessor 1060.  The MSRP increase with the same VRAM was +$50.  There was a huge jump in the midrange with Turing.  The top end is where it was lackluster.

With Ampere, the mid range didn't get much.  The 3060 was only about 19% faster than the 2060, and basically tied the 2060 Super.  On paper, the 3060 had a $20 price reduction, but we never saw that thanks to Covid and crypto.

1060 vs 2060 on release :





3060 vs 2060 on release :


----------



## AnotherReader (Oct 11, 2022)

Richards said:


> __ https://twitter.com/i/web/status/1579825383842537473The card needs zen 4 and raptor  lake  to show its true power.. 5800x is bad


TechSpot only tested 12 games so their average can't be compared to TPU's. Even in GPU limited scenarios like CyberPunk at 4K, the ratio between the 4090 and the 3090 is almost the same as the average. Still a faster CPU seems to be called for with this new generation of GPUs.



RandallFlagg said:


> That's not accurate.  The 2060 was 45% faster than its predecessor 1060.  The MSRP increase with the same VRAM was +$50.  There was a huge jump in the midrange with Turing.  The top end is where it was lackluster.
> 
> With Ampere, the mid range didn't get much.  The 3060 was only about 19% faster than the 2060, and basically tied the 2060 Super.  On paper, the 3060 had a $20 price reduction, but we never saw that thanks to Covid and crypto.
> 
> ...


Even the flagship was pretty fast for Turing; it's the prices that were terrible. At 4K, we saw:


----------



## vmarv (Oct 11, 2022)

AnotherReader said:


> I understand the constraints that @W1zzard was operating under, but this would have been a good time to switch the GPU test bed. If not going with Zen 4 or Alder Lake, replacing the 5800X with a 5800X3D would have been appreciated. That being said, it's unlikely to be a bottleneck in most games at 4K, and for this card, the 4K results are the most relevant ones.


Hardware Unboxed did the review with the 5800X3D


----------



## TheDeeGee (Oct 11, 2022)

TheDeeGee said:


> While not the claimed 4 times faster, the sometimes near double jump is impressive from one generation to the next.


I have to quote myself.

It is actually 4 times faster in 4K with DLSS ^^


----------



## Fluffmeister (Oct 11, 2022)

Hey @W1zzard, it's must be terrible for you having all those big chunky 4090's taking up all that room at your no doubt comfortably sized labs, but still... feel free to send me your least favourite 4090 if only to free up some room of course.


----------



## evernessince (Oct 11, 2022)

RandallFlagg said:


> That's not accurate.  The 2060 was 45% faster than its predecessor 1060.  The MSRP increase with the same VRAM was +$50.  There was a huge jump in the midrange with Turing.  The top end is where it was lackluster.
> 
> With Ampere, the mid range didn't get much.  The 3060 was only about 19% faster than the 2060, and basically tied the 2060 Super.  On paper, the 3060 had a $20 price reduction, but we never saw that thanks to Covid and crypto.
> 
> ...



No, Turing was terrible across the stack.

The comparison you are making, the 1060 vs the 2060, is a bad one because the 1060 had an MSRP of $250 whereas the 2060 had an MSRP of $399.  That's an increase of $150 USD.  I'm not sure where you are getting your +$50 figure but it was a lot more than that: https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_200_series

Mind you in this price category even a $50 price increase is huge.  For a $200 video card that's a 25% price increase.  The price increase of $150 more than offsets any performance increases seen.

So yes even in the best scenario you can come up with for Turning you are still looking at a average to less than average gain at a large increase in price.  In fact the 2060 doesn't even target the same market segment the price increase is so large.

Ampere is just a continuation of the trend of a lack of improvement in entry level / mid range video cards.  Mind you I'm not sure if I'd quality $400 as mid range, it's in that middle ground where the price could warrant a high end tag.


----------



## RandallFlagg (Oct 11, 2022)

AnotherReader said:


> TechSpot only tested 12 games so their average can't be compared to TPU's. Even in GPU limited scenarios like CyberPunk at 4K, the ratio between the 4090 and the 3090 is almost the same as the average. Still a faster CPU seems to be called for with this new generation of GPUs.
> 
> 
> Even the flagship was pretty fast for Turing; it's the prices that were terrible. At 4K, we saw:
> ...



Both the 4K and the 1080P gains with 1080 Ti vs 2080 Ti pale compared to 1060 vs 2060 gains.  At 1080P the 2060 got +45% - but at 4K, it got +67%.

4K




evernessince said:


> No, Turing was terrible across the stack.
> 
> The comparison you are making, the 1060 vs the 2060, is a bad one because the 1060 had an MSRP of $250 whereas the 2060 had an MSRP of $399.  That's an increase of $150 USD.  I'm not sure where you are getting your +$50 figure but it was a lot more than that: https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_200_series



More bad information.  The 1060 6GB that is being compared was $299.  The 2060 6GB was $350.   I noted this in my post.  The 1060 4GB you speak of was $250.












evernessince said:


> Mind you in this price category even a $50 price increase is huge.  For a $200 video card that's a 25% price increase.  The price increase of $150 more than offsets any performance increases seen.



For a 20% price increase you got +45% in 1080P and +67% in 4K.  I paid $309 for my 2060 KO which is a few percent faster still.


----------



## AnotherReader (Oct 11, 2022)

RandallFlagg said:


> Both the 4K and the 1080P gains with 1080 Ti vs 2080 Ti pale compared to 1060 vs 2060 gains.  At 1080P the 2060 got +45% - but at 4K, it got +67%.
> 
> 4K
> View attachment 265045
> ...


That's right, but I think the true successor for the 1060 was the 1660 Ti. The 2060 was too expensive to be considered a 1060 successor.


----------



## R0H1T (Oct 11, 2022)

TheDeeGee said:


> I have to quote myself.
> 
> It is actually 4 times faster in 4K with DLSS ^^


And you need to buy that new card to get DLSS 3, *win-win*


----------



## HTC (Oct 11, 2022)

evernessince said:


> The comparison you are making, the 1060 vs the 2060, is a bad one because the 1060 had an MSRP of $250 whereas the 2060 had an MSRP of $399



On price yes, but that's not what he was comparing: he was comparing the % gains of the 2060 over the 1060 with the % gains of the 3060 over the 2060.


----------



## mb194dc (Oct 11, 2022)

Great performance if you want RT. Can already game at 4k 60fps+ on prior gen. $1600 seems totally ludicrous price to me.


----------



## RandallFlagg (Oct 11, 2022)

AnotherReader said:


> That's right, but I think the true successor for the 1060 was the 1660 Ti. The 2060 was too expensive to be considered a 1060 successor.



I heard a lot of people say this at the time, many even said it was just as fast as the 2060 which was false (2060 was 18%-20% faster). 

The problem with comparing at a static price point is that you fail to factor in inflation.  Think what you'd be looking at if you compared cars that way.

To illustrate - in the past 24 months in the US there has been over 14% general price inflation.  So a $300 widget from 2020 would today cost $342.  We are talking about a 1060 from 2016 vs a 2060 from 2018 vs 3060 from 2020.   The same tier is not going to stay the same price.  $299 - $350 is IMO a perfectly valid range for these cards.


----------



## adilazimdegilx (Oct 11, 2022)

Impressive performance, glad to see it's efficient and runs quite cool even as FE card.
Shocked to see the CPU struggling to keep with this card on 4K! This got me hyped even more for DLSS3.
I'm not really sold about this new PCIex power connector though. Other reviewers had some serious concerns.
Great review as always, looking forward to other brands' cards and scaling charts. 
Also what's about RE Village chart? 4090 has phenomenal gains there compared to to others?


----------



## BigMack70 (Oct 11, 2022)

evernessince said:


> To be fair, Turing was statistically the worst Nvidia GPU generation ever.  Zero improvement in performance per dollar, only a small performance bump over last gen, and a single digit efficiency increase.
> 
> I hope to see the same people that are praising this 45% increase also praise AMD when they achieve the same.
> 
> ...


Yup. By the way, Turing was 38% faster than Pascal. Ada is 45% faster than Ampere.

Again, I ask - why is this review's conclusion authored in such a way that would suggest historically significant performance improvements worthy of effusive praise, when it's doing what is merely ordinary?

It's really misleading, and suggests far too much of Jensen's marketing kool-aid has been consumed.


----------



## mouacyk (Oct 11, 2022)

Rowsol said:


> The efficiency is the most surprising part.


That should be credited to TSMC, not NVidia?


----------



## Upgrayedd (Oct 11, 2022)

I thought 95C was supposed to be the new 65C?
Does that mean this 72C is the new 42C?


----------



## mouacyk (Oct 11, 2022)

Colddecked said:


> I mean, you can see what it can do in games that can somewhat take advantage of its strong suits.  100% over a 3080 is nuts.


At >2x the cost. Unremarkable.


----------



## N/A (Oct 11, 2022)

BigMack70 said:


> Yup. By the way, Turing was 38% faster than Pascal. Ada is 45% faster than Ampere.


Where does this number come from again 45%?? It's Double the performance the 3090 in non CPU limited situations and this is without the new feature of frame insertion.


----------



## TheinsanegamerN (Oct 11, 2022)

mouacyk said:


> That should be credited to TSMC, not NVidia?


Intel's ARC GPUs are a prime example of the exact opposite. The node is not the only thing that matters, GPU arch design is just as important.


----------



## Chrispy_ (Oct 11, 2022)

@W1zzard: Holy shit, you have the ONLY DLSS 3.0 frame generation results on the internet right now.
Good job, nice review; Crack open a cold beer and put your feet up!

As for the 3090, it's a beast, as expected - but it's not the model that 99.9% of people are interested in and we don't know what AMD's offering will be. For now it's just the first data point in a _very_ incomplete next-gen dataset and I cannot wait to see how the 4070 or any RDNA3 cards compare.


----------



## ARF (Oct 11, 2022)

grammar_phreak said:


> I really wish SLI and multi-GPU setups would have evolved.



RTX 4090 is your new "dual-GPU replacement".
The RTX 4080-16 will be nvidia's new single-GPU top dog.
While the RTX 4080-12 - well, who knows, there is no such animal


----------



## clopezi (Oct 11, 2022)

Chrispy_ said:


> @W1zzard: Holy shit, you have the ONLY DLSS 3.0 frame generation results on the internet right now.
> Good job, nice review; Crack open a cold beer and put your feet up!
> 
> As for the 3090, it's a beast, as expected - but it's not the model that 99.9% of people are interested in and we don't know what AMD's offering will be, so for now it's just the first data point in a _very_ incomplete next-gen dataset. I cannot wait to see how the 4070 or any RDNA3 cards compare.


It's a supperb work! For things like this, Techpowerup it's my first tech place to read


----------



## Frick (Oct 11, 2022)

Frick said:


> Would it be interesting to add 4K Vsync 60Hz to this?



For power draw I meant.


----------



## Gameslove (Oct 11, 2022)

@W1zzard 
4K monitor, you can trying to enable a dynamic super resolution (DSR) - 4K into 8K.


----------



## Shazamy (Oct 11, 2022)

Thanks for an amazing review W1zzard, as always, much appreciated!
The DLSS FG video was an awesome and unexpected inclusion!
I can't believe how much more power efficient and quiet this card is, really impressed!
I was going to get the MSI water cooled 4090, but with these temps I may not have to. Looking forward to the custom design card reviews!


----------



## BigMack70 (Oct 11, 2022)

N/A said:


> Where does this number come from again 45%?? It's Double the performance the 3090 in non CPU limited situations and this is without the new feature of frame insertion.
> 
> View attachment 265050


Um.... from the portion of the review conclusion that I've been criticizing (because it makes no sense from the data)?

I'll quote it again, for reference. *Emphasis *added.



> For a majority of gamers, the "classic" raster performance is very important though—highest settings, RT off, DLSS off—so we made sure to extensively test this scenario using 25 games at three resolutions. *The GeForce RTX 4090 achieves incredible performance results here: +45% vs RTX 3090 Ti. Yup, 45% faster than last generation's flagship—this is probably the largest jump Gen-over-Gen for many years.* Compared to RTX 3080 the uplift is 89%—wow!—almost twice as fast. Compared to AMD's flagship, the Radeon RX 6950 XT, the RTX 4090 is 64% faster. Somehow I feel that after RDNA2, Jensen said to his people "go all out, I want to conclusively beat AMD next time."



I'm not critiquing the card. I'm critiquing a section of this review's conclusion that makes ZERO sense. The data in this review does not support these statements with respect to "classic" raster performance. The data in this review suggests that this card is right in the middle of the pack with regards to new GPU architectures launched over the past 10 years of GPU launches and is just fine.

This card's performance is excellent - as expected - without needing to use unwarranted hyperbole and unearned effusive praise in comparison to what previous architectures achieved at their respective launches. It sounds much more like marketing than product review.


----------



## Gica (Oct 11, 2022)

The price is not high for a monster. You have everything you want for 4K@100+ FPS gaming, DLSS 3.0 and full enc/dec support. For Content Creation, the former king 3090Ti turns into a poor kitten in front of the new king. Here, at least 100% performance increase is expected.


----------



## evernessince (Oct 11, 2022)

RandallFlagg said:


> More bad information.  The 1060 6GB that is being compared was $299.  The 2060 6GB was $350.   I noted this in my post.  The 1060 4GB you speak of was $250.
> 
> View attachment 265046
> 
> ...



The only bad information here is that you are implying that $350 is the price the card was broadly available for.  $350 was the founder's edition price which almost no one saw.  Aside from the scalping, AIB pricing started at $400+ as was usual when comparing founders edition to non-founders edition.

Mind you that's comparing the 6GB 2060 to the 6GB 1060.  Zero increase in VRAM.  The 12GB 2060 cost a whopping $600 at time of review on TechPowerUp.

You paid $309 for a 2060 KO, which launched March 3rd 2020, a whopping 14 MONTHS after the Jan 7th 2019 launch of the 2060.  Suffice to say, it's extremely misleading to compare late generation prices to MSRP.

You admitted you own a 2060 and I believe your ownership is clouding your judgement in this case.  Even when you take the best turning SKU value wise in the best possible light it's at best underwhelming.



HTC said:


> On price yes, but that's not what he was comparing: he was comparing the % gains of the 2060 over the 1060 with the % gains of the 3060 over the 2060.



He was replying to my comment referencing performance per dollar.  I have to assume he's replying on topic.  Otherwise I could care less about off-topic arguments.


----------



## Dimestore (Oct 11, 2022)

DLSS3 'Frame Generation' makes no sense to me.  Take two frames and interpose a generated frame in between them.  What happens?  Your FPS goes up, sure, but your *latency* goes up as well!  Latency is time between action and reaction, from the time you move the control to the time the movement hits your eyes.  If you are generating an extra frame for every two frames that are not controlled by you, then you are pushing latency up by 1/3.  Does this make sense to anyone else?


----------



## clopezi (Oct 11, 2022)

Dimestore said:


> DLSS3 'Frame Generation' makes no sense to me.  Take two frames and interpose a generated frame in between them.  What happens?  Your FPS goes up, sure, but your *latency* goes up as well!  Latency is time between action and reaction, from the time you move the control to the time the movement hits your eyes.  If you are generating an extra frame for every two frames that are not controlled by you, then you are pushing latency up by 1/3.  Does this make sense to anyone else?


Latency it's reduced in all tests, they have dedicated technology for it.

I don't know in competitive games, but on single player games, latency it's heavely reduced.


----------



## terroralpha (Oct 11, 2022)

this review is worthless. why would you test a high end GPU on a budget CPU? you are knee capping the performance. a lot of the benches here, even those at 4K, are CPU limited. you guys couldn't spring for a 5800X3D? or an intel bench?

no one who is the market for this GPU is going to be using a 5800X system. 

everyone else used ryzen 7000 or high end intel. but you guys published this junk



GerKNG said:


> TWO TO FOUR TIMES FASTER!!!



in some case, yes, it does go 4x faster. the reason you don't see that here is because, as mentioned above, these benches are being run on a ryzen 5800


----------



## W1zzard (Oct 11, 2022)

Dimestore said:


> DLSS3 'Frame Generation' makes no sense to me.  Take two frames and interpose a generated frame in between them.  What happens?  Your FPS goes up, sure, but your *latency* goes up as well!  Latency is time between action and reaction, from the time you move the control to the time the movement hits your eyes.  If you are generating an extra frame for every two frames that are not controlled by you, then you are pushing latency up by 1/3.  Does this make sense to anyone else?


You are absolutely right, that's why DLSS 3 requires devs to add Reflex support to their game


----------



## Dimestore (Oct 11, 2022)

clopezi said:


> Latency it's reduced in all tests, they have dedicated technology for it.
> 
> I don't know in competitive games, but on single player games, latency it's heavely reduced.


Where is this addressed?  I see no mention of it.  I only see reviews use frame rates as performance indicators.


----------



## clopezi (Oct 11, 2022)

terroralpha said:


> this review is worthless. why would you test a high end GPU on a budget CPU? you are knee capping the performance. a lot of the benches here, even those at 4K, are CPU limited. you guys couldn't spring for a 5800X3D? or an intel bench?
> 
> everyone else used ryzen 7000 or high end intel. but you guys published this junk
> 
> ...


LTT numbers are strange. Those numbers without DLSS enabled are not seen on any other review...


----------



## W1zzard (Oct 11, 2022)

Chrispy_ said:


> @W1zzard: Holy shit, you have the ONLY DLSS 3.0 frame generation results on the internet right now.
> Good job, nice review; Crack open a cold beer and put your feet up!


lol really? and I felt bad because I had only two games


----------



## 3x0 (Oct 11, 2022)

clopezi said:


> Latency it's reduced in all tests, they have dedicated technology for it.
> 
> I don't know in competitive games, but on single player games, latency it's heavely reduced.


----------



## TheoneandonlyMrK (Oct 11, 2022)

Dimestore said:


> DLSS3 'Frame Generation' makes no sense to me.  Take two frames and interpose a generated frame in between them.  What happens?  Your FPS goes up, sure, but your *latency* goes up as well!  Latency is time between action and reaction, from the time you move the control to the time the movement hits your eyes.  If you are generating an extra frame for every two frames that are not controlled by you, then you are pushing latency up by 1/3.  Does this make sense to anyone else?


Yeah,. But more frames though
Sorry I couldn't help myself.

I agree with your sentiments though virtual high FPS isn't for me.

Wow did I just delete a rant, in short wtaf I don't like ai generated frames. or its equivalents 

Secondly htaf, does a maximum gain of 40% or close max in 4k and 12% at 1080p turn into 100% or 2 to 4 time's the performance, just how niche is the use case that gain is on considering its a gaming card that's some clear hyping by Nvidia.
I do want to know though, where does 4x kick in


----------



## evernessince (Oct 11, 2022)

clopezi said:


> Latency it's reduced in all tests, they have dedicated technology for it.
> 
> I don't know in competitive games, but on single player games, latency it's heavely reduced.



No latency is absolutely not reduced by DLSS's AI frame insertion.  Given that it requires you to wait for the next frame to be rendered before creating the AI generated frame it will always carry a latency penalty until they remove that requirement.

This is why Nvidia require that Reflex be enabled with the frame insertion, to hide a portion of the latency penalty.  If you disable frame insertion and enable reflex you will get lower latency.



Dimestore said:


> DLSS3 'Frame Generation' makes no sense to me.  Take two frames and interpose a generated frame in between them.  What happens?  Your FPS goes up, sure, but your *latency* goes up as well!  Latency is time between action and reaction, from the time you move the control to the time the movement hits your eyes.  If you are generating an extra frame for every two frames that are not controlled by you, then you are pushing latency up by 1/3.  Does this make sense to anyone else?



I assume there's a sweet spot between 60 FPS and 144 FPS where it makes sense for some people to enable it.  Above 144 FPS it doesn't make much sense to enable as smoothness is already very good and the latency penalty would outweigh any diminishing benefits above that frame-rate.  Below 60 FPS and the latency penalty would be quite sizable, maybe for strategy games that aren't real time but even then it could still be annoying and those are the games least likely to benefit from additional motion smoothness.

Hopefully improvements come to the tech but until then I feel like it's going to be highly preferential as to whether it's worth enabling.


----------



## Footman (Oct 11, 2022)

Yes very good but..... 

Nice Review. I noticed Techspot remarked that the 4090 was cpu bound with the 5800X3D in their test machine. Your test machine is running a 5800X, so I wonder if the performance of the 4090 might be higher at lower resolutions with a faster cpu???


----------



## GerKNG (Oct 11, 2022)

terroralpha said:


> in some case, yes, it does go 4x faster. the reason you don't see that here is because, as mentioned above, these benches are being run on a ryzen 5800


you show us the LTT benchmark where they forgot to disable DLSS.
this is not a real result.


----------



## RandallFlagg (Oct 11, 2022)

evernessince said:


> The only bad information here is that you are implying that $350 is the price the card was broadly available for.  $350 was the founder's edition price which almost no one saw.  Aside from the scalping, AIB pricing started at $400+ as was usual when comparing founders edition to non-founders edition.
> 
> Mind you that's comparing the 6GB 2060 to the 6GB 1060.  Zero increase in VRAM.  The 12GB 2060 cost a whopping $600 at time of review on TechPowerUp.
> 
> You paid $309 for a 2060 KO, which launched March 3rd 2020, a whopping 14 MONTHS after the Jan 7th 2019 launch of the 2060.  Suffice to say, it's extremely misleading to compare late generation prices to MSRP.




It is you who is trying to interject "market price" from 3-4 years ago.  I am using MSRP.  I'll keep on using MRSP.  The reasoning for using that to compare has been explained ad nauseum not only here, but on countless other sites.  If demand is too high, then the AIBs and retailers pocket the difference.  If it's too low, then they take a hit.  MSRP is the only thing we have, anything else is chaos and cherry picking.  

My comment about paying $309 was just pointing out that I got a discount on the card by buying late in the cycle, since you seemed dead set on comparing discounted prices to new release prices, or high demand scalp prices to MSRP.  Anyone can do that, just cherry pick launch vs mid cycle vs late cycle, crypto bust vs crypto mania.  That's chaos.

As I said, I think MSRP is the only reasonable way to compare pricing.  And +$50 on a $299 card is only +16.7%.  For that you get +45% at 1080P and +67% at 4K.  That's not hard to understand.  



evernessince said:


> You admitted you own a 2060 and I believe your ownership is clouding your judgement in this case.  Even when you take the best turning SKU value wise in the best possible light it's at best underwhelming.



That's a laugh.  I posted charts, I've been posting charts, I don't usually say stuff without proof when it comes to numbers because proof is so easy to find.  It's 45% faster than 1060 at 1080P and 67% faster at 4K.  I paid about $30 over what the 1660 Ti was selling for at the time (about 12% more) to get +20% over that card.  The reason was likely because everyone said that the 2060 wasn't worth it, which drove up 1660 Ti demand, while the 2060 wasn't selling.  So I got to take advantage of both bad information in circulation and its results.   I don't do these things because fanboy gibberish like you imply, I do it because the math works out, and I already showed that it did.


----------



## Dr. Dro (Oct 11, 2022)

terroralpha said:


> this review is worthless. why would you test a high end GPU on a budget CPU? you are knee capping the performance. a lot of the benches here, even those at 4K, are CPU limited. you guys couldn't spring for a 5800X3D? or an intel bench?
> 
> no one who is the market for this GPU is going to be using a 5800X system.
> 
> everyone else used ryzen 7000 or high end intel. but you guys published this junk



Since when is a 5800X a budget CPU? I must have missed something. Its performance comfortably exceeds that of the CPUs installed in most gaming PCs today.

This guy out here literally with the mindset of "Imagine being so poor you could only afford a 5950X and a 3090 " , damn, I've never loved being poor as much as I do today  



TheoneandonlyMrK said:


> in short wtaf I don't like ai generated frames. or its equivalents



You're not alone, brother. I have to confess, I have never liked upscalers and I definitely do not like this frame generation thing. I will be skipping the launch models of this gen, probably going to grab something during the inevitable mid-gen refresh.

Other than that, it's a remarkable GPU. I'll be keeping a close eye on AMD's counteroffer, it looks like they have their work cut out for them.


----------



## gffermari (Oct 11, 2022)

4090 shouldn't be tested in 1080p/1440p at the moment. Even in 4K, the card was kept behind by the cpu, even with 5800X3D(Hardware Unboxed).
The cpus clearly are ...30 years behind in technology compared to the gpus.


----------



## Gica (Oct 11, 2022)

For haters: insert a link with a review in which the general impression was disappointment. I found it only with WoW!!!!


----------



## Colddecked (Oct 11, 2022)

Dr. Dro said:


> Since when is a 5800X a budget CPU? I must have missed something. Its performance comfortably exceeds that of the CPUs installed in most gaming PCs today.
> 
> This guy out here literally with the mindset of "Imagine being so poor you could only afford a 5950X and a 3090 " , damn, I've never loved being poor as much as I do today
> 
> ...



Since 12900 and 5800x3d came out, it's mid range now at best.  Heck even in the 5000 series stack its in the middle.


----------



## bobmeix (Oct 11, 2022)

Small typo on page 35: you can enable DLAA + FG


----------



## Dr. Dro (Oct 11, 2022)

Colddecked said:


> Since 12900 and 5800x3d came out, it's mid range now at best.  Heck even in the 5000 series stack its in the middle.



I dunno man but I want some of what you guys are on calling it low-mid, but I want some, it must be the good stuff. You guys referring to it as if it were some sort of antique lmao

Vermeer remains one of the most competent gaming processors money can buy, as well as one of the most efficient processor architectures out there, and the 5800X remains a very high end processor for this purpose. I would go a step further on that, the 5700X is the CPU that most people looking to play video games, including those looking to build with an Ada GPU. 

That particular processor is reliable, runs cool and has amazing performance, too. I'll just leave @Mussels' post on the Zen Garden do the talking:









						Ryzen Owners Zen Garden
					

I had to take my B-Die to 1.55V to be stable at 3800 CL14 1T. Here are the settings I used. ClkDrvStr 40 and you may need to add AddrCmdSetup 56 if you get a bluescreen going into windows. RDWR can go to 7.    It boots but when tested with memtest5 1usmus config give me errors on 6 and 12...




					www.techpowerup.com
				




I don't think you will be losing out on a quality experience by using these with a Ryzen 5000 processor, or even one of the faster 3000 processors like a 3800XT or a 3950X.


----------



## Why_Me (Oct 11, 2022)

Rokugan said:


> View attachment 264998
> 
> +63% vs  RTX3090 @ 3x  2nd hand market price in Europe (2K vs 700 Eur)
> +88% vs RTX3080 @ 4x  2nd hand market price in Europe (2K vs 500 Eur)
> ...


Just about everything in the EU is priced insane.  Do you ever think it might be the EU and not Nvidia?


----------



## Wasteland (Oct 11, 2022)

RandallFlagg said:


> As I said, I think MSRP is the only reasonable way to compare pricing.  And +$50 on a $299 card is only +16.7%.  For that you get +45% at 1080P and +67% at 4K.  That's not hard to understand.


The MSRP of the 1060 (6 GB) was $250.  The MSRP for the _Founder's Edition _was $300.  This was back when NVIDIA charged a premium for the FE.  I know because I bought a Zotac 1060 for $250 at launch.

I don't have a dog in this hunt, particularly, but there was a minor outcry about bad perf/price on Turing when it launched.  I remember being deeply underwhelmed, myself.  Later on the 16 series appeased people who felt that RT features weren't worth the premium, which I think was a sensible position at the time.

EDIT: Better link - https://www.gamersnexus.net/news-pc/2505-official-nvidia-gtx-1060-specs-price-release-date


----------



## Dr. Dro (Oct 11, 2022)

Why_Me said:


> Just about everything in the EU is priced insane.  Do you ever think it might be the EU and not Nvidia?



It's NV alright. Prices are historically high even in America which gets dirt cheap hardware anyway. That said it is not out of line compared to the 3090's launch price, and honestly, anyone with a GA102-based GPU should not be running around with their hair on fire, it's just FOMO, really. 

If you spent $800-$1600+ on a high end GPU, give it at least 3 years before you replace it. I guarantee anyone on a 3080+ is going to be comfortable gaming on ultra high settings for the next year if not more.


----------



## Dimestore (Oct 11, 2022)

evernessince said:


> No latency is absolutely not reduced by DLSS's AI frame insertion.  Given that it requires you to wait for the next frame to be rendered before creating the AI generated frame it will always carry a latency penalty until they remove that requirement.
> 
> This is why Nvidia require that Reflex be enabled with the frame insertion, to hide a portion of the latency penalty.  If you disable frame insertion and enable reflex you will get lower latency.
> 
> ...


I don't see how it is possible to generate an intermediary frame between two frames using temporal (time-based) data and not require that the second frame be rendered before the interpose.  Frame1 - Interframe - Frame2, in order to create interframe you must have the already rendered Frame1 and Frame2, and by the time you push Frame1 to the screen it is behind by a frame, giving the 1/3 extra latency.  Is this incorrect?


----------



## gffermari (Oct 11, 2022)

Dr. Dro said:


> I dunno man but I want some of what you guys are on calling it low-mid, but I want some, it must be the good stuff. You guys referring to it as if it were some sort of antique lmao
> 
> Vermeer remains one of the most competent gaming processors money can buy, as well as one of the most efficient processor architectures out there, and the 5800X remains a very high end processor for this purpose. I would go a step further on that, the 5700X is the CPU that most people looking to play video games, including those looking to build with an Ada GPU.



We don't care if the 5000 are still capable or not or what most people have.
In reviews we care about real numbers.

It's the same when we test cpus and put a 4090 playing at 720p.
We don't care if it never happens. We only care about numbers and differences.

Yes a 5800X was a bad choice for GPU review but it is understandable. Every reviewer has to retest every gpu using the latest and greatest cpu when there is no always time for that.
Wizzard will do that, obviously, when 13900K arrives. And this time it would be great if we had gpu usage at some point, so we know if there is or not a cpu bottleneck.


----------



## adilazimdegilx (Oct 11, 2022)

BigMack70 said:


> Um.... from the portion of the review conclusion that I've been criticizing (because it makes no sense from the data)?
> 
> I'll quote it again, for reference. *Emphasis *added.
> 
> ...


I think w1z is correct with this one. But I agree it's not obvious as it's not explained clearly.
4090 does indeed make the highest jump between new generations since (at least) 6xx series. (I didn't check older generations but it probably goes until 8xxx series, TPU reviews only).

*This 'gap' is between the fastest new generation card 'at launch' versus the fastest old generation card at that time. *

So it's between 4090 and 3090ti and it's 45% at 4K.
It was close to this (43%) between 3090 (launched 1 week after 3080 but still counts I guess) and 2080ti.
39% between 2080ti and 1080ti.
Only 23% between 1080 and Titan X (1080ti released 1 year later).


----------



## P4-630 (Oct 11, 2022)

damric said:


> it's finally time to test the most powerful cards at 8k.





Gameslove said:


> Waiting also 5K, 6K, 8K test. And PCI-E 3.0 vs PCI-E 4.0?


Here you go, 8K gaming tested in various games:








						Nvidia GeForce RTX 4090 - Olifant in de porseleinkast
					

De Nvidia GeForce RTX 4090 Founders Edition is een gigantische videokaart die een flinke prestatietoename moet leveren. Is de kaart net zo snel als hij groot is?




					tweakers.net


----------



## metalslaw (Oct 11, 2022)

CallandorWoT said:


> Doesn't HDMI 2.1 already max out at like 4k 240hz and 8k 144hz? I mean who would ever need more than that mate?


If I'm going to run triple screen 4k for a sim rig, I'm limited to a max of 97 hertz with 1.4a at 4:4:4.

With that number not being 120/144 hertz, this would also nullify freesync/g-sync use at 4k, as most screens only offer the magical 2.4x range required, at min to max hertz range of 50 hertz to 120 hertz, or 60 to 144.

This of course wouldn't be a problem if the gfx cards still had a 3 DP, 3 HDMI, arrangement (as I could then use 3x HDMI instead). But every 3090 is currently maxed at 2 hdmi ports (and 3 1.4a DP ports), so I'm stuck with 1.4a for triple screen, with no ability to run triple 4k 4:4:4 with g-sync/freesync on in that triple setup.


----------



## Colddecked (Oct 11, 2022)

Dr. Dro said:


> I dunno man but I want some of what you guys are on calling it low-mid, but I want some, it must be the good stuff. You guys referring to it as if it were some sort of antique lmao
> 
> Vermeer remains one of the most competent gaming processors money can buy, as well as one of the most efficient processor architectures out there, and the 5800X remains a very high end processor for this purpose. I would go a step further on that, the 5700X is the CPU that most people looking to play video games, including those looking to build with an Ada GPU.
> 
> ...



Just because its a mid level CPU doesn't mean it sucks.  Its now a very good value.  BUT it isn't the pinnacle of PC gaming, that's all.


----------



## TheoneandonlyMrK (Oct 11, 2022)

metalslaw said:


> If I'm going to run triple screen 4k for a sim rig, I'm limited to a max of 97 hertz with 1.4a at 4:4:4.
> 
> With that number not being 120/144 hertz, this would also nullify freesync/g-sync use at 4k, as most screens only offer the magical 2.4x range required, at min to max hertz range of 50 hertz to 120 hertz, or 60 to 144.
> 
> This of course wouldn't be a problem if the gfx cards still had a 3 DP, 3 HDMI, arrangement (as I could then use 3x HDMI instead). But every 3090 is currently maxed at 2 hdmi ports, so I'm stuck with 1.4a for triple screen, with virtually no ability to run triple 4k 4:4:4 with g-sync/freesync on in that triple setup.


I think the video output dire straights on these cards is on purpose, a lifetime limiter so to speak.


----------



## antuk15 (Oct 11, 2022)

So we get ~70% more performance on average with ~60% more shaders + 50% more clock speed.

Architecturally it's not scaling very well compared to previous generations.


----------



## evernessince (Oct 11, 2022)

RandallFlagg said:


> It is you who is trying to interject "market price" from 3-4 years ago.  I am using MSRP.  I'll keep on using MRSP.  The reasoning for using that to compare has been explained ad nauseum not only here, but on countless other sites.  If demand is too high, then the AIBs and retailers pocket the difference.  If it's too low, then they take a hit.  MSRP is the only thing we have, anything else is chaos and cherry picking.
> 
> My comment about paying $309 was just pointing out that I got a discount on the card by buying late in the cycle, since you seemed dead set on comparing discounted prices to new release prices, or high demand scalp prices to MSRP.  Anyone can do that, just cherry pick launch vs mid cycle vs late cycle, crypto bust vs crypto mania.  That's chaos.
> 
> ...



Are you contesting the fact that AIB models were more expensive than founders edition cards?  That's an argument certain to fail.  That fact alone disproves the idea that using only MSRP while ignoring street price is misguided, let alone the 3000 serier's MSRP vs what people actually paid.  It's one thing to use MSRP when people could actually purchase at said price like the 10xx series.  That's completely fine.  It's not fine to only use MSRP when the vast majority of people are not getting MSRP prices, whether that be from scalpers of Nvidia founders edition shenanigans.  The goal of any price comparison is to compare prices people actually paying, it makes no sense to ignore other factors that will impact that.

You are using MSRP?  Your $309 2060 KO comment says otherwise.

No, you seem to be using whatever numbers happen to support your point when you are clearly biased by your 2060 ownership.

MSRP has been plain misleading in many cases the last two generations.  The 2000 was a terrible generation overall.  If you can't agree on those two points you are just denying reality and we have nothing further to discuss.  No amount of goal post moving will change that.


----------



## erek (Oct 11, 2022)

birdie said:


> The biggest issue of this generation is that other cards despite costing an arm and a leg are castrated too much, not to mention that NVIDIA had no qualms calling RTX 4070 an RTX 4080.
> 
> It even looks like there will be no RTX 4060/4050 this gen and Nvidia will continue selling the previous generation stock.


Won’t they rebrand the 3050 through 3060 to new generation parts?


----------



## evernessince (Oct 11, 2022)

Dimestore said:


> I don't see how it is possible to generate an intermediary frame between two frames using temporal (time-based) data and not require that the second frame be rendered before the interpose.  Frame1 - Interframe - Frame2, in order to create interframe you must have the already rendered Frame1 and Frame2, and by the time you push Frame1 to the screen it is behind by a frame, giving the 1/3 extra latency.  Is this incorrect?



DLSS 3.0 frame insertion does indeed wait for frame 2 in your example before generating the intermediary frame.


----------



## cvaldes (Oct 11, 2022)

erek said:


> Won’t they rebrand the 3050 through 3060 to new generation parts?



One would presume so although no one here has a timetable. Clearly NVIDIA is trying to draw down inventory of assembled graphics cards in the channel as well as Ampere GPU chips.

Unfortunately for NVIDIA crypto mining demand has completely evaporated due to the crypto market crash and Ethereum's PoS merge so they don't have the option of selling off excess Ampere GPU chips in mining cards.

At some point, they will sell out of a particular Ampere GPU and then it'll be time for NVIDIA to decide whether it's worth transitioning the low end products to the Ada Lovelace generation. Or they can tell Samsung to fire up their foundry and churn out another batch of those Amperes.

From a marketing standpoint, I'm sure they would rather have their entire product line on the latest generation rather than straddle both Lovelace and Ampere.

But remember that two years ago when Ampere launched, they marketed the 3090 and 3080 while still selling 2060s. The 3060 and 3050 didn't come until later.


----------



## QUANTUMPHYSICS (Oct 11, 2022)

As we can see from the MSI SUPRIM LIQUID X,  (and even the Kingpin 3090Ti) 


there's No good reason the cards have to be that size.


----------



## N/A (Oct 11, 2022)

antuk15 said:


> So we get ~70% more performance on average with ~60% more shaders + 50% more clock speed.
> 
> Architecturally it's not scaling very well compared to previous generations.



4090 is double the performance in games that are not CPU limited like resident evil in 4K gen over gen which means 3090 non Ti. And with frame insertion probably goes up to 3 times.
How is it supposed to scale when it uses the same memory type, it needs GDDR7 very badly. But no such luck. This behemoth is doomed to a kind of failure like the Fermi. GDDR6X is broken, it does very little, 3070 Ti is the same as 3070, no actual benefit from transferring data on 4 voltage levels. just a ginormous power loss.


----------



## cvaldes (Oct 11, 2022)

QUANTUMPHYSICS said:


> As we can see from the MSI SUPRIM LIQUID X,  (and even the Kingpin 3090Ti) View attachment 265077there's No good reason the cards have to be that size.



Water has a higher cooling capacity than air. 

However you're also moving the heat from one place (the graphics card PCB) to another (the radiator) which is placed in a location that is better for heat dissipation.

You can't just add up cubic centimeters of cooling solutions and say "the stock cooler doesn't need to be that thick."


----------



## mama (Oct 11, 2022)

Where's the 4080?  Cheeky... or sneaky?

A solid card.  Good uplift in rasterization, reasonable power draw (and according to GN a nifty power adapter included), temperatures good.  Not a lot not to like but for the halo pricing.

But is it enough to be top dog?


----------



## ARF (Oct 11, 2022)

gffermari said:


> Yes a 5800X was a bad choice for GPU review but it is understandable. Every reviewer has to retest every gpu using the latest and greatest cpu when there is no always time for that.
> Wizzard will do that, obviously, when 13900K arrives. And this time it would be great if we had gpu usage at some point, so we know if there is or not a cpu bottleneck.



I don't think the performance loss at 2160p is more than 5% compared to Core i9-12900K / Ryzen 7 5800X3D.

Look at a review with Core i9-12900K:
GeForce RTX 4090: 4K Gaming Performance - Nvidia GeForce RTX 4090 Review: Queen of the Castle | Tom's Hardware (tomshardware.com)


----------



## dgianstefani (Oct 11, 2022)

bobmeix said:


> Small typo on page 35: you can enable DLAA + FG


DLAA is a real technology, it's NVIDIA deep learning anti aliasing, similar to DLSS but without the lower render resolution then upscaling. FG is frame generation.

DLAA is better than TAA, but doesn't offer the performance benefits of DLSS.


----------



## DemonicRyzen666 (Oct 11, 2022)

who's 4nm is this TSMC or Samsung ?

from what I'm reading 5nm TSMC is 88% more dense than 7nm on TSMC
That 4nm TSMC is a like an inbetween node with a facelift of 5nm for around 6%-22% increase in density compare to 5nm, with some power tweaks & frequnce tweaks for high prefromance.
so around 94%-108%+ cause it was samsungs 8nm, more dense vs ampere
it seems like nvidia is just brute forcing everything. ¯\_(ツ)_/¯


----------



## Dr. Dro (Oct 11, 2022)

DemonicRyzen666 said:


> who's 4nm is this TSMC or Samsung ?
> 
> from what I'm reading 5nm TSMC is 88% more dense than 7nm on TSMC
> That 4nm TSMC is a like an inbetween node with a facelift of 5nm for around 6%-22% increase in density compare to 5nm, with some power tweaks & frequnce tweaks for high prefromance.
> ...



It uses the TSMC N4 node.


----------



## Fluffmeister (Oct 11, 2022)

It's now time to take aim at the CPU duopoly, pull your fingers out and give us faster CPUs.


----------



## DemonicRyzen666 (Oct 12, 2022)

Dr. Dro said:


> It uses the TSMC N4 node.


thanks listening to the LTT saying the clocking being 35% higher makes this card look like meh


----------



## Dr. Dro (Oct 12, 2022)

DemonicRyzen666 said:


> thanks listening to the LTT saying the clocking being 35% higher makes this card look like meh



Yeah, architecturally speaking, this isn't exactly out of the ordinary. There are more execution units and significantly faster clocks, in addition to the power limit being raised very high (something that most Ampere models struggle with due to their very conservative power limits, unless you have ROG Strix/KPE GPUs or the 3090 Ti).

Still, the jump is very healthy especially at 4K. I would wager the RTX 4090 Ti (aka this GPU perfected) will be one to remember


----------



## Why_Me (Oct 12, 2022)

PapaTaipei said:


> Less than 15% improvements vs 3090ti on 1080p and if you don't care about that useless RT bs. Amazing! So much power draw and wasted transistors on RT and tensor cores... Btw the article would attract more ppl if it included CSGO and Overwatch 2.
> 
> Amazing to see this GPU with 78 BILLION transistors has less FPS vs a 3080 (less than 28b transistors) on some games that do not use RT and on 1080p all the while using up to 600 WATTS!!! This has to be a new record of stupidity.


Someone would have to be brain dead to purchase this card for gaming at 1080P.



SOAREVERSOR said:


> Competitive players are at 1080p on 24in monitors that do 240 or 360hz and they do not enable gsync/freesync.  Many of them don't have to pay for their stuff either.


Not sure what games you're talking about.  Most competitive FPS gamers w/sponsors that play games such as Warzone play at 1440.


----------



## RandallFlagg (Oct 12, 2022)

DemonicRyzen666 said:


> who's 4nm is this TSMC or Samsung ?
> 
> from what I'm reading 5nm TSMC is 88% more dense than 7nm on TSMC
> That 4nm TSMC is a like an inbetween node with a facelift of 5nm for around 6%-22% increase in density compare to 5nm, with some power tweaks & frequnce tweaks for high prefromance.
> ...



N4 isn't even an in-between 'half node'.  It's more like N5+.  6% greater density.  

But Samsung's 8LP was really just an enhancement to their 10LP node.  It was only 12% more dense.  

For comparison, this is not even a full node jump over Intel's 14nm node, it's closer to a 1/2 node jump from that and is not even 2/3 the density of Intel 7 (61 MT/mm2 vs 106 MT/mm2).

So in fact TSMC N5 and N4 are almost 3X more dense than Samsung 8LP (180 MT/mm2 vs 61MT/mm2).


----------



## DemonicRyzen666 (Oct 12, 2022)

RandallFlagg said:


> N4 isn't even an in-between 'half node'.  It's more like N5+.  6% greater density.
> 
> But Samsung's 8LP was really just an enhancement to their 10LP node.  It was only 12% more dense.
> 
> ...


 That makes this card even worse than what I just said.


----------



## Denver (Oct 12, 2022)

RandallFlagg said:


> N4 isn't even an in-between 'half node'.  It's more like N5+.  6% greater density.
> 
> But Samsung's 8LP was really just an enhancement to their 10LP node.  It was only 12% more dense.
> 
> ...


Unfortunately, GPUs don't reach this reported density... this number is probably based on the best possible scenario: a small chip with little cache.


----------



## Dyatlov A (Oct 12, 2022)

Will it melt down a Corsair SF Series 750W Platinum PSU or can be ok?


----------



## wheresmycar (Oct 12, 2022)

@W1zzard glad we've got a larger number of games to compare and as always appreciate the reviews.

Out of the 3 titles "I play" @ 1440p, all 3 are showing 60-100% increase in FPS over my current 2080 TI (yep im comparing 2 Gens behind as it best reflects personal relevance). I have to admit, thats pretty impressive. A 4th played title sees the 4090 performing below AMDs top 3 cards which was a little odd but we'll let it pass.

2 of the titles "i play" shown in the ray tracing chart easily see x2.5~x3 increase in perf. I haven't bothered with ray tracing as taxing performance aint my thing... but 40-series with RT enabled easily shoots beyond my displays 144hz max refresh rate....i have to admit thats impressive!!!

Needless to say, DLSS seeing equally impressive gains in the titles im playing.

Oh well, performance looks great, its one hell of a teaser BUT nah.... sorry NVIDIA i aint wasting $1600 on a graphics card to over-fill your already full pockets. IMO, for gaming its utter madnessss to even consider forking out this sort of cash for a GPU. I'll pay you half ($800) for an on-par but a couple of pegs down 4070 or similar unless RDNA3 pulls the rug and steals your boots first. I can't get over it... even $800 IMO is way too much to invest in graphics muscle but i'm willing to commit if the numbers shine. No wander why SLI was long shot in its knees.... you naughty buggers always planned for this type of x2-and-north-of it increase in cost seeing there was a small market for it. Flagship cards are no longer flagships but "pocketpits"

I read some comments suggesting _"oh but its a flagship card you dont need to buy into it so why complain"_ - something along those lines. Wrong! These are price pre-engineered "Pocketpit" cards which is a reference point for NVIDIA to set the stage for higher MSRPs across the lower tier board. We've seen this since the 1000-series with each Gen hoarding widely hungrier premiums with each launch. Going by the current trajectory soon we'll be on 5090 @ $1800-$2000 and then 6090 @ $2300-$2500.... with 1-2-3 peg down SKUs correspondingly following these fattened up reference points which is tough to stomach for the average neglected miles wider majority consumer base. It's damn right tier-down chaos!! I'm so close to pulling a finger at NVIDIA and accepting anything AMD drops for the sake of it (providing its a decent lift over my current 2080 TI). EDIT: actually no, a little more frankness, after years of buying into NVIDIA and kinda feeling snubbed - even if AMDs RDNA3 trails marginally and completely cocks up RT (or other features) by comparison.... i think im well on my way already to the red team.  Whats with these colours anyway... Red AMD, Green NV and Blue Intel...spells out RGB and ive had enough of that too.


----------



## 80-watt Hamster (Oct 12, 2022)

Dyatlov A said:


> Will it melt down a Corsair SF Series 750W Platinum PSU or can be ok?



It'll probably be fine. SF has pretty robust protection IIRC, so it should shut down before melting down.


----------



## Xex360 (Oct 12, 2022)

defaultluser said:


> yeah, but identical launch day prices


Not really, MSRP isn't a real price. Real price is what buy it for, a good starting point is average prices in different markets.


CallandorWoT said:


> I actually think the 4090 is a good price, you need to think of it as the new Titan though. This card isn't meant for average gamers, it's meant for 4k stuff and 4k gamers pretty much exclusively. I honestly am not sad at all its out of my price range. I just hope RDNA3 can get me at 165 fps 165hz 1440p in games like cyberpunk 2077 with raytracing turned off.
> 
> That's all I want. lol


Except it's not a Titan, had it been one it would be a steal, a true Titan is more akin to a quadro than a GeForce, but people are just confused (probably on purpose by nVidia, even though they never say it's a Titan).


----------



## Crackong (Oct 12, 2022)

GerKNG said:


> you show us the LTT benchmark where they forgot to disable DLSS.
> this is not a real result.



They said it was FSR
They have issued a 'correction' and promised an update in the video


----------



## Richards (Oct 12, 2022)

RandallFlagg said:


> N4 isn't even an in-between 'half node'.  It's more like N5+.  6% greater density.
> 
> But Samsung's 8LP was really just an enhancement to their 10LP node.  It was only 12% more dense.
> 
> ...


Tsmc n5 has been exposed its  137Mt/mm2 not 180..  plus Samsung  8lp are high density  cells  not high performance  thats why  it doesn't  clock high


----------



## sepheronx (Oct 12, 2022)

Rasterization performance is great.  Price? Crap.  RT performance? Not that great. DLSS 3? cool I guess but I am not interested in theoretical input of frames vs the real deal.

Now my question is, why isn't there possibly a separate card for RT like there was a Physx card?


----------



## RandallFlagg (Oct 12, 2022)

Richards said:


> Tsmc n5 has been exposed its  137Mt/mm2 not 180..  plus Samsung  8lp are high density  cells  not high performance  thats why  it doesn't  clock high



None of them are what the calculated densities are, largely because the calculated density is based on transistor size only.  The size of the interconnects / traces hasn't much advanced.  Also there's no standard way of measuring more complex circuits, and so on.

Nevertheless, the 'predicted' max density based on transistor dimensions is still useful to compare relative size.


----------



## Ando (Oct 12, 2022)

So I like AMD as much as the next guy, but I’m fairly certain this is a mistake.


----------



## BigMack70 (Oct 12, 2022)

adilazimdegilx said:


> I think w1z is correct with this one. But I agree it's not obvious as it's not explained clearly.
> 4090 does indeed make the highest jump between new generations since (at least) 6xx series. (I didn't check older generations but it probably goes until 8xxx series, TPU reviews only).
> 
> *This 'gap' is between the fastest new generation card 'at launch' versus the fastest old generation card at that time. *
> ...


Yes, the "gap" referenced is a new flagship vs last generation's flagship. No, the conclusion in this review is not warranted by the data. It's straight up nonsense that only makes sense if you let Nvidia's marketing department dictate how you evaluate their "generations" of GPUs.

Here are the actual "generations" of Nvidia GPUs, by architecture:
Fermi (GTX 4xx and 5xx)
Kepler (GTX 6xx and 7xx)
Maxwell (GTX 9xx)
Pascal (GTX 10xx)
Turing (RTX 20xx)
Ampere (RTX 30xx)
Ada (RTX 40xx)

Using any other criteria to determine a "generation" of GPU is ignorant and misleading. When one speaks of "generations" of graphics cards, one speaks of GPU architectures, not specific SKUs.

And here are the performance jumps from the top flagship card in each generation to the next, based on TPU's own data:

Tesla 2.0 (GTX 285) --> Fermi (GTX 580) = 67% performance jump gen on gen
Fermi (GTX 580) --> Kepler (GTX 780 Ti) = 104% performance jump gen on gen
Kepler (GTX 780 Ti) --> Maxwell (Titan X) = 45% performance jump gen on gen
Maxwell (Titan X) --> Pascal (Titan Xp) = 72% performance jump gen on gen [Titan Xp and 1080 Ti were very similar, 1080 Ti a couple points faster]
Pascal (GTX 1080 Ti/Titan Xp) --> Turing (RTX 2080 Ti) = 39% performance jump gen on gen
Turing (RTX 2080 Ti) --> Ampere (RTX 3090 Ti) = 56% performance jump gen on gen
Ampere (RTX 3090 Ti) --> Ada (RTX 4090) = 45% performance jump gen on gen

If you assume the "true" Ada flagship will be a 4090 Ti, and that card will be 10% faster than the 4090, then Ada is a 59% performance jump gen on gen.

In no way is this remarkable. It's decidedly *un*remarkable. Ordinary. Expected. Typical. *If you exclude Kepler, which WAS extraordinary and remarkable, the average gen on gen performance jump is 54% per generation. *


----------



## neatfeatguy (Oct 12, 2022)

evernessince said:


> DLSS 3.0 frame insertion does indeed wait for frame 2 in your example before generating the intermediary frame.



I'm sorry, but the only way I can picture this and how it happens is the scene in Fight Club where Tyler splices a single frame of porno into a family movie.....just a magical frame that's not really needed, but it shows up without you even really noticing.


----------



## matar (Oct 12, 2022)

Wow makes the the RTX 3090Ti look like an RTX 3060 vs the 3090TI = (3090Ti vs RTX 4090)


----------



## PapaTaipei (Oct 12, 2022)

Why_Me said:


> Someone would have to be brain dead to purchase this card for gaming at 1080P.
> 
> 
> Not sure what games you're talking about.  Most competitive FPS gamers w/sponsors that play games such as Warzone play at 1440.


Warzone is not a competitive game. I'm talking about CSGO, Quake Live/Quake Champions, Overwatch, Valotant, StarCraft 2.


----------



## Why_Me (Oct 12, 2022)

PapaTaipei said:


> Warzone is not a competitive game. I'm talking about CSGO, Quake Live/Quake Champions, Overwatch, Valotant, StarCraft 2.


I'm talking about gamers that have actual sponsors and yes there's plenty of those that play Warzone.  The reason for 1440 *>* 1080 is for sniping as in depth. 240Hz 1440P is what it's about.  For the poors who don't have sponsors .. then ya 1080P for them.


----------



## Prima.Vera (Oct 12, 2022)

When you think a PS5 is 500$ and can play ANY game, you kind of start to laugh of those ridiculous prices from nGreedia. I mean, really. In the end is just a game you're playing. 
Any price above 800$ for a GPU is not worthing at all.


----------



## ModEl4 (Oct 12, 2022)

W1zzard said:


> As mentioned in the conclusion, it seems the driver overhead is higher than on Ampere, so this higher CPU usage will eat into the CPU time available for the game and thus make it even more CPU limited


Regarding why in some games 4090 is slower than Amprere models in 1080p, it's not only higher driver overhead imo.
Probably It's also the latency incurred being a 12 GPC (11 active) design, if I'm right, RTX 4080 16GB (being a 7 GPC design) will be slightly faster than RTX 4090 in these specific games in 1080p, we will see if my hypothesis is correct next month.


----------



## swaaye (Oct 12, 2022)

Prima.Vera said:


> When you think a PS5 is 500$ and can play ANY game, you kind of start to laugh of those ridiculous prices from nGreedia. I mean, really. In the end is just a game you're playing.
> Any price above 800$ for a GPU is not worthing at all.


All this new hardware and the end of GPU mining is creating some lovely prices on all that now apparently completely undesirable hardware that many of these guys paid a lot of money for a year or two ago.


----------



## Bwaze (Oct 12, 2022)

matar said:


> Wow makes the the RTX 3090Ti look like an RTX 3060 vs the 3090TI = (3090Ti vs RTX 4090)



Yeah, that's what new generation does. RTX 3090 did that to RTX 2080 Ti. As has been shown, rasterisation shows quite a normal and expected leap - and there is no "2 - 4x" leap even if you look at raytracing or DLSS. 

Except this time around there is (yet) no cryptominers that would leap to such an expensive card. 

And so the fall in next quarter's revenue will be squarely on us, unreliable gamers.


----------



## The Von Matrices (Oct 12, 2022)

Dr. Dro said:


> It's NV alright. Prices are historically high even in America which gets dirt cheap hardware anyway. That said it is not out of line compared to the 3090's launch price, and honestly, anyone with a GA102-based GPU should not be running around with their hair on fire, it's just FOMO, really.
> 
> If you spent $800-$1600+ on a high end GPU, give it at least 3 years before you replace it. I guarantee anyone on a 3080+ is going to be comfortable gaming on ultra high settings for the next year if not more.


As a 3090 owner, I'm debating whether to buy. 

On one hand, it costs a lot of money at $1600.

On the other hand, each GPU generation now lasts about 2 years and prices move very little within those two years (absent external events like cryptocurrency crashes). Economics suggests that if I am going to buy it at any point in the next 2 years, I should buy as soon as possible. If I wait until 2023 or 2024 to buy, it will be the same price, and I will have given up 1-2 years I could have been using it.

It's an interesting consideration.


----------



## Xex360 (Oct 12, 2022)

BigMack70 said:


> Yes, the "gap" referenced is a new flagship vs last generation's flagship. No, the conclusion in this review is not warranted by the data. It's straight up nonsense that only makes sense if you let Nvidia's marketing department dictate how you evaluate their "generations" of GPUs.
> 
> Here are the actual "generations" of Nvidia GPUs, by architecture:
> Fermi (GTX 4xx and 5xx)
> ...


Such wisdom.
I would say nVidia marketing is very powerful, some philistines even call RT RTX, some reviewers (like DF) are basically nVidia's marketing tool (if you'd like similar analysis check NX Gamer, much more knowledgeable and impartial).
Another thing people tend to think about Turing most of the time, where performance stagnated (2080 vs 1080ti), hence lots of people got the impression that the 3080 was a huge step up.
I believe sites like Techpowerup should stand to this mediocrity, maybe the community here (which BTW is great compared to other fanboys/ignorants filled sites) can help avoid falling in the trap of marketing be it nVidia's, Intel's or AMD's.


----------



## Dirt Chip (Oct 12, 2022)

Gameslove said:


> This graphics card designed for a 4K, 8K gaming, not less. It ridiculous is using this powerful card for 1080p.
> 4K gaming 5800X quite enough here.
> 
> @W1zzard
> Thanks for the review. Waiting also 5K, 6K, 8K test. And PCI-E 3.0 vs PCI-E 4.0?


One must have 500fps on FHD so no cpu in sight is enough...


----------



## Legacy-ZA (Oct 12, 2022)

Dyatlov A said:


> Will it melt down a Corsair SF Series 750W Platinum PSU or can be ok?




Hard to say; depends on what else and how much of that else you have in your system. 

I would recommend an 850W Gold / Plat minimum PSU to be safe. I think if you are going to slot in an RTX4090 on a 750W PSU, your system probably won't post or if it does, you will have to power limit the card in MSI afterburner before use, and even if you power limit the card, what else might happen is, during load, your machine will "dip" out and reboot.


----------



## Sake (Oct 12, 2022)

CallandorWoT said:


> I actually think the 4090 is a good price, you need to think of it as the new Titan though. This card isn't meant for average gamers, it's meant for 4k stuff and 4k gamers pretty much exclusively. I honestly am not sad at all its out of my price range. I just hope RDNA3 can get me at 165 fps 165hz 1440p in games like cyberpunk 2077 with raytracing turned off.
> 
> That's all I want. lol


It is very expensive, the price have nothing to do with the MSRP.


----------



## Luminescent (Oct 12, 2022)

I am surprised by how amazed reviewers are about rtx 4090 performance.
If you look at the numbers, this was to be expected.
RTX 3090 TI - 28,300 million transistors and lower clocks , RTX 4090 - 76,300 million transistors and much higher clocks.
While they claim they did some architectural changes and the usual marketing "second, third, fourth generation..........", i attribute this gain to TSMC and they took a look at what AMD did with infinity cache and did something similar, just look at that L2 cache now.
All this gain comes from much improved process node, this can't be done on previous Samsung 8nm.


----------



## rrrrex (Oct 12, 2022)

Transistors 3x, Clock 1.35x, but Perfomance 1.5x


----------



## Dyatlov A (Oct 12, 2022)

Why_Me said:


> Just about everything in the EU is priced insane.  Do you ever think it might be the EU and not Nvidia?


I think for Bieden primarily for super high prices, the war is sponsored from our savings, as by endless mad money printing cause huge inflation and our money always worth less. Better to spend our savings for a 4090, because in a year will worth again half.


----------



## Why_Me (Oct 12, 2022)

Sake said:


> It is very expensive, the price have nothing to do with the MSRP.
> View attachment 265104


That's expensive!  

https://www.newegg.com/p/pl?N=100007709 601408872 *<---* US prices atm.


----------



## N/A (Oct 12, 2022)

rrrrex said:


> Transistors 3x, Clock 1.35x, but Perfomance 1.5x


That number 1,5x includes alot of CPU limited games that don't scale very well above 100 FPS. The frame rate is double at 4K in resident evil. This will be the case for many games coupled with 14900KS next year, triple with frame generation, and could quadruple with GDDR7 in the next gen or refresh. This is still sub optimal, no way around it. Work in progress.  RT core, tensor and more $L2 does take a lot of space. But apparently it's the way forward.


----------



## Luminescent (Oct 12, 2022)

I am from EU and i'm telling you, very few people will buy an RTX 4090, we have much bigger problems and a 400W GPU is not something anyone wants in a household with possible rations for energy.
Aside this, *look at the share price of Nvidia, Intel and AMD*, they continue to drop even though they released new shiny toys, very few people care anymore about flagship GPU's and cpu's.
I predict GPU's and cpu's will get cheaper even at a loss if need it just to get rid of them in a year or two, there is no interest from the masses for this anymore.


----------



## clopezi (Oct 12, 2022)

Luminescent said:


> very few people will buy an RTX 4090



I expect a total sold out. We are so many people in the world and everyone has a different situation.



Luminescent said:


> with possible rations for energy.



There are not expected any energy rations on any house, maybe for the industry. For example, here in Spain, we don't have any problems with this, we have a lot of renewable energies and other solutions. There are many countries in Europe that doesn't have this problems, but as always, Germany problems has to be everyone problems.



Luminescent said:


> Aside this, *look at the share price of Nvidia, Intel and AMD*, they continue to drop even though they released new shiny toys, very few people care anymore about flagship GPU's and cpu's.



Has nothing to do with this models, all semiconductor companies are down in market share because potatoes (and global situation, china chips bans, etc), but for example, in 2020 this companies went up a lot with the same sales. 

However, Nvidia it's grewing up his revenue since... ¿6 years? Every quarter without interrumptions.




Luminescent said:


> I predict GPU's and cpu's will get cheaper even at a loss if need it just to get rid of them in a year or two, there is no interest from the masses for this anymore.



This it's my desire but this it's impossible meanwhile only 2-3 companies in the world can make 4nm/3nm chips.


----------



## 1d10t (Oct 12, 2022)

Definitely fast card obviously not for me, I'm more interesting in how AMD counter these core heavy beast with their chiplet design.


----------



## Bwaze (Oct 12, 2022)

Luminescent said:


> I predict GPU's and cpu's will get cheaper even at a loss if need it just to get rid of them in a year or two, there is no interest from the masses for this anymore.



Not necessarily. 

When cryptomarket broke in early 2018, prices of Pascal cards (GTX 10x0) remained very high, until the fall when Turing (RTX 20x0) came out with zero price / performance increase. Nvidia had several bad quarters, and aleggedly a huge stock of unsold high end Pascal chips. People were guessing if we were going to see a flood of cheap GTX 1080, 1080 Ti's, or some product based on those, but they just dissapeared. 

Throwing the excess in the landfill, pushing on with high prices eventually paid off - with new generation and new crypto boom. 

Nvidia can well afford a year or two of drought. Sure, they will blame us gamers for the bad market results, but that won't lower the prices. 

And competitors are irrelevant. They reach only a small portion of the market, and are so limited in their ability to offer huge number of products that they can only price match Nvidia, or loose income by selling cheaper.


----------



## tfdsaf (Oct 12, 2022)

Better than expected, but its way overpriced for what it does! And Nvidia should stop with the 'cheat codes' to boost fps like frame generation and crap like that. At $1600 starting price its way worse in price to performance than any of the AMD cards and much worse than Nvidia's own RTX 3080.

If This card released at $1100 I'd say this is the best GPU that released in the past 5-6 years no doubt, but its about $500-600 more expensive than it should be! $1600 just to be able to play a game at 80fps at 4k? Weren't we supposed to be gaming at 60+fps at 4k generations ago?

Oh NOW I can play old generation games at 60-80fps at 4k for "only" $1600, goody!

Gamers should STOP rewarding Ngreedia for way overpriced cards, $1200 should be the HIGHEST price anyone should pay for the highest tier gpu and it should be a 'Titan' class GPU, where you get that for $1200, but you have the TI version for $900, the vanilla version for $700, etc...

What I would have liked to see is the RTX 4090 starting at $1000, the RTX 4080 at $800, RTX 4080 12GB at $700, RTX 4070 at $500 and later on the RTX Ti at $1200.  This would also have the mid range GPU's return to somewhat sane prices with the RTX 4060 at $350, RTX 4050 at $250, etc...


----------



## big_glasses (Oct 12, 2022)

W1zzard said:


> Would love to., Any recommendation.. that's not super CPU limited due to the pos that Unity is?


I'd personally love GTFO, but that'd be a proper pos to check (due to rng) (maybe the devs will respond to a benchmark level, doubt it)
Wsateland 3, Hardpoint: shipbreaker, Desperados?
Valheim is probably too CPU limited. same with Raft. 

Kinda hard, given a lot of unity games are ether with RNG or "low graphical", personally would just love an indicator of how CPU/GPU's can performs with that engine









						Performance Benchmarking in Unity: How to Get Started | Unity Blog
					

As a Unity developer, you want your users to love playing your games, enjoying a smooth experience across all the platforms they may play on. What if I told you that we just made it easier to create performance benchmarks? If you want to learn how to develop games or Unity tools with an eye on...




					blog.unity.com
				



Maybe Unity devs could respond with a perf benchmark?


----------



## GoldenTiger (Oct 12, 2022)

W1zzard said:


> lol really? and I felt bad because I had only two games


You're one of the few... TheFpsReview had them up too though .


----------



## gffermari (Oct 12, 2022)

The card and its price are fine. 2080Ti, 3090, 4090 are in the same price range and are ok for enthusiasts. 
The 2080, 3080, 4080s should have some value but apart from the 1080 and 3080, the rest were a failure.

Even AMDs cards give a reasonable bump in performance in every gen.
The CPUs and the game engines are the problem which do not follow the performance increase accordingly.


----------



## R0H1T (Oct 12, 2022)

N/A said:


> That number 1,5x includes alot of CPU limited games that don't scale very well above 100 FPS. The frame rate is double at 4K in resident evil. *This will be the case for many games coupled with 14900KS next year, triple with frame generation, and could quadruple with GDDR7 in the next gen or refresh.* This is still sub optimal, no way around it. Work in progress.  RT core, tensor and more $L2 does take a lot of space. But apparently it's the way forward.


Yeah and we'll have flying unicorns by then just as well


----------



## igralec84 (Oct 12, 2022)

Wouldn't mind one for 4K 144hz gaming, but not at 2000€... maybe 1700€ with gritted teeth  

Hoping 7900XT is priced at 1200€ and very close to the 4090 in performance or at least less than 25% slower


----------



## Luminescent (Oct 12, 2022)

clopezi said:


> I expect a total sold out. We are so many people in the world and everyone has a different situation.


If the stock is of 2 gpus of course, i doubt retailers will stock up in the thousands for rtx 4090, remember, mining is dead.



clopezi said:


> There are not expected any energy rations on any house, maybe for the industry. For example, here in Spain, we don't have any problems with this, we have a lot of renewable energies and other solutions. There are many countries in Europe that doesn't have this problems, but as always, Germany problems has to be everyone problems.


I don't know in Spain but in most EU energy prices tripled and quadrupled in price and there is no sign of stoping.


clopezi said:


> Has nothing to do with this models, all semiconductor companies are down in market share because potatoes (and global situation, china chips bans, etc), but for example, in 2020 this companies went up a lot with the same sales.
> 
> However, Nvidia it's grewing up his revenue since... ¿6 years? Every quarter without interrumptions.


Isn't 2020 the year pandemic restrictions hit ? 
I will let it written here so we come back in a year or two, octomber 12 2022, no matter the gpu segment, low mid or high end,  Nvidia  sales will drop considerably and nobody will care, there is no more mining, most gamers  moved on and the current inflation needs a recession.
They will be lucky to sell a 300-400$ GPU in a year, hopefully Jen-Hsun Huang set aside some money for employees when times get tough.


----------



## clopezi (Oct 12, 2022)

tfdsaf said:


> Nvidia should stop with the 'cheat codes' to boost fps like frame generation and crap like that.



Did you see the comparative? I've to see it in frame by frame mode to see any differences. In real time, looks the same to me.

DLSS3 dosn't look like a "crap", it's a wonderful tool to take 40fps games and make it 100fps games.


----------



## Rokugan (Oct 12, 2022)

Why_Me said:


> Just about everything in the EU is priced insane.  Do you ever think it might be the EU and not Nvidia?



I know it's the dreadly combination of current Eur/$ rate being around parity, and our stupidly high VAT taxes. But still, +25% increase vs the USA price is bonkers, period.
I'd buy a 4090 for 1.600 Eur. For 2K Eur won't. Ever. Specially taking into account I can buy a 2nd hand 3090 for 700 Eur.
Also, 99% sure that DLSS3 will be implemented for RTX3K series as well, the RTX4K exclusivity is just a temporary BS marketing stunt, zero technical reason for that.



Luminescent said:


> I don't know in Spain but in most EU energy prices tripled and quadrupled in price and there is no sign of stoping.



In Spain prices have tripled as well. Renewables are significant in Spain, but we are heavily dependent on gas as the rest of Europe, and we import a lot of nuclear generated electricity from France



clopezi said:


> There are not expected any energy rations on any house, maybe for the industry. For example, here in Spain, we don't have any problems with this, we have a lot of renewable energies and other solutions. There are many countries in Europe that doesn't have this problems, but as always, Germany problems has to be everyone problems.



Stupid beyond comprehension. I'm in Spain and renewable energies DON'T warrant a continuous supply 24/7 because it can't be stored, and they don't produce enough energy anyways.
Simple proof is that everyone's electrical bill in Spain has x3 in price. Alas, the clowns in our government have already warned that there could be energy cuts this winter.
Don't spread BS


----------



## clopezi (Oct 12, 2022)

Rokugan said:


> Simple proof is that everyone's electrical bill in Spain has x3 in price. Don't spread BS



I didn't say anything about the price, but of possible rationing. We are now serving, in addition to 100% domestic, 50% of Portugal energy and 20% of France energy


----------



## Rokugan (Oct 12, 2022)

clopezi said:


> I didn't say anything about the price, but of possible rationing. We are now serving, in addition to 100% domestic, 50% of Portugal energy and 20% of France energy



Do you understand supply/demand? Why do you think prices have 3x if we have so much energy to spare?
Answer: we don't. Renewables produce peaks that if not consumed they're lost, so they get exported.
But when renewables are not available, we are using gas and importing nuclear generated electricity from France like crazy.
Renewables are a band-aid, not a cure. And in winter demand will rise and prices will go even higher.

Also, Pinocchio Sanchez has already warned everyone there could be energy cuts. Probably they want to steal even more money.


----------



## Valantar (Oct 12, 2022)

@W1zzard Most other publications seem to have had DLSS3 be so broken as to be essentially useless, yet you note no major problems. Any idea why this is?

Also, those power numbers are _very_ different than what some other publications (GN, LTT) report - to the tune of ~100W in games. Makes me curious as to why that is - is CP2077 somehow CPU limited in the test scenario? Or is RTX 4000 power management just really weird and unpredictable between different games?


Other than that though, the absolute performance of this is quite impressive. To the degree that, well, buy this GPU and you'll be set for ... five years? More? Of high resolution, high frame rate, high settings gaming, with few compromises.

In light of which I can't help but foresee the GPU market shrinking _massively_ in the next few years. Yes, there are tons of people on five-year-old GPUs who would be in the market for an upgrade right about now, but those are five-year-old $200 GPUs, and these people aren't spending $900+ on an upgrade. And these will no doubt sell to wealthy enthusiasts, but there aren't _that_ many of those globally. And then there's the glut of now-previous gen cards, and the entry of Intel into the market (which is looking somewhat passable, at least for a first-gen effort). My hope? That AMD doesn't bother to compete all that hard with this, but rather puts out a couple of high-end SKUs before rapidly moving on to a highly competitive, cost-optimized midrange and lower midrange. Maybe an RX 7500 XT that's mostly a ~$150-200 die shrink of the RX 6600? That would be awesome, and would actually have a chance at selling in volume, unlike these billionaire yacht-style flagships.


----------



## W1zzard (Oct 12, 2022)

Valantar said:


> Most other publications seem to have had DLSS3 be so broken as to be essentially useless, yet you note no major problems. Any idea why this is?


NVIDIA gave press a huge guide with lots of info on how to use DLSS3. One requirement is to have HAGS enabled, turn off V-Sync, not use certain overlays etc. Maybe they didn't read these guides? I did encounter crashes when changing settings in some games but I'm 100% sure these are just minor implementation details that are easy to fix in final builds. Publishers gave us beta access to preview builds, so this isn't unexpected.

The visuals you can see and judge for yourself. Once you're trained and fence area to look at the tyres in F1 you can see it in normal playback speed, but not when you're concentrating on driving a car. At least that's my experience. Zero difference in MSFS, even in stills



big_glasses said:


> I'd personally love GTFO, but that'd be a proper pos to check (due to rng) (maybe the devs will respond to a benchmark level, doubt it)
> Wsateland 3, Hardpoint: shipbreaker, Desperados?
> Valheim is probably too CPU limited. same with Raft.


Wasteland 3 is super CPU limited, I tried that for Unity. Having doubts about the others, too, and I don't want a POS or always-online to make my benching life miserable .. remember I'm at 25 titles now


----------



## antuk15 (Oct 12, 2022)

N/A said:


> 4090 is double the performance in games that are not CPU limited like resident evil in 4K gen over gen which means 3090 non Ti. And with frame insertion probably goes up to 3 times.
> How is it supposed to scale when it uses the same memory type, it needs GDDR7 very badly. But no such luck. This behemoth is doomed to a kind of failure like the Fermi. GDDR6X is broken, it does very little, 3070 Ti is the same as 3070, no actual benefit from transferring data on 4 voltage levels. just a ginormous power loss.



It doesn't need GDDR7, it needs HBM3.


----------



## ARF (Oct 12, 2022)

igralec84 said:


> Hoping 7900XT is priced at 1200€ and very close to the 4090 in performance or at least less than 25% slower



The RX 7900 XT won't be that slow, for sure. 25% less would make it only miserable 20-30% faster than the RX 6950 XT.

Meanwhile, RTX 4090 Ti is behind the corner: NVIDIA GeForce RTX 4090 Ti Rumored To Be Up To 20% Faster Than 4090: Specs Include 18176 Cores, 475W TGP, 96 MB Cache & Almost 3 GHz Boost (wccftech.com)


----------



## P4-630 (Oct 12, 2022)

About the 4090 Ti









						According to the rumor, NVIDIA is already saving AD102 GPUs for GeForce RTX 4090 Ti - VideoCardz.com
					

NVIDIA RTX 4090 Ti back in the rumor mill NVIDIA is allegedly putting aside the best AD102 chips for the new flagship model.  With today’s release of RTX 4090, the focus will now shift towards the upgraded Ti model, which had been rumored for many months already. NVIDIA has indeed left a lot of...




					videocardz.com


----------



## ARF (Oct 12, 2022)

24 GB is overkill. The change from lower models to higher models should not be this steep.

12 - 16 - 24 - 24.

Right: 16 - 18 - 20/22 - 24.


----------



## Why_Me (Oct 12, 2022)

ARF said:


> 24 GB is overkill. The change from lower models to higher models should not be this steep.
> 
> 12 - 16 - 24 - 24.
> 
> Right: 16 - 18 - 20/22 - 24


I'm thinking 20GB for the 4080 Ti.


----------



## Xex360 (Oct 12, 2022)

Sake said:


> It is very expensive, the price have nothing to do with the MSRP.
> View attachment 265104


Damn that's even worse than what I thought. 
Review should reflect this, especially in the ratio price/performance.


----------



## Dr. Dro (Oct 12, 2022)

The Von Matrices said:


> As a 3090 owner, I'm debating whether to buy.
> 
> On one hand, it costs a lot of money at $1600.
> 
> ...



I had to blow what I had saved for this on emergency repairs for my setup, I planned on upgrading but now it's going to be for the inevitable Ti model. Honestly, as another 3090 owner I'm telling you, the FOMO should blow over. You are not missing out on much, if anything, the wise thing to do for people like us is to buy the 4090 Ti or AMD's 7950XT (Navi 31 refresh?)


----------



## big_glasses (Oct 12, 2022)

Xex360 said:


> Damn that's even worse than what I thought.
> Review should reflect this, especially in the ratio price/performance.


It's why I generally am not a fan of price/perf, as it's too country/region dependent. and most (english) tech site have a very wide and global audience. 
I do like what w1zzard have done with 4 "tiers" of price/perf for the new card, best solution I've seen for it so far. But it should have been a tier that is more expensive than MSRP


----------



## Bwaze (Oct 12, 2022)

big_glasses said:


> I do like what w1zzard have done with 4 "tiers" of price/perf for the new card, best solution I've seen for it so far. But it should have been a tier that is more expensive than MSRP



But what's the point in comparing a "1000 USD" 4090 to other cards at their real prices? Does anyone believe tha's a possibility?


----------



## Why_Me (Oct 12, 2022)

Xex360 said:


> Damn that's even worse than what I thought.
> Review should reflect this, especially in the ratio price/performance.


That's the EU where pretty much everything is expensive. 

https://www.newegg.com/p/pl?N=100007709 601408872 *<----* US prices.


----------



## Valantar (Oct 12, 2022)

Bwaze said:


> But what's the point in comparing a "1000 USD" 4090 to other cards at their real prices? Does anyone believe tha's a possibility?


Agreed, that's rather weird. Below MSRP is fine, but 2/3rds the price? Not happening. And most partner models will be more expensive. Rather than $1000-1200-1400-1600 it should have been something like $1400-1600-1800-2000.


----------



## Bwaze (Oct 12, 2022)

Yeah. After months of "plummeting prices" RTX 3080 is still at or above starting MSRP in Europe...


----------



## Valantar (Oct 12, 2022)

Bwaze said:


> Yeah. After months of "plummeting prices" RTX 3080 is still at or above starting MSRP in Europe...


What's the EU MSRP for those? If you're thinking of US$ MSRPs, those don't include sales tax/VAT/GST, so you need to add however much your country charges on top of the US MSRP. Blame the silly Americans with their wildly variable sales tax rates.


----------



## ARF (Oct 12, 2022)

Valantar said:


> What's the EU MSRP for those? If you're thinking of US$ MSRPs, those don't include sales tax/VAT/GST, so you need to add however much your country charges on top of the US MSRP. Blame the silly Americans with their wildly variable sales tax rates.



It's not only the VAT. There are pricing differences inside these countries with the same VAT +19-20%.
It really depends on how greedy the retailers are.

I think the lowest prices inside the EU are those in Germany.


----------



## Bwaze (Oct 12, 2022)

Valantar said:


> What's the EU MSRP for those? If you're thinking of US$ MSRPs, those don't include sales tax/VAT/GST, so you need to add however much your country charges on top of the US MSRP. Blame the silly Americans with their wildly variable sales tax rates.


It should be about US MSRP + VAT, so in Germany about 810 EUR. There are a few models about that price, and many far above it - after months of news how the prices are falling.


----------



## Xex360 (Oct 12, 2022)

big_glasses said:


> It's why I generally am not a fan of price/perf, as it's too country/region dependent. and most (english) tech site have a very wide and global audience.
> I do like what w1zzard have done with 4 "tiers" of price/perf for the new card, best solution I've seen for it so far. But it should have been a tier that is more expensive than MSRP


The problem with some English tech sites is they think there is only one market the US market (ironic given that the US gave up English a century ago).
I noticed the 4 tiers system, but it still not representative at all, there is a huge difference moving from 1600$ to 3100€. Maybe use Amazon's prices from different markets.
Another issue is tax, no one should show before tax price, it is meaningless.


----------



## Valantar (Oct 12, 2022)

ARF said:


> It's not only the VAT. There are pricing differences inside these countries with the same VAT +19-20%.
> It really depends on how greedy the retailers are.
> 
> I think the lowest prices inside the EU are those in Germany.


It's very little due to "greedy retailers", and more due to the realities of operating in smaller, less unified markets. The US has _massive_ economics of scale due to being a 300m people, wealthy, single-language market, meaning tons of sales of high margin products with relatively low staff needs and increased opportunity for centralization of various branches of the retail value chain. In contrast, while the EU is nominally a single market, it is still fragmented through language, distribution chains, localized economic factors, and more. This drives up prices simply because there are more people involved, more business costs that need covering in order to deliver the products onto retail shelves.

Germany being the cheapest in Europe just confirms this: it's the largest single market, and has a relatively high level of income, meaning it gets a lot of the same advantages as the US, just to a lesser degree due to less scale. The only major difference would be that many brands have direct distribution in the US (meaning AMD, Nvidia, Asus, and others sell directly to retailers rather than to distributors), which is again a limitation of market size - running your own distribution business is _expensive_ and _difficult_.

After nearly a decade in electronics retail, even if that's a while back now, I can verify with 100% security that for most retailers, margins (_especially_ on expensive products like GPUs and CPUs) are _razor-thin_, often below what is actually needed for the company to break even, let alone make any net profit.



Bwaze said:


> It should be about US MSRP + VAT, so in Germany about 810 EUR.


By my calculation and current USD-EUR conversion rates according to DuckDuckGo, that's more like €850. $699*1.03*1.19=€852.6. And, of course, you never, ever get that conversion rate in the real world, so adding another few percent is pretty safe.


----------



## ARF (Oct 12, 2022)

Valantar said:


> It's very little due to "greedy retailers", and more due to the realities of operating in smaller, less unified markets. The US has _massive_ economics of scale due to being a 300m people, wealthy, single-language market, meaning tons of sales of high margin products with relatively low staff needs and increased opportunity for centralization of various branches of the retail value chain. In contrast, while the EU is nominally a single market, it is still fragmented through language, distribution chains, localized economic factors, and more. This drives up prices simply because there are more people involved, more business costs that need covering in order to deliver the products onto retail shelves.
> 
> Germany being the cheapest in Europe just confirms this: it's the largest single market, and has a relatively high level of income, meaning it gets a lot of the same advantages as the US, just to a lesser degree due to less scale. The only major difference would be that many brands have direct distribution in the US (meaning AMD, Nvidia, Asus, and others sell directly to retailers rather than to distributors), which is again a limitation of market size - running your own distribution business is _expensive_ and _difficult_.
> 
> After nearly a decade in electronics retail, even if that's a while back now, I can verify with 100% security that for most retailers, margins (_especially_ on expensive products like GPUs and CPUs) are _razor-thin_, often below what is actually needed for the company to break even, let alone make any net profit.



No, I disagree. The poorer EU countries have much lower purchase power, so these products are sitting on the shelves collecting dust. Because literally no one buys them.

Anyways, no matter where you live, you can always buy and import from Germany or from wherever it is the cheapest


----------



## TheinsanegamerN (Oct 12, 2022)

Prima.Vera said:


> When you think a PS5 is 500$ and can play ANY game, you kind of start to laugh of those ridiculous prices from nGreedia. I mean, really. In the end is just a game you're playing.
> Any price above 800$ for a GPU is not worthing at all.


$500, unless you want to play games off of disk, then its $600, for a console that will likely play many of its games at 30 FPS, with dithered resolution, and a wonderful closed garden environment where you have to subscribe to play online, cant mod, and get to pay full price for years old games, then yeah its great.


----------



## Valantar (Oct 12, 2022)

ARF said:


> No, I disagree. The poorer EU countries have much lower purchase power, so these products are sitting on the shelves collecting dust. Because literally no one buys them.


... exactly. Which drives up prices, because actually running a business selling these things _becomes more expensive_ due to this. You have more money bound up in products that take time to sell, meaning you have less cash flow and more trouble paying your bills, meaning you need more cash reserves and higher margins to cover operating costs relative to the time between you buy and sell the product. What you are describing is literally why this is the way it is. And at the level of distribution, there are less products flowing through the distribution system, forcing distributors to charge more to cover their (otherwise relatively static) business costs.


----------



## SOAREVERSOR (Oct 12, 2022)

Valantar said:


> It's very little due to "greedy retailers", and more due to the realities of operating in smaller, less unified markets. The US has _massive_ economics of scale due to being a 300m people, wealthy, single-language market, meaning tons of sales of high margin products with relatively low staff needs and increased opportunity for centralization of various branches of the retail value chain. In contrast, while the EU is nominally a single market, it is still fragmented through language, distribution chains, localized economic factors, and more. This drives up prices simply because there are more people involved, more business costs that need covering in order to deliver the products onto retail shelves.
> 
> Germany being the cheapest in Europe just confirms this: it's the largest single market, and has a relatively high level of income, meaning it gets a lot of the same advantages as the US, just to a lesser degree due to less scale. The only major difference would be that many brands have direct distribution in the US (meaning AMD, Nvidia, Asus, and others sell directly to retailers rather than to distributors), which is again a limitation of market size - running your own distribution business is _expensive_ and _difficult_.
> 
> After nearly a decade in electronics retail, even if that's a while back now, I can verify with 100% security that for most retailers, margins (_especially_ on expensive products like GPUs and CPUs) are _razor-thin_, often below what is actually needed for the company to break even, let alone make any net profit.



Yep this.  Way before the founders program Nvidia had direct distribution in the US to places like Best Buy as Nvidia branded cards.   The US has a relative few massive retailers here and companies can directly get their products there at a reasonable cost.  There are really only a few shipping channels as well that often own their own aiports or own/lease dedicated hubs at airports.

Even among "distributors" that operate on a global scale like say Amazon the scale of their facilities in the US compared to overseas is mind boggling.  If you haven't seen a US Amazon facility or say UPS hub you have no idea how stupidly insane things are here.   These things have buildings that take up acres, run 24/7 and at times their own air strips.  It's nuts.


TheinsanegamerN said:


> $500, unless you want to play games off of disk, then its $600, for a console that will likely play many of its games at 30 FPS, with dithered resolution, and a wonderful closed garden environment where you have to subscribe to play online, cant mod, and get to pay full price for years old games, then yeah its great.


This is silly.  Diskless is 400 with BR disk is 500.  Not all the games are 30 fps and most peoples PCs simply will not run 4k at even 30fps.  4k on PS5 is a reality with performance from 30-60fps which is something that the vast majority of PC gamers simply cannot do.   As for subscriptions yeah sure.  But I know lots of PC gamers with multiple subscriptions to multiple games or services so really that's not a good argument.  You also do not pay full price for much older games that's only the case in stuff like Nintendo's own IPs for the most part they drop fast.  You are also generally not playing the games off disk.  You're installing part of it to the SSD, and you can add another SSD.

I have a PC, a Switch, and a PS5 (along with my older stuff) and the PS5 is really better at 4k in general than the PC for most people.  I live in a very afluent area and the general trend among the people who were PC gamers has been moving to a macbook pro and getting a console for a while now.   Simply because even though these tesla/bmw owning assholes could afford a 4090 but it's sort of ludicrus when a PS5 does what it should, is less hassle, and really just works.   The PC gaming side of things that is still around is stupidly highrefresh 1080p monitors for stuff like FPS.

All these things have their own place and purpose and I enjoy all of them.  Really I enjoy the PC for the keyboard and mouse on FPS and I love me some strategy games.   But now that I have a 120hz OLED in living room and consoles are getting there for stuff like Dark Souls, Elden Ring, and other stuff it's sort of a wash.


----------



## Valantar (Oct 12, 2022)

Lol, the "consoles need subscriptions" argument is ... well, growing thinner by the day. How many PC gamers have Game Pass PC or Ultimate? I sure do, and I love it. Xbox All Access is possibly the best gaming deal available, with a zero-interest downpayment plan for a great console including Game Pass Ultimate? Awesome deal. Playstation doesn't have the same value proposition, but they still give out a ton of _good_ free games through PS+. PCs have the advantage of backwards compatibility with a huge library of free games, but if you want to play new games, it takes _a long time_ for the ~$10 advantage of PC game pricing to surpass the cost difference vs. a console at similar graphics/performance levels.


----------



## SOAREVERSOR (Oct 12, 2022)

Valantar said:


> ... exactly. Which drives up prices, because actually running a business selling these things _becomes more expensive_ due to this. You have more money bound up in products that take time to sell, meaning you have less cash flow and more trouble paying your bills, meaning you need more cash reserves and higher margins to cover operating costs relative to the time between you buy and sell the product. What you are describing is literally why this is the way it is. And at the level of distribution, there are less products flowing through the distribution system, forcing distributors to charge more to cover their (otherwise relatively static) business costs.



To get back to this as the US is massive stuff is routed quickly to the places that have the income.  Case in point we have COSTCO which is a US thing.  This is a warehouse full of stuff and it's a national chain.  It's a "discount" store in the sense that it's cheaper but they stell stuff in massive amounts ie by the crate, or the entire rib rack, or multiple whole chickens.  I've included some pictures but in the US these things aren't rare.  There are dozens in areas.  To call it a store is a bit silly.  It requires fork lifts to get you some things.  And we have these in the middle of downtown cities!  They also have pharmacies, food courts, and auto centers.

The funny thing though is due to the stupidly insane ability of the US to move stuff about they don't carry the same things.   The COSTCO where I live sells diamond jewlrey in the 10s of thousands range, Rolex watches, LG OLEDs, 1600 dollar giant slabs of wagyu beef and other stuff.  Drive 30 mins away and they don't have any of that.  It has jewlrey, watches, and TVs, but it's not the same stuff.

So in the US the higher end GPUs show in the market areas that can afford them, the other areas get lower end ones.  All this stuff is done by AI systems and it runs itself.  And if you want to buy a gift for a relative in another state where their COSTCO doesn't have it you buy it at yours and they route the shipping from their distrocenter to the other store and they come and pick it up.

You can't get the scale of the US if you haven't seen how bonkers it gets.


----------



## Valantar (Oct 12, 2022)

@W1zzard Looking more closely at the efficiency measurements and CP2077 benchmarks here, it seems to me that you've run into a problem: CP2077 @1440p is quite clearly CPU limited on this GPU. No, it isn't _quite_ at the average FPS of 1080p, but the difference is negligible, meaning that for a significant proportion of the test sequence, the game is likely CPU bound rather than GPU bound, with some sections being heavier on the GPU and thus drawing the average FPS down by a couple of percentage points. To me this would seem to align with other media outlets reporting power draws in the ~450W range for gaming loads. IMO, this makes your conclusions on efficiency essentially invalid - it seems impossible that the GPU is actually being loaded 100% during this workload, making it rather unrepresentative outside of that specific game and system configuration.

Of course that also makes the performance increase in CP2077 all the more impressive, but as Der8auer showed in a recent video, the 4090 seems to lose only ~5% performance with a power limit reduction down to 60%!


----------



## ARF (Oct 12, 2022)

Valantar said:


> ... exactly. Which drives up prices, because actually running a business selling these things _becomes more expensive_ due to this. You have more money bound up in products that take time to sell, meaning you have less cash flow and more trouble paying your bills, meaning you need more cash reserves and higher margins to cover operating costs relative to the time between you buy and sell the product. What you are describing is literally why this is the way it is. And at the level of distribution, there are less products flowing through the distribution system, forcing distributors to charge more to cover their (otherwise relatively static) business costs.



I want to ask you the following - when do the prices decrease?
1. when the demand is low, supply is fine, and no one buys?
or
2. when the demand is high, supply is plenty, and everyone buys?


----------



## igralec84 (Oct 12, 2022)

Soo, i just went home at 15:00 and didn't check NBB and stuff. Did the FE drop and how much did it cost in EU? Did 1599$ turn into 1949€ this time, although the 3080 FE was 699$ or 710€ at launch in 2020.


----------



## 80-watt Hamster (Oct 12, 2022)

Legacy-ZA said:


> Hard to say; depends on what else and how much of that else you have in your system.
> 
> I would recommend an 850W Gold / Plat minimum PSU to be safe. I think if you are going to slot in an RTX4090 on a 750W PSU, your system probably won't post or if it does, you will have to power limit the card in MSI afterburner before use, and even if you power limit the card, what else might happen is, during load, your machine will "dip" out and reboot.



Er, why wouldn't it POST?  I'm not going to claim that 750W isn't riding kinda close to the edge, but I can't think of a reason it wouldn't even run.


----------



## chowow (Oct 12, 2022)

W1zzard said:


> You'll get your 13900K with RTX 4090 results soon enough, just be a bit more patient


THANK FOR PUTTING IN THAT 5800X now we'll see if there's any difference AT 4K MAYBE if there's not much different like 5% I GOING TO GET 4090 WITH MY 5600 don't listen to these goofballs again thanks for all your hard work.


----------



## W1zzard (Oct 12, 2022)

chowow said:


> THANK FOR PUTTING IN THAT 5800X now we'll see if there's any difference AT 4K MAYBE if there's not much different like 5% I GOING TO GET 4090 WITH MY 5600 don't listen to these goofballs again thanks for all your hard work.


+1 for 5600X + 4090, it's probably not even 5%, but we'll know soon enough, very interesting question


----------



## HTC (Oct 12, 2022)

After checking the reviews of the non reference cards, i see why EVGA opted out: 1 to 3% higher performance only @ 4K, for $100 to $400 more ...


----------



## AnotherReader (Oct 12, 2022)

Dimestore said:


> I don't see how it is possible to generate an intermediary frame between two frames using temporal (time-based) data and not require that the second frame be rendered before the interpose.  Frame1 - Interframe - Frame2, in order to create interframe you must have the already rendered Frame1 and Frame2, and by the time you push Frame1 to the screen it is behind by a frame, giving the 1/3 extra latency.  Is this incorrect?


Latency varies; In Spider-Man, it is similar to native rendering without Reflex. In Cyberpunk, it is slightly less than native with Reflex. In both of these, it is worse than DLSS 2 with Reflex.


Marvel's Spider-Man Feast HQPerf DifferentialReflex OffReflex On*Native 4K*100%39ms36ms*DLSS 2 Performance*136%24ms23ms*DLSS 3 Frame Generation*219%-38ms



Cyberpunk 2077 MarketPerf DifferentialReflex OffReflex On*Native 4K*100%108ms62ms*DLSS 2 Performance*258%42ms31ms*DLSS 3 Frame Generation*399%-54ms


----------



## Valantar (Oct 12, 2022)

ARF said:


> I want to ask you the following - when do the prices decrease?
> 1. when the demand is low, supply is fine, and no one buys?
> or
> 2. when the demand is high, supply is plenty, and everyone buys?


That simplistic reasoning can only be used for comparing absolute pricing across different markets if all other factors are equal - otherwise you're just ignoring a heap of variables in how differently businesses operate in different regions. A distributor covering a market of 5 million people like Norway will have much higher costs per product sold than one covering 80 million people like in Germany, even if the latter necessarily has a lot more employees and other costs. Why? Because the market is 15 times larger, but they won't need 15x the employees, warehouse space, or other operating costs. A lot of those costs (assuming similar price levels in each location) will be very, very similar, despite the latter potentially selling 15x more products. They will also have a much more substantial cash flow, making it easier for them to pay creditors and thus having less credit interest to pay at any given time. The larger market also allows for thinner stock margins as you can make better predictive models for a larger population, making it possible to have less stock sitting around unsold for significant periods of time, further improving cash flow. All this means that they need much lower margins in order to break even or make any given profit, which again drives down prices compared to smaller markets. The lack of a market basis for any real competition in smaller markets also severely affects pricing.



Dimestore said:


> I don't see how it is possible to generate an intermediary frame between two frames using temporal (time-based) data and not require that the second frame be rendered before the interpose.  Frame1 - Interframe - Frame2, in order to create interframe you must have the already rendered Frame1 and Frame2, and by the time you push Frame1 to the screen it is behind by a frame, giving the 1/3 extra latency.  Is this incorrect?


That's not what they're doing. Rather, let's say we have three frames, 0, 1 and 2. They use the data from frames 0 and 1 to make a guess at what will happen between frame 1 and the yet-to-be-rendered frame 2. Interframe 1, between frames 1 and 2, has no actual relation to frame 2 beyond coming before it in the order of frames displayed.


----------



## Dimestore (Oct 12, 2022)

Valantar said:


> That simplistic reasoning can only be used for comparing absolute pricing across different markets if all other factors are equal - otherwise you're just ignoring a heap of variables in how differently businesses operate in different regions. A distributor covering a market of 5 million people like Norway will have much higher costs per product sold than one covering 80 million people like in Germany, even if the latter necessarily has a lot more employees and other costs. Why? Because the market is 15 times larger, but they won't need 15x the employees, warehouse space, or other operating costs. A lot of those costs (assuming similar price levels in each location) will be very, very similar, despite the latter potentially selling 15x more products. They will also have a much more substantial cash flow, making it easier for them to pay creditors and thus having less credit interest to pay at any given time. The larger market also allows for thinner stock margins as you can make better predictive models for a larger population, making it possible to have less stock sitting around unsold for significant periods of time, further improving cash flow. All this means that they need much lower margins in order to break even or make any given profit, which again drives down prices compared to smaller markets. The lack of a market basis for any real competition in smaller markets also severely affects pricing.
> 
> 
> That's not what they're doing. Rather, let's say we have three frames, 0, 1 and 2. They use the data from frames 0 and 1 to make a guess at what will happen between frame 1 and the yet-to-be-rendered frame 2. Interframe 1, between frames 1 and 2, has no actual relation to frame 2 beyond coming before it in the order of frames displayed.


So:

Frame 0 (player frame) - render, queue
Frame 1 (ai frame), render, queue
Frame 0 push
Frame 2 (player frame), render, queue
Frame 1 push
Frame 2 push
repeat

Is this correct?  If so, how does Frame 1 get generated temporally if it does not have data from Frame 2?  It would be like seeking in an mpeg video only having access to already played video frames, it wouldn't work?  What am I missing?


----------



## 3x0 (Oct 12, 2022)

It's like this:
Frame 1, Frame 2 get rendered but not pushed yet, Frame 1.5 AI gets generated based on 1 and 2, and then pushed out Frame 1, AI 1.5, 2.


----------



## W1zzard (Oct 12, 2022)

3x0 said:


> It's like this:
> Frame 1, Frame 2 get rendered but not pushed yet, Frame 1.5 AI gets generated based on 1 and 2, and then pushed out Frame 1, AI 1.5, 2.


This is what NVIDIA told me is happening.

To expand slightly, you have a render queue in which rendered frames are saved before they go out to screen, also called "frames render ahead" or similar. Typically the queue is 3 frames

DLSS 3 (just like Reflex) removes that queue, buffers the frames internally and sends them out at the right time, with generated frames in-between.

If this conflicts with other sources let me know so I can double-check with them


----------



## Dimestore (Oct 12, 2022)

3x0 said:


> It's like this:
> Frame 1, Frame 2 get rendered but not pushed yet, Frame 1.5 AI gets generated based on 1 and 2, and then pushed out Frame 1, AI 1.5, 2.


That's what I originally wrote though...



> I don't see how it is possible to generate an intermediary frame between two frames using temporal (time-based) data and not require that the second frame be rendered before the interpose. Frame1 - Interframe - Frame2, in order to create interframe you must have the already rendered Frame1 and Frame2, and by the time you push Frame1 to the screen it is behind by a frame, giving the 1/3 extra latency. Is this incorrect?


But then it was said that I was wrong...


Valantar said:


> That's not what they're doing. Rather, let's say we have three frames, 0, 1 and 2. They use the data from frames 0 and 1 to make a guess at what will happen between frame 1 and the yet-to-be-rendered frame 2. Interframe 1, between frames 1 and 2, has no actual relation to frame 2 beyond coming before it in the order of frames displayed.





W1zzard said:


> This is what NVIDIA told me is happening.
> 
> To expand slightly, you have a render queue in which rendered frames are saved before they go out to screen, also called "frames render ahead" or similar. Typically the queue is 3 frames
> 
> ...


What does 'removes that queue and buffers the frames internally' mean? Does it mean that it gets rid of the queue and then frames would be real time 'out of order', so it is pushing frames when ever they are ready, and using a buffer somewhere to hold the frames and generate the AI frame and push them in whatever order is most performant?


----------



## TheinsanegamerN (Oct 12, 2022)

SOAREVERSOR said:


> This is silly.  Diskless is 400 with BR disk is 500.


If you can find them. Wal mart, for instance, is selling the diskless for 658 and the BR for 739. 


SOAREVERSOR said:


> Not all the games are 30 fps and most peoples PCs simply will not run 4k at even 30fps





SOAREVERSOR said:


> 4k on PS5 is a reality with performance from 30-60fps which is something that the vast majority of PC gamers simply cannot do.


4k on ps5 is dithered/checkerboarded form lower resolutions. You can easily do the same with most gaming PCs. You can also, you know, choose what settings are frame rate you would like on PC, instead of being stuck on 4k30 of a console. 


SOAREVERSOR said:


> As for subscriptions yeah sure.  But I know lots of PC gamers with multiple subscriptions to multiple games or services so really that's not a good argument.


Whataboutism at its finest. Come back with an actual argument. 


SOAREVERSOR said:


> You also do not pay full price for much older games that's only the case in stuff like Nintendo's own IPs for the most part they drop fast.


You clearly do not pay attention tot he price of console VS PC games if you think the console prices are anywhere close. 


SOAREVERSOR said:


> You are also generally not playing the games off disk.  You're installing part of it to the SSD, and you can add another SSD.


This has absolutely nothing to do with the argument, which is that consoles are a walled garden and not the same experience as a (relatively) open  and customizable PC experience.  


SOAREVERSOR said:


> I have a PC, a Switch, and a PS5 (along with my older stuff) and the PS5 is really better at 4k in general than the PC for most people.  I live in a very afluent area and the general trend among the people who were PC gamers has been moving to a macbook pro and getting a console for a while now.   Simply because even though these tesla/bmw owning assholes could afford a 4090 but it's sort of ludicrus when a PS5 does what it should, is less hassle, and really just works.   The PC gaming side of things that is still around is stupidly highrefresh 1080p monitors for stuff like FPS.
> 
> All these things have their own place and purpose and I enjoy all of them.  Really I enjoy the PC for the keyboard and mouse on FPS and I love me some strategy games.   But now that I have a 120hz OLED in living room and consoles are getting there for stuff like Dark Souls, Elden Ring, and other stuff it's sort of a wash.


Literally this entire section is annecdotal gibberish.


----------



## mainlate (Oct 12, 2022)

does RTX 4000 series have LHR or similar workload limiters?


----------



## Dr. Dro (Oct 12, 2022)

mainlate said:


> does RTX 4000 series have LHR or similar workload limiters?



I doubt it. The 3090 series didn't either, besides, mining is dead.


----------



## igralec84 (Oct 12, 2022)

Dr. Dro said:


> I doubt it. The 3090 series didn't either, besides, mining is dead.



Which is why scalping now is very risky compared to 2020, when buying a scalped 3090 meant 2 months longer to ROI and they could've mined for 23 months anyway, so it didn't matter. 

I see ebay is filling up with 4090s for 3000-4000 EUR by the hour


----------



## Dr. Dro (Oct 12, 2022)

igralec84 said:


> Which is why scalping now is very risky compared to 2020, when buying a scalped 3090 meant 2 months longer to ROI and they could've mined for 23 months anyway, so it didn't matter.
> 
> I see ebay is filling up with 4090s for 3000-4000 EUR by the hour



Yeah, I saw this coming. But if my hunch is right, it should self-correct fast, this scalping is mostly to profit on people who are hasty to get the new thing day one no matter what. This kind of always happened. But I have to be honest with you: early adopting these GPUs have "sucked" for a while now, because they made it a habit of releasing the full processor (coupled with some decent project improvements) as a mid-gen refresh, so you pay flagship money for a slightly cut down card these days.

Honestly, the market was significantly different back then. This time around I feel comfortable waiting for AMD's answer and the mid-generational refresh, it's not like my 3090 sucks


----------



## Why_Me (Oct 12, 2022)

Valantar said:


> What's the EU MSRP for those? If you're thinking of US$ MSRPs, those don't include sales tax/VAT/GST, so you need to add however much your country charges on top of the US MSRP. Blame the silly Americans with their wildly variable sales tax rates.


No sales tax in the US state where I live.  This silly American can get that card for MSRP.


----------



## 3x0 (Oct 12, 2022)

BigMack70 said:


> And here are the performance jumps from the top flagship card in each generation to the next, based on TPU's own data:
> 
> Tesla 2.0 (GTX 285) --> Fermi (GTX 580) = 67% performance jump gen on gen
> Fermi (GTX 580) --> Kepler (GTX 780 Ti) = 104% performance jump gen on gen
> ...


Thanks for summarizing the architectural improvements. I have added release dates and performance improvement per month into the calculation, which can provide an additional data point although the data takes into account manufacturing problems (delays) and lack of competition (ie nV sandbagging with the Kepler generation)

*Tesla 2.0 (GTX 285) January 15, 2009--> Fermi (GTX 580) November 9, 2010* = 67% performance jump gen on gen (~22 months between releases, *3.05% perf jump per month*)
*Fermi (GTX 580) November 9, 2010 --> Kepler (GTX 780 Ti) February 19, 2013* = 104% performance jump gen on gen (~27 months between releases, *3.85% perf jump per month*)
*Kepler (GTX 780 Ti) February 19, 2013 --> Maxwell (Titan X) March 17, 2015* = 45% performance jump gen on gen (~25 months between releases, *1.8% perf jump per month*)
*Maxwell (Titan X) March 17, 2015 --> Pascal (Titan Xp) April 6, 2017* = 72% performance jump gen on gen (~25 months between releases, *2.88% perf jump per month*)
*Pascal (GTX 1080 Ti/Titan Xp) April 6, 2017 --> Turing (RTX 2080 Ti)* September 27, 2018 = 39% performance jump gen on gen (~18 months between releases, *2.16% perf jump per month*)
*Turing (RTX 2080 Ti) September 27, 2018 --> Ampere (RTX 3090 Ti)* March 29, 2022 = 56% performance jump gen on gen (~42 months between releases, 1.33% perf jump per month)
*Comparing RTX 2080 Ti vs. the 3090 September 24, 2020* = 43% performance jump gen on gen (~24 months between releases, *1.79% perf jump per month*)
*Ampere (RTX 3090 Ti) March 29, 2022 --> Ada (RTX 4090) October 12, 2022* = 45% performance jump gen on gen (~7 months between releases, 6.42% perf jump per month)
*Comparing RTX 3090 September 24, 2020 vs. the 4090 October 12, 2022* = 64% performance jump gen on gen (~25 months between releases, *2.56% perf jump per month*)

There were outliers in the perf. per month improvement calculation Turing (RTX 2080 Ti) September 27, 2018 --> Ampere (RTX 3090 Ti) March 29, 2022 and so I added the 3090 into the equation with its Sep. 24th 2020 release date.

Your hierarchy is:
Kepler
Pascal
Fermi
Ada
Ampere
Maxwell
Turing

While the perf. per month hierarchy is:
Kepler
Fermi
Pascal
Ada
Turing
Maxwell
Ampere


----------



## Vayra86 (Oct 12, 2022)

Dirt Chip said:


> I think it's the opposite: according to 4090 high efficiency, lower ada are so much better than ampera pref\watt wise that if they are out now no one will buy any ampera. NV will be stuck with all the stock.
> I will patiently wait for a 10-12GB Ada GPU down the road with 150w tdp.


I think both considerations are true.

Still though, its a shame the gap between x80 and x90 is so high, and the x80 is in fact rather meagre compared to last gen's x80's. That leaves less room below x80 in the stack as well for substantial perf gains.

But 150W? That might end up being an x50(ti) or something. I mean, x60 is likely going to end up higher...


----------



## spnidel (Oct 12, 2022)

clopezi said:


> Did you see the comparative? I've to see it in frame by frame mode to see any differences. In real time, looks the same to me.
> 
> DLSS3 dosn't look like a "crap", it's a wonderful tool to take 40fps games and make it 100fps games.


yes, you're right, no difference at all, wouldn't be noticeable in real time... if all you do is watch twitch and video encoder artifacts are second nature to your eyes


----------



## Valantar (Oct 12, 2022)

Vayra86 said:


> I think both considerations are true.
> 
> Still though, its a shame the gap between x80 and x90 is so high, and the x80 is in fact rather meagre compared to last gen's x80's. That leaves less room below x80 in the stack as well for substantial perf gains.
> 
> But 150W? That might end up being an x50(ti) or something. I mean, x60 is likely going to end up higher...


It really depends on the tuning. Der8auer has shown how the 4090 only loses ~5% performance from 100% to 60% power limit (which is partly why it scores so high in efficiency in testing here, as it's quite cpu limited in the CP2077 test scenario, and is thus clocking down), but Nvidia has clearly factory OC'd this quite a bit. In other words, lower tier SKUs could be crazy efficient, or they could be quite inefficient - all down to how hard Nvidia wants to push them.


----------



## Why_Me (Oct 12, 2022)

From the Nvidia site.  The RTX 4080 16GB looks to be a beast.


----------



## spnidel (Oct 12, 2022)

Why_Me said:


> From the Nvidia site.  The RTX 4080 16GB looks to be a beast.


you missed the part where it says "dlss on", and the 4000 series fps is colored with the dlss 3 palette - in other words bullshit fake frames lol


----------



## Why_Me (Oct 12, 2022)

spnidel said:


> you missed the part where it says "dlss on", and the 4000 series fps is colored with the dlss 3 palette - in other words bullshit fake frames lol


And you missed the part where that card is running about dead even the 3090 Ti with DLSS turned off.


----------



## Colddecked (Oct 12, 2022)

Why_Me said:


> From the Nvidia site.  The RTX 4080 16GB looks to be a beast.



Nvidia really has no shame calling that 12gb card a 80 series.


----------



## Why_Me (Oct 12, 2022)

Colddecked said:


> Nvidia really has no shame calling that 12gb card a 80 series.


Look at the RTX 3090 Ti vs the RTX 4080 12GB on that graph. The MSRP for the 4080 12GB is $899.99 USD.









						NVIDIA GeForce RTX 4080 Graphics Cards
					

For gamers and creators.



					www.nvidia.com


----------



## Colddecked (Oct 12, 2022)

Why_Me said:


> Look at the RTX 3090 Ti vs the RTX 4080 12GB on that graph. The MSRP for the 4080 12GB is $899.99 USD.
> 
> 
> 
> ...


I don't care how it stacks up against the previous gen.  There's way too big a gap between the 16gb and 12gb to call it the same series, IMHO.
There's less of a difference between the real 4080 and the 4090!


----------



## Wasteland (Oct 12, 2022)

Why_Me said:


> And you missed the part where that card is running about dead even the 3090 Ti with DLSS turned off.



Based on that chart:

The RTX 4080 12 GB has a ~22% advantage, in terms of non-DLSS FPS, over the RTX 3080.  Its MSRP is 28.5% higher than the 3080's at launch.
The RTX 4080 16 GB has a ~53% advantage, in terms of non-DLSS FPS, over the RTX 3080.  Its MSRP is *71%* higher than the 3080's at launch.
Sure you can make the value look better by comparing against the 3090 Ti's MSRP, but that card was always a bad (gaming) value by design, just as the Titan class cards were.  And sure, you can argue that MSRPs on the 30-series quickly became academic after the mining craze hit, but NVIDIA seems to think that mining-craze-era prices should be the norm.  We'll see how things go after AMD makes its move (and if/when the backlog of Ampere GPUs clears out), but it's hard to get excited ATM; there's so little room for real perf/price progress lower down the stack.

EDIT: I should have said "non-DLSS FPS" instead of rasterized.  I assume the Microsoft Flight Sim numbers are with RT on, but either way I misspoke.  Fixed


----------



## AnotherReader (Oct 12, 2022)

Nvidia shared benchmarks for various titles. This jumped out at me: notice that the 12 GB 4080 is barely faster than the 3080 10 GB without DLSS.


----------



## Wasteland (Oct 12, 2022)

AnotherReader said:


> Nvidia shared benchmarks for various titles. This jumped out at me: notice that the 12 GB 4080 is barely faster than the 3080 10 GB without DLSS.
> 
> View attachment 265216


Wow, look at that gap between the 4090 and the $1200 4080.


----------



## ARF (Oct 12, 2022)

Wasteland said:


> Wow, look at that gap between the 4090 and the $1200 4080.



Double the performance. This makes the $1200 4080 DOA. Dead on arrival.


----------



## Richards (Oct 12, 2022)

@W1zzard  we need Spider-Man  remasted and overwatch on future  benchmarks  on the 7900 xt.. borderlands  and witcher are old


----------



## dir_d (Oct 13, 2022)

Wasteland said:


> Wow, look at that gap between the 4090 and the $1200 4080.


The more you spend the more you save /s


----------



## wheresmycar (Oct 13, 2022)

spnidel said:


> you missed the part where it says "dlss on", and the 4000 series fps is colored with the dlss 3 palette - in other words bullshit fake frames lol



"bullshit fake frames"

Tell me more...

After reading pre-review earlier speculations of 4x increase in FPS with DLSS 3 enabled... i immediately fell into the "what if" pit of too-good-to-be-true skepticism (marketing gimmickery?). So kill the curiousity, tell me more!


----------



## Why_Me (Oct 13, 2022)

ARF said:


> Double the performance. This makes the $1200 4080 DOA. Dead on arrival.


That's at 4K. The 4090 stretches its lead at 4K and not so much at lower resolutions. The 4080's will be geared for 1440P


----------



## outpt (Oct 13, 2022)

I’m waiting to see what a 13900k can do. It seems that wizard has one and the testing may already be done. I bet these things will kick ass. 13900k and a 7950x is going to be interesting.


----------



## Legacy-ZA (Oct 13, 2022)

80-watt Hamster said:


> Er, why wouldn't it POST?  I'm not going to claim that 750W isn't riding kinda close to the edge, but I can't think of a reason it wouldn't even run.



Depends on the PSU, some don't if they see the power draw of the system at startup exceeds what it's capable of.


----------



## Why_Me (Oct 13, 2022)

outpt said:


> I’m waiting to see what a 13900k can do. It seems that wizard has one and the testing may already be done. I bet these things will kick ass. 13900k and a 7950x is going to be interesting.


Why a 7950 when you can pair that cpu up with a 4090.


----------



## Bwaze (Oct 13, 2022)

AnotherReader said:


> Nvidia shared benchmarks for various titles. This jumped out at me: notice that the 12 GB 4080 is barely faster than the 3080 10 GB without DLSS.
> 
> View attachment 265216



That looks really bad. Remember, RTX 3080 is a *$700* card, and RTX 4080 "Lite" is a *$900* card!

And with the much lower memory bandwidth we could even see games that run much faster on RTX 3080 than on RTX 4080 12Gb! 

Of course we might see Nvidia trying to push benchmarks of very select games that will run well on lower bandwidth - and benchmarking sites like this one are already angering Nvidia by mostly benchmarking pure rasterisation, with little focus on raytracing or DLSS...


----------



## HenrySomeone (Oct 13, 2022)

Just like I was saying back in 3000 vs 6000 efficiency debate, Nvidia on a cutting edge node is far above the rest and it truly shows now:




And that's with the top-tier, no-holds-barred card, designed for maximum performance. Optimize it with lower power limit and some undervolting and you get something that's leagues beyond anything else, which is clearly shown by ridiculously low "60hz v-sync" consumption.




AMD will have a REALLY tough time trying to come anywhere near this with 7000 series, especially if their new cpus are any indication, since they are literally less efficient than previous gen:


----------



## spnidel (Oct 13, 2022)

Why_Me said:


> And you missed the part where that card is running about dead even the 3090 Ti with DLSS turned off.


a next generation card being on par with the last generation's flagship? wooow, that's insane and unheard of, never seen that happen before, nvidia truly outdid themselves and everyone else 
hasn't happened with 2080 ti -> 3080...
...or 1080 ti ->2080
...or 980 ti ->1080
...or 780 ti ->980

seriously, what is it with you people; you look at an expected performance uplift of a new generation of gpus and you treat it as some kind of impossible feat... sound like utter shills trying to drum up excitement for overpriced cards


----------



## ARF (Oct 13, 2022)

dir_d said:


> The more you spend the more you save /s



This is nasty 

The greedier they are, the more you pay.


----------



## HenrySomeone (Oct 13, 2022)

outpt said:


> I’m waiting to see what a 13900k can do. It seems that wizard has one and the testing may already be done. I bet these things will kick ass.


Agreed, 5800x is more or less 4-5 year old performance (8700k-9900k) and simply isn't nearly good enough to unlock the full potential of this beast, especially at 1080p, but probably also at 1440p in many cases, maybe even at 4k in a couple. Testing with 13900k will give the proper picture (or I guess 7000x3d if and when they arrive, but that's still months away).


----------



## Bwaze (Oct 13, 2022)

Yeah, comparing RTX 3080 with RTX 2080 Ti:

At 1080p it's 18% faster
At 1440p it's 23% faster
At 4K it's 27.5% faster.

MSRP of RTX 2080 Ti was $1499. MSRP of RTX 3080 was $699.

But now we should be glad the RTX 4080 12GB barely matches the RTX 3090 Ti - so we'll be able to buy a 1100 EUR card that matches a 1200 EUR card - wow, much price / performance increase?

And in some cases I predict RTX 4080 12Gb will be much slower, and we'll see a 1100 EUR card barely match the 800 EUR RTX 3080.


----------



## W1zzard (Oct 13, 2022)

Richards said:


> @W1zzard  we need Spider-Man  remasted and overwatch on future  benchmarks  on the 7900 xt.. borderlands  and witcher are old


Will definitely add Spiderman even though it's super CPU limited. Not sure about Overwatch due to its always-online design, also it runs 4896748956 FPS and is CPU limited, too. Will probably kick Borderlands 3, but Witcher 3 stays, because very important DX11 game


----------



## ARF (Oct 13, 2022)

W1zzard said:


> Will definitely add Spiderman even though it's super CPU limited. Not sure about Overwatch due to its always-online design, also it runs 4896748956 FPS and is CPU limited, too. Will probably kick Borderlands 3, but Witcher 3 stays, because very important DX11 game



Is it possible to add a review to compare different CPUs running with RTX 4090?


----------



## W1zzard (Oct 13, 2022)

ARF said:


> Is it possible to add a review to compare different CPUs running with RTX 4090?


Anything is possible  

5800X vs 12900K vs 7700X vs 13900K definitely sounds interesting, but that'll be A LOT of work


----------



## spnidel (Oct 13, 2022)

Bwaze said:


> Yeah, comparing RTX 3080 with RTX 2080 Ti:
> 
> At 1080p it's 18% faster
> At 1440p it's 23% faster
> ...


people pushing the narrative to be amazed at a generic generational performance leap and to be grateful for higher prices is really funny


----------



## bobmeix (Oct 13, 2022)

dgianstefani said:


> DLAA is a real technology, it's NVIDIA deep learning anti aliasing, similar to DLSS but without the lower render resolution then upscaling. FG is frame generation.
> 
> DLAA is better than TAA, but doesn't offer the performance benefits of DLSS.


Yes, it was my mistake. I thought it should have been DLSS + FG, not DLAA + FG.


----------



## 3x0 (Oct 13, 2022)

HenrySomeone said:


> AMD will have a REALLY tough time trying to come anywhere near this with 7000 series, especially if their new cpus are any indication, since they are literally less efficient than previous gen:


You realize that the 7950x at 65W has the same performance as 5950X at stock? AMD, Intel and nV have all opted to increase their power consumption drastically and out of the window of optimal energy efficiency just for a few percent more performance.


----------



## Bwaze (Oct 13, 2022)

spnidel said:


> people pushing the narrative to be amazed at a generic generational performance leap and to be grateful for higher prices is really funny



That's the thing - it seems like "generic generational performance leap" will be present only in RTX 4090, even the RTX 4080 16GB seems too cut to be able to do a RTX 3080 + 50 - 70%, and that's a 1500 EUR card now! 

Will we see the push of "but DLSS 3.0 does that, and more!", sO who cares about rasterisation uplift? And have a Turing kind of release - perhaps even worse?


----------



## HenrySomeone (Oct 13, 2022)

3x0 said:


> You realize that the 7950x at 65W has the same performance as 5950X at stock? AMD, Intel and nV have all opted to increase their power consumption drastically and out of the window of optimal energy efficiency just for a few percent more performance.


Ahh, so now that the ball is in the other court, it's fine to compare performance at a certain, limited power and not only at stock? Back when 12900k was killing it in this metric, all that mattered was its "horrible stock consumption", hehe... And yes, I realize that, but 4090 just pushed the bar so high, there is no way 7000 series will have any hope of even coming within a class of its performance while still staying at least somewhat on the efficiency side of the curve.


----------



## 3x0 (Oct 13, 2022)

HenrySomeone said:


> there is no way 7000 series will have any hope of even coming within a class of its performance while still staying at least somewhat on the efficiency side of the curve.


Let's wait for reviews


----------



## Bwaze (Oct 13, 2022)

HenrySomeone said:


> And yes, I realize that, but 4090 just pushed the bar so high, there is no way 7000 series will have any hope of even coming within a class of its performance while still staying at least somewhat on the efficiency side of the curve.



That's not necessarily a given. Lower end Ada cards are more severely cut than it was normal in past - in term of units, memory bandwidth - will they be pushed more to compensate, and so operate more inefficiently? 

Very few people actually care about RTX 4090. It's not a normal gaming card, no matter how reviewers and Youtube influencer drool over it's performance.


----------



## clopezi (Oct 13, 2022)

W1zzard said:


> Will definitely add Spiderman even though it's super CPU limited. Not sure about Overwatch due to its always-online design, also it runs 4896748956 FPS and is CPU limited, too. Will probably kick Borderlands 3, but Witcher 3 stays, because very important DX11 game


Thanks a lot for your hard work!

I agree on Witcher 3, for me it's still a reference and one of the first games I see on any GPU review, was very balanced and their results are important!


----------



## spnidel (Oct 13, 2022)

HenrySomeone said:


> there is no way 7000 series will have any hope of even coming within a class of its performance while still staying at least somewhat on the efficiency side of the curve.


people said the exact same thing back when 6000 series was about to be launched xD


----------



## HTC (Oct 13, 2022)

Bwaze said:


> Yeah, comparing RTX 3080 with RTX 2080 Ti



You're comparing a flagship card with a non-flagship card: try comparing it to the 2080 instead (i'm referring to launch day reviews).

Do that and then, when 4080 releases, compare it's % lead VS the 3080 with the % lead of the 3080 VS the 2080, and then factor in the prices of the cards.


----------



## HenrySomeone (Oct 13, 2022)

spnidel said:


> people said the exact same thing back when 6000 series was about to be launched xD


No they didn't, at least those of us who can think didn't at the very least. Back then AMD had the superior process due to Nvidia favoring volume and going Samsung (which proved to be a great business move; they sold an order of magnitude more 3000 series than AMD did 6000). This time though, if anything, Nvidia will even have a small node edge (N4 vs N5) and it's not hard to predict the outcome.


----------



## 3x0 (Oct 13, 2022)

HenrySomeone said:


> This time though, if anything, Nvidia will even have a small node edge (N4 vs N5) and it's not hard to predict the outcome.


The difference in nodes could very well be in naming only and effectively the same when it comes to perf/watt/prices


----------



## Valantar (Oct 13, 2022)

HenrySomeone said:


> Just like I was saying back in 3000 vs 6000 efficiency debate, Nvidia on a cutting edge node is far above the rest and it truly shows now:
> 
> 
> 
> ...


The 4090 is quite clearly CPU limited in the TPU efficiency test scenario (not as hard as 1080p, but with an average that close, it's CPU limited most of the time), rendering that comparison quite invalid, as the card is essentially running under clocked. Of course the performance is bonkers nonetheless, and UV/UC/power limiting potential for this card is HUGE (as Der8auer has demonstrated), but these results are not representative. @W1zzard needs to get on this and find another efficiency test scenario.

Edit: autocorrect. Also w1zzard has tested the 4090, 3090 and 6900 XT in 2160p with some interesting results - the AMD card is about the same relative efficiency as it lags behind at the higher resolution, but the 3090 looks much better compared to the 4090.


----------



## Wasteland (Oct 13, 2022)

wheresmycar said:


> "bullshit fake frames"
> 
> Tell me more...
> 
> After reading pre-review earlier speculations of 4x increase in FPS with DLSS 3 enabled... i immediately fell into the "what if" pit of too-good-to-be-true skepticism (marketing gimmickery?). So kill the curiousity, tell me more!



Here are a couple of informative videos:



Spoiler: videos
























I don't expect you to watch all of that, though.  The gist seems to be that DLSS 3 frames aren't quite "fake," but they are definitely "half-fake," or maybe even "three-quarters fake."  Certainly NVIDIA's marketing around DLSS 3 trends towards fake.  Why?  Because the extra frames generated by DLSS 3 don't reduce input latency, at all, in contrast to normal extra framerate.  In some cases DLSS 3 even makes latency marginally worse than it would be at a lower native framerate.

At first this didn't sound so bad to me, but it turns out that the use case for this tech is a pretty small niche.  For example, if you're already at or near your monitor's max refresh rate, then DLSS 3 is wasted, because the screen can't convey the visual smoothness benefits.  Likewise, if you're looking to push stratospheric FPS for competitive gaming, DLSS 3 is completely pointless.

On the other side of the spectrum, at lower FPS numbers DLSS 3's visual artifacting is more noticeable, so the extra frames provided come at a higher visual cost without providing any benefit in terms of responsiveness.  Plus DLSS 3 disables V-Sync and FPS limiters by default, so there's tearing if you don't have Variable Refresh Rate or if you're below/above your monitor's thresholds for VRR.  These factors limit DLSS 3's appeal as an FPS booster to lower end or mid-range hardware.

So FWIW, Tim says this tech is best for people who fit the following criteria:

- They're already capable of running the game at roughly 100-120 FPS without DLSS 3;
- They're running a (VRR-capable) monitor with a refresh rate significantly higher than 100-120 Hz, and
- They're playing games that aren't especially latency sensitive (e.g. graphically impressive single player stuff, like Cyberpunk 2077)

I don't believe this is an especially large market.  People expecting DLSS 3 to be anywhere near as impactful as DLSS 2 are destined for disappointment.

EDIT: Here's the companion article to the HUB video linked above, for those who are more text-inclined: https://www.techspot.com/article/2546-dlss-3/


----------



## R0H1T (Oct 13, 2022)

In a dynamic scene frame generation is basically useless, especially high FPS scenarios. I can't see how the predicted or AI generated frame can be as accurate as the real scene ever!
You would probably need 10x the computational power & 1000x-10000x more AI training to get it working in an acceptable way, acceptable to me at least.


----------



## AnotherReader (Oct 13, 2022)

Valantar said:


> The 4090 is quite clearly CPU limited in the TPU efficiency test scenario (not as hard as 1080p, but with an average that close, it's CPU limited most of the time), rendering that comparison quite invalid, as the card is esse tislly running under clocked. Of course the performance is bonkers nonetheless, and UV/UC/power limiting potential for this card is HUGE (as Der8auer has demonstrated), but these results are not representative. @W1zzard needs to get on this and find another efficiency test scenario.


I think increasing the test scene's resolution to UHD for cards faster than 3090 Ti will be enough to resolve that.


----------



## Valantar (Oct 13, 2022)

AnotherReader said:


> I think increasing the test scene's resolution to UHD for cards faster than 3090 Ti will be enough to resolve that.


Possibly, though it also skews inter-architectural comparisons as different architectures scale across resolutions differently. The ideal for a broadly representative test would be a demanding 1440p title that still scales to very high fps without becoming cpu limited.


----------



## AnotherReader (Oct 13, 2022)

Valantar said:


> Possibly, though it also skews inter-architectural comparisons as different architectures scale across resolutions differently. The ideal for a broadly representative test would be a demanding 1440p title that still scales to very high fps without becoming cpu limited.


Funnily enough, for all the talk of DX12 decreasing CPU bottlenecks, the one game that doesn't seem CPU limited at 1440p is The Witcher 3.

I also excluded all the games from TPU's test suite that are clearly CPU limited and got somewhat better speedups for the 4090: 53% and 73% over the 3090 Ti and the 3090 respectively at 4K. The games that I excluded are:


Battlefield V
Borderlands 3
Civilization VI
Divinity Original Sin II
Elden Ring
F1 22
Far Cry 6
Forza Horizon 5 
Guardians of the Galaxy
Halo Infinite
Hitman 3
Watch Dogs Legion


----------



## W1zzard (Oct 13, 2022)

AnotherReader said:


> I think increasing the test scene's resolution to UHD for cards faster than 3090 Ti will be enough to resolve that.


All cards have to run the same scene + resolution, because "efficiency" = "fps / power" .. and it has to be a game that's fair to both vendors .. and something popular .. leaning towards switching to doom eternal 4k for all cards .. even very old cards get decent fps there and dont fall off a cliff due to vram limits


----------



## AnotherReader (Oct 13, 2022)

W1zzard said:


> All cards have to run the same scene + resolution, because "efficiency" = "fps / power" .. and it has to be a game that's fair to both vendors .. and something popular .. leaning towards switching to doom eternal 4k for all cards .. even very old cards get decent fps there and dont fall off a cliff due to vram limits


My bad; I forgot about the efficiency metric. I was only thinking of peak power.


----------



## W1zzard (Oct 13, 2022)

AnotherReader said:


> My bad; I forgot about the efficiency metric. I was only thinking of peak power.


you mean "maximum" in my charts? that's furmark and definitely not cpu limited. but furmark is a totally unrealistic load, that's why I also have a real gaming load and the differences are huge


----------



## AnotherReader (Oct 13, 2022)

W1zzard said:


> you mean "maximum" in my charts? that's furmark and definitely not cpu limited. but furmark is a totally unrealistic load, that's why I also have a real gaming load and the differences are huge


I phrased it badly. I meant the Gaming power usage; I consider that the maximum for most purposes as Furmark is unrealistic. I was thinking that the game and resolution were chosen for a near peak gaming power draw.


----------



## medi01 (Oct 13, 2022)

How come 40%-somthing perf boost is not a disappointment after promised "2-4 times faster"?

3080Ti vs 2080Ti was a 56% bump, mind you.

How is 4080 supposed to compete, given it is a heavily cut down version of 4090?


----------



## DemonicRyzen666 (Oct 13, 2022)

AnotherReader said:


> Funnily enough, for all the talk of DX12 decreasing CPU bottlenecks, the one game that doesn't seem CPU limited at 1440p is The Witcher 3.
> 
> I also excluded all the games from TPU's test suite that are clearly CPU limited and got somewhat better speedups for the 4090: 53% and 73% over the 3090 Ti and the 3090 respectively at 4K. The games that I excluded are:
> 
> ...



List of DirectX 12 games - PCGamingWiki PCGW - bugs, fixes, crashes, mods, guides and improvements for every PC game: only 240 games vs 3,099 games for DX 12 vs DX11.
Raytracing is even smaller 141 games support raytracing : List of games that support ray tracing - PCGamingWiki PCGW - bugs, fixes, crashes, mods, guides and improvements for every PC game


----------



## Bwaze (Oct 14, 2022)

medi01 said:


> How come 40%-somthing perf boost is not a disappointment after promised "2-4 times faster"?
> 
> 3080Ti vs 2080Ti was a 56% bump, mind you.
> 
> How is 4080 supposed to compete, given it is a heavily cut down version of 4090?



I think we'll have a Turing situation all over again. Higher price increase than performance increase. 

In 2018 it was explained by RTX - raytracing and DLSS. Although it took quite a long time for both technologies to become widely adapted, and most people never enjoyed ray tracing on RTX 2080 - it was just too slow. 

Now they'll say we have a revolution in frame doubling with DLSS 3.0. Although it increases latency and leaves clear artifacts for all to see with moving GUI elements and such, but it's your fault if you notice it - you should be playing game, not looking for artifacts!


----------



## W1zzard (Oct 14, 2022)

medi01 said:


> "2-4 times faster"?


Base performance increase: +50%
Turn on DLSS Super Resolution: +50%-+100% depending on setting
Turn on DLSS Frame Generation: +100%


----------



## Bwaze (Oct 14, 2022)

I think Hardware Unboxed has done a good job at showing the visual artefacts of DLSS 3.0:










It's all hard to detect in many cases, but some things are very apparent. Any moving GUI element is really hard to predict by frame generation, so they are heavily garbled in AI generated frames - which cause them to flicker. 

Could this be repaired? Well, game could render the GUI elements after the frame generation, but that would require a compleyely different approach - and more involvement by developers.


----------



## Dirt Chip (Oct 14, 2022)

medi01 said:


> How come 40%-somthing perf boost is not a disappointment after promised "2-4 times faster"?
> 
> 3080Ti vs 2080Ti was a 56% bump, mind you.
> 
> How is 4080 supposed to compete, given it is a heavily cut down version of 4090?


2x-4x in a very specific game\demo.
No one expected to see those increases on every game.
This repetitive mantra about 2-4x is sooo very boring.

If you into the bashing business than bash on mate, but know it is look somewhat pathetic.
I very much agree that this product is quite stupid (as any >1000$ GPU out there for gaming imo) but the "you promised me 2-4x" mantra is not the reason.
There much more other, real things, to criticize this GPU about.


----------



## Bwaze (Oct 14, 2022)

But it is silly. 

Even if you take into account that Nvidia chose best case scenario - it's pure bullshit, when you notice this claim holds true (even in very specific scenario) only because they compared it to non-DLSS result. As if we don't have the DLSS for 4 years now. 

It's not "pathetic" to call bulshit on such practices.


----------



## Gundem (Oct 14, 2022)

When 150 watts was serious business,,, and for 2 PCBs


----------



## Dirt Chip (Oct 14, 2022)

Bwaze said:


> But it is silly.
> 
> Even if you take into account that Nvidia chose best case scenario - it's pure bullshit, when you notice this claim holds true (even in very specific scenario) only because they compared it to non-DLSS result. As if we don't have the DLSS for 4 years now.
> 
> ...


"Somewhat pathetically" naive, if you will, to think that you get "2-4x".
If NV was misleading in its presentation about best case scenario "2-4x" than OK fight them to the moon and I will join but as you said- this is a very much valid graph, no matter how much sugar-coated it is.
So, to be lead by NV PR presentation about "2-4x" and to believe\expect that you will see that improvement in day-to-day usage, while knowing their way of doing business, is either (somewhat pathetically) naive or just "somewhat pathetic" way of bashing because it is not actual, proper, bullshit.

What I will bash on (if i`m into that sort of practice, which i`m not): price, total power consumption, phisical size and weight, DLSS3 only practical for the very high end usage (>120FPS on 240Hz screens) while increasing input lag and so on.

Ranting about about a point that not really exist take all the air from other valid ranting you (and others) might say.


----------



## spnidel (Oct 14, 2022)

W1zzard said:


> Base performance increase: +50%
> *Turn on DLSS Super Resolution: +50%-+100% depending on setting*
> Turn on DLSS Frame Generation: +100%


no way!... but what if you enable Deep Learning Super Sample Super Sample on, you know, the 3000 series gpu?


----------



## Wasteland (Oct 14, 2022)

Dirt Chip said:


> "Somewhat pathetically" naive, if you will, to think that you get "2-4x".
> If NV was misleading in its presentation about best case scenario "2-4x" than OK fight them to the moon and I will join but as you said- this is a very much valid graph, no matter how much sugar-coated it is.



We all expect first-party benchmark results to be sugar-coated.  That NVIDIA would inflate its performance numbers in marketing is unremarkable, but in this case the inflation is especially dishonest, because the extra frames generated by DLSS 3.0 don't give you at least half the benefit of frames generated by other means (i.e. reduced input latency).

In retrospect, I think I was too kind earlier in discussing this feature. A commenter on the Techspot article I linked described DLSS 3.0 as a "motion fidelity" feature, rather than a performance boost, and that seems like the most sensible way to look at it. Imagine a feature that increased perceived smoothness without adding extra frames. That's what DLSS 3.0 does, in effect.  It's purely a visual enhancement, though one that comes with a trade off to picture quality. 

(One of the more interesting, and I think damning, passages in the Techspot article observes that DLSS 2.0 in Performance mode gave the same FPS and picture quality as DLSS 2.0 in Quality mode when combined with DLSS 3.0 in a particular game/scenario, and thus DLSS 3.0 was worse than pointless in that scenario, increasing latency in return for zero benefit.)

I think DLSS 3.0 is an impressive invention, and in time it could prove to be useful, but it isn't remotely comparable to extra GPU horsepower.

EDIT: Also I think it's somewhat annoying that NVIDIA chose to label its AI-frame-generation tech as "DLSS 3.0," implying that it's in some way not only linked to DLSS 2.0, but _superior to it_.  In fact, the two features have basically nothing to do with one another.  DLSS 2.0 increases real frame rate by rendering the scene in a lower resolution and then ingeniously scaling it up to look like you're running in native.  (And in some cases, DLSS 2.0 can actually _enhance_ the image, which is a neat trick.) DLSS 3.0 is a fancy interpolation technology that increases perceived motion smoothness. You can choose to enable one or both; they operate independently.

DLSS 2.0 will remain _vastly_ more useful to the average gamer long after DLSS 3.0 proliferates to the masses. Vastly vastly more useful; it isn't a contest.


----------



## Dirt Chip (Oct 14, 2022)

Wasteland said:


> We all expect first-party benchmark results to be sugar-coated.  That NVIDIA would inflate its performance numbers in marketing is unremarkable, but in this case the inflation is especially dishonest, because the extra frames generated by DLSS 3.0 don't give you at least half the benefit of frames generated by other means (i.e. reduced input latency).
> 
> In retrospect, I think I was too kind earlier in discussing this feature. A commenter on the Techspot article I linked described DLSS 3.0 as a "motion fidelity" feature, rather than a performance boost, and that seems like the most sensible way to look at it. Imagine a feature that increased perceived smoothness without adding extra frames. That's what DLSS 3.0 does, in effect.  It's purely a visual enhancement, though one that comes with a trade off to picture quality.
> 
> ...


Agreed.
Unlike dlss2\fsr2, dlss3 is a gimmick feature, much like RTX, and the pros are smaller than the cons considrablly.
In time though, NV can improve dlss3 and, except maybe the inceaced input-leg problem, deal with all the visual artifacts.
I`m sure dlss3.4 will be much better just as dlss2.4 is now.


----------



## Wasteland (Oct 14, 2022)

Dirt Chip said:


> Agreed.
> Unlike dlss2\fsr2, dlss3 is a gimmick feature, much like RTX, and the pros are smaller than the cons considrablly.
> 
> In time though, NV can improve dlss3 and, except maybe the inceaced input-leg problem, deal with all the visual artifacts.
> I`m sure dlss3.4 will be much better just as dlss2.4 is now.



As I understand it, the latency problem can't ever really be "fixed."  NVIDIA might reduce the latency _cost_ of enabling the feature (relative to not enabling it), but the extra frames generated by DLSS 3.0 will always be "fake," or dumb if you prefer, with respect to the player's inputs.  That's just the nature of interpolation.

Image quality might be improved, though, and hopefully at some point NVIDIA will find a way to allow V-Sync and framerate caps with DLSS 3.0 on.  The Digital Foundry guy actually already forced V-Sync to work in certain situations, which is encouraging.

But again there are certain fundamental limitations here that can't be eliminated--e.g. frames generated by DLSS 3.0 will always be pointless above the maximum refresh rate of your monitor, and DLSS 3.0's picture quality will always be worse at lower FPS, because a longer delay between "real" frames requires the AI to guess more in constructing the mid-point image between them.  (And if the frames are displayed longer, obviously, the human eye is more likely to notice errors.)  Thus, DLSS 3.0 will tend to skew against the very people you'd expect to want it most (i.e. those without high-refresh monitors or strong rendering hardware, or on the other end of the spectrum, competitive gamers who want stratospheric FPS to improve latency).

I'm sure there's room for improvement, but it's a niche feature now and I doubt that niche will ever dramatically expand.  And even if DLSS 3.0 were 100% perfected, it still wouldn't be analogous to adding extra FPS in the traditional way.  DLSS 2.0 and its analogues, on the other hand, are and will continue to be extremely useful to huge swathes of the user base, precisely because they _do_ provide real performance boosts analogous to traditional frame rate improvements.

EDIT: lol, it looks like I misread your post.  I thought you said NVIDIA might improve the input lag.  My fault, man.


----------



## wheresmycar (Oct 14, 2022)

Dirt Chip said:


> Agreed.
> Unlike dlss2\fsr2, dlss3 is a gimmick feature, much like RTX, and the pros are smaller than the cons considrablly.
> 
> In time though, NV can improve dlss3 and, except maybe the inceaced input-leg problem, deal with all the visual artifacts.
> I`m sure dlss3.4 will be much better just as dlss2.4 is now.


 
After seeing a couple of reviews from tech-tubers i'm of the same opinion. Perhaps nV is rattled with AMD trailing closely behind + Intel now enlisted with hopefully stronger competition ahead. Rather then offering raw performance at a reasonable cost it appears the king of the hill decided to milk the hill further with the "perception" of wider performance gains. I wander if its desperate times or just another nV authoritative swindling strategy to rob the less informed/insensible rich?


----------



## medi01 (Oct 14, 2022)

W1zzard said:


> Base performance increase: +50%
> Turn on DLSS Super Resolution: +50%-+100% depending on setting
> Turn on DLSS Frame Generation: +100%


Oh, that is how that works.
My TV can do "frame generation".
I guess, that's another 200% on top, if base fps is low enough.



Dirt Chip said:


> 2-4x" mantra



If you are fine with "2-4 times" claims, in regards to a card that per TPU tests is about 45% faster, that's cool, I guess.
But fairly pathetic in my books.









						Nvidia's monstrous RTX 4090 is even crazier than rumors said | Digital Trends
					

The Nvidia RTX 4090 is real, and it's an absolute monster. Here are all the details Nvidia revealed about the card during its GeForce Beyond broadcast.




					www.digitaltrends.com
				





Even if one zooms in at what was claimed:

"in games, it's 2 times"








						Nvidia RTX 4090 will be “2-4 times faster” than RTX 3090 Ti - Ripene
					

ripene.com is your daily source for entertainment & celebrity news. Find the latest movie and job news covering the newest film releases and upcoming movies.




					ripene.com


----------



## DemonicRyzen666 (Oct 15, 2022)

So W1zzard ?
Tom's Harwared a links for a compared SLI rtx 3090 ti's vs an RTX 4090 running DLSS on microsoft flight simulator. 


DemonicRyzen666 said:


> W1zzard any plans later on to compare two RTX 3090 Ti's in SLI/mGPU in RTX SLI vs one RTX 4090?
> 
> most likly going to be a no, because you focus on Triple A games that everyone has to be playing for a mass audience. Makes the forum here feel like a gaming forum more than a pc enthuiast fourm
> 
> I do have a list on supposivly mGPU's games, they need confirmation, It might help.


RTX 4090 Beats Two RTX 3090s in SLI — Barely | Tom's Hardware (tomshardware.com)

1.Why would use DLSS on SLI then complain about the second card not being loaded enough? 

2. Can someone tell me why or how the hell they got cyberpunk 2077 supporting SLI or mGPU, when it was detailed by CD project that they had no plans to impliment it ?


----------



## W1zzard (Oct 15, 2022)

DemonicRyzen666 said:


> microsoft flight simulator.


mfsfs is a cpu limited pos. the only way to get higher fps is by using frame doubling in dlss


----------



## medi01 (Oct 15, 2022)

Dirt Chip said:


> 2x-4x in a very specific game\demo.


I've came across where I got that from, the resetera:

*RTX 4080*

Starts at $900 for 12GB G6X, $1200 for 16GB G6X
2-4x faster than 3080 Ti
Launching in November
*RTX 4090*

$1600
24GB G6X
2-4x faster than 3090 Ti
Launching on October 12th

Ada Lovelace is 2x faster in rasterization and 4x faster in ray-tracing compared to Ampere
Ada Lovelace GPUs are significantly more energy efficient compared to Ampere









						NVIDIA RTX 40 Series Launch and Discussion Thread News
					

https://www.globenewswire.com/en/news-release/2022/09/20/2519484/0/en/NVIDIA-Delivers-Quantum-Leap-in-Performance-Introduces-New-Era-of-Neural-Rendering-With-GeForce-RTX-40-Series.html  SANTA CLARA, Calif., Sept.  20, 2022  (GLOBE NEWSWIRE) -- NVIDIA today unveiled the GeForce RTX® 40 Series of...




					www.resetera.com
				




So, reality is quite far from it, ain't it? (oh, I mean, besides the pricing, although in DE AIBs want 25% on top... I guess I know why EVGA quit)


----------



## DemonicRyzen666 (Oct 15, 2022)

W1zzard said:


> mfsfs is a cpu limited pos. the only way to get higher fps is by using frame doubling in dlss


you mean it's Single threaded even on DX12. Stop calling cpu limited when it barely even uses the cpu.
how about calling cpu limited when the cpu like 5600x or 12400K is at 80% load and going to something like a 5900x or a 12700K increases Frame rates & keeps a decent load on the cpu/
It's crappy built engine that wasn't properly built for multi threading on dx12


----------



## HenrySomeone (Oct 16, 2022)

DemonicRyzen666 said:


> you mean it's Single threaded even on DX12. Stop calling cpu limited when it barely even uses the cpu.
> how about calling cpu limited when the cpu like 5600x or 12400K is at 80% load and going to something like a 5900x or a 12700K increases Frame rates & keeps a decent load on the cpu/
> It's crappy built engine that wasn't properly built for multi threading on dx12


Single thread limited is still, well, cpu limited pal...


----------



## Dirt Chip (Oct 16, 2022)

medi01 said:


> I've came across where I got that from, the resetera:
> 
> *RTX 4080*
> 
> ...


Yes, realty is very different from suger costed PR presentation (yet they are right according to the spacific details of the graph\test). What's to it? Something new?


----------



## bhappy (Oct 18, 2022)

W1zzard said:


> Compare it to 1080p FPS.
> Same number? Bottleneck (Borderlands 3, DoS II)
> Lower number? No bottleneck


Could we please have a CPU scaling review for this gpu? That would be awesome to see.


----------



## big_glasses (Oct 18, 2022)

Anyone know how DLSS 3.0 is in "slower gameplay games" that is CPU limited?
Like Stellaris late-game or other Paradox map games. City: Skylines and similar games


----------



## Gica (Oct 19, 2022)

spnidel said:


> people pushing the narrative to be amazed at a generic generational performance leap and to be grateful for higher prices is really funny


It is the beast of the moment and no one is forcing you to buy it. You want pure performance in 4K, you buy it... if you can afford it. You want to play decently in 8k, buy it. Play WoT in 1080p, don't buy it!
For Content Creation, this price is really low. It takes 2x3090Ti to beat it in this segment.


----------



## HenrySomeone (Oct 19, 2022)

Gica said:


> It is the beast of the moment and no one is forcing you to buy it. You want pure performance in 4K, you buy it... if you can afford it. You want to play decently in 8k, buy it. Play WoT in 1080p, don't buy it!
> For Content Creation, this price is really low. It takes 2x3090Ti to beat it in this segment.


Precisely! Under proper, gpu-constrained circumstances (top cpu, 4k resolution, rt), it is anything but a "generic performance leap" and is simply leagues above everything else and will most likely retain a very healthy lead even against the best rdna3 will have to offer.


----------



## Bwaze (Oct 19, 2022)

No, it simply isn't. It's similar leap that RTX 3080, GTX 1080 Ti made, no matter how you sugarcoat it.


----------



## Gica (Oct 19, 2022)

medi01 said:


> How come 40%-somthing perf boost is not a disappointment after promised "2-4 times faster"?
> 
> 3080Ti vs 2080Ti was a 56% bump, mind you.
> 
> How is 4080 supposed to compete, given it is a heavily cut down version of 4090?


"In full ray-traced games, the RTX 4090 with DLSS 3 is up to 4x faster compared to last generation’s RTX 3090 Ti with DLSS 2"
In this screenshot, there are "only" 3x and surely 4K players are extremely unhappy playing at an average of 120 fps. It was better around 30-40 fps average.


----------



## W1zzard (Oct 19, 2022)

Gica said:


> In this screenshot


This testing is with DLSS 3 Frame Generation disabled, enabling it will double the FPS, so another +100%


----------



## HTC (Oct 19, 2022)

W1zzard said:


> This testing is with DLSS 3 Frame Generation disabled, enabling it will double the FPS, so another +100%



Possibly right, but the problem is that they claimed up to 2 in raster and up to 4 in DLSS 3, and they seemed to have failed in the 1st.

Have ANY of the games tested gotten 2 times faster in raster, like nVidia claimed?

Also, 2 times faster VS which card exactly?


----------



## W1zzard (Oct 19, 2022)

HTC said:


> Have ANY of the games tested gotten 2 times faster in raster, like nVidia claimed?


Not that I'm aware of


----------



## karakarga (Oct 19, 2022)

4090 --> Expensive, but good performance.
4080 --> High price for really lower performance, only 256 bit memory bus for the second member, it must have or would be better called as 4070!

The good:  4nm production, good new DLSS features, AV1 encode & decode, doubles Ram amount for the memory bus.
The Bad:  Still PCI-ex 4.0, same generation ram & speed, still no h266 encode & decode.

I was thinking to buy one, but with those specs, I give up and decided to wait for 5xxx series better.


----------



## HenrySomeone (Oct 19, 2022)

karakarga said:


> 4090 --> Expensive, but good performance.
> 4080 --> High price for really lower performance, *only 256 bit memory bus* for the second member, it must have or would be better called as 4070!


Very failed logic here; let me remind you that AMD's very top card(s) also have exactly 256 bit bus. If anything, you could complain about a lot less cuda cores...



karakarga said:


> The Bad:  *Still PCI-ex 4.0*, same generation ram & speed, still no h266 encode & decode.


----------



## karakarga (Oct 20, 2022)

HenrySomeone said:


> Very failed logic here; let me remind you that AMD's very top card(s) also have exactly 256 bit bus. If anything, you could complain about a lot less cuda cores...


I am not counting AMD in any way. Noisy, worth nothing, but for Mac's. 256 bit bus is really low, 512 bit is essential today, I want to see 512 bit HBM. Core count not always represent more productivity AMD is not good at Ray-tracing. Nearly 15 years, I am not buying AMD graphics cards. I did not forget failing Sapphire cards, all of a sudden death. My last was a HD6850, that time nVidia was more or less DX 12 capable starting 4xx, but AMD not. You can still use 4xx and 5xx Geforce cards with Windows 11 but not with AMD from that time. Nearly every 3 week a new driver comes from nVidia, by AMD not even every 3 months. Never use AMD, let them used at Mac's as a hell....


----------



## medi01 (Oct 20, 2022)

Gica said:


> "In full ray-traced games, the RTX 4090 with DLSS 3 is up to 4x faster compared to last generation’s RTX 3090 Ti with DLSS 2"
> In this screenshot, there are "only" 3x and surely 4K players are extremely unhappy playing at an average of 120 fps. It was better around 30-40 fps average.



In this screenshot it is 68/40 => around 1.7 times faster.

"But if I run it at lower resolution and upscale it?" => yes, if you run it at lower resolution it "gets faster" indeed...


----------



## Gica (Oct 21, 2022)

All manufacturers (it doesn't matter if they produce processors or nails) will praise their products based on tests. If in one of these they have a "4x", then they will mark that product as "up to 4x".
I think you remember that with the first Radeon PCIe 4.0, AMD used this aspect with much fanfare. Was it fake? No, but they didn't say that this PCIe 4.0 doesn't bring even 0.000000001% performance increase for those video cards.
Advertising, the soul of commerce. Fortunately, we have reviews for hardware.


----------



## HenrySomeone (Oct 21, 2022)

karakarga said:


> I am not counting AMD in any way. Noisy, worth nothing, but for Mac's. 256 bit bus is really low, 512 bit is essential today, I want to see 512 bit HBM. Core count not always represent more productivity AMD is not good at Ray-tracing. Nearly 15 years, I am not buying AMD graphics cards. I did not forget failing Sapphire cards, all of a sudden death. My last was a HD6850, that time nVidia was more or less DX 12 capable starting 4xx, but AMD not. You can still use 4xx and 5xx Geforce cards with Windows 11 but not with AMD from that time. Nearly every 3 week a new driver comes from nVidia, by AMD not even every 3 months. Never use AMD, let them used at Mac's as a hell....


Ahhh, okay. It's just that 99% of posts like this are from AMD fanboys who never fail to display disgusting double standards and I guess I just assumed something similar... Still though, more than 256 bit bus on a xx80 series card is far more of an exception rather than the rule, appearing only once in the last 8 years (namely 3080) and 512 bit only a single time ever (gtx 280 over 14 years ago).



Gica said:


> All manufacturers (it doesn't matter if they produce processors or nails) will praise their products based on tests. If in one of these they have a "4x", then they will mark that product as "up to 4x".
> I think you remember that with the first Radeon PCIe 4.0, AMD used this aspect with much fanfare. Was it fake? No, but they didn't say that this PCIe 4.0 doesn't bring even 0.000000001% performance increase for those video cards.
> Advertising, the soul of commerce. Fortunately, we have reviews for hardware.


It's interesting how it's always the same group of people who get hung up on this marketing discrepancies, except never with the ones from "their" company...


----------



## Gica (Oct 21, 2022)

It's just as interesting how some minimize one of the most successful video card launches. Probably only Pascal can dethrone such a performance boom from one generation to another (only in gaming), but these "others" send the discussion into the weeds. I'm wodering why?
The prices of raw materials and energy have increased enormously, salaries have also increased, but wait for the good times when an RTX x090 will be 400% more competitive than its predecessor and will cost $49.9.
Prices are closely related to demand. If the demand is high, the prices will be high.

RTX 4090:
1. The first video card that allows you a high (60+) fps with maximum details (including RT) in 4K
2. A decent fps in 8K
3. Explosion in Content Creation, by far the biggest jump in performance from one generation to another.


----------



## Valantar (Oct 21, 2022)

karakarga said:


> I am not counting AMD in any way. Noisy, worth nothing, but for Mac's. 256 bit bus is really low, 512 bit is essential today, I want to see 512 bit HBM. Core count not always represent more productivity AMD is not good at Ray-tracing. Nearly 15 years, I am not buying AMD graphics cards. I did not forget failing Sapphire cards, all of a sudden death. My last was a HD6850, that time nVidia was more or less DX 12 capable starting 4xx, but AMD not. You can still use 4xx and 5xx Geforce cards with Windows 11 but not with AMD from that time. Nearly every 3 week a new driver comes from nVidia, by AMD not even every 3 months. Never use AMD, let them used at Mac's as a hell....


Wow, it's really clear that you haven't been near an AMD GPU in a decade if that's how you think things are. AMD has released at least a GPU driver per month for at least all of 2022 (that's all that's listed on their 'previous drivers' site (current drivers are here)), and from my recollection for far longer than that. Whether it's important to you that a 2010 GPU still works in W11 is for you to decide, but I don't see that as a big issue - if your GPU is that old, most likely the rest of your hardware isn't W11 compatible anyway. It also obviously stands to reason that a much larger company like Nvidia will have more resources for long term support. Also, there's no W11 driver download on Nvidia's site for anything older than the 600 series, FWIW.

Oh, and what you're saying about memory buses is nonsense - you can't just do a 1:1 comparison between bus width today and ten years ago and pretend that it's the same - the VRAM itself is _far_ faster, you have memory compression leading to significant speedups on top of that, and of course memory is utilized very differently in games today v. 10 years ago (asset streaming v. preloading etc.). It's still true that overall effective memory bandwidth has gone way down relative to the compute power of GPUs (mostly because compute has gone up _massively_, while memory hasn't become all that much faster), but that's unavoidable if you want to keep GPUs in usable form factors and at even somewhat affordable prices. Oh, and 512-bit HBM would _suck_. The whole point of HBM is its massive bus width - the Fury X had a 4096-bit bus. HBM clocks much lower than GDDR, but makes up for that with more bus width. What you want is at least 2048-bit HBM (current HBM is 4-8 times faster than the HBM1 on the Fury X), but ideally even more.



Gica said:


> It's just as interesting how some minimize one of the most successful video card launches. Probably only Pascal can dethrone such a performance boom from one generation to another (only in gaming), but these "others" send the discussion into the weeds. I'm wodering why?
> The prices of raw materials and energy have increased enormously, salaries have also increased, but wait for the good times when an RTX x090 will be 400% more competitive than its predecessor and will cost $49.9.
> Prices are closely related to demand. If the demand is high, the prices will be high.
> 
> ...


The 4090 is definitely a very fast GPU, but there have been plenty of calculations done showing that its generational gains aren't _that_ special - it's just that the past couple of generations have had particularly small gains, while this is more back  to the previous norm. On the other hand it only manages this based on a 1.5-2x node jump.

Also, BOM costs are indeed higher than previously, so some cost increases do make sense, but you need to remember that this is a GPU with an MSRP 2-3x higher than its predecessors (except for the 30 series, which brought prices to this level to begin with). Also, salaries have increased? Really? Where? For who? In most of the Western world, middle class wages have been stagnant for decades.


----------



## Gica (Oct 21, 2022)

Valantar said:


> The 4090 is definitely a very fast GPU, but there have been plenty of calculations done showing that its generational gains aren't _that_ special - it's just that the past couple of generations have had particularly small gains, while this is more back  to the previous norm. On the other hand it only manages this based on a 1.5-2x node jump.
> 
> Also, BOM costs are indeed higher than previously, so some cost increases do make sense, but you need to remember that this is a GPU with an MSRP 2-3x higher than its predecessors (except for the 30 series, which brought prices to this level to begin with). Also, salaries have increased? Really? Where? For who? In most of the Western world, middle class wages have been stagnant for decades.


At a performance increase of over 100% in Content Creation, where time means money, a creative studio amortizes its purchase price in less than a month. For gaming, even a GT 1030 is nothing more than an expense for entertainment.
P.S. The video cards are not manufactured in the West, and the carriers are not from the middle class of the West.
PPS. Someone buy them if you can't find them available here, even with prayers.


----------



## Valantar (Oct 21, 2022)

Gica said:


> At a performance increase of over 100% in Content Creation, where time means money, a creative studio amortizes its purchase price in less than a month.


... And? Does that somehow make its generational performance increase any bigger than it is?


Gica said:


> For gaming, even a GT 1030 is nothing more than an expense for entertainment.


Again: and? Is entertainment not something worth paying for?


Gica said:


> P.S. The video cards are not manufactured in the West


... did anyone say so?


Gica said:


> and the carriers are not from the middle class of the West.


"Carriers"? 

Most high end/expensive GPUs are sold in the West, and as they're priced well out of reach of anyone not at least middle class, that and above is the target market. Asia is definitely growing into a juggernaut of a PC market, but is still dominated by lower end hardware due to overall lower income levels.


----------



## CarlkCarson (Oct 21, 2022)

DLSS3 Frame Generation not work in windows11 insider build 25227, crash games, do you know of any way to fix it?


----------



## wheresmycar (Oct 22, 2022)

Gica said:


> *It's just as interesting how some minimize one of the most successful video card launches. *Probably only Pascal can dethrone such a performance boom from one generation to another (only in gaming), but these "others" send the discussion into the weeds. I'm wodering why?
> The prices of raw materials and energy have increased enormously, salaries have also increased, but wait for the good times when an RTX x090 will be 400% more competitive than its predecessor and will cost $49.9.
> Prices are closely related to demand. If the demand is high, the prices will be high.



Interesting? Should have been the other way around for the 99%-99.9% of gamers. How about:

"its rather interesting how a tiny handful of gamers are buying $1600-$2000 4090s... no seriously its interesting!! But thats about all, just interesting!


"its rather interesting we the 99ers have to put up with the ~1% percenters raising miniscule DEMAND which runs price-inflated shockwaves through the wider all-tier SUPPLY war machine... poor ole disinterested and disparaging 99er anomalies - empty your bank accounts pls!!"


"its rather interesting the most successful video card launch surrounding the flagship is unambiguously targeted at the ~1%.... can't believe the rest of them never came out in flip flops singing songs of nV triumphant hymns"


"its rather interesting 2X-4X raster performance didnt even make it to the flagship, God knows what 4000 series has in store for the non-flagship models.... maybe a 0.3x increase for the bottom barrel scrapers and a price justifiable free lollypop - that ought to shup em up!"


"its rather interesting the XX80 club is now banishing whopping $800 spends... silly minions no entry for you guys unless you top up with $400 more or above the already inflated $700-$800 recent spikes (which never settled) for the VIP green card.... Oh, and supposedly the increase in wages should help by ~2025"


"its rather interesting with a little noise our holy omniscent NVIDIA pulled the 4080-12.... actually interestingly laughable for the $900 asking price if you ask me..... whats more interesting is NVIDIAs gigantic balls for the attempt in the first place"


"its rather interesting I was interested but now disinterested... so much for the 'most successful video card launches' for the ~1% vacuum which is hardly a success measuring tool for the all-encompassing wider gaming consumer who almost fits the entire pie. That leaves us room for celebration for nV and the VIP ~1% or ~0.1%... and not forgetting NVs extremely successful profiteering punt which seemingly fortifies exortionate pricing standardisation across the board....definitely deserves a BIG SALUTE to those heavy jaw busting pocket rinsing weighty nV BALLS - only i'm gutted, i don't get a dividend to share the same level of self-indulgent satisfaction"
No qualms with the content creator segment...



> ....but wait for the good times when an RTX x090 will be 400% more competitive than its predecessor and will cost $49.9.



$49.90 :O

$0.10 less from 50 bucks... thats a steal!!! Although i suspect, by then i'll be bald, old and sold 6-feet under six-fold

LASTLY, to add perspective... nVs set the $-bar very high and no doubt RDNA3 will follow (to some extent anyway) = hardly exciting times for the gaming consumer! lol everyones banging on about platform costs, mobo premiums and DDR5 mark-ups... rightly so! but little stones compared to nVs big rock of BALLS


----------



## Gica (Oct 22, 2022)

Something is not right with you. We complain about prices, but we missed the session where the fundamental principle of the market economy was taught: "demand sets the price".
If AMD* launches a product, not similar but even more powerful in all aspects, do you think they will sell it cheap???? Hint: look at the price of their processors

*AMD, because it is the only real competitor for nVidia, but you can also look at the energy market.


----------



## Valantar (Oct 22, 2022)

Gica said:


> "demand sets the price".


Literally nobody in the world with any basic understanding of how economics works would agree with your utterly simplistic application of this idea here. The 4090 isn't $1600 because of demand, it's $1600 because Nvidia feels comfortable pricing it at that level. They didn't just offer up the GPU to "the market" at whatever price and then $1600 spontaneously appeared out of the ether - they picked a price they saw as giving them the highest possible profits while still selling - and they've been working hard for years and years to normalize the idea of ever more expensive GPUs.


----------



## Gica (Oct 22, 2022)

Putin's army is not coming to force you to buy video cards based on some ... referendum. The money is yours and you decide what to buy. The price is adjusted according to the demand and it seems that it exists because in the stores they are even more expensive.
Let's leave the past. Gone are the days when a video card had 2-3 million transistors, 3 mosfets and 5 grams of aluminum.

Excursions in the cosmos.
Sports cars.
Holiday in Dubai.
5-star hotels and restaurants.
The latest iphone
Million dollar jewelry
and so on
They all fit into the enthusiast in their category. The price you pay is definitely not a fair one, but that's how the market works. You want a "Ferrari" video card, pay or sit on the sidelines and complain that the grapes are sour.


----------



## Valantar (Oct 22, 2022)

Gica said:


> Putin's army is not coming to force you to buy video cards based on some ... referendum. The money is yours and you decide what to buy. The price is adjusted according to the demand and it seems that it exists because in the stores they are even more expensive.
> Let's leave the past. Gone are the days when a video card had 2-3 million transistors, 3 mosfets and 5 grams of aluminum.
> 
> Excursions in the cosmos.
> ...


Seriously, you really, really, really need to brush up on some very, very basic principles of economics. Luxury products like that are a prime example of the fact that sypply v. demand as a price-regulating mechanism is a gross oversimplification that only has any applicability on a very large scale, abstracted from any specific product. You also seem entirely blind to the fact that demand isn't just something that spontaneously appears in the world, but is itself fundamentally bound up in marketing, advertising, culture, money, and a lot more. Your "arguments" here are laughably simplistic.


----------



## Gica (Oct 22, 2022)

Correct. They are simplistic, but real. If there is no demand for their offer, prices will drop. If demand exceeds supply, prices will rise. Damn, simplistic and true, right? According to this simplistic indicator, the market worked at the launch of the RTX 3000 and Radeon 6000 series. An explosion of demand triggered all the madness, and everything calmed down when the demand decreased.
Intel, without a competitor, sold octa cores for $2000 in 2017. In 2018, the same class of processors was sold for $800 when AMD came back into the game. If AMD succeeds in dethroning nVidia in terms of performance on all levels (rasterization, RT, Content Creation), do you think their cards will have a more customer-friendly msrp? You are stupid if you believe that and again they send you to their processors.


----------



## wheresmycar (Oct 23, 2022)

Gica said:


> Something is not right with you.



i tell that to myself every morning... a quick shower, a light breakie and a nice HOT cuppa tea gets me sorted double time 



Gica said:


> We complain about prices, but we missed the session where the fundamental principle of the market economy was taught: *"demand sets the price".*



Thats incorrect. Please don't try to validate this claim as you have done in some of the other posts but spend a little time looking into basic economic concepts encircling demand/supply and you'll have your answer. On your screen or hardback economicspedia often "price" will be used as a relevant piece of the puzzle but this should not be confused with "demand" being a "price-setter". 



Gica said:


> If AMD* launches a product, not similar but even more powerful in all aspects, do you think they will sell it cheap???? Hint: look at the price of their processors
> 
> *AMD, because it is the only real competitor for nVidia, but you can also look at the energy market.



Not entirely sure how the AMD correlation fell into our laps but if it helps - I absolutely don't believe AMD will sell surpassed performance for less, not in a million years. Even worse should AMD fall short of nVs performance feat I still suspect RDNA3's top end models to cost a BOMB... a highly undesirable outcome!! It's no secret, nV setting the stage and AMD tagging along hardly helps "us" the dissatisfied potential buyer. TBH, it surprises me having to witness people (consumers) bending over backwards in defence of these awful price hikes...  for me thats 10-fold more startling than witnessing someone picking up a 4090 for gaming. The buyer fed his impulse/desire BUT the partisan vocalist fed his ........... [still tryna work that one out]


----------



## Valantar (Oct 23, 2022)

Gica said:


> Correct. They are simplistic, but real. If there is no demand for their offer, prices will drop. If demand exceeds supply, prices will rise. Damn, simplistic and true, right? According to this simplistic indicator, the market worked at the launch of the RTX 3000 and Radeon 6000 series.


... but this simplistic approach _entirely_ fails to account for how and why this situation in fact indicated the exact opposite of a well functioning market.


Gica said:


> An explosion of demand triggered all the madness, and everything calmed down when the demand decreased.


... and? The entire point here is that "supply and demand determines prices" is a _gross_ oversimplification. Demand can be faked, or driven up in all kinds of manipulative or outright fraudulent ways. Supply can be artificially restrained - like Nvidia is currently doing with RTX 3000 GPUs. Even the balance between supply and demand can be tweaked and manipulated in all kinds of ways. Not to forget the fact that _none of this is a given_. It's all predicated on the premise that everyone in the value chain is out only for profiteering, and will drive up prices at any given opportunity. This is often the case - and it is indeed a mode of operations that capitalist ideologies promote very strongly - but it's not a universal law.


Gica said:


> Intel, without a competitor, sold octa cores for $2000 in 2017. In 2018, the same class of processors was sold for $800 when AMD came back into the game. If AMD succeeds in dethroning nVidia in terms of performance on all levels (rasterization, RT, Content Creation), do you think their cards will have a more customer-friendly msrp? You are stupid if you believe that and again they send you to their processors.


... did I say any of that? It'd be nice if you actually made arguments of your own instead of drawing up ludicrous straw men. Nvidia has been working _hard_ over the past few years to drive up premium GPU prices and to normalize the idea of >$1000 GPUs as a thing that has any reasonable existence in the market. There isn't a natural, pre-existing demand for these things - demands are manufactured, just as products are. That's what marketing is for, and what long term business and marketing strategies do. This is why luxury products operate entirely outside of the realm of supply and demand - because it's _all_ manipulation, all the way. The premise of the logic of supply and demand regulating prices is that this happens in a value-neutral, non-manipulative market situation, not in one where market actors are working as hard as they can to push things in one direction or the other. And it's a model logic that entirely ignores these external manipulations, which is where it becomes woefully simplistic in relation to the actual real world, which is far, far, far too complex for any such logic to have anything but _very_ broad applicability.


----------



## Gica (Oct 24, 2022)

You still don't understand that you have the freedom NOT to buy. And if no one buys, prices will drop or the product will disappear. As long as you have no intention of buying, why are you complaining? I don't know if you noticed, but also for processors, r5/i5 prices are now sold at the price of r7/i7.
This is the offer and I don't think your girls mind as long as they have buyers.


----------



## wheresmycar (Oct 25, 2022)

Gica said:


> You still don't understand that you have the freedom NOT to buy.



We have been rescued!!

Thank you Detective Gica... you have solved the greatest enigma known to man    I shall now use the full might of my power, and newly discovered ability, and fight the green monster without lifting another keystroke finger.



Gica said:


> As long as you have no intention of buying, why are you complaining?



Oh trust me, we (or i can speak for myself) have all the intention in the world to "buy" and i've been waiting since 2017 for a solid upgrade. Not just the intention to buy, but "already" willing to fork out an exorbitant sum of cash seeing how GPU prices have excessively sky-rocketed in the last 6/more years (of course, with NVIDIA captaining the wheel). Now with 40-series i'm speechless, which takes my presupposed huge chunk of cash (~$800) and turns it into a bad joke. Forget the 4090 inflated contagion - a 4080 for $1200??? Or maybe a 4070 for $900? Seems like the green monsters apetite has grown tremendously since taking your noncomformist Freedom classes with a newly assumed position to neglect the majority of GPU buyers (nV: "tough luck biatches"). Oh well, we get to feed our disappointment once again but only this time we get to use our newly discovered Gica certified weapon of choice "Freedom NOT to buy" hahahaha


----------



## Valantar (Oct 25, 2022)

Gica said:


> You still don't understand that you have the freedom NOT to buy.


How do I not understand that? What part of my argument says that people don't have that right?

On the other hand, I would contend that your argument takes as its basis an idea of customers as free, rational actors which is woefully out of touch with any actual reality, and has been disproven time and time again. Human beings do not exist or act in a vacuum, and we are strongly affected by the physical and cultural context we are embedded in at all times. Speaking of "free to not buy" in that situation is so woefully simplistic that even a child would understand it.


Gica said:


> And if no one buys, prices will drop or the product will disappear.


... yes, that is why we have marketing and complex structures geared towards generating and maintaining quasi-artificial demands (not that there's any such thing as a "real" demand beyond utterly basic forms of basic physical human needs though, which don't really come into play in developed societies).


Gica said:


> As long as you have no intention of buying, why are you complaining?


... because of companies exploiting customers, and abusing their massive power? Oh, right, you seem like one of those libertarians who deny the existence of power dynamics and structures beyond individual people at all, so you probably don't see this happening at all. Too bad that's just a failure of your analysis of society, rather than anything resembling reality.


Gica said:


> I don't know if you noticed, but also for processors, r5/i5 prices are now sold at the price of r7/i7.


... yes? Has anyone here said this isn't also bad?


----------



## Gica (Oct 25, 2022)

To set the msrp of $0.99 and let the market regulate the final price. Fck, it still goes to $2000, but we can blame the store.
You had the experience of launching the RTX 3000 and Radeon 6000 series and you still didn't understand that msrp is fixed rainwater in relation to the final price. In the case of 4090: msrp=$1600 and the final price exceeds $2000.
All the best.


----------



## Valantar (Oct 25, 2022)

Gica said:


> To set the msrp of $0.99 and let the market regulate the final price. Fck, it still goes to $2000, but we can blame the store.
> You had the experience of launching the RTX 3000 and Radeon 6000 series and you still didn't understand that msrp is fixed rainwater in relation to the final price. In the case of 4090: msrp=$1600 and the final price exceeds $2000.
> All the best.


And you still have absolutely zero conception of market manipulation (which comes in too many forms and from too many directions to count) or the massively complex and irrational dynamics causing these things to happen, instead holding onto the fantasy that this is explained by the "rational" libertarian idea of supply and demand in a free market. Which just demonstrates that your theory of economics is the woefully naive and massively disproven libertarian one - one that insists on simplicity and rationality above all else. It is literally impossible to have a productive conversation about anything involving economics with someone so insistent on strict adherence to such an ideology.


----------



## Gica (Oct 26, 2022)

Explain that...smartass.
RTX 3090: $1499 msrp
RTX 3090Ti: $1999 msrp
RTX 4090: $1599 msrp
For a company, the best product is not necessarily the best performing. It is the one that brings the most profit. It's no wonder you look like fools at Putin and don't understand anything.


Beyond gaming:
Overall, the new NVIDIA GeForce RTX 4090 24GB GPU represents a massive leap in GPU performance. The exact amount depends highly on the application, with the greater benefit of course being found when the GPU is the primary bottleneck to performance.

For *video editing, the RTX 4090 can be as much as 40% faster than the previous generation RTX 3090 and 3090 Ti, or almost 2x faster than the older RTX 2080 Ti*. The RTX 40 Series also brings about a small performance boost for those using the GPU for either hardware decoding or encoding of H.264 and HEVC media.

*Unreal Engine sees an even greater performance gain, with the RTX 4090 giving us roughly an 85% increase in FPS over the RTX 3090 and 3090 Ti across all our tests. *Depending on the exact use case (ArchViz, Virtual Production, etc.), that means either faster renders, smoother performance, or the capacity for increased detail.

Lastly, GPU rendering is really where you are going to get the most out of a more powerful GPU, and the RTX 4090 comes through in spades. *GPU Rendering is often nearly twice as fast as the previous generation RTX 3090 or 3090 Ti, or four times faster than the older RTX 2080 Ti.*


----------



## 80-watt Hamster (Oct 26, 2022)

Gica said:


> Explain that...smartass.
> RTX 3090: $1499 msrp
> RTX 3090Ti: $1999 msrp
> RTX 4090: $1599 msrp
> ...



Um. What are you on about here? And what does Putin have to do with any of this?


----------



## wheresmycar (Oct 26, 2022)

80-watt Hamster said:


> Um. What are you on about here? And what does Putin have to do with any of this?



Apparently Gica believes Putin is taking over the world by forcing everyone to buy 40-series RTX cards... Gica is counteracting and thwarting Putins plans by offering free online "Freedom NOT to buy" classes. In Gica's very first enlightening class Gica cracks the uncrackable by informing the TRUE mechanics of GPU price-setting - apparently its the AI autonomous market which sets the price, not NVIDIA. All this time we had been fooled! Not anymore! With this newly discovered truth, in Day 2 (second class) Gica teaches how to fight back without lifting a single finger - just close your eyes and repeat thrice:

"Freedom NOT to buy"
"Freedom NOT to buy"
"Freedom NOT to buy"

It worked for me!! Putins got nothing on us, not with Sensei Gica at the helm of this tide-turning pacifist ship



Gica said:


> For a company, the best product is not necessarily the best performing. *"It is the one" that brings the most profit. *It's no wonder you look like fools at Putin and don't understand anything.



Yes SENSEI GICA..... dont worry these fools dont understand.

_*"It is the one"*_ .... only if these fools knew the "AI autonomous market" is "THE ONE", the one who munches up demand and spews out MSRPs and generously fills nVs pockets with huge profits. I hope the market isn't burdening nV too much with all this unforseen weight.


----------



## Gica (Oct 28, 2022)

If you don't understand how the market works (and it's clear that you don't), visit this site.
At best buy (1499 dollars) you can't find anything in stock.
At a price slightly above MSRP, you can find the most awaited in the queue.
You can find plenty of RTX4090 at 2000 dollars and even 2500 dollars.
Why? Because they are wanted. Despite your complaints, the market has adjusted this product to a much higher price than msrp. The exception proves the rule.


----------



## Valantar (Oct 28, 2022)

Gica said:


> If you don't understand how the market works (and it's clear that you don't), visit this site.
> At best buy (1499 dollars) you can't find anything in stock.
> At a price slightly above MSRP, you can find the most awaited in the queue.
> You can find plenty of RTX4090 at 2000 dollars and even 2500 dollars.
> Why? Because they are wanted. Despite your complaints, the market has adjusted this product to a much higher price than msrp. The exception proves the rule.


... Does that link contain some hidden market analysis showing why MSRPs of non-founders cards are higher but that isn't visible to ordinary readers? All I could see was some bog-standard reporting on prices and availability. And MSRPs are higher to cover costs and provide livable margins for AIB partners, which have a less advantageous market position than Nvidia and thus need to price themselves higher in order to survive. 

That there is demand doesn't change the fact that this demand had been created through years of marketing and careful manipulation of expectations and accepted pricing levels by Nvidia, nor does it change the fact that AIB partners are pricing their cards higher not because "the market makes them do it", but because they would be losing money if they tried selling these at Nvidia's MSRP.

As I've said a dozen times by now: your analysis is simplistic and utterly fails to explain how these things work. Pretending that this isn't an extremely complex, multi-factor system doesn't change the fact that is is exactly that.


----------



## Gica (Oct 30, 2022)

Access the links that lead directly to the product.
Sold out for all "best buy" at $1499, waiting list up to $2000 and availability from $2200 and up. You still can't buy from Romania, although prices start at $2,200 for the most basic implementation. Do you think they were available at MSRP $1000?
Let's not forget that AMD had a much more attractive msrp for the 6000 series, but it was impossible to find something on the market at that price. Yes, $1000 MSRP at launch, but you couldn't find anything under $2000 if you wanted the 6900XT.
Good luck learning the market economy for beginners.


----------



## Night (Oct 30, 2022)

Gica said:


> Let's not forget that AMD had a much more attractive msrp for the 6000 series, but it was impossible to find something on the market at that price. Yes, $1000 MSRP at launch, but you couldn't find anything under $2000 if you wanted the 6900XT.


That was due to the cryptocurrency hysteria increasing GPU prices by 100% or more. Now it's "blamed" by inflation world wide. Personally, I find hype to be the tactic used by Nvidia as their main tool for marketing. I never buy into such hype, I'll wait for reviews and compare multiple charts if I consider a product may be worth my money, and only then make a conclusion. I'd never pay $1500 (let alone $2000+) to play using 4K120 now, when in 2 years I'll be able to only get 4k60, that is unless I have excess money laying around ready to be wasted for no apparent cause.


----------



## Gica (Oct 31, 2022)

In short, it was the request. It doesn't matter what caused that anomaly, the demand dictates the offer price. If no one buys the RTX 4090 for $1500, you could find it in all stores at a price just below MSRP. Unfortunately, it's all the other way around and you can barely find it for $2000 even with a dead crypto GPU. The high demand also comes from creators, who buy the RTX 4090 at the MSRP price and amortize the investment very quickly. Gamers have been and will always remain in the queue since the Quadro restrictions were mostly removed.
Buy at the market price or cry on the forums, it's their choice.
Or:


----------



## Valantar (Oct 31, 2022)

Gica said:


> In short, it was the request. It doesn't matter what caused that anomaly, the demand dictates the offer price.


I mean, that is some of the most beautifully circular reasoning I've ever seen. Well done! It doesn't matter whatsoever the mechanisms and systems behind what is perceived as supply and/or demand, as long as it can be described with those terms in some way whatsoever, then your theory apples. Not a self-fulfilling prophecy at all that, no sir, not at all.

Seriously, your approach to economics and markets is so woefully naive and underinformed that it's downright frightening. You're truly embodying the libertarian refusal to engage whatsoever with the concrete realities of the world or its power dynamics, instead keeping to the broadest, most vague categories possible - into which literally everything fits.


Gica said:


> If no one buys the RTX 4090 for $1500, you could find it in all stores at a price just below MSRP. Unfortunately, it's all the other way around and you can barely find it for $2000 even with a dead crypto GPU.


Oh, and you also seem to not know the difference between MSRP and price?


Gica said:


> The high demand also comes from creators, who buy the RTX 4090 at the MSRP price and amortize the investment very quickly.


... Just how many creators with that kind of budget do you think exist in the world?


----------



## Gica (Nov 1, 2022)

Valantar said:


> Oh, and you also seem to not know the difference between MSRP and price?


MSRP = Mi Se RuPe, in Romanian (street language with porn orientation). It's a play on words that perfectly mirrors the reality, and this reality only takes into account the demand in determining the selling price.


----------



## Valantar (Nov 1, 2022)

Gica said:


> MSRP = Mi Se RuPe, in Romanian (street language with porn orientation). It's a play on words that perfectly mirrors the reality, and this reality only takes into account the demand in determining the selling price.


Man, your head must be fascinating to live in. No regional differences in pricing, no power dynamics in trade, no market momentum or manipulation or scalping or artificial demand or artificially constrained supply, no taxes, import fees, local market variances affecting prices, no, it all boils down to the most simplistic and broad conception possible of "supply and demand", and absolutely zero interest in looking below this surface at all. No curiosity or base level critical thinking, as it all adds up in its superficial circularity. Makes perfect sense, of course.


----------



## Ravenas (Nov 6, 2022)

W1z how is not having PCI 5.0 not a con?


----------



## Lycanwolfen (Nov 7, 2022)

Does anyone have any tests without DLSS turned on and the settings done in the control panel not in the game


----------



## Valantar (Nov 7, 2022)

Ravenas said:


> W1z how is not having PCI 5.0 not a con?


Because it has absolutely zero impact on anything relevant to the card? PCIe 4.0 x16 is in no way a bottleneck, 5.0 wouldn't improve performance in any way, and would just add unnecessary cost.

Would you complain if your USB HDD didn't have 40Gbps Thunderbolt also? Adding a higher bandwidth interface increases cost and development complexity, and PCIe 5.0 might even drive up BOM costs by requiring fancy circuitry to maintain signal integrity (at least it does on motherboards). So, it would increase costs and make everything more complex, but bring zero real-world benefits. How is the absence of that a con?


----------



## Ravenas (Nov 7, 2022)

Valantar said:


> Because it has absolutely zero impact on anything relevant to the card? PCIe 4.0 x16 is in no way a bottleneck, 5.0 wouldn't improve performance in any way, and would just add unnecessary cost.
> 
> Would you complain if your USB HDD didn't have 40Gbps Thunderbolt also? Adding a higher bandwidth interface increases cost and development complexity, and PCIe 5.0 might even drive up BOM costs by requiring fancy circuitry to maintain signal integrity (at least it does on motherboards). So, it would increase costs and make everything more complex, but bring zero real-world benefits. How is the absence of that a con?



PCI 5.0 could and would provide FPS gains, potentially 5 to 10 on average.

I don’t need a metaphor comparing HDDs and lacking feature sets to understand. It’s also a con for the 7900XTX.

These cards were likely in production prior to Z590 release. If I’m giving $1800 from my bank account, the card needs to support current bandwidths, I’m not asking for future technology not on the market like DP 2+


----------



## Valantar (Nov 7, 2022)

Ravenas said:


> PCI 5.0 could and would provide FPS gains, potentially 5 to 10 on average.


Based on what? Where would these gains be coming from?


Ravenas said:


> I don’t need a metaphor comparing HDDs and lacking feature sets to understand. It’s also a con for the 7900XTX.


No, it is utterly and completely meaningless. These GPUs lose a few percentage points of performance moving to PCIe 3.0x16, meaning that even that isn't a noticeable bottleneck for them. Thus, doubling bandwidth over 4.0x16 most likely would have near zero effect on performance.


You're making statements with zero factual basis, with the only possible source being speculative extrapolation from current and lower PCIe generations - but if so, then said extrapolation has methodological flaws, as PCIe scaling tests even for the 4090 show no meaningful signs of a bottleneck at even PCIe 3.0x16.


Ravenas said:


> These cards were likely in production prior to Z590 release. If I’m giving $1800 from my bank account, the card needs to support current bandwidths, I’m not asking for future technology not on the market like DP 2+


Z590? Do you mean Z690? Neither the GPUs or the cards were in production in early 2021, that's for sure - most likely the AD102 entered mass production somewhere around 6 months before launch, meaning it was taped out a bit before that. So it might roughly have coincided with the launch of Z690, sure.

But the issue with demanding this: what are you getting in return? It will not affect performance, but only increase costs. PCIe 5.0 has zero use for consumers - it's extremely useful in datacenters, which is why it replaced 4.0 so rapidly, but they also don't care about hardware costs, and can use all the bandwidth you throw at them. Consumer GPUs are not bandwidth constrained in any common usage scenario, so pushing for faster PCIe is just pushing for higher costs with no benefits to show for it. It makes no sense whatsoever. Both Nvidia and AMD not pushing for PCIe 5 on these GPUs is among the most sensible things they've done these past few years.


----------



## Solaris17 (Nov 7, 2022)

Ravenas said:


> PCI 5.0 could and would provide FPS gains, potentially 5 to 10 on average.



What?? Where are you getting this?


----------



## Nopa (Nov 7, 2022)

Ravenas said:


> PCI 5.0 could and would provide FPS gains, potentially 5 to 10 on average.


5-10 more FPS? I'm all for it even 1! Every gain matters no matter how small it is imho.
I'd be interest to know tho if 5.0 x16 does anything better than 4.0 x16 or 3.0 x16 when it comes to RT.


----------



## Valantar (Nov 7, 2022)

Nopa said:


> 5-10 more FPS? I'm all for it even 1! Every gain matters no matter how small it is imho.


... even when you're paying noticeably more for it? That makes no sense whatsoever.


Nopa said:


> I'd be interest to know tho if 5.0 x16 does anything better than 4.0 x16 or 3.0 x16 when it comes to RT.


No. RT is bottlenecked by the GPU's RT compute capabilities (and also the CPU's ability to keep up), but not at all by PCIe bandwidth.


----------



## dgianstefani (Nov 7, 2022)

Nopa said:


> 5-10 more FPS? I'm all for it even 1! Every gain matters no matter how small it is imho.
> I'd be interest to know tho if 5.0 x16 does anything better than 4.0 x16 or 3.0 x16 when it comes to RT.





Ravenas said:


> PCI 5.0 could and would provide FPS gains, potentially 5 to 10 on average.
> 
> I don’t need a metaphor comparing HDDs and lacking feature sets to understand. It’s also a con for the 7900XTX.
> 
> These cards were likely in production prior to Z590 release. If I’m giving $1800 from my bank account, the card needs to support current bandwidths, I’m not asking for future technology not on the market like DP 2+


Do you understand the concept of "adding bandwidth potential doesn't matter when the existing bandwidth was nowhere near saturated"?

I guess that what the market really needed right now was an additional price hike in current gen motherboards due to OEMs upselling a "feature" that benefits zero people (PCIe Gen 5 SSDs aren't even out yet, and it's doubtful that even next generation cards will saturate the PCIe Gen 4 bus).


----------



## Valantar (Nov 7, 2022)

dgianstefani said:


> Do you understand the concept of "adding bandwidth potential doesn't matter when the existing bandwidth was nowhere near saturated"?


Doesn't seem like it, no.


dgianstefani said:


> I guess that what the market really needed right now was an additional price hike in current gen motherboards due to OEMs upselling a "feature" that benefits zero people (PCIe Gen 5 SSDs aren't even out yet, and it's doubtful that even next generation cards will saturate the PCIe Gen 4 bus).


They 99.9% sure won't, given that current ones aren't really even saturating PCIe 3.0 consistently. And even PCIe 5.0 SSDs are meaningless outside of large sequential file transfers, which, well, most consumers don't spend much of their time doing. The NAND is still the same, so random and mixed performance will be the same - and outside of the effects of slightly faster controller cores, PCIe 3.0 and 4.0 drives perform pretty much the same in those metrics. The only reason PCIe 4.0 drives are faster today is that nobody is producing PCIe 3.0 drives with premium NAND and controllers - if you had a 3.0 controller supporting the fastest NAND today, that drive would most likely match 4.0 drives in any non-sequential task. In real world applications, even the fastes 4.0 drives don't come close to saturating PCIe 3.0 bandwidth.

PCIe 5.0 for consumers is a clear-cut case of Intel and AMD being caught in a destructive game of one-upmanship, where they "have no choice" but to include whatever harebrained feature they can _because they exist_, because if they don't they'll get all kinds of shit from customers complaining that their competitor has [feature X that nobody can make use of].


----------



## Assimilator (Nov 23, 2022)

@W1zzard Table on first page has GA102 at 28 billion transistors while [the DB](https://www.techpowerup.com/gpu-specs/nvidia-ga102.g930) has it at 28.3 bil. Others appear to be correct (or at least, correspond to what the DB says).


----------

