• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090 Founders Edition

Joined
Jan 27, 2015
Messages
1,747 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
The only bad information here is that you are implying that $350 is the price the card was broadly available for. $350 was the founder's edition price which almost no one saw. Aside from the scalping, AIB pricing started at $400+ as was usual when comparing founders edition to non-founders edition.

Mind you that's comparing the 6GB 2060 to the 6GB 1060. Zero increase in VRAM. The 12GB 2060 cost a whopping $600 at time of review on TechPowerUp.

You paid $309 for a 2060 KO, which launched March 3rd 2020, a whopping 14 MONTHS after the Jan 7th 2019 launch of the 2060. Suffice to say, it's extremely misleading to compare late generation prices to MSRP.


It is you who is trying to interject "market price" from 3-4 years ago. I am using MSRP. I'll keep on using MRSP. The reasoning for using that to compare has been explained ad nauseum not only here, but on countless other sites. If demand is too high, then the AIBs and retailers pocket the difference. If it's too low, then they take a hit. MSRP is the only thing we have, anything else is chaos and cherry picking.

My comment about paying $309 was just pointing out that I got a discount on the card by buying late in the cycle, since you seemed dead set on comparing discounted prices to new release prices, or high demand scalp prices to MSRP. Anyone can do that, just cherry pick launch vs mid cycle vs late cycle, crypto bust vs crypto mania. That's chaos.

As I said, I think MSRP is the only reasonable way to compare pricing. And +$50 on a $299 card is only +16.7%. For that you get +45% at 1080P and +67% at 4K. That's not hard to understand.

You admitted you own a 2060 and I believe your ownership is clouding your judgement in this case. Even when you take the best turning SKU value wise in the best possible light it's at best underwhelming.

That's a laugh. I posted charts, I've been posting charts, I don't usually say stuff without proof when it comes to numbers because proof is so easy to find. It's 45% faster than 1060 at 1080P and 67% faster at 4K. I paid about $30 over what the 1660 Ti was selling for at the time (about 12% more) to get +20% over that card. The reason was likely because everyone said that the 2060 wasn't worth it, which drove up 1660 Ti demand, while the 2060 wasn't selling. So I got to take advantage of both bad information in circulation and its results. I don't do these things because fanboy gibberish like you imply, I do it because the math works out, and I already showed that it did.
 
Joined
Dec 25, 2020
Messages
7,459 (4.97/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
this review is worthless. why would you test a high end GPU on a budget CPU? you are knee capping the performance. a lot of the benches here, even those at 4K, are CPU limited. you guys couldn't spring for a 5800X3D? or an intel bench?

no one who is the market for this GPU is going to be using a 5800X system.

everyone else used ryzen 7000 or high end intel. but you guys published this junk

Since when is a 5800X a budget CPU? I must have missed something. Its performance comfortably exceeds that of the CPUs installed in most gaming PCs today.

This guy out here literally with the mindset of "Imagine being so poor you could only afford a 5950X and a 3090 :banghead:" , damn, I've never loved being poor as much as I do today :kookoo: :laugh:

in short wtaf I don't like ai generated frames. or its equivalents

You're not alone, brother. I have to confess, I have never liked upscalers and I definitely do not like this frame generation thing. I will be skipping the launch models of this gen, probably going to grab something during the inevitable mid-gen refresh.

Other than that, it's a remarkable GPU. I'll be keeping a close eye on AMD's counteroffer, it looks like they have their work cut out for them.
 
Joined
Apr 14, 2022
Messages
786 (0.77/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
4090 shouldn't be tested in 1080p/1440p at the moment. Even in 4K, the card was kept behind by the cpu, even with 5800X3D(Hardware Unboxed).
The cpus clearly are ...30 years behind in technology compared to the gpus.
 
Joined
Jun 6, 2022
Messages
622 (0.64/day)
For haters: insert a link with a review in which the general impression was disappointment. I found it only with WoW!!!!
 
Joined
Jun 11, 2020
Messages
574 (0.34/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
Since when is a 5800X a budget CPU? I must have missed something. Its performance comfortably exceeds that of the CPUs installed in most gaming PCs today.

This guy out here literally with the mindset of "Imagine being so poor you could only afford a 5950X and a 3090 :banghead:" , damn, I've never loved being poor as much as I do today :kookoo: :laugh:



You're not alone, brother. I have to confess, I have never liked upscalers and I definitely do not like this frame generation thing. I will be skipping the launch models of this gen, probably going to grab something during the inevitable mid-gen refresh.

Other than that, it's a remarkable GPU. I'll be keeping a close eye on AMD's counteroffer, it looks like they have their work cut out for them.

Since 12900 and 5800x3d came out, it's mid range now at best. Heck even in the 5000 series stack its in the middle.
 
Joined
Nov 18, 2020
Messages
39 (0.03/day)
Location
Arad, Romania
Processor i9-10850K @ 125W Power Limit
Motherboard ASUS TUF Gaming Z590-PLUS
Cooling Noctua NH-D15S
Memory Kingston KF432C16RBK2/64
Video Card(s) ASUS RTX 3070 TUF GAMING O8G @ 950mV / 2010MHz
Storage Samsung 970 EVO Plus 2TB + Kingston KC3000 2TB + Samsung 860 EVO 2TB + Samsung 870 EVO 4TB
Display(s) ASUS PB287Q + DELL S2719DGF
Case FRACTAL Define 7 Dark TG
Audio Device(s) integrated + Microlab FC330 / Audio-Technica ATH-M50s/LE
Power Supply Seasonic PRIME TX-650, 80+ Titanium, 650W
Mouse SteelSeries Rival 600
Keyboard Corsair K70 RGB TKL – CHERRY MX SPEED
Small typo on page 35: you can enable DLAA + FG
 
Joined
Dec 25, 2020
Messages
7,459 (4.97/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Since 12900 and 5800x3d came out, it's mid range now at best. Heck even in the 5000 series stack its in the middle.

I dunno man but I want some of what you guys are on calling it low-mid, but I want some, it must be the good stuff. You guys referring to it as if it were some sort of antique lmao

Vermeer remains one of the most competent gaming processors money can buy, as well as one of the most efficient processor architectures out there, and the 5800X remains a very high end processor for this purpose. I would go a step further on that, the 5700X is the CPU that most people looking to play video games, including those looking to build with an Ada GPU.

That particular processor is reliable, runs cool and has amazing performance, too. I'll just leave @Mussels' post on the Zen Garden do the talking:


I don't think you will be losing out on a quality experience by using these with a Ryzen 5000 processor, or even one of the faster 3000 processors like a 3800XT or a 3950X.
 
Joined
Jan 29, 2021
Messages
1,917 (1.31/day)
Location
Alaska USA
View attachment 264998

+63% vs RTX3090 @ 3x 2nd hand market price in Europe (2K vs 700 Eur)
+88% vs RTX3080 @ 4x 2nd hand market price in Europe (2K vs 500 Eur)

Absolutely idiotic pricing in Europe. No thanks Ngreedia o_O
Just about everything in the EU is priced insane. Do you ever think it might be the EU and not Nvidia?
 
Joined
Jan 18, 2021
Messages
225 (0.15/day)
Processor Core i7-12700
Motherboard MSI B660 MAG Mortar
Cooling Noctua NH-D15
Memory G.Skill Ripjaws V 64GB (4x16) DDR4-3600 CL16 @ 3466 MT/s
Video Card(s) AMD RX 6800
Storage Too many to list, lol
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply Corsair RM750x
Mouse Too many to list, lol
Keyboard Keychron low profile
Software Fedora, Mint
As I said, I think MSRP is the only reasonable way to compare pricing. And +$50 on a $299 card is only +16.7%. For that you get +45% at 1080P and +67% at 4K. That's not hard to understand.
The MSRP of the 1060 (6 GB) was $250. The MSRP for the Founder's Edition was $300. This was back when NVIDIA charged a premium for the FE. I know because I bought a Zotac 1060 for $250 at launch.

I don't have a dog in this hunt, particularly, but there was a minor outcry about bad perf/price on Turing when it launched. I remember being deeply underwhelmed, myself. Later on the 16 series appeased people who felt that RT features weren't worth the premium, which I think was a sensible position at the time.

EDIT: Better link - https://www.gamersnexus.net/news-pc/2505-official-nvidia-gtx-1060-specs-price-release-date
 
Joined
Dec 25, 2020
Messages
7,459 (4.97/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Just about everything in the EU is priced insane. Do you ever think it might be the EU and not Nvidia?

It's NV alright. Prices are historically high even in America which gets dirt cheap hardware anyway. That said it is not out of line compared to the 3090's launch price, and honestly, anyone with a GA102-based GPU should not be running around with their hair on fire, it's just FOMO, really.

If you spent $800-$1600+ on a high end GPU, give it at least 3 years before you replace it. I guarantee anyone on a 3080+ is going to be comfortable gaming on ultra high settings for the next year if not more.
 
Joined
Oct 5, 2022
Messages
12 (0.01/day)
No latency is absolutely not reduced by DLSS's AI frame insertion. Given that it requires you to wait for the next frame to be rendered before creating the AI generated frame it will always carry a latency penalty until they remove that requirement.

This is why Nvidia require that Reflex be enabled with the frame insertion, to hide a portion of the latency penalty. If you disable frame insertion and enable reflex you will get lower latency.



I assume there's a sweet spot between 60 FPS and 144 FPS where it makes sense for some people to enable it. Above 144 FPS it doesn't make much sense to enable as smoothness is already very good and the latency penalty would outweigh any diminishing benefits above that frame-rate. Below 60 FPS and the latency penalty would be quite sizable, maybe for strategy games that aren't real time but even then it could still be annoying and those are the games least likely to benefit from additional motion smoothness.

Hopefully improvements come to the tech but until then I feel like it's going to be highly preferential as to whether it's worth enabling.
I don't see how it is possible to generate an intermediary frame between two frames using temporal (time-based) data and not require that the second frame be rendered before the interpose. Frame1 - Interframe - Frame2, in order to create interframe you must have the already rendered Frame1 and Frame2, and by the time you push Frame1 to the screen it is behind by a frame, giving the 1/3 extra latency. Is this incorrect?
 
Joined
Apr 14, 2022
Messages
786 (0.77/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
I dunno man but I want some of what you guys are on calling it low-mid, but I want some, it must be the good stuff. You guys referring to it as if it were some sort of antique lmao

Vermeer remains one of the most competent gaming processors money can buy, as well as one of the most efficient processor architectures out there, and the 5800X remains a very high end processor for this purpose. I would go a step further on that, the 5700X is the CPU that most people looking to play video games, including those looking to build with an Ada GPU.

We don't care if the 5000 are still capable or not or what most people have.
In reviews we care about real numbers.

It's the same when we test cpus and put a 4090 playing at 720p.
We don't care if it never happens. We only care about numbers and differences.

Yes a 5800X was a bad choice for GPU review but it is understandable. Every reviewer has to retest every gpu using the latest and greatest cpu when there is no always time for that.
Wizzard will do that, obviously, when 13900K arrives. And this time it would be great if we had gpu usage at some point, so we know if there is or not a cpu bottleneck.
 
Joined
Dec 26, 2013
Messages
200 (0.05/day)
Processor Ryzen 7 5800x3D
Motherboard Gigabyte B550 Gaming X v2
Cooling Thermalright Phantom Spirit 120 SE
Memory Corsair Vengeance LPX 2x32GB 3600Mhz C18
Video Card(s) XFX RX 6800 XT Merc 319
Storage Kingston KC2500 2TB NVMe + Crucial MX100 256GB + Samsung 860QVO 1TB + Samsung Spinpoint F3 500GB HDD
Display(s) Samsung CJG5 27" 144 Hz QHD
Case Phanteks Eclipse P360A DRGB Black + 3x Thermalright TL-C12C-S ARGB
Audio Device(s) Logitech X530 5.1 + Logitech G35 7.1 Headset
Power Supply Cougar GEX850 80+ Gold
Mouse Razer Viper 8K
Keyboard Logitech G105
Um.... from the portion of the review conclusion that I've been criticizing (because it makes no sense from the data)?

I'll quote it again, for reference. Emphasis added.



I'm not critiquing the card. I'm critiquing a section of this review's conclusion that makes ZERO sense. The data in this review does not support these statements with respect to "classic" raster performance. The data in this review suggests that this card is right in the middle of the pack with regards to new GPU architectures launched over the past 10 years of GPU launches and is just fine.

This card's performance is excellent - as expected - without needing to use unwarranted hyperbole and unearned effusive praise in comparison to what previous architectures achieved at their respective launches. It sounds much more like marketing than product review.
I think w1z is correct with this one. But I agree it's not obvious as it's not explained clearly.
4090 does indeed make the highest jump between new generations since (at least) 6xx series. (I didn't check older generations but it probably goes until 8xxx series, TPU reviews only).

This 'gap' is between the fastest new generation card 'at launch' versus the fastest old generation card at that time.

So it's between 4090 and 3090ti and it's 45% at 4K.
It was close to this (43%) between 3090 (launched 1 week after 3080 but still counts I guess) and 2080ti.
39% between 2080ti and 1080ti.
Only 23% between 1080 and Titan X (1080ti released 1 year later).
 
Joined
Jan 5, 2006
Messages
18,584 (2.67/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
Last edited:
Joined
Oct 8, 2014
Messages
123 (0.03/day)
Doesn't HDMI 2.1 already max out at like 4k 240hz and 8k 144hz? I mean who would ever need more than that mate?
If I'm going to run triple screen 4k for a sim rig, I'm limited to a max of 97 hertz with 1.4a at 4:4:4.

With that number not being 120/144 hertz, this would also nullify freesync/g-sync use at 4k, as most screens only offer the magical 2.4x range required, at min to max hertz range of 50 hertz to 120 hertz, or 60 to 144.

This of course wouldn't be a problem if the gfx cards still had a 3 DP, 3 HDMI, arrangement (as I could then use 3x HDMI instead). But every 3090 is currently maxed at 2 hdmi ports (and 3 1.4a DP ports), so I'm stuck with 1.4a for triple screen, with no ability to run triple 4k 4:4:4 with g-sync/freesync on in that triple setup.
 
Joined
Jun 11, 2020
Messages
574 (0.34/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
I dunno man but I want some of what you guys are on calling it low-mid, but I want some, it must be the good stuff. You guys referring to it as if it were some sort of antique lmao

Vermeer remains one of the most competent gaming processors money can buy, as well as one of the most efficient processor architectures out there, and the 5800X remains a very high end processor for this purpose. I would go a step further on that, the 5700X is the CPU that most people looking to play video games, including those looking to build with an Ada GPU.

That particular processor is reliable, runs cool and has amazing performance, too. I'll just leave @Mussels' post on the Zen Garden do the talking:


I don't think you will be losing out on a quality experience by using these with a Ryzen 5000 processor, or even one of the faster 3000 processors like a 3800XT or a 3950X.

Just because its a mid level CPU doesn't mean it sucks. Its now a very good value. BUT it isn't the pinnacle of PC gaming, that's all.
 
Joined
Mar 10, 2010
Messages
11,880 (2.18/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB
Video Card(s) Asus tuf RX7900XT /Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores laptop Timespy 6506
If I'm going to run triple screen 4k for a sim rig, I'm limited to a max of 97 hertz with 1.4a at 4:4:4.

With that number not being 120/144 hertz, this would also nullify freesync/g-sync use at 4k, as most screens only offer the magical 2.4x range required, at min to max hertz range of 50 hertz to 120 hertz, or 60 to 144.

This of course wouldn't be a problem if the gfx cards still had a 3 DP, 3 HDMI, arrangement (as I could then use 3x HDMI instead). But every 3090 is currently maxed at 2 hdmi ports, so I'm stuck with 1.4a for triple screen, with virtually no ability to run triple 4k 4:4:4 with g-sync/freesync on in that triple setup.
I think the video output dire straights on these cards is on purpose, a lifetime limiter so to speak.
 
Joined
Feb 23, 2011
Messages
467 (0.09/day)
System Name Gaming PC
Processor Intel Core i5 12400f at 5.4Ghz
Motherboard Asrock B660 Riptide PG (Eternal clock gen)
Cooling ARCTIC Liquid Freezer II
Memory 32GB Kingston FURY Renegade
Video Card(s) Zotac RTX 4070ti Trinity
Storage WD SN750 SE 1TB M.2 + 2x Kioxia Exceria 480GB
Display(s) BenQ EX2780Q 1440p/144Hz
Case Lian Li 205M
Power Supply Corsair RM850x SHIFT
So we get ~70% more performance on average with ~60% more shaders + 50% more clock speed.

Architecturally it's not scaling very well compared to previous generations.
 
Joined
Jul 13, 2016
Messages
3,456 (1.10/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
It is you who is trying to interject "market price" from 3-4 years ago. I am using MSRP. I'll keep on using MRSP. The reasoning for using that to compare has been explained ad nauseum not only here, but on countless other sites. If demand is too high, then the AIBs and retailers pocket the difference. If it's too low, then they take a hit. MSRP is the only thing we have, anything else is chaos and cherry picking.

My comment about paying $309 was just pointing out that I got a discount on the card by buying late in the cycle, since you seemed dead set on comparing discounted prices to new release prices, or high demand scalp prices to MSRP. Anyone can do that, just cherry pick launch vs mid cycle vs late cycle, crypto bust vs crypto mania. That's chaos.

As I said, I think MSRP is the only reasonable way to compare pricing. And +$50 on a $299 card is only +16.7%. For that you get +45% at 1080P and +67% at 4K. That's not hard to understand.



That's a laugh. I posted charts, I've been posting charts, I don't usually say stuff without proof when it comes to numbers because proof is so easy to find. It's 45% faster than 1060 at 1080P and 67% faster at 4K. I paid about $30 over what the 1660 Ti was selling for at the time (about 12% more) to get +20% over that card. The reason was likely because everyone said that the 2060 wasn't worth it, which drove up 1660 Ti demand, while the 2060 wasn't selling. So I got to take advantage of both bad information in circulation and its results. I don't do these things because fanboy gibberish like you imply, I do it because the math works out, and I already showed that it did.

Are you contesting the fact that AIB models were more expensive than founders edition cards? That's an argument certain to fail. That fact alone disproves the idea that using only MSRP while ignoring street price is misguided, let alone the 3000 serier's MSRP vs what people actually paid. It's one thing to use MSRP when people could actually purchase at said price like the 10xx series. That's completely fine. It's not fine to only use MSRP when the vast majority of people are not getting MSRP prices, whether that be from scalpers of Nvidia founders edition shenanigans. The goal of any price comparison is to compare prices people actually paying, it makes no sense to ignore other factors that will impact that.

You are using MSRP? Your $309 2060 KO comment says otherwise.

No, you seem to be using whatever numbers happen to support your point when you are clearly biased by your 2060 ownership.

MSRP has been plain misleading in many cases the last two generations. The 2000 was a terrible generation overall. If you can't agree on those two points you are just denying reality and we have nothing further to discuss. No amount of goal post moving will change that.
 
Last edited:
Joined
Jun 22, 2006
Messages
1,098 (0.16/day)
System Name Beaver's Build
Processor AMD Ryzen 9800X3D
Motherboard Asus TUF Gaming X670E Plus WiFi
Cooling Corsair H115i RGB PLATINUM 97 CFM Liquid
Memory G.SKILL Trident Z5 Neo DDR5-6000 CL30 RAM 32GB (2x16GB)
Video Card(s) NVIDIA GeForce RTX 4090 Founders Edition
Storage WD_BLACK 8TB SN850X NVMe
Display(s) Alienware AW3225QF 32" 4K 240 Hz OLED
Case Fractal Design Design Define R6 USB-C
Audio Device(s) Focusrite 2i4 USB Audio Interface
Power Supply SuperFlower LEADEX TITANIUM 1600W
Mouse Razer DeathAdder V2
Keyboard Corsair K70 RGB Pro
Software Microsoft Windows 11 Pro
Benchmark Scores 3dmark = https://www.3dmark.com/spy/51229598
The biggest issue of this generation is that other cards despite costing an arm and a leg are castrated too much, not to mention that NVIDIA had no qualms calling RTX 4070 an RTX 4080.

It even looks like there will be no RTX 4060/4050 this gen and Nvidia will continue selling the previous generation stock.
Won’t they rebrand the 3050 through 3060 to new generation parts?
 
Joined
Jul 13, 2016
Messages
3,456 (1.10/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I don't see how it is possible to generate an intermediary frame between two frames using temporal (time-based) data and not require that the second frame be rendered before the interpose. Frame1 - Interframe - Frame2, in order to create interframe you must have the already rendered Frame1 and Frame2, and by the time you push Frame1 to the screen it is behind by a frame, giving the 1/3 extra latency. Is this incorrect?

DLSS 3.0 frame insertion does indeed wait for frame 2 in your example before generating the intermediary frame.
 
Joined
Jun 21, 2021
Messages
3,127 (2.36/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Won’t they rebrand the 3050 through 3060 to new generation parts?

One would presume so although no one here has a timetable. Clearly NVIDIA is trying to draw down inventory of assembled graphics cards in the channel as well as Ampere GPU chips.

Unfortunately for NVIDIA crypto mining demand has completely evaporated due to the crypto market crash and Ethereum's PoS merge so they don't have the option of selling off excess Ampere GPU chips in mining cards.

At some point, they will sell out of a particular Ampere GPU and then it'll be time for NVIDIA to decide whether it's worth transitioning the low end products to the Ada Lovelace generation. Or they can tell Samsung to fire up their foundry and churn out another batch of those Amperes.

From a marketing standpoint, I'm sure they would rather have their entire product line on the latest generation rather than straddle both Lovelace and Ampere.

But remember that two years ago when Ampere launched, they marketed the 3090 and 3080 while still selling 2060s. The 3060 and 3050 didn't come until later.
 
Last edited:
Joined
Oct 18, 2019
Messages
428 (0.22/day)
Location
NYC, NY
As we can see from the MSI SUPRIM LIQUID X, (and even the Kingpin 3090Ti)
MSI-Geforce-RTX-4090-SUPRIM-LIQUID-X-24G2-1.jpg
there's No good reason the cards have to be that size.
 
Joined
Dec 31, 2020
Messages
1,110 (0.74/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
So we get ~70% more performance on average with ~60% more shaders + 50% more clock speed.

Architecturally it's not scaling very well compared to previous generations.

4090 is double the performance in games that are not CPU limited like resident evil in 4K gen over gen which means 3090 non Ti. And with frame insertion probably goes up to 3 times.
How is it supposed to scale when it uses the same memory type, it needs GDDR7 very badly. But no such luck. This behemoth is doomed to a kind of failure like the Fermi. GDDR6X is broken, it does very little, 3070 Ti is the same as 3070, no actual benefit from transferring data on 4 voltage levels. just a ginormous power loss.
 
Last edited:
Joined
Jun 21, 2021
Messages
3,127 (2.36/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
As we can see from the MSI SUPRIM LIQUID X, (and even the Kingpin 3090Ti) View attachment 265077there's No good reason the cards have to be that size.

Water has a higher cooling capacity than air.

However you're also moving the heat from one place (the graphics card PCB) to another (the radiator) which is placed in a location that is better for heat dissipation.

You can't just add up cubic centimeters of cooling solutions and say "the stock cooler doesn't need to be that thick."
 
Top