• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Delivers Quantum Leap in Performance, Introduces New Era of Neural Rendering With GeForce RTX 40 Series

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,215 (7.55/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA today unveiled the GeForce RTX 40 Series of GPUs, designed to deliver revolutionary performance for gamers and creators, led by its new flagship, the RTX 4090 GPU, with up to 4x the performance of its predecessor. The world's first GPUs based on the new NVIDIA Ada Lovelace architecture, the RTX 40 Series delivers massive generational leaps in performance and efficiency, and represents a new era of real-time ray tracing and neural rendering, which uses AI to generate pixels.

"The age of RTX ray tracing and neural rendering is in full steam, and our new Ada Lovelace architecture takes it to the next level," said Jensen Huang, NVIDIA's founder and CEO, at the GeForce Beyond: Special Broadcast at GTC. "Ada provides a quantum leap for gamers and paves the way for creators of fully simulated worlds. With up to 4x the performance of the previous generation, Ada is setting a new standard for the industry," he said.



DLSS 3 Generates Entire Frames for Faster Game Play
Huang also announced NVIDIA DLSS 3—the next revolution in the company's Deep Learning Super Sampling neural-graphics technology for games and creative apps. The AI-powered technology can generate entire frames for massively faster game play. It can overcome CPU performance limitations in games by allowing the GPU to generate entire frames independently.

The technology is coming to the world's most popular game engines, such as Unity and Unreal Engine, and has received support from many of the world's leading game developers, with more than 35 games and apps coming soon.

Additionally, the RTX 40 Series GPUs feature a range of new technological innovations, including:
  • Streaming multiprocessors with up to 83 teraflops of shader power—2x over the previous generation.
  • Third-generation RT Cores with up to 191 effective ray tracing teraflops—2.8x over the previous generation.
  • Fourth-generation Tensor Cores with up to 1.32 Tensor petaflops—5x over the previous generation using FP8 acceleration.
  • Shader Execution Reordering (SER) that improves execution efficiency by rescheduling shading workloads on the fly to better utilize the GPU's resources. As significant an innovation as out-of-order execution was for CPUs, SER improves ray tracing performance up to 3x and in-game frame rates by up to 25%.
  • Ada Optical Flow Accelerator with 2x faster performance allows DLSS 3 to predict movement in a scene, enabling the neural network to boost frame rates while maintaining image quality.
  • Architectural improvements tightly coupled with custom TSMC 4N process technology results in an up to 2x leap in power efficiency.
  • Dual NVIDIA Encoders (NVENC) cut export times by up to half and feature AV1 support. The NVENC AV1 encode is being adopted by OBS, Blackmagic Design DaVinci Resolve, Discord and more.
New ray tracing Tech for Even More Immersive Games
For decades, rendering ray-traced scenes with physically correct lighting in real time has been considered the holy grail of graphics. At the same time, geometric complexity of environments and objects has continued to increase as 3D games and graphics strive to provide the most accurate representations of the real world.

Achieving physically accurate graphics requires tremendous computational horsepower. Modern ray-traced games like Cyberpunk 2077 run over 600 ray tracing calculations for each pixel just to determine lighting—a 16x increase from the first ray-traced games introduced four years ago.

The new third-generation RT Cores have been enhanced to deliver 2x faster ray-triangle intersection testing and include two important new hardware units. An Opacity Micromap Engine speeds up ray tracing of alpha-test geometry by a factor of 2x, and a Micro-Mesh Engine generates micro-meshes on the fly to generate additional geometry. The Micro-Mesh Engine provides the benefits of increased geometric complexity without the traditional performance and storage costs of complex geometries.

Creativity Redefined With RTX Remix, New AV1 Encoders
The RTX 40 Series GPUs and DLSS 3 deliver advancements for NVIDIA Studio creators. 3D artists can render fully ray-traced environments with accurate physics and realistic materials, and view the changes in real time, without proxies. Video editing and live streaming also get a boost from improved GPU performance and the inclusion of new dual, eighth-generation AV1 encoders. The NVIDIA Broadcast software development kit has three updates, now available for partners, including Face Expression Estimation, Eye Contact and quality improvements to Virtual Background.

NVIDIA Omniverse—included in the NVIDIA Studio suite of software—will soon add NVIDIA RTX Remix, a modding platform to create stunning RTX remasters of classic games. RTX Remix allows modders to easily capture game assets, automatically enhance materials with powerful AI tools, and quickly enable RTX with ray tracing and DLSS.

Portal Is RTX ON!
RTX Remix has been used by NVIDIA Lightspeed Studios to reimagine Valve's iconic video game Portal, regarded as one of the best video games of all time. Advanced graphics features such as full ray tracing and DLSS 3 give the game a striking new look and feel. Portal with RTX will be released as free, official downloadable content for the classic platformer with RTX graphics in November, just in time for Portal's 15th anniversary.

The GeForce RTX 4090 and 4080:
The New Ultimate GPUs The RTX 4090 is the world's fastest gaming GPU with astonishing power, acoustics and temperature characteristics. In full ray-traced games, the RTX 4090 with DLSS 3 is up to 4x faster compared to last generation's RTX 3090 Ti with DLSS 2. It is also up to 2x faster in today's games while maintaining the same 450 W power consumption. It features 76 billion transistors, 16,384 CUDA cores and 24 GB of high-speed Micron GDDR6X memory, and consistently delivers over 100 frames per second at 4K-resolution gaming. The RTX 4090 will be available on Wednesday, Oct. 12, starting at $1,599.

The company also announced the RTX 4080, launching in two configurations. The RTX 4080 16 GB has 9,728 CUDA cores and 16 GB of high-speed Micron GDDR6X memory, and with DLSS 3 is 2x as fast in today's games as the GeForce RTX 3080 Ti and more powerful than the GeForce RTX 3090 Ti at lower power. The RTX 4080 12 GB has 7,680 CUDA cores and 12 GB of Micron GDDR6X memory, and with DLSS 3 is faster than the RTX 3090 Ti, the previous-generation flagship GPU.

Both RTX 4080 configurations will be available in November, with prices starting at $1,199 and $899, respectively.

The GeForce RTX 4090 and 4080 GPUs will be available as custom boards, including stock-clocked and factory-overclocked models, from top add-in card providers such as ASUS, Colorful, Gainward, Galaxy, GIGABYTE, Innovision 3D, MSI, Palit, PNY and Zotac. The RTX 4090 and RTX 4080 (16 GB) are also produced directly by NVIDIA in limited Founders Editions for fans wanting the NVIDIA in-house design. Look for the GeForce RTX 40 Series GPUs in gaming systems built by Acer, Alienware, ASUS, Dell, HP, Lenovo and MSI, leading system builders worldwide, and many more.

View at TechPowerUp Main Site
 
Joined
Apr 16, 2013
Messages
549 (0.13/day)
Location
Bulgaria
System Name Black Knight | White Queen
Processor Intel Core i9-10940X (28 cores) | Intel Core i7-5775C (8 cores)
Motherboard ASUS ROG Rampage VI Extreme Encore X299G | ASUS Sabertooth Z97 Mark S (White)
Cooling Noctua NH-D15 chromax.black | Xigmatek Dark Knight SD-1283 Night Hawk (White)
Memory G.SKILL Trident Z RGB 4x8GB DDR4 3600MHz CL16 | Corsair Vengeance LP 4x4GB DDR3L 1600MHz CL9 (White)
Video Card(s) ASUS ROG Strix GeForce RTX 4090 OC | KFA2/Galax GeForce GTX 1080 Ti Hall of Fame Edition
Storage Samsung 990 Pro 2TB, 980 Pro 1TB, 850 Pro 256GB, 840 Pro 256GB, WD 10TB+ (incl. VelociRaptors)
Display(s) Dell Alienware AW2721D 240Hz| LG OLED evo C4 48" 144Hz
Case Corsair 7000D AIRFLOW (Black) | NZXT ??? w/ ASUS DRW-24B1ST
Audio Device(s) ASUS Xonar Essence STX | Realtek ALC1150
Power Supply Enermax Revolution 1250W 85+ | Super Flower Leadex Gold 650W (White)
Mouse Razer Basilisk Ultimate, Razer Naga Trinity | Razer Mamba 16000
Keyboard Razer Blackwidow Chroma V2 (Orange switch) | Razer Ornata Chroma
Software Windows 10 Pro 64bit
Time for 4090... finally.
 
Joined
Apr 29, 2014
Messages
4,290 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
The two 4080's is what's got me, not a big fan of releasing to memory variants that also have different core counts because it comes off as just a memory difference. But I will be interested to see these in the wild and some comparisons!
 
Joined
Jun 21, 2021
Messages
3,112 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
The two 4080's is what's got me, not a big fan of releasing to memory variants that also have different core counts because it comes off as just a memory difference. But I will be interested to see these in the wild and some comparisons!

A wiser approach is to ignore the 4080 model number designations and just assess the cards by actual performance.

In addition to the VRAM size and core count differences, they also have different memory bus sizes, clock speeds, and power requirements.

It is noteworthy that all three cards announced today are based on different GPUs: AD102-300, AD103-300, and AD104-400.

In the same way, many people erroneously compared the 3080 Ti to the 3080 because of the model numbers. The 3080 Ti actually shared the same GPU as the 3090 so the better comparison would have been with the 3090 (basically the 3080 Ti was a binned 3090 with half the VRAM).
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
Key points:
  • DLSS 3.0 is fantastic though proprietary.
  • Pricing is just bad.
  • Two 4080 SKUs with a different amount of shaders? Looks like NVIDIA decided to charge top dollar for what should have been RTX 4070 Ti. Let's see what RDNA 3.0 will bring because this is just ugly.
  • I expect RDNA 3.0 to reach the RTRT performance of the RTX 30 series which again means NVIDIA will take the performance crown for heavy RT games for the next two years.
  • Looks like we've reached the point where the laws of physics no longer allow to get more performance at the same power (envelope) which is really really sad.
 
Joined
Aug 4, 2022
Messages
54 (0.06/day)
Wish they would have announced a new Shield TV. The current one could really use a CPU and GPU upgrade. Get hardware decoding of newer formats, and it would be nice to use some DLSS tech to do better 4k content upscaling, and potentially use DLSS3 tech to do frame interpolating to 60-120fps that is actually good and without a big increase in latency. And of course, the current one is too slow to run emulators on so that would be a nice upgrade too. Guess we will have to wait till probably next year for that to come.


Is it confirmed that the RTX 4080 16GB is using the AD103 core with 9700 cores while the RTX 4080 12GB uses the AD104 core and has 7,600 cores? I didn't see core counts listed in the presentation, but he went pretty fast through that part (probably to try and pull a fast one on as many people as he could).
So effectively the 4080 12GB is a renamed 4070Ti and Nvidia once again shifted the product stack costs up a tier?
 
Last edited:
Joined
Jun 16, 2019
Messages
373 (0.19/day)
System Name Cyberdyne Systems Core
Processor AMD Sceptre 9 3950x Quantum neural processor (384 nodes)
Motherboard Cyberdyne X1470
Cooling Cyberdyne Superconduct SC5600
Memory 128TB QRAM
Storage SK 16EB NVMe PCI-E 9.0 x8
Display(s) Multiple LG C9 3D Matrix OLED Cube
Software Skysoft Skynet
Saying how much more powerful they are when using DLSS3 compared to the last gen cards... I bet those last gen cards aren't even using DLSS 2.3 or anything, it will be native rendering on them vs DLSS3 on the new cards to make the jump seem bigger.
 
Joined
Feb 11, 2009
Messages
5,545 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
"a quantum leap" but they can just throw out anything without consequences cant they?
 
Joined
Mar 14, 2014
Messages
1,387 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
A wiser approach is to ignore the 4080 model number designations and just assess the cards by actual performance.

In addition to the VRAM size and core count differences, they also have different memory bus sizes, clock speeds, and power requirements.

It is noteworthy that all three cards announced today are based on different GPUs: AD102-300, AD103-300, and AD104-400.

In the same way, many people erroneously compared the 3080 Ti to the 3080 because of the model numbers. The 3080 Ti actually shared the same GPU as the 3090 so the better comparison would have been with the 3090 (basically the 3080 Ti was a binned 3090 with half the VRAM).
3080 was also cut from the same die as 90s.

While ignoring the model number is okay for some, at the store to the regular person only difference will be VRAM. Which isn't cool if that's not the only difference.
 
Joined
Jun 21, 2021
Messages
3,112 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Well, the price is going to be a difference.

Discrete graphics cards have become an increasingly niche product, especially in the upper tier. For sure, NVIDIA's choice in product model number choice may confuse a handful of people but not those who do their homework.

Joe Consumer in the USA is going to buy whatever's cheaper anyhow.
 
Joined
Aug 12, 2019
Messages
2,175 (1.13/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
so nvidia holding back the RTX4070 to counter amd price to perf gpu
or if you guys think the rtx4080 12gb is the new 4070...
massive gap between the 4080 16gb and 4090 so i guess its for a RTX4080ti to counter amd
later on we will see RTX4090ti?
 
Joined
Nov 26, 2021
Messages
1,642 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
This will be the biggest difference between the flagship and the next fastest GPU in terms of SMX count that I can remember. The previous generations were like this:

GenerationFlagship SMX count2nd tier GPU SMX countRatioComments
Kepler
15​
12​
1.25​
GTX 780 Ti vs GTX 780
Maxwell
24​
22​
1.09​
Titan X vs GTX 980 Ti
Pascal
30​
28​
1.07​
Titan Xp vs GTX 1080 Ti
Turing
72​
68​
1.06​
RTX Titan vs RTX 2080 Ti
Ampere
84​
68​
1.24​
RTX 3090 Ti vs RTX 3080 10 GB
Ada
128​
76​
1.68​
RTX 4090 vs RTX 4080 16 GB

One can easily see how the 4080 16 GB stands out as the runt and poor value.

Even if we go by die sizes for the actual 2nd tier full die, this generation is an outlier.

GenerationFlagship SMX countFlagship Price2nd tier GPU SMX count2nd tier PriceSMX RatioPrice RatioComments
Kepler
15​
699
8​
330
1.88​
2.12GTX 780 Ti vs GTX 680
Maxwell
24​
649
16​
499
1.50​
1.30980 Ti vs GTX 980
Pascal
30​
699
20​
499
1.50​
1.751080 Ti vs GTX 1080
Turing
72​
1200
48​
699
1.50​
2.082080 Ti vs RTX 2080 Super
Ampere
84​
1999
48​
599
1.75​
3.34RTX 3090 Ti vs RTX 3070 Ti
Ada
128​
1599
76​
1199
1.68​
1.33RTX 4090 vs RTX 4080 16 GB

Now it looks better for the 4080 16 GB until you consider the price which is outside the historical norm for the lower tier GPUs. Only the GTX 980 was priced this close to the flagship, and that was an atypical generation in many ways.
 
Joined
Jan 27, 2015
Messages
1,715 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
3080 was also cut from the same die as 90s.

While ignoring the model number is okay for some, at the store to the regular person only difference will be VRAM. Which isn't cool if that's not the only difference.

Agree, was just looking at the specs at videocardz and that is very deceptive. They even use different die it seems. This is more like the difference I'd expect between a 4080 and a 4080 Ti, or perhaps even a 4080 and 4090. Overall the 16GB seems to have 20-25% more SMs, CUDA cores, and memory bandwidth. That's normally an entire tier of performance.

1663694309830.png
 
Joined
Jun 21, 2021
Messages
3,112 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
so nvidia holding back the RTX4070 to counter amd price to perf gpu
or if you guys think the rtx4080 12gb is the new 4070...
massive gap between the 4080 16gb and 4090 so i guess its for a RTX4080ti to counter amd
later on we will see RTX4090ti?

I don't think it's wise to simply rely on NVIDIA marketing department's model numbers. They have a habit of changing their model number implementations from generation to generation. Hell, even the --90 cards are really Titans in sheep's clothing these days.

My guess is that we will see a 4090 Ti someday in the future, the full fat AD102 GPU from binned silicon. Why not? NVIDIA can set aside some better samples and charge more money for them. The cost to NVIDIA is the same, they are all coming off the same wafers.
 
Joined
Feb 3, 2012
Messages
202 (0.04/day)
Location
Tottenham ON
System Name Current
Processor i7 12700k
Motherboard Asus Prime Z690-A
Cooling Noctua NHD15s
Memory 32GB G.Skill
Video Card(s) GTX 1070Ti
Storage WD SN-850 2TB
Display(s) LG Ultragear 27GL850-B
Case Fractal Meshify 2 Compact
Audio Device(s) Onboard
Power Supply Seasonic 1000W Titanium
Any idea when the NDA lifts on reviews?
 
Joined
Mar 14, 2014
Messages
1,387 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
Agree, was just looking at the specs at videocardz and that is very deceptive. They even use different die it seems. This is more like the difference I'd expect between a 4080 and a 4080 Ti, or perhaps even a 4080 and 4090. Overall the 16GB seems to have 20-25% more SMs, CUDA cores, and memory bandwidth. That's normally an entire tier of performance.

View attachment 262349
Uhhh there's a 192bit 4080? Wtf
 
Joined
Jun 21, 2021
Messages
3,112 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Any idea when the NDA lifts on reviews?

My guess is that W1zzard knows. Maybe some other TPU staffers as well.

Part of the NDA might be to not talk about the NDA until it is lifted. Any date at this point would just be speculation unless you've actually read the NDA itself.
 
Joined
Aug 4, 2022
Messages
54 (0.06/day)
Any idea when the NDA lifts on reviews?
Safest bet will be October 13th, same day the card goes on sale. That way everyone has to go run and buy one first knowing they will run out of stock and then sit down and read the review after they already purchased it.
 
Joined
May 15, 2014
Messages
235 (0.06/day)
I don't think it's wise to simply rely on NVIDIA marketing department's model numbers. They have a habit of changing their model number implementations from generation to generation. Hell, even the --90 cards are really Titans in sheep's clothing these days.

My guess is that we will see a 4090 Ti someday in the future, the full fat AD102 GPU from binned silicon. Why not? NVIDIA can set aside some better samples and charge more money for them. The cost to NVIDIA is the same, they are all coming off the same wafers.
About Titan or 4090tie branding... I depends on competitiveness of AMD. If perf diff significant enough it'll be Titan @ >$2k, otherwise 4090tie <$2k.
 
Joined
May 11, 2018
Messages
1,237 (0.52/day)
"Computing is getting more expensive at incredible speeds!" - Jensen Huang, probably.

RTX 3080 was $700.

RTX 4080 12GB is $900, 16GB is $1200.

I just hate it when I'm right. I kind of feared such price increase, and I think this is just the beginning, everything coming out this fall will have such perverse price increases. New CPUs, motherboards, even new PSUs are reported to be much more expennsive.

On the other hand, even though inflation is more than 10% here, and cost of living is skyrocketing, salaries mostly remain the same - because companies are reportedly struggling and higher salaries would be a breaking point.
 
Joined
Nov 12, 2020
Messages
152 (0.10/day)
Processor 265K (running stock until more Intel updates land)
Motherboard MPG Z890 Carbon WIFI
Cooling Peerless Assassin 140
Memory 48GB DDR5-7200 CL34
Video Card(s) RTX 3080 12GB FTW3 Ultra Hybrid
Storage 1.5TB 905P and 2x 2TB P44 Pro
Display(s) CU34G2X and Ea244wmi
Case Dark Base 901
Audio Device(s) Sound Blaster X4
Power Supply Toughpower PF3 850
Mouse G502 HERO/G700s
Keyboard Ducky One 3 Pro Nazca
The debacle with EVGA and the reporting from JPR shows full well Nvidia has been abusing its place in the market to raise prices for greater margins while squeezing AIBs. I wish AMD wasn't willing to go along with it and/or Intel would come out swinging on price. As it stands I can't imagine buying another video card in this market as it's apparent that it is up to the customers to stop it.
 
Joined
Feb 3, 2005
Messages
499 (0.07/day)
I wonder if EVGA exited because it didn't see much demand at these MSPR prices, given the flood of used cards hitting the market and far less crypto demand.
 
Joined
Jun 21, 2021
Messages
3,112 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
I wonder if EVGA exited because it didn't see much demand at these MSPR prices, given the flood of used cards hitting the market and far less crypto demand.

Not a likely factor. One of EVGA's clearly stated gripes was that NVIDIA would not reveal MSRP to their AIB partners until the very last moment so they were basically flying blind on gross margin forecasting.

EVGA has two decades of 20-20 hindsight of how gross margins ended up and apparently they did not like how GM was trending.
 
Joined
Feb 15, 2019
Messages
1,658 (0.79/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
That is clearly a (supposed to be 4070) rebranded to be 4080 12GB and sell for double the bucks
 
Joined
May 19, 2009
Messages
1,865 (0.33/day)
Location
Latvia
System Name Personal \\ Work - HP EliteBook 840 G6
Processor 7700X \\ i7-8565U
Motherboard Asrock X670E PG Lightning
Cooling Noctua DH-15
Memory G.SKILL Trident Z5 RGB Black 32GB 6000MHz CL36 \\ 16GB DDR4-2400
Video Card(s) ASUS RoG Strix 1070 Ti \\ Intel UHD Graphics 620
Storage 2x KC3000 2TB, Samsung 970 EVO 512GB \\ OEM 256GB NVMe SSD
Display(s) BenQ XL2411Z \\ FullHD + 2x HP Z24i external screens via docking station
Case Fractal Design Define Arc Midi R2 with window
Audio Device(s) Realtek ALC1150 with Logitech Z533
Power Supply Corsair AX860i
Mouse Logitech G502
Keyboard Corsair K55 RGB PRO
Software Windows 11 \\ Windows 10
Those prices... Worse than I feared, that's the best I can say without swearing.
 
Top