• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Provides a Statement on MIA RTX 3090 Ti GPUs

Joined
Dec 23, 2012
Messages
1,715 (0.39/day)
Location
Somewhere Over There!
System Name Gen2
Processor Ryzen R9 5950X
Motherboard Asus ROG Crosshair Viii Hero Wifi
Cooling Lian Li 360 Galahad
Memory G.Skill Trident Z RGB 64gb @ 3600 Mhz CL14-13-13-24 1T @ 1.45V
Video Card(s) Sapphire RX 6900 XT Nitro+
Storage Seagate 520 1TB + Samsung 970 Evo Plus 1TB + lots of HDD's
Display(s) Samsung Odyssey G7
Case Lian Li PC-O11D XL White
Audio Device(s) Onboard
Power Supply Super Flower Leadex SE Platinum 1000W
Mouse Xenics Titan GX Air Wireless
Keyboard Kemove Snowfox 61
Software Main: Gentoo+Arch + Windows 11
Benchmark Scores Have tried but can't beat the leaders :)
I think they found out no performance uplift from 3090 ha ha joke

But I believe they waiting until 6950 will be released...
 
Joined
Sep 17, 2014
Messages
22,431 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I still say it's because no Radeon 6950XT showed up. There's no reason for it in this market, especially if they're having issues producing one, which it seems they are.

This has never stopped Nvidia before

They probably realized releasing another video card that cost 3k is stupid. Especially when 40 series and AMDs cards are due out later this year.

This has never stopped Nvidia before

The only thing stopping Nvidia or its cards.... is HEAT, boys. Heat.
They built a boost algorithm that works better than the hardware it's used on. Everytime we see Space Invaders or EVGA producing yet another shite cooler, its a heat problem.

Or, put differently for this current gen, Ampere is shit on Samsung 8nm, as it was initially scaled for TSMC.
Nobody in their right mind willingly chooses to up the TDP on the top end of its stack by nearly a third in one gen - and that's just counting the non ti. Every lower configuration can 'make it' fine, albeit also with inflated TDPs. But the top end... that's new territory. We didn't need Raja after all to get a solid handle on >300W cards in the consumer space did we...

Its a trend with all newer components now. To get more perf, we get more heat, and the margin for error is thin, but hardware is stopped from frying itself by smart algorithms.
 
Joined
Feb 18, 2021
Messages
83 (0.06/day)
Processor Ryzen 7950X3D
Motherboard Asus ROG Crosshair X670E Hero
Cooling Corsair iCUE H150i ELITE LCD
Memory 64GB (2X 32GB) Corsair Dominator Platinum RGB DDR4 60000Mhz CL30
Video Card(s) Zotac GeForce RTX 4090 AMP Extreme AIRO 24GB
Storage WD SN850X 4TB NVMe / Samsung 870 QVO 8TB
Display(s) Asus PG43UQ / Samsung 32" UJ590
Case Phanteks Evolv X
Power Supply Corsair AX1600i
Mouse Logitech MX Master 3
Keyboard Corsair K95 RGB Platinum
Software Windows 11 Pro 24H2
Coming from a Strix 3090 OC owner the TI variant of the 3090 is absolutely stupid, just look at the spec difference, it's absolutely tiny to the point where I doubt you'll see just a few frames of difference if any.

Just look at the spec difference between the 3080 and 3090, the spec difference between those two is pretty massive on paper but in reality it's what, 10 to 15fps faster in the best cases. For a 3090 Ti you would be paying way more than a 3090, you'd get a card with a much larger power draw so most likely hotter for what, 0fps to 5fps if your very lucky.

It would be a DOA product at launch and ridiculed by reviewers, it would be the worst Ti card to ever be released and nvidia knows it.
 
Last edited:
Joined
Dec 25, 2020
Messages
6,698 (4.69/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
They built a boost algorithm that works better than the hardware it's used on. Everytime we see Space Invaders or EVGA producing yet another shite cooler, its a heat problem.

Sorry, call me old school but, give me a stable static clock over whatever rubbish this GPU boost system is any day every day... I actually do a custom curve having my card target a ~1100MHz base so it flatlines on the maximum possible overboost bin all the time (which turns out to be the 1755 MHz target I want it to run at). GPU boost is especially awful on cards with a lower power limit, you will never see it behave correctly pretty much ever - this thing loves to throttle. Trust me when I tell you it is a better experience to have the stable, flatlined clocks at a very modest vcore and absolutely no thermal or power throttling whatsoever than just letting this boost algorithm rubbish run wild.
 
Joined
Jan 14, 2019
Messages
12,337 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Sorry, call me old school but, give me a stable static clock over whatever rubbish this GPU boost system is any day every day... I actually do a custom curve having my card target a ~1100MHz base so it flatlines on the maximum possible overboost bin all the time (which turns out to be the 1755 MHz target I want it to run at). GPU boost is especially awful on cards with a lower power limit, you will never see it behave correctly pretty much ever - this thing loves to throttle. Trust me when I tell you it is a better experience to have the stable, flatlined clocks at a very modest vcore and absolutely no thermal or power throttling whatsoever than just letting this boost algorithm rubbish run wild.
I sort of agree, but sort of don't. I prefer seeing flat clocks myself, though I have to admit that not every scene is the same in any game, so the card may need less or more power to render them. Targeting the TDP level instead of clocks makes sense for cooling, and is better for your PSU as well.
 
Joined
Nov 11, 2016
Messages
3,403 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I sort of agree, but sort of don't. I prefer seeing flat clocks myself, though I have to admit that not every scene is the same in any game, so the card may need less or more power to render them. Targeting the TDP level instead of clocks makes sense for cooling, and is better for your PSU as well.

I use 3 undervolt profiles:
1725mhz/750mV (~280W)
1830mhz/800mV (~310W)
1920mhz/850mV (~350W)
I just toggle between them in-game to find the optimal FPS/efficiency, 90% of the time I use 1830mhz/800mV though
 
Joined
Sep 17, 2014
Messages
22,431 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Sorry, call me old school but, give me a stable static clock over whatever rubbish this GPU boost system is any day every day... I actually do a custom curve having my card target a ~1100MHz base so it flatlines on the maximum possible overboost bin all the time (which turns out to be the 1755 MHz target I want it to run at). GPU boost is especially awful on cards with a lower power limit, you will never see it behave correctly pretty much ever - this thing loves to throttle. Trust me when I tell you it is a better experience to have the stable, flatlined clocks at a very modest vcore and absolutely no thermal or power throttling whatsoever than just letting this boost algorithm rubbish run wild.

Meh. As much as I understand the reasoning, I do disagree. A well built boost algorithm is just extra performance and usually where it counts.

It requires a different stance I think towards overclocking or undervolting.
You're no longer setting the exact situation you want, you're setting the limitations you want. Much like @nguyen here above: 3 sets of limitations for voltage and boost will ensure you have the maximum clock within that limitation. The net result is probably an equal amount of control to the old situation, but still a fluctuation of clocks, where the fluctuation is likely to be overclock potential you'd never have had with a flat clock line.

But... it all depends on how refined and well designed the boost algorithm is, and what its stock limits are.
 
Top