• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Blackwell Graphics Architecture GPU Codenames Revealed, AD104 Has No Successor

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
A refresh because why else name it "2xx" if it's a brand new arch? Ofc I'm just speculating like everyone else here. But it's "just" 3 years.

Can be a refresh of the brand new arch, with the first iteration cancelled to see the light of the day. They are skipping GB100 and go straight to GB200...
 
Joined
Aug 10, 2023
Messages
341 (0.72/day)
Can be a refresh of the brand new arch, with the first iteration cancelled to see the light of the day. They are skipping GB100 and go straight to GB200...
It makes sense because right now ADA isn't selling that well, the covid times are over, the fat selling times are gone. Otherwise both is possible, a refresh and GB200 could be more than a refresh, a 2nd gen of ADA, I kinda overlooked this earlier. GM200 for example wasn't a refresh of GM100, it was a 2nd generation Maxwell product. This could very well be the case with ADA as well, just that it was renamed to Blackwell, for whatever (probably marketing) reasons.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
If indeed the new gens both from AMD and nvidia are delayed till 2025 | 2026, then you can safely assume that's the end of the graphics market as we know it.
2018 - RTX 2000
2020 - RTX 3000
2022 - RTX 4000
2025 | 2026 - RTX 5000

3 | 4 years to launch a new generation ?
Quite possible, since die shrinks can no longer provide the massive performance increases and power decreases we've been used to, due to reaching the limitations of both physics and economics. Physics because silicon becomes susceptible to quantum effects below ~7nm and has doubtful scaling below 1nm, while copper at such small sizes has massive leakage issues; and economics because processes (EUV) needed to etch chips at such tiny sizes are expensive and slow, and any replacements to copper or silicon are going to make chip production incredibly expensive.

There are technologies like chiplets and vertical stacking that can mitigate some of these fundamental limitations and allow us to eke a little more life out of copper and silicon, but those are really just band-aids and side-steps.

GTX 900 - 28nm
GTX 1000 - 16nm/14nm
RTX 2000 - 12nm
RTX 3000 - 8nm
RTX 4000 - 4nm
RTX 5000 - 3nm if 2024, which may not be worth it if TSMC has managed to get 2nm to volume production by end of 2024 (assuming Apple doesn't snipe all that capacity)
 
Joined
Aug 10, 2023
Messages
341 (0.72/day)
Quite possible, since die shrinks can no longer provide the massive performance increases and power decreases we've been used to, due to reaching the limitations of both physics and economics. Physics because silicon becomes susceptible to quantum effects below ~7nm and has doubtful scaling below 1nm, while copper at such small sizes has massive leakage issues; and economics because processes (EUV) needed to etch chips at such tiny sizes are expensive and slow, and any replacements to copper or silicon are going to make chip production incredibly expensive.

There are technologies like chiplets and vertical stacking that can mitigate some of these fundamental limitations and allow us to eke a little more life out of copper and silicon, but those are really just band-aids and side-steps.

GTX 900 - 28nm
GTX 1000 - 16nm/14nm
RTX 2000 - 12nm
RTX 3000 - 8nm
RTX 4000 - 4nm
RTX 5000 - 3nm if 2024, which may not be worth it if TSMC has managed to get 2nm to volume production by end of 2024 (assuming Apple doesn't snipe all that capacity)
Silicon has limits, other material can surpass those limits and still shrink further. That debate is old, tech will never stop or slow down.
 
Joined
Mar 10, 2023
Messages
22 (0.04/day)
Processor 5700X
Motherboard B550 Tomahawk
Memory 16GB Teamgroup
Video Card(s) 6800XT
Storage WD Blue 1TB
Display(s) Iiyama
Case Define S
With AMD dropping out and Intel not anywhere close to NVIDIA, I don't think we will see the 50 series until 2025 at the earliest. NVIDIA even cancelled the 4090 Ti because it has no competition.
You must have some insider knowledge if you know AMD is dropping out? From where? There was a rumour they won't try to compete with 4090 anymore, is that your insider knowledge?
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Silicon has limits, other material can surpass those limits and still shrink further. That debate is old, tech will never stop or slow down.
Tell us, captain obvious, how long do you think it'll take for the whole semiconductor industry plus the entirety of its supply chain to completely retool for completely different materials and completely different ways of fabricating them together?

What, exactly, is going to be the socio-economic impact when we're told that the chips we've been used to getting faster and cheaper for decades, can only keep becoming faster if we're willing to pay orders of magnitude more for them because faster is only possible with those new, vastly more expensive, materials and methods?
 
Joined
Aug 10, 2023
Messages
341 (0.72/day)
Tell us, captain obvious, how long do you think it'll take for the whole semiconductor industry plus the entirety of its supply chain to completely retool for completely different materials and completely different ways of fabricating them together?
They are already researching for that since many many years now, even if you didn't hear about it. No need to get angry, btw. Technological advancement is no myth.
What, exactly, is going to be the socio-economic impact when we're told that the chips we've been used to getting faster and cheaper for decades, can only keep becoming faster if we're willing to pay orders of magnitude more for them because faster is only possible with those new, vastly more expensive, materials and methods?
It's funny that you think that pricing will slow down, it's increasing since 7nm. So why should i state anything else. That being said, it's very possible that some marvelous technological breakthrough delivers us what silicon can't anymore: good pricing, while being better. One of the reasons why new silicon is so expensive right now is that the manufacturing of it became too complicated on the advanced nodes. Not necessarily the case for a new technological approach. Oh yes, I'm being optimistic here.
 
Joined
Dec 25, 2020
Messages
6,749 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
When you take a chip typically put on a 104 die, move it to a 106 die, but sell it for 104 die prices, you deserve to be called out for it. "but muh chip names dont matter" is a very poor excuse for this behavior.


HARD disagree. The last time AMD tried this, it gave nvidia the upper mid range and high end on a silver platter, leading not only to price increases, but nvidia getting more money from sweet high end margins to continue pulling ahead. When you drop out of a market, its HARD to break back in, as everyone will have moved on.

If you refer to Vega 20 or Navi 10 v. Turing, they always had it... Price aside the Titan RTX could pull almost twice the frame rates of Radeon VII, it just cost like 4 VIIs though
 
Joined
Sep 17, 2014
Messages
22,447 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
1. We already knew it's Blackwell
2. It's said from numerous sources, that it comes at earliest 2025.
Yep and nothing stops Nvidia from getting a new cut later down the line. No 104? I don't see it, why leave such a massive gap between the high end and midrange? Or perhaps this is just Nvidia reshuffling the tiers again. Whatever it will be, the x04 is generally 2/3rds ~ 3/4th of the 'full' die on the top. Maybe the x03 is taking that spot, because the current gen's x70 and x80 positioning was problematic.

When you take a chip typically put on a 104 die, move it to a 106 die, but sell it for 104 die prices, you deserve to be called out for it. "but muh chip names dont matter" is a very poor excuse for this behavior.


HARD disagree. The last time AMD tried this, it gave nvidia the upper mid range and high end on a silver platter, leading not only to price increases, but nvidia getting more money from sweet high end margins to continue pulling ahead. When you drop out of a market, its HARD to break back in, as everyone will have moved on.
Yep... if AMD abandons the high end we can well see them take 2-3 generations again to recover. If you don't play ball in the top or subtop, you're just not playing much. AMD will find Intel in the midrange so they stand to gain nothing, they've just dropped a segment where there's but one competitor and now have two everywhere else.
 
Joined
Dec 25, 2020
Messages
6,749 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Yep and nothing stops Nvidia from getting a new cut later down the line. No 104? I don't see it, why leave such a massive gap between the high end and midrange? Or perhaps this is just Nvidia reshuffling the tiers again. Whatever it will be, the x04 is generally 2/3rds ~ 3/4th of the 'full' die on the top. Maybe the x03 is taking that spot, because the current gen's x70 and x80 positioning was problematic.

We kind of reached an inflection point imo. GPUs have gotten plenty fast, and the battle has shifted to feature set. Except that has been standardized as well, so as long as you have a Pascal card 1060 and above/equivalent AMD RDNA card, smooth 1080p 60 gaming is well within your reach.

By Turing you're already fully compliant with DirectX 12 Ultimate and looking strictly at RT performance and general efficiency gains, and Ada's advanced features like SER aren't used by games yet.

So for eSports segment, GPUs have been good enough for the past 6 years. For AAA flagships, the past 4. And Ada all but made currently released RT games easy to run, so it makes more sense than ever to issue this surcharge if you want to go above and beyond in performance... Because otherwise your experiences are probably going to be the same.
 
Joined
Sep 17, 2014
Messages
22,447 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
They are already researching for that since many many years now, even if you didn't hear about it. No need to get angry, btw. Technological advancement is no myth.

It's funny that you think that pricing will slow down, it's increasing since 7nm. So why should i state anything else. That being said, it's very possible that some marvelous technological breakthrough delivers us what silicon can't anymore: good pricing, while being better. One of the reasons why new silicon is so expensive right now is that the manufacturing of it became too complicated on the advanced nodes. Not necessarily the case for a new technological approach. Oh yes, I'm being optimistic here.
Dies can just go bigger and nodes can be refined for better yield and reduced cost as well. We're already seeing this in fact. Shrinking isn't the only path forward on silicon - DUV required lots of patterning/steps to make a chip for a small (like 7nm) node, and EUV now has reduced that number significantly. This is all a gradual process. If anything new is going to replace silicon it will be a very slow process too, and silicon will still remain alongside it for a long time.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
That being said, it's very possible that some marvelous technological breakthrough delivers us what silicon can't anymore: good pricing, while being better.

1. Online gaming but every computer would need a cable optic fibre connection with as low latency as possible, and servers in every country, as densely located across the globe as possible.
2. Technology that gets rid of the traditional polygon rasterisation - transition to unlimited detail with unlimited zoom of points | atoms in the 3D instead of polygons and stripes.


Something close to it is actually used in Unreal Engine 5 - Nanite.
 

ir_cow

Staff member
Joined
Sep 4, 2008
Messages
4,469 (0.75/day)
Location
USA
You must have some insider knowledge if you know AMD is dropping out? From where? There was a rumour they won't try to compete with 4090 anymore, is that your insider knowledge?
I know nothing besides the rumor mill
 
Joined
Aug 20, 2007
Messages
21,469 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
If indeed the new gens both from AMD and nvidia are delayed till 2025 | 2026, then you can safely assume that's the end of the graphics market as we know it.
I disagree. AMD and Nvidia have probably realised that gamers don't need yearly upgrades, as game graphics aren't advancing quickly enough these days. If they can keep selling the same line of products for 3-4 years, that's less R&D cost and more focus on generational uplift. I mean, please don't tell me that one desperately needs to upgrade a 6600 XT to a 7600, or a 3060 to a 4060.
 
Joined
Dec 14, 2011
Messages
1,038 (0.22/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Corsair iCUE H115i Elite Capellix 280mm
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage Sabrent Rocket 1TB M.2
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Redragon K618 RGB PRO
Software Microsoft Windows 11 - Enterprise (64-bit)
Meh, can't get excited anymore. The performance increases, but the average person can no longer afford it.
 
Joined
May 3, 2018
Messages
2,881 (1.20/day)
Just a rumor AMD is dropping out. AMD is just delayed cause Apple bought 100% of the 3nm node from TSMC in 2024. AMD will be back in winter 2025 or spring 2026. I'd bet money on it. Apple is just whoring itself for the best node for the entirety of 2024 cause they rich as fuck and want to leave everyone else in the dust temporarily
More than a rumour. And to be clear it's only N41 and N42 that are being canned to focus on I presume what will be called N51 and N52. RDNA5 is already lloking good apparently but high end RDNA4 is lacklustre and offering only small gains over 7900 cards. Sources inside AMD have said it would require delays to both RDNA4 and RDNA5 launch dates to get it right. Given Blackwell isn't coming out to 2025 AMD doesn't need to worry too much. Just focus on 7700 and 7800 this year and they can get 8600 out by September next year apparently. Hope they can make a 8600 a true generational uplift unlike this years crap.
 
Joined
Sep 15, 2011
Messages
6,722 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
With AMD dropping out and Intel not anywhere close to NVIDIA, I don't think we will see the 50 series until 2025 at the earliest. NVIDIA even cancelled the 4090 Ti because it has no competition.
That's actually a good thing. You can keep your very expensive video cards longer, and don't feel the temptation to buy a new generation, just because is 50% stronger than what you currently have. Even if there are no quality games out there worthy of a new GPU investment. ;)
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,221 (4.66/day)
Location
Kepler-186f
That's actually a good thing. You can keep your very expensive video cards longer, and don't feel the temptation to buy a new generation, just because is 50% stronger than what you currently have. Even if there are no quality games out there worthy of a new GPU investment. ;)

I'm notnsure why this is even news. we have always been on a 2 to 3 year cycle for high end cards.
 
Joined
Sep 15, 2011
Messages
6,722 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Interesting comments reads, but I was wondering. Maybe 3dFX had a great idea 20 years ago, but bad implementation.
Instead of using 1 big monolithic GPU, why not using multiple small ones, that are easy and faster to produce due to better yields.
They call it "chiplets" nowadays, so why for AMD to abandon the High-end and Enthusiast segment, if you can stack 4 or more of those, basically doubling performance of the video card with each pair added. I thought they were going to smoke enVideea with this...
Confusing times.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,221 (4.66/day)
Location
Kepler-186f
It was every 6 months at one point when ATi rocked NVIDIA.

maybe the most recent gpu's is all i can remember, huh. interesting.

radeon 6800 xt launch date was november 2020, so it took 3 years for 7800 xt to replace it. its prob fair to say its not just covid, but we are on a 3 year cycle.

im glad i got my 7900 xt for the price i did, im set for many years. i was playing assassins creed brotherhood at 160 fps ultra setting last night (one of the few ac games that supports high refresh) and the fans didn't even kick on on the gpu, cause the card itself is so powerful. absolutely blew me away. lol

not a single frame drop either, was 160 the entire time. same as in uncharted 4. unbelievable the thing this power has, now in uncharted 4 the fans do kick in to high gear and she gets got. its about the only game that makes my card extra hot.
 
Joined
Aug 10, 2023
Messages
341 (0.72/day)
That's actually a good thing. You can keep your very expensive video cards longer, and don't feel the temptation to buy a new generation, just because is 50% stronger than what you currently have. Even if there are no quality games out there worthy of a new GPU investment. ;)
Why? Is your buying decision dependant on what the fastest GPU is right now, or what Nvidia told you "you have to have, because of feature X"? Please don't.
Interesting comments reads, but I was wondering. Maybe 3dFX had a great idea 20 years ago, but bad implementation.
Instead of using 1 big monolithic GPU, why not using multiple small ones, that are easy and faster to produce due to better yields.
They call it "chiplets" nowadays, so why for AMD to abandon the High-end and Enthusiast segment, if you can stack 4 or more of those, basically doubling performance of the video card with each pair added. I thought they were going to smoke enVideea with this...
Confusing times.
Yes, but these GPUs were way different back then. 3DFX SLI (the original SLI, not Nvidias SLI, which name was essentially bought from buying up the rest of 3DFX) works by using multiple GPUs in tandem who do different parts of the work and it's perfectly synchronized, not like Crossfire / SLI now (I mean it's dead anyway), because the hardware was specifically build for it from the start. But newer GPUs, they don't work like that, they are built to work alone, and it's just optional to let them work with a partner card. Now that was a lot of work and they abandoned it, also it was mostly multiple cards, not multiple GPUs on one board. TLDR: way different tech than what 3DFX used back then, but of course it's a shame 3DFX is gone. It's one of the biggest regrets of all time.
maybe the most recent gpu's is all i can remember, huh. interesting.

radeon 6800 xt launch date was november 2020, so it took 3 years for 7800 xt to replace it. its prob fair to say its not just covid, but we are on a 3 year cycle.

im glad i got my 7900 xt for the price i did, im set for many years. i was playing assassins creed brotherhood at 160 fps ultra setting last night (one of the few ac games that supports high refresh) and the fans didn't even kick on on the gpu, cause the card itself is so powerful. absolutely blew me away. lol

not a single frame drop either, was 160 the entire time. same as in uncharted 4. unbelievable the thing this power has, now in uncharted 4 the fans do kick in to high gear and she gets got. its about the only game that makes my card extra hot.
The 7800 XT isn't even the real successor to the 6800 XT, the 7900 XT is (they renamed everything 1 tier up), more or less, but the chip is more deactivated in comparison. Just like with Nvidia gave us a fat chip for just 700 $ (i mean if you were lucky to get one), the 3080, the 6800 XT was just a slightly deactivated part of a 6900 XT big chip to compete with that, that all for 650-700$, something we can't imagine now. Nvidias big chip is 1600$ now and still not the full thing, it's ridiculous, but whatever I guess. The 4080 isn't that big but still costs 1200$. In comparison AMD didn't increase prices that much, so I'm not surprised people don't like this gen, mainly because Nvidia got too greedy. First they did it to lose old inventory of 3090 and down, but that was never the real reason. The real reason was, they saw with covid times that people will pay ridiculous prices and they kept it up.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Interesting comments reads, but I was wondering. Maybe 3dFX had a great idea 20 years ago, but bad implementation.
Instead of using 1 big monolithic GPU, why not using multiple small ones, that are easy and faster to produce due to better yields.
They call it "chiplets" nowadays, so why for AMD to abandon the High-end and Enthusiast segment, if you can stack 4 or more of those, basically doubling performance of the video card with each pair added. I thought they were going to smoke enVideea with this...
Confusing times.
Chiplets are nothing like 3dfx's approach. You are comparing apples to pigs.

it's a shame 3DFX is gone
Nope, they were a poorly-managed company that made many questionable decisions and ultimately contributed little long-term value to the graphics card industry, except nostalgia.
 
Joined
Aug 10, 2023
Messages
341 (0.72/day)
Nope, they were a poorly-managed company that made many questionable decisions and ultimately contributed little long-term value to the graphics card industry, except nostalgia.
What a terrible take, that has unfortunately not much to do with reality. Poorly managed, at the end, yes, not in general. Otherwise you're spot-on-wrong.
 
Top