• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GTC 2019 Kicks Off Later Today, New GPU Architecture Tease Expected

Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I hear what you guys are saying. Anyway, if you take turing die and it's size and move from 12 to 7nm, considering it's got same amount of cores, shaders etc. the die will be smaller. You have more of them on one wafer. I kinda consider this that way. What it means for me is that you get same performance (because it is the same chip) but it uses less power and it's smaller due to the shrink.
Isn't this going that way?

Yeah you got that right, but now you need to still factor in the actual cost of moving fabs to a smaller node, adjusting the processes and machinery etc. And then all you've got is the same product that uses a bit less power - and has headroom for further improvement. That on its own is not enough to compete. You go smaller so you can go 'bigger' :)
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Yeah you got that right, but now you need to still factor in the actual cost of moving fabs to a smaller node, adjusting the processes and machinery etc. And then all you've got is the same product that uses a bit less power - and has headroom for further improvement. That on its own is not enough to compete. You go smaller so you can go 'bigger' :)
Yes. I agree with that just wanted to make a point with the die size and yields. On the other hand if what you say is true, AMD or NV wouldn't go that direction but improve upon current manufacturing process till they hit the limit. At the beginning it might get more expensive but with time both companies are cutting costs with new smaller nodes. That's also what they are after and of course they can go bigger and even more faster. I think this all goes in line. Smaller, less power, faster and of course can go bigger which gives even more performance. This gives the companies a possibility to go bigger (in comparison to the 12 or 16 nm nodes)and they will need to evaluate how big they can go with the 7nm node so it is reasonable.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
On the other hand if what you say is true, AMD or NV wouldn't go that direction but improve upon current manufacturing process till they hit the limit.

They did! 28nm was dragged out long past expiry date (that is why Maxwell was dubbed "Paxwell") and Pascal was little more than a shrink of it with some bonus features. 12nm is just 16++ in a sense and on this node, Nvidia already pushed die sizes to an absolute maximum with the 2080ti. Other side of the fence, AMD's Vega was already scaled beyond sensible power/temp targets and prior to it, Fury X also maxed out 28nm - it even required HBM and watercooling to round it off.
 
Joined
Feb 3, 2017
Messages
3,822 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Current state of things indicates that 7nm is not ready for prime-time with big dies like GPUs. Vega10 at 14nm and TU104 at 12nm are about 55-60% larger than Vega20 at 7nm. If 7nm is more than 50% more expensive (taking both manufacturing costs and yields in account) it does not make financial sense to go for a shrink. I did a quick Google search but could not find the AMD slide I mentioned earlier but twice the cost for 250mm² die at the end of 2017 is a cause for some pessimism. Vega20 is already a third larger than that. I would expect Vega20 production cost to be about on par with the same GPU (with a 55-60% larger die) made on 12nm today. For AMD, the choice to shrink was a no-brainer - they really need the power efficiency and judging from Radeon VII the choice was justified. Nvidia felt they did not need that for Turing and stayed with 12nm for now.

This will change in time. Or perhaps has changed, considering the time it takes to develop and manufacture a GPU.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
They did! 28nm was dragged out long past expiry date (that is why Maxwell was dubbed "Paxwell") and Pascal was little more than a shrink of it with some bonus features. 12nm is just 16++ in a sense and on this node, Nvidia already pushed die sizes to an absolute maximum with the 2080ti. Other side of the fence, AMD's Vega was already scaled beyond sensible power/temp targets and prior to it, Fury X also maxed out 28nm - it even required HBM and watercooling to round it off.
Ok. So you think that the 12nm is already maxed out since NV is going 7nm? I think there's still headroom but yet NV is moving to 7nm. Question is: is it because AMD does that or they have some doubts about the 12nm yields or performance gains? The crown has already been given to AMD for the first 7nm chip so what's the point in rushing it?
 
Joined
Feb 3, 2017
Messages
3,822 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Ok. So you think that the 12nm is already maxed out since NV is going 7nm? I think there's still headroom but yet NV is moving to 7nm. Question is: is it because AMD does that or they have some doubts about the 12nm yields or performance gains? The crown has already been given to AMD for the first 7nm chip so what's the point in rushing it?
12nm is maxed out. TU102 is 754 mm², very close to TSMC's reticle size. They simply cannot manufacture a larger die without some workarounds (and very considerable additional cost). GV100 was a bit larger at 815 mm² but that is at the very limit and cards with these chips were sold at $3k at the cheapest. If Nvidia wants more transistors they have to go to 7nm.

Performance gain is an unanswered question. For Nvidia GPUs 16/14nm the efficiency curve goes to hell after a little over 2GHz 12nm gets to 2.1GHz. We really do not know whether that is process or architecture limit but probably a bit of both. AMD Vegas get power limited very fast but seem to have gained a couple hundred MHz from the shrink. It is not bad but not that much either. Power consumption will go down noticeably which is a good thing but is it good enough by itself?
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Ok. So you think that the 12nm is already maxed out since NV is going 7nm? I think there's still headroom but yet NV is moving to 7nm. Question is: is it because AMD does that or they have some doubts about the 12nm yields or performance gains? The crown has already been given to AMD for the first 7nm chip so what's the point in rushing it?

Maxed as in: has no future for their product stack. Look at this die shot

https://www.techpowerup.com/gpu-specs/geforce-rtx-2080-ti.c3305

And this is using 250W or more, so is also TDP 'capped' for the Geforce range. (Historically; never say never)

The headroom that does remain simply isn't enough to push another gen or refresh out. There is too little to gain and it would result in even larger dies which also means higher power consumption. That won't work for the 2080ti without extreme measures in cooling or otherwise. So where does that leave all products below it? They have nowhere to reposition to...

As for AMD, they simply have to jump straight to 7nm because Vega 64 already touches 300W and has the exact same problem. And for them, that even includes having already implemented HBM which is marginally more power efficient than conventional memory.
 
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Current state of things indicates that 7nm is not ready for prime-time with big dies like GPUs. Vega10 at 14nm and TU104 at 12nm are about 55-60% larger than Vega20 at 7nm. If 7nm is more than 50% more expensive (taking both manufacturing costs and yields in account) it does not make financial sense to go for a shrink. I did a quick Google search but could not find the AMD slide I mentioned earlier but twice the cost for 250mm² die at the end of 2017 is a cause for some pessimism. Vega20 is already a third larger than that. I would expect Vega20 production cost to be about on par with the same GPU (with a 55-60% larger die) made on 12nm today. For AMD, the choice to shrink was a no-brainer - they really need the power efficiency and judging from Radeon VII the choice was justified. Nvidia felt they did not need that for Turing and stayed with 12nm for now.

This will change in time. Or perhaps has changed, considering the time it takes to develop and manufacture a GPU.

Voila!

https://www.techpowerup.com/forums/...auce-ray-tracing-and-more.251444/post-3974556
 
Joined
Dec 31, 2009
Messages
19,371 (3.54/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
They need more rtx games,it's been just two and one is a mp shooter.This technology will be dead if this continues.
They arent really behind. If you recall when these were released, IIRC, the only thing released in 18 was supposed to be bf v and SOTR. With the latter RT being delayed. We should see more this year... sooner the better.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,425 (4.69/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710 (Red/Black)
Audio Device(s) Asgard 2, Modi 3, HD58X
Power Supply Corsair RM850x Gold
So RTX 2000 series obsolete in less than a year ?

no... gtx 1080 ti and rtx 2080 ti were 2 1/2 years apart from launch dates... Nvidia knows silicon has its limits so they probably will move rtx 2080 ti to a 3 year cycle... unless competition arrives, but it won't so....
 
Joined
Feb 3, 2017
Messages
3,822 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Nvidia has other things they can talk about. Ampere. RTX. Whatever piece of software they want to hype, probably directed to devs not consumers.

Turing is still being ramped up. Or down, in this case as TU116 is true midrange material. Announcing RTX 3000 series now would be unexpected. Even if RTX 2000 will be a one-year thing, RTX 3000 announcement is more likely to be at GamesCom in August.

Eventually though, it depends on what AMD has up its sleeve. If AMD comes out with competitive enough Navi in August as rumors currently say Nvidia will need to answer. AMD and Nvidia know pretty well what the other is working on and generally have a pretty good idea what the other announces and released. There are details like final clocks and final performance that they have to estimate but these estimations are not far off.
 
Joined
Mar 11, 2019
Messages
201 (0.10/day)
Location
over the HoRYZEN
System Name Not an Intel Piece of Shite
Processor Superior AMD Glorious Master Race 2700SEX
Motherboard Glorious low cost Awesome Motherboard 4
Cooling A piece of metal that cools the amazing Ryzen CPU
Memory SAMMY BEEE DAI BABEH
Video Card(s) Turding
Storage irelevant
Display(s) monitor
Case It's red because AMD = red and AMD = awesome
Power Supply 1000W,. but not needed as Glorious RYZEN CPU is extremely afficient unlike that recylced 14nm++ Junk
Mouse *gets cat*
Keyboard RUHGUBUH!
Software Not Linux
Benchmark Scores Higher than Intel shite
Welp. Didn't expect them to potentially announce Turing successor so soon lol. I was thinking nextr-gen would be a 'Turing Refresh' on 7nm, using the increased density to throw more CUDA cores at the problem and ramp up the clocks a bit. Turing, IMO, is a fantastic uArch with some very future looking features (dual int/fp execution could be huge, look at games like Wolfenstein 2 and how they run on Turing cards). Anyway I never thought 20 series were a good investment. They are great cards and if you got the cash, sure get one. But you are paying significant early adopter tax, not that it matters to real hardware enthusiasts though.

Either way with my budget of 200 quid or less on a GPU, the anxiety of buyer's remorse due to this is largely diminished (much lower than if i spent £1000+ on a 2080 ti lol). The 1660 i just bought is a great little card will most definitely be keeping it till 7nm comes at the same price point and gives me 2x perfwatt in F@H^^
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
They arent really behind. If you recall when these were released, IIRC, the only thing released in 18 was supposed to be bf v and SOTR. With the latter RT being delayed. We should see more this year... sooner the better.

As much as I understand these things take time, the initial hype has already died down and for a new tech that needs adoption, this is not a good thing.

Ironically there seems to be a bit more hype surrounding the recent CryEngine demo. And that is not just me looking through my tinted glasses... we also know Crytek is a studio that isn't in the best position at this time and they could just be fishing for exposure. But even so, their vision of the ray traced future looks a whole lot better IMO.
 

Moldysaltymeat

New Member
Joined
Mar 18, 2019
Messages
1 (0.00/day)
So RTX 2000 series obsolete in less than a year ?

RTX 2000 series was obsolete from the time the first benchmarks launched. Way overpriced with little performance benefit over Pascal. Not to mention that the flagship, $1200 2080ti was dying on people shortly after launch. This RTX experiment was courageous, but caused a massive loss in share value. Investors are demanding an upgrade to Pascal for a cost-to-performance ratio that makes sense.
 
Joined
Jan 8, 2017
Messages
9,504 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Eventually though, it depends on what AMD has up its sleeve. If AMD comes out with competitive enough Navi in August as rumors currently say Nvidia will need to answer. AMD and Nvidia know pretty well what the other is working on and generally have a pretty good idea what the other announces and released. There are details like final clocks and final performance that they have to estimate but these estimations are not far off.

No high end part from AMD is in sight, it will all come down to whether or not the RTX series is selling well enough. They have no reason to compete with their own products, I suspect Turing was a money pit and they will like not let go of it any time soon.
 
Joined
Oct 10, 2009
Messages
944 (0.17/day)
System Name Desktop
Processor AMD Ryzen 7 5800X3D
Motherboard MAG X570S Torpedo Max
Cooling Corsair H100x
Memory 64GB Corsair CMT64GX4M2C3600C18 @ 3600MHz / 18-19-19-39-1T
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra
Storage Kingston KC3000 1TB + Kingston KC3000 2TB + Samsung 860 EVO 1TB
Display(s) 32" Dell G3223Q (2160p @ 144Hz)
Case Fractal Meshify 2 Compact
Audio Device(s) ifi Audio ZEN DAC V2 + Focal Radiance / HyperX Solocast
Power Supply Super Flower Leadex V Platinum Pro 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman V2 Optical (Linear Red)
Software Windows 11 Pro x64
I suspect Turing was a money pit and they will like not let go of it any time soon.

This. It'll probably be a shrink and tweak at most. Would be surprised to see anything more at this stage, if we see anything at all
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
no... gtx 1080 ti and rtx 2080 ti were 2 1/2 years apart from launch dates... Nvidia knows silicon has its limits so they probably will move rtx 2080 ti to a 3 year cycle... unless competition arrives, but it won't so....

According to the GPU database here the 1080 Ti and 2080 Ti launches were 1 1/2 years apart.
 
Joined
Mar 18, 2019
Messages
3 (0.00/day)
Until a card comes out with 2X the performance of the 1080ti I'm waiting. Upgrading is a waste of money unless you can at least double your performance. I miss the golden age of computing when CPU's and GPU's were always at least double the performance every generation and prices weren't so ridiculous...
 
Joined
Mar 10, 2014
Messages
1,793 (0.45/day)
Yeah it's GTC, so successor to Volta at most. Maybe some new bits about Tegra Orin automotive part too. Vega20 is quite compelling HPC compute gpu on the paper, so perhaps nvidia focuses their next effort for 7nm full fp64 gpu. GV100 is still two years old gpu, which got updated memory config year ago. Not to mention it lacks some mixed precision compute that Turing has.
 
Joined
Feb 3, 2017
Messages
3,822 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
According to the GPU database here the 1080 Ti and 2080 Ti launches were 1 1/2 years apart.
...
GTX 980 - September 2014
GTX 980Ti - June 2015
GTX 1080 - May 2016
GTX 1080Ti - March 2017
RTX 2080Ti - September 2018
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
According to the GPU database here the 1080 Ti and 2080 Ti launches were 1 1/2 years apart.

I think its more accurate to use the Gx104 to compare generation release moments. Nvidia plays around with their big die and launches it at the time it will have the greatest impact - or not at all (Kepler). For Turing, that was right at the beginning as it was the only card that offered a performance gain over Pascal. But for Pascal, it was right at the end because the rest of the stack could carry everything fine.

Bottom line doesn't really change, we've been looking at 'Pascal performance' for far too long now and Turing barely changes that.
 
  • Like
Reactions: 64K

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
I think its more accurate to use the Gx104 to compare generation release moments. Nvidia plays around with their big die and launches it at the time it will have the greatest impact - or not at all (Kepler). For Turing, that was right at the beginning as it was the only card that offered a performance gain over Pascal. But for Pascal, it was right at the end because the rest of the stack could carry everything fine.

Bottom line doesn't really change, we've been looking at 'Pascal performance' for far too long now and Turing barely changes that.

Good point but I disagree about the big die Kepler. They eventually did release the 780 Ti which had more cores than the Titan and was faster than the Titan until the Titan Black was released but the 780 Ti wasn't good for compute.

I am of the opinion, though I can't prove it, that if Turing didn't need die space for the RT and Tensor cores then there would have been more CUDA cores and we would have seen the kind of performance increase of Pascal over Maxwell.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
RTX 2000 series was obsolete from the time the first benchmarks launched. Way overpriced with little performance benefit over Pascal. Not to mention that the flagship, $1200 2080ti was dying on people shortly after launch. This RTX experiment was courageous, but caused a massive loss in share value. Investors are demanding an upgrade to Pascal for a cost-to-performance ratio that makes sense.

Radeon 7 was obsolete from the time the first benchmarks launched. Way overpriced with little performance benefit over Vega. Not to mention that it's noisy, power hungry and can't consistently match the product it competes against. This Mi50 experiment was courageous, but the cards are sold for virtually no profit. Investors are demanding an upgrade to GCN for a cost-to-performance ratio that makes sense.

Fixed.
 
Joined
Jan 8, 2017
Messages
9,504 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Investors are demanding an upgrade to GCN for a cost-to-performance ratio that makes sense.

Your comment is cute.

Meanwhile, what has actually happened in the real world : (marked the launch dates of both 20 series and Radeon 7)

lol.png


s.png


Regardless, nice try.
 
Top