• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

January 5 Release Date Predicted for NVIDIA GeForce RTX 4070 Ti

Joined
Sep 17, 2014
Messages
22,449 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I don't think that they were meant to coexist. I think Nvidia expected scalpers and miners to buy up all Ampere stock before the Ada launch, but this plan got thrown out of the window when crypto crashed.
Right, because they didn't learn from Pascal?

and then

With Ampere they just simply didn't even bother making a mining specific card, but rather they started making gamer specific cards (LHR) when the backlash and rumor mill of other sales channels started to become tangible.

Come on. Nvidia has always deployed some tactic to delay their time to market to just the right time, for each part of their stack. The strategy is profit maximization, that and that alone determines what gets launched when. The 4nm and limited run story fits right in, too. They don't stand to gain anything from cannibalizing their own stock.

As far as the 4080's go though, yeah, that I don't think they expected, because the 4070ti is probably just as tainted right now because of it, which leaves them in a crappy position. If they delay 4070ti further by limited stock or otherwise, they might miss their sales entirely on that one as people can simply buy an equally performant Ampere alternative or something in the AMD stable.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Right, because they didn't learn from Pascal?
1670461435541.png


I mean, I wouldn't blame them. Who thought Etherum was going to switch to Proof-of-Stake, effectively crashing the mining market? It was hanging in the air for years, but nobody knew when it would actually happen, or if it would at all.
 
Joined
May 11, 2018
Messages
1,255 (0.52/day)
Etherum switched to Proof-of-Stake well after the crypto crash - in September 2022. And no, crypto didn't crash because Ethereum announced change to Proof-of-Stake - that was planned and postponed since 2019. Home mining on gaming GPUs has very little impact on overall crypto market, all it does is it fucks up the GPU market. People used to praise it publicly on forums like this one, now they know better.
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Etherum switched to Proof-of-Stake well after the crypto crash - in September 2022. And no, crypto didn't crash because Ethereum announced change to Proof-of-Stake - that was planned and postponed since 2019. Home mining on gaming GPUs has very little impact on overall crypto market, all it does is it fucks up the GPU market. People used to praise it publicly on forums like this one, now they know better.
There's no denying that, but I'm talking about the GPU market, which was hugely f-ed up due to crypto mining. The other thing is that a GPU architecture's roadmap spans across several years. When Nvidia was only making plans for the transition from Ampere to Ada was well within the crypto boom. They could have easily counted on Etherum not making the switch for a couple more years back then.

Edit: I find it supported by the fact that Ada is basically a larger, more efficient Ampere made on a smaller node. Not much effort has been made to improve on the architecture, which makes me believe that gaming wasn't the main focus in development. Sure, there's DLSS 3 and stuff, but those improvements are usually a side note, not the main focus.
 
Last edited:
Joined
May 11, 2018
Messages
1,255 (0.52/day)
Well the way they did Ada was going for the low hanging fruits, new process at TSMC really enabled quite a jump from inefficient Samsung one, so they got to pocket the money from 2020-2022 crypto high and save the money on redesign!

And then have the audacity to tell us that's really expensive, so we should be glad it's only 70% price increase.
 
Joined
Sep 17, 2014
Messages
22,449 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
There's no denying that, but I'm talking about the GPU market, which was hugely f-ed up due to crypto mining. The other thing is that a GPU architecture's roadmap spans across several years. When Nvidia was only making plans for the transition from Ampere to Ada was well within the crypto boom. They could have easily counted on Etherum not making the switch for a couple more years back then.

Edit: I find it supported by the fact that Ada is basically a larger, more efficient Ampere made on a smaller node. Not much effort has been made to improve on the architecture, which makes me believe that gaming wasn't the main focus in development. Sure, there's DLSS 3 and stuff, but those improvements are usually a side note, not the main focus.
Interesting point. But: gaming is the main focus for Geforce and Ada is a gaming arch. They have Hopper and other for datacenter/enterprise, they have quadro RTX cards for pro. But its all more of the same, with tweaks, additions or omissions.

In the end its a chip that works and improves through iterative technological upgrades/adjustments. Both Nvidia and AMD have historically been doing a dance with what they do or don't enable on these chips, and how this suits different purposes or markets. But in the end its a huge amount of programmable cores that can work in parallel. We've seen Nvidia's CUDA based arch evolve over time, and Geforce was already thinned down to pure gaming (Pascal), GPGPU capability removed; then we got Volta and they borrowed tensor cores from it to transplant into the Pascal 'base' (=Turing) and added some cache and sauce so now we have an RT core too. But in the usage/purpose, there are similarities between everything Nvidia releases, and always have been. The usage of machine learning/'AI' to produce frames is effectively also a gaming improvement that originates from another corner of the market.

I think they've reached the end of the line with the fruits they can harvest for a more efficient gaming GPU based on raster alone. The marketing on Geforce kinda follows that logic, its getting separated further and further from reality, just like with Intel's chaotic explanation and changes to how turbo and frequency works, resulting in product stacks that become largely unusable or provide parts you'd rather undervolt; or priced way out of comfort because they must adhere to inflated marketing claims. You're right, what used to 'seem' side note like DLSS version upgrade is now front and center; but compare it to Maxwell > Pascal! That was similarly mostly just a good shrink and some improvements to power delivery (Ada has almost the same powerpoint slides :D) enabling higher frequencies. Except today, they barely pay off on their own; diminishing returns happen, especially as lots of raster components in the pipeline are becoming more dynamic.

TL DR I agree with the idea the focus has indeed shifted since Volta, but it remains plausible that still was the best way forward for their gaming lineup.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Interesting point. But: gaming is the main focus for Geforce and Ada is a gaming arch. They have Hopper and other for datacenter/enterprise, they have quadro RTX cards for pro. But its all more of the same, with tweaks, additions or omissions.
Yeah, but they didn't have a separation between gaming and mining cards since the unpopular (and as I've seen, unavailable) CMP series. When you say "Ada is a gaming arch", it is kind of synonymous with "Ada is a mining arch". Considering how similar Ada is to Ampere in its physical design, I would say mining was a more likely target than gaming, as miners were the ones who needed more of the same. Gamer architectures tend to undergo more changes as games change in their approach to graphics.

I think they've reached the end of the line with the fruits they can harvest for a more efficient gaming GPU based on raster alone. The marketing on Geforce kinda follows that logic, its getting separated further and further from reality, just like with Intel's chaotic explanation and changes to how turbo and frequency works, resulting in product stacks that become largely unusable or provide parts you'd rather undervolt; or priced way out of comfort because they must adhere to inflated marketing claims.
I agree - but then logic dictates that they should have focused on things that aren't at the end of the line, like RT for example. Ada seems to show exactly the same performance drop when RT is on compared to off as Ampere, which is a real shame, imo. Cramming more RT cores on the die relative to traditional shaders, or just making the existing ones faster would have paid off much more, especially since AMD doesn't seem to be focusing on improving RT with RDNA 3, either. Nobody cares if you have 600 FPS in CS:GO instead of 400, so having more of the same is again, a choice that would have catered to miners much more than gamers.
 
Joined
May 11, 2018
Messages
1,255 (0.52/day)
Considering how similar Ada is to Ampere in its physical design, I would say mining was a more likely target than gaming, as miners were the ones who needed more of the same. Gamer architectures tend to undergo more changes as games change in their approach to graphics.

I'd say gaming changed right about 0 from arrival of Ampere to Ada. We don't even have a single AA game that they would promote as a showcase of new RTX, DLSS iterations, they are showing them in patches to old games, or in Portal fully path-traced RTX mod - which only shows we're not there by a long shot...

Remember the GeForce RTX 2080 Ti Cyberpunk 2077 Edition, that came out months before game was finally launched, and was actually then too slow to run the ray-tracing in game properly?
:-D
 
Joined
Sep 17, 2014
Messages
22,449 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Yeah, but they didn't have a separation between gaming and mining cards since the unpopular (and as I've seen, unavailable) CMP series. When you say "Ada is a gaming arch", it is kind of synonymous with "Ada is a mining arch". Considering how similar Ada is to Ampere in its physical design, I would say mining was a more likely target than gaming, as miners were the ones who needed more of the same. Gamer architectures tend to undergo more changes as games change in their approach to graphics.


I agree - but then logic dictates that they should have focused on things that aren't at the end of the line, like RT for example. Ada seems to show exactly the same performance drop when RT is on compared to off as Ampere, which is a real shame, imo. Cramming more RT cores on the die relative to traditional shaders, or just making the existing ones faster would have paid off much more, especially since AMD doesn't seem to be focusing on improving RT with RDNA 3, either. Nobody cares if you have 600 FPS in CS:GO instead of 400, so having more of the same is again, a choice that would have catered to miners much more than gamers.
Interesting points! I'm more of the opinion (we'll never really know will we...) Nvidia's mining performance is a side effect, not a target for them. It would present major risk to their continued business strategy to make mining oriented GPUs. They just repurposed existing cards.
 

bug

Joined
May 22, 2015
Messages
13,773 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Interesting points! I'm more of the opinion (we'll never really know will we...) Nvidia's mining performance is a side effect, not a target for them. It would present major risk to their continued business strategy to make mining oriented GPUs. They just repurposed existing cards.
I believe you are correct. GPUs only started doing compute when rendering shaders became flexible enough to tackle compute tasks. And later on, someone figured since the compute capabilities on a GPU are highly parallel, they could use that for mining.
If Nvidia was actively trying to target miners, there's a lot of GPU silicon they could remove, building dedicated, smaller dies. As it stands, their compute cards only use the same dies meant for gaming, albeit with (a lot of) disabled shaders.
 
Top