• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5070 Ti Leak Tips More VRAM, Cores, and Power Draw

Joined
Jul 31, 2024
Messages
186 (1.27/day)
Processor AMD Ryzen 7 5700X
Motherboard ASUS ROG Strix B550-F Gaming Wifi II
Cooling Noctua NH-U12S Redux
Memory 4x8G Teamgroup Vulcan Z DDR4; 3600MHz @ CL18
Video Card(s) MSI Ventus 2X GeForce RTX 3060 12GB
Storage WD_Black SN770, Leven JPS600, Toshiba DT01ACA
Display(s) Samsung ViewFinity S6
Case Fractal Design Pop Air TG
Power Supply Corsair CX750M
Mouse Corsair Harpoon RGB
Keyboard Keychron C2 Pro
VR HMD Valve Index
I never thought I'd say this... THANK GOD FOR INTEL and their decision to make 8GB VRAM a thing of the past! 12GB VRAM for just $250 is a thing of beauty that the GPU market so desperately needed.
Ehhh... I wouldn't get too excited. B300 will probably be a 6/8GB lineup, but then again A300 was the ultra-budget/transcoding/SFF series so it's not much of a tragedy.
 
Joined
Jan 9, 2023
Messages
324 (0.45/day)
> Supposedly, the RTX 5070 Ti will also see a bump in total CUDA cores, from 8448 in the RTX 4070 Ti to 8960 in the RTX 5070 Ti.

@Cpt.Jank The 4070Ti Super has 8448 cores, the 4070Ti has 7680 cores.

Upgrade seems a bit mid. If we compare it against (and assume it's similarly priced as) the 4070Ti the uplift should be less than 25%.
The additional VRAM is nice but the extra memory speed is a bit of a meme.

Does make it somewhat comparable to a 4080 though.

But I think this thing will get an MSRP of 850USD.
 
Joined
Jan 27, 2024
Messages
305 (0.92/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
Only 6% more shaders means around that figure higher performance. As time goes by, the 16 GB framebuffer will be a bottleneck, so no matter the other improvements, it will hit a performance wall.

I never thought I'd say this... THANK GOD FOR INTEL and their decision to make 8GB VRAM a thing of the past! 12GB VRAM for just $250 is a thing of beauty that the GPU market so desperately needed.

24GB on a 5080 should now be a thing nGreedia, as it should have been all along. But I won't hold my breath.

Even if it's not exactly 24 GB, 18 GB, 20 GB or 22 GB would be definitely better.
 
Joined
Nov 24, 2018
Messages
2,252 (1.01/day)
Location
south wales uk
System Name 1.FortySe7en VR rig 2. intel teliscope rig 3.MSI GP72MVR Leopard Pro .E-52699, Xeon play thing
Processor 1.3900x @stock 2. 3700x . 3. i7 7700hq
Motherboard 1.aorus x570 ultra 2. Rog b450 f,4 MR9A PRO ATX X99
Cooling 1.Hard tube loop, cpu and gpu 2. Hard loop cpu and gpu 4 360 AIO
Memory 1.Gskill neo @3600 32gb 2.corsair ven 32gb @3200 3. 16gb hyperx @2400 4 64GB 2133 in quad channel
Video Card(s) 1.GIGABYTE RTX 3080 WaterForce WB 2. Aorus RTX2080 3. 1060 3gb. 4 Arc 770LE 16 gb
Storage 1 M.2 500gb , 2 3tb HDs 2. 256gb ssd, 3tbHD 3. 256 m.2. 1tb ssd 4. 2gb ssd
Display(s) 1.LG 50" UHD , 2 MSI Optix MAG342C UWHD. 3.17" 120 hz display 4. Acer Preditor 144hz 32inch.z
Case 1. Thermaltake P5 2. Thermaltake P3 4. some cheapo case that should not be named.
Audio Device(s) 1 Onboard 2 Onboard 3 Onboard 4. onboard.
Power Supply 1.seasonic gx 850w 2. seasonic gx 750w. 4 RM850w
Mouse 1 ROG Gladius 2 Corsair m65 pro
Keyboard 1. ROG Strix Flare 2. Corsair F75 RBG 3. steelseries RBG
VR HMD rift and rift S and Quest 2.
Software 1. win11 pro 2. win11 pro 3, win11 home 4 win11 pro
Benchmark Scores 1.7821 cb20 ,cb15 3442 1c 204 cpu-z 1c 539 12c 8847
New shiny thing eh, im not falling for it untill it get dull.
 
Joined
Jan 14, 2019
Messages
12,582 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Ehhh... I wouldn't get too excited. B300 will probably be a 6/8GB lineup, but then again A300 was the ultra-budget/transcoding/SFF series so it's not much of a tragedy.
I'm more excited to see what B700 has in store for us (and RDNA 4 for that matter). RTX 5000 will most probably be overpriced, just like everything Nvidia since the 3000 or maybe even 2000 series.
 
Joined
Nov 20, 2012
Messages
163 (0.04/day)
@Cpt.Jank : You mixed up the specs of 4070Ti and 4070Ti Super. The Ti had 60CUs or 7.680 cores and 192Bit SI, the Ti Super 66CUs or 8.448 cores and 256Bit.

The "leaked" specs have been leaked for about the third time, so no surprise at all, just nearly a confirmation. The card seems to be no big bump at all, with only 6% more cores, 33% more bandwith and 65W TDP (the 4070TiS was not TDP-limited at all!). If core clock doesn't go up quite a bit, which I doubt given the nearly identical fabrication process, increased performance will have to come mainly from architectural improvements or DLSS4.x. It is the same with 5080 and 5070, which offer similarly small spec-bumps over their predecessors.

If they don't cost more, I don't care, but I doubt that.
 
Joined
Jul 31, 2024
Messages
186 (1.27/day)
Processor AMD Ryzen 7 5700X
Motherboard ASUS ROG Strix B550-F Gaming Wifi II
Cooling Noctua NH-U12S Redux
Memory 4x8G Teamgroup Vulcan Z DDR4; 3600MHz @ CL18
Video Card(s) MSI Ventus 2X GeForce RTX 3060 12GB
Storage WD_Black SN770, Leven JPS600, Toshiba DT01ACA
Display(s) Samsung ViewFinity S6
Case Fractal Design Pop Air TG
Power Supply Corsair CX750M
Mouse Corsair Harpoon RGB
Keyboard Keychron C2 Pro
VR HMD Valve Index
I'm more excited to see what B700 has in store for us (and RDNA 4 for that matter). RTX 5000 will most probably be overpriced, just like everything Nvidia since the 3000 or maybe even 2000 series.
It's an odd pacing, at least as far back as I've had my head dipped in things.
  • Pascal (10) was a breakout hit and was priced insanely well across the board, seeing volumes of 1060s, 1070s, and 1080/1080Tis moving. Even the 1050Ti sold like hotcakes as a slot-only card.
  • Turing (20/16) came out sorely overpriced and underperformed due to it being mostly just an experimentalist platform for 1st gen RT and matrix math. The 16 series cut the crap and focused on price appeal, to much success with SIs and OEMs. Seen as an afterthought nowadays...
  • Ampere (30) was, like the 10 series, so good that literally everything sold. Again. Even the RTX 3050, a laughably bad model in the lineup, was moving units. Everyone wanted a piece of that fresh and functional 2nd gen RT and the true fruit of all those fancy arch additions—DLSS. Good DLSS. It was the crypto boom that ruined prices after launch.
  • Ada (40) follows the same leapfrogging pattern that was forming the last few generations, being an overpriced, underperforming software showcase for special first-adopters only. DLSS framegen, ray reconstruction, AI toolkit—yadda yadda no one friggin' cares.
Naturally, we should expect the 50 series to be a return to form for Nvidia, another lineup of home-runs that makes everyone happy, but the forecast isn't looking great. Higher power draw, barely improving on shader count from Ada/Ampere, and a lame leg for VRAM on the lower end. They're putting less and less into impressing across the board, and putting more and more into the ultra-mega-super-expensive chips from the 4090 to the—what, B6000? A6000 Blackwell?

The margins are getting to their head, and the shareholders are only gonna stop huffing nitrous when the stock price eventually, inevitably, crashes to what Nvidia is ACTUALLY worth as an asset.
 
Joined
Jul 27, 2022
Messages
31 (0.04/day)
So the TDP is going up a bit compared to 4070ti? That always seems like the wrong direction to me.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I don't feel like I got screwed. I feel like money doesn't buy what it used to though.
Right, you're totally not annoyed with having 12GB on what is essentially a 16GB card, signed Nvidia, because they released a 4070ti Super that's a lot better at almost the same cost.

You don't feel screwed? And now you're ready to jump on the successor, again, too early and paying far too much, and you'll still be saying 'you don't get what you used to for your money' ? If you had waited a few months and went 4070ti S in the last round, there would have been ZERO reason to even put this 5070ti on your radar to begin with. You'll be gaining what 10-15% perf and 4GB VRAM, its almost a sidegrade.

Its strange how that logic works wrt spending on hardware, honestly. Just be honest, the 4070ti was a shit buy. I'm going to be honest too: I regret buying the 7900XT at the price it had at the time. 250 bucks over what it costs now. Worst timing I've had on a GPU. Could've had a free B580 for that today.

We should do better :)

So the TDP is going up a bit compared to 4070ti? That always seems like the wrong direction to me.
Same node, a few more shaders and a higher voltage to extract another 3% you could've had from an OC yourself.

Nvidia is going to sell this card to everyone who didn't buy 4070ti Super, basically, just stalling at virtually the same level of performance right there, but calling it new.

It's an odd pacing, at least as far back as I've had my head dipped in things.
  • Pascal (10) was a breakout hit and was priced insanely well across the board, seeing volumes of 1060s, 1070s, and 1080/1080Tis moving. Even the 1050Ti sold like hotcakes as a slot-only card.
  • Turing (20/16) came out sorely overpriced and underperformed due to it being mostly just an experimentalist platform for 1st gen RT and matrix math. The 16 series cut the crap and focused on price appeal, to much success with SIs and OEMs. Seen as an afterthought nowadays...
  • Ampere (30) was, like the 10 series, so good that literally everything sold. Again. Even the RTX 3050, a laughably bad model in the lineup, was moving units. Everyone wanted a piece of that fresh and functional 2nd gen RT and the true fruit of all those fancy arch additions—DLSS. Good DLSS. It was the crypto boom that ruined prices after launch.
  • Ada (40) follows the same leapfrogging pattern that was forming the last few generations, being an overpriced, underperforming software showcase for special first-adopters only. DLSS framegen, ray reconstruction, AI toolkit—yadda yadda no one friggin' cares.
Naturally, we should expect the 50 series to be a return to form for Nvidia, another lineup of home-runs that makes everyone happy, but the forecast isn't looking great. Higher power draw, barely improving on shader count from Ada/Ampere, and a lame leg for VRAM on the lower end. They're putting less and less into impressing across the board, and putting more and more into the ultra-mega-super-expensive chips from the 4090 to the—what, B6000? A6000 Blackwell?

The margins are getting to their head, and the shareholders are only gonna stop huffing nitrous when the stock price eventually, inevitably, crashes to what Nvidia is ACTUALLY worth as an asset.
What? Ampere was good? What are you talking about? It was heavily underspecced and overpriced and baked on the worst node we've seen on Nvidia cards in decades. The worst perf/W as well, that is a big reason why Ada looks so good now. If you would just look at the hardware in isolation (not the pricing, indeed mining at the time that inflated things badly, Nvidia positioned the MSRP's quite alright initially) its really not a good generation at all. What sold cards at the time was mining, and it barely leaving scraps for gamers. I think you're right about DLSS. But that's not really related to the GPUs, is it, its artificial segmentation. I think people would have gobbled up the GPUs anyway.

10GB on a 3080 (the most interesting card in the line up, everything else was overpriced or halfway obsolete on release, see x60/x70 performance in stuff like Hogwarts for example) was a fucking joke, and the lower tiers were nothing better. Double VRAM cards came too late and looked strange relative to core power, so basically one half of the stack was underspecced and the replacements were overkill on VRAM. The whole release of Ampere was a horrible mess, failure rate wasn't exactly lowest either. I think Ampere may go down history as the generation with the lowest useable lifetime of Nvidia's GPU stacks, all things considered. What also didn't and still doesn't help it, is the development in games where new engine developments just simply slaughter these GPUs. Ampere has been plagued with issues already in various games. To illustrate, a 4090 is almost twice the GPU that a 3090 is at this moment, and that's all core/shader performance, neither card is VRAM constrained.
 
Last edited:
Joined
Dec 14, 2011
Messages
1,087 (0.23/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
My 4070Ti smokes my 3070Ti in every possible way. Lots of guys hate Nvidia, and that's ok :)

Even the 3070Ti is a powerful GPU, the problem is, hit that VRAM ceiling and you might as well be rocking a GTX1060.
 
Joined
Jul 31, 2024
Messages
186 (1.27/day)
Processor AMD Ryzen 7 5700X
Motherboard ASUS ROG Strix B550-F Gaming Wifi II
Cooling Noctua NH-U12S Redux
Memory 4x8G Teamgroup Vulcan Z DDR4; 3600MHz @ CL18
Video Card(s) MSI Ventus 2X GeForce RTX 3060 12GB
Storage WD_Black SN770, Leven JPS600, Toshiba DT01ACA
Display(s) Samsung ViewFinity S6
Case Fractal Design Pop Air TG
Power Supply Corsair CX750M
Mouse Corsair Harpoon RGB
Keyboard Keychron C2 Pro
VR HMD Valve Index
What? Ampere was good? What are you talking about? It was heavily underspecced and overpriced and baked on the worst node we've seen on Nvidia cards in decades. The worst perf/W as well, that is a big reason why Ada looks so good now. If you would just look at the hardware in isolation (not the pricing, indeed mining at the time that inflated things badly, Nvidia positioned the MSRP's quite alright initially) its really not a good generation at all. What sold cards at the time was mining, and it barely leaving scraps for gamers. I think you're right about DLSS. But that's not really related to the GPUs, is it, its artificial segmentation. I think people would have gobbled up the GPUs anyway.

10GB on a 3080 (the most interesting card in the line up, everything else was overpriced or halfway obsolete on release, see x60/x70 performance in stuff like Hogwarts for example) was a fucking joke, and the lower tiers were nothing better. Double VRAM cards came too late and looked strange relative to core power, so basically one half of the stack was underspecced and the replacements were overkill on VRAM. The whole release of Ampere was a horrible mess, failure rate wasn't exactly lowest either.
In isolation, yeah, it wasn't the same "OMFG GUYS BUY IT NOW" goodness of Pascal, but it did what people wanted at the time—something better than Turing (and more importantly, much better than Pascal) for a price they could justify. VRAM back then was less of a reason to hem and haw about a GPU; no one expected to need more than 8GB to run games at the settings they wanted, and at the time they were right.

Even the most demanding titles barely tickled the 8-gig barrier in the 30 series era, and if you were exceeding it, it was because you were trying to play 4K High/Ultra on games that just came out. That was barely in reach for the 3080, nevermind a 3070 (or 3060Ti, which iirc sold way better). The only reason it's an issue now is because easy-bake 'cinematic quality' has become a trend while cinematic framerates and studio-spec GPUs have not.

As for the failure rate debacle, I hardly remember anything of the sort (and what I have inklings of tell me of it being specifically problems with the first batch or two of boards). I do remember the power draw, Ampere guzzled gas and put out heat, but it was worth it for what ya got and you could usually squeeze much better voltages or more worthwhile performance out of a card if you gave even a fart about it, speaking from experience.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
no one expected to need more than 8GB to run games at the settings they wanted, and at the time they were right.
This is the story of Nvidia buyers' lives. I was one of them. Only the 1080 with a rich 8GB was a good balance in my history of Nvidia cards. And the 780ti with 3GB. The rest? Underspecced or put differently, badly balanced and VRAM shortage will make them go obsolete, always, before core power.

'No one expected' is the wrong term for it imho. They were warned but ignored it, or have the perspective they will upgrade before that moment occurs anyway. But the latter is of course cognitive dissonance. If you have no need, you wouldn't upgrade, they simply pre-empted their dissapointment with a lowered expectation. This is the 'money is no object crowd' that then proceeds to complain about pricing and buys it anyway.
 
Joined
Jan 14, 2019
Messages
12,582 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
What? Ampere was good? What are you talking about? It was heavily underspecced and overpriced and baked on the worst node we've seen on Nvidia cards in decades. The worst perf/W as well, that is a big reason why Ada looks so good now. If you would just look at the hardware in isolation (not the pricing, indeed mining at the time that inflated things badly, Nvidia positioned the MSRP's quite alright initially) its really not a good generation at all. What sold cards at the time was mining, and it barely leaving scraps for gamers. I think you're right about DLSS. But that's not really related to the GPUs, is it, its artificial segmentation. I think people would have gobbled up the GPUs anyway.

10GB on a 3080 (the most interesting card in the line up, everything else was overpriced or halfway obsolete on release, see x60/x70 performance in stuff like Hogwarts for example) was a fucking joke, and the lower tiers were nothing better. Double VRAM cards came too late and looked strange relative to core power, so basically one half of the stack was underspecced and the replacements were overkill on VRAM. The whole release of Ampere was a horrible mess, failure rate wasn't exactly lowest either. I think Ampere may go down history as the generation with the lowest useable lifetime of Nvidia's GPU stacks, all things considered. What also didn't and still doesn't help it, is the development in games where new engine developments just simply slaughter these GPUs. Ampere has been plagued with issues already in various games. To illustrate, a 4090 is almost twice the GPU that a 3090 is at this moment, and that's all core/shader performance, neither card is VRAM constrained.
Ampere was supposed to be good based on launch slides, but then, Nvidia ended up doubling shader count without actually doubling shader count (by calling half of them "dual issue"). Combine that with the fact that we didn't see a single model sold anywhere near MSRP through the entire life cycle of the generation, and we have the complete dumpster fire we call Ampere.

It's only exaggerated by the fact that Nvidia went full retard with prices on Ada, thinking that if gullible gamers are willing to pay thousands for a graphics card just because it comes in a green box, then maybe they should. The saddest part of it all is that time proved Nvidia right.
 
Joined
Jul 31, 2024
Messages
186 (1.27/day)
Processor AMD Ryzen 7 5700X
Motherboard ASUS ROG Strix B550-F Gaming Wifi II
Cooling Noctua NH-U12S Redux
Memory 4x8G Teamgroup Vulcan Z DDR4; 3600MHz @ CL18
Video Card(s) MSI Ventus 2X GeForce RTX 3060 12GB
Storage WD_Black SN770, Leven JPS600, Toshiba DT01ACA
Display(s) Samsung ViewFinity S6
Case Fractal Design Pop Air TG
Power Supply Corsair CX750M
Mouse Corsair Harpoon RGB
Keyboard Keychron C2 Pro
VR HMD Valve Index
This is the story of Nvidia buyers' lives. I was one of them. Only the 1080 with a rich 8GB was a good balance in my history of Nvidia cards. And the 780ti with 3GB. The rest? Underspecced or put differently, badly balanced and VRAM shortage will make them go obsolete, always, before core power.

'No one expected' is the wrong term for it imho. They were warned but ignored it, or have the perspective they will upgrade before that moment occurs anyway. But the latter is of course cognitive dissonance. If you have no need, you wouldn't upgrade, they simply pre-empted their dissapointment with a lowered expectation. This is the 'money is no object crowd' that then proceeds to complain about pricing and buys it anyway.
Fair enough, lol. I picked up the 3060 12GB in '22 for $370 because it was the cheapest 12-gig card I knew about and I wanted that extra legroom for VRChat publics and super duper texture quality. The rest of my settings can stay at a happy Medium until I get a 7800XT/equivalent for 4 hundo or less. Fingers crossed Intel or AMD makes my wish come true next year.

I'm also making good on buying the DLSS-majigger now that I'm at 1440p, too. Cyberpunk doesn't look that bad on Ultra Quality.
 
Joined
Jan 8, 2017
Messages
9,505 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I don't really understand what Nvidia is doing
Nothing, the answer is they're pretty much doing nothing. This is somewhere between a half baked node shrink and a refresh, they got most of the market share so now there is no reason for them to make significantly better products because they know there just isn't that much more of the market that they can capture and there is also not much more that they can milk out of their existing customers.
 
Joined
Jan 8, 2021
Messages
10 (0.01/day)
The comparisons in this article are messed up. The 4070 Ti was superseded by the 4070 Ti Super and has been discontinued, so there's no point mentioning or comparing to the 4070 Ti non-Super.
Yet it just mentions 4070 Ti, while mixing specs from the non-Super (12 GB VRAM) and Super (8448 CUDA cores, the non-Super had 7680).
So in fact, it does not have more VRAM, because the 4070 Ti Super already had 16 GB.
 
Joined
Jan 18, 2020
Messages
835 (0.46/day)
It's what happens when they're able to print money with the LLM GPUs. No incentive for them to try on the consumer side, total complacency like Intel after 2006. Remains to be seen if the data centre side holds up given no decent front end mass paid use cases for any of it yet. Of if AMD or another competitor can come up with something better on the consumer side.

Seems tech stagnation will go on for a while yet.
 
Joined
Dec 31, 2020
Messages
1,004 (0.69/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
RTX 5070 Ti is very special and brings 4090 performance down to $800. At least In 1440p it should land much closer to 4090 than to 4080.
 
Joined
Feb 18, 2013
Messages
2,186 (0.51/day)
Location
Deez Nutz, bozo!
System Name Rainbow Puke Machine :D
Processor Intel Core i5-11400 (MCE enabled, PL removed)
Motherboard ASUS STRIX B560-G GAMING WIFI mATX
Cooling Corsair H60i RGB PRO XT AIO + HD120 RGB (x3) + SP120 RGB PRO (x3) + Commander PRO
Memory Corsair Vengeance RGB RT 2 x 8GB 3200MHz DDR4 C16
Video Card(s) Zotac RTX2060 Twin Fan 6GB GDDR6 (Stock)
Storage Corsair MP600 PRO 1TB M.2 PCIe Gen4 x4 SSD
Display(s) LG 29WK600-W Ultrawide 1080p IPS Monitor (primary display)
Case Corsair iCUE 220T RGB Airflow (White) w/Lighting Node CORE + Lighting Node PRO RGB LED Strips (x4).
Audio Device(s) ASUS ROG Supreme FX S1220A w/ Savitech SV3H712 AMP + Sonic Studio 3 suite
Power Supply Corsair RM750x 80 Plus Gold Fully Modular
Mouse Corsair M65 RGB FPS Gaming (White)
Keyboard Corsair K60 PRO RGB Mechanical w/ Cherry VIOLA Switches
Software Windows 11 Professional x64 (Update 23H2)
have a gut feeling it will be $850 or $900 card.
 
Joined
Sep 6, 2013
Messages
3,392 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
I never thought I'd say this... THANK GOD FOR INTEL and their decision to make 8GB VRAM a thing of the past! 12GB VRAM for just $250 is a thing of beauty that the GPU market so desperately needed.
There are A770 and RTX 3060 that where playing at $250 or lower with 12-16GB of VRAM. But you are right that putting an MSRP price of $250 on a 12GB card and not having to wait months to see the price slide to those price levels, is a step in the right direction.
Also with AMD not looking only at Nvidia pricing but also having to worry about Intel pricing, that could help to make them ignore their fears and start with lower MSRPs. Until today they where fearing that a much lower MSRP price for their cards compared to Nvidia's equivalent would trigger a response from Nvidia. Now they will have to think differently if they don't want to get squeezed from two directions.
 
Joined
Oct 22, 2014
Messages
14,170 (3.81/day)
Location
Sunshine Coast
System Name H7 Flow 2024
Processor AMD 5800X3D
Motherboard Asus X570 Tough Gaming
Cooling Custom liquid
Memory 32 GB DDR4
Video Card(s) Intel ARC A750
Storage Crucial P5 Plus 2TB.
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Eweadn Mechanical
Software W11 Pro 64 bit
350W... No thanks.
Make it 250W maximum.
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,861 (3.87/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
You guys that are jumping on me are like children when it comes to hardware. Like oh my god you guys.

Buy what makes you happy, don't worry about what I am doing.
 
Joined
Nov 2, 2016
Messages
122 (0.04/day)
Judging by the specs I’d say the gpu will be at 699 or 749
But judging by the manufacturer it's more likely to be closer to $900. :)

You guys that are jumping on me are like children when it comes to hardware. Like oh my god you guys.

Buy what makes you happy, don't worry about what I am doing.
I think they're jumping on you like children because of the "whoever doesn't agree with me has an AMD card" remark. That's the call for playmates at this young age. So you asked for playmates, you got playmates, stop complaining about it, any other child would be super happy :).
 
Last edited:

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,861 (3.87/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
your kindergarten level remark about "whoever doesn't agree with me has an AMD card"
I did no such thing.

I use the hardware that pleases me, I don't give a fuck about the name on the box.

Edit:

How well do AMD GPU's run F@H?

Like shit, that is one of the reasons I am not running one.
 
Top