Monday, December 16th 2024

NVIDIA GeForce RTX 5070 Ti Leak Tips More VRAM, Cores, and Power Draw

It's an open secret by now that NVIDIA's GeForce RTX 5000 series GPUs are on the way, with an early 2025 launch on the cards. Now, preliminary details about the RTX 5070 Ti have leaked, revealing an increase in both VRAM and TDP and suggesting that the new upper mid-range GPU will finally address the increased VRAM demand from modern games. According to the leak from Wccftech, the RTX 5070 Ti will have 16 GB of GDDR7 VRAM, up from 12 GB on the RTX 4070 Ti, as we previously speculated. Also confirming previous leaks, the new sources confirm that the 5070 Ti will use the cut-down GB203 chip, although the new leak points to a significantly higher TBP of 350 W. The new memory configuration will supposedly run on a 256-bit memory bus and run at 28 Gbps for a total memory bandwidth of 896 GB/s, which is a significant boost over the RTX 4070 Ti.

Supposedly, the RTX 5070 Ti will also see a bump in total CUDA cores, from 7680 in the RTX 4070 Ti to 8960 in the RTX 5070 Ti. The new RTX 5070 Ti will also switch to the 12V-2x6 power connector, compared to the 16-pin connector from the 4070 Ti. NVIDIA is expected to announce the RTX 5000 series graphics cards at CES 2025 in early January, but the RTX 5070 Ti will supposedly be the third card in the 5000-series launch cycle. That said, leaks suggest that the 5070 Ti will still launch in Q1 2025, meaning we may see an indication of specs at CES 2025, although pricing is still unclear.

Update Dec 16th: Kopite7kimi, ubiquitous hardware leaker, has since responded to the RTX 5070 Ti leaks, stating that 350 W may be on the higher end for the RTX 5070 Ti: "...the latest data shows 285W. However, 350W is also one of the configs." This could mean that a TBP of 350 W is possible, although maybe only on certain graphics card models, if competition is strong, or in certain boost scenarios.
Sources: Wccftech, Kopite7kimi on X
Add your own comment

160 Comments on NVIDIA GeForce RTX 5070 Ti Leak Tips More VRAM, Cores, and Power Draw

#51
Nostras
> Supposedly, the RTX 5070 Ti will also see a bump in total CUDA cores, from 8448 in the RTX 4070 Ti to 8960 in the RTX 5070 Ti.

@Cpt.Jank The 4070Ti Super has 8448 cores, the 4070Ti has 7680 cores.

Upgrade seems a bit mid. If we compare it against (and assume it's similarly priced as) the 4070Ti the uplift should be less than 25%.
The additional VRAM is nice but the extra memory speed is a bit of a meme.

Does make it somewhat comparable to a 4080 though.

But I think this thing will get an MSRP of 850USD.
Posted on Reply
#52
3valatzy
Only 6% more shaders means around that figure higher performance. As time goes by, the 16 GB framebuffer will be a bottleneck, so no matter the other improvements, it will hit a performance wall.
StimpsonJCatI never thought I'd say this... THANK GOD FOR INTEL and their decision to make 8GB VRAM a thing of the past! 12GB VRAM for just $250 is a thing of beauty that the GPU market so desperately needed.

24GB on a 5080 should now be a thing nGreedia, as it should have been all along. But I won't hold my breath.
Even if it's not exactly 24 GB, 18 GB, 20 GB or 22 GB would be definitely better.
Posted on Reply
#53
xtreemchaos
New shiny thing eh, im not falling for it untill it get dull.
Posted on Reply
#54
AusWolf
yfn_ratchetEhhh... I wouldn't get too excited. B300 will probably be a 6/8GB lineup, but then again A300 was the ultra-budget/transcoding/SFF series so it's not much of a tragedy.
I'm more excited to see what B700 has in store for us (and RDNA 4 for that matter). RTX 5000 will most probably be overpriced, just like everything Nvidia since the 3000 or maybe even 2000 series.
Posted on Reply
#55
Tigerfox
@Cpt.Jank : You mixed up the specs of 4070Ti and 4070Ti Super. The Ti had 60CUs or 7.680 cores and 192Bit SI, the Ti Super 66CUs or 8.448 cores and 256Bit.

The "leaked" specs have been leaked for about the third time, so no surprise at all, just nearly a confirmation. The card seems to be no big bump at all, with only 6% more cores, 33% more bandwith and 65W TDP (the 4070TiS was not TDP-limited at all!). If core clock doesn't go up quite a bit, which I doubt given the nearly identical fabrication process, increased performance will have to come mainly from architectural improvements or DLSS4.x. It is the same with 5080 and 5070, which offer similarly small spec-bumps over their predecessors.

If they don't cost more, I don't care, but I doubt that.
Posted on Reply
#56
yfn_ratchet
AusWolfI'm more excited to see what B700 has in store for us (and RDNA 4 for that matter). RTX 5000 will most probably be overpriced, just like everything Nvidia since the 3000 or maybe even 2000 series.
It's an odd pacing, at least as far back as I've had my head dipped in things.
  • Pascal (10) was a breakout hit and was priced insanely well across the board, seeing volumes of 1060s, 1070s, and 1080/1080Tis moving. Even the 1050Ti sold like hotcakes as a slot-only card.
  • Turing (20/16) came out sorely overpriced and underperformed due to it being mostly just an experimentalist platform for 1st gen RT and matrix math. The 16 series cut the crap and focused on price appeal, to much success with SIs and OEMs. Seen as an afterthought nowadays...
  • Ampere (30) was, like the 10 series, so good that literally everything sold. Again. Even the RTX 3050, a laughably bad model in the lineup, was moving units. Everyone wanted a piece of that fresh and functional 2nd gen RT and the true fruit of all those fancy arch additions—DLSS. Good DLSS. It was the crypto boom that ruined prices after launch.
  • Ada (40) follows the same leapfrogging pattern that was forming the last few generations, being an overpriced, underperforming software showcase for special first-adopters only. DLSS framegen, ray reconstruction, AI toolkit—yadda yadda no one friggin' cares.
Naturally, we should expect the 50 series to be a return to form for Nvidia, another lineup of home-runs that makes everyone happy, but the forecast isn't looking great. Higher power draw, barely improving on shader count from Ada/Ampere, and a lame leg for VRAM on the lower end. They're putting less and less into impressing across the board, and putting more and more into the ultra-mega-super-expensive chips from the 4090 to the—what, B6000? A6000 Blackwell?

The margins are getting to their head, and the shareholders are only gonna stop huffing nitrous when the stock price eventually, inevitably, crashes to what Nvidia is ACTUALLY worth as an asset.
Posted on Reply
#57
Mack4285
So the TDP is going up a bit compared to 4070ti? That always seems like the wrong direction to me.
Posted on Reply
#58
Vayra86
freeagentI don't feel like I got screwed. I feel like money doesn't buy what it used to though.
Right, you're totally not annoyed with having 12GB on what is essentially a 16GB card, signed Nvidia, because they released a 4070ti Super that's a lot better at almost the same cost.

You don't feel screwed? And now you're ready to jump on the successor, again, too early and paying far too much, and you'll still be saying 'you don't get what you used to for your money' ? If you had waited a few months and went 4070ti S in the last round, there would have been ZERO reason to even put this 5070ti on your radar to begin with. You'll be gaining what 10-15% perf and 4GB VRAM, its almost a sidegrade.

Its strange how that logic works wrt spending on hardware, honestly. Just be honest, the 4070ti was a shit buy. I'm going to be honest too: I regret buying the 7900XT at the price it had at the time. 250 bucks over what it costs now. Worst timing I've had on a GPU. Could've had a free B580 for that today.

We should do better :)
Mack4285So the TDP is going up a bit compared to 4070ti? That always seems like the wrong direction to me.
Same node, a few more shaders and a higher voltage to extract another 3% you could've had from an OC yourself.

Nvidia is going to sell this card to everyone who didn't buy 4070ti Super, basically, just stalling at virtually the same level of performance right there, but calling it new.
yfn_ratchetIt's an odd pacing, at least as far back as I've had my head dipped in things.
  • Pascal (10) was a breakout hit and was priced insanely well across the board, seeing volumes of 1060s, 1070s, and 1080/1080Tis moving. Even the 1050Ti sold like hotcakes as a slot-only card.
  • Turing (20/16) came out sorely overpriced and underperformed due to it being mostly just an experimentalist platform for 1st gen RT and matrix math. The 16 series cut the crap and focused on price appeal, to much success with SIs and OEMs. Seen as an afterthought nowadays...
  • Ampere (30) was, like the 10 series, so good that literally everything sold. Again. Even the RTX 3050, a laughably bad model in the lineup, was moving units. Everyone wanted a piece of that fresh and functional 2nd gen RT and the true fruit of all those fancy arch additions—DLSS. Good DLSS. It was the crypto boom that ruined prices after launch.
  • Ada (40) follows the same leapfrogging pattern that was forming the last few generations, being an overpriced, underperforming software showcase for special first-adopters only. DLSS framegen, ray reconstruction, AI toolkit—yadda yadda no one friggin' cares.
Naturally, we should expect the 50 series to be a return to form for Nvidia, another lineup of home-runs that makes everyone happy, but the forecast isn't looking great. Higher power draw, barely improving on shader count from Ada/Ampere, and a lame leg for VRAM on the lower end. They're putting less and less into impressing across the board, and putting more and more into the ultra-mega-super-expensive chips from the 4090 to the—what, B6000? A6000 Blackwell?

The margins are getting to their head, and the shareholders are only gonna stop huffing nitrous when the stock price eventually, inevitably, crashes to what Nvidia is ACTUALLY worth as an asset.
What? Ampere was good? What are you talking about? It was heavily underspecced and overpriced and baked on the worst node we've seen on Nvidia cards in decades. The worst perf/W as well, that is a big reason why Ada looks so good now. If you would just look at the hardware in isolation (not the pricing, indeed mining at the time that inflated things badly, Nvidia positioned the MSRP's quite alright initially) its really not a good generation at all. What sold cards at the time was mining, and it barely leaving scraps for gamers. I think you're right about DLSS. But that's not really related to the GPUs, is it, its artificial segmentation. I think people would have gobbled up the GPUs anyway.

10GB on a 3080 (the most interesting card in the line up, everything else was overpriced or halfway obsolete on release, see x60/x70 performance in stuff like Hogwarts for example) was a fucking joke, and the lower tiers were nothing better. Double VRAM cards came too late and looked strange relative to core power, so basically one half of the stack was underspecced and the replacements were overkill on VRAM. The whole release of Ampere was a horrible mess, failure rate wasn't exactly lowest either. I think Ampere may go down history as the generation with the lowest useable lifetime of Nvidia's GPU stacks, all things considered. What also didn't and still doesn't help it, is the development in games where new engine developments just simply slaughter these GPUs. Ampere has been plagued with issues already in various games. To illustrate, a 4090 is almost twice the GPU that a 3090 is at this moment, and that's all core/shader performance, neither card is VRAM constrained.
Posted on Reply
#59
Legacy-ZA
freeagentMy 4070Ti smokes my 3070Ti in every possible way. Lots of guys hate Nvidia, and that's ok :)
Even the 3070Ti is a powerful GPU, the problem is, hit that VRAM ceiling and you might as well be rocking a GTX1060.
Posted on Reply
#60
yfn_ratchet
Vayra86What? Ampere was good? What are you talking about? It was heavily underspecced and overpriced and baked on the worst node we've seen on Nvidia cards in decades. The worst perf/W as well, that is a big reason why Ada looks so good now. If you would just look at the hardware in isolation (not the pricing, indeed mining at the time that inflated things badly, Nvidia positioned the MSRP's quite alright initially) its really not a good generation at all. What sold cards at the time was mining, and it barely leaving scraps for gamers. I think you're right about DLSS. But that's not really related to the GPUs, is it, its artificial segmentation. I think people would have gobbled up the GPUs anyway.

10GB on a 3080 (the most interesting card in the line up, everything else was overpriced or halfway obsolete on release, see x60/x70 performance in stuff like Hogwarts for example) was a fucking joke, and the lower tiers were nothing better. Double VRAM cards came too late and looked strange relative to core power, so basically one half of the stack was underspecced and the replacements were overkill on VRAM. The whole release of Ampere was a horrible mess, failure rate wasn't exactly lowest either.
In isolation, yeah, it wasn't the same "OMFG GUYS BUY IT NOW" goodness of Pascal, but it did what people wanted at the time—something better than Turing (and more importantly, much better than Pascal) for a price they could justify. VRAM back then was less of a reason to hem and haw about a GPU; no one expected to need more than 8GB to run games at the settings they wanted, and at the time they were right.

Even the most demanding titles barely tickled the 8-gig barrier in the 30 series era, and if you were exceeding it, it was because you were trying to play 4K High/Ultra on games that just came out. That was barely in reach for the 3080, nevermind a 3070 (or 3060Ti, which iirc sold way better). The only reason it's an issue now is because easy-bake 'cinematic quality' has become a trend while cinematic framerates and studio-spec GPUs have not.

As for the failure rate debacle, I hardly remember anything of the sort (and what I have inklings of tell me of it being specifically problems with the first batch or two of boards). I do remember the power draw, Ampere guzzled gas and put out heat, but it was worth it for what ya got and you could usually squeeze much better voltages or more worthwhile performance out of a card if you gave even a fart about it, speaking from experience.
Posted on Reply
#61
kondamin
OK now name it 5070 and scrap the 'TI' keep that for a 2026 release
Posted on Reply
#62
Vayra86
yfn_ratchetno one expected to need more than 8GB to run games at the settings they wanted, and at the time they were right.
This is the story of Nvidia buyers' lives. I was one of them. Only the 1080 with a rich 8GB was a good balance in my history of Nvidia cards. And the 780ti with 3GB. The rest? Underspecced or put differently, badly balanced and VRAM shortage will make them go obsolete, always, before core power.

'No one expected' is the wrong term for it imho. They were warned but ignored it, or have the perspective they will upgrade before that moment occurs anyway. But the latter is of course cognitive dissonance. If you have no need, you wouldn't upgrade, they simply pre-empted their dissapointment with a lowered expectation. This is the 'money is no object crowd' that then proceeds to complain about pricing and buys it anyway.
Posted on Reply
#63
AusWolf
Vayra86What? Ampere was good? What are you talking about? It was heavily underspecced and overpriced and baked on the worst node we've seen on Nvidia cards in decades. The worst perf/W as well, that is a big reason why Ada looks so good now. If you would just look at the hardware in isolation (not the pricing, indeed mining at the time that inflated things badly, Nvidia positioned the MSRP's quite alright initially) its really not a good generation at all. What sold cards at the time was mining, and it barely leaving scraps for gamers. I think you're right about DLSS. But that's not really related to the GPUs, is it, its artificial segmentation. I think people would have gobbled up the GPUs anyway.

10GB on a 3080 (the most interesting card in the line up, everything else was overpriced or halfway obsolete on release, see x60/x70 performance in stuff like Hogwarts for example) was a fucking joke, and the lower tiers were nothing better. Double VRAM cards came too late and looked strange relative to core power, so basically one half of the stack was underspecced and the replacements were overkill on VRAM. The whole release of Ampere was a horrible mess, failure rate wasn't exactly lowest either. I think Ampere may go down history as the generation with the lowest useable lifetime of Nvidia's GPU stacks, all things considered. What also didn't and still doesn't help it, is the development in games where new engine developments just simply slaughter these GPUs. Ampere has been plagued with issues already in various games. To illustrate, a 4090 is almost twice the GPU that a 3090 is at this moment, and that's all core/shader performance, neither card is VRAM constrained.
Ampere was supposed to be good based on launch slides, but then, Nvidia ended up doubling shader count without actually doubling shader count (by calling half of them "dual issue"). Combine that with the fact that we didn't see a single model sold anywhere near MSRP through the entire life cycle of the generation, and we have the complete dumpster fire we call Ampere.

It's only exaggerated by the fact that Nvidia went full retard with prices on Ada, thinking that if gullible gamers are willing to pay thousands for a graphics card just because it comes in a green box, then maybe they should. The saddest part of it all is that time proved Nvidia right.
Posted on Reply
#64
yfn_ratchet
Vayra86This is the story of Nvidia buyers' lives. I was one of them. Only the 1080 with a rich 8GB was a good balance in my history of Nvidia cards. And the 780ti with 3GB. The rest? Underspecced or put differently, badly balanced and VRAM shortage will make them go obsolete, always, before core power.

'No one expected' is the wrong term for it imho. They were warned but ignored it, or have the perspective they will upgrade before that moment occurs anyway. But the latter is of course cognitive dissonance. If you have no need, you wouldn't upgrade, they simply pre-empted their dissapointment with a lowered expectation. This is the 'money is no object crowd' that then proceeds to complain about pricing and buys it anyway.
Fair enough, lol. I picked up the 3060 12GB in '22 for $370 because it was the cheapest 12-gig card I knew about and I wanted that extra legroom for VRChat publics and super duper texture quality. The rest of my settings can stay at a happy Medium until I get a 7800XT/equivalent for 4 hundo or less. Fingers crossed Intel or AMD makes my wish come true next year.

I'm also making good on buying the DLSS-majigger now that I'm at 1440p, too. Cyberpunk doesn't look that bad on Ultra Quality.
Posted on Reply
#65
Vya Domus
ScircuraI don't really understand what Nvidia is doing
Nothing, the answer is they're pretty much doing nothing. This is somewhere between a half baked node shrink and a refresh, they got most of the market share so now there is no reason for them to make significantly better products because they know there just isn't that much more of the market that they can capture and there is also not much more that they can milk out of their existing customers.
Posted on Reply
#66
dismuter
The comparisons in this article are messed up. The 4070 Ti was superseded by the 4070 Ti Super and has been discontinued, so there's no point mentioning or comparing to the 4070 Ti non-Super.
Yet it just mentions 4070 Ti, while mixing specs from the non-Super (12 GB VRAM) and Super (8448 CUDA cores, the non-Super had 7680).
So in fact, it does not have more VRAM, because the 4070 Ti Super already had 16 GB.
Posted on Reply
#67
mb194dc
It's what happens when they're able to print money with the LLM GPUs. No incentive for them to try on the consumer side, total complacency like Intel after 2006. Remains to be seen if the data centre side holds up given no decent front end mass paid use cases for any of it yet. Of if AMD or another competitor can come up with something better on the consumer side.

Seems tech stagnation will go on for a while yet.
Posted on Reply
#68
N/A
RTX 5070 Ti is very special and brings 4090 performance down to $800. At least In 1440p it should land much closer to 4090 than to 4080.
Posted on Reply
#69
Tsukiyomi91
have a gut feeling it will be $850 or $900 card.
Posted on Reply
#70
john_
StimpsonJCatI never thought I'd say this... THANK GOD FOR INTEL and their decision to make 8GB VRAM a thing of the past! 12GB VRAM for just $250 is a thing of beauty that the GPU market so desperately needed.
There are A770 and RTX 3060 that where playing at $250 or lower with 12-16GB of VRAM. But you are right that putting an MSRP price of $250 on a 12GB card and not having to wait months to see the price slide to those price levels, is a step in the right direction.
Also with AMD not looking only at Nvidia pricing but also having to worry about Intel pricing, that could help to make them ignore their fears and start with lower MSRPs. Until today they where fearing that a much lower MSRP price for their cards compared to Nvidia's equivalent would trigger a response from Nvidia. Now they will have to think differently if they don't want to get squeezed from two directions.
Posted on Reply
#71
Caring1
350W... No thanks.
Make it 250W maximum.
Posted on Reply
#72
freeagent
You guys that are jumping on me are like children when it comes to hardware. Like oh my god you guys.

Buy what makes you happy, don't worry about what I am doing.
Posted on Reply
#73
close
HyderzJudging by the specs I’d say the gpu will be at 699 or 749
But judging by the manufacturer it's more likely to be closer to $900. :)
freeagentYou guys that are jumping on me are like children when it comes to hardware. Like oh my god you guys.

Buy what makes you happy, don't worry about what I am doing.
I think they're jumping on you like children because of the "whoever doesn't agree with me has an AMD card" remark. That's the call for playmates at this young age. So you asked for playmates, you got playmates, stop complaining about it, any other child would be super happy :).
Posted on Reply
#74
freeagent
closeyour kindergarten level remark about "whoever doesn't agree with me has an AMD card"
I did no such thing.

I use the hardware that pleases me, I don't give a fuck about the name on the box.

Edit:

How well do AMD GPU's run F@H?

Like shit, that is one of the reasons I am not running one.
Posted on Reply
#75
john_
freeagentLots of guys hate Nvidia, and that's ok :)
freeagentAnd many of the comments in this thread are from guys running AMD GPU's lol..
freeagentYou guys just need to stop with this red vs green nonsense.
freeagentI think some people are just way too sensitive.
freeagentYou guys that are jumping on me are like children when it comes to hardware.
You guys, lots of guys, sensitive guys, guys that are targeting you. Maybe we are jealous of you? Just a thought.
The worst thing is that you are a staff member and I can't put you in my ignore list(yes I checked that). But you did started following me just 35 minutes ago. I guess you are preparing your BAN hammer for any future posts by me, because, well, you are a staff member I guess you can do that.
freeagentHow well do AMD GPU's run F@H?

Like shit, that is one of the reasons I am not running one.
Yep, it's a wonderful game. I do agree that I am tempted to buy an Nvidia card to play it. But don't tell anyone.
Posted on Reply
Add your own comment
Dec 23rd, 2024 19:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts