Monday, December 16th 2024

NVIDIA GeForce RTX 5070 Ti Leak Tips More VRAM, Cores, and Power Draw

It's an open secret by now that NVIDIA's GeForce RTX 5000 series GPUs are on the way, with an early 2025 launch on the cards. Now, preliminary details about the RTX 5070 Ti have leaked, revealing an increase in both VRAM and TDP and suggesting that the new upper mid-range GPU will finally address the increased VRAM demand from modern games. According to the leak from Wccftech, the RTX 5070 Ti will have 16 GB of GDDR7 VRAM, up from 12 GB on the RTX 4070 Ti, as we previously speculated. Also confirming previous leaks, the new sources confirm that the 5070 Ti will use the cut-down GB203 chip, although the new leak points to a significantly higher TBP of 350 W. The new memory configuration will supposedly run on a 256-bit memory bus and run at 28 Gbps for a total memory bandwidth of 896 GB/s, which is a significant boost over the RTX 4070 Ti.

Supposedly, the RTX 5070 Ti will also see a bump in total CUDA cores, from 7680 in the RTX 4070 Ti to 8960 in the RTX 5070 Ti. The new RTX 5070 Ti will also switch to the 12V-2x6 power connector, compared to the 16-pin connector from the 4070 Ti. NVIDIA is expected to announce the RTX 5000 series graphics cards at CES 2025 in early January, but the RTX 5070 Ti will supposedly be the third card in the 5000-series launch cycle. That said, leaks suggest that the 5070 Ti will still launch in Q1 2025, meaning we may see an indication of specs at CES 2025, although pricing is still unclear.

Update Dec 16th: Kopite7kimi, ubiquitous hardware leaker, has since responded to the RTX 5070 Ti leaks, stating that 350 W may be on the higher end for the RTX 5070 Ti: "...the latest data shows 285W. However, 350W is also one of the configs." This could mean that a TBP of 350 W is possible, although maybe only on certain graphics card models, if competition is strong, or in certain boost scenarios.
Sources: Wccftech, Kopite7kimi on X
Add your own comment

161 Comments on NVIDIA GeForce RTX 5070 Ti Leak Tips More VRAM, Cores, and Power Draw

#26
notoperable
Wonder how long it will take for the first melting cables headline to hit the news after delivery
Posted on Reply
#27
Hyderz
This feels like a 4080 super level performance
Posted on Reply
#28
dartuil
HyderzThis feels like a 4080 super level performance
it should be
Posted on Reply
#29
Onasi
notoperableWonder how long it will take for the first melting cables headline to hit the news after delivery
…never? The revised connector is explicitly designed so that it is impossible to make it fail in such manner, even DELIBERATELY. It’s been tested already. The connector is a non-issue.
Posted on Reply
#30
FreedomEclipse
~Technological Technocrat~
Outback Bronze@ $10K?
We used to make jokes about sacrificing your first born. Unfortunately it seems like its now a reality.
Posted on Reply
#31
notoperable
Onasi…never? The revised connector is explicitly designed so that it is impossible to make it fail in such manner, even DELIBERATELY. It’s been tested already. The connector is a non-issue.
That's what 'they' said last time as well, didn't they? You must be naive when you think that oems won't cut corner's or humans will stop being Humans all of the sudden, but I might be wrong :)
Posted on Reply
#32
gffermari
The Ti will be at least +10% to 4080.
The 4070Ti Super is already a tad slower than the 4080.
Posted on Reply
#33
Outback Bronze
FreedomEclipseWe used to make jokes about sacrificing your first born. Unfortunately it seems like its now a reality.
Quite happy to sacrifice my new born atm…
Posted on Reply
#34
Hecate91
notoperableThat's what 'they' said last time as well, didn't they? You must be naive when you think that oems won't cut corner's or humans will stop being Humans all of the sudden, but I might be wrong :)
I remember everyone said the first version of the connector was just fine and anyone that had one melt was "plugging it in wrong". The 6+2 and 8 pin molex connector was just better, IMO.
Posted on Reply
#35
FierceRed
FreedomEclipseWe used to make jokes about sacrificing your first born. Unfortunately it seems like its now a reality.
Heh, I know you're joking but damn my first born is worth WAY more than $10k
Outback BronzeQuite happy to sacrifice my new born atm…
A true parent right here :laugh:
Posted on Reply
#36
Hyderz
Judging by the specs I’d say the gpu will be at 699 or 749
Posted on Reply
#37
rv8000
HyderzJudging by the specs I’d say the gpu will be at 699 or 749
I expect $799, + whatever tariff impacts. Funny how 70 series cards use to be half that price. Nvidia has no reason to price things sanely anymore.
Posted on Reply
#38
Scircura
I don't really understand what Nvidia is doing, assuming these leaked specs are correct. The only meaningful spec boost is in memory bandwidth, but we just saw, comparing 4070 Ti to 4070 Ti Super (192->256 bit bus), that the 40 series cards don't benefit from increased bandwidth, not even ray tracing.

Remember that the 40 series cards actually had lower bandwidth than 30 series: Nvidia slashed bus width and only slightly increased memory clock rate, even as they raised core counts and core clocks by a huge amount. Now 50 series is going to barely improve core counts, but increase bandwidth 30-50% from the previous generation?!

Can anyone weigh in: How would it benefit gaming? Seems mostly a boon for LLM inference speed.

Edit: and a big increase in TDP without increase in core count probably means throwing efficiency out the window to crank clocks. Although leaked TDPs in the past have been way off: kopite7kimi said the 4080 would have 420W TDP, but it launched with 320W TDP.
Posted on Reply
#39
john_
I guess this is going to be a $900 4080 Super replacement.
Nvidia tried to justify the high price of the 4070 Ti last time by calling it "4080 12GB".
Now with no competition and the public hypnotized by Nvidia's logo, they can easily put a $900 price tag and still sell it like hot cakes. If they come out with a $800 price tag, the original price tag for the 4070 Ti, the hordes will go bananas, looking at Jensen like he is the biggest humanitarian alive. The few who will think that this is a high unrealistic price, will just blame AMD and start praying for Intel to save them..... in 5 years.
Vya DomusMan these things will be atrociously underpowered compared to their predecessors, 6% more shaders lol.
They will probably support a new feature that will be justifying an upgrade. Think Frame Generation as the main difference between RTX 3060 and RTX 4060. A new feature for the RTX 5000 series that could be supported by RTX 4000 and probably even RTX 3000 series but Nvidia will say that "it can't be supported on older GPUs because of hardware changes in RTX 5000 that are necessary". Then AMD and Intel will introduce that feature for their cards, making it obvious that Nvidia lies, but I doubt they will give it to Nvidia owners this time.
freeagentMy 4070Ti smokes my 3070Ti in every possible way. Lots of guys hate Nvidia, and that's ok :)
42% faster (based on TPU) with a 33% higher MSRP.
freeagentI don't feel like I got screwed. I feel like money doesn't buy what it used to though.
Nvidia doesn't sell only performance. It also sells the feeling of having a superior product compared to the others. Think Apple vs Samsung. Even if Samsung comes out with a superior model, people will still think that buying an Apple iPhone is the premium choice.
Posted on Reply
#40
freeagent
john_42% faster (based on TPU) with a 33% higher MSRP.
I have 2 computers :)
john_It also sells the feeling of having a superior product compared to the others.
I do not believe that for an instant. You guys just need to stop with this red vs green nonsense. People buy what they buy because it works for them.
Posted on Reply
#41
john_
freeagentYou guys just need to stop with this red vs green nonsense.
You probably mean that previous post of yours about "most posters having an AMD GPU. lol"
Or that previous post where you where saying that many people "just hate Nvidia".
Or maybe we should just combine those two posts.

"Many people hate Nvidia, most posters here have an AMD GPU. lol".
How about that?

..........please.......... The above two posts of yours could be considered too close to an effort to ignite a flame war.
Posted on Reply
#42
freeagent
I think some people are just way too sensitive. Anyways, I am going to bed.

Peace and thermal grease y0
Posted on Reply
#43
john_
freeagentI think some people are just way too sensitive.
Self-criticism is useful.
freeagentAnyways, I am going to bed.
Good idea.
Posted on Reply
#44
Legacy-ZA
I have my eyes set on either the 5080 or 5070Ti, as long as it has 16GB VRAM+, I am hoping that the 5080 has 20GB VRAM instead of the rumoured 16GB. I am a little disappointed with the power draw though, I was hoping it would be around 275W for the 5070Ti and 300W for the 5080 , but I can always do some underclocking, hopefully without ruining performance too much. These space heaters are ridiculous at this point but I need a proper GPU.

Let's hope some semblance of sanity has been restored with pricing, as it only took 1 generation to see a doubling of prices for xx70 cards from $400 to over $600+

Anyways, time will tell.
Posted on Reply
#45
usiname
rv8000I expect $799, + whatever tariff impacts. Funny how 70 series cards use to be half that price. Nvidia has no reason to price things sanely anymore.
70 series was also with ~40% less cores than the max configuration and 30% less for 70ti, now this is ~55% for 70ti and ~65% for regular 70. Its basically 60 class card at the price of 80/80ti
Posted on Reply
#46
AusWolf
16 GB on a 350 W card. Am I supposed to be impressed or something? :wtf:
Posted on Reply
#47
3valatzy
AusWolf16 GB on a 350 W card. Am I supposed to be impressed or something? :wtf:
Nope ! :banghead:

The results:
1. No future proofing
2. Reduced textures resolution in order to fit them in the limited framebuffer

See this for a reference at 6:06, 8:03:



GTX 1070 Ti was a 180W card


www.techpowerup.com/gpu-specs/geforce-gtx-1070-ti.c3010


Question: why did they backport GB102 on 3nm to GB202 on 4nm? Does it mean the TSMC 3nm node is broken for GPUs ? :rolleyes:
Posted on Reply
#48
StimpsonJCat
I never thought I'd say this... THANK GOD FOR INTEL and their decision to make 8GB VRAM a thing of the past! 12GB VRAM for just $250 is a thing of beauty that the GPU market so desperately needed.

24GB on a 5080 should now be a thing nGreedia, as it should have been all along. But I won't hold my breath.
Posted on Reply
#49
Hecate91
StimpsonJCatI never thought I'd say this... THANK GOD FOR INTEL and their decision to make 8GB VRAM a thing of the past! 12GB VRAM for just $250 is a thing of beauty that the GPU market so desperately needed.

24GB on a 5080 should now be a thing nGreedia, as it should have been all along. But I won't hold my breath.
I like seeing more VRAM from Intel but they need to do something to gain market share, but sadly most gamers will blindly buy the card with Nvidia on the box.
But I'm not impressed by 16GB on the 5070Ti which will probably be $800-900, its the minimum a card should have at that price.
freeagentI don't feel like I got screwed. I feel like money doesn't buy what it used to though.
I don't expect anyone to feel like they're getting screwed while still having any excitement for new GPU's from Nvidia, and I mean anyone still accepting what the leather jacket man is charging for what you get in a mid range card.
As for inflation, it doesn't really apply when Nvidia has margins over 70% and they have 90% of the dGPU market, they're pricing things well above inflation because they can.
Posted on Reply
#50
yfn_ratchet
StimpsonJCatI never thought I'd say this... THANK GOD FOR INTEL and their decision to make 8GB VRAM a thing of the past! 12GB VRAM for just $250 is a thing of beauty that the GPU market so desperately needed.
Ehhh... I wouldn't get too excited. B300 will probably be a 6/8GB lineup, but then again A300 was the ultra-budget/transcoding/SFF series so it's not much of a tragedy.
Posted on Reply
Add your own comment
Mar 28th, 2025 03:47 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts