Friday, November 22nd 2024

NVIDIA GeForce RTX 5070 Ti Specs Leak: Same Die as RTX 5080, 300 W TDP

Recent leaks have unveiled specifications for NVIDIA's upcoming RTX 5070 Ti graphics card, suggesting an increase in power consumption. According to industry leaker Kopite7kimi, the RTX 5070 Ti will feature 8,960 CUDA cores and operate at a 300 W TDP. In a departure from previous generations, the RTX 5070 Ti will reportedly share the same GB203 die with its higher-tier sibling, the RTX 5080. This architectural decision differs from the RTX 40-series lineup, where the 4070 Ti and 4080 utilized different dies (AD104 and AD103, respectively). This shared die approach could potentially keep NVIDIA's manufacturing costs lower. Performance-wise, the RTX 5070 Ti shows promising improvements over its predecessor. The leaked specifications indicate a 16% increase in CUDA cores compared to the RTX 4070 Ti, though this advantage shrinks to 6% when measured against the RTX 4070 Ti Super.

Power consumption sees a modest 5% increase to 300 W, suggesting improved efficiency despite the enhanced capabilities. Memory configurations remain unconfirmed, but speculations about the card indicate that it could feature 16 GB of memory on a 256-bit interface, distinguishing it from the RTX 5080's rumored 24 GB configuration. The positioning across the 50-series GPU stack of this RTX 5070 Ti appears carefully calculated, with its 8,960 CUDA cores sitting approximately 20% below the RTX 5080's 10,752 cores. This larger performance gap between tiers contrasts with the previous generation's approach, potentially indicating a more defined product hierarchy in the Blackwell lineup. NVIDIA is expected to unveil its Blackwell gaming graphics cards at CES 2025, with the RTX 5090, 5080, and 5070 series leading the announcement.
Source: VideoCardz
Add your own comment

36 Comments on NVIDIA GeForce RTX 5070 Ti Specs Leak: Same Die as RTX 5080, 300 W TDP

#1
Macro Device
Wait till competition commits suicide
@
"Design" the "new" architecture for a couple months longer than usual
@
Sell a slightly overclocked and tuned mid-tier product for even longer dollars than last time
@
"You definitely need that, trust me"
Posted on Reply
#2
wNotyarD
While on paper this sounds a lot better than the 4070 Ti was design-wise, I don't like the power consumption increase (even if modest) and certainly won't like its price. Lucky me I'm not in the market for a new graphics card in the foreseeable future.
Posted on Reply
#3
Daven
Looks like the upper part of the line-up is coming into focus

5070 $600 250W Slightly higher perforamance than 4070 Super
5070 Ti $800 300W Slightly higher performance than 4070 Ti Super
5080 $1000 400W Slightly higher performance than 4080 Super
5090 $2000 600W 40% higher performance than 4090

Nothing too exciting given the same 4 nm die process except for the 5090. I have no idea how this thing is going to work at 600W if your rig isn't perfectly up to snuff.

As for the lower part of the line-up, Nvidia is definitely waiting to see how Battlemage and RNDA4 performs.
Posted on Reply
#4
Legacy-ZA
I have my eye on the 5070Ti, however, if it arrives with 12GB VRAM, there is no way. Also, the MSRP should be reasonable, you can't just keep pushing prices up nVidia, the world doesn't work that way, salaries doesn't just go up ad-infinitum, eventually you will price yourself out of the tier/market you used to target.
Posted on Reply
#5
usiname
DavenLooks like the upper part of the line-up is coming into focus

5070 $600 250W Slightly higher perforamance than 4070 Super
5070 Ti $800 300W Slightly higher performance than 4070 Ti Super
5080 $1000 400W Slightly higher performance than 4080 Super
5090 $2000 600W 40% higher performance than 4090

Nothing too exciting given the same 4 nm die process except for the 5090. I have no idea how this thing is going to work at 600W if your rig isn't perfectly up to snuff.

As for the lower part of the line-up, Nvidia is definitely waiting to see how Battlemage and RNDA4 performs.
Replace "slightly higher" with "same" and you will nail them.
Also add $100 to 5070 and $200 to 70ti and 80
Posted on Reply
#7
docnorth
Legacy-ZAI have my eye on the 5070Ti, however, if it arrives with 12GB VRAM, there is no way. Also, the MSRP should be reasonable, you can't just keep pushing prices up nVidia, the world doesn't work that way, salaries doesn't just go up infinitum, eventually you will price yourself out of the tier/market you used to target.
It's rumored to be 16GB. About MSRP I'm afraid same as 4070ti/ti super at best.
Posted on Reply
#8
Legacy-ZA
docnorthIt's rumored to be 16GB. About MSRP I'm afraid same as 4070ti/ti super at best.
Hopefully some semblance of sanity can be restored. I was happy with the RTX3070Ti GPU performance itself, after all, I am only on 1440p, however, that damn VRAM ceiling of only 8GB, the thing cries out in pain with AAA games on this resolution, I can't stand stuttering/1% lows, irks me to the extreme.
Posted on Reply
#9
close
Macro DeviceWait till competition commits suicide
@
"Design" the "new" architecture for a couple months longer than usual
@
Sell a slightly overclocked and tuned mid-tier product for even longer dollars than last time
@
"You definitely need that, trust me"
Of course the competition committed suicide. Journalists and reviewers banged the drum of how much better Nvidia is in terms of performance even when just edging out AMD, banged another drum of how ray tracing is "the future" (TM). People gobbled that up, went for Nvidia even when AMD was a perfectly decent (almost equivalent) or cheaper alternative. The demand for either cheaper (at the mid/low end) or more efficient (at the high end) GPUs (I think AMD never had these together in a single product) dried up in favor of "Nvidia at every end".

Only now that Nvidia is squeezing every $ it can with last year's overclocked products from a market they ended up dominating have many of the same journalists and reviewers realized the consequences and I started seeing articles how "Nvidia's GPUs don't get better, they just trade more power for more performance", or users started voting in polls that they care about raster performance nor ray tracing.
Posted on Reply
#10
Vayra86
Legacy-ZAHopefully some semblance of sanity can be restored. I was happy with the RTX3070Ti GPU performance itself, after all, I am only on 1440p, however, that damn VRAM ceiling of only 8GB, the thing cries out in pain with AAA games on this resolution, I can't stand stuttering/1% lows, irks me to the extreme.
Naaah that can't be, 'its just allocation, not utilisation'

:roll::oops::(

But yeah, that's what's about to happen to the crop of 12GB cards in about 2 years, with the occasional game popping up in 2025 already. If you upgrade now, better make it 16 or don't bother.
Posted on Reply
#12
Vayra86
closeOf course the competition committed suicide. Journalists and reviewers banged the drum of how much better Nvidia is in terms of performance even when just edging out AMD, banged another drum of how ray tracing is "the future" (TM). People gobbled that up, went for Nvidia even when AMD was a perfectly decent (almost equivalent) or cheaper alternative. The demand for either cheaper (at the mid/low end) or more efficient (at the high end) GPUs (I think AMD never had these together in a single product) dried up in favor of "Nvidia at every end".

Only now that Nvidia is squeezing every $ it can with last year's overclocked products from a market they ended up dominating have many of the same journalists and reviewers realized the consequences and I started seeing articles how "Nvidia's GPUs don't get better, they just trade more power for more performance", or users started voting in polls that they care about raster performance nor ray tracing.
But... Nvidia was always better. Simple as that. Even during GCN; AMD drew more power for a slightly better bang for buck, offered more VRAM for a slightly better bang for buck. And that's all AMD wrote. Not ONCE did they take the leading position, either in featureset or in software altogether. Driver regime has been spotty. GPU time to market has no real fixed cadence, its 'whatever happens with AMD' every single time and it never happens to be just a smooth launch. The list of issues goes on and on and on.

The only thing to applaud is AMD brought RDNA2/3 to a good, stable situation. Too bad the products don't sell. Because AMD chose to price them in parity with Nvidia... So at that point, they didn't have the consistency, nor the trust factor or brand image, nor the bang for buck price win.... and guess what. RDNA4 is a bugfix and then they're going back to the drawing board. Again: no consistency, they even admitted themselves that they failed now.
Posted on Reply
#13
rv8000
This has got some strong 4080 performance for $999 price vibes written all over it; with potential to be much worse value than the 4070ti super depending on tariffs.

Another cycle of Nvidia giving you less or the same for more.
Posted on Reply
#14
Daven
usinameCorrect, I was confused by the 5070 - it has significantly lower core count than the 4070 super, but for the prices I believe they will be higher
RTX 4070 - 5888 cores
RTX 5070 - 6400 cores
RTX 4070 super - 7168 cores
Whoops, you are right. I was mistaken that the 4070 Super had 5888 cores.
Posted on Reply
#15
mate123
rv8000This has got some strong 4080 performance for $999 price vibes written all over it; with potential to be much worse value than the 4070ti super depending on tariffs.

Another cycle of Nvidia giving you less or the same for more.
999$ for the always out of stock "founders edition" and probably 1100$+ for the non-hairdryer models
Posted on Reply
#16
Daven
Vayra86But... Nvidia was always better. Simple as that. Even during GCN; AMD drew more power for a slightly better bang for buck, offered more VRAM for a slightly better bang for buck. And that's all AMD wrote. Not ONCE did they take the leading position, either in featureset or in software altogether. Driver regime has been spotty. GPU time to market has no real fixed cadence, its 'whatever happens with AMD' every single time and it never happens to be just a smooth launch. The list of issues goes on and on and on.

The only thing to applaud is AMD brought RDNA2/3 to a good, stable situation. Too bad the products don't sell. Because AMD chose to price them in parity with Nvidia... So at that point, they didn't have the consistency, nor the trust factor or brand image, nor the bang for buck price win.... and guess what. RDNA4 is a bugfix and then they're going back to the drawing board. Again: no consistency, they even admitted themselves that they failed now.
Not quite true. AMD had good wins with the R300, Fury and all RDNA products. Also their All-in-wonder products were good. And iGPUs were always a step ahead.

In between those, AMD and Nvidia traded blows with the Nvidia FX5000 series being a notable flop. Only recently starting with the release of Maxwell did Nvidia start really pulling ahead. And yes they are killing it in the last six years.
Posted on Reply
#17
TechBuyingHavoc
DavenAs for the lower part of the line-up, Nvidia is definitely waiting to see how Battlemage and RNDA4 performs.
I expect both to suck as well. AMD is distracted with AI and not focused on gaming, despite what they are saying, and Intel may *wish* to gain market share with Battlemage but they have no money left and Celestial looks dead in the water.
Posted on Reply
#18
Prima.Vera
Vayra86But... Nvidia was always better. Simple as that.
Maybe you are to young to remember, but definitely nVidia wasn't always better.
Just for your homework, search for AMD Radeon HD 5870 card. It was so good, that it was almost beating the dual GPU card from nVidia, while wiping the floor with whole nvidia gen cards. Also the 5850 was a monster too, and could work in pair with the 5870. I remember that was my last SLI setup ever, but it was a blast. Good ol' times.
www.techpowerup.com/review/ati-radeon-hd-5870/30.html
Posted on Reply
#19
ThomasK
Vayra86But... Nvidia was always better. Simple as that. Even during GCN; AMD drew more power for a slightly better bang for buck, offered more VRAM for a slightly better bang for buck. And that's all AMD wrote. Not ONCE did they take the leading position, either in featureset or in software altogether. Driver regime has been spotty. GPU time to market has no real fixed cadence, its 'whatever happens with AMD' every single time and it never happens to be just a smooth launch. The list of issues goes on and on and on.
Were you even around when the HD 5870 came out and Nvidia's answer was the Fermi toaster?
Posted on Reply
#20
TechBuyingHavoc
Prima.VeraMaybe you are to young to remember, but definitely nVidia wasn't always better.
Just for your homework, search for AMD Radeon HD 5870 card. It was so good, that it was almost beating the dual GPU card from nVidia, while wiping the floor with whole nvidia gen cards. Also the 5850 was a monster too, and could work in pair with the 5870. I remember that was my last SLI setup ever, but it was a blast. Good ol' times.
www.techpowerup.com/review/ati-radeon-hd-5870/30.html
And earlier than that in the ATI era, the Radeon 9700 was a game-changer.
Posted on Reply
#21
Random_User
Vayra86But... Nvidia was always better. Simple as that. Even during GCN; AMD drew more power for a slightly better bang for buck, offered more VRAM for a slightly better bang for buck. And that's all AMD wrote. Not ONCE did they take the leading position, either in featureset or in software altogether. Driver regime has been spotty. GPU time to market has no real fixed cadence, its 'whatever happens with AMD' every single time and it never happens to be just a smooth launch. The list of issues goes on and on and on.

The only thing to applaud is AMD brought RDNA2/3 to a good, stable situation. Too bad the products don't sell. Because AMD chose to price them in parity with Nvidia... So at that point, they didn't have the consistency, nor the trust factor or brand image, nor the bang for buck price win.... and guess what. RDNA4 is a bugfix and then they're going back to the drawing board. Again: no consistency, they even admitted themselves that they failed now.
AMD was beating nVidia, since 4870 'till Polaris. Even the dumb GTX260 was a hot furnace. 5870 was the first DX11 card, and the first gen of HW tesselation. Also, the HW of AMD/ATi was much of higher quality. But drivers were hit-and miss, for decades- that's true. The picture quality was a bit better on AMD/ATi either, much like not gimping the colour preset like "green" company did since ever.
P.S.: AMD made Mantle, which became Vulcan. The SW RT back in 2016.
TechBuyingHavocAnd earlier than that in the ATI era, the Radeon 9700 was a game-changer.
There was also the Radeon 9600 Pro. A great, affordable low end card. that made gaming possible for alot of people. And it had a quite nice OC room. It was like "2500+ Barton" of videocard.
Posted on Reply
#22
freeagent
DavenAlso their All-in-wonder products were good.
Just to be clear, this was all ATi, AMD had nothing to do with that :)

I will probably still aim for 5080 this time around. My oldest will get my 4070Ti, youngest gets my 3070Ti.

Although I could be swayed into a 5070Ti, because while the internet was hating on the Ti series, I was enjoying them :D
Posted on Reply
#23
Vayra86
Random_UserAMD was beating nVidia, since 4870 'till Polaris. Even the dumb GTX260 was a hot furnace. 5870 was the first DX11 card, and the first gen of HW tesselation. Also, the HW of AMD/ATi was much of higher quality. But drivers were hit-and miss, for decades- that's true. The picture quality was a bit better on AMD/ATi either, much like not gimping the colour preset like "green" company did since ever.
P.S.: AMD made Mantle, which became Vulcan. The SW RT back in 2016.

There was also the Radeon 9600 Pro. A great, affordable low end card. that made gaming possible for alot of people. And it had a quite nice OC room. It was like "2500+ Barton" of videocard.
So pray tell where it went wrong then. They were 'beating' Nvidia with what? Tech that met the end of its dev cycle. They had every opportunity to obtain true leadership but AMD was thinking 'meh, we're good, this is fine, we don't need to chase the cutting edge continuously, 50% market is all we can do'? And then they thought, 'beating Nvidia': 'Let's release Nvidia's 970*(Edited) 2 years after the fact and kill this market!' I mean... what?! They weren't beating Nvidia at all. They traded punches, but never answered Nvidia Titan, and Hawaii XT failed miserably - a way too hungry dead end forcing them into Fury X and the capital loss against Maxwell. AMD's death of GCN happened somewhere between the great release of a 7970 and the birth of Tonga, which proved the arch was a dead end, but pushing out 290(x) anyway on a whoppin 512 bit bus. And then Fury had to happen, because how else do you above and beyond moar VRAM 512 bit? And then they got their 1000 bit hbm ass kicked by a 384 bit 980ti.

AMD made Mantle, which became Vulkan. And then what? What is the overarching strategy here, console access? We can applaud their many successes but the key to those events is that you use them to increase your market share and control, to the detriment of other key players. That's commerce.

Its one thing to make the occasional 'good card' (which is really nothing more than pricing a product correctly / in a way people buy it!) that sells, its another to actually execute on a strategy. Over several decades of AMD GPUs I haven't discovered what it is. If we go buy the marketing its some wild mix of making fun of the others while failing yourself (Poor Volta and a string of other events), going unified arch first and then not, and then yes, we might as well unify this again after dropping under 20% share convincingly; going 'midrange with Polaris' to lose key market share and brand recognition earned on GCN (which had a few 'good cards') only to claw back into the high end with RDNA2/3 and then back to midrange again?

There's just no rhyme or reason to it, and that is why it can't get ever get consistently good.
Prima.VeraMaybe you are to young to remember, but definitely nVidia wasn't always better.
Just for your homework, search for AMD Radeon HD 5870 card. It was so good, that it was almost beating the dual GPU card from nVidia, while wiping the floor with whole nvidia gen cards. Also the 5850 was a monster too, and could work in pair with the 5870. I remember that was my last SLI setup ever, but it was a blast. Good ol' times.
www.techpowerup.com/review/ati-radeon-hd-5870/30.html
I had a console age in those years, for some reason it was PS3 at that point, not PC :D
Posted on Reply
#24
Zazigalka
My prediction for this 5070Ti is 4080S equivalent for 799. If it manages to beat the 4080S by 10-15%, 899.
Vayra86So pray tell where it went wrong then. They were 'beating' Nvidia with what? Tech that met the end of its dev cycle. They had every opportunity to obtain true leadership but AMD was thinking 'meh, we're good, this is fine, we don't need to chase the cutting edge continuously, 50% market is all we can do'? And then they thought, 'beating Nvidia': 'Let's release Nvidia's 970*(Edited) 2 years after the fact and kill this market!' I mean... what?! They weren't beating Nvidia at all. They traded punches, but never answered Nvidia Titan, and Hawaii XT failed miserably - a way too hungry dead end forcing them into Fury X and the capital loss against Maxwell. AMD's death of GCN happened somewhere between the great release of a 7970 and the birth of Tonga, which proved the arch was a dead end, but pushing out 290(x) anyway on a whoppin 512 bit bus. And then Fury had to happen, because how else do you above and beyond moar VRAM 512 bit? And then they got their 1000 bit hbm ass kicked by a 384 bit 980ti.

AMD made Mantle, which became Vulkan. And then what? What is the overarching strategy here, console access? We can applaud their many successes but the key to those events is that you use them to increase your market share and control, to the detriment of other key players. That's commerce.

Its one thing to make the occasional 'good card' (which is really nothing more than pricing a product correctly / in a way people buy it!) that sells, its another to actually execute on a strategy. Over several decades of AMD GPUs I haven't discovered what it is. If we go buy the marketing its some wild mix of making fun of the others while failing yourself (Poor Volta and a string of other events), going unified arch first and then not, and then yes, we might as well unify this again after dropping under 20% share convincingly; going 'midrange with Polaris' to lose key market share and brand recognition earned on GCN (which had a few 'good cards') only to claw back into the high end with RDNA2/3 and then back to midrange again?

There's just no rhyme or reason to it, and that is why it can't get ever get consistently good.


I had a console age in those years, for some reason it was PS3 at that point, not PC :D
Not moving forwards=going backwards
Posted on Reply
#25
3valatzy
DavenLooks like the upper part of the line-up is coming into focus

5070 $600 250W Slightly higher perforamance than 4070 Super
5070 Ti $800 300W Slightly higher performance than 4070 Ti Super
5080 $1000 400W Slightly higher performance than 4080 Super
5090 $2000 600W 40% higher performance than 4090

Nothing too exciting given the same 4 nm die process except for the 5090. I have no idea how this thing is going to work at 600W if your rig isn't perfectly up to snuff.

As for the lower part of the line-up, Nvidia is definitely waiting to see how Battlemage and RNDA4 performs.
Nothing too exciting except RTX 5090 which will be crazy expensive - maybe $4000 for the GB202 die that is 744 mm^2. That is at the reticle size limit! :kookoo:

The good news is that AMD will have a chance to survive after this, because the RTX 5000 will be mostly not worthy to buy..

RTX 4070 - 5888 shaders RTX 5070 - 6400 shaders
RTX 4070S - 7168 shaders
RTX 4070Ti - 7680 shaders
RTX 4070TiS - 8448 shaders RTX 5070Ti - 8960 shaders
RTX 4080 - 9728 shaders
RTX 4080S - 10240 shaders RTX 5080 - 10752 shaders
RTX 4090 - 16384 shaders RTX 5090 - 21760 shaders
Posted on Reply
Add your own comment
Nov 22nd, 2024 18:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts