Friday, November 22nd 2024

NVIDIA GeForce RTX 5070 Ti Specs Leak: Same Die as RTX 5080, 300 W TDP

Recent leaks have unveiled specifications for NVIDIA's upcoming RTX 5070 Ti graphics card, suggesting an increase in power consumption. According to industry leaker Kopite7kimi, the RTX 5070 Ti will feature 8,960 CUDA cores and operate at a 300 W TDP. In a departure from previous generations, the RTX 5070 Ti will reportedly share the same GB203 die with its higher-tier sibling, the RTX 5080. This architectural decision differs from the RTX 40-series lineup, where the 4070 Ti and 4080 utilized different dies (AD104 and AD103, respectively). This shared die approach could potentially keep NVIDIA's manufacturing costs lower. Performance-wise, the RTX 5070 Ti shows promising improvements over its predecessor. The leaked specifications indicate a 16% increase in CUDA cores compared to the RTX 4070 Ti, though this advantage shrinks to 6% when measured against the RTX 4070 Ti Super.

Power consumption sees a modest 5% increase to 300 W, suggesting improved efficiency despite the enhanced capabilities. Memory configurations remain unconfirmed, but speculations about the card indicate that it could feature 16 GB of memory on a 256-bit interface, distinguishing it from the RTX 5080's rumored 24 GB configuration. The positioning across the 50-series GPU stack of this RTX 5070 Ti appears carefully calculated, with its 8,960 CUDA cores sitting approximately 20% below the RTX 5080's 10,752 cores. This larger performance gap between tiers contrasts with the previous generation's approach, potentially indicating a more defined product hierarchy in the Blackwell lineup. NVIDIA is expected to unveil its Blackwell gaming graphics cards at CES 2025, with the RTX 5090, 5080, and 5070 series leading the announcement.
Source: VideoCardz
Add your own comment

80 Comments on NVIDIA GeForce RTX 5070 Ti Specs Leak: Same Die as RTX 5080, 300 W TDP

#51
ThomasK
Visible NoiseThat was 15 years ago dude
How old were you back then? Like 4?
Posted on Reply
#52
nguyen
DavenSo you are saying there will be a 100% increase in IPC going from Ada to Blackwell? That would mean the 5090 would be almost three times the performance of 4090.

By the way, the 4070Ti has over 50% clock increase over the 3090 which was possible going from Samsung 8LPP to TSMC 4N. Blackwell is the same node as Ada.
2070 Super has less CUDA cores and less bandwidth, is based on TSMC 12nm which is a tweaked TSMC 16nm, still perform the same as 1080Ti regardless.

Well if 5090 were given 1000W+ TDP and GDDR8 then it could be 3x as fast as 4090 LOL, obviously perf doesn't scale with CUDA cores and bandwidth past a certain point, i.e 4090 is not 60% faster than 4080
Posted on Reply
#53
Daven
nguyen2070 Super has less CUDA cores and less bandwidth, is based on TSMC 12nm which is a tweaked TSMC 16nm, still perform the same as 1080Ti regardless.

Well if 5090 were given 1000W+ TDP and GDDR8 then it could be 3x as fast as 4090 LOL, obviously perf doesn't scale with CUDA cores and bandwidth past a certain point, i.e 4090 is not 60% faster than 4080
It really comes down to boost clocks and cores since the IPC across all three RTX series is similar. Just multiple the two and you get the ranking order in the charts.

It’s that simple. And here’s the prediction:

5070Ti: 8.9K cores times 2.7 Ghz = 24
4090: 16.4k cores times 2.5 Ghz = 41

Not even close. The 5070Ti will be about 20% faster than the 4070Ti (24/20).
Posted on Reply
#54
N/A
4090 16,4K Cores more than 2X and only 50% faster than a 4070 Ti, 5070 Ti needs to be just that. I believe it can get pretty close with 20% uArch.
Posted on Reply
#55
nguyen
DavenIt really comes down to boost clocks and cores since the IPC across all three RTX series is similar. Just multiple the two and you get the ranking order in the charts.

It’s that simple. And here’s the prediction:

5070Ti: 8.9K cores times 2.7 Ghz = 24
4090: 16.4k cores times 2.5 Ghz = 41

Not even close.
Man you are cracking me up, so the 5070Ti will be slower than 4070?
Posted on Reply
#56
Daven
nguyenMan you are cracking me up, so the 5070Ti will be slower than 4070?
Ummm…no. Again the math is simple.

4070: 5.9k cores times 2.5 Ghz = 14.75
5070Ti: 8.9k cores times 2.7 Ghz = 24

Btw this is how precision is calculated, cores times clocks. While the precision scales, games tend not to at higher resolution and higher cores. So the calculation starts to break down at 4k and 4090 levels. I don’t expect the 5090 to be 33% faster than the 4090 at 4k at the same clocks but it could be close.
Posted on Reply
#57
Wasteland
nguyen4070 Ti: 7680 CUDA cores, 192bit bus
3090: 10496 CUDA cores, 384 bit bus


Pretty much every xx70 GPU tie with previous gen xx80Ti/xx90 (1070 = 980Ti, 2070Super = 1080Ti, 3070 = 2080Ti, 4070Ti = 3090)
Right but Ada's a different case. The 3090, like most flagship-tier GPUs before it, didn't have a huge performance uplift over the next-best product in the stack. Techpowerup has the 3090 at +14% over the 3080, and 39% over the 3070 Ti. (And it's worth pointing out the 3070 Ti was widely considered a turd.)

By contrast, the 4090 shows a 26% advantage over the 4080, and a 60% advantage over the 4070 Ti. This change in the product stack's composition was exacerbated by the huge increase to the 4080's price over its previous generation analogues, which explains why so many people complained: Nvidia concentrated most of Ada's performance gains at the tippy top end. There's no obvious reason to expect that they'll change that approach.

I think it's unrealistic to expect the 5070/Ti to meet or exceed the 4090; the fact that it appears to carry many fewer CUDA cores only strengthens the case.
Posted on Reply
#58
nguyen
WastelandRight but Ada's a different case. The 3090, like most flagship-tier GPUs before it, didn't have a huge performance uplift over the next-best product in the stack. Techpowerup has the 3090 at +14% over the 3080, and 39% over the 3070 Ti. (And it's worth pointing out the 3070 Ti was widely considered a turd.)

By contrast, the 4090 shows a 26% advantage over the 4080, and a 60% advantage over the 4070 Ti. This change in the product stack's composition was exacerbated by the huge increase to the 4080's price over its previous generation analogues, which explains why so many people complained: Nvidia concentrated most of Ada's performance gains at the tippy top end. There's no obvious reason to expect that they'll change that approach.

I think it's unrealistic to expect the 5070/Ti to meet or exceed the 4090; the fact that it appears to carry many fewer CUDA cores only strengthens the case.
Pretty much every 104 die (xx70/xx70Ti) match the previous 102 die (xx80Ti/xx90) going way way back.

So many people complained, yet Nvidia gain market share with ADA? :kookoo:.

Edit: even that turd 3070Ti is faster than 2080Ti by 10%.
Posted on Reply
#59
RootinTootinPootin
Eyeing on the 5080 but it seems to be identical to the specs of the 4080 super with just GDDR7 soldered in there..hmmm the future is indeed crazy/lazy.
Posted on Reply
#60
close
Vayra86But... Nvidia was always better. Simple as that. Even during GCN; AMD drew more power for a slightly better bang for buck, offered more VRAM for a slightly better bang for buck. And that's all AMD wrote. Not ONCE did they take the leading position, either in featureset or in software altogether. Driver regime has been spotty. GPU time to market has no real fixed cadence, its 'whatever happens with AMD' every single time and it never happens to be just a smooth launch. The list of issues goes on and on and on.

The only thing to applaud is AMD brought RDNA2/3 to a good, stable situation. Too bad the products don't sell. Because AMD chose to price them in parity with Nvidia... So at that point, they didn't have the consistency, nor the trust factor or brand image, nor the bang for buck price win.... and guess what. RDNA4 is a bugfix and then they're going back to the drawing board. Again: no consistency, they even admitted themselves that they failed now.
Don't get me wrong, I never denied Nvidia's performance was better. Just that almost every reviewer only ever cared about that for years. Ran the benchmarks, wrote down the FPS, declared Nvidia the best because FPS(Nvidia) > FPS(AMD). Not to mention the raytracing hype that never turned into anything but AMD sucked at that. Reviewers fueled the "go for FPS and RT" craze even if they had seen where this attitude led in the CPU world: years of overpriced low core count CPUs that were rehashes of the previous gen. Most reviews were anyway just basic bar charts of builtin game benchmarks but enough of those created a general consensus that Nvidia is the only good choice.

AMD took a beating from all sides and (maybe smartly?) chose to focus on the CPU business. Not sure how many companies could have successfully taken on Intel (back when they were not the shadow that they are today) but certainly none would be able to successfully take on both Intel and Nvidia simultaneously. Recently we thankfully get an increase of reviewers noticing after so many years that Nvidia's bang for the buck and watt are constantly going down and starting to wise up to the fact that "better" and "faster" might be different things. Sometimes "faster" doesn't justify recommending a card.

Long story short, we get exactly what our "expert" reviewers sold. They sold Nvidia as the best, users bought Nvidia, now we all get Nvidia. But any "surprise" that a GPU is just a tweak of last generation, or consumes way more power, or costs too much is pointless now or in the near future. You still get more FPS so it must be just as much a win as ever. Let's see if everyone still gives the usual "Editor's pick" for these GPUs, they're the fastest after all.
Posted on Reply
#61
freeagent
Just skimming through..

Is 300w really that bad for the performance gains? I mean my 4070Ti has been doing 300w for 2 years.

Also, my GTX 580 has a 384 bit bus, but it doesn't mean that its better than my 3070Ti, or even my 4070Ti..

No point in arguing about leaks and speculation..
Posted on Reply
#62
Visible Noise
ThomasKHow old were you back then? Like 4?
Let‘s put it this way - I bought a C64 as a teenager.

You?
Posted on Reply
#63
3valatzy
closeDon't get me wrong, I never denied Nvidia's performance was better. Just that almost every reviewer only ever cared about that for years. Ran the benchmarks, wrote down the FPS, declared Nvidia the best because FPS(Nvidia) > FPS(AMD). Not to mention the raytracing hype that never turned into anything but AMD sucked at that. Reviewers fueled the "go for FPS and RT" craze even if they had seen where this attitude led in the CPU world: years of overpriced low core count CPUs that were rehashes of the previous gen. Most reviews were anyway just basic bar charts of builtin game benchmarks but enough of those created a general consensus that Nvidia is the only good choice.
There is no such consensus. There is too much sponsorship paid by Nvidia in the reviews which are misleading and lack true data.
closeAMD took a beating from all sides and (maybe smartly?) chose to focus on the CPU business.
Was more a matter of luck that they got the right man in the right place - his name is Jim Keller who designed the Zen architecture which for now saves AMD.
The thing is that Nvidia is focused in a market which made them a multi-trillion company, while AMD is nowhere near, especially with the today's threat to exit the GPU competition all together. One or two weak Radeon GPUs generations and AMD will be out of the business.
Which will be fatal for the company.
Posted on Reply
#64
chrcoluk
Why is a ti card coming at start of generation, arent they usually later cards?
Posted on Reply
#65
k0vasz
chrcolukWhy is a ti card coming at start of generation, arent they usually later cards?
usually, there is quite some performance (price, etc.) gap between a xx70 and xx80 card, and when AMD launches a card with somewhat better performance than the current xx70 card, nvidia launches the xx70ti, which is just a little bit better than the newly released AMD counterpart. but this time it's very likely that AMD's top-of-the-line card will be weaker than 5070, so nvidia won't have to keep a stronger card in their hand
Posted on Reply
#66
wheresmycar
chrcolukWhy is a ti card coming at start of generation, arent they usually later cards?
Yeah that is unusual but then again Ti is no longer the top performer in any segment, not since NVIDIA started including "Ti Super" in their lineup. It’s just another example of NVIDIA’s skill in exploiting sub-tiered product segmentation within already segmented product categories. Perhaps the Ti model comes with 16GB, while the non-Ti has 12GB, and maybe a TIS gets introduced later to bridge the supposedly wide performance gap between the 70s mid-range and high-end 80-class giants.

The 5080 could be anywhere between $1200-$1500. Below $1200, thats a lot of price-points to cover for mid-tier cards, Nvidias gonna have a field day with the 70s and who knows maybe a TISD (~DELUXE) variant to comfortably fill that revenue tasty ~$1100 gap.

King of marketing semantics!
Posted on Reply
#67
tfdsaf
RTX 5090 will cost $2500, RTX 5080 will cost $1400, RTX 5070ti will cost $1000, RTX 5070 will cost $800, RTX 5060ti will cost $500, RTX 5060 will cost $400.

AMD's line up will be: RX 8800XT will cost $600, RX 8700XT will cost $500, RX 8600XT will cost $400, RX 8600 will cost $300, RX 8500XT will cost $200.
Posted on Reply
#68
Marcus L
wheresmycarBelow $1200, thats a lot of price-points to cover for mid-tier cards
This statement says it all, below 1200 mid-tier :rolleyes: honestly this whole shit show of charging 800+ for yes, what is essentially mid range performance tier GPU's can fuck right off, I am not playing their games, will sit on my current HW until it's obsolete or dead and maybe just stick to retro HW and gaming, not like AAA games that are bringing these 1k+ cards to their knees even look any better than top titles almost 10 years ago to justify it either, no optimisation from game devs, NV could give a fk about budget gamers which make up 90% of the PC gaming market and I really hope their AI bubble bursts big time and they fall flat on their face, a nice big helping of humble pie is needed in the PC gaming market space, just ask Intel how it tastes
Posted on Reply
#69
JustBenching
Vayra86But... Nvidia was always better. Simple as that. Even during GCN; AMD drew more power for a slightly better bang for buck, offered more VRAM for a slightly better bang for buck. And that's all AMD wrote. Not ONCE did they take the leading position, either in featureset or in software altogether. Driver regime has been spotty. GPU time to market has no real fixed cadence, its 'whatever happens with AMD' every single time and it never happens to be just a smooth launch. The list of issues goes on and on and on.

The only thing to applaud is AMD brought RDNA2/3 to a good, stable situation. Too bad the products don't sell. Because AMD chose to price them in parity with Nvidia... So at that point, they didn't have the consistency, nor the trust factor or brand image, nor the bang for buck price win.... and guess what. RDNA4 is a bugfix and then they're going back to the drawing board. Again: no consistency, they even admitted themselves that they failed now.
Nah, there was an era that AMD (or was it ATI? frankly don't remember) had the overall better product. During the hd4770 - hd5850 - hd 6850 - hd 7850 era they had either more performance , better efficiency or just came first to market. And first to market is a huge thing in my book, especially on the high end. I don't like buying the top end card 6 or 12 months in since the later you buy it, the more closer you are to the next big launch and suddenly your flagship GPU is now, well, not so flagship.
Prima.VeraMaybe you are to young to remember, but definitely nVidia wasn't always better.
Just for your homework, search for AMD Radeon HD 5870 card. It was so good, that it was almost beating the dual GPU card from nVidia, while wiping the floor with whole nvidia gen cards. Also the 5850 was a monster too, and could work in pair with the 5870. I remember that was my last SLI setup ever, but it was a blast. Good ol' times.
www.techpowerup.com/review/ati-radeon-hd-5870/30.html
My bro had the 5850s in SLI. Good times
Posted on Reply
#70
Marcus L
JustBenchingMy bro had the 5850s in SLI.
No he didn't lol :pimp:
Posted on Reply
#71
wheresmycar
Marcus LThis statement says it all, below 1200 mid-tier :rolleyes: honestly this whole shit show of charging 800+ for yes, what is essentially mid range performance tier GPU's can fuck right off, I am not playing their games, will sit on my current HW until it's obsolete or dead and maybe just stick to retro HW and gaming, not like AAA games that are bringing these 1k+ cards to their knees even look any better than top titles almost 10 years ago to justify it either, no optimisation from game devs, NV could give a fk about budget gamers which make up 90% of the PC gaming market and I really hope their AI bubble bursts big time and they fall flat on their face, a nice big helping of humble pie is needed in the PC gaming market space, just ask Intel how it tastes
Yep its a fully lit-up shit show like a burning dumpster rolling downhill into a fireworks factory (inevitable). Unfortunately, I’m shackled to this thing because I need an upgrade to "desirably" hit my consistent 120fps+ performance goals at 1440p. I've always been a nV 80-class subscriber although this time around I ain't bending over for any card above £800. I might settle for a 16GB+ 70-class card from the 50-series at this price limit, or if not possible if AMDs 8000-series offers something compelling, I'm open to switching.

I couldn’t care less about some of the flashy features everyone keeps banging on about, but if Nvidia continues to deliver better power efficiency than AMD, that’s a big win in my book, definitely something I wouldn’t mind paying a bit extra for. I just can’t deal with GPUs that double-up as mini-heaters during the summer - nothing annoys me more than sweaty pants during a 2-hour/+ gaming session.
Posted on Reply
#73
Marcus L
JustBenchingOkay I guess
He had 5850's in Crossfire :p
Posted on Reply
#74
JustBenching
Marcus LHe had 5850's in Crossfire :p
Oh well, you are correct.
Posted on Reply
#75
Dawora
DavenLooks like the upper part of the line-up is coming into focus

5070 $600 250W Slightly higher perforamance than 4070 Super
5070 Ti $800 300W Slightly higher performance than 4070 Ti Super
5080 $1000 400W Slightly higher performance than 4080 Super
5090 $2000 600W 40% higher performance than 4090

Nothing too exciting given the same 4 nm die process except for the 5090. I have no idea how this thing is going to work at 600W if your rig isn't perfectly up to snuff.

As for the lower part of the line-up, Nvidia is definitely waiting to see how Battlemage and RNDA4 performs.
it seems u dont know much, u underate all the performance those cards will have? u trolling or serious?
usinameReplace "slightly higher" with "same" and you will nail them.
Also add $100 to 5070 and $200 to 70ti and 80
More butthurt Amd fans, just like every Nvidia news/releases.
Random_UserAMD was beating nVidia, since 4870 'till Polaris. Even the dumb GTX260 was a hot furnace. 5870 was the first DX11 card, and the first gen of HW tesselation. Also, the HW of AMD/ATi was much of higher quality. But drivers were hit-and miss, for decades- that's true. The picture quality was a bit better on AMD/ATi either, much like not gimping the colour preset like "green" company did since ever.
P.S.: AMD made Mantle, which became Vulcan. The SW RT back in 2016.

There was also the Radeon 9600 Pro. A great, affordable low end card. that made gaming possible for alot of people. And it had a quite nice OC room. It was like "2500+ Barton" of videocard.
And whole AMD community think Nvidia=rip because Mantle, it was so stupid..

Look now!
How did it go?
3valatzyNothing too exciting except RTX 5090 which will be crazy expensive - maybe $4000 for the GB202 die that is 744 mm^2. That is at the reticle size limit! :kookoo:

The good news is that AMD will have a chance to survive after this, because the RTX 5000 will be mostly not worthy to buy..

RTX 4070 - 5888 shaders RTX 5070 - 6400 shaders
RTX 4070S - 7168 shaders
RTX 4070Ti - 7680 shaders
RTX 4070TiS - 8448 shaders RTX 5070Ti - 8960 shaders
RTX 4080 - 9728 shaders
RTX 4080S - 10240 shaders RTX 5080 - 10752 shaders
RTX 4090 - 16384 shaders RTX 5090 - 21760 shaders
One Amd fan doing some more damage control..
There is no way to know is 5000 series worthy buy or not, But Nvidia know what they do so be ready to suprise.

And comparing shader count alone makes u look stupid
3valatzyEven 4090 is not sold for this.


Nice cherry pic, how long did u try to find those
phints300W TDP is quite high. Without knowing anything else you can tell the TSMC lithography jump is more minor this time :ohwell:
OR TGP ?
3valatzy^^^ RTX 5090 will be $4000, maybe after discounts $3900..
We hope so big price because what else we Amd fans can do, Amd cant compete so lets try to troll in forums and doing some damage control
Visible NoiseRead what I said. The 7970 reached parity with the 680 two years later.

And are you really going to bring up janky crossfire on a single board? There’s a reason why that was quickly abandoned. But hey, if “Two AMD GPU’s beats one Nvidia GPU for a $200 price increase ($275 inflation adjusted)” works for you, I’m not going to say you’re wrong. But then we would bring up the GTX 690, which beat everything that AMD would produce for the next 5 years - probably a record.

Funny, inflation adjusted the 5970 was a $1K card, and nobody screamed about the price then.


NoOne in AMD fans care prices not powerusage and not noise or heat because if AMD product

They only QQ if Nvidia
mb194dcSeems really meh, won't be much generation increase. Pretty convinced the next gen cards are going to flop hard.

There just isn't the demand for ultra expensive cards anymore.

Compute is what's been driving Nvidia sales even for gaming division. People trying use consumer cards in their llms.

Without a massive breakthrough on the front end, it's going to end, very badly.
AMD fans wet dream right?
Keep hoping
tfdsafRTX 5090 will cost $2500, RTX 5080 will cost $1400, RTX 5070ti will cost $1000, RTX 5070 will cost $800, RTX 5060ti will cost $500, RTX 5060 will cost $400.

AMD's line up will be: RX 8800XT will cost $600, RX 8700XT will cost $500, RX 8600XT will cost $400, RX 8600 will cost $300, RX 8500XT will cost $200.
There is no coming 8800XT

better to buy nvidia if u want faster GPU
Posted on Reply
Add your own comment
Nov 26th, 2024 22:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts