Wednesday, December 25th 2024

NVIDIA GeForce RTX 5070 and RTX 5070 Ti Final Specifications Seemingly Confirmed

Thanks to kopite7kimi, we are able to finalize the leaked specifications of NVIDIA's upcoming GeForce RTX 5070 and RTX 5070 Ti graphics cards.
Starting off with RTX 5070 Ti, it will feature 8,960 CUDA cores and come equipped with 16 GB GDDR7 memory on a 256-bit memory bus, offering 896 GB/s bandwidth. The card is reportedly designed with a total board power (TBP) of 300 W. The Ti variant appears to use the PG147-SKU60 board design with a GB203-300-A1 GPU. The standard RTX 5070 is positioned as a more power-efficient option, with specifications pointing to 6,144 CUDA cores and 12 GB of GDDR7 memory on a 192-bit bus, with 627 GB/s memory bandwidth. This model is expected to operate at a slightly lower 250 W TBP.

Interestingly, the non-Ti RTX 5070 card will be available in two board variants, PG146 and PG147, both utilizing the GB205-300-A1 GPU. While we don't know what the pricing structure looks like, we see that NVIDIA has chosen to make more considerable differentiating factors between its SKUs. The Ti variant not only gets an extra four GB of GDDR7 memory, but it also gets a whopping 45% increase in CUDA core count, going from 6,144 to 8,960 cores. While we wait for the CES to see the initial wave of GeForce RTX 50 series cards, the GeForce RTX 5070 and RTX 5070 Ti are expected to arrive later, possibly after RTX 5080 and RTX 5090 GPUs.
Sources: @kopite7kimi #1, @kopite7kimi #2
Add your own comment

132 Comments on NVIDIA GeForce RTX 5070 and RTX 5070 Ti Final Specifications Seemingly Confirmed

#76
sepheronx
TheinsanegamerNMore likely they're catering to the millions of gamers that still clutch tot heir 8GB cards crying about how anything that uses more then 8GB is poorly made and everyone else is greedy.
What?

How does that make sense to anything I said?

It's the complete opposite. The patch isn't made by any company but people so that the users of other gpus below 12gb of vram is able to enable and use path tracing. It's that game developers (possibly with aid of nvidia) disabling it for anything below 12gb of vram.

Also proves it can work with less vram.
Posted on Reply
#77
TheinsanegamerN
PalladiumI got my GTX460 and HD5850 for $110 each, and AAA doesn't sucked as hard then as now.
I remember those days, when we were in the grip of the Great Financial Crisis and you were damn lucky to get a job flipping burgers for $8 an hour while tens of millions lost their homes. Pushing hundreds of job applications with no responses.
AtiAmdPeople dont really have a choice...
They do though. AMD is a choice. not buying is also a choice.

People act like you HAVE to twist your arm into the meat grinder because you HAVE to upgrade and buy the new shiny bing bing wahoo. That consumerist mindset is why the average american has over $7k in CC debt.
RuruHey, they haven't even killed Maxwell yet, just installed the newest drivers for my brother's GTX 970 as I was on my parents celebrating Christmas. My bet is that Pascal still has at least 2 years of support.

Could it be the first generation with 10 years of support? o_O
It's not the first. Geforce 8000 series got 10 years. A decade has been pretty typical for Nvidia for awhile.
Posted on Reply
#78
Ruru
S.T.A.R.S.
TheinsanegamerNIt's not the first. Geforce 8000 series got 10 years. A decade has been pretty typical for Nvidia for awhile.
Maybe, though feels like Fermi and Kepler were dropped out hella early.
Posted on Reply
#79
hsew
esserpainThe 5070 Ti is GB203, while the 5070 is GB205. The latter is 2 product stacks down in the chip stack as compared to the former, despite being only a stack down by name (or even half a stack if you argue that the Ti designation isn't worth a full stack).

NVIDIA couldn't be more blatant if they tried about them moving the "midrange" and "low-end" parts of the stack down in the naming department. They should've called the 5070 the 5060 Ti or SMTH.
Assuming there will be a gaming GB204 sku, or that there won’t be a 5070 Super…
Posted on Reply
#80
freeagent
RuruMaybe, though feels like Fermi and Kepler were dropped out hella early.
In the big picture, they were, but not by much. Like I said before, I used my GTX580 for 8 long, painful years :D
Posted on Reply
#81
TheinsanegamerN
RuruMaybe, though feels like Fermi and Kepler were dropped out hella early.
Fermi got 9 years of drivers, so did kepler. If that's "hella early", you are gonna be pearl clutching when you find out that AMD dropped cards after just 6 years recently, and less then 5 years back int he Evergreen days. And the less said about intel GPU support the better.

And frankly, who was still using Fermi on anything even remotely modern in 2019? Or Kepler in 2021? Even with the end of bugfixes, youtubers showed in 20/21 that fermis till worked in newer games, they were unplayably slow, but they still worked. it took another couple before the cards just flat out didnt work. Of course, they still worked fine for anything 2018 and earlier. Same with kepler.
Posted on Reply
#82
Ruru
S.T.A.R.S.
freeagentIn the big picture, they were, but not by much. Like I said before, I used my GTX580 for 8 long, painful years :D
Maybe we met in BF4 (if you played it?) with Thermis at spring 2014. :D 2500K @ 4.5, 16GB 1600MHz, 470 SOC @ 870/1100 was a dope system even then. :toast:
Posted on Reply
#83
freeagent
RuruMaybe we met in BF4
I have a few BF games that I popped into now and then, could be..
Posted on Reply
#84
Broken Processor
I was looking to upgrade my 6800xt this gen but I simply don't see a suitable upgrade path so far. The prices will dictate what I do but I'm strongly looking at used 4090 unless the 5080 is really impressive and priced right but going by 40 series that's a pipe dream.

I had 3 7900xtx and each one had issues so I'm not going AMD even if they had a product to suit me but Nvidia is looking equally bad for different reasons. It's a suck ass time to be a PC enthusiast.
Posted on Reply
#85
Vayra86
oxrufiioxoI was mostly just pointing out why people are harsh or have negative sentiment towards them.

It's up to you and every consumer to decide if their are actually worth their hard earned $$$.

It's a great gameplan by Nvidia give the cards just enough vram that they aren't garbage but not so much that people want to hold onto them multiple generations. AMD would be doing the same if people actually bought their cards....

Nobody should be a happy about 5070 a likely 600 usd card in 2025 is going to come with 12GB of vram and that has 0 to do with if that's enough or not.
This is the thing really. Its never a criticism at anyone in particular its just a shame we get so little hardware for that money, and that I think is something we should signify and stress every time. Because underneath that we really need to realize this is not the market functioning as it should. The game is rigged now, and the margins made are absolutely silly. - Margins made over OUR backs with OUR money.
Dyatlov AHow do you know guys from these numbers the performance? 5070ti should beat the 4090 with a little margin. Just like 4070ti beats 3090.
Nah. IF the leaks about shader counts are true, the 5080 is half the GPU that a 5090 is going to be; 10k vs 21k shaders. There is no way the 5080's inflated 350W will cover that gap by any notable margin, even if you give it say 10-15% from memory upgrade and some 10% from architecture - because those same upgrades also apply to the 5090.

Its also logically the best deal for Nvidia because it keeps their previous halo product 'quite halo ish' by letting the 5080 land just under it. They're not cannibalizing their own stack; but they do EOL it rapidly, creating persistent value for past gen's 4090 - but killing the value of anything under it and making the 5080, while just a meagre upgrade gen-to-gen, a wanted card for being a cheaper 4090.
Posted on Reply
#86
nguyen
Onasi@nguyen
I also enjoy science fiction, but no, that’s not happening. The GA102 was already essentially as big and dense as the process allowed. Hence the poor yields. And comparing 4090 to the 1080ti vs 2070 situation is silly - the 1080ti was absolutely not as far removed from the rest of the stack as the 4090 is. Not to mention that Turing and onwards the die sizes are not comparable to previous architectures - those chips didn’t have RT and Tensor cores. And that was also on a new node. None of those factors will be in play here. Can the 5070 Ti come close, like in a 10-15% range. Potentially. Being faster? Not happening. Again, we’re not getting 50% generational uplifts anymore.

tl:dr The 5070Ti will be 4080S level or a bit faster.
You couldn't be more wrong, 1080Ti was even further removed from the rest of the stack than 4090 in term of performance


1080 Ti was 35% faster than 1080, meanwhile 4090 beats 4080 by 31%.

2070 Super has 30% less CUDA Core and 10% less Bandwidth than 1080Ti, yet still match 1080 Ti in older DX11 games

TSMC 12nm (2070S) was only a tweaked node vs TSMC 16nm (1080 Ti), TSMC introduce them as same node just like their N5/N4nm nodes.

Nvidia has been very consistent in the past decade in making 104 die perform just as well as previous gen 102 die, but now we will see 103 die (GB103) carry that torch.
Posted on Reply
#87
forman313
gffermari5070 and 12GB ...is not that bad, if nVidia introduce something to compensate.
You mean like loading lower resolution and/or quality textures?

In some games there is a big difference in image quality on 4060 8GB and 16GB, or when compared with a 12GB equipped competitor. I have not seen too many reviewers point it out lately.
I couldn´t care less that e.g. the RTX 4060 has 10% higher framerate than the Arc B580 in some game, when image quality is worse. How long before we see the same thing happening in 12 vs 16GB VRAM scenarios....

LOD hacks and other tricks like it, is not fair. Players and benchers gets banned, rankings deleted and accounts disabled. Not to mention all the personal attacks and hate mail after numerous videos are made about them, calling them all sorts of stuff. Deliberate use of unfair advantages is cheating.
Posted on Reply
#88
gffermari
forman313You mean like loading lower resolution and/or quality textures?
No, I mean if they bring neural texture compression in this gen, 12GB would be enough for a x70 class gpu.


Posted on Reply
#89
R-T-B
Macro DeviceHighly doubt they improved IPC.
I'd be shocked if they didn't.
Posted on Reply
#90
freeagent
I didn't watch the videos, but that last one reminded me of this guy :D

Posted on Reply
#91
oxrufiioxo
R-T-BI'd be shocked if they didn't.
Unless they release a part with identical sm count to ADA and memory bandwidth it would be impossible to actually tell.

I'm also guessing they'd at least hit 10-15% IPC though but that's super hard to even measure with gpus outside of synthetic workloads. My guess is Pathtracing will be the best measuring stick as far as gaming goes.

At the end of the day raw performance without upscaling or frame generation is all that matters.
Posted on Reply
#92
Ruru
S.T.A.R.S.
RuruHey, they haven't even killed Maxwell yet, just installed the newest drivers for my brother's GTX 970 as I was on my parents celebrating Christmas. My bet is that Pascal still has at least 2 years of support.

Could it be the first generation with 10 years of support? o_O
Need to quote myself but damn, 750 Ti is almost 11 years old and still supported. Talk about having a long driver support.


As much as do hate Nvidia, I have to give them credit to do that (still RIP to Fermi and Kepler, you were forgotten too early).
freeagentI didn't watch the videos, but that last one reminded me of this guy :D

Oh boy I miss them E6400 days...
Posted on Reply
#93
Macro Device
RuruRIP to Fermi and Kepler, you were forgotten too early
In all seriousness, their gaming performance isn't enough since like 2019 so not a big deal. You wouldn't be able to reasonably play recent titles anyway. Let the grandpas go. Phasing Maxwell out next year wouldn't be a complete bummer, either.
Posted on Reply
#94
Ruru
S.T.A.R.S.
Macro DeviceIn all seriousness, their gaming performance isn't enough since like 2019 so not a big deal. You wouldn't be able to reasonably play recent titles anyway. Let the grandpas go. Phasing Maxwell out next year wouldn't be a complete bummer, either.
780 Ti would still have raw performance.
Posted on Reply
#95
Macro Device
Ruru780 Ti would still have raw performance.
It's very weak at DX12 and isn't really good at Vulkan so it's some good quality phase out material. It has drivers for virtually all games it can play well. Also not to underestimate its TGP: this GPU is a really fiendish beast.

With GTX 1070 / RX 5700 and give or take similar GPUs being that cheap I don't see that as an issue. Let it go.
Posted on Reply
#96
igormp
Onasi@nguyen
I also enjoy science fiction, but no, that’s not happening. The GA102 was already essentially as big and dense as the process allowed. Hence the poor yields. And comparing 4090 to the 1080ti vs 2070 situation is silly - the 1080ti was absolutely not as far removed from the rest of the stack as the 4090 is. Not to mention that Turing and onwards the die sizes are not comparable to previous architectures - those chips didn’t have RT and Tensor cores. And that was also on a new node. None of those factors will be in play here. Can the 5070 Ti come close, like in a 10-15% range. Potentially. Being faster? Not happening. Again, we’re not getting 50% generational uplifts anymore.

tl:dr The 5070Ti will be 4080S level or a bit faster.
I actually did a graph once comparing the biggest die of each generation, and the percentage that each consumer product fell in (not sure if I posted it in this forum before):


Blackwell is supposed to make this graph even bigger, it'll be fun to update it once the products are available.
esserpainThe 5070 Ti is GB203, while the 5070 is GB205. The latter is 2 product stacks down in the chip stack as compared to the former, despite being only a stack down by name (or even half a stack if you argue that the Ti designation isn't worth a full stack).

NVIDIA couldn't be more blatant if they tried about them moving the "midrange" and "low-end" parts of the stack down in the naming department. They should've called the 5070 the 5060 Ti or SMTH.
I had also thought about this, but there is no GB204 chip so far, (04 were where the previous x70 models usually used), so seems like it just got downgraded, but the segmentation from the 70ti and 70 is not news.
Posted on Reply
#97
Legacy-ZA
TrekEmondaSlCyberpunk 2077 4K PT FG DLSS need 16gb vram ;(
Sad that we get so little VRAM, I want to believe, that they came up with a better texture compression technique without the loss of image quality... but this is nGreedia. We have to curb our expectations :S
freeagentIn the big picture, they were, but not by much. Like I said before, I used my GTX580 for 8 long, painful years :D
Hey, that was a good GPU, but I got the Gainward one with 3GB VRAM, so it was less painful and lasted longer, can't have that anymore eh leather jacket man? It lasted me until I could get a GTX1070. :D
Posted on Reply
#98
Outback Bronze
freeagentLol. People complaining about prices in every single thread need to find a new hobby :laugh:
If I get a new hobby, I will probably complain about its pricing too :laugh:
Posted on Reply
#99
TumbleGeorge
3valatzyThey can afford to do it now, but the greed rules them all.

Have you noticed how expensive PC gaming in general is. One game such as the DiRT Rally 2.0 costs over $180 with the additional payed packs - every car costs several money, it's crazy.
I wonder if it would be a profitable idea to open a club for anonymous PC enthusiasts and gamers, where they could be cured of their addiction to spending money on things that are not an obligation to buy. If the treatment is successful, I am convinced that prices will also decrease to normal values.
Posted on Reply
#100
oxrufiioxo
Outback BronzeIf I get a new hobby, I will probably complain about its pricing too :laugh:
Pricing definitely sucks for everything but they really are raw dogging us no lube....
Posted on Reply
Add your own comment
Dec 26th, 2024 16:34 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts