Wednesday, December 25th 2024

NVIDIA GeForce RTX 5070 and RTX 5070 Ti Final Specifications Seemingly Confirmed

Thanks to kopite7kimi, we are able to finalize the leaked specifications of NVIDIA's upcoming GeForce RTX 5070 and RTX 5070 Ti graphics cards.
Starting off with RTX 5070 Ti, it will feature 8,960 CUDA cores and come equipped with 16 GB GDDR7 memory on a 256-bit memory bus, offering 896 GB/s bandwidth. The card is reportedly designed with a total board power (TBP) of 300 W. The Ti variant appears to use the PG147-SKU60 board design with a GB203-300-A1 GPU. The standard RTX 5070 is positioned as a more power-efficient option, with specifications pointing to 6,144 CUDA cores and 12 GB of GDDR7 memory on a 192-bit bus, with 627 GB/s memory bandwidth. This model is expected to operate at a slightly lower 250 W TBP.

Interestingly, the non-Ti RTX 5070 card will be available in two board variants, PG146 and PG147, both utilizing the GB205-300-A1 GPU. While we don't know what the pricing structure looks like, we see that NVIDIA has chosen to make more considerable differentiating factors between its SKUs. The Ti variant not only gets an extra four GB of GDDR7 memory, but it also gets a whopping 45% increase in CUDA core count, going from 6,144 to 8,960 cores. While we wait for the CES to see the initial wave of GeForce RTX 50 series cards, the GeForce RTX 5070 and RTX 5070 Ti are expected to arrive later, possibly after RTX 5080 and RTX 5090 GPUs.
Sources: @kopite7kimi #1, @kopite7kimi #2
Add your own comment

110 Comments on NVIDIA GeForce RTX 5070 and RTX 5070 Ti Final Specifications Seemingly Confirmed

#51
Onasi
RuruAbout 1080 Ti, I have a little tinfoiling since first, it crushed 3060, but now it loses to it. Driver optimization or lack of them for 1080 Ti?
Nah, it’s just the Kepler effect, just less pronounced this time - with time, newer games come out, are tested, perform better on newer architectures since they are more suited for those newer games and this is what’s happening. It’s always a bit of a delayed reaction since (obviously) games that come out when X generation is relevant were actually developed several years prior when Y generation was targeted. So it goes.
Posted on Reply
#52
Ruru
S.T.A.R.S.
OnasiNah, it’s just the Kepler effect, just less pronounced this time - with time, newer games come out, are tested, perform better on newer architectures since they are more suited for those newer games and this is what’s happening. It’s always a bit of a delayed reaction since (obviously) games that come out when X generation is relevant were actually developed several years prior when Y generation was targeted. So it goes.
Hey, they haven't even killed Maxwell yet, just installed the newest drivers for my brother's GTX 970 as I was on my parents celebrating Christmas. My bet is that Pascal still has at least 2 years of support.

Could it be the first generation with 10 years of support? o_O
Posted on Reply
#53
Onasi
RuruCould it be the first generation with 10 years of support?
Depends on what one considers support. Technically, Kepler, why not getting the newer drivers, had the legacy branch with security fixes and bug fixes all the way to this September. NV is, whatever one might say about them, is generally pretty good at long term support of their hardware.
Posted on Reply
#54
Ruru
S.T.A.R.S.
OnasiDepends on what one considers support. Technically, Kepler, why not getting the newer drivers, had the legacy branch with security fixes and bug fixes all the way to this September. NV is, whatever one might say about them, is generally pretty good at long term support of their hardware.
I mean that you'll get the newest drivers just like with your modern card.

I'm not an AMD fanboy, more a NV hater, but I gotta admit that NV has better support for what it comes to ~10yr old cards. Though Amernime exists for GCN but they aren't official AMD drivers.
Posted on Reply
#55
Neo_Morpheus
ZazigalkaI wonder what you will say when amd relase the long awaited fsr4 that leverages rdna3 AI hardware.
Oh well, you know, its justified, because, err, reasons and stuff.

lol.
Posted on Reply
#56
Zazigalka
Neo_MorpheusOh well, you know, its justified, because, err, reasons and stuff.

lol.
Progress is a good thing
Posted on Reply
#57
hsew
RuruI mean that you'll get the newest drivers just like with your modern card.

I'm not an AMD fanboy, more a NV hater, but I gotta admit that NV has better support for what it comes to ~10yr old cards. Though Amernime exists for GCN but they aren't official AMD drivers.
Watch out now, AMD’s software has always been just as good as nVidia’s, and you all should know it!
Posted on Reply
#58
phints
5070 having a 25% TDP increase over the 4070 is enourmous. Not good but perf/watt will still improve a bit. Guess this is the way we'll have to see performance gains without a more impactful improvement in TSMC lithography.
Posted on Reply
#59
Ruru
S.T.A.R.S.
hsewWatch out now, AMD’s software has always been just as good as nVidia’s, and you all should know it!
I have never had anything to complain about both camps' drivers, so I'm hella neutral about that. Mkay, I had black screens with R9 290 but that was almost a decade ago, my 2nd PC's 6700 XT works like a charm, used it about 1½ yrs in my main PC without issues. Also with improved drivers, I had R9 290 CF in 2018 and worked perfectly.

Never installefd GFE/NVApp and Radeon Software is kinda complex but good. Now I have an AMD/NV and Intel/AMD setup yet nothing to complain about drivers.
Posted on Reply
#60
evernessince
gffermariFor those who need a nVidia GPU(for CUDA) with loads of memory, the company created the 3060 12GB and 4060Ti 16GB. It never meant to make sense in gaming, since both cards are quite slow anyway and that (usually happened to) come/s first before the vram is out.

My issue is that the 5070 will be able to run most of the games in decent PT settings but the vram will kill it if something is not there to handle the missing vram.
I was watching a vid (Indiana Jones game -
) where in 1440p DLSS Q + FG in PT medium, the vram needed is 14GB+.
It's bad to have a fast card and not enough VRAM for this.

I wouldn't call 12 and 16GB loads of VRAM. 16GB is merely enough to run all current titles without VRAM compromises while 12GB is good enough for most games. When you are buying a new GPU at $500+ you should have to consider making compromises immediately on existing games. That doesn't reflect well on the card's longevity.
Posted on Reply
#61
Ruru
S.T.A.R.S.
evernessinceI wouldn't call 12 and 16GB loads of VRAM. 16GB is merely enough to run all current titles without VRAM compromises while 12GB is good enough for most games. When you are buying a new GPU at $500+ you should have to consider making compromises immediately on existing games. That doesn't reflect well on the card's longevity.
10GB *cough*

feels like 3080 was made purposely obsolete.
Posted on Reply
#62
freeagent
My oldest kid is still using my old 3070Ti, for 1080p its not a bad card. It gets way more hate then it deserves imo.

Edit:

I am about to drag my 980 Classified out of retirement to use on my youngest's soon to be built first rig..

Should be ok for Roblox :D
Posted on Reply
#63
Ruru
S.T.A.R.S.
freeagentMy oldest kid is still using my old 3070Ti, for 1080p its not a bad card. It gets way more hate then it deserves imo.
My little brother uses a 970...

3070 Ti is a good card, the hate it gets is its 8GB VRAM. :/
Posted on Reply
#64
freeagent
RuruMy little brother uses a 970...

3070 Ti is a good card, the hate it gets is its 8GB VRAM. :/
Ninja edit :pimp:
Posted on Reply
#65
oxrufiioxo
RuruMy little brother uses a 970...

3070 Ti is a good card, the hate it gets is its 8GB VRAM. :/
It's price with the 8 GB just like the price of the 4070ti with 12GB..... They were both too expensive at launch to have so little vram.

Owners of those cards seem to upgrade every generation so Nvidia knows what they are doing....
Posted on Reply
#66
TommyT
its amd's fault we have those 5070 and 5070 ti's

like it was intel fault that we used 4/8 cores/threds for years... till zen came out
Posted on Reply
#67
freeagent
oxrufiioxoIt's price with the 8 GB just like the price of the 4070ti with 12GB..... They were both too expensive at launch to have so little vram.

Owners of those cards seem to upgrade every generation so Nvidia knows what they are doing....
GTX 580 Matrix Platinum --> GTX 980 Classified --> RTX 3070Ti --> RTX 4070TI.. I normally wait a few years.. kids want to use my computer though.. nope..

:laugh:
Posted on Reply
#68
oxrufiioxo
freeagentGTX 580 Matrix Platinum --> GTX 980 Classified --> RTX 3070Ti --> RTX 4070TI.. I normally wait a few years.. kids want to use my computer though.. nope..

:laugh:
I was mostly just pointing out why people are harsh or have negative sentiment towards them.

It's up to you and every consumer to decide if their are actually worth their hard earned $$$.

It's a great gameplan by Nvidia give the cards just enough vram that they aren't garbage but not so much that people want to hold onto them multiple generations. AMD would be doing the same if people actually bought their cards....

Nobody should be a happy about 5070 a likely 600 usd card in 2025 is going to come with 12GB of vram and that has 0 to do with if that's enough or not.
Posted on Reply
#69
Ruru
S.T.A.R.S.
oxrufiioxoIt's price with the 8 GB just like the price of the 4070ti with 12GB..... They were both too expensive at launch to have so little vram.

Owners of those cards seem to upgrade every generation so Nvidia knows what they are doing....
Feels that I'm stuck in the middle what it comes to older cards with my 3080 10GB. Tho Iceberg Tech made a video recently and in majority of games, the difference between this and the 12GB version was barely noticeable (read: I don't play any of them games where there was difference, only RE4 remake).
freeagentNinja edit :pimp:
U were a real ninja. :D
Posted on Reply
#70
sepheronx
Funny how some game developers have put limitations for using PT when it can be hacked and enabled and working on cards with 10GB of VRAM

nvidia/comments/1hbak60
So I also think game developers are working with nvidia to create artificial limitations.

So 12GB should be actually enough for a while. I just think that game developers are working in tandem with other companies to also create limitations that aren't necessary.
Posted on Reply
#71
oxrufiioxo
RuruFeels that I'm stuck in the middle what it comes to older cards with my 3080 10GB. Tho Iceberg Tech made a video recently and in majority of games, the difference between this and the 12GB version was barely noticeable (read: I don't play any of them games where there was difference, only RE4 remake).


U were a real ninja. :D
For me it's always been stagnation that I hate it's been 8 years and we still have 3-400 usd 8GB gpus and it seems 12GB in the 600 ish usd range is the next stagnation.

Progress is always a good thing it was the same with the quad core era that dragged on too long.

Developers are always going to target the lowest common denominator so it's hard to have progress when the baseline has been garbage for half a decade the segment 95% of people actually purchase from it's why I love what Intel is doing yeah the card isn't fast but at least it showed progress in the segment that matters the most.
Posted on Reply
#72
Dahita
I just want a $500 16GB decent card for 1440p, or a $1000 24GB . Was that really too much to ask for 2025?
Posted on Reply
#73
esserpain
The 5070 Ti is GB203, while the 5070 is GB205. The latter is 2 product stacks down in the chip stack as compared to the former, despite being only a stack down by name (or even half a stack if you argue that the Ti designation isn't worth a full stack).

NVIDIA couldn't be more blatant if they tried about them moving the "midrange" and "low-end" parts of the stack down in the naming department. They should've called the 5070 the 5060 Ti or SMTH.
Posted on Reply
#74
TrekEmondaSl
Cyberpunk 2077 4K PT FG DLSS need 16gb vram ;(
Posted on Reply
#75
TheinsanegamerN
sepheronxFunny how some game developers have put limitations for using PT when it can be hacked and enabled and working on cards with 10GB of VRAM

nvidia/comments/1hbak60
So I also think game developers are working with nvidia to create artificial limitations.

So 12GB should be actually enough for a while. I just think that game developers are working in tandem with other companies to also create limitations that aren't necessary.
More likely they're catering to the millions of gamers that still clutch tot heir 8GB cards crying about how anything that uses more then 8GB is poorly made and everyone else is greedy.
Posted on Reply
Add your own comment
Dec 26th, 2024 03:51 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts