• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5070 Ti Leak Tips More VRAM, Cores, and Power Draw

Most of that sounds fine (Spec wise), I am more concerned on the price because I will bet this is going to be around the $900 mark or higher. Every time a card gets a bump in specs in some way they bump the price up. Would not be surprised if the XX70 series starts to become the 1K price point cards (Meaning some below some above).
 
Does anyone really care? It will be what it is and Nvidia is going to charge whatever it wants. The MSRP would have to be $499 for it to matter to me. I'm much more impressed with the B580. Let's see where the vanilla 5060 lands. (not very hopeful)
Same, I really want to see what the B750 and B770 will put on the table. Nvidia need to lower price to appeal to me. I want them to prove that they've understand that people can't throw 700€ on a mid range gpu. 700 to 1500€ should be high tier to best card of the generation, not 1000 to 2000€
 
Last edited:
But core clocks is 2800-2900mhz
IPC gains and more memory bandwith

So it will be much faster
It wont be, a 4070ti Super already runs at about 2850Mhz, the faster memory will give a small boost as Ada already has a large cache, overall it will be some 15% faster at most. I'd be amazed if it even matches a 4080.

Question. Will this beat the 4090 or not??
Most certainly not.
 
My base expectation is 10% more cores, 10% higher clock speed and 10% bump (at 4K mostly) from memory bandwidth. 1.1*1.1*1.1=1.33x performance. I think this is perfectly realistic and the only thing that can spoil the fun is price.
Sounds reasonable, give or take a few % depending on the tier you're looking at.

I personally think they'll clock in a bit lower. +20%.
I don't really see where the remaining % is at, but you might be right about it being a 4K advantage. Very unlike Nvidia though, they usually tend to scale pretty linearly and excel at lower res - and you can clearly see the 4090 is a shader king, the memory isn't the issue from 4070ti Super on up. And blackwell is again showing that the x90 is going to have a massive shader count advantage, its projected to be twice an x80. That doesn't point at VRAM being the limiter in the high end.
 
> Supposedly, the RTX 5070 Ti will also see a bump in total CUDA cores, from 8448 in the RTX 4070 Ti to 8960 in the RTX 5070 Ti.

@Cpt.Jank The 4070Ti Super has 8448 cores, the 4070Ti has 7680 cores.

Upgrade seems a bit mid. If we compare it against (and assume it's similarly priced as) the 4070Ti the uplift should be less than 25%.
The additional VRAM is nice but the extra memory speed is a bit of a meme.

Does make it somewhat comparable to a 4080 though.

But I think this thing will get an MSRP of 850USD.
Nuts! Just corrected, thanks for pointing that out. I thought it seemed like a very minor uptick lol. Should've double checked.
 
Performance moving down 1 tier has been the norm for over a decade now. I doubt anyone will be upset at getting a $650-750 4080 super.

Well, let me rephrase, MOST people will not be upset. There will be those angry that nvidia doesnt give then a 4090 at $200 but, meh. Cant please everyone.

Yes and no. Prices have gone up, and while their newest tier will slide performance down a segment, the p/p or value has dropped significantly. And somehow people are just going to gloss over how horrendous the 4000 series is outside of offering almost nothing to the consumer aside from the 4090. Yet here we are making excuses for nvidia, no wonder the trend continues.
 
My base expectation is 10% more cores, 10% higher clock speed and 10% bump (at 4K mostly) from memory bandwidth. 1.1*1.1*1.1=1.33x performance. I think this is perfectly realistic and the only thing that can spoil the fun is price.
I think we can be more specific. I will compare to 4070 TiS, which is the closest comparison as a cut-down xx103 chip. The core count is +6% (says the leak). Clocks can't go up more than 5-10% even with +25% TBP since they're already at the tippy-top with the 40 series and Nvidia isn't meaningfully changing TSMC nodes. Memory is 33% faster.

Core count perf benefit should be near-linear, like +5%. Comparing core counts vs. avg. FPS @4K for 4070 TiS / 4080 / 4080 S:
  • 8448 / 9728 (+15%) / 10240 (+21%)
  • 79.7 / 92.5 (+16%) / 93.7 (+18%)
And for reference, memory speeds (Gbps) and avg. core clock (MHz):
  • 21 / 22.4 (+7%) / 23 (+10%)
  • 2686 / 2758 (+3%) / 2698 (+0%)
4080 is +15, +7, +3 and gives +16 perf.
4080 S is +21, +10, +0 and gives +18 perf

my prediction:

5070 Ti is +6, +5, +33, and gives + 20 perf over the 4070 TiS. Add +2 for architectural improvements, +2 for TBP increase*, so +24% total.

* W1zzard said 4080 S wasn't power limited so I don't think that TGP increase will give much performance benefit by itself.
 
It wont be, a 4070ti Super already runs at about 2850Mhz, the faster memory will give a small boost as Ada already has a large cache, overall it will be some 15% faster at most. I'd be amazed if it even matches a 4080.


Most certainly not.
So u think
Overlocked 4070Ti is only 5% slower than Rtx5070Ti

Or are u just hoping it because maybe u dont like nvidia?

Rtx5070Ti will be faster than Rtx4080 u like or not
 
Yes and no. Prices have gone up, and while their newest tier will slide performance down a segment, the p/p or value has dropped significantly. And somehow people are just going to gloss over how horrendous the 4000 series is outside of offering almost nothing to the consumer aside from the 4090. Yet here we are making excuses for nvidia, no wonder the trend continues.
The 4080 outperforms the 3090ti, and was $300 less then the 3090. The 4070 outperformed the 3080, and was $600 to the 3080's accidental price of $700. Let's also not forget that, unlike AMD, Nvidia offered an increase in performance per CU with the 4000s, which was very noticeable with raytracing.

So...thanks for proving me right, I guess?
It wont be, a 4070ti Super already runs at about 2850Mhz, the faster memory will give a small boost as Ada already has a large cache, overall it will be some 15% faster at most. I'd be amazed if it even matches a 4080.
Nvidia has been pretty consistent with matching a tier down every generation. Even the sucky RTX 2000s did that, with the 2080 being slight faster then the old 1080ti.
 
Well if 5080 is 4090 +5% then 5070 ti is 4090 -5% and it could be if it retains the same memory, rops and L3$. 20% more Cuda translates to 10% more performance.
 
The 4080 outperforms the 3090ti, and was $300 less then the 3090. The 4070 outperformed the 3080, and was $600 to the 3080's accidental price of $700. Let's also not forget that, unlike AMD, Nvidia offered an increase in performance per CU with the 4000s, which was very noticeable with raytracing.

So...thanks for proving me right, I guess?

Nvidia has been pretty consistent with matching a tier down every generation. Even the sucky RTX 2000s did that, with the 2080 being slight faster then the old 1080ti.

There are some reading comprehension issues on your end it seems. They offered a smaller performance increase at a higher cost gen over gen, which is LESS value. A 4070 offers the same performance as a 3080, yet 3070 to 4070 was a $100 markup over msrp gen to gen. 3080 to 4080 was an insane $500 markup in msrp (71% cost increase) for 80 to 80 series generation jump. You are literally paying more for less. It’s not a hard concept to figure out…
 
Nvidia offered an increase in performance per CU with the 4000s, which was very noticeable with raytracing.
I'm curious: do you have some like-for-like numbers? I recall 40 series optimized BVH traversal for faster ray tracing, but the overall benefit was modest. Random TPU ray tracing benchmark says 40 series takes 27-31% perf hit with RT on, 30 series takes 31-32% hit. Only other CU improvements I'm aware of are Optical Flow Acceleration, Opacity Micro Meshes, and a big increase in L2 cache (like 10x).

Going from 3080 to 4070 TiS:
  • cores: 8704 to 8448, -3%
  • mem bw: 760 to 672, -13%
  • core clock in gaming: 1870 to 2686, +44%
  • overall perf: +27/+29/+25 at 1080p/1440p/4k
Based on that I don't think there was much IPC gain.
 
There are some reading comprehension issues on your end it seems.
Uno reverse card. My original comment stated that nvidia had consistently moved performance down a tier. which they had. The 4070 was vaster then a 3080, and a 3070 was faster then the 2080. For some reason, you then went with a strawman about the value of the cards, which doesnt change my point at all.
They offered a smaller performance increase at a higher cost gen over gen, which is LESS value. A 4070 offers the same performance as a 3080, yet 3070 to 4070 was a $100 markup over msrp gen to gen. 3080 to 4080 was an insane $500 markup in msrp (71% cost increase) for 80 to 80 series generation jump. You are literally paying more for less. It’s not a hard concept to figure out…
In case you havent noticed, there's been this thing called inflation. You may have heard of it. The price of EVERYTHING has gone up substantially. The reason I labeled the 3080 as having an "accidental" price was specifically because $700 for a 3080 was totally unsustainable, and right around the time it released, prices went through the roof. Even if supply chains had not fallen apart, that card was not staying at $700.

Realistically, almost nobody got a 3080 for anywhere near $700. $1000 was a lot more realistic.

And before you go "ah but GREED its not FAAAAAAAIR!!!!!" check out their gross margins. You'll notice that, prior to the AI boom, Nvidia's margin went up by a whopping 3% from ampere to ada, and then would proceed to FALL to levels last seen in 2019 just before the AI boom hit.


The higher cost of TSMC wafers, inflation, energy, shipping, ece all played a role. If you don't factor in inflation, then yeah I guess they are not a great value. but in the real world, that's not how things work.

short of another economic crash, you're not seeing $500 high end cards again. The 4070 was a good value for what it offered, compared to the inflated 3080s with pitiful 10GB VRAM buffers.
I'm curious: do you have some like-for-like numbers? I recall 40 series optimized BVH traversal for faster ray tracing, but the overall benefit was modest. Random TPU ray tracing benchmark says 40 series takes 27-31% perf hit with RT on, 30 series takes 31-32% hit. Only other CU improvements I'm aware of are Optical Flow Acceleration, Opacity Micro Meshes, and a big increase in L2 cache (like 10x).

Going from 3080 to 4070 TiS:
  • cores: 8704 to 8448, -3%
  • mem bw: 760 to 672, -13%
  • core clock in gaming: 1870 to 2686, +44%
  • overall perf: +27/+29/+25 at 1080p/1440p/4k
Based on that I don't think there was much IPC gain.
I'm either misremembering them or the numbers I saw were based on Raytracing. either way it was 2 years ago, so I dont have the numbers I saw back then.
 
A 4070 offers the same performance as a 3080, yet 3070 to 4070 was a $100 markup over msrp gen to gen. 3080 to 4080 was an insane $500 markup in msrp (71% cost increase) for 80 to 80 series generation jump. You are literally paying more for less.
Perf/$ steadily improves, even at MSRP, even ignoring inflation. How much would you have had to pay for 4080-tier performance during the Ampere era? You'd need 3090 SLI, that's $2000.

Price at each product tier goes up because Nvidia is launching new product series faster than perf/$ can catch up. Anticipate +10% perf/$/year and you won't be so disappointed.
 
Perf/$ steadily improves, even at MSRP, even ignoring inflation. How much would you have had to pay for 4080-tier performance during the Ampere era? You'd need 3090 SLI, that's $2000.

Price at each product tier goes up because Nvidia is launching new product series faster than perf/$ can catch up. Anticipate +10% perf/$/year and you won't be so disappointed.
That's not good enough. The 7800 XT is 33% faster than the 6700 XT with only a $20 (~4%) difference in MSRP.
 
Or are u just hoping
I am not hoping for anything these are the facts, 6% more shaders, clock speeds at around ~<=3Ghz and faster memory can only mean at most ~15% better performance which puts it at 4080 level.

Also people are forgetting Ada saw a massive increases in cache, GDDR7 is not going to bring much, not at this level in the product stack.
 
Perf/$ steadily improves, even at MSRP, even ignoring inflation. How much would you have had to pay for 4080-tier performance during the Ampere era? You'd need 3090 SLI, that's $2000.

Price at each product tier goes up because Nvidia is launching new product series faster than perf/$ can catch up. Anticipate +10% perf/$/year and you won't be so disappointed.

If that’s how historic generational price increases vs performance worked a 70 series class card would be $1500+, let alone an 80 series card.

According to your logic a 4080 fails to meet those requirements, being only 48% faster while costing 71% more.

Accepting covid and crypto pricing as normal is one thing, but making excuses for Nvidia to give you less at a higher cost on top of that is something else, less we not forget every review outlet slammed Nvidia for terrible value across the entire lineup for the 4000 series (with the exclusion of the 4090).

But back to the main point, those expecting a $800 to $900 5070ti to come relatively close to a 4090 with half the physical hardware are huffing copium by the tankful.
 
I'm not even sure where to start, but blaming AMD for the greed and monopolization Nvidia has on the market is an interesting take, although unsurprising.
Haha yea, i do hope AMD quits the GPU race. It will be super funny. Who are these people going to blame when Nvidia is the only one remaining? Ye AMD is not doing well, so that makes it ok for Nvidia to do this to their fans? Yea, next time a volcano erupts, ill also blame AMD. Cus red something something. GRR AMD! GRRR!!!!!!!!!!!!!!!!!

P.s. I only own Nvidia cards, ill never bought any AMD/Intel cards. This doesn't blind me from the truth tho. Nvidia bloody sucks, especially after the mining crazy. They never stop sucking, and god knows when this whole thing will stop. Next 2 years will be terrible, just like the whole 40 series. Biggest joke ever (ye ye 4090 is good bla bla) You gotta be mental to even defend them!
 
I am not hoping for anything these are the facts, 6% more shaders, clock speeds at around ~<=3Ghz and faster memory can only mean at most ~15% better performance which puts it at 4080 level.

Also people are forgetting Ada saw a massive increases in cache, GDDR7 is not going to bring much, not at this level in the product stack.
I have seen so many times how so many ppls underestimate Nvidia and next gen GPUs allover again and again..
its starting to be very funny and its hapening again.

I screenshot this so lets see, who knows the best.
 
I have seen so many times how so many ppls underestimate Nvidia and next gen GPUs allover again and again
Bro Nvidia literally had to retract the "4080 12GB" because people realized they were about to be scammed, they straight up tried to sell an inferior product under a misleading name and you're talking about underestimating them lol.
 
"Coming to a store near you for ONLY $999!" lol. People are going to be walking out very happy with their newly discounted 4070Ti SUPER instead.
Or a 7900xtx for 839.99!
 
They posted data for RTX 5070, so it's only 6400 cores? (i thought that GB205 will be the same as AD104 with 7680 cores) that probably won't bring much performance gain, the difference between 5070 vs 4070 super should be around 12% only or around that depending on the final clocks, so we probably looking at a $599 part also.
More interesting is the following news:
https://wccftech.com/nvidia-to-showcase-several-ai-focused-gpu-technologies-at-ces-2025/
If you check all the A.I. focused marketing bullet points, you see the following:
"Neural Rendering Capabilities: Revolutionising how graphics are processed and displayed."
Not to be confused with the usual A.I. accelerated functions that we already have like upscaling that is mentioned separately anyway.
There is an excellent interview of Intel's Tom Petersen from Hardware Unboxed:


In a section talking about the future and what A.I can bring to the table, among other mentions regarding where A.I. can help, he refers that in the future graphics will be generated mainly from A.I. and that raster generated frames (also ray traced ones i will add) will be just a hint to A.I. to generate the final frame. But i guess we are talking for next-next gen (2038 PS7 era or whenever happens or even the next one) at the earliest.
From this point of view these new techniques will start to shift focus of what is important in the generated final frame, gradually making raytracing less and less the focus point of what the future will bring (but then again Nvidia in this field is seemingly way ahead from competition and combined with the market share that it enjoys right now, it should be in the position to maintain the current status quo for many years to come)
 
Last edited:
Back
Top