Friday, December 30th 2022

NVIDIA France Accidentally Reveals GeForce RTX 4070 Ti Specs

With less than a week to go until the official launch of the GeForce RTX 4070 Ti, NVIDIA France has gone and spoiled things by revealing the official specs of the upcoming GPU. The French division of NVIDIA appears to have posted the full product page, but it has since then been pulled. That didn't prevent Twitter leaker @momomo_us from snapping a couple of screenshots, including that of the official performance numbers from NVIDIA.

There aren't any real surprises here though, as we already knew the CUDA core count and the memory size, courtesy of the RTX 4070 Ti having been the RTX 4080 12 GB, until NVIDIA changed its mind. It's interesting to see that NVIDIA compares the RTX 4070 Ti to the RTX 3080 12 GB in the three official benchmarks, as it makes the RTX 4070 Ti look a lot better than it is in reality, at least based on the rumoured MSRP of US$800-900. One of the three benchmarks is Cyberpunk 2077 using Raytracingl, where NVIDIA suggests the RTX 4070 Ti is around 3.5 times faster than the RTX 3080, but it's worth reading the fine print. We'll know next week how the RTX 4070 Ti actually performs, as well as where the official pricing and actual retail pricing ends up.
Sources: NVIDIA France (reverted to older page), via @momomo_us
Add your own comment

102 Comments on NVIDIA France Accidentally Reveals GeForce RTX 4070 Ti Specs

#26
pavle
willcf15Seems pretty tight on VRAM for the amount of processing power it has. I wonder if that's intentional to keep people dishing out the big bucks for 4k cards.
That's classic nvidia MO.
Posted on Reply
#27
kiriakost
I will wait for RTX 50xx with Windows 7 drivers support.
Posted on Reply
#28
efikkan
DristunHot turd about to drop.
So, you've tested it then? :P
I don't believe it will be hot, and whether or not it's a "turd" or not remains to be seen.
This card is likely to perform comparable or better than RTX 4080 per GFlop, so if it's priced at $800 it might turn out to be a better deal than most seem to think.
So, it this turns out to be right, will you eat your words? (not in a literal sense)
ZoneDymoIm 100% sure it is, its planned obsolescence, AMD is dishing out large amounts of Vram so cards can remain viable but big N (remember, over 80% marketshare) is now planning for even more profits.

All up to you guys to decide what you want to support....
This nonsense still lives on, unfortunately.
If the reviews show the card scales just fine with 4K, then there is no reason to worry, neither now or in the future.
Posted on Reply
#29
Dristun
efikkanSo, you've tested it then? :p
I don't believe it will be hot, and whether or not it's a "turd" or not remains to be seen.
This card is likely to perform comparable or better than RTX 4080 per GFlop, so if it's priced at $800 it might turn out to be a better deal than most seem to think.
So, it this turns out to be right, will you eat your words? (not in a literal sense)
I didn't mean hot in a sense that it's gonna be 110C under load like a certain competitor, haha! But I also won't eat my words based on that criteria of being just better than 4080 - 4080 is in my books egregiously priced. I just don't buy the "wafers and components are too expensive" story to that big of a degree. All hardware got more expensive but not by as much as GPUs. GN's Steve had a good graph, imo. I could accept a 20% hike as reasonable, hell, even 30 - but not 50. So being better than the worst value we've ever seen outside of mining shortages is not enough. All of these products are terribly priced and I just refuse to pay that much. If that's the new norm - well, they can have it, I'm out. Once my rig is obsolete I'll just buy a playstation again for the AAA-stuff and keep the rig for work and indies+emulation. And if others are willing to pay - I'm not going to criticize their decisions, it's not my money to spend.
Posted on Reply
#30
evernessince
efikkanSo, you've tested it then? :p
I don't believe it will be hot, and whether or not it's a "turd" or not remains to be seen.
This card is likely to perform comparable or better than RTX 4080 per GFlop, so if it's priced at $800 it might turn out to be a better deal than most seem to think.
So, it this turns out to be right, will you eat your words? (not in a literal sense)
A $800 MSRP 4070 Ti with a street price of $900 - $1,200 that's guaranteed to be slower than a 4080 is indeed a turd. "deal"? If you think this is a deal I have an extended car warranty to sell you.
Posted on Reply
#31
AusWolf
the54thvoidKind of interesting. The 3070ti was comparable, if not better than the 2080ti. Nvidia don't appear keen to compare the 4070ti to the 3080ti.

Hmmm....
But.. but... it's 3.5x faster than the non-Ti... if you turn on frame generation, decrease some settings, play a different game and buy a 4090. :roll:
Posted on Reply
#32
N/A
3080 Ti and 3080 12G OC are very similar -3%. If we exclude the gimmics 4070 Ti probably even sligthly exceeds the goal. But the vanilla 4070 is better suited for a 60 Ti.

Posted on Reply
#33
ZoneDymo
efikkanThis nonsense still lives on, unfortunately.
If the reviews show the card scales just fine with 4K, then there is no reason to worry, neither now or in the future.
Vram is needed for RT and Texture resolution and Resolution its played at, so yeah, in the future all those just become more demanding so Vram will be a problem.

Portal RTX uses 16gb of Vram at 4k RIGHT NOW sooo yeah.
Posted on Reply
#34
Garrus
DristunI didn't mean hot in a sense that it's gonna be 110C under load like a certain competitor, haha! But I also won't eat my words based on that criteria of being just better than 4080 - 4080 is in my books egregiously priced. I just don't buy the "wafers and components are too expensive" story to that big of a degree. All hardware got more expensive but not by as much as GPUs. GN's Steve had a good graph, imo. I could accept a 20% hike as reasonable, hell, even 30 - but not 50. So being better than the worst value we've ever seen outside of mining shortages is not enough. All of these products are terribly priced and I just refuse to pay that much. If that's the new norm - well, they can have it, I'm out. Once my rig is obsolete I'll just buy a playstation again for the AAA-stuff and keep the rig for work and indies+emulation. And if others are willing to pay - I'm not going to criticize their decisions, it's not my money to spend.
Exactly. I don't mind if NVidia wants to charge $800 for the 4080. But $1200 is just criminal.
N/A3080 Ti and 3080 12G OC are very similar -3%. If we exclude the gimmics 4070 Ti probably even sligthly exceeds the goal. But the vanilla 4070 is better suited for a 60 Ti.

RTX 3070 is $500, beats the $1200 2080 Ti. This new 4070 Ti is really the 4070, probably just beats the 3080 Ti if lucky (with RT on, not with RT off). Should also be $500. Could stomach $600. I'll probably go buy one of those $325 Arc A770's just for the giggles in January.
Posted on Reply
#35
willcf15
efikkanThis nonsense still lives on, unfortunately.
If the reviews show the card scales just fine with 4K, then there is no reason to worry, neither now or in the future.
This isn't really true. VRAM requirements gradually drift up over time. I've actually hit the 8GB VRAM limit on my GTX 1070 Ti, and that's with a lot less processing power. I think it's reasonable to assume that, in 5 years (my typical GPU lifecycle) and at a higher settings target, 12 GB could be a problem. I'm not saying this is the end of the world or anything, but I guess personally if I'm spending $800 on a GPU, I don't want to have doubts. That said, it looks like I'm going to be either facing unacceptably high idle power draw (AMD) or questionable VRAM (NVidia) on this generation so in my opinion we've got no winners yet...They've got like 2 months to get their shit together before I go buy a used last-gen card and put the saved cash toward other hobbies lol
Posted on Reply
#36
ZoneDymo
willcf15This isn't really true. VRAM requirements gradually drift up over time. I've actually hit the 8GB VRAM limit on my GTX 1070 Ti, and that's with a lot less processing power. I think it's reasonable to assume that, in 5 years (my typical GPU lifecycle) and at a higher settings target, 12 GB could be a problem. I'm not saying this is the end of the world or anything, but I guess personally if I'm spending $800 on a GPU, I don't want to have doubts. That said, it looks like I'm going to be either facing unacceptably high idle power draw (AMD) or questionable VRAM (NVidia) on this generation so in my opinion we've got no winners yet...They've got like 2 months to get their shit together before I go buy a used last-gen card and put the saved cash toward other hobbies lol
What I wonder with that powerdraw, if you have a CPU with intergrated graphics, what if you just set your normal ermm desktop usage to use the integrated graphics card instead, would that no solve the power draw as its no the AMD gpu actually doing the video playback or multi monitor stuff?
Posted on Reply
#37
eidairaman1
The Exiled Airman
ZoneDymoIm 100% sure it is, its planned obsolescence, AMD is dishing out large amounts of Vram so cards can remain viable but big N (remember, over 80% marketshare) is now planning for even more profits.

All up to you guys to decide what you want to support....
Nv has done this since GF2, their cards suddenly dying, fucking bullshit.
GarrusExactly. I don't mind if NVidia wants to charge $800 for the 4080. But $1200 is just criminal.



RTX 3070 is $500, beats the $1200 2080 Ti. This new 4070 Ti is really the 4070, probably just beats the 3080 Ti if lucky (with RT on, not with RT off). Should also be $500. Could stomach $600. I'll probably go buy one of those $325 Arc A770's just for the giggles in January.
Anything over 500 for top end is criminal.

People can get a XB or PS5 for that prive and be on their way playing
Posted on Reply
#38
efikkan
DristunI didn't mean hot in a sense that it's gonna be 110C under load like a certain competitor, haha! But I also won't eat my words based on that criteria of being just better than 4080 - 4080 is in my books egregiously priced. I just don't buy the "wafers and components are too expensive" story to that big of a degree. <snip>
My objection is prejudging a product before we know the product's performance and price.
This card may very well end up at a similar performance per Dollar range as AMD's latest cards, so will you criticize them as harshly as you criticize this product then?
ZoneDymoVram is needed for RT and Texture resolution and Resolution its played at, so yeah, in the future all those just become more demanding so Vram will be a problem.

Portal RTX uses 16gb of Vram at 4k RIGHT NOW sooo yeah.
VRAM allocated isn't the same as VRAM needed. Many buffers and textures are heavily compressed on the fly. The true judge of VRAM requirement is benchmarking the performance; if the card runs out of VRAM the performance will drop sharply. If on the other hand the performance keep scaling, then there is no issue.
willcf15This isn't really true. VRAM requirements gradually drift up over time.
VRAM requirements may increase, but the bandwidth required to utilize a given amount of VRAM is fixed for any piece of hardware. As games become more demanding, the bandwidth required to utilize the desired VRAM will inevitably become the bottleneck long before the VRAM itself. By the time you actually allocate that much VRAM, the performance for a fairly well-balanced game will be approaching "slide-show territory" (way below 60 FPS).
The only exception to this would be a game which manages VRAM extremely poorly, meaning a game which allocates a lot more VRAM than it should, like a modded game with a texture pack. This is really an edge case, and for most buyers it's silly to buy cards with extra VRAM for this purpose. (If you're the exception, then that's fine for you, but don't assume normal gamers needs it.)
Posted on Reply
#39
ZoneDymo
efikkanVRAM allocated isn't the same as VRAM needed. Many buffers and textures are heavily compressed on the fly. The true judge of VRAM requirement is benchmarking the performance; if the card runs out of VRAM the performance will drop sharply. If on the other hand the performance keep scaling, then there is no issue.
I know it works like that for normal ram, but I dont think it works like that for Vram, I also remember Digital Foundry running into a Vram bottleneck on an Nvidia card resulting in the game's textures never loading in fully, just this early low res initial texture, think it was far cry 6.

and sure but again, im saying that a current game, a game today, and perhaps more then one can already demand close to the max vram of current cards, spending 1000 bucks on a gpu running the risk that it will run into basic issues like Vram shortages within a year or 2 is not a risk I think many would be willing to make, course you are free to believe it to be a non issue.
Posted on Reply
#40
evernessince
efikkanVRAM allocated isn't the same as VRAM needed. Many buffers and textures are heavily compressed on the fly. The true judge of VRAM requirement is benchmarking the performance; if the card runs out of VRAM the performance will drop sharply. If on the other hand the performance keep scaling, then there is no issue.
16GB is the amount portal RTX uses at 4K, not just allocates: www.techpowerup.com/review/portal-with-rtx/3.html

Performance doesn't immediately drop when you run out of VRAM. It depends on the game but usually you can go 30% above available VRAM and the GPU will do a decent job of swapping between the VRAM and main system memory. The problem is, the instant something that needs to be fetched often is sent to the main system memory when VRAM is full, performance tanks.

It's not just an annoyance, it renders the game unplayable. The 3070 gets 1 FPS at 4K, but even in less extreme scenarios where you "just" get poor frame timing or stuttering it's easy to see why people want more VRAM. There's really no excuse other than forced obsolescence either because it would not be expensive for Nvidia to have added more.
efikkanVRAM requirements may increase, but the bandwidth required to utilize a given amount of VRAM is fixed for any piece of hardware. As games become more demanding, the bandwidth required to utilize the desired VRAM will inevitably become the bottleneck long before the VRAM itself. By the time you actually allocate that much VRAM, the performance for a fairly well-balanced game will be approaching "slide-show territory" (way below 60 FPS).
The only exception to this would be a game which manages VRAM extremely poorly, meaning a game which allocates a lot more VRAM than it should, like a modded game with a texture pack. This is really an edge case, and for most buyers it's silly to buy cards with extra VRAM for this purpose. (If you're the exception, then that's fine for you, but don't assume normal gamers needs it.)
Games must work on a variety of cards with vastly different speed memory subsystems. Games address this issue through asset streaming. The engine will load in objects based on priority and will use multiple LODs to ensure that the game will play smoothly on a wide range of video cards with vastly differing memory subsystems.

Both VRAM and Bandwidth are equally important. You need more VRAM to store assets and graphics data of increasingly complex games and you also need more bandwidth to move those into said memory within a reasonable time.

The above example of the 3070 getting 1 FPS disproves the idea that bandwidth will always be a limiting factor before VRAM size. It may appear that bandwidth is often the limiting factor but that's down to the fact that game developers would not design games that run like crap on newer video cards. The consequences of using too much VRAM make the game unplayable and thus you rarely see it. Limited VRAM size restrict what devs are able to do with their games.
Posted on Reply
#41
马嘉伟
The performance difference between "4070TI" and "3080" should be less than 10%, without dlss3. And it has to sell for more expensive prices.
Posted on Reply
#42
Hxx
there is nothing wrong with these 40 series cards except price. its the only issue. the performance is there, power consumption overclocking cooling etc. If Nvidia would have released them at $500 $700 and $1K or something, then it would have been a completely different discussion. but ofc why would they when the sheeple are willing to pay much more.
Posted on Reply
#43
eidairaman1
The Exiled Airman
Hxxthere is nothing wrong with these 40 series cards except price. its the only issue. the performance is there, power consumption overclocking cooling etc. If Nvidia would have released them at $500 $700 and $1K or something, then it would have been a completely different discussion. but ofc why would they when the sheeple are willing to pay much more.
Till the gddr6x starts to fail like on rtx2000/3000...
Posted on Reply
#45
Dux
CrackongWow we are so not surprised.

I like that gif a lot.
Posted on Reply
#46
watzupken
The specs and performance is not new. Nvidia actually showed the specs and performance when they were going to launch the RTX 4080 16GB and 12GB. Nothing really changed here other than the model number.
Posted on Reply
#48
TumbleGeorge
马嘉伟The performance difference between "4070TI" and "3080" should be less than 10%, without dlss3. And it has to sell for more expensive prices.
"4070Ti" 12GB < 3080 12GB if not use dlls3 fake frames and on similarity resolution and game settings. Also when tested on same PC configuration. Yes both cards are limited by VRAM size but 3080 12GB must using it's VRAM better because much bandwidth.
Posted on Reply
#49
Why_Me
BwazeReally?

Must be more akin to high end Tesla than Mercedes, due to the high quality fires…

www.tomshardware.com/news/rtx-4090-owner-hits-nvidia-with-lawsuit-over-melting-16-pin-connector

www.glitched.online/nvidias-16-pin-rtx-4090-connectors-are-setting-on-fire/
User error. Don't try to stick an RTX 4090 in a small cramped case and make sure it's plugged in all the way.

Any guesses on where the RTX 4070 Ti will place on this list after Wizard's review of that card?

pcpartpicker.com/search/?q=RTX+3080+Ti <--- current US prices for the RTX 3080 Ti 12GB via PC Partpicker

Posted on Reply
#50
nguyen
Why_MeUser error. Don't try to stick an RTX 4090 in a small cramped case and make sure it's plugged in all the way.

Any guesses on where the RTX 4070 Ti will place on this list after Wizard's review of that card?

pcpartpicker.com/search/?q=RTX+3080+Ti <--- current US prices for the RTX 3080 Ti 12GB via PC Partpicker

I will take a guess, 4070Ti will be
~3090Ti at 1440p
3090Ti -5% at 4K
Top the efficiency chart
Posted on Reply
Add your own comment
Oct 31st, 2024 18:47 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts