• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3070 Ti Founders Edition

1. Same amount of RAM as the 3070.
2.. 5-7% faster than the 3070 (on the 2 desired resolution for a card this powerful).
3. Nearly 20% less efficient than the 3070. WTF
4. GPU is 82 degrees. WTF
4. Yet costs $100 more. WTF


Sorry, but I can only call the 3070 Ti a piece of crap. At least the 3080 Ti was very close to the 3090 and it is cheaper (on paper and in real life) than the 3090.
 
Last edited:
I agree with the suggestions made for a new "Not Recommended" badge (though this should be used only for truly bad products). The thing here is that 3070 Ti is a bit of an odd release - as a product, it doesn't shine but it doesn't stink either, my opinion is that it doesn't deserve to be "Not Recommended" but personally, I wouldn't be comfortable "recommending" it either, given that the competition has adequate products at similar price brackets that do not share its caveats and potential problems arising from the use of exotic memory technology.

Personally, I would have wrapped it up without giving it any award and mentioning that the cost to benefit ratio, in current market conditions, make it a fair (not amazing! but nowhere near bad, either!) product to own if purchased at MSRP, but not something that you should go out of your way, or part with a significant amount of cash to own.

The most interesting bit is that this is not entirely unlike its direct predecessor, the RTX 2080 Super. It was also the full xx104 chip with enhanced memory and that was actually a great product at its time. The 3070 Ti is just more of that - and at $600, it's not necessarily a bad thing. Just unremarkable in face of competition.

If the RTX 3070 Ti has earned such lukewarm reception among enthusiasts - it is solely because of AMD's highly competitive offerings and this is frankly amazing, a price war would have long sparked if the ongoing shortages due to supplies and mining weren't in effect.

When the last time NVIDIA truly had to sweat to keep up? I would argue that was when the Radeon HD 5000 series cards had full DirectX 11 support for almost a year in advance while NVIDIA still lingered with the DX10.0 GTX 200 series, and that's a good 12 years ago, now.
 
Yeah not great, but they will sell them all.

Meh.
 
Impressive.

Unlike the 3080Ti which is just defective 3090 dies dumped with no particular effort into a 3080FE board, this does not feel like a phoned-in effort; It feels like a complete redesign that is a step up from the 3070 in several key ways.

If it's still a relevant GPU when the scalping/supply pricing finally subsides, I'll be picking one of these up. I'm not holding my breath though, I suspect GPUs to be basically unattainable at MSRP for longer than the generational lifespan of Ampere GPUs. Unless you're both wealthy and desperate, you won't be buying a $1300+ GPU for the current slew of games. You'll either make do with the GPU you have at reduced settings, or you'll be picking up something way less powerful than you want as a placeholder for the next year or so.
 
Impressive.

Unlike the 3080Ti which is just defective 3090 dies dumped with no particular effort into a 3080FE board, this does not feel like a phoned-in effort; It feels like a complete redesign that is a step up from the 3070 in several key ways.

If it's still a relevant GPU when the scalping/supply pricing finally subsides, I'll be picking one of these up. I'm not holding my breath though, I suspect GPUs to be basically unattainable at MSRP for longer than the generational lifespan of Ampere GPUs. Unless you're both wealthy and desperate, you won't be buying a $1300+ GPU for the current slew of games. You'll either make do with the GPU you have at reduced settings, or you'll be picking up something way less powerful than you want as a placeholder for the next year or so.
How can you say impressive for a card like that? You get 5-7% performance uplift for 20% less efficiency. Given this, it's nearly 10 degrees hotter than the reference 3070... It also has the same amount of VRAM as the 3070. 8 GB is NOT enough for 4K gaming if you play a variety of games. Yet it costs $100 more. You can't even say OK for this card, not to speak of impressive... The 3080 Ti might be a cut-down version of the 3090, but it nearly equals it in performance and ~10% better efficiency. Yes, it has half the amount of VRAM but 12 GB is still enough and costs $300 less. And the 3080 Ti is far from a great card either, but this 3070 Ti is utter garbage.
 
8 GB is NOT enough for 4K gaming if you play a variety of games
Not seeing anything in my data that would suggest this to be true. Yes, of course .. modded Skyrim with uncompressed textures.
 
If I had $600 I would still pick the normal 3070. 300 watts is too much.
 
Not seeing anything in my data that would suggest this to be true. Yes, of course .. modded Skyrim with uncompressed textures.
Doom Eternal and I think the most recent Resident Evil can push right up to to or exceed 8GB with the right settings and I'm sure there are other examples.

Point being even if 8GB is fine for 90% of the titles today a high-end GPU shouldn't be running at near its capacity the day its released, its going to be nearly useless in a few years time. This is just more "F you" from Nvidia with this release.
 
Doom Eternal and I think the most recent Resident Evil can push right up to to or exceed 8GB with the right settings and I'm sure there are other examples.

While allocations might reach 8 GB, there's no performance penalty for cards with less VRAM, because the game doesn't come even close to using all this data

its going to be nearly useless in a few years time
That will happen with 8 GB or 16 GB.
 

While allocations might reach 8 GB, there's no performance penalty for cards with less VRAM, because the game doesn't come even close to using all this data


That will happen with 8 GB or 16 GB.
I'm aware of the differences between useage and allocation but without game or graphics driver debugger level access its impossible to know exactly what a game is doing. Hardware Unboxed found limits in Doom with cards with 8GB. Not sure where I saw / read the 8GB issue with Resident Evil and you can mitigate the issue by tweaking settings but it dosn't really matter the issue exists today.

And equating a 8GB with 16GB as being the same is ridiculous, rasterization performance being equal a card with 16GB is going to have a much longer useful lifespan than a 8GB one. This is a "high-end" card, it should be able to maintain its "high-end" status for several years.
 
600$ GPU with 8gb vram in 2021 is ridiculous, Nvidia introduced 8gb with the 1070, HALF A DECADE AGO !

Nvidia made 0% performance per watt improvements over rtx 2000 series, what were they doing ??!

They cant have been focusing just on ray tracing performance, a feature most gamers already turn off cause the performance hit is so not worth it :shadedshu:
 
600$ GPU with 8gb vram in 2021 is ridiculous, Nvidia introduced 8gb with the 1070, HALF A DECADE AGO !
Tbf it's clearly not the same memory, but I personally agree on the amount only because to me I've had 8GB years, my next upgrade would Have to be better, I use em a while.
 
LOL!!, wtf is this crap??
Definitely not even needed. If you do a blind test between this and the vanilla 3070, you would not know which is which. For $100 more!!

OMG this is incredible.
 
Is 3070 ti really considered as “high end”? Above it we have 3 more SKUs - 3080, 3080 ti, 3090. I would say it is upper middle at best.
 
Is 3070 ti really considered as “high end”?

I personally dont anyway.

XX60 entry
XX 70 mid level
XX70ti mid to high
XX80 high end
XX80TI high end
XX90/TITAN enthusiast.

I thought its always been like this since 3 digit cards.
 
3070ti --> fail, enough said, even the normal 3070 was fail in my list, 3080 is ages better for the price, 3080ti = fail too.
 
Gaming power consumption is increased significantly over the RTX 3070, by 80 W (!) for +7% performance—hard to justify. The underlying reason is that the GDDR6X memory is faster, but also more power hungry.

I have a very hard time believing that the vram itself is using +80 watts more. I think they are juicing the gpu a little harder, which consumes disproportionately more power at the top. Or a foundry change ...
 
An absolute money grab if I've ever seen one. Nvidia is just another Apple, they have no intention of selling anything for a reasonable, honest price. Their manta has become how much can we possibly charge for this and still sell it. Capitalism gone bad! Whatever happened to honest companies making reasonable earnings? Existing to fill a need, provide jobs and contribute to their communities!
 

While allocations might reach 8 GB, there's no performance penalty for cards with less VRAM, because the game doesn't come even close to using all this data


That will happen with 8 GB or 16 GB.
Gamers Nexus found a substantial difference in performance between the 3070ti and the 3080 running eternal in 4k. The 3070ti is notably faster in his testing at 4k, but the gap between the 3070ti and the 3080 is far larger then it should be. At 1440p this gap is significantly smaller, suggesting VRAM limitations.
I personally dont anyway.

XX60 entry
XX 70 mid level
XX70ti mid to high
XX80 high end
XX80TI high end
XX90/TITAN enthusiast.

I thought its always been like this since 3 digit cards.
Fermi was

x5x entry
x6x mid range (includes x6xse low mid range later in life)
x6xti mid high range
x7x high end
x8x high end
x9x enthusiast dual GPU
 
Fermi was

x5x entry
x6x mid range (includes x6xse low mid range later in life)
x6xti mid high range
x7x high end
x8x high end
x9x enthusiast dual GPU

LMAO! Fermi was more than 10 years ago! Get with the program man!
 
If I had $600 I would still pick the normal 3070. 300 watts is too much.
Not if you can afford a bigger power supply. I'll take one of those MSI 3070 Ti's thank you very much.

battlefield-5-2560-1440.png
 
Pointless GPU, makes me glad I have my 3070. It's the exact same thing except with memory that runs hotter than the surface of Mercury, and a backplate colder than the planet's backside due to those thermal pads doing next to nothing. Mine runs at 2100 MHz @ 950mv down from the stock 1995 MHz @ 1050mv so I'm almost matching its performance while consuming less power lol. And not everyone wants high TDP components in their PC. I avoided the higher end Ampere cards for this reason despite my power supply being more than up for the task. I hope the next xx70 card doesn't have a TDP that would make the R9 295x2 shake in its boots.
 
Last edited by a moderator:
Back
Top