• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3070 Ti Founders Edition

If I had $600 I would still pick the normal 3070. 300 watts is too much.
I wonder if you can real find 3070 at $600 at your place.
In Germany the cheapest offer I can see is 1099€. :(

Mindfactory.de has usually the most volumes available in stock for every current generation Nvidia and AMD cards and they are selling those 3070 for 1199€.
https://www.mindfactory.de/Hardware/Grafikkarten+(VGA)/GeForce+RTX+fuer+Gaming/RTX+3070.html

Most probably 3070Ti would be sold at even higher price.
The prices are really crazy at the moment.o_O
 
I'm aware of the differences between useage and allocation but without game or graphics driver debugger level access its impossible to know exactly what a game is doing. Hardware Unboxed found limits in Doom with cards with 8GB. Not sure where I saw / read the 8GB issue with Resident Evil and you can mitigate the issue by tweaking settings but it dosn't really matter the issue exists today.

And equating a 8GB with 16GB as being the same is ridiculous, rasterization performance being equal a card with 16GB is going to have a much longer useful lifespan than a 8GB one. This is a "high-end" card, it should be able to maintain its "high-end" status for several years.
methinks you're missing how titan class cards (kepler,maxwell, pascal), w/twice the vram as their (x)x80ti counterparts didn't last any longer; by the time the vram becomes of use, the (now outdated) gpu arch becomes the bottleneck. google some old gtx770 2/4gb reviews; adding more vram isn't always a good thing but can be a waste.

and "the emperor wears no clothes", this is not a high end gpu. hell, its not even a 4K gpu, imo w/104 chip.
 
Last edited:
I have only four awards to pick from "Budget", "Recommended", "Editor's Choice" and "Innovation".

Ideas?
Maybe a Bronze, Silver and Gold "recommended" badge. It's a universally recognizable tier gradient and would be very helpful to differentiate higher recommended cards from the lower. "Editor's Choice" would then be reserved for the "cream of the crop" products.
 
Last edited:
trying to see if i am lucky to grab one dam the time we live in i hate scalpers and miners.
 
No gpu no cry.
 
How can you say impressive for a card like that? You get 5-7% performance uplift for 20% less efficiency. Given this, it's nearly 10 degrees hotter than the reference 3070... It also has the same amount of VRAM as the 3070. 8 GB is NOT enough for 4K gaming if you play a variety of games. Yet it costs $100 more. You can't even say OK for this card, not to speak of impressive... The 3080 Ti might be a cut-down version of the 3090, but it nearly equals it in performance and ~10% better efficiency. Yes, it has half the amount of VRAM but 12 GB is still enough and costs $300 less. And the 3080 Ti is far from a great card either, but this 3070 Ti is utter garbage.
Why are you worried about efficiency when this is obviously the same GA-104 silicon being pushed harder? OF COURSE efficiency is down, that's what happens when you push higher voltages and higher clocks. This isn't new silicon, and the laws of physics always apply.

As for pricing, I covered that by saying this will never be attainable at MSRP. The botters have proven that they can get the lion's share of Founders Edition stocks, and so few of those are made in the first place that whatever's left for end-users to actually buy might as well be vaporware.
 
I have a very hard time believing that the vram itself is using +80 watts more. I think they are juicing the gpu a little harder, which consumes disproportionately more power at the top. Or a foundry change ...
There's more to it than just memory, I talked about this in the conclusion

Maybe a Bronze, Silver and Gold "recommended" badge. It's a universally recognizable tier gradient and would be very helpful to differentiate higher recommended card for the lower. "Editor's Choice" would then be reserved for the "cream of the crop" products.
I love that idea! Maybe I could remove "Highly", and put "Bronze", "Silver", "Gold" in the empty space instead?
 
There's more to it than just memory, I talked about this in the conclusion


I love that idea! Maybe I could remove "Highly", and put "Bronze", "Silver", "Gold" in the empty space instead?
It does sound reasonable, but I noted and did like them, sometimes being not recommended, any improvement here is golden because after giving it some thought how many skip to the end and saw 3stars and highly recommended and just went with your opinion based on the smaller amount of input they took in, surely not what you would like but in percentage terms, it happens.

Obviously , though now in text, love your Reviews, work and opinion, keep it up.
 
3070ti most definitely is NOT highly recommended, maybe "Nice Try" badge.
 
The more I think about the more it looks like I'll have to wait for another generation of GPU's to drop before updating from my 1060.
 
"Personally, I'm not a fan of going all out on VRAM size, none of our benchmarks show any noteworthy performance issues arising from 8 GB VRAM capacity. Actually, it seems likely DirectStorage, a technology that was first pioneered on the new consoles, will reduce VRAM pressure by optimizing the disk to GPU memory path."

Wizzard, you have Doom Eternal in the review. That game at 4K does show differences between 8GB cards and 10 or more gig cards. I dont know where you test, I presume it is the tiny 1st level of the game, but on my old RTX 2080 switching to the otherwise slightly weaker (at 1440p and 1080p) GTX 1080 Ti led to higher averages and Lows. I test in Blood Swamps (DLC level) or Urdak (large end-game mission).
It is still playable on 8GB with ocassional stutters (its less playable in end-game levels that are bigger and with more memories), so the game likely isnt going MUCH above 8GB, but it is definitely using more. RX 5700 XT also had the same issue. RTX 3080 10GB and RX 6900 XT 16GB dont have a problem at all.

Also... as for allocation and usage lol.

Games don't allocate memory in the sense of "give me 10GB, I'll throw my stuff in that pool". They allocate mem only for stuff that is needed, +-some granularity loss. If a game "allocates" 10G, there is indeed 10G of data in use. The catch is that not all of this data may be required to draw a given frame right now. Let's say you're in a cube where each side has a different texture, you can only see at most 5 sides at any point in time - does that mean the mem for the 6th wall is "allocated but not used"? No. If you turn around, it will need to be rendered - meaning if you didn't have the VRAM required to hold that 6th texture, you would get a lag spike when you turn around as it swaps it back into VRAM in place of the wall that just became invisible. Then if you turn around again, you will have to swap VRAM again... In big games there usually isn't a single place in an area where you can see every single asset that is in the scene - so you can in fact get away with lower VRAM than what the game calls for but you will see higher frametimes as you start moving around and start running into assets that spilled over into system RAM previously. Especiall if you actually play the game you know, move around and are in it for more time than 1-10 minutes.

That is why VRAM testing is tricky. It is not easy to do via normal short tutorial benches.

Other games that use more VRAM are Wolfenstein 2 (use actual max settings, not the preset), Cyberpunk with RT at 1440P+, supposedly the new US propaganda games (CoD) too. For Wolfenstein and CB2077 I am 100% certain though.
 
Last edited:
a tI card is all about the price.. nearly as good as the next card up but for less money.. in a world where recommended retail prices are meaningless nonsense so is this card..

trog
 
Nearly as goof would be 7680 Cuda on a 384 bit bus but only 80 ROPS, can'be GA102. can't be 16GB bc 3080Ti is 12GB, and GDDR6X is very expensive and low supply, so decisions decisions.
 
Gamers Nexus found a substantial difference in performance between the 3070ti and the 3080 running eternal in 4k. The 3070ti is notably faster in his testing at 4k, but the gap between the 3070ti and the 3080 is far larger then it should be. At 1440p this gap is significantly smaller, suggesting VRAM limitations.

Fermi was

x5x entry
x6x mid range (includes x6xse low mid range later in life)
x6xti mid high range
x7x high end
x8x high end
x9x enthusiast dual GPU

mph I’m aware of the x50 and x30 cards I just didn’t consider them because I never thought of them as anything more than “I need multiple monitor GPUs” so I still stand by my list.
 
a tI card is all about the price.. nearly as good as the next card up but for less money.. in a world where recommended retail prices are meaningless nonsense so is this card..

trog
Historically, Nvidia's Ti has meant many different, conflicting things since it was first introduced with the Geforce 2.

GF2 Ti (die shrink, clocked a bit higher)
GF3 Ti (binning process, both better and worse options than the original GF3
GF4 Ti (branding distinction between new architecture and rebranded NV1x silicon)
GF 560 Ti (total mess, three different SKUs with that name, all vastly different specs/prices to the vanilla GF560)
GF 650 Ti (literally double the card of the vanilla GF 650)
GF 660 Ti (almost identical to the vanilla GF 660, but with one extra SM)

I mean, I could go on - but I think from that list you can see that Ti has been meaningless in terms of definition since Nvidia introduced it. There's no consistency between generations, and there's no consistency between Ti models of the same generation.

Ti is just whatever Nvidia decide it's going to be for that specific SKU - which in this case is a fully-enabled GA-104 die (almost insignificant improvement) with a power draw increase and price increase to cover the beefier cooler, overclock, and switch to GDDR6X VRAM. If you can pick one up for not much more than a vanilla 3070 then it's a good deal. If you can pick one up at MSRP then you should sell it for the market value and pocket the $700 you make as instant profit.
 
AS a regular person you can't just get at MSRP, because some billionaire bought all of them at the facroty exit and shipped in containers to unknown location to just gather dust, sold half at triple msrp, and keeps the rest to gradually release them at 3x msrp also, but not too quick. this is just criminal, so why endorse the same conduct.
 
God damn it, eu.evga, 3070ti coming soon and after seconds page at full capacity.
 
Is it too much to ask to attach a screenshot of GPU-Z for the card you're testing? Many people would really appreciate that.
Like the one on the overclocking page.. just like in my last 700 (!) reviews?
 
There's more to it than just memory, I talked about this in the conclusion


I love that idea! Maybe I could remove "Highly", and put "Bronze", "Silver", "Gold" in the empty space instead?
Bronze, Silver & Gold makes really good sense, but maybe keep that 4th "Editors Choice" for the really special ones that balance innovation, performance and price, if so this award should be pretty rare!
 
What sense does it make. At least add Platinum and Titanium to the list. Titanium is instabuy, its not simply efficient but produces energy back, yeah. Higly recommend at least means you can buy (if you can handle the pricing), the point is that there are no issues with the product. other than the 350 Watts stove and 8GB framebuffer that can be found in the likes of 6600XT and this is clearly 60% faster and deserving 12GB at least.
 
There's more to it than just memory, I talked about this in the conclusion


I love that idea! Maybe I could remove "Highly", and put "Bronze", "Silver", "Gold" in the empty space instead?
Does it need a badge though, did they really earn it?

The reivew itself is really solid and your conclusions can only be your own but I mean I read conclusion page and you state plenty of stipulations on recommending it, myself and others have even more negative things to say. This is obviously a pretty tepid release with too many compromises with way too high of a asking price to really be recommended at all in my opinion.
 
The reason and the only reason why I believe that Nidia are doing this kind of pricing by putting out this many cards is because to keep the prices at a their MSRP. This will keep the series of video cards as stable in price for as long as possible. You have to have the entire video card sector to have a dramatic fall for any kind of price fluctuation. And even if it does happen the price reduction will be minimal.

IMHO this is a DOG of a card.
 
The reason and the only reason why I believe that Nidia are doing this kind of pricing by putting out this many cards is because to keep the prices at a their MSRP. This will keep the series of video cards as stable in price for as long as possible. You have to have the entire video card sector to have a dramatic fall for any kind of price fluctuation. And even if it does happen the price reduction will be minimal.
No it won't. Supply and demand determine market price, and when demand outstrips supply to this degree MSRP is meaningless. This is just Nvidia greed taking taking a bigger piece of "value pie" that the market has set for these cards.
 
Maybe I could remove "Highly", and put "Bronze", "Silver", "Gold" in the empty space instead?
That sounds like a great idea. There you go!

but maybe keep that 4th "Editors Choice" for the really special ones that balance innovation, performance and price, if so this award should be pretty rare!
Agreed and also products that are so exceptional that they deserve special recognition.
 
@W1zzard

great review as always

a few thoughts/suggestions

Maybe put a gpuz screenshot on first page with your chart.
With the chart itself would it be possible to put shaders/cores and add a column for TMUs and CUs beside rops and shaders?

thanks
 

Attachments

  • 2ED8FDF9-FEA3-4411-97ED-A1165A56DB96.png
    2ED8FDF9-FEA3-4411-97ED-A1165A56DB96.png
    282.6 KB · Views: 85
Back
Top