# NVIDIA GeForce RTX 3070 Ti Founders Edition



## W1zzard (Jun 9, 2021)

NVIDIA's new GeForce RTX 3070 Ti is designed to bring the fight to Radeon RX 6800 non-XT. To achieve their goal, NVIDIA maxed out the GA104 GPU design and switched to faster GDDR6X memory. Unfortunately, this also resulted in an increase in power consumption and heat output.

*Show full review*


----------



## TheOne (Jun 9, 2021)




----------



## human_error (Jun 9, 2021)

A shame about the massive power increase relative to the performance boost. I still bet they'll sell out in seconds like the other cards...

I do like the simpler charts showing raytracing performance - much easier to read!


----------



## Aretak (Jun 9, 2021)

Seems like somewhat of a dissonance between "I can recommend the card, but only barely" and the "Highly Recommended" badge. 

The card itself is a joke for the price and power consumption compared to the 3070. Another money grab from Nvidia, just like the 3080 Ti. And some suckers actually bought the line that they wanted to somehow "help" gamers.


----------



## W1zzard (Jun 9, 2021)

human_error said:


> I do like the simpler charts showing raytracing performance - much easier to read!


Thanks, technically there's fewer information in them now, but I felt like that was a reasonable tradeoff


----------



## ChristTheGreat (Jun 9, 2021)

Sorry nVidia, except for RT performance, this gen, AMD has my heart!


----------



## Anymal (Jun 9, 2021)

Maybe in 4th generation RT would become only 10% burden when ON, almost 50% is only a bad joke.


----------



## Lionheart (Jun 9, 2021)

I have no words, <-- except for these.


----------



## watzupken (Jun 9, 2021)

I was expecting a poor results on this card, and this ended worst than what I had in mind. Its more expensive than the non-Ti version, with sub 10% improvement in performance, and a massive increase in power requirements. Considering this is also a card that is mining nerfed, I think it will sell well initially. But not sure if it will continue to sell well over time. This RTX 3070 Ti in my opinion is probably in the same league as the RTX 3060 in terms of wow factor, and that is not a compliment from me.


----------



## zilul (Jun 9, 2021)

"highly recommended" to whom ? this product doesn't exist or is out of reach for the majority of reasonable people, the card is not even worth the normal price compared to the 3070 non ti, look at those power draw charts sight...


----------



## ExcuseMeWtf (Jun 9, 2021)

Supply issues? Let's just make more SKUs with small differences between each other. That surely just what everyone needs!

/s


----------



## Vya Domus (Jun 9, 2021)

To be honest I don't really care how poor of a value this is since it wont really exist out there in the real world anyway for the foreseeable future.


----------



## looniam (Jun 9, 2021)

all it had to do was "beat" the 6800 . . 





what a fail; gives nothing over the 3070.


----------



## Dr. Dro (Jun 9, 2021)

Wow... just wow... and not in a good way. I expected the power consumption and thermal load to drastically increase due to the usage of GDDR6X, but I didn't expect that even with the addition of this exotic memory and a shader count increase, we'd have such minimal improvement in performance over the original vanilla 3070. That concern about the card throttling under some workloads due to GDDR6X's extremely high power consumption causing it to run into the power limit still lingers, and I think it might end up being a reality to many owners of this card, as it is for many of us RTX 3090 owners.

Were I in the market for a GPU in this segment, I'm not gonna lie: I straight up wouldn't buy it. The Radeon RX 6800 is a better product in my eyes, especially given NVIDIA's refusal to allow us to have BIOS editors or advanced configuration settings to fix their mess (i.e. increase the power limit enough for the memory to stop suffocating the GPU ASIC), all while AMD has advanced in-driver hardware control settings. I like my hardware to be manageable, and Ampere is everything but that.

A small price reduction on the original would have made it a much better product to compete with RDNA 2 offerings in both segments surrounding it, imho, even in today's absurd market conditions.


----------



## TheoneandonlyMrK (Jun 9, 2021)

Not sure how you can give it a highly recommend badge when just above that badge you can barely recommend it odd. @W1zzard

Other than that sound review, no issues with any of it but the badge.

Should have read the thread I'm not alone.


----------



## Mistral (Jun 9, 2021)

You're really trying hard to devalue that "Highly Recommended" badge...


----------



## Legacy-ZA (Jun 9, 2021)

I waited for this turd?    



*EDIT*

It should have had at least a minimum of 10GB VRAM and performance right in the middle of a 3070 & 3080. In my opinion, the only cards worth having this generation is the 3080 & 3060Ti, the rest is just straight up trash.


----------



## ppn (Jun 9, 2021)

AppleM1 5nm 16Btr. / 120sq.mm
GA104 8nm 17.2 Btr / 392 sq.mm. On 5nm 3080Ti would be a 4060 at best.

So waiting can only make sense for a new node. At least 6nm with 66Mtr/mm2 density.


----------



## RedelZaVedno (Jun 9, 2021)

5% more performance for 20% higher MSRP and 30% higher power consumption. Great value compared to 3080TI 
NGeedia & AMD don't even pretend to care anymore, it's just a mindless money grab.
RIP value oriented DIY PC builders, you've fallen out of Ngreedia's love


----------



## W1zzard (Jun 9, 2021)

@Mistral, @TheoneandonlyMrK, @Aretak, @zilul 
the issue with "Highly" Recommended is that "Highly" is just to there to fill the space above "Recommended" in the award image. I couldn't come up with a better idea for the design, back when I made those badges.

I have only four awards to pick from "Budget", "Recommended", "Editor's Choice" and "Innovation".

Ideas?


----------



## RedelZaVedno (Jun 9, 2021)

W1zzard said:


> @Mistral, @TheoneandonlyMrK, @Aretak, @zilul
> the issue with "Highly" Recommended is that "Highly" is just to there to fill the space above "Recommended" in the award image. I couldn't come up with a better idea for the design, back when I made those badges.
> 
> I have only four awards to pick from "Budget", "Recommended", "Editor's Choice" and "Innovation".
> ...


Maybe you should add at least two more: "Not Recommended" when product is bad in general (Vega 64 at launch) and "Bad value" when it's MSRP price to performance ratio is insanely bad (3080TI).


----------



## Chomiq (Jun 9, 2021)

W1zzard said:


> @Mistral, @TheoneandonlyMrK, @Aretak, @zilul
> the issue with "Highly" Recommended is that "Highly" is just to there to fill the space above "Recommended" in the award image. I couldn't come up with a better idea for the design, back when I made those badges.
> 
> I have only four awards to pick from "Budget", "Recommended", "Editor's Choice" and "Innovation".
> ...


"Moneygrab" with an open fist reaching for wallet.


----------



## Operandi (Jun 9, 2021)

Chomiq said:


> "Moneygrab" with an open fist reaching for wallet.


I mean the world has bigger problems so who really cares honestly but nVidia's level of tone deafness with releasing this generation of Tis in this cliamte is pretty amazing.  I kinda wish review media would have gotten together and boycotted reviewing this garbage.


----------



## TheoneandonlyMrK (Jun 9, 2021)

W1zzard said:


> @Mistral, @TheoneandonlyMrK, @Aretak, @zilul
> the issue with "Highly" Recommended is that "Highly" is just to there to fill the space above "Recommended" in the award image. I couldn't come up with a better idea for the design, back when I made those badges.
> 
> I have only four awards to pick from "Budget", "Recommended", "Editor's Choice" and "Innovation".
> ...


Don't give it a badge at all if it's a bit meh in your opinion ,does every review get a badge?, If so more badge types.

More badges hmm, "If you can get it", "Gamer recommended" , I mean really you have budget, why not mainstream or high-end too then also a "value recommended".

Just seems odd to barely recommend something that's highly recommended by you, I read the review and found it fair and sound, I was genuinely surprised by the badge.

Shit even a honest barely recommended could and should be possible.


----------



## Wasteland (Jun 9, 2021)

TheoneandonlyMrK said:


> Don't give it a badge at all if it's a bit meh in your opinion ,does every review get a badge?, If so more badge types.


----------



## B-Real (Jun 9, 2021)

1. Same amount of RAM as the 3070.
2.. 5-7% faster than the 3070 (on the 2 desired resolution for a card this powerful).
*3. Nearly 20% less efficient than the 3070. WTF
4. GPU is 82 degrees. WTF
4. Yet costs $100 more. WTF*

Sorry, but I can only call the 3070 Ti a piece of crap. At least the 3080 Ti was very close to the 3090 and it is cheaper (on paper and in real life) than the 3090.


----------



## Dr. Dro (Jun 9, 2021)

I agree with the suggestions made for a new "Not Recommended" badge (though this should be used only for truly bad products). The thing here is that 3070 Ti is a bit of an odd release - as a product, it doesn't shine but it doesn't stink either, my opinion is that it doesn't deserve to be "Not Recommended" but personally, I wouldn't be comfortable "recommending" it either, given that the competition has adequate products at similar price brackets that do not share its caveats and potential problems arising from the use of exotic memory technology.

Personally, I would have wrapped it up without giving it any award and mentioning that the cost to benefit ratio, in current market conditions, make it a fair (not amazing! but nowhere near bad, either!) product to own if purchased at MSRP, but not something that you should go out of your way, or part with a significant amount of cash to own.

The most interesting bit is that this is not entirely unlike its direct predecessor, the RTX 2080 Super. It was also the full xx104 chip with enhanced memory and that was actually a great product at its time. The 3070 Ti is just more of that - and at $600, it's not necessarily a bad thing. Just unremarkable in face of competition.

If the RTX 3070 Ti has earned such lukewarm reception among enthusiasts - it is solely because of AMD's highly competitive offerings and this is frankly amazing, a price war would have long sparked if the ongoing shortages due to supplies and mining weren't in effect. 

When the last time NVIDIA truly had to sweat to keep up? I would argue that was when the Radeon HD 5000 series cards had full DirectX 11 support for almost a year in advance while NVIDIA still lingered with the DX10.0 GTX 200 series, and that's a good 12 years ago, now.


----------



## Fluffmeister (Jun 9, 2021)

Yeah not great, but they will sell them all.

Meh.


----------



## Chrispy_ (Jun 9, 2021)

Impressive.

Unlike the 3080Ti which is just defective 3090 dies dumped with no particular effort into a 3080FE board, this does not feel like a phoned-in effort; It feels like a complete redesign that is a step up from the 3070 in several key ways.

If it's still a relevant GPU when the scalping/supply pricing finally subsides, I'll be picking one of these up. I'm not holding my breath though, I suspect GPUs to be basically unattainable at MSRP for longer than the generational lifespan of Ampere GPUs. Unless you're both wealthy and desperate, you won't be buying a $1300+ GPU for the current slew of games. You'll either make do with the GPU you have at reduced settings, or you'll be picking up something way less powerful than you want as a placeholder for the next year or so.


----------



## B-Real (Jun 9, 2021)

Chrispy_ said:


> Impressive.
> 
> Unlike the 3080Ti which is just defective 3090 dies dumped with no particular effort into a 3080FE board, this does not feel like a phoned-in effort; It feels like a complete redesign that is a step up from the 3070 in several key ways.
> 
> If it's still a relevant GPU when the scalping/supply pricing finally subsides, I'll be picking one of these up. I'm not holding my breath though, I suspect GPUs to be basically unattainable at MSRP for longer than the generational lifespan of Ampere GPUs. Unless you're both wealthy and desperate, you won't be buying a $1300+ GPU for the current slew of games. You'll either make do with the GPU you have at reduced settings, or you'll be picking up something way less powerful than you want as a placeholder for the next year or so.


How can you say impressive for a card like that? You get 5-7% performance uplift for 20% less efficiency. Given this, it's nearly 10 degrees hotter than the reference 3070... It also has the same amount of VRAM as the 3070. 8 GB is NOT enough for 4K gaming if you play a variety of games. Yet it costs $100 more. You can't even say OK for this card, not to speak of impressive... The 3080 Ti might be a cut-down version of the 3090, but it nearly equals it in performance and ~10% better efficiency. Yes, it has half the amount of VRAM but 12 GB is still enough and costs $300 less. And the 3080 Ti is far from a great card either, but this 3070 Ti is utter garbage.


----------



## W1zzard (Jun 9, 2021)

B-Real said:


> 8 GB is NOT enough for 4K gaming if you play a variety of games


Not seeing anything in my data that would suggest this to be true. Yes, of course .. modded Skyrim with uncompressed textures.


----------



## TrantaLocked (Jun 9, 2021)

If I had $600 I would still pick the normal 3070. 300 watts is too much.


----------



## Operandi (Jun 9, 2021)

W1zzard said:


> Not seeing anything in my data that would suggest this to be true. Yes, of course .. modded Skyrim with uncompressed textures.


Doom Eternal and I think the most recent Resident Evil can push right up to to or exceed 8GB with the right settings and I'm sure there are other examples.

Point being even if 8GB is fine for 90% of the titles today a high-end GPU shouldn't be running at near its capacity the day its released, its going to be nearly useless in a few years time.  This is just more "F you" from Nvidia with this release.


----------



## W1zzard (Jun 9, 2021)

Operandi said:


> Doom Eternal and I think the most recent Resident Evil can push right up to to or exceed 8GB with the right settings and I'm sure there are other examples.











						Resident Evil 8 Village Benchmark Test & Performance Review
					

Resident Evil Village is the first Resident Evil with support for raytracing. The game looks fantastic, yet achieves very high FPS on all graphics cards. Especially GPUs using the AMD RDNA2 architecture deliver amazing performance, even with raytracing enabled.




					www.techpowerup.com
				




While allocations might reach 8 GB, there's no performance penalty for cards with less VRAM, because the game doesn't come even close to using all this data



Operandi said:


> its going to be nearly useless in a few years time


That will happen with 8 GB or 16 GB.


----------



## altermere (Jun 9, 2021)

such a wasted, pointless and short-lived GPU generation all around, and I thought Fermi was bad.


----------



## Operandi (Jun 9, 2021)

W1zzard said:


> Resident Evil 8 Village Benchmark Test & Performance Review
> 
> 
> Resident Evil Village is the first Resident Evil with support for raytracing. The game looks fantastic, yet achieves very high FPS on all graphics cards. Especially GPUs using the AMD RDNA2 architecture deliver amazing performance, even with raytracing enabled.
> ...


I'm aware of the differences between useage and allocation but without game or graphics driver debugger level access its impossible to know exactly what a game is doing.  Hardware Unboxed found limits in Doom with cards with 8GB.  Not sure where I saw / read the 8GB issue with Resident Evil and you can mitigate the issue by tweaking settings but it dosn't really matter the issue exists today.

And equating a 8GB with 16GB as being the same is ridiculous, rasterization performance being equal a card with 16GB is going to have a much longer useful lifespan than a 8GB one.  This is a "high-end" card, it should be able to maintain its "high-end" status for several years.


----------



## Darksword (Jun 9, 2021)

looniam said:


> all it had to do was "beat" the 6800 . .
> View attachment 203238
> 
> what a fail; gives nothing over the 3070.



It gives more heating in the winter.


----------



## Solid State Soul ( SSS ) (Jun 9, 2021)

600$ GPU with 8gb vram in 2021 is ridiculous, Nvidia introduced 8gb with the 1070, *HALF A DECADE AGO ! *

Nvidia made 0% performance per watt improvements over rtx 2000 series, what were they doing ??!

They cant have been focusing just on ray tracing performance, a feature most gamers already turn off cause the performance hit is so not worth it


----------



## TheoneandonlyMrK (Jun 9, 2021)

Solid State Soul ( SSS ) said:


> 600$ GPU with 8gb vram in 2021 is ridiculous, Nvidia introduced 8gb with the 1070, *HALF A DECADE AGO ! *


Tbf it's clearly not the same memory, but I personally agree on the amount only because to me I've had 8GB years, my next upgrade would Have to be better, I use em a while.


----------



## biffzinker (Jun 9, 2021)

Solid State Soul ( SSS ) said:


> 600$ GPU with 8gb vram in 2021 is ridiculous, Nvidia introduced 8gb with the 1070, *HALF A DECADE AGO ! *


Isn’t that because of the last gen consoles?


----------



## N3M3515 (Jun 9, 2021)

LOL!!, wtf is this crap??
Definitely not even needed. If you do a blind test between this and the vanilla 3070, you would not know which is which. For $100 more!!

OMG this is incredible.


----------



## LFaWolf (Jun 10, 2021)

Is 3070 ti really considered as “high end”? Above it we have 3 more SKUs - 3080, 3080 ti, 3090. I would say it is upper middle at best.


----------



## Solaris17 (Jun 10, 2021)

LFaWolf said:


> Is 3070 ti really considered as “high end”?



I personally dont anyway.

XX60 entry
XX 70 mid level
XX70ti mid to high
XX80 high end
XX80TI high end
XX90/TITAN enthusiast.

I thought its always been like this since 3 digit cards.


----------



## Metroid (Jun 10, 2021)

3070ti --> fail, enough said, even the normal 3070 was fail in my list, 3080 is ages better for the price, 3080ti = fail too.


----------



## xorbe (Jun 10, 2021)

> Gaming power consumption is increased significantly over the RTX 3070, by 80 W (!) for +7% performance—hard to justify. The underlying reason is that the GDDR6X memory is faster, but also more power hungry.



I have a very hard time believing that the vram itself is using +80 watts more.  I think they are juicing the gpu a little harder, which consumes disproportionately more power at the top.  Or a foundry change ...


----------



## Zareek (Jun 10, 2021)

An absolute money grab if I've ever seen one. Nvidia is just another Apple, they have no intention of selling anything for a reasonable, honest price. Their manta has become how much can we possibly charge for this and still sell it. Capitalism gone bad! Whatever happened to honest  companies making reasonable earnings? Existing to fill a need, provide jobs and contribute to their communities!


----------



## TheinsanegamerN (Jun 10, 2021)

W1zzard said:


> Resident Evil 8 Village Benchmark Test & Performance Review
> 
> 
> Resident Evil Village is the first Resident Evil with support for raytracing. The game looks fantastic, yet achieves very high FPS on all graphics cards. Especially GPUs using the AMD RDNA2 architecture deliver amazing performance, even with raytracing enabled.
> ...


Gamers Nexus found a substantial difference in performance between the 3070ti and the 3080 running eternal in 4k. The 3070ti is notably faster in his testing at 4k, but the gap between the 3070ti and the 3080 is far larger then it should be. At 1440p this gap is significantly smaller, suggesting VRAM limitations.


Solaris17 said:


> I personally dont anyway.
> 
> XX60 entry
> XX 70 mid level
> ...


Fermi was 

x5x entry
x6x mid range (includes x6xse low mid range later in life)
x6xti mid high range
x7x high end
x8x high end
x9x enthusiast dual GPU


----------



## LFaWolf (Jun 10, 2021)

TheinsanegamerN said:


> Fermi was
> 
> x5x entry
> x6x mid range (includes x6xse low mid range later in life)
> ...



LMAO! Fermi was more than 10 years ago! Get with the program man!


----------



## Why_Me (Jun 10, 2021)

TrantaLocked said:


> If I had $600 I would still pick the normal 3070. *300 watts is too much.*


Not if you can afford a bigger power supply.  I'll take one of those MSI 3070 Ti's thank you very much.


----------



## Deleted member 205776 (Jun 10, 2021)

Pointless GPU, makes me glad I have my 3070. It's the exact same thing except with memory that runs hotter than the surface of Mercury, and a backplate colder than the planet's backside due to those thermal pads doing next to nothing. Mine runs at 2100 MHz @ 950mv down from the stock 1995 MHz @ 1050mv so I'm almost matching its performance while consuming less power lol. And not everyone wants high TDP components in their PC. I avoided the higher end Ampere cards for this reason despite my power supply being more than up for the task. I hope the next xx70 card doesn't have a TDP that would make the R9 295x2 shake in its boots.


----------



## turbogear (Jun 10, 2021)

TrantaLocked said:


> If I had $600 I would still pick the normal 3070. 300 watts is too much.


I wonder if you can real find 3070 at $600 at your place.
In Germany the cheapest offer I can see is 1099€.  


			https://www.idealo.de/preisvergleich/ProductCategory/16073F101483660.html?sortKey=minPrice
		


Mindfactory.de has usually the most volumes available in stock for every current generation Nvidia and AMD cards and they are selling those 3070 for 1199€.
https://www.mindfactory.de/Hardware/Grafikkarten+(VGA)/GeForce+RTX+fuer+Gaming/RTX+3070.html

Most probably 3070Ti would be sold at even higher price.
The prices are really crazy at the moment.


----------



## looniam (Jun 10, 2021)

Operandi said:


> I'm aware of the differences between useage and allocation but without game or graphics driver debugger level access its impossible to know exactly what a game is doing.  Hardware Unboxed found limits in Doom with cards with 8GB.  Not sure where I saw / read the 8GB issue with Resident Evil and you can mitigate the issue by tweaking settings but it dosn't really matter the issue exists today.
> 
> And equating a 8GB with 16GB as being the same is ridiculous, rasterization performance being equal a card with 16GB is going to have a much longer useful lifespan than a 8GB one.  This is a "high-end" card, it should be able to maintain its "high-end" status for several years.


methinks you're missing how titan class cards (kepler,maxwell, pascal), w/twice the vram as their (x)x80ti counterparts didn't last any longer;  by the time the vram becomes of use, the (now outdated) gpu arch becomes the bottleneck. google some old gtx770 2/4gb reviews; adding more vram isn't always a good thing but can be a waste.

and "the emperor wears no clothes", this is not a high end gpu. hell, its not even a 4K gpu, imo w/10*4* chip.


----------



## lexluthermiester (Jun 10, 2021)

W1zzard said:


> I have only four awards to pick from "Budget", "Recommended", "Editor's Choice" and "Innovation".
> 
> Ideas?


Maybe a Bronze, Silver and Gold "recommended" badge. It's a universally recognizable tier gradient and would be very helpful to differentiate higher recommended cards from the lower. "Editor's Choice" would then be reserved for the "cream of the crop" products.


----------



## matar (Jun 10, 2021)

trying to see if i am lucky to grab one dam the time we live in i hate scalpers and miners.


----------



## Anymal (Jun 10, 2021)

No gpu no cry.


----------



## Chrispy_ (Jun 10, 2021)

B-Real said:


> How can you say impressive for a card like that? You get 5-7% performance uplift for 20% less efficiency. Given this, it's nearly 10 degrees hotter than the reference 3070... It also has the same amount of VRAM as the 3070. 8 GB is NOT enough for 4K gaming if you play a variety of games. Yet it costs $100 more. You can't even say OK for this card, not to speak of impressive... The 3080 Ti might be a cut-down version of the 3090, but it nearly equals it in performance and ~10% better efficiency. Yes, it has half the amount of VRAM but 12 GB is still enough and costs $300 less. And the 3080 Ti is far from a great card either, but this 3070 Ti is utter garbage.


Why are you worried about efficiency when this is obviously the same GA-104 silicon being pushed harder? *OF COURSE* efficiency is down, that's what happens when you push higher voltages and higher clocks. This isn't new silicon, and the laws of physics *always* apply.

As for pricing, I covered that by saying this will _never_ be attainable at MSRP. The botters have proven that they can get the lion's share of Founders Edition stocks, and so few of those are made in the first place that whatever's left for end-users to actually buy might as well be vaporware.


----------



## W1zzard (Jun 10, 2021)

xorbe said:


> I have a very hard time believing that the vram itself is using +80 watts more.  I think they are juicing the gpu a little harder, which consumes disproportionately more power at the top.  Or a foundry change ...


There's more to it than just memory, I talked about this in the conclusion



lexluthermiester said:


> Maybe a Bronze, Silver and Gold "recommended" badge. It's a universally recognizable tier gradient and would be very helpful to differentiate higher recommended card for the lower. "Editor's Choice" would then be reserved for the "cream of the crop" products.


I love that idea! Maybe I could remove "Highly", and put "Bronze", "Silver", "Gold" in the empty space instead?


----------



## TheoneandonlyMrK (Jun 10, 2021)

W1zzard said:


> There's more to it than just memory, I talked about this in the conclusion
> 
> 
> I love that idea! Maybe I could remove "Highly", and put "Bronze", "Silver", "Gold" in the empty space instead?


It does sound reasonable, but I noted and did like them, sometimes being not recommended, any improvement here is golden because after giving it some thought how many skip to the end and saw 3stars and highly recommended and just went with your opinion based on the smaller amount of input they took in, surely not what you would like but in percentage terms, it happens.

Obviously , though now in text, love your Reviews, work and opinion, keep it up.


----------



## Anymal (Jun 10, 2021)

3070ti most definitely is NOT highly recommended, maybe "Nice Try" badge.


----------



## Chomiq (Jun 10, 2021)

The more I think about the more it looks like I'll have to wait for another generation of GPU's to drop before updating from my 1060.


----------



## Charcharo (Jun 10, 2021)

"Personally, I'm not a fan of going all out on VRAM size, none of our benchmarks show any noteworthy performance issues arising from 8 GB VRAM capacity. Actually, it seems likely DirectStorage, a technology that was first pioneered on the new consoles, will reduce VRAM pressure by optimizing the disk to GPU memory path."

Wizzard, you have Doom Eternal in the review. That game at 4K does show differences between 8GB cards and 10 or more gig cards. I dont know where you test, I presume it is the tiny 1st level of the game, but on my old RTX 2080 switching to the otherwise slightly weaker (at 1440p and 1080p) GTX 1080 Ti led to higher averages and Lows. I test in Blood Swamps (DLC level) or Urdak (large end-game mission).
It is still playable on 8GB with ocassional stutters (its less playable in end-game levels that are bigger and with more memories), so the game likely isnt going MUCH above 8GB, but it is definitely using more. RX 5700 XT also had the same issue. RTX 3080 10GB and RX 6900 XT 16GB dont have a problem at all.

Also... as for allocation and usage lol.

Games don't allocate memory in the sense of "give me 10GB, I'll throw my stuff in that pool". They allocate mem only for stuff that is needed, +-some granularity loss. If a game "allocates" 10G, there is indeed 10G of data in use. The catch is that not all of this data may be required to draw a given frame right now. Let's say you're in a cube where each side has a different texture, you can only see at most 5 sides at any point in time - does that mean the mem for the 6th wall is "allocated but not used"? No. If you turn around, it will need to be rendered - meaning if you didn't have the VRAM required to hold that 6th texture, you would get a lag spike when you turn around as it swaps it back into VRAM in place of the wall that just became invisible. Then if you turn around again, you will have to swap VRAM again... In big games there usually isn't a single place in an area where you can see every single asset that is in the scene - so you can in fact get away with lower VRAM than what the game calls for but you will see higher frametimes as you start moving around and start running into assets that spilled over into system RAM previously. Especiall if you actually play the game you know, move around and are in it for more time than 1-10 minutes. 

That is why VRAM testing is tricky. It is not easy to do via normal short tutorial benches. 

Other games that use more VRAM are Wolfenstein 2 (use actual max settings, not the preset), Cyberpunk with RT at 1440P+, supposedly the new US propaganda games (CoD) too. For Wolfenstein and CB2077 I am 100% certain though.


----------



## trog100 (Jun 10, 2021)

a tI card is all about the price.. nearly as good as the next card up but for less money.. in a world where recommended retail prices are meaningless nonsense so is this card..

trog


----------



## ppn (Jun 10, 2021)

Nearly as goof would be 7680 Cuda on a 384 bit bus but only 80 ROPS, can'be GA102. can't be 16GB bc 3080Ti is 12GB, and GDDR6X is very expensive and low supply, so decisions decisions.


----------



## Solaris17 (Jun 10, 2021)

TheinsanegamerN said:


> Gamers Nexus found a substantial difference in performance between the 3070ti and the 3080 running eternal in 4k. The 3070ti is notably faster in his testing at 4k, but the gap between the 3070ti and the 3080 is far larger then it should be. At 1440p this gap is significantly smaller, suggesting VRAM limitations.
> 
> Fermi was
> 
> ...



mph I’m aware of the x50 and x30 cards I just didn’t consider them because I never thought of them as anything more than “I need multiple monitor GPUs” so I still stand by my list.


----------



## Chrispy_ (Jun 10, 2021)

trog100 said:


> a tI card is all about the price.. nearly as good as the next card up but for less money.. in a world where recommended retail prices are meaningless nonsense so is this card..
> 
> trog


Historically, Nvidia's Ti has meant many different, conflicting things since it was first introduced with the Geforce 2.

GF2 Ti (die shrink, clocked a bit higher)
GF3 Ti (binning process, both better and worse options than the original GF3
GF4 Ti (branding distinction between new architecture and rebranded NV1x silicon)
GF 560 Ti (total mess, three different SKUs with that name, all vastly different specs/prices to the vanilla GF560)
GF 650 Ti (literally double the card of the vanilla GF 650)
GF 660 Ti (almost identical to the vanilla GF 660, but with one extra SM)

I mean, I could go on - but I think from that list you can see that Ti has been meaningless in terms of definition since Nvidia introduced it. There's no consistency between generations, and there's no consistency between Ti models of the same generation.

Ti is just whatever Nvidia decide it's going to be for that specific SKU - which in this case is a fully-enabled GA-104 die (almost insignificant improvement) with a power draw increase and price increase to cover the beefier cooler, overclock, and switch to GDDR6X VRAM. If you can pick one up for not much more than a vanilla 3070 then it's a good deal. If you can pick one up at MSRP then you should sell it for the market value and pocket the $700 you make as instant profit.


----------



## ppn (Jun 10, 2021)

AS a regular person you can't just get at MSRP, because some billionaire bought all of them at the facroty exit and shipped in containers to unknown location to just gather dust, sold half at triple msrp, and keeps the rest to gradually release them at 3x msrp also, but not too quick. this is just criminal, so why endorse the same conduct.


----------



## Anymal (Jun 10, 2021)

God damn it, eu.evga, 3070ti coming soon and after seconds page at full capacity.


----------



## W1zzard (Jun 10, 2021)

birdie said:


> Is it too much to ask to attach a screenshot of GPU-Z for the card you're testing? Many people would really appreciate that.


Like the one on the overclocking page.. just like in my last 700 (!) reviews?








						GeCube Radeon HD 3850 X-Turbo III 512 MB Review
					

AMD's new Radeon HD 3850 has been a major upgrade in both performance and efficiency. GeCube has taken the reference design and increased the clock speeds. They also doubled the available memory to 512 MB GDDR3 and use a two slot cooler on the card. But can this yield enough performance to...




					www.techpowerup.com


----------



## Tatty_One (Jun 10, 2021)

W1zzard said:


> There's more to it than just memory, I talked about this in the conclusion
> 
> 
> *I love that idea! Maybe I could remove "Highly", and put "Bronze", "Silver", "Gold" in the empty space instead?*


Bronze, Silver & Gold makes really good sense, but maybe keep that 4th "Editors Choice" for the really special ones that balance innovation, performance and price, if so this award should be pretty rare!


----------



## ppn (Jun 10, 2021)

What sense does it make. At least add Platinum and Titanium to the list. Titanium is instabuy, its not simply efficient but produces energy back, yeah. Higly recommend at least means you can buy (if you can handle the pricing), the point is that there are no issues with the product. other than the 350 Watts stove and 8GB framebuffer that can be found in the likes of 6600XT and this is clearly 60% faster and deserving 12GB at least.


----------



## Operandi (Jun 10, 2021)

W1zzard said:


> There's more to it than just memory, I talked about this in the conclusion
> 
> 
> I love that idea! Maybe I could remove "Highly", and put "Bronze", "Silver", "Gold" in the empty space instead?


Does it need a badge though, did they really earn it?  

The reivew itself is really solid and your conclusions can only be your own but I mean I read conclusion page and you state plenty of stipulations on recommending it, myself and others have even more negative things to say.  This is obviously a pretty tepid release with too many compromises with way too high of a asking price to really be recommended at all in my opinion.


----------



## Icon Charlie (Jun 10, 2021)

The reason and the only reason why I believe that Nidia are doing this kind of pricing by putting out this many cards is because to keep the prices at a their MSRP.  This will keep the series of video cards as stable in price for as long as possible.   You have to have the entire video card sector to have a dramatic fall for any kind of price fluctuation.  And even if it does happen the price reduction will be minimal.

IMHO this is a DOG of a card.


----------



## Operandi (Jun 10, 2021)

Icon Charlie said:


> The reason and the only reason why I believe that Nidia are doing this kind of pricing by putting out this many cards is because to keep the prices at a their MSRP.  This will keep the series of video cards as stable in price for as long as possible.   You have to have the entire video card sector to have a dramatic fall for any kind of price fluctuation.  And even if it does happen the price reduction will be minimal.


No it won't.  Supply and demand determine market price, and when demand outstrips supply to this degree MSRP is meaningless.  This is just Nvidia greed taking taking a bigger piece of "value pie" that the market has set for these cards.


----------



## lexluthermiester (Jun 10, 2021)

W1zzard said:


> Maybe I could remove "Highly", and put "Bronze", "Silver", "Gold" in the empty space instead?


That sounds like a great idea. There you go!



Tatty_One said:


> but maybe keep that 4th "Editors Choice" for the really special ones that balance innovation, performance and price, if so this award should be pretty rare!


Agreed and also products that are so exceptional that they deserve special recognition.


----------



## mechtech (Jun 10, 2021)

@W1zzard 

great review as always

a few thoughts/suggestions 

Maybe put a gpuz screenshot on first page with your chart.  
With the chart itself would it be possible to put shaders/cores and add a column for TMUs and CUs beside rops and shaders?

thanks


----------



## TrantaLocked (Jun 11, 2021)

Chrispy_ said:


> Why are you worried about efficiency when this is obviously the same GA-104 silicon being pushed harder? *OF COURSE* efficiency is down, that's what happens when you push higher voltages and higher clocks. This isn't new silicon, and the laws of physics *always* apply.
> 
> As for pricing, I covered that by saying this will _never_ be attainable at MSRP. The botters have proven that they can get the lion's share of Founders Edition stocks, and so few of those are made in the first place that whatever's left for end-users to actually buy might as well be vaporware.


The 3070 Ti could have been less efficient for any number of reasons, it makes zero difference in whether someone likes or dislikes the loss of efficiency. This is like saying "but why do you care about ice cream having sugar in it, OBVIOUSLY they added it to it so its ok because we _understand _it has sugar!" Or like "captain, we shouldn't arrest him because we _understand _that criminals are unstable people who make bad decisions so why bother with a man who just assaulted someone" Like, no, that is not how decisions work.

I'm honestly tired of seeing this sentiment every time there is discussion about Ampere's bad power efficiency, where people try to defend it and tell people what their opinion should be because we uNdErStAnD why. IT DOES NOT MATTER WHY the power efficiency is how it is. Samsung 8nm, GDDR6X, architecture, it is all irrelevant to someone's opinion of a generation that is barely more efficient than five year old Pascal. People need to stop telling others what their opinions are about graphics cards because it is incredibly annoying.


----------



## R-T-B (Jun 11, 2021)

W1zzard said:


> Like the one on the overclocking page.. just like in my last 700 (!) reviews?
> 
> 
> 
> ...


OT but GeCube Radeon sounds like a Radeon that really identifies as a GeForce.


----------



## Tom Sunday (Jun 15, 2021)

It was reported that STALKER 2 will be arriving by mid-2022 and revealing a very high barrier of entry as to required minimum GPU specs. What this signaled is that RTX 3070 & RTX 3080 cards will essentially be outclassed or at that time just basically meeting minimum requirements with most new and upcoming AAA gaming titles. Since gaming developers actually never stood still, the GPU shortage in turn or its still existing unavailability with any of the 3000 series cards, considerably thus shrunk their future-proofing period. Then "ultrawide-monitors" in waves are also hitting the decks in force putting even more strain on needed GPU performance. With that NVIDIA better be coming out in the next 10-months or so with a brand new (more powerful mainstream) GPU product to catch-up! It was already a shame that NVIDIA was not listening to the masses and supplying their new 3080 TI with a paltry 12GB of memory. Surely it's all about marketing by NVIDIA of which I know very little about being a simple man on the street.


----------



## Chrispy_ (Jun 15, 2021)

Tom Sunday said:


> It was reported that STALKER 2 will be arriving by mid-2022 and revealing a very high barrier of entry as to required minimum GPU specs. What this signaled is that RTX 3070 & RTX 3080 cards will essentially be outclassed or at that time just basically meeting minimum requirements with most new and upcoming AAA gaming titles. Since gaming developers actually never stood still, the GPU shortage in turn or its still existing unavailability with any of the 3000 series cards, considerably thus shrunk their future-proofing period. Then "ultrawide-monitors" in waves are also hitting the decks in force putting even more strain on needed GPU performance. With that NVIDIA better be coming out in the next 10-months or so with a brand new (more powerful mainstream) GPU product to catch-up! It was already a shame that NVIDIA was not listening to the masses and supplying their new 3080 TI with a paltry 12GB of memory. Surely it's all about marketing by NVIDIA of which I now very little about being a simple man on the street.


Whether or not 8GB is enough RAM or not isn't really a question anyone can answer.
What we do know for sure, from Steam surveys and market data, is that 12GB and 16GB cards are being adopted and game devs looking for a graphical edge will _absolutely_ start taking advantage of that extra VRAM


----------



## W1zzard (Jun 15, 2021)

Tom Sunday said:


> It was reported that STALKER 2 will be arriving by mid-2022 and revealing a very high barrier of entry as to required minimum GPU specs


Haven't we heard that countless times, and then the game was a huge downgrade from E3, or terrible gameplay, or some other fail?



Chrispy_ said:


> will _absolutely_ start taking advantage of that extra VRAM


we'll see if they go significantly beyond what the consoles offers, I don't think they ever have


----------



## altermere (Jun 17, 2021)

Tom Sunday said:


> It was reported that STALKER 2 will be arriving by mid-2022 and revealing a very high barrier of entry as to required minimum GPU specs. What this signaled is that RTX 3070 & RTX 3080 cards will essentially be outclassed or at that time just basically meeting minimum requirements with most new and upcoming AAA gaming titles. Since gaming developers actually never stood still, the GPU shortage in turn or its still existing unavailability with any of the 3000 series cards, considerably thus shrunk their future-proofing period. Then "ultrawide-monitors" in waves are also hitting the decks in force putting even more strain on needed GPU performance. With that NVIDIA better be coming out in the next 10-months or so with a brand new (more powerful mainstream) GPU product to catch-up! It was already a shame that NVIDIA was not listening to the masses and supplying their new 3080 TI with a paltry 12GB of memory. Surely it's all about marketing by NVIDIA of which I know very little about being a simple man on the street.


yeah, history is repeating itself once again. i think most know by now that Nvidia (and AMD to a lesser extent) cards released at the start of the new console cycle age like carton wine: Kepler was the prime example. developers never make games for "those who didn't upgrade in time", they develop around consoles and their limits, PC gets unoptimized scraps that we just bulldoze through with raw power.


----------



## Charcharo (Jun 17, 2021)

W1zzard said:


> Haven't we heard that countless times, and then the game was a huge downgrade from E3, or terrible gameplay, or some other fail?
> 
> 
> we'll see if they go significantly beyond what the consoles offers, I don't think they ever have



STALKER is an ironic example. It has massively downgraded in gameplay and storytelling from the original design documents and even beta versions... yet it is still head and shoulders above MEtro Exodus, Far Cry 5, and Fallout 4 as a game. And its mods are usually absolutely top tier in open-world FPS gameplay. 

As for VRAM Wizzard -> In Cyberpunk 2077 with RT on, 8GB is not enough. Even DF (that dont test complex areas) or Computerbase agree on this. Hell, Computerbase sees other problems too not just in that game.

You can also install Wolfenstein 2 and manually enable all settings to max since Mein Leben doesnt actually enable everything on max. And it will stutter or crash at 1440p on a 8GB card. There is also Doom Eternal at 4K as well but yeah. 

I used a RTX 2080 at 4K and a RX 5700 XT at 4K as well. Both were generally good, but often I had to lower textures and/or other VRAM heavy settings that impacted long term gameplay stability. Cause yeah, a 8GB card can handle a single loaded room in Urdak, but once you go to the next one the pain starts.


----------



## Jordlr (Jul 7, 2021)

Personally I don't care about the extra power consumption or the extra cost above the 3070.

In the real world the 70w difference equates to 1.2pence per hour extra in electricity costs in the UK.  I just managed to pick up a 3070ti for £600. The 3070 is still unobtainable and when you can get one it's around the £800 region then I would need to play for 24/7 at max load for 22 months before the costs equal out. In real terms the 3070ti plus the extra electricity costs are still going to be cheaper than buying a 3070 or 6800 gpu in the years before upgrading again, all while enjoying my extra 7% performance and RTX gaming.


----------



## AteXIleR (Oct 8, 2021)

I replaced my 3070 to a 3070 Ti based on current market prices, and on the assumption that the voltage curves which would have been made wrong are the cause of the serious increase in terms of power consumption, and it can be fixed. I've looked after multiple reviews in several models and in all-the default usage voltages seemed protrudingly high, and also It was suspicious to me that the power draw can't be reasoned with the improved memory type on its own.

I can confirm after playing around with the card which comes from the same brand(Gigabyte) that it has been proven right, and in general the differences with a bit of tweaking could be in average reduced to around 30w on the settings where they are going the most neck and neck as possible.

The first picture is the default voltage curve in the case of the 3070 Aorus and on the second on the 3070 Gaming.
It can be seen that the already poorly set curve is further botched by raising the voltage to the same levels 90 mhz lower as in the case of the 3070.
On the second picture there is some scarce data what I can show made with a 9900k at 5.1 on all cores.

Life is strange : TC is a fairly poorly optimized game, Trackmania on the other hand runs quite well on newer gen hardware. The power draw is 22w more in the former, and 31w more in the latter.
I did not want to make more of these pictures, but in other games so far what I've tested, the distinctions are similarish(in some occasion around upper 30s).

The poor curve on default kinda makes me question whether it is deliberate act by Nvidia. Would they really be this negligent?


----------



## Dr. Dro (Oct 9, 2021)

AteXIleR said:


> I replaced my 3070 to a 3070 Ti based on current market prices, and on the assumption that the voltage curves which would have been made wrong are the cause of the serious increase in terms of power consumption, and it can be fixed. I've looked after multiple reviews in several models and in all-the default usage voltages seemed protrudingly high, and also It was suspicious to me that the power draw can't be reasoned with the improved memory type on its own.
> 
> I can confirm after playing around with the card which comes from the same brand(Gigabyte) that it has been proven right, and in general the differences with a bit of tweaking could be in average reduced to around 30w on the settings where they are going the most neck and neck as possible.
> 
> ...



This is some pretty good data, thanks for sharing... and yeah, the curve is less than optimal throughout, the sweet spot for my 3090 is 0.83v and even at such low voltage it still power throttles... What you're observing there is the pressure on the power budget coming from the use of G6X, which makes their curve even less efficient...

I personally wouldn't mind if NVIDIA wasn't locking down every single card from all AIBs with a signed VBIOS, I could fix that myself... 450W power limit, static clocks and i'd be good to go... but I guess you have the hard evidence to back my point up already. Cheers


----------

