• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS GeForce RTX 4070 Ti TUF

Not exactly, they're already in trouble. The cards won't fly off the shelves. They have to make it right. 16GB or see ya in 2024. Who wants those crappy choices for memory config at above $750. Pay $900 with taxes only to get 20% bottlenecked at 4K. Well it's not a 4k card but still.
 
Three things:
1. 12GB is not "already borderline", it's plenty for 4k. Worst case scenario, you'll find a few titles that go overboard in 3-5 years. You can work around that by fiddling with settings with a minimal loss in IQ.
2. Looking at benchmarks, 192bit is too narrow. The card is easily faster than 3090Ti at FHD and QHD, but it already runs out of steam at 4k. It's still in the same league as the 3090Ti, so it will do its job. But with a 256bit bus it would have been significantly faster.
3. I'm pretty sure when you say "FSR is good enough for me", you forget DLSS3 comes with frame generation which will boost your FPS where FSR can't follow.

Long story short, I would consider the 4070Ti, without worrying about its longevity. On the other hand, I'm damn certain I'm not paying $800 for a video card.

1. Well it is borderline. I am hitting a wall with most games at 8 GB at 3440*1440 (Shadow of the Tomb Raider hits 7 Gb at 2560*1440 already). 4K is going to push that past 10 GB easy. So in 2 years time, do you think 12 GB will be enough? (And no, reducing details is not an option, if i'm paying this much money for a card, it better work)
2. My point exactly, I am worried, again about the future.
3. I did not forget, until Frame Generation also improves frametimes (playing at 100 fps but with 50 fps input lag is not 100fps), I care not for it.
 
1. Well it is borderline. I am hitting a wall with most games at 8 GB at 3440*1440 (Shadow of the Tomb Raider hits 7 Gb at 2560*1440 already). 4K is going to push that past 10 GB easy. So in 2 years time, do you think 12 GB will be enough? (And no, reducing details is not an option, if i'm paying this much money for a card, it better work)
2. My point exactly, I am worried, again about the future.
3. I did not forget, until Frame Generation also improves frametimes (playing at 100 fps but with 50 fps input lag is not 100fps), I care not for it.
What can I say, if you won't fiddle with settings and you're one of those "all or nothing" persons (nothing wrong with that, just my choice), you're going to have to cough up more $$$ for that. Servus ;)
 
How much of an issue would the 192 bit and only 12 GB vram be in the future? I cannot afford a 4080 or 7900XTX, the most I can go is the 7900XT. Than one has 320 bit and a whooping 20 GB of Vram. The 4070 is cheaper, and thus the more attractive option. I dont care for RT and FSR is good enough for me, so those are not a factor in my choice. But with only 12 GB of Vram (which is already borderline at 4K) and only 192 bit, I fear that the 4070 is a great performer today, but as newer more advanced games release, I fear it won't have the lasting power of the 7900. (I remember the old days, when the rule of thumb was: avoid the 128bit cards, always go for the 256bit ones)
The 12 GB of VRAM wouldn't be an issue (not in realistic workloads anyways), as rasterizing workloads will be bottlenecked by memory bandwidth or computational power before. (RT workloads tend to be bottlenecked by computational power first).

The one thing that nearly no one gets is that given a fixed memory bandwidth, the amount of VRAM which can be used within a given timeframe is also fixed. This is regardless of the game, algorithm or API. So if your GPU has 504 GB/s and your desired frame rate is 120 FPS, then the maximum theoretical utilized VRAM in a single frame i 504 / 120 = 4.2 GB. But this assumes the GPU only accesses the same memory once (which it doesn't), and does it 100% efficiently, so the real usage is probably less than half of this. The next logical deduction from this, if you mod a game with a giant texture pack with 16x the normal texture size, the frame rate will fall sharply simply because the memory bus can't deliver, long before you've fully utilized the VRAM. You'll quickly fall below 30 FPS because of the memory bottleneck.

So the next big question then, what about RX 7900 XT with its tempting 20 GB and impressive 800 GB/s bandwidth? Well, if this card was better balanced, then it would completely outclass RTX 4070 Ti even today, but it doesn't, because it's limited on the computational side*. Not only in pure TFlops and so on, there could be GPU scheduling and numerous tiny architectural details which "holds back" performance, and this is always the case with any GPU design. So for this reason, base your choice on a wide set of benchmarks (like this review), and the conclusions drawn from this is likely to hold true for the useful lifespan of the products, both 2 years and even 5 years from now, the relative performance between the products are likely to remain the same. And this has held true in the past, looking back at Polaris, Vega and Navi, they didn't stand the test of time any better than their green counterparts.
So pick the card that fits your situation best now, and it's likely to be the best choice in the long term.

*)
A texture mod pack would probably be the exception here, as these are unbalanced, where this card might get an edge.

1. Well it is borderline. I am hitting a wall with most games at 8 GB at 3440*1440 (Shadow of the Tomb Raider hits 7 Gb at 2560*1440 already). 4K is going to push that past 10 GB easy. So in 2 years time, do you think 12 GB will be enough?
Don't forget, memory allocated isn't the same as memory used. GPUs compress buffers and some textures heavily.

Secondly, to reiterate my point above, if your future games in two years are going to allocate more, then they will also demand more bandwidth and likely computational power too, so ineviteably you are going to sacrifice FPS or detail levels, and in either case you are not likely to run out of VRAM first.
 
  • Like
Reactions: N/A
A product doesn't become garbage because of the price.

We should live in the real world, not in a fantasy land.
In reality, this product is better priced than RX 7900 XT, which is the closest competitor. Any comparison is meaningless without context. If we want lower prices then AMD needs to be able to launch their cards with 100k+ units available, only then will we see prices drop more.

In reality, we are also in the middle of "hyperinflation", causing ~25% inflation over the past two years for most of the western world (even more true inflation of the US), compared to the usual ~2% per year for the past decades.
Well said. The launch conditions of the 3000 series are ancient hsitory today.

Something else people need to consider is cost. The cost of a single 7nm wafer from TSMC in $10k, according to toms hardware. 5nm is $16000. Nvidia was using samsung 8nm last gen, allegedly because TSMC was too expensive, so we can assume nvidia was paying under 10k per wafer. That means with ADA, nvidia could be paying upwards of twice as much per wafer as they were two years ago, meaning that a 3060 sized GPU could cost more to make then a 3070 sized GPU did last gen. They are also using GDDR6X, which costs notably more then the GDDR6 used on the 3060. Then you add on the much higher cost of fuel, labor, ece.

So expecting the 4070ti to launch at $450 like the 3060, IMO, is wishful thinking, nvidia could very well be losing money. Not that this justifies the $800 price. IMO $650 would have been far more reasonable, and at that price this card would be a winner. $700 would have been acceptable, same price as a MSRP launch 3080 with 50% more VRAM and 15-20% more performance and much better efficiency at the same price.
 
Well said. The launch conditions of the 3000 series are ancient hsitory today.

Something else people need to consider is cost. The cost of a single 7nm wafer from TSMC in $10k, according to toms hardware. 5nm is $16000. Nvidia was using samsung 8nm last gen, allegedly because TSMC was too expensive, so we can assume nvidia was paying under 10k per wafer. That means with ADA, nvidia could be paying upwards of twice as much per wafer as they were two years ago, meaning that a 3060 sized GPU could cost more to make then a 3070 sized GPU did last gen. They are also using GDDR6X, which costs notably more then the GDDR6 used on the 3060. Then you add on the much higher cost of fuel, labor, ece.

So expecting the 4070ti to launch at $450 like the 3060, IMO, is wishful thinking, nvidia could very well be losing money. Not that this justifies the $800 price. IMO $650 would have been far more reasonable, and at that price this card would be a winner. $700 would have been acceptable, same price as a MSRP launch 3080 with 50% more VRAM and 15-20% more performance and much better efficiency at the same price.
$650 and Nvidia is still making money. Hopefully people will speak with thier wallets and these cards will sit on shelves(for the most part) until Nvidia drops the price.

As has been said before, this whole generation from both sides is overpriced. Hopefully when old stock dwindles prices will come down.
 
$650 and Nvidia is still making money.
Well yes, that is generally how economies work. I wouldnt suggest selling GPUs at a loss, AMD did that for several years and look how well it worked for them.....
Hopefully people will speak with thier wallets and these cards will sit on shelves(for the most part) until Nvidia drops the price.

As has been said before, this whole generation from both sides is overpriced. Hopefully when old stock dwindles prices will come down.
They're overpriced, yes, but short of a massive global financial meltdown you're not going to see 2019 prices gain. There's been far too much hyperinflation for that to happen. Despite all the gouging and price raising nvidia's net margin for last year was only a few percentage points higher then it was the year before.
 
Well yes, that is generally how economies work. I wouldnt suggest selling GPUs at a loss, AMD did that for several years and look how well it worked for them.....

They're overpriced, yes, but short of a massive global financial meltdown you're not going to see 2019 prices gain. There's been far too much hyperinflation for that to happen. Despite all the gouging and price raising nvidia's net margin for last year was only a few percentage points higher then it was the year before.
Hi,
And both were effected by miner sells.
 
  • Like
Reactions: N/A
This card is actually a considerably better value than the RX 7900 XT. Now, I realise that this is a VERY low bar we're setting but... there is a silver lining to all of this. A lot of people mindlessly reach for nVidia if the price is the same (which is usually a mistake but whatever), let alone when the price is lower (performance be damned in most cases) and nVidia cards are much worse values than their Radeon counterparts 99.99% of the time.
I'm replying to myself because I can't edit the post anymore. I was wrong about it being a better value than the RX 7900 XT because I've since watched Gamers Nexus, Paul's Hardware, BPS Customs, LTT and JayzTwoCents. I'm afraid that something is wrong with Steve Walton because he actually gave it a positive review. Ever since nVidia tried blacklisting Hardware Unboxed, their testing has started to lean sharply into the green. Steve Burke on the other hand missed no opportunity to ridicule nVidia for the crappy RTX 4070 Ti and nVidia's marketing (which managed to be WORSE than AMD's).
When even LTT and JayzTwoCents are trash-talking an nVidia card, you know that it's terrible.
 
What is this new RT overdrive mode actually and can we run it on the public release.
 
What is this new RT overdrive mode actually and can we run it on the public release.
No idea, there's no description for overdrive on the official site.
 
Hi,
It's just a logo that is added to all reviews.
It has to be a damn bad product to not get it, it's saying it's a good GPU and/or a good 4070Ti - it's all about what it's being compared to


This GPU goes for $1k Au less than i paid for my 3090, so I cant fault the pricing too much
(Just the lack of VRAM/bus width and if that'll be a future issue)
 
Please link to your review wanna see what a real reviewer looks like else you are the coward :(
 
Just a shame Techpowerup doesn't have the integrity or guts to say the same.

cowards.
The work done here for reviewing is unmatched, e.g. the ray tracing part is very interesting and shows that nVidia's RT cores are useless and barely improved gen over gen.

As for the conclusion, as far as I'm concerned they are meaningless, they provide the data to make your own mind, plus it's too US biased e.g the 4070ti launch price is 50%+ than the 3080 any conclusion that doesn't take this into account is wrong.

I wouldn't say they lack integrity but rather they are wrong, as they clearly show that the 4070ti is a bad product. While others with no integrity like DF will actively mislead viewers and openly do marketing for nVidia in non-sponsored videos.
 
Just a shame Techpowerup doesn't have the integrity or guts to say the same.

cowards.
Or maybe the youtubers are doing their clickbait thing while TPU sticks to the facts, with less focus on the USD MSRP pricing - because that pricing is worthless in the long term?
 
Yt Influencers are resorting to antics and slandering for click bait telling viewers what they want to hear, not what they need or something like that. So much noise for just $300 overpriced 4070 and 4080. If the buyer like me can't afford $700, what difference does it make. 700 or 900 wouldn't matter to me, can't afford it. So Nvidia are going to keep releasing new products at a lower price until we meet.
 
Yt Influencers are resorting to antics and slandering for click bait telling viewers what they want to hear, not what they need or something like that. So much noise for just $300 overpriced 4070 and 4080. If the buyer like me can't afford $700, what difference does it make. 700 or 900 wouldn't matter to me, can't afford it. So Nvidia are going to keep releasing new products at a lower price until we meet.
It literally matters because there is a $200 difference… lol

I would be willing to pay $999 for a 4080 but at $1199 it is a hard pass. See how that works. I can afford both, but there is a point that too much is too much. $799 for the 4070ti is too much.
 
It has to be a damn bad product to not get it, it's saying it's a good GPU and/or a good 4070Ti - it's all about what it's being compared to


This GPU goes for $1k Au less than i paid for my 3090, so I cant fault the pricing too much
(Just the lack of VRAM/bus width and if that'll be a future issue)
Hi,
Yep I've seen laundry list of thumbs down, way more than thumbs up and the review still get the editors choice logo so I just don't give that part of the conclusion much notice anymore
The context is better to take notice to

As far as pricing in the long run, well in the far past prices have always gone down after approximately 6-12 months
That seems like a thing of the past since miners screwed the system up completely
I look forward to this hardware price going down but will not hold my breath so atm I'm out of the new hardware market and used I'm never interested in that market.
 
Or maybe the youtubers are doing their clickbait thing while TPU sticks to the facts, with less focus on the USD MSRP pricing - because that pricing is worthless in the long term?
Problem with facts is the 4070ti is 50% more expensive than the 3080 where was this mentioned? They still stick to US prices in their conclusion. They should've instead just link the conclusion to a price, at xxx it's good beyond xxx it's stupid.

But from what I've seen people are smart, all new cards are not selling and are available at their suggested prices.
 
Not exactly, they're already in trouble. The cards won't fly off the shelves. They have to make it right. 16GB or see ya in 2024. Who wants those crappy choices for memory config at above $750. Pay $900 with taxes only to get 20% bottlenecked at 4K. Well it's not a 4k card but still.
Well, it's around the RTX 3080 Ti in performance and that was referred to as a "4K card".
1. Well it is borderline. I am hitting a wall with most games at 8 GB at 3440*1440 (Shadow of the Tomb Raider hits 7 Gb at 2560*1440 already). 4K is going to push that past 10 GB easy. So in 2 years time, do you think 12 GB will be enough? (And no, reducing details is not an option, if i'm paying this much money for a card, it better work)
This is a big reason the reason that I chose the RX 6800 XT over the RTX 3080. That 6GB VRAM difference is HUGE and 10GB was not enough VRAM for a GPU with that level of potency. Down the road, it would end the card's usefulness far too early.
2. My point exactly, I am worried, again about the future.
Most intelligent people are because when you're not worried about the future, it has a tendency to sneak up on you and bite you in the ass.
3. I did not forget, until Frame Generation also improves frametimes (playing at 100 fps but with 50 fps input lag is not 100fps), I care not for it.
I really don't think that will ever happen because you can't reduce latency with fake frames.
It has to be a damn bad product to not get it, it's saying it's a good GPU and/or a good 4070Ti - it's all about what it's being compared to
The word "good" is extremely vague. One could say that a GPU is "good" as long as it functions. The fact is that "good" is a blend of performance, price, longevity and all the other features associated with GPUs. Everyone wants a different balance because they have different priorities so what is "good" for one person is "bad" for another. That's why I define "good" as "not defective" because that's a universal priority.
This GPU goes for $1k Au less than i paid for my 3090, so I cant fault the pricing too much
(Just the lack of VRAM/bus width and if that'll be a future issue)
Actually, you can fault the pricing because it gets thoroughly crushed by the RX 7900 XTX in GPU speed with or without RT turned on. This is despite the fact that it's less than $200 cheaper. You can also fault the pricing because it was a mistake to pay what you did for your RTX 3090 just like it was a mistake for me to pay what I did for my 6800 XT. Just because we screwed up doesn't mean that we have to do it again, we just have to admit that we buggered things up for ourselves.
It literally matters because there is a $200 difference… lol
Absolutely it does. Every dollar counts no matter what Jensen says.
I would be willing to pay $999 for a 4080 but at $1199 it is a hard pass. See how that works. I can afford both, but there is a point that too much is too much. $799 for the 4070ti is too much.
Only if you're talking $999AUD because if you're talking $1,000USD, then you're talking around $1,500AUD which is just insane. Unless you're using this for a professional workload, there's no reason to spend that kind of money for a VIDEO CARD. I remember when I could get a good gaming card for under $500CAD. It wasn't that long ago either because I paid $490CAD for my XFX RX 5700 XT.

Having used both GeForce and Radeon cards in the past, there's no way that I'd pay $200 more for an RTX 4080 over the RX 7900 XTX because I know that the gaming experience is the same regardless of whether the box is red or green. It's like trying to tell if the GPU in your phone is Adreno or Imagination. Yeah, good luck with that. :roll:
 
Problem with facts is the 4070ti is 50% more expensive than the 3080 where was this mentioned? They still stick to US prices in their conclusion. They should've instead just link the conclusion to a price, at xxx it's good beyond xxx it's stupid.

But from what I've seen people are smart, all new cards are not selling and are available at their suggested prices.
So the reviews are supposed to include VAT? Not all countries have a VAT. I live in the US and I pay no sales tax.
 
This is a big reason the reason that I chose the RX 6800 XT over the RTX 3080. That 6GB VRAM difference is HUGE and 10GB was not enough VRAM for a GPU with that level of potency. Down the road, it would end the card's usefulness far too early.
RTX 4070 Ti with its "measly" 12 GB has already put others with much more VRAM to shame, like RX 6800/6900 XT with 16 GB or RTX 3090 with 24 GB. None of these will all of a sudden start to scale better again in the future.
As I've said before, I don't care how much VRAM people "feel" a card needs, the facts matter, which is evident in the benchmarking.

So the reviews are supposed to include VAT? Not all countries have a VAT. I live in the US and I pay no sales tax.
The only measure that makes sense across countries and over time is US MSRP, the other countries' local MSRP is pretty much proportional to this with their respective VAT and toll rates. Prices in stores will vary over time, so this would affect the conclusions of the reviews.
 
Back
Top