Tuesday, January 3rd 2023

NVIDIA GeForce RTX 4070 Ti Launched at $799 with Performance Matching RTX 3090 Ti

NVIDIA today formally launched the GeForce RTX 4070 Ti "Ada" performance segment graphics card at a starting MSRP of USD $799. Based on the 4 nm "AD104" silicon, the RTX 4070 Ti is essentially the same product as the RTX 4080 12 GB, which NVIDIA decided to cancel from its original mid-November launch, toward a new one this CES, under a new model name. The card maxes out the silicon it's based on, featuring 7,680 CUDA cores, 60 RT cores, 240 Tensor cores, 240 TMUs, and 80 ROPs. It gets 12 GB of GDDR6X memory across a 192-bit wide memory interface, running at 21 Gbps (GDDR6X-effective). The card has a typical power rating of 285 W, and continues to use a 12VHPWR power connector, even on the custom-design products.

NVIDIA claims that the RTX 4070 Ti should enable maxed out AAA gaming with ray tracing at 1440p, while also being formidable at 4K Ultra HD in games that can take advantage of technologies such as DLSS 3 frame-generation, or even classic DLSS 2. The company claims that it offers performance comparable to the previous-generation flagship, the GeForce RTX 3090 Ti "Ampere," with a much higher performance/Watt rating. The RTX 4070 Ti doesn't appear to feature an NVIDIA Founders Edition model, and is a partner-driven launch, with custom-design cards dominating the scene. The RTX 4070 Ti will be available from January 5, 2023, but we'll have reviews for you before that!
Add your own comment

150 Comments on NVIDIA GeForce RTX 4070 Ti Launched at $799 with Performance Matching RTX 3090 Ti

#76
Chrispy_
Tech NinjaCrutches that are better IQ than TSAA?
That's subjective rather than objective, far from a unanimous opinion, and I for one hate DLSS because it's a smeary, incoherent blur in motion on a fast enough display like OLED or good high-refresh rates screens.

In some games, blur doesn't matter, but I have always turned off motion blur no matter what. I loathe it in games.
Some games implement DLSS better than others, and a good DLSS implementation can be good enough that I will actually tolerate it for the performance gain.

There will always be artifacts, and it's always easy for me to spot the low-resolution jaggies of the actual render resolution; You can't see them easily in static side-by-side comparisons, and you can't really see them in Youtube videos because of the compression.

Raytracing makes the artifacts even more obvious because the raytracing generates a noisy image and the size of the noise is exponentially larger the lower the sampling resolution gets. You only see the noise in motion, because the denoiser can hide the worst of it within 3-4 frames. Only a moving image is constantly generating virgin reflection/shadow/ambient information that the denoiser has no prior frames to deal with.

One thing we can always agree on is that when DLSS 'Quality' is used for 4K, it's being rendered at 1440p because that's an objective fact. IMO that makes it a 1440p card because you can upscale or downscale that to literally any arbitrary resolution you want, using any technology you want. 4K, 5K, 8K? Who cares? It's still a 1440p render and if you want DLSS's temporal AA, then use DLAA.
Posted on Reply
#77
robb
It's going to be more of a 3080ti than a 3090ti.
Posted on Reply
#78
80-watt Hamster
HaKN !Sorry, English is not my native language
"More money than brains," is also common.
Posted on Reply
#79
Garrus
You have to laugh so hard at these new chips.

For the first time ever, we get TWO node improvements. To 7nm, and to 5nm, at the same time. Massive transistor count increases. Free clock speed increases. Just shrink the RTX 3080 to 5nm, make it $600, give it 3Ghz, jesus christ these new products suck so bad.

RTX 2080 to 3080, 60 percent improvement in perf/dollar.
3070 Ti to 4070 Ti, -5 percent improvement (actual pricing, there is no $800 4080, most of them are more expensive than the 7900 XTX in Canada, slightly over $1000 USD).

Unbelievable. 2 node improvements and zero or negative improvement depending on your country.

Total garbage.
Posted on Reply
#80
ARF
RedelZaVednoQuick math using die per wafer estimator calc and known numbers:
2080TI 754 mm² die size -> 12nm 300mm wafer $4000 (yield of 80%) = 70 dies out of them fully 56 functional = $71 per die -> this die probably costed between $200 & $300 to produce at the start of N12 production
4080 379 mm² die size -> 5nm $17000 (yield of +90%) = 147 dies out of them fully 132 functional = $128 per die

As you can see new dies are more expensive to produce, but not groundbreakingly more expensive and prices will only get better once TSMC moves Apple's products to smaller node.
You got everything wrong. $4000 price of wafer is the 2020 price. Not the 2018 price.
Also, the $17000 is also a 2020 price, not 2023 price :D

Don't troll, get your facts correct. You have to normalise for the time of launch of the corresponding product - RTX 2080 Ti - 2018, RTX 4080 - 2023.
Posted on Reply
#82
efikkan
So, Nvidia launched a card that's slightly cheaper than RX 7900 XT and presumably performs a little lower, what makes Nvidia evil this time?
Is it like in the Fury X days, when Nvidia was evil for making AMD look bad?
Posted on Reply
#83
ARF
RedelZaVednoQuick math using die per wafer estimator calc and known numbers:
2080TI 754 mm² die size -> 12nm 300mm wafer $4000 (yield of 80%) = 70 dies out of them fully 56 functional = $71 per die -> this die probably costed between $200 & $300 to produce at the start of N12 production
4080 379 mm² die size -> 5nm $17000 (yield of +90%) = 147 dies out of them fully 132 functional = $128 per die

As you can see new dies are more expensive to produce, but not groundbreakingly more expensive and prices will only get better once TSMC moves Apple's products to smaller node.
Fixed the wrong maths by you :D

RTX 2080 Ti 2018 754 mm² die size -> 12nm 300mm wafer $15000 (yield of 60%) = 70 dies out of them fully 42 functional = $357 per die
RTX 4080 2023 379 mm² die size -> 5nm 300mm wafer $15000 (yield of 80%) = 147 dies out of them fully 117 functional = $128 per die
Posted on Reply
#84
heflys20
efikkanSo, Nvidia launched a card that's slightly cheaper than RX 7900 XT and presumably performs a little lower, what makes Nvidia evil this time?
Is it like in the Fury X days, when Nvidia was evil for making AMD look bad?
That price is gonna be $900 after a while.
Posted on Reply
#85
ARF
efikkanSo, Nvidia launched a card that's slightly cheaper than RX 7900 XT and presumably performs a little lower, what makes Nvidia evil this time?
Is it like in the Fury X days, when Nvidia was evil for making AMD look bad?
1. 12 GB VRAM
2. Performance expectation by users for this price tier was somewhat higher - so the product is overpriced and underperforms. AMD doesn't matter.
Posted on Reply
#86
tvshacker
Lei950$ in Europe:



Cartes graphiques GeForce RTX 4070 Ti | NVIDIA
Each country has (wildly) different prices. In Portugal at the moment the cheapest (without shipping) for each model are:
  • 3070TI: 691,14€
  • 3080: 839,9€ (the 2nd cheapest is 1092,92€)
  • 3080TI: 1559,9€
  • 3090: 1999,9€
  • 3090TI: 1939,98€ (not a typo, the cheapest 3090TI is actually cheaper than the cheapest 3090)
  • 4080: 1405,99€
Let's wait for the reviews, but around this parts, the perf/€ should be near double of past gen.
Posted on Reply
#87
ARF
tvshackerEach country has (wildly) different prices. In Portugal at the moment the cheapest for each model are (without shipping):
  • 3070TI: 691,14€
  • 3080: 839,9€ (the 2nd cheapest is 1092,92€)
  • 3080TI: 1559,9€
  • 3090: 1999,9€
  • 3090TI: 1939,98€ (not a typo, the cheapest 3090TI is actually cheaper than the cheapest 3090)
  • 4080: 1405,99€
Let's wait for the reviews, but around this parts, the perf/€ should be near double of past gen.
AMD's pricing is better:

Radeon RX 6400 - 140.90
Radeon RX 6500 XT - 181.90
Radeon RX 6600 - 269.00
Radeon RX 6600 XT - 387.16
Radeon RX 6650 XT - 319.00
Radeon RX 6700 XT - 399.00
Radeon RX 6750 XT - 455.00
Radeon RX 6800 - 569.00
Radeon RX 6800 XT - 659.00
Radeon RX 6900 XT - 846.46
Radeon RX 6950 XT - 879.00
Radeon RX 7900 XT - 999.00
Radeon RX 7900 XTX - 1389.00
Posted on Reply
#88
tvshacker
ARFAMD's pricing is better:

Radeon RX 6400 - 140.90
Radeon RX 6500 XT - 181.90
Radeon RX 6600 - 269.00
Radeon RX 6600 XT - 387.16
Radeon RX 6650 XT - 319.00
Radeon RX 6700 XT - 399.00
Radeon RX 6750 XT - 455.00
Radeon RX 6800 - 569.00
Radeon RX 6800 XT - 659.00
Radeon RX 6900 XT - 846.46
Radeon RX 6950 XT - 879.00
Radeon RX 7900 XT - 999.00
Radeon RX 7900 XTX - 1389.00
There were very nice discounts a little while ago on a lot of models, for example ~340€ for a 6700 (non-XT). But right now it's gone up quite a bit across the board.
Posted on Reply
#89
heflys20
ARFAMD's pricing is better:

Radeon RX 6400 - 140.90
Radeon RX 6500 XT - 181.90
Radeon RX 6600 - 269.00
Radeon RX 6600 XT - 387.16
Radeon RX 6650 XT - 319.00
Radeon RX 6700 XT - 399.00
Radeon RX 6750 XT - 455.00
Radeon RX 6800 - 569.00
Radeon RX 6800 XT - 659.00
Radeon RX 6900 XT - 846.46
Radeon RX 6950 XT - 879.00
Radeon RX 7900 XT - 999.00
Radeon RX 7900 XTX - 1389.00
I knew I couldn't beat a 6800xt for $500. Good luck getting one at that price now. Lol.
Posted on Reply
#90
ARF
heflys20I knew I couldn't beat a 6800xt for $500. Good luck getting one at that price now. Lol.
I don't think I am getting it. I was expecting something like 400$ for EOL.... but not happening.
Now, better take something new generation - better AV1 video support, improved media engine, updated architecture, longer driver support, etc...
Even if 7700 or 7800.
Posted on Reply
#91
TheoneandonlyMrK
ARFFixed the wrong maths by you :D

RTX 2080 Ti 2018 754 mm² die size -> 12nm 300mm wafer $15000 (yield of 60%) = 70 dies out of them fully 42 functional = $357 per die
RTX 4080 2023 379 mm² die size -> 5nm 300mm wafer $15000 (yield of 80%) = 147 dies out of them fully 117 functional = $128 per die
Wooa there, I'm certain the production cost is not the same between 2018/12Nm and 2023/5Nm

And failure rates are debatable, variable and improve.
Posted on Reply
#92
Lei
efikkanwhat makes Nvidia evil this time?
They're both Evil

Posted on Reply
#93
kiriakost
Blessed the ones keeping a distance from 4K gaming.
This is the only way to fight back any unreasonable pricing policy.

At the end of day there is only one question, how much profitable its the gaming for us regular people?
Posted on Reply
#94
Lei
kiriakostAt the end of day there is only one question, how much profitable its the gaming for us regular people?
instead of chewing on lollipops, we game and thus keep our teeth stronger.
a dental implant costs nearly as a 4090. and an all-porcelain dental crown restoration is minimum 500$

Posted on Reply
#95
Garrus
The actual wafer costs are not the only issue. The problem is nVidia's designs suck FOR GAMING. They are blowing their transistor budget with bad designs. Doesn't matter if wafers are double the price, you could still make a much better GPU for a lot less money. One GPU design is for 5 different use cases. Bad idea.

GTX 1080 Ti: 12 billion transistors.
RTX 3080: 28 billion.
RTX 4090: 76 billion.

We are talking 7x the old density here. We could be buying a GTX 1080 Ti, at 3Ghz, with double speed VRAM, less than 100mm2. $200-$300 product, same speed as the RTX 4070 Ti in normal rastered games thanks to that double performance. Die size half the RTX 4050 die. LOL. There's a lot of crap thrown in to the die design to add those "features" that I don't want.

As for the RTX 3080, we have 2.7x that density. We could be buying the RTX 3080 with a 30 percent speed boost, <250mm2. $400. Faster than the 4070 Ti for raster, just as good for RT. $400. Same profit margins as the old 3080. We know the RTX 3080 design works. A shrunk design at higher frequencies will also work. This isn't "pie in the sky" thinking.

TSMC handed nVidia the victory. NVidia gave TSMC terrible designs for gaming. It's all ethereum mining, application performance, RT this DLSS that. I just want the 1080 Ti for $200, running with double clock speeds. So would everyone else.

They call it vision. I call it stupidity. AMD had a chance to go a different way, and they messed also. This isn't an nVidia versus AMD thing. This is a "our video card companies suck and they are ruining gaming" thing. Intel had such an opportunity here to target gamers instead of the datacenter, they messed up also.

Even the best of the bunch, the RTX 4090 doesn't look so hot. 7x the transistors for 3x performance. But that's running at 2x the clock speed. Basically 14x transistor*frequency for 3x gaming FPS. Bad. Those transistors are not being used effectively for the games you play.
Posted on Reply
#96
kiriakost
Leiinstead of chewing on lollipops, we game and thus keep our teeth stronger.
a dental implant costs nearly as a 4090. and an all-porcelain dental crown restoration is minimum 500$
So you are not going to be with us at CES, giving a hard time to NVIDIA's pavilion and sales people, something for them to feel as gamer answer at their unreasonable plans.
Because at the end day, due such forum topics, the message of consumers this never reaches it destination.
Anonymous people word's can not harm any brand.
Posted on Reply
#97
Garrus
I mean, this is so funny. Stick modern VRAM in to an old GTX 1080 Ti design, and you get DOUBLE the ram bandwidth compared to the RTX 4070 Ti. That's how crap the 4070 Ti is. Terrible design. It's '60 class.
Posted on Reply
#98
kiriakost
GarrusThis isn't an nVidia versus AMD thing. This is a "our video card companies suck and they are ruining gaming" thing. Intel had such an opportunity here to target gamers instead of the datacenter, they messed up also.
Leave INTEL outside, we need someone dedicated to build data-center with out severe technical issues. :)
We need some new brains to stand up and create fresh competition.
Posted on Reply
#99
Lei
GarrusGTX 1080 Ti: 12 billion transistors.
RTX 3080: 28 billion.
RTX 4090: 76 billion.
I just want the 1080 Ti for $200, running with double clock speeds. So would everyone else.
I see, 6.3 times more transistors; 2.75 times more fps:

but 2.75 1080ti would take 300w more watts than a single 4090

Posted on Reply
#100
kiriakost
Garrusdesign, and you get DOUBLE the ram bandwidth compared to the RTX 4070 Ti. That's how crap the 4070 Ti is. Terrible design. It's '60 class.
Leibut 2.75 1080ti would take 300w more watts than a single 4090
This is the only thing worth talking about today, power consumption RTX3000 vs RTX4000
RTX3000 failed to have reasonable power consumption in contrast to their transistors count.
And many poor kids wasted piles of money at buying thermal pads and magical pills, and they succeed nothing.
Posted on Reply
Add your own comment
Dec 23rd, 2024 15:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts