• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Ti to Launch $100 Cheaper Than RTX 4080 12 GB

Regarding the current prices for card, here in the Uk the prices are for once lower than in other parts of the world, at or lower than MSRP.

3080-10GB - £720 - https://www.scan.co.uk/products/pal...ray-tracing-graphics-card-8704-core-1710mhz-b
6800XT - £650 - https://www.overclockers.co.uk/powe...16gb-pci-express-graphics-card-gx-19t-pc.html
7900XT - £899 - but all of these are based on the AMD reference design, with the potentially faulty vapor chamber.
Checked prices here(Norway), and could see one (or some) 6800XT higher priced than the 7900XTX.... though most cheaper
3080 10GB - 9999,- NOK (1,004 USD)
6800 XT - 8 699,- NOK (874 USD)
7900 XT - 11 290,- NOK (1,135 USD)
4080 - 16 409,- NOK (1,650 USD) (note, this was 2nd cheapest, but cheapest in storage)
A number of AIB's have used the stock AMD design, so no, not just MBA. Der8auer commented that this issue is effecting thousands of cards, which is a lot worse than the 50 cases (so nVidia says) of the power connector issue with the 4090.
thousands...? and that's based on the 40 reported cases, with again 4 of them tried (and not concluded what the fault is)?
It's its own thread for that issue anyways, and imo, not very related to a thread about rumored 4070Ti https://www.techpowerup.com/forums/...re-faulty-coolers-causing-overheating.302917/
 
AMD could be killing this time if they have concentrated in RT performance. They concentrated in raster performance and they failed even there. If they have concentrated in RT and they where at least equal in performance with RTX 4000 series, they would have been shielded from a price war. Now they have to fight a $100 cheaper product that will be performing better at RT and be more than enough in raster. Not to mention the fake frames 3.0 that tech press already find it smooth and slowly, version after version, they will be seeing it as a must have feature.

I think at this price 4070 Ti will be having a walk in the park. It's ridiculously expensive for a xx70 product, but again it will sell like hotcakes.

I guess it's going to be stagnation, monopoly and slow news for at least the next 2 years.
 
the 4080 should sit above a 7900 xtx because its objectively a better card.
Sure while the 7900's clearly have issues and the XTX is still ahead in raster. Also you are getting 24GB not 16GB and you think that should be free memory? I don't care at all about RT, even the 4090 needs fake frames to look good in AAA titles. I have Nvidia cards and have never bothered with RT once.
 
Gpu sales are at the lowest in 20 years, wonder why

The consumer market has spoken. Only the very few are willing to pay what is being asked. The majority are saying, 'no'. This is how markets work and no matter what people will say, this WILL be worrying for Nvidia and AMD. They will have expected the 4080 to fly but it has not. AMD haven't got enough XTX inventory to sell in high numbers, regardless, against the 4080, it is a hard sell (4080 is more efficient, close enough in raster and more well-rounded). The 7900XT is on shaky ground while we await reviews for the 4070ti.

We're hitting a hard wall here where the 4060 will probably be priced higher than last years 3080. :shadedshu:
 
That price's a joke, never in my life, I will pay that much for an x70 card. We badly need Intel to make competitive GPUs, there is no reason for AMD to have an aggressive pricing policy and they also lack the performance and feature set. So they just accept the loss and sell it a bit cheaper but still way above the "vintage" prices.
 
Unlike the 7900 XTX, the 4080 doesn't have any defects tmk not to mention that card is loaded with Nvidia goodies including DLSS3 along with solid drivers. If the availability is decent for the 4070 Ti then it could very well be the go-to card a lot for peeps gaming @ 1440P and that of course includes streamers.

Dear lord do you make it ever sound like you have some Nvidia stock you would like to increase in value.....NON of the new cards are the "go-to cards", that is an absolutely INSANE statement.
And Nvidia goodies? idk if you noticed but DLSS3 is pretty hit or miss atm, kinda like DLSS when it first came out.....no wait that was only miss.

Seriously man, looking at your other posts here as well, are you in Nvidia's pocket or just having a bad day or what?
"the 4080 doesnt have any defects" well it has the same connector which was a bit of an issue, and what defect does the 7900 xtx have? you mean that Derbauer vid of how some of the reference cooler designs have an issue with the vaporchamber? you extrapolate that to a defect for the 7900xtx in general? ignoring those that dont suffer from it not to mention partner cards with different coolers?

Honestly I could react to just about any of your posts on this thread man, again...idk if you are in Nvidia's pocket or what but man are you spewing some weird stuff.....

You gotta admit though DLSS 2.0 > FSR by a wide margin in adoption, quality, options, and performance

If that is a fact that there is nothing to "admit" really.
but do back it up with some facts then, how many games support DLSS and how many FSR?
In how many games does DLSS actually look better (and how much does that hold true if we exclude pixel peeping in direct comparisons?

Options? not even sure what you mean with that.

Performance, again, provide some benchmarks to back this up.
 
Last edited:
Most shamelessly overpriced xx70 Ti in history.
 
You gotta admit though DLSS 2.0 > FSR by a wide margin in adoption, quality, options, and performance

I am hard pressed to find a single game where DLSS looks or runs noticeably better. Do you have any such examples ? Also there seems to be more or less an equal amount of games coming out these days that support DLSS or FSR, some of them support both. FSR is easy to implement, the fact that there have been many games modded to support it proves that, any developer that doesn't add FSR but adds DLSS does so with a vested interest in promoting Nvidia products.
 
Last edited:
Sure while the 7900's clearly have issues and the XTX is still ahead in raster. Also you are getting 24GB not 16GB and you think that should be free memory? I don't care at all about RT, even the 4090 needs fake frames to look good in AAA titles. I have Nvidia cards and have never bothered with RT once.
The xtx is not ahead in raster . It is on par. I’m looking at the 1440p average fps so I suppose we can single out games where one does better than the other . It’s not a clear win for either of them.
I feel like 16 GB is plenty for this level card but I agree that more is better . I do think the main decision is whether u want dlss/rt or not . If u don’t then the decision is simple
 
Last edited:
They don't have a choice, there's no competition.
HAHAHA, you're acting like people NEED GPUs like they need water and that simply not buying one isn't an option....you're forgetting that they could always just abstain from upgrading....if even 30% of potential GPU consumers did this, it WOULD send a message. What I'm very confused about personally is that we know for a fact, that the global trend toward more and more poorer people and fewer and fewer richer people is a fact....so who the heck is buying all these 4090s?

AMD could be killing this time if they have concentrated in RT performance. They concentrated in raster performance and they failed even there. If they have concentrated in RT and they where at least equal in performance with RTX 4000 series, they would have been shielded from a price war. Now they have to fight a $100 cheaper product that will be performing better at RT and be more than enough in raster. Not to mention the fake frames 3.0 that tech press already find it smooth and slowly, version after version, they will be seeing it as a must have feature.

I think at this price 4070 Ti will be having a walk in the park. It's ridiculously expensive for a xx70 product, but again it will sell like hotcakes.

I guess it's going to be stagnation, monopoly and slow news for at least the next 2 years.

The implied assumption in this comment is that "AMD CAN do all these things you want, but they're just not doing it" when that couldn't be further from the truth. I'm guilty of posting this ad nauseam, but it truly is THE defining factor in all of this: Nvidia has an R&D budget of over $5.27 billion. AMD has an R&D budget of just $2 Billion, and don't forget, where Nvidia can spend the overwhelming majority if their much larger budget purely on graphics, AMD has to split their between x86 and graphics. Due to the fact that x86 has the larger T.A.M. when compared to graphics, and Ryzen is much more lucrative to AMD than Radeon is, it's safe to assume that of that $2 Billion budget, less than half goes to graphics. So basically, it's a situation in which AMD's graphics division has to compete with Nvidia while Nvidia spends about 6x more on R&D. Based on that reality alone, I think it's actually a MIRACLE and rather impressive that AMD is able to compete at all, and even match or exceed Nvidia in Raster. Can you think of any other example from any other market or industry in which a duopoly exists and the underdog manages to still compete while having around a sixth of the budget? I think instead of complaining about what AMD does or doesn't do, we need to understand that financial reality of the situation and realize that when it comes to things like ray tracing, Nvidia had a generational headstart and basically unlimited funds when compared to AMD and use that realization to temper our expectations.

FYI: Intel's R&D budget is over $15 billion, and while Intel has numerous different markets and products in which to utilize those funds, I think it's safe to assume that Intel is spending a whole lot more than AMD on x86 R&D, probably by a factor approaching Nvidia's 6x advantage. It is therefore yet another extremely impressive feat that AMD has accomplished what it has in x86 as well while being at a severe financial disadvantage, Based on the money spent, Intel should be outperforming AMD by 40+%, but they're not, which testifies to the fact that AMD does a whole lot with very little.
 
Oh wow. So generous of Nvidia to rebrand there 4080 and sell it 100 usd cheaper.
Generous is too modest, we should call this divine...:nutkick:
 
In how many games does DLSS actually look better (and how much does that hold true if we exclude pixel peeping in direct comparisons?
Like, the vast majority, but you could always argue that the FSR image is preferable to you, depending on your IQ tastes.
I am hard pressed to find a single game where DLSS looks or runs noticeably better.
I'm hard pressed to find a game where FSR looks roughly as good at a glance, although admittedly 2.2 is really starting to shape up. And here I was thinking with the way you post you're allergic to Nvidia GPU's, do you posses one to see DLSS running with your own eyes on your own setup, to do a genuine and honest apples to apples? doubt.
Nvidia has an R&D budget of over $5.27 billion. AMD has an R&D budget of just $2 Billion
Are we supposed to give them a pass or go easy on a products performance, features and pricing because of that? no.

On topic, my price prediction was spot on, @ $799, still too much of course, this should be 499/599 but in this current market its about the best we can hope for, till both camps can satisfy their desire to move old gen stock. Keen to see how it performs.
 
On topic, my price prediction was spot on, @ $799, still too much of course, this should be 499/599 but in this current market its about the best we can hope for, till both camps can satisfy their desire to move old gen stock. Keen to see how it performs.

:toast:

Bingo. It's where the 7900XT should be as well.
 
The 3060-class GPU die in the 4070 Ti is an insult. $600, fine, but $800? NO way. Garbage.
Do you know that the chip on the 4070 Ti (AD104) literally has 3x as much transistors as the chip on the 3060 (GA106)? It's not about die size but about transistor count and the cost of transistors. The AD104 is about 2.5x more expensive vs GA106. Not to mention a video card is not just one chip. The TDP of the 4070 Ti is allegedly 280W while the TDP of the 3060 was 170W which means the 4070 Ti needs more complex pcb with better components and better/bigger cooling. Also it's a higher tier product which means higher margins for Nvidia and for the AIBs. And the 3060 was basically unavailable at MSRP since it's launch, maybe they sold it in small quantities.
Coreteks really misses the plot in that video, he doesn't know what's he talking about.
 
Last edited:
Like, the vast majority, but you could always argue that the FSR image is preferable to you, depending on your IQ tastes.

IQ tastes?
And also, I mean what does it mean to look better, how? I know FSR has some occlusion issues still that plagued DLSS before as well and yeah you could notice that.
But in general again, unless you do a meticulous side by side, there is basically no difference between them, its not a case of preferring one over the other like there is a taste or opinion involved, like its some artistic impression, its just a program doing what it suppose to be, upscale an image to resemble the native res while actually running a lower internal res.
 
It's 100$ cheaper for DLSS3 and raytracing, I am not saying it's good or bad to neglect raster, just that people may justify picking this one over the 7900XT for those features and the lower price.
7900XTX is 14% slower than 4080 at RT. So even for people who are into RT gimmick, it's a non-argument.

DLSS3 as in "let me interpolate frames" (something that my TV can do on its own) will inevitably appear under FSR brand (and has major downsides anyhow)


That "with DLSS3" is needed to claim card being "faster than 3090Ti".


It is bonkers products like that are even being rolled out at $800. 3080 likely kicks it at 4k.
 
I'm hard pressed to find a game where FSR looks roughly as good at a glance, although admittedly 2.2 is really starting to shape up. And here I was thinking with the way you post you're allergic to Nvidia GPU's, do you posses one to see DLSS running with your own eyes on your own setup, to do a genuine and honest apples to apples? doubt.

You know that you can just prove me wrong by showing me some screenshots or something, right ? Lord Jensen has been kind enough to allow screenshots of games running DLSS to be viewable by peasants on anything that has a screen in order to do comparisons on them.

1672745941862.png


Can you tell which is which ? I sure as hell can't. To be honest, even if you told me that you can tell the difference, I wouldn't believe you in a million years.
 
Last edited:
The implied assumption in this comment is that "AMD CAN do all these things you want, but they're just not doing it" when that couldn't be further from the truth. I'm guilty of posting this ad nauseam, but it truly is THE defining factor in all of this: Nvidia has an R&D budget of over $5.27 billion. AMD has an R&D budget of just $2 Billion, and don't forget, where Nvidia can spend the overwhelming majority if their much larger budget purely on graphics, AMD has to split their between x86 and graphics. Due to the fact that x86 has the larger T.A.M. when compared to graphics, and Ryzen is much more lucrative to AMD than Radeon is, it's safe to assume that of that $2 Billion budget, less than half goes to graphics. So basically, it's a situation in which AMD's graphics division has to compete with Nvidia while Nvidia spends about 6x more on R&D. Based on that reality alone, I think it's actually a MIRACLE and rather impressive that AMD is able to compete at all, and even match or exceed Nvidia in Raster. Can you think of any other example from any other market or industry in which a duopoly exists and the underdog manages to still compete while having around a sixth of the budget? I think instead of complaining about what AMD does or doesn't do, we need to understand that financial reality of the situation and realize that when it comes to things like ray tracing, Nvidia had a generational headstart and basically unlimited funds when compared to AMD and use that realization to temper our expectations.

FYI: Intel's R&D budget is over $15 billion, and while Intel has numerous different markets and products in which to utilize those funds, I think it's safe to assume that Intel is spending a whole lot more than AMD on x86 R&D, probably by a factor approaching Nvidia's 6x advantage. It is therefore yet another extremely impressive feat that AMD has accomplished what it has in x86 as well while being at a severe financial disadvantage, Based on the money spent, Intel should be outperforming AMD by 40+%, but they're not, which testifies to the fact that AMD does a whole lot with very little.
I do agree with the above stuff you post. I am also posting similar replies to people expecting AMD to beat Intel and Nvidia in everything just so they don't have any excuse to try to attack AMD.

BUT. What I was saying is that they where shortsighted here, thinking that RT performance will remain secondary for at least the next 2 years. That they still have time, they can focus on raster for the next 2 years and still be fine. Well, RT performance isn't that much important in gaming yet and it might not be in the next 2 years, but it is in marketing. And that's where they are losing. 4070 Ti will sell better because of DLSS 3.0 and RT performance. AMD might offer something competitive with FSR 3.0, but they wouldn't be able to do so with RT. Even Hardware Unboxed that many Nvidia trolls call it AMD biased (they are used demand in having tech press at 80%-20% favoring Nvidia over AMD, so anything close to 50%-50% they call it biased) was mentioning in their latest video that as time passes, RT performance will becoming more important, more games will be supporting it. And the RT performance is the main argument from tech press today to paint 7900XT/X as a NON recommended product. If AMD has focuses on RT performance, they could sell their top cards at at least RTX 4070 Ti prices. Now they will be having problems selling RX 7900XT/X even against RTX 4070 Ti.

So, because of their limited R&D, they have to be more careful about their targets. They thought of DDR5 and a long lasting socket and they failed in CPUs. They thought of chiplets and raster and they failed in GPUs. And they have also to change their business model. AMD tries to offer products that are future proof. Intel and Nvidia try to win current benchmarks, ignoring or even limiting the long term value of their products. AMD needs to do the same because consumers do not appreciate this. If AM5 was a 1-2 generations socket with DDR5 AND DDR4 support, the platform could be (much) cheaper. If 7900XT/X where coming with lower VRAM capacity and narrower data bus, they could be cheaper to produce. And if they where not much better in raster than 6900X but twice as fast in RT, they could have a real chance in the market.

Intel is another kind of monster. They know they will sell even mediocre products, thanks to their strong ties with OEMs and the dependensy of OEMs to Intel products. As with their efforts to remain an alternative to ARM in tablets, by throwing billions, they will do the same with GPUs. They need GPUs for their future servers and they need GPUs to keep having a gaming platform in retail, even if Nvidia in the future decides to go big in ARM and slowly start abandoning X86 platform with a goal to create it's own ARM based gaming platform. Think Valve but with hardware.


PS AMD needs to focus on RT because if they don't they will lose the next generation of consoles. MS and SONY are not married to AMD.
 
Last edited:
They don't have a choice, there's no competition.
Up to a point. For luxury goods even in the face of monopolies where there's a lack of choice of 'what' to buy, the other real personal choice ('that' to buy) remains, ie, learning to say "no" and standing firm against anti-consumer practices. If "Real Gamers (tm)" stopped acting like FOMO is some incurable supernatural possession and developed an ounce of self-control, everyone (including themselves) would be better off in the long run.

The consumer market has spoken. Only the very few are willing to pay what is being asked. The majority are saying, 'no'. This is how markets work and no matter what people will say, this WILL be worrying for Nvidia and AMD. They will have expected the 4080 to fly but it has not.
Agreed. Many "ordinary" people are obviously saying no, and asking around this Xmas has been an eye-opener. Out of the dozen or so people I know in person who are PC gamers, only one bought a new GPU this Xmas. Five bought their kids consoles (mostly XBox S/X + Gamepass) instead of a GPU and all the kids seem happy with it, two are mostly into older / Indie gaming and are just keeping what they already have (I make a 3rd in that category), one who was planning to buy a mid-range 4070 class GPU took one look at the predicted prices and bought a Steam Deck instead for half the price (and he's very happy with it too taking it damn near everywhere), and 2-3 others are in a "holding pattern" quietly watching & waiting to see if things improve / the economy worsens in 2023 vs the threshold at which they start to drift away from PC gaming altogether if the GPU market doesn't get its sh*t together.
 
Absolutely insane that a 70 series card is now $800.
It's not even a true 70 series. xx70 class used to have 256 bit memory bus. This thing is somewhere between traditional xx60 and xx70 class. It should be 4060TI at best. Pure insanity.
 
If Wizard's review of this card turns out to be like I think it will I'm going to bump this thread for sure. ^^
 
Back
Top