Friday, February 9th 2024
NVIDIA GeForce RTX 4070 Ti Drops Down to $699, Matches Radeon RX 7900 XT Price
The NVIDIA GeForce RTX 4070 Ti an now be found for as low as $699, which means it is now selling at the same price as the AMD Radeon RX 7900 XT graphics card. The GeForce RTX 4070 Ti definitely lags behind the Radeon RX 7900 XT, and packs less VRAM (12 GB vs. 20 GB), and the faster GeForce RTX 4070 Ti SUPER is selling for around $100 more. The Radeon RX 7900 XT is around 6 to 11 percent faster, depending on the game and the resolution.
The GeForce RTX 4070 Ti card in question comes from MSI and it is Ventus 2X OC model listed over at Newegg.com for $749.99 with a $50-off promotion code. Bear in mind that this is a dual-fan version from MSI and we are quite sure we'll see similar promotions from other NVIDIA AIC partners.
Sources:
Newegg.com, via Videocardz.com
The GeForce RTX 4070 Ti card in question comes from MSI and it is Ventus 2X OC model listed over at Newegg.com for $749.99 with a $50-off promotion code. Bear in mind that this is a dual-fan version from MSI and we are quite sure we'll see similar promotions from other NVIDIA AIC partners.
122 Comments on NVIDIA GeForce RTX 4070 Ti Drops Down to $699, Matches Radeon RX 7900 XT Price
Edit:
That is with vsync enabled.. the only game I have to cheat with is cp2077.
Also you can't blame AMD entirely for the bad Blender performance, as the following article shows:
Blender’s decision to enable AMD ray tracing cores marks a pivotal moment in the world of 3D rendering. This follows Maxon’s recent inclusion of HIP in their Redshift renderer. We are increasingly seeing AMD looking to professional workflows with their video cards. They still aren’t entirely competitive with NVIDIA, but this comes as a warning shot. AMD is taking GPU rendering seriously, and if they are able to make the same sort of improvements as they did with CPUs when they introduced the Ryzen line, 3D artists stand to win. We are excited to see what the future holds for GPU rendering."
There is a 4060ti with 16GB.
There are midrange Ampere cards with 16GB.
These cards have no business in gaming whatsoever. There's nil advantage to them over their half VRAM counterparts (perhaps situationally, but that won't last), especially the 4060ti.
That's them catering to that exact demand right there but on a much more 'democratic' price level. Still overpriced for what it really is. But. An Nvidia card with 16GB on the newest architecture. It can do RT. It has AI. It has creator tools. It has everything your little gamur heart wants. Yadayada. You get the gist ?
I'm not telling you this makes sense in any kind of realistic economy or for real pro markets. But it makes sense in the hearts and minds of prospective buyers. Young people with little knowledge of what they might do or can do with that GPU perhaps. Somewhat more knowledgeable people that know how to put VRAM to use. Etc. There's a market here. Niche? I'm not so sure. I think a lot of people are sensitive to this.
I can't even deny I was totally insensitive to this, say for example a feature like Ansel. It's not like I would have picked Pascal over any other GPU at the time for it. But still. Its yet another neat little tool you can use, and I've pulled some pretty nifty screens for my desktop from it. All these little things really do matter. If you buy an Nvidia GPU, you get a package of added value that AMD simply cannot match. I'm now past the point of caring too much about all of that, but its a package nonetheless.
Amd tried to but they got their shit together.
500 for 7800xt
700 for 7900xt
is almost normal price. Should have been release price but we cant get everything.
whoever thinks any 70 or 70ti card is worth 700-900€ has lost their damn minds especially with that vram greed und ridiculous 60 card bit bus.
while a 7900xt gives you ti super to 4080 performance depending on the game.
The funny thing is the 4090 is so cut down it would barely make an actual 80ti card.
but hey people bought the freaking 3090 for double the price of a 3080 while being 10% faster. stupidity has no bounds especially gamers as you can see on the gaming industry.
Nvidia lied when they said 90 card will replace titans . Titan rtx birns the 3090 in some productivity tasks because it was areal titan. not a wannabe so they can raise the prices by 2. and people eating it up
However, the availability of the 4070Ti and other nVidia cards, is another question. Youre right here. However, some time ago, the same sentences were shot towards GCN and Vega achitectures, which even being the low end gaming were compute monsters. Contrary to nVidia counterparts, which with the exception of very high end, were completely anemic for the said tasks. Now the tables have been turned, but the narratives left the same. I don't try to attack you, but just some points to note.
Everyone knows, that Nvidia gatekeeps the market with CUDA, and does the really dirty things towards their fans, consumers, heck, even towards their precious clients and partners. They shit on absolutely everyone. Thats the fact. nVidia is anti-consumer and pro-investor trillion bucks worth corporation. And it growth like the mushrooms after the rain.
But at same time, what prevents AMD, to overtake the situation, and provide "morally" correct, OpenSource alternative to the CUDA, as well as countless of other confortable tool sets? To stop this vicious circle, to disrupt the monopoly. But AMD doesn't fight that, and instead joins this game, and might be in collusion with Nvidia. Anybody can point out the disgusting tactics, Nvidia wages, and how locked is their proprietary ecosystem. But it worth a credit to their many endeavours, and many SDKs they open to developers, excluding direct "incentives". There's no need to bribe game developers, as most already make games in regard with consoles, which carry Zen2 and RDNA2. What is needed is to help, support developers, make the process as easy as possible, so the devs won't evn care about nVidias fat suitcases.
Again, why AMD can't make invest into own viable, effective and comfortable and quality ecosystem? Pproprietary or not. What prevents AMD to do so, except their greed. At this point, it looks like AMD is the laziest company, as they sit on the laurels of EPYC/Ryzen, and gouging them as much as possible, and just ocasionally respond to the rivals. And they use OpenSource banner of their stuff, just to offload the development on the shoulders of clients and community.
Why this matters, is because such incompetent behavior is dangerous not only for AMD itself, but to entire market. Loosing one participant due to it's reckless moves, and the market would collapse. The next RTXxx50 would cost a grand, if will be at all. Every consumer, buyer needs a competition. It's impossible, when the partaker already gave up. Indeed. This is almost like the Buldozer vs Sandy Bridge drama all over again. When intel was competing itself for almost eight years. AMD needs to roll out their "Zen" of GPU, or they will loose the consumer gaming market completely. Intel is already reached the marketshare, that AMD were gaining for decade,. With just being couple of years present on the market, and even having their Xe failed launch. What AMD is going to do, when Battlemage wll happen. I bet intel doesn't sit their idling on their arse.
Another question, where it is possible to trick games to use Radeons, as there's no way they can't run such a basic task? That might be those poor laptops, that these sweatshops and cafés were running the Etherium and other crypto garbage. What nobody tells, is where have gone all the storage that been used for Chia mining? :rolleyes: Even if 4060Ti had wider bus, the GPU is still incapable of using all the VRAM, fast enough. Maybe 10-12GB would be better, but still doubtful.
Chia mining didn't use lots of storage, it' used up lots of storage. Expected lifespans of TLC SSDs was 90-days of Chia mining per TB of capacity. I assume QLC drives didn't even last long enough to be worth bothering with. For a mechanical drive of any capacity to survive more than 6 months was also an outlier, apparently, with death usually at the 3-5 month mark.
Even as someone who mined and holds crypto, I couldn't see the point of Chia, and I'm not really sure I see the point of Bitcoin mining. Digital, nation-independent, DeFi is the future, and Bitcoin started that, but we don't need to mine it wastefully. A successful independent DeFi doesn't have to generate 90 Mt of CO2 a year for no justifiable reason. The narrow bus is exactly what the 4060Ti needs. My own personal 4060Ti is undervolted and underclocked to a 125W power draw but even hamstrung like that and rendering at just 1080p I'll run into situations where the overlay says it's not fully loaded and neither is any single CPU core. That's either memory bandwidth or game engine bottlenecking, and I know it's not game engine because the same scene runs at 100% GPU usage on the 4070 or 7800XT.
It's also ROP-limited, so resolution scaling on the 4060Ti is pathetic compared to the 3060Ti but since the bandwidth bottleneck is so great, we don't really get to see the ROP limitation. For the 4060Ti to be a better 1440p card, it would have mostly needed more bandwidth, but also that would just revealed the ROP deficiency, which is more situational but still an issue holding it back.
Sadly, if you head to Wikipedia and look at the one-spec-sheet-to-rule-them-all, you can see how the 4060Ti is really a successor to the 3060 8GB in terms of bandwidth, SM+GPC counts, and relative position of that silicon in Nvidia's range of GPU dies. It's a long way off the 3060Ti and the only reason it gets close is because TSMC 4N lets Nvidia clock it 55% higher than the 3060Ti on Samsungs underwhelming 8nm node. And you've enjoyed 15 months of games at 4K from 2022 to 2023 titles I presume?
TLoU-P1, CP2077, Hogwarts, MS Fligh Sim all exceed 12GB at 4K on max settings. You'll notice it uncapped because it manifests initially as microstuttering and frame-pacing issues but realistically at those settings (expecially overdrive in CP2077) you're unlikely to be getting much more than 60fps in heavy scenes anyway, so artificially hindering the stuttering/pacing with 60Hz cap means it's less of an issue in those older 2022/2023 titles. Realistically, the issue with the 4070Ti isn't it's performance over the past 15 months, it's how it's going to perform in the next 15 months now that so many more games in the development pipeline are moving to UE5 and also ditching any semblence of PS4 and XB1-compatibility now that those consoles have been dropped for good.
The 4070 Ti isn't a bad card. It's objectively better than the 4070 and 4070S, both of which are considered "decent" cards. Everyone talks shit about the 4070Ti because of the asking price, and the sheer hubris/cheek/greed of Nvidia in trying to launch it at $900 as the 4080 12GB.
If you fall into the trap of comparing its price to other 40-series cards on pricing, you'll end up drinking the Nvidia kool-aid and justifying the cost relative to the 4080 which was just ridiculously poor value, but the reality at launch was that the $800 4070 Ti was bringing 3080 12GB/ 3080 Ti levels of performance and VRAM to the table at the exact same performance/$ point for new retail cards at that time. It wasn't dramatically faster than the 6950XT (another chart in the same HUB Jan '23 update) which was selling for $649, not that it had the same feature set.
Hardware Unboxed has been doing these monthly GPU pricing updates for a few years now, and around the 4070 Ti's launch it's clear to see why there was so much hate for the card and it's all because of the price. The only reason you can say it's a decent card is because you bought it at a deep discount which means you weren't actually price-scalped by Nvidia.
Nvidia's lock was so anticonsumer that even someone who had payed full price for an Nvidia card to use it for CUDA or PhysX, while also using a higher end/newer AMD card as main GPU, couldn't, because Nvidia was forcing the option the Nvidia card to be primary. So, If I had for example a GeForce 9800GT and was buying at a latter time an HD 4870 to use as my primary 3D card, Nvidia was punishing me for not being loyal by disabling CUDA and PhysX. That was a very sh__y business practice from a company that had 60% of the market back then, less of a billion income per quarter and less support from the public and press. Imagine them today having 80%+ of the market, billions of income every quarter and total acceptance from public and support from tech press, how they move in the background. And people expect AMD to offer super competitive options and not lose money in the process for no results. When running on the GPU, CPU wasn't doing much. When running on the CPU everything bad was happening, like CPU maxing and FPS becoming a slideshow.
How does Nvidia even define how much VRAM to put on their cards in order to meet the games' hardware requirements? :rolleyes:
www.techpowerup.com/review/nvidia-geforce-rtx-4080-super-founders-edition/29.html
www.techspot.com/article/2670-vram-use-games/
2. Not everyone plays in 4K. Up to 1440p, 12 GB is still fine, in my opinion. How fine it will be in the near future when the PS5 Pro is out and new games get developed for it, we'll see.
- Last of Us and Hogwarts are shitty console ports that make no effort to manage resources. That's not NVIDIA's fault yet you're blaming NVIDIA. No logic.
- NVIDIA has never positioned the 4070 series or lower as 4K cards. Therefore, if you run 4K on anything lower than a 4080 (which has perfectly sufficient memory for that resolution because it's designed for it) and complain about the experience, It's really simple, if you want to run games at 4K, buy the GPU designed for 4K. Can't believe I have to explain this, but here we are.
Don't waste your time explaining that; people whose only agenda is to parrot "NVIDIA doesn't have enough VRAM REEEEEEEEE" , to care about facts.If you only want to play current games and swap out your GPU with every new generation, then go ahead, but I do think that having some reserve potential in your system for future games isn't a bad idea.
So, how much would you be willing to spend on that 12 GB card and what are you going to use it for if you did spend? But that is a 4070 Ti, a tier higher. And the 2070 was a 2018 thing, now it's 2024, and you will be using that for at least a few years more.
who cares membus if gpu is fast..
My last car was 3.7L v6 mustang.
Now i got MB AMG CLA 2.0 and that thing is Fast! amd only 2.0l 4cyl engine..
so, i dont care how they buid GPU if it fast and run all my games. But maybe we who bought 3090 are so smart we can make money almost thin air.. or lucky ebought that we got the money to spend and buy 3090.
u can buy lambo or Toyota..
but even Toyota oweners can buy 3090..
i just want to say, 3090 is not smart buy but many can buy it whitout selling kidney.
i hope i can buy better enghlist writing talets.. but i cant so i buy best GPU money can buy.