• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Ti Drops Down to $699, Matches Radeon RX 7900 XT Price

Hey, at least now people openly admit Nvidia basically just bribes everyone.
That's objectively not bribery, sorry kid. Facts don't care about your feelings.

How many do you think there are ? How many people do you think care about blender performance for example ?
Considering Gamers Nexus includes Blender as one of their tests? A lot more than you believe, again facts trump your feelings.
 
If you believe that's bribery you're objectively an idiot who doesn't understand how capitalism works.
Literally paying people to use your proprietary software in order to wall yourself off from competition is not bribery ?

By the way, I saw your comment before you edited it, just wanted to let you know I think the same about you as well.

Considering Gamers Nexus includes Blender as one of their tests? A lot more than you believe, again facts trump your feelings.
The facts are they use it for CPU reviews not GPU, another joke of an argument as well, most people actually do not even use GPU acceleration.
 
Literally paying people to use your proprietary software in order to wall yourself off from competition is not bribery ?
It's not proprietary software. It's adding enhancements to existing software, AKA a value add, that happen to be proprietary to specific hardware. The consumer benefits if they have that specific hardware; consumers that do not do not lose anything.

Paying money to gain a competitive advantage is completely legal unless it disadvantages consumers. What NVIDIA is doing has the opposite effect, ergo it is not bribery. I am not a lawyer yet I have no trouble understanding this, why it's so difficult for you is beyond me.

By the way, I saw your comment before you edited it, just wanted to let you know I think the same about you as well.
Educate yourself using the vast free resources of the internet, instead of choosing to be ignorant, and I wouldn't have to use such words.
 
That’s a pretty big “unless”. I love how a lot of people just downplay that Radeon are absolute cheeks in anything apart from gaming. If you do, for example, any amount of work in Blender you basically have no actual choice. NV is the only game in town. The fact that AMD still hasn’t even tried to compete with OptiX is pathetic, to be blunt. How they think that they can just keep their back turned while the competition reaps the rewards is baffling to me.
Just out of curiosity....do you just blindly assume that Radeon and Nvidia are competing on the same playing field? You understand that Radeon operates on a much, much, much smaller R&D budget than Nvidoa does, right? That Nvidia has way more resources in basically every sense of the word, more money, more employees, better employees (because they can pay them more), and all those resources can be used as leverage to get better deals from suppliers, to pressure videogame developers to use their technology, etc.

Why do so many people just ASSUME that these two companies are competing on the same field with the same resources? I'm just getting really tired of the following: Armchair quarterbacks saying something like the following: "Radeon should just lower their prices by $150 across the board"

Almost every single one of these "suggestions" operates on the incorrect assumption thar Radeon has all the resources and "options" that Nvidia does and that Radeon's lack of success is solely due to making the wrong choices. For example, we are all aware that a publicly traded company has to maximize profit for shareholders or could risk being sued by those shareholders, right? So then just releasing a new line of videocards with an MSRP $150 below Nvidoa's competing option and that leaves a huge amount of profit margin on the table, probably wouldn't go over too well with shareholders, right? And we know shareholders and stock prices are basically the primary focus of publicly traded companies these days, right?

I'm not trying to defend AMD, not trying to tell anyone that can't offer their criticism, but what I am doing is suggesting that criticisms be actually anchored in and acknowledge reality and the limitations presented by reality.
 
@AnarchoPrimitiv
I am aware. I have never said anything about AMD lowering the prices on their cards. I merely noted that they are seriously lacking in non-gaming tasks. Is this true? Yes, objectively. I, as a customer, don’t care what is the state of AMD as a company, their RnD budget or anything else. I care about what the final product offers me for my money and for my use cases. I have no idea why people always try to find reasons to excuse the woeful performance of AMD Graphics division over the last decade. As much as they are winning in CPUs (versus another company that is bigger and richer than them, funny that), the way the GPU side of AMD is falling behind. This is just the reality of things. When and if they perform - then I will give them props. But rooting for the underdog just because they are one isn’t my style.
 
That’s a pretty big “unless”. I love how a lot of people just downplay that Radeon are absolute cheeks in anything apart from gaming. If you do, for example, any amount of work in Blender you basically have no actual choice. NV is the only game in town. The fact that AMD still hasn’t even tried to compete with OptiX is pathetic, to be blunt. How they think that they can just keep their back turned while the competition reaps the rewards is baffling to me.
Just out of curiosity....do you just assume that Radeon and Nvidia are competing on the same playing field? You understand that Radeon operates on a much, much, much smaller R&D budget than Nvidia does, right? That Nvidia has way more resources in basically every sense of the word, more money, more employees, better employees (because they can pay them more), and all those resources can be used as leverage to get better deals from suppliers, to pressure videogame developers to use their technology, etc.

Why do so many people just ASSUME that these two companies are competing on the same field with the same resources and the ONLY thing radeon has to do is make the correct choices? I'm just getting really tired of the following: Armchair quarterbacks saying something like the following: "Radeon should just lower their prices by $150 across the board" as if it's that simple.

Almost every single one of these "suggestions" operates on the incorrect assumption thar Radeon has all the resources and "options" that Nvidia does and that Radeon's lack of success is solely due to making the wrong choices. For example, we are all aware that a publicly traded company has to maximize profit for shareholders or could risk being sued by those shareholders, right? So then just releasing a new line of videocards with an MSRP $150 below Nvidoa's competing option and that leaves a huge amount of profit margin on the table, probably wouldn't go over too well with shareholders, right? And we know shareholders and stock prices are basically the primary focus of publicly traded companies these days, right?

I'm not trying to defend AMD, not trying to tell anyone that can't offer their criticism, but what I am doing is suggesting that criticisms be actually anchored in and acknowledge reality and the limitations presented by reality. For example, Radeon group is probably NEVER going to "overtake" Nvidia in the marketplace while Radeon probably has an R&D budget less than half of Nvidia's (I haven't been able to find how AMD divides its overall R&D budget among its different division, but based on the fact that x86 has a larger T.A.M. and represents a much larger revenue stream, I think it's safe to assume x86 receives the majority of R&D funds). And they're never going to be able develop something to counter Cuda while spending significantly less than Nvidia does on Cuda.
 
I still wouldn’t buy the AMD. Why would I? Just for a little more vram? Nah.
 
what I am doing is suggesting that criticisms be actually anchored in and acknowledge reality and the limitations presented by reality.
The reality is that consumers don't buy products based on the R&D budgets of the company producing them. They buy based on the value they perceive those products give them. If a specific company has a smaller R&D budget and its product is inferior as a result, consumers don't care - they simply don't buy that product.

AMD has chosen to play in the GPU market. They have chosen to go head-to-head with a much wealthier competitor that has historically utilised their capital more effectively, to deliver a more finished product with a better value add. If you choose to compete, and you don't do it well enough, and consumers don't buy your products, that is your fault and nobody else's.

But this is what AMD fanboys do all the time: they blame NVIDIA for being the better competitor instead of blaming AMD for being a poor one. This is the opposite of logic yet these same people will blindly claim, time after time, that they aren't actually AMD fanboys. It hurts my brain.

But rooting for the underdog just because they are one isn’t my style.
Rooting for the underdog when they're constantly shitting on themselves isn't my style.
 
It's not proprietary software.
CUDA and everything built on top of it that's part of their software stack is proprietary. You've got no clue what you are talking about.

I wouldn't have to use such words.
You do because quite literally everything you say is wrong, it's your only recourse.
 
Just out of curiosity....do you just blindly assume that Radeon and Nvidia are competing on the same playing field? You understand that Radeon operates on a much, much, much smaller R&D budget than Nvidoa does, right? That Nvidia has way more resources in basically every sense of the word, more money, more employees, better employees (because they can pay them more), and all those resources can be used as leverage to get better deals from suppliers, to pressure videogame developers to use their technology, etc.

The R&D budget is relative. It depends on the particular location. You can't compare offices in India/China to offices in Luxembourg, Norway, USA, UK or Switzerland, because the latter are in the order of magnitude more expensive to be run.
You have to compare the efficiency of those budgets, not their absolute values.

Also, if AMD focuses on higher market share, that means it will dramatically improve the balance sheets - ROI, economy of scale leads to lower costs and lower prices on the street.

I still wouldn’t buy the AMD. Why would I? Just for a little more vram? Nah.

20 GB vs 12 GB is not so little. 66.67% more. Another question is if that 20GB graphics card needs it or its shaders and core will become obsolete long before the games saturate those 20 GB.
 
Last edited:
IMG_2476.jpeg

Do remember this poll, lads, it helps to know what the vast majority intend to use their GPU for.
 
View attachment 333868
Do remember this poll, lads, it helps to know what the vast majority intend to use their GPU for.
…the poll shows that 40% of respondents use their GPU for things other than gaming. That’s not a point you think it is. Now, of course, it could be argued that TPU members are not the “average” consumers or whatever, but still.
Most popular options are AI, rendering and encoding, by the way. Options in which NV cards are dominant.
 
Do remember this poll, lads, it helps to know what the vast majority intend to use their GPU for.
Vastly biased towards people more interested and likely to work in the tech sector given the nature of this site.

I use my card for pure compute not rendering, AI, crypto or folding, there is no option for me to even select in that poll. TPU doesn't even include any productivity benchmarks for their GPU reviews, most sites don't, that's pretty indicative of how many people care about that.
 
…the poll shows that 40% of respondents use their GPU for things other than gaming. That’s not a point you think it is. Now, of course, it could be argued that TPU members are not the “average” consumers or whatever, but still.
Most popular options are AI, rendering and encoding, by the way. Options in which NV cards are dominant.
I am not making a point for either side here, simply presenting information that I feel is being left out.
(You may argue I am taking a side, considering some posts earlier, but I assure you I am not)
 
This is why AMD can't start a price war. Nvidia has the capacity at TSMC, the profit margins, the market share, the support from consumers and tech press and the performance advantage to drop it's prices instantly and still keep making billions in profits.

AMD has nothing of those. If RX 7900 XT has being announced at an original price of $699, The RTX 4070 Ti would have being announced at $699 the next day of the RX 7000 series announcement, not a month later.

People should understand that the ONLY ONE who dictates pricing is Nvidia. Attacking AMD is plain stupid.
 
You don't understand how this works, neither Nvidia or AMD care about regular consumers and professional workflows, it's not relevant for that segment.

So they don't care about anyone? Who are you, so wise in the ways of business?

I use my card for pure compute not rendering, AI, crypto or folding, there is no option for me to even select in that poll.

Your perspective is as deep as a puddle in a drought.
 
Last edited:
So they don't care about anyone
What are you even talking about, all I pointed out is that regular consumers and professionals are different segments.

That's why Quadro and Firepro/Radeon Pro exist, they are clearly distinct markets in the eyes of these companies, mister "oH yOU SO wISe In tHe WAYS OF busInesS".
 
Literally paying people to use your proprietary software in order to wall yourself off from competition is not bribery ?

By the way, I saw your comment before you edited it, just wanted to let you know I think the same about you as well.


The facts are they use it for CPU reviews not GPU, another joke of an argument as well, most people actually do not even use GPU acceleration.
mmmh What I've heard from ars technica and a few devs is that OpenCL eventually became a pain in the ass to work with, (while CUDA had a better documentation, and getting help from nvidia was easier) hence even blender stopped using it in favor of HIP for AMD, OneAPI for Intel and metal for Apple. At one point in time AMD was faster than Nvidia in Blender.

OpenCL​


OpenCL rendering support was removed. The combination of the limited Cycles kernel implementation, driver bugs, and stalled OpenCL standard has made maintenance too difficult.

We are working with hardware vendors to bring back GPU rendering support using other APIs.

AMD is actually trying. One of the leading commercial GPU renderer supports AMD now, RT acceleration included, the performance is just not there yet. Heck, even Intel is trying, even though they still have many things to fix just on the gaming part. I find odd that people wanting to learn 3D or any creative software as hobby is not even taken into consideration :D. People are spending a lot of money into camera gear, painting supplies even though they don't make money from it. There are a bunch of blender tutorials that got several millions of views. One should also not underestimate the amount of people who buy something "just in case". Someone just needs to think: "I might be interested to learn Blender for fun" to choose an Nvidia GPU just in case. Even if they never actually do it.

1707481650814.png
1707481981593.png
 
View attachment 333868
Do remember this poll, lads, it helps to know what the vast majority intend to use their GPU for.
About that poll.
The majority out there are not people who spend time in forums, meaning a poll at random consumers would show a different image where gaming will be much higher.
Encoding, rendering, AI, are all very general stuff, meaning, someone did ONE Encoding 3 months ago can choose that option, someone done ONE rendering 6 months ago, it means they can choose that option, someone out of curiosity run an AI application, means they can choose that option in the poll.
On the other hand, Folding and mining are very specific, that's why they get a 3% at the poll.

A sample of random consumers and a change in the poll question saying "What else except gaming do you do AT LEAST once in a week?" would probably move that gaming to over 70% and everything else to under 3%.
 
What I've heard from ars technica and a few devs is that OpenCL eventually became a pain in the ass to work with
I have an academic background in these things, people used OpenCL a lot in the 2010-2015 era, Nvidia refused to support anything but the basic 1.2 layer for almost a decade. There was nothing wrong with OpenCL, Nvidia killed it by owning the majority platform which supported OpenCL but only a version that was ancient thereby forcing people to use CUDA.

When I said Nvidia used CUDA to wall themselves off from competition I was not exaggerating, that was and still is their primary strategy. Ultimately CUDA/HIP/OpenCL/OneAPI/Metal are irrelevant, they're all the same, there is not much value in the software itself but in how these companies use it to block competitors.

There is a reason why GPU makers refuse to converge on one unified ISA/shader/compute language, because if that were to happen all their leverage would vanish, all the software would run the same on anything, they'd all be able to optimize everything from day one.
 
Haha, no.

It's a 12GB card for 1440p Ultra and 4K Ultra.

The price drops are never going to fix the issue this card has which is a crippling lack of VRAM and bandwidth for the resolutions its expected to run at. Not unless it starts competing with the 7700XT on price....
 
Nvidia killed it by owning the majority platform which supported OpenCL but only a version that was ancient thereby forcing people to use CUDA.
What they did with PhysX, where they where offering a software support to non Nvidia systems that was hilariously slow to force people to buy Nvidia GPUs. Hardware PhysX died in the end, but CUDA is a different beast.
Nvidia knows how to create the illusion of being open, while driving consumers to it's proprietary options.
 
So still not worth the asking price.
Nvidia worth allways, even then if it cost more than Amd.

Better RT and better game performance if we look all the games, Early acces and other not core benchmark games, there is much less problems using Nvidia Gpus in those games and much better performance.

I mean, 7900XT is still (judging by recent news) cheaper, and better (unless you do production), soooooo Idk man.
View attachment 333850

Still, back on topic, good that prices are dropping, maybe soon we might see a good value proposition from either manufacturer!
Nvidia got much better features, better RT + DLSS.
Also DLSS 3 Frame Generation wins 6-0 against AMD.

Amd is more power hungry =more heat + more $ bills
Nvidia better driver support if we look all the games.

Those are reasons why Nvidia is Top1 in dGpu markets
No Amd fan can change that fact, only Amd can change it by doing better GPUs.

Regular consumers do not care, that's a fact.

AMD has no incentive to care and even if they did like I already explained this is a segment Nvidia is gatekeeping with CUDA and other proprietary software, their efforts would be wasted for nothing, those companies would not bother to use whatever software AMD has.
AMD fails to compete and its Nvidia fault ?
Lets just say it.. Nvidia is Top Dog here, better than AMD

Hey, at least now people openly admit Nvidia basically just bribes everyone.

How many do you think there are ? How many people do you think care about blender performance for examp
It hurts because Nvidia is better and not Amd?
Its easy to skip facts and being butt hurt.

CUDA and everything built on top of it that's part of their software stack is proprietary. You've got no clue what you are talking about.


You do because quite literally everything you say is wrong, it's your only recourse.
Its good that we have at leas one who can releases good Gpus.

Haha, no.

It's a 12GB card for 1440p Ultra and 4K Ultra.

The price drops are never going to fix the issue this card has which is a crippling lack of VRAM and bandwidth for the resolutions its expected to run at. Not unless it starts competing with the 7700XT on price....
8GB is fine in 1080p and even 1440p, it can even run many games 4K whitout Vram problems.

But many Gpu is allredy too slow if using Max setting and higher resolutions, not because its have lower amount of Vram.
 
Back
Top