Tuesday, November 15th 2022

AMD Confirms Radeon RX 7900 Series Clocks, Direct Competition with RTX 4080

AMD in its technical presentation confirmed the reference clock speeds of the Radeon RX 7900 XTX and RX 7900 XT RDNA3 graphics cards. The company also made its first reference to a GeForce RTX 40-series "Ada" product, the RTX 4080 (16 GB), which is going to launch later today. The RX 7900 XTX maxes out the "Navi 31" silicon, featuring all 96 RDNA3 compute units or 6,144 stream processors; while the RX 7900 XT is configured with 84 compute units, or 5,376 stream processors. The two cards also differ with memory configuration. While the RX 7900 XTX gets 24 GB of 20 Gbps GDDR6 across a 384-bit memory interface (960 GB/s); the RX 7900 XT gets 20 GB of 20 Gbps GDDR6 across 320-bit (800 GB/s).

The RX 7900 XTX comes with a Game Clocks frequency of 2300 MHz, and 2500 MHz boost clocks, whereas the RX 7900 XT comes with 2000 MHz Game Clocks, and 2400 MHz boost clocks. The Game Clocks frequency is more relevant between the two. AMD achieves 20 GB memory on the RX 7900 XT by using ten 16 Gbit GDDR6 memory chips across a 320-bit wide memory bus created by disabling one of the six 64-bit MCDs, which also subtracts 16 MB from the GPU's 96 MB Infinity Cache memory, leaving the RX 7900 XT with 80 MB of it. The slide describing the specs of the two cards compares them to the GeForce RTX 4080, which is what the two could compete more against, especially given their pricing. The RX 7900 XTX is 16% cheaper than the RTX 4080, and the RX 7900 XT is 25% cheaper.
Add your own comment

166 Comments on AMD Confirms Radeon RX 7900 Series Clocks, Direct Competition with RTX 4080

#101
Valantar
bugThe piece that you're missing is that the measure of you providing useful goods and/or services is exactly "being profitable" and, by extension, "pleasing shareholders".
That simply isn't true. Yes, the two are related, but it is entirely possible to run a large-scale non-profit or not-for-profit business that provides eminently useful goods and/or services. Profit is a byproduct - it is by definition excess income. Of course all businesses need some form of buffer to cover investments, maintenance, R&D, etc., but all of that is still easily accounted for without chasing profits as the explicit main end goal of operations. And the crucial point you're missing here is that the core of my argument is the reversal we're seeing here: that the focus becomes the byproduct, rather than the core activity of the organization, shifting from "how can we make the best products/services to people (and how can we try to profit from that)?" to "how can we maximize profits (and what do we need to do in terms of products in order to do so?" That reversal is a massive ideological shift, and one that has major implications for how corporations act against the societies they operate in, their customers, the environment, and so on. That of course isn't to say that unscrupulous corporations are by any means a new thing, far from it, but the degree of profiteering and the sheer ideology of profit above all else that is dominant today is still relatively new. Any argument against this must by default be an argument defending the rights of corporations to exploit their customers (and workers), as not doing so stands in the way of maximizing profits. And if that's what you're arguing, I would strongly suggest maybe taking a step back and thinking about what is important to you in this world.

Put it this way: how did Nvidia come to be as a company - was it Jensen sitting in his college dorm thinking "I want to be rich, I wonder how I can create profits?", or was it someone making a useful product - the precursor to a GPU - and then building a business out of producing these useful products?
Posted on Reply
#102
bug
ValantarThat simply isn't true. Yes, the two are related, but it is entirely possible to run a large-scale non-profit or not-for-profit business that provides eminently useful goods and/or services. Profit is a byproduct - it is by definition excess income. Of course all businesses need some form of buffer to cover investments, maintenance, R&D, etc., but all of that is still easily accounted for without chasing profits as the explicit main end goal of operations. And the crucial point you're missing here is that the core of my argument is the reversal we're seeing here: that the focus becomes the byproduct, rather than the core activity of the organization, shifting from "how can we make the best products/services to people (and how can we try to profit from that)?" to "how can we maximize profits (and what do we need to do in terms of products in order to do so?" That reversal is a massive ideological shift, and one that has major implications for how corporations act against the societies they operate in, their customers, the environment, and so on. That of course isn't to say that unscrupulous corporations are by any means a new thing, far from it, but the degree of profiteering and the sheer ideology of profit above all else that is dominant today is still relatively new. Any argument against this must by default be an argument defending the rights of corporations to exploit their customers (and workers), as not doing so stands in the way of maximizing profits. And if that's what you're arguing, I would strongly suggest maybe taking a step back and thinking about what is important to you in this world.

Put it this way: how did Nvidia come to be as a company - was it Jensen sitting in his college dorm thinking "I want to be rich, I wonder how I can create profits?", or was it someone making a useful product - the precursor to a GPU - and then building a business out of producing these useful products?
You sound like you're being forced to buy video cards at gunpoint.

You don't like the price, you don't buy. Nobody can inflate prices past that. And video cards are not air, we can easily live without them.
Posted on Reply
#103
medi01
erockerPerformance looks to be really good for 4080 and 7xxx series but the prices are not. What was a $500-700 segment 3-4 years ago is now $1000-1200+. Well beyond the rate of inflation and a horrible value.
AMD is poised to roll out 1.5 times faster GPU for 10% less (6950XT was 1100)
NV rolled out 1.5 times faster GPU for 71% more. (4080 vs 3080)

(assuming MSRPs are true, which they likely aren't and things are even worse than that)

How could you refer to it as "they are both horrible value" is beyond me.
Posted on Reply
#104
Valantar
bugYou sound like you're being forced to buy video cards at gunpoint.
No, I'm just saying that if you want to take part in possibly the most popular hobby on planet earth right now, you're subject to the exploitative tactics of the corporations controlling the market for the equipment necessary for this - and that these corporations have been turning ever more exploitative in recent years.
bugYou don't like the price, you don't buy. Nobody can inflate prices past that. And video cards are not air, we can easily live without them.
Yes, and we can all live without friends, and hobbies, and fun, and joy, and things that make us happy. None of this is necessary, right? I mean, come on. These aren't arguments, they're bad-faith dismissals trying to paint complex, nuanced problems as simple black-and-white distinctions. Please don't be that reductive.
medi01AMD is poised to roll out 1.5 times faster GPU for 10% less (6950XT was 1100)
NV rolled out 1.5 times faster GPU for 71% more. (4080 vs 3080)
I don't feel that it's reasonable to compare a new gen against a mid-gen refresh. AMD is rolling out ~1.55x performance for the same price as the RX 6900 XT. Yes, the new card is named one tier higher, but that's just AMD expanding their rather slim SKU selection (as demonstrated by there now being two x9xx SKUs at launch rather than one, and the 7900 XT being just $100 less).
medi01(assuming MSRPs are true, which they likely aren't and things are even worse than that)
AMD will be selling their reference models at MSRP through their own web store as usual, and will most likely be enforcing that MSRP for partners reselling that card (which has been common in previous generations), but we'll see how many take them up on that offer.
medi01How could you refer to it as "they are both horrible value" is beyond me.
You could say that in light of no $1000 GPU ever being anything but bad value (I mean, an RX 6600 trounces all of these high end cards in perf/$), but overall they're definitely not equals in this regard, no.
Posted on Reply
#105
medi01
ValantarI don't feel that it's reasonable to compare a new gen against a mid-gen refresh. AMD is rolling out ~1.55x performance for the same price as the RX 6900 XT. Yes, the new card is named one tier higher, but that's just AMD expanding their rather slim SKU selection (as demonstrated by there now being two x9xx SKUs at launch rather than one, and the 7900 XT being just $100 less).
I don't get the "mid gen refresh" point here. If anything, beaten last gen card being newer makes it even more into AMD's favor.

The fastest cards they were selling had MSRP of $1099.
New card, that seems to be 1.5+ times faster, has MSRP of $999.

So top vs top. Roughly the same ballpark power consumption and not oversized either. Legit perf bump. Essentially a drop-in replacement. Instant buy in green world.

This looks good to me, even ignoring horrors going on the other side.
Posted on Reply
#106
bug
ValantarYes, and we can all live without friends, and hobbies, and fun, and joy, and things that make us happy. None of this is necessary, right? I mean, come on. These aren't arguments, they're bad-faith dismissals trying to paint complex, nuanced problems as simple black-and-white distinctions. Please don't be that reductive.
And there we go. I say video cards, you say friends, fun and joy. You're way too invested in this.
Posted on Reply
#107
Valantar
medi01I don't get the "mid gen refresh" point here. If anything, beaten last gen card being newer makes it even more into AMD's favor.

The fastest cards they were selling had MSRP of $1099.
New card, that seems to be 1.5+ times faster, has MSRP of $999.

So top vs top. Roughly the same ballpark power consumption and not oversized either. Legit perf bump. Essentially a drop-in replacement. Instant buy in green world.

This looks good to me, even ignoring horrors going on the other side.
It's not an invalid comparison, no, but it's a slightly skewed one, as you're comparing an optimized mid-gen product vs. an unoptimized first launch product. There obviously isn't any guarantee that there'll be a future optimized mid-gen RX 79XX [whatever suffix] refresh, but it's still comparing two slightly different things. In terms of current market realities it's accurate (but at that point you IMO also need to take into account actual street pricing today), but in terms of how these products are positioned within a first-launch product lineup, it's inaccurate. Which of these comparisons you prefer is obviously a personal preference, but I hold strongly to the comparison that doesn't inherently push price expectations upwards. Starting from the 6950 XT presents this as a better deal than it actually is.
bugAnd there we go. I say video cards, you say friends, fun and joy. You're way too invested in this.
... that's what video cards are used for. Playing games. Having fun. Very, very, very often with friends. This isn't "being too invested", it's taking into account the actual use of these products. I mean, yes, there are collectors buying video cards just to own them and look at them or whatever, but for the most part, the point is to use them. And losing access to their use through unaffordability also then loses you access to the activities afforded by their use. Are these activities replaceable? Sure. Is that easy? Not at all. This stuff literally breaks up people's social lives and has lasting real-world effects on the things that they do in their day-to-day lives. This isn't being too invested, it's keeping perspective on what these things are actually for, and refusing to see them as abstract objects traded for abstract purposes.
Posted on Reply
#108
wolf
Better Than Native
ValantarAnd that's precisely where the core of our disagreement lies: I don't accept "pleasing shareholders" or "being profitable" as the main operating principle for any business
This conversion has become a lot more than nvidia segmenting products being shitty, ima tap out here, don't really want to discuss economy, capitalism, politics etc.
spnidelbro you take online discussions way too seriously lmao
In retrospect I'd have done one thing different, my post that you troll quoted was in response to Mr all caps, not to you specifically, I should have altered the order of what I quoted.

But reporting an obvious troll post? Yeah sorry not sorry. I take this community seriously.
Posted on Reply
#109
AusWolf
ratirtOh the answer is so obvious here. I DON'T CARE. Sorry I had to :D
And I wanna play Quake 2 RTX and Minecraft RTX even if they run at 5 FPS! Accurate lighting adds so much realism to building blocks half the size of your viewing angle! :rockout: Only joking. :D
ratirtHow about this. I remember not long ago companies advertised cards for 4k gaming. Well this is now gone but how about RT and 1080p gaming? Obviously emphasis on RT which is so damn cool and makes your game hundred times better right? They are preparing for 4k gaming advertisement again it will just have RT there. It all repeats itself with a minor change RT in the mix.
I'm actually in between the two camps with RT. A good implementation can look cool if your PC can run it, but if it can't, meh, whatevs. :)

As for high-resolution, high refresh rate gaming, my opinion is a definite "no thanks" unless someone throws a random £10k at me.
AsRockBut why shoot them self's in the foot by doing so ?.
Brand image? I mean, when you buy the highest of the high end, and then an even higher end product follows half a year later, it can be seen as a scummy move. Besides, when a new line launches with only partially disabled GPUs on the table, you know something better is reserved for later. It might not bother some, but it does bother me.
Posted on Reply
#110
Valantar
AusWolfI'm actually in between the two camps with RT. A good implementation can look cool if your PC can run it, but if it can't, meh, whatevs. :)
Pretty much exactly this. As with all new, demanding graphics settings, really. It's a cool bonus if you can run it, if not then not.
AusWolfAs for high-resolution, high refresh rate gaming, my opinion is a definite "no thanks" unless someone throws a random £10k at me.
If "high resolution" is 2160p to you, I agree. On the other hand, 1440p high refresh rate gaming is shockingly accessible these days though, as long as you don't demand the highest possible settings and the best monitor ever. 1440p144 monitors can be had for as low as $200 (no, not good 1440p144 monitors); even a lowly RX 6600 manages well above 60fps average across the TPU test suite at 1440p (and that's at Ultra!), and CPUs great for gaming are more affordable than ever (looking at you, 12100F).

A big part of the current issues in the gaming market is that, I think for the first time ever, pretty much anything is good enough. We've actually moved beyond the point where you need to upgrade frequently to keep playing games - now it's that you need to upgrade frequently if you want to keep up with ever increasing resolutions (beyond a certain point, why?) at ever increasing refresh rates (beyond a certain point, again, why?). And chipmakers are recognizing this - the need they used to fulfill is saturated already, and they're struggling to invent new needs to drive sales. It's difficult to imagine that this won't lead to a major downturn for the entire industry over the next decade or so.
Posted on Reply
#111
nguyen
AusWolfBrand image? I mean, when you buy the highest of the high end, and then an even higher end product follows half a year later, it can be seen as a scummy move. Besides, when a new line launches with only partially disabled GPUs on the table, you know something better is reserved for later. It might not bother some, but it does bother me.
7900XTX come out with gimped clocks (AMD themselves said the clocks were supposed to be 3ghz+), 6 months later AMD will release a revised silicon that boost to 3Ghz (just like they did 6900XTXH and 6950XT), so what does that make the current soon to be released 7900XTX? place-holder?

Anyways I don't think you will buy neither 7900XTX nor 4080, I don't care about these 2 cards neither as I have 4090. Just give my opinions about your lopsided perspective
Posted on Reply
#112
Valantar
nguyen(AMD themselves said the clocks were supposed to be 3ghz+),
Did they? I seem to have missed that, but I see it thrown around everywhere. Got a source?
Posted on Reply
#113
nguyen
ValantarDid they? I seem to have missed that, but I see it thrown around everywhere. Got a source?
Navi31 Block Diagram
Posted on Reply
#114
AusWolf
bugYou sound like you're being forced to buy video cards at gunpoint.

You don't like the price, you don't buy. Nobody can inflate prices past that. And video cards are not air, we can easily live without them.
That's besides the point. The point here is that a company can focus on making the best products possible while still earning profit (to please customers), or on making the cheapest products possible for maximum profit (to please shareholders). Every new company has to start from the first standpoint to gain the interest of customers, but once that's done, they have a choice to continue focusing on the product, or to change direction towards maximising profits and keeping the momentum of the hype train going to achieve that. Whether you want to admit it or not, these changes have an effect on the company's image.
ValantarIf "high resolution" is 2160p to you, I agree. On the other hand, 1440p high refresh rate gaming is shockingly accessible these days though, as long as you don't demand the highest possible settings and the best monitor ever. 1440p144 monitors can be had for as low as $200 (no, not good 1440p144 monitors); even a lowly RX 6600 manages well above 60fps average across the TPU test suite at 1440p (and that's at Ultra!), and CPUs great for gaming are more affordable than ever (looking at you, 12100F).
Personally, if I was loaded with cash, I would only go for 2160p and skip 1440p altogether. When you're watching a movie, or playing an old game that doesn't support odd resolutions, any non-square upscaling makes the image look blurry and potentially distorted. That is, 360p->720p->1080p->2160p upscaling looks good, but anything in between doesn't necessarily. That's why I'm staying with 1080p.
ValantarA big part of the current issues in the gaming market is that, I think for the first time ever, pretty much anything is good enough. We've actually moved beyond the point where you need to upgrade frequently to keep playing games - now it's that you need to upgrade frequently if you want to keep up with ever increasing resolutions (beyond a certain point, why?) at ever increasing refresh rates (beyond a certain point, again, why?). And chipmakers are recognizing this - the need they used to fulfill is saturated already, and they're struggling to invent new needs to drive sales. It's difficult to imagine that this won't lead to a major downturn for the entire industry over the next decade or so.
Well said! Actually, we're already feeling this on our skins. If it was still necessary to upgrade every generation, like it was in the '90s, I'm sure hardware prices would be a lot more reasonable. We don't spend £200 on every generation anymore - we spend £1,000 on every third or fourth generation (except for the few "enthusiasts" who always want the best of the best even if they don't actually need it).
Posted on Reply
#115
Valantar
nguyenNavi31 Block Diagram
Idk, that's a bit vague. The slide is titled RDNA3, not Navi 31 - that's just the only announced RDNA3 die for now. Also, is that an officially published slide? The blur and watermark makes it look like a leaked internal/non-public one. So, depending on the context and the specific wording, this can be read as "AMD says Navi 31 will hit 3GHz", or it can be read as "AMD told internal partners that the architecture of RDNA3 is tuned to clock past 3GHz, but will obviously vary with the configuration of each SKU". That's a pretty big difference. Makes sense that I missed it though, as I generally tend to ignore leaks like this. Non-public statements are not promises of real-world product performance.
Posted on Reply
#116
AusWolf
nguyen7900XTX come out with gimped clocks (AMD themselves said the clocks were supposed to be 3ghz+), 6 months later AMD will release a revised silicon that boost to 3Ghz (just like they did 6900XTXH and 6950XT), so what does that make the current soon to be released 7900XTX? place-holder?
If the binned chip can achieve higher clocks, then the non-binned chip can also potentially reach those clocks with a bit more voltage. Anyway, clock speed differences aren't even noticeable, aka. 6600 XT vs. 6650 XT, 6700 XT vs. 6750 XT or 6900 XT vs 6950 XT. If you bought the non-50 version, you still have the full product with slightly lower clocks which you can change if you want, but you can't add more cores to a partially disabled GPU like you could in the past.
nguyenAnyways I don't think you will buy neither 7900XTX nor 4080, I don't care about these 2 cards neither as I have 4090. Just give my opinions about your lopsided perspective
You're right, I'm not gonna buy any of those (they're way out of my price and performance requirement range), but that doesn't mean that I can't have an opinion on certain business practices.
Posted on Reply
#117
ratirt
AusWolfI'm actually in between the two camps with RT. A good implementation can look cool if your PC can run it, but if it can't, meh, whatevs. :)

As for high-resolution, high refresh rate gaming, my opinion is a definite "no thanks" unless someone throws a random £10k at me.
I always say it and will say it again. RT is great and looks nice and I'm sure, at some point it will be the future of gaming but not today. For the price companies ask for it, the performance dips on literally any hardware you throw at it barely capable to handle it and at low resolution I say no thanks. When we get there with RT I will gladly clap my hands twice and get a card that supports it in full extent. Now, RT is not a game changer since you barely see any difference. The only thing RT actually changes in games, is performance hit so bad, it is not worth it. And in order to play a RT game you need to bend over backwards to get a card for $2k or around.
Posted on Reply
#118
nguyen
AusWolfIf the binned chip can achieve higher clocks, then the non-binned chip can also potentially reach those clocks with a bit more voltage. Anyway, clock speed differences aren't even noticeable, aka. 6600 XT vs. 6650 XT, 6700 XT vs. 6750 XT or 6900 XT vs 6950 XT. If you bought the non-50 version, you still have the full product with slightly lower clocks which you can change if you want, but you can't add more cores to a partially disabled GPU like you could in the past.
Yeah sure like the 2% shader difference between 3090Ti and 3090 make a difference LMAO, I bet my old 3090 can out-perform most 3090Ti OCed vs OCed because I had a good 3090 sample.
AusWolfYou're right, I'm not gonna buy any of those (they're way out of my price and performance requirement range), but that doesn't mean that I can't have an opinion on certain business practices.
You are giving opinion from the wrong perspective, enthusiasts who can actually afford these type of products don't care about what you are about LOL
Posted on Reply
#119
bug
AusWolfThat's besides the point. The point here is that a company can focus on making the best products possible while still earning profit (to please customers), or on making the cheapest products possible for maximum profit (to please shareholders). Every new company has to start from the first standpoint to gain the interest of customers, but once that's done, they have a choice to continue focusing on the product, or to change direction towards maximising profits and keeping the momentum of the hype train going to achieve that. Whether you want to admit it or not, these changes have an effect on the company's image.
That is woefully shortsighted. If you operate for minimum profit, you have no cushion for a crisis situation or when the competition simply has a better product. This is why AMD had to rely on ATI to survive their Bulldozer years: they sold their CPUs for cheap (yes, Intel foul play and everything; I know, but that's another story) and had no cushion or other means to fund development of a new architecture.

Basically, when the product you sell is scarce, you will have what is essentially a Dutch auction. That's what sets the right price.

You can argue Nvidia is creating scarcity themselves, but if that were the case, nothing stops AMD from flooding the market with their own, cheap GPUs.

I'm not saying a company can't have a policy of balancing between price and profit. They can and they do. I'm just saying that policy is just one piece of a puzzle and in and of itself, doesn't influence the market as mush as some would think.
Posted on Reply
#120
AusWolf
nguyenYeah sure like the 2% shader difference between 3090Ti and 3090 make a difference LMAO, I bet my old 3090 can out-perform most 3090Ti OCed vs OCed because I had a good 3090 sample.
If the 2% core number difference doesn't matter to you (although an overclocked 3090 Ti is most probably faster than your overclocked 3090), then why does 2% clock speed difference matter so much? This is what I don't get. If Nvidia shaves a few cores off to sell the good chips at a higher price half a year after launch, it's fine, but if AMD reserves binned chips to sell them with a 2% overclock, it's the end of the world?
Posted on Reply
#121
Valantar
nguyenYeah sure like the 2% shader difference between 3090Ti and 3090 make a difference LMAO, I bet my old 3090 can out-perform most 3090Ti because I had a good 3090 sample.



You are giving opinion from the wrong perspective
I think there's a wrinkle that needs adding to this that changes it somewhat: launching a mid-gen refresh after a while that increases clocks and/or is based on a new, better bin? That's obviously fine. Launching a flagship SKU, pitched as "the best of the best", with the plan of launching a product to supersede it shortly thereafter, say within half a year or so? That's rather underhanded. Why are these different? Because of timing and messaging. I don't care whether the flagship SKU is fully enabled or not, nor do I care about absolute clocks as long as performance is there. But I do care about bait-and-switch marketing tactics.
bugThat is woefully shortsighted. If you operate for minimum profit, you have no cushion for a crisis situation or when the competition simply has a better product. This is why AMD had to rely on ATI to survive their Bulldozer years: they sold their CPUs for cheap (yes, Intel foul play and everything; I know, but that's another story) and had no cushion or other means to fund development of a new architecture.

Basically, when the product you sell is scarce, you will have what is essentially a Dutch auction. That's what sets the right price.

You can argue Nvidia is creating scarcity themselves, but if that were the case, nothing stops AMD from flooding the market with their own, cheap GPUs.

I'm not saying a company can't have a policy of balancing between price and profit. They can and they do. I'm just saying that policy is just one piece of a puzzle and in and of itself, doesn't influence the market as mush as some would think.
Yes, companies need buffers to keep running (though for large businesses those buffers are typically loans based on projected revenues rather than cash on hand). But it feels like you're taking that point and running too far with it, ignoring surrounding factors. We know Nvidia is limiting RTX 3000 supplies to keep prices high. Why can't AMD flood the market to counteract that? Because they literally don't have the production capacity to match Nvidia's ~4x sales advantage, and Nvidia has a massive mindshare advantage, where for most buyers GPU=Nvidia, period. So yes, there are definitely things stopping AMD from flooding the market with their own cheap GPUs. Of course, another factor stopping AMD from doing this is that they themselves are working under the same ideology, wanting to maximize profits above all else, and as with most tech companies these days their focus is on ASPs, not volume. Why? 'Cause it's always cheaper (and thus more profitable) to sell fewer products more expensively than more products cheaply, just because of the realities of manufacturing and distribution.

Another factor you're either ignoring or not considering: ideologies propagate themselves. Why are AMD and Nvidia both operating on roughly the same principles? Because over the past few decades that has slowly but surely become the dominant mode of doing business. That's the thing here: none of these companies are direct origin points of this thinking, but that still doesn't remove their responsibility for their policies and actions. They can and do resist shareholder pressure - but selectively. Nobody is saying it would be simple for them to change direction, as at this point society is literally geared against them, with the possibility of shareholder lawsuits and so on. But they could still push back in many ways. Instead, we're seeing AMD slowly embrace this thinking, while Nvidia has been at the forefront of embracing it for years already. It's difficult to directly blame someone for being pulled along by an ideological current, but that is not what Nvidia is doing - they're actively paddling ahead of the pack.
Posted on Reply
#122
AusWolf
bugThat is woefully shortsighted. If you operate for minimum profit, you have no cushion for a crisis situation or when the competition simply has a better product. This is why AMD had to rely on ATI to survive their Bulldozer years: they sold their CPUs for cheap (yes, Intel foul play and everything; I know, but that's another story) and had no cushion or other means to fund development of a new architecture.

Basically, when the product you sell is scarce, you will have what is essentially a Dutch auction. That's what sets the right price.

You can argue Nvidia is creating scarcity themselves, but if that were the case, nothing stops AMD from flooding the market with their own, cheap GPUs.

I'm not saying a company can't have a policy of balancing between price and profit. They can and they do. I'm just saying that policy is just one piece of a puzzle and in and of itself, doesn't influence the market as mush as some would think.
I see what you mean, but you're talking about the market - I'm talking about the company's image. Sure, you can argue that it doesn't matter when it comes to you buying the product, but it does matter to some people. I prefer buying stuff from companies that aren't openly and intentionally trying to shove their crap down my throat with useless hype, and companies that are more transparent towards the customer about their products and business practices. The final product may not be very different, but there is a difference in price, value and availability.

Don't get me wrong, I'm not against Nvidia. Their technology is awesome. I'm only against the practice of trying to sell the worst product possible for the highest profit. It's understandable from a company leadership and shareholder point of view, but I'm a customer, and I'm not here to please rich folks with my choices.
Posted on Reply
#123
Valantar
AusWolfI see what you mean, but you're talking about the market - I'm talking about the company's image. Sure, you can argue that it doesn't matter when it comes to you buying the product, but it does matter to some people. I prefer buying stuff from companies that aren't openly and intentionally trying to shove their crap down my throat with useless hype, and companies that are more transparent towards the customer about their products and business practices. The final product may not be very different, but there is a difference in price, value and availability.

Don't get me wrong, I'm not against Nvidia. Their technology is awesome. I'm only against the practice of trying to sell the worst product possible for the highest profit. It's understandable from a company leadership and shareholder point of view, but I'm a customer, I'm not here to please them with my choices.
It's not just about image though, it's also about the actual actions taken in the actual world by these companies, and the outcomes of these actions. Which are real, tangible things, and affect far more people than just the ones interested enough to actually know anything at all about a company's image or reputation. Criticizing companies for acting like profiteering asshats isn't just because it's bad that they look like profiteering asshats, but because the things that make them profiteering asshats have actual harmful consequences in the real world. And for all the "just choose not to buy" hand-waving here, it can't be ignored that there's a direct, causal link between the massively increased economic precarity in the world today and the massively increased corporate profits over the past couple of years. Why? Because corporations are raising prices to raise profits, which, on a large scale, forces people to spend more in total. When wages don't also increase to match, you then have increased poverty instead. And yes, GPUs are clearly not a basic necessity for life, but that doesn't give them carte blanche to do whatever they want in terms of exploiting their customers. Not at all.
Posted on Reply
#124
nguyen
AusWolfIf the 2% core number difference doesn't matter to you (although an overclocked 3090 Ti is most probably faster than your overclocked 3090), then why does 2% clock speed difference matter so much? This is what I don't get. If Nvidia shaves a few cores off to sell the good chips at a higher price half a year after launch, it's fine, but if AMD reserves binned chips to sell them with a 2% overclock, it's the end of the world?
Nah my good binned 3090 can out perform most 3090Ti overclocked vs overclocked.

You seem to make up facts, 3090Ti was release 1.5 years after 3090. Meanwhile AMD released 3 revisions of 6900XT (6900XT LC with 18Gbps VRAM, 6900XTXH and then 6950XT) a few months apart.
Posted on Reply
#125
bug
AusWolfI see what you mean, but you're talking about the market - I'm talking about the company's image. Sure, you can argue that it doesn't matter when it comes to you buying the product, but it does matter to some people. I prefer buying stuff from companies that aren't openly and intentionally trying to shove their crap down my throat with useless hype, and companies that are more transparent towards the customer about their products and business practices. The final product may not be very different, but there is a difference in price, value and availability.

Don't get me wrong, I'm not against Nvidia. Their technology is awesome. I'm only against the practice of trying to sell the worst product possible for the highest profit. It's understandable from a company leadership and shareholder point of view, but I'm a customer, and I'm not here to please rich folks with my choices.
Can't say I'm crazy about buying overpriced stuff myself.

But discussing a company's image is an exercise in futility, imho. Everyone has their own image of what a company is or isn't and that image is almost always subjective.

My "message" to Nvidia (and AMD) is simply me holding on to my GTX 1060. They don't care, I go about my business as well.
Posted on Reply
Add your own comment
Dec 23rd, 2024 11:20 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts