Thursday, November 17th 2022

NVIDIA Plans GeForce RTX 4060 Launch for Summer 2023, Performance Rivaling RTX 3070

NVIDIA is reportedly planning to ramp its GeForce "Ada" generation into the high-volume performance segment by Summer 2023, with the introduction of the GeForce RTX 4060. The card is expected to launch somewhere around June, 2023. The card will be based on the 4 nm "AD106" silicon, the 4th chip based on the "Ada Lovelace" graphics architecture. Wolstame. a reliable source with NVIDIA leaks as Lenovo's Legion gaming desktop product manager, predicts that the RTX 4060 performance could end up matching that of the current RTX 3070 at a lower price-point.

This should make it a reasonably fast graphics card for 1440p AAA gaming with high-ultra settings, and ray tracing thrown in. What's interesting is if NVIDIA is expected to extend the DLSS 3 frame-generation feature to even this segment of graphics cards, which means a near-100% frame rate uplift can be had. Other predictions include a board power expected to be in the range of 150-180 W, and a 10% generational price-increase, which would mean that the RTX 4060 would have a launch-price similar to that of the RTX 3060 Ti (USD $399).
Sources: harukaze5719 (Twitter), VideoCardz
Add your own comment

165 Comments on NVIDIA Plans GeForce RTX 4060 Launch for Summer 2023, Performance Rivaling RTX 3070

#126
ARF
efikkanIf we want lower prices, then AMD needs to show serious competition, and not just on paper, they need to have high availability of better priced cards globally.

And if RTX 4080 is a "pricing disaster", is RX 6950 XT that too? And for whichever you choose, are you basing this conclusion on MSRP, the pricing in TPU's reviews or your local pricing?
Because this may lead to very different conclusions. RX 6950 XT shows up at a greater value in TPU's reviews because it's based on Newegg's pricing at the time of writing, and right now Newegg have a couple of RX 6950 XTs way below MSRP. So is this price representative for the global market?
In my area pricing for RX 6950 XT is all over the place, and vary a lot day by day, very few in stock at or near MSRP. Many are priced comparatively to RTX 4080.
When I compare products I base my conclusions on US MSRP, which is not perfect, but is probably still more representative for a relative comparison than specific shops. (at least now that shops have stocks again)
Well, the RX 6950 XT is poor value.
Prices in Germany in euro:

Radeon RX 6800 - 499.00
Radeon RX 6800 XT - 648.00
Radeon RX 6900 XT - 699.00
Radeon RX 6950 XT - 880.00

In this group the RX 6800 is the sweet deal if you don't care about the relatively painful performance drop.

Posted on Reply
#127
AusWolf
efikkanIf we want lower prices, then AMD needs to show serious competition, and not just on paper, they need to have high availability of better priced cards globally.

And if RTX 4080 is a "pricing disaster", is RX 6950 XT that too? And for whichever you choose, are you basing this conclusion on MSRP, the pricing in TPU's reviews or your local pricing?
Because this may lead to very different conclusions. RX 6950 XT shows up at a greater value in TPU's reviews because it's based on Newegg's pricing at the time of writing, and right now Newegg have a couple of RX 6950 XTs way below MSRP. So is this price representative for the global market?
In my area pricing for RX 6950 XT is all over the place, and vary a lot day by day, very few in stock at or near MSRP. Many are priced comparatively to RTX 4080.
When I compare products I base my conclusions on US MSRP, which is not perfect, but is probably still more representative for a relative comparison than specific shops. (at least now that shops have stocks again)
I don't know where you live, but here in the UK, AMD offers better prices than Nvidia all across their product range. The 3060 is about £50-100 more expensive than the 6600, the 3070 is £100-150 more expensive than the 6700 XT, the 3080 is also £150 more expensive than the 6800 XT, and the 6900 XT is £50 cheaper than the 12 GB 3080. What more do you want?
Posted on Reply
#128
ARF
AusWolfI don't know where you live, but here in the UK, AMD offers better prices than Nvidia all across their product range. The 3060 is about £50-100 more expensive than the 6600, the 3070 is £100-150 more expensive than the 6700 XT, the 3080 is also £150 more expensive than the 6800 XT, and the 6900 XT is £50 cheaper than the 12 GB 3080. What more do you want?
Lower prices from AMD. The 6800 XT is still too expensive compared to the historical trends of lowering the pricing over aging. The 6800 XT is still around its 2-year-old original MSRP.
AMD sells Navi 21 from 499 to 880 euro depending on the binning. So, there is plenty of room for price reductions on 6800 XT / 6900 XT and 6950 XT.
Posted on Reply
#129
Dr. Dro
efikkanWhat unmitigated disaster?
RTX 4080 and RTX 4090 has proven to be great performers, the RTX 4080 performed better than "leaks" were projecting. The pricing can change if AMD offers some serious competition.

As for the rest of the product lineup, we actually don't know yet. But let's use any opportunity to bash Nvidia prematurely anyway!
I am not bashing prematurely. The RTX 4090 costs about 60% more than what it should, the RTX 4080 is irrationally priced considered the 7900 XTX is likely going to whoop it - and they still wanted to charge $900 for an even slower version of it based out of a midrange chip(!), and no, NVIDIA doesn't do price wars anymore. They'd rather limit volume and spam SKUs than lower prices, and the Ada lineup has PLENTY of space for SKU spam, above and below both 4080 and 4090, including room for a faster AD103-based chip and a SIGNIFICANTLY faster AD102-based chip above the RTX 4090 as well.

Like I said - I'm not an NVIDIA stakeholder or Jensen's dad, I want a competitively priced, yet high-quality product. NVIDIA cannot deliver that with this generation. They may be high-quality, but the price is an absurdity and the black-box ecosystem makes it even worse. I'll pass, and this is from an RTX 3090 owner.

I owned a Radeon VII back in the day. Lovely card, but Vega 20 was never a gaming GPU. It's little wonder that specific processor became the bedrock for the CDNA architecture. It also did not cost thousands of dollars, which greatly excuses it.
Posted on Reply
#130
AusWolf
ARFLower prices from AMD. The 6800 XT is still too expensive compared to the historical trends of lowering the pricing over aging. The 6800 XT is still around its 2-year-old original MSRP.
AMD sells Navi 21 from 499 to 880 euro depending on the binning. So, there is plenty of room for price reductions on 6800 XT / 6900 XT and 6950 XT.
They're still cheaper than the equal offerings from Nvidia.
Posted on Reply
#131
efikkan
Dr. DroI am not bashing prematurely. The RTX 4090 costs about 60% more than what it should, the RTX 4080 is irrationally priced considered the 7900 XTX is likely going to whoop it
I too think the pricing is way off. ~1000$ for RTX 4090 and ~$800 for RTX 4080 is the maximum I think makes sense, and that is even factoring the insane levels of inflation we have right now.
But as I've been saying, the solution is more competition, real competition. AMD needs to ship enough volumes to make a dent in Nvidia's sales across the markets, then the prices will drop rapidly.
Dr. DroNVIDIA doesn't do price wars anymore. They'd rather limit volume and spam SKUs than lower prices, and the Ada lineup has PLENTY of space for SKU spam…
This is just nonsense.
Nvidia isn't limiting volume.
Dr. Droand they still wanted to charge $900 for an even slower version of it based out of a midrange chip(!)…
And this is where you venture into nonsense territory.
The chip segmentation is pretty arbitrary and varies from generation to generation, so the term "midrange chip" makes no sense. Even the naming of the chips just refers to the order they've been designed, it bears no meaning whether it will be a mid-range product or not. So if AD103 is performing like a high-end chip, then it's a high-end chip.

Keep in mind back in the Kepler era, GK104 was used for GTX 680 (because GK100 was faulty). In the Maxwell generation, GM204 was used for the GTX 980, the original top model of the lineup. The same goes for Pascal, GTX 1080(GP104) was the top model for almost a year until GTX 1080 Ti arrived.
Posted on Reply
#132
Dr. Dro
efikkanI too think the pricing is way off. ~1000$ for RTX 4090 and ~$800 for RTX 4080 is the maximum I think makes sense, and that is even factoring the insane levels of inflation we have right now.
But as I've been saying, the solution is more competition, real competition. AMD needs to ship enough volumes to make a dent in Nvidia's sales across the markets, then the prices will drop rapidly.

This is just nonsense.
Nvidia isn't limiting volume.

And this is where you venture into nonsense territory.
The chip segmentation is pretty arbitrary and varies from generation to generation, so the term "midrange chip" makes no sense. Even the naming of the chips just refers to the order they've been designed, it bears no meaning whether it will be a mid-range product or not. So if AD103 is performing like a high-end chip, then it's a high-end chip.

Keep in mind back in the Kepler era, GK104 was used for GTX 680 (because GK100 was faulty). In the Maxwell generation, GM204 was used for the GTX 980, the original top model of the lineup. The same goes for Pascal, GTX 1080(GP104) was the top model for almost a year until GTX 1080 Ti arrived.
They aren't now (beyond holding back on lower SKUs in the stack to move the same Ampere stock they call ancient GPUs unworthy of DLSS 3 but as a certified poor, I digress) but they would rather do that then engage in a price war, especially considered how much room for SKUs they have.

On the last bit, nonsense? I wasn't the one who announced a card and then unlaunched it, it was NVIDIA. No matter where you put it, even in these generations of old (the GTX 980 wasn't the top model, there were two GPUs above it, the 980 Ti and the Titan X), the xx104-class cards have always been the middle of the pack ones. Even with Kepler, the GTX 600 series were quickly complimented by the 700 series that introduced the GK110 which was sizably faster than the GK104, similarly to how the GTX 500 series (and the 580) launched only 8 months apart from the GTX 480 that fixed the 400 series' terrible thermals, it was wise of NVIDIA at the time not to repeat the GF100.

Anyway, the ill-fated 4080-12GB (full AD104) was no different, relative to the full AD102 it has like 40% of the shader count, and NVIDIA quickly realized that it wasn't going to stick. If they had gone through with it, the 4080-12GB would have been laughed out of every publication, which would hype the 7900 XT instead. The 103 segment is new and in Ampere it was only used in the RTX 3060 Ti in a very cutdown configuration, or in the mobile RTX 3080 Ti. Similarly to the AD103 compared to the AD102, the GA103 was a smaller and less powerful processor than the GA102. You could call it high end, but it was never intended to be in the leading pack either. It'd work... if AMD hadn't crashed their party with the 7900 XT in that segment either, which is giving you a Navi 31 chip with a MCD and some shaders disabled, which should more than perform competitively with the RTX 4080-16GB.
Posted on Reply
#133
ARF
AusWolfThey're still cheaper than the equal offerings from Nvidia.
First we have to define what "equal" actually means. Maybe for some people RTX 3060 is "equal" with RX 6800 XT.
We know there is an extremely high performance difference but they can argue that the RTX 3060 is much cheaper cause, you know...

Look, there is 80-20% market share difference against AMD.
AMD has always been the cheaper, value option and despite this, it is not enough to improve the market situation of the company.
So, AMD needs to step up with something completely different, plus the discounts, of course.
Posted on Reply
#134
N3M3515
ARFFirst we have to define what "equal" actually means. Maybe for some people RTX 3060 is "equal" with RX 6800 XT.
We know there is an extremely high performance difference but they can argue that the RTX 3060 is much cheaper cause, you know...

Look, there is 80-20% market share difference against AMD.
AMD has always been the cheaper, value option and despite this, it is not enough to improve the market situation of the company.
So, AMD needs to step up with something completely different, plus the discounts, of course.
You need amd to be much more competitive so that nvidia lower their prices and you then can buy nvidia? that's not how it works, what needs to happen is that nvidia loses market share to amd and for that to happen then people need to buy more from amd.

From 6800(even 6800 xt at $520) and below amd is wayyyyyy better value than nvidia counterparts, but hey nvidia mindshare does not understand how a rx 6600 is lightyears ahead in value than a rtx 3060.
Posted on Reply
#135
ARF
N3M3515You need amd to be much more competitive so that nvidia lower their prices and you then can buy nvidia? that's not how it works, what needs to happen is that nvidia loses market share to amd and for that to happen then people need to buy more from amd.

From 6800(even 6800 xt at $520) and below amd is wayyyyyy better value than nvidia counterparts, but hey nvidia mindshare does not understand how a rx 6600 is lightyears ahead in value than a rtx 3060.
This means that nvidia has created a large fanbase loyal to the brand, and also a brand that is recognisable as the "to-go" in the graphics cards market no matter the performance and no matter the price.
It's like a voodoo black magic or something...
Posted on Reply
#136
Dr. Dro
ARFThis means that nvidia has created a large fanbase loyal to the brand, and also a brand that is recognisable as the "to-go" in the graphics cards market no matter the performance and no matter the price.
It's like a voodoo black magic or something...
This is not very far from the truth; however, AMD also has its diehards and loyalists. It's just that the segments that are loyal to AMD are either people who have grown wise and repulsed to NVIDIA's predatory business practices, old timers that have nostalgia from ATI, or Linux users. All combined, it's a very small minority of people.

NVIDIA's marketing machine is extraordinarily effective. When someone mentions ray tracing, what comes to mind? RTX. When someone brings up upscaling solutions, what comes to mind? DLSS. By successfully capturing the public's attention, they have built a perceived trust - and are now capitalizing on the brand name.
Posted on Reply
#137
Sisyphus
Dr. DroThis is not very far from the truth; however, AMD also has its diehards and loyalists. It's just that the segments that are loyal to AMD are either people who have grown wise and repulsed to NVIDIA's predatory business practices, old timers that have nostalgia from ATI, or Linux users. All combined, it's a very small minority of people.
Creatives have no alternative, nVidia rules due to his broad software support. 4090 sales are fine, price is ok.
NVIDIA's marketing machine is extraordinarily effective. When someone mentions ray tracing, what comes to mind? RTX. When someone brings up upscaling solutions, what comes to mind? DLSS. By successfully capturing the public's attention, they have built a perceived trust - and are now capitalizing on the brand name.
Ray tracing was introduced by nVidia into gaming. DLSS was introduced by nVidia into gaming.
Being the technology leader in HPC/Visualization and AI has its perks. AMD lags behind here. These are not marketing tricks, AMD's future depends on catching up here. These technologies increase in importance from generation to generation. For the mid-tier gamer, this may not matter, if you only want rasterization, AMD is a better solution. If you want more than just playing with the card or like RT effects/eye candy, nVidia has a clear advantage.
If the 4080/16 GB came for $700, as some here are demanding, AMD could pack up. If the 4080/16 GB rasterization is about the same as the 7900 XTX for 1000$, 1100$ for 4080 would be ok. compared to AMD for better RT and Tensor cores. We will see.
Posted on Reply
#138
Dr. Dro
SisyphusCreatives have no alternative, nVidia rules due to his broad software support. 4090 sales are fine, price is ok.


Ray tracing was introduced by nVidia into gaming. DLSS was introduced by nVidia into gaming.
Being the technology leader in HPC/Visualization and AI has its perks. AMD lags behind here. These are not marketing tricks, AMD's future depends on catching up here. These technologies increase in importance from generation to generation. For the mid-tier gamer, this may not matter, if you only want rasterization, AMD is a better solution. If you want more than just playing with the card or like RT effects/eye candy, nVidia has a clear advantage.
If the 4080/16 GB came for $700, as some here are demanding, AMD could pack up. If the 4080/16 GB rasterization is about the same as the 7900 XTX for 1000$, 1100$ for 4080 would be ok. compared to AMD for better RT and Tensor cores. We will see.
People have more money than sense. I only bought a 3090 back then because I saw a window at launch, I had the money and I took it. A week later, the crypto mining boom sent GPU prices skyrocketing, and at its height, it was at almost triple the already absurd sum I spent on it. But that doesn't make the price fine, even if sales are as expected for corporate. There's little justification beyond "just 'cause we can" for its pricing.

Not to mention I'm not too sure about that either. After the stores sold the initial batch of 4090s that got to my country, I haven't seen any restocks occur yet...

And no, NVIDIA just seized the moment. They didn't invent raytraced graphics. They just were the first to market with a product ready for it. AMD, Intel, and NV worked with Microsoft to design the specification. Also... the 4080 16GB isn't enough to make AMD pack up even if they didn't have something better than the 6900 XT on the way. It's just not that good a product.
Posted on Reply
#139
ARF
Dr. DroPeople have more money than sense. I only bought a 3090 back then because I saw a window at launch, I had the money and I took it. A week later, the crypto mining boom sent GPU prices skyrocketing, and at its height, it was at almost triple the already absurd sum I spent on it. But that doesn't make the price fine, even if sales are as expected for corporate. There's little justification beyond "just 'cause we can" for its pricing.

Not to mention I'm not too sure about that either. After the stores sold the initial batch of 4090s that got to my country, I haven't seen any restocks occur yet...

And no, NVIDIA just seized the moment. They didn't invent raytraced graphics. They just were the first to market with a product ready for it. AMD, Intel, and NV worked with Microsoft to design the specification. Also... the 4080 16GB isn't enough to make AMD pack up even if they didn't have something better than the 6900 XT on the way. It's just not that good a product.
It is very possible that RTX is the new PhysX and the same fate will be shared.
Dr. DroPeople have more money than sense. I only bought a 3090 back then because I saw a window at launch, I had the money and I took it. A week later, the crypto mining boom sent GPU prices skyrocketing, and at its height, it was at almost triple the already absurd sum I spent on it.
Did you sell it for large profit?
Posted on Reply
#140
Legacy-ZA
Xex360And the 1070 on par with the 980ti, seems lot of people forgot what a true generational leap is.
Indeed; hopefully there will be a hardware website somewhere that will make a chart like this, to put things back into perspective and hopefully make people aware of what a "ride" nGreedia is taking them for.
Posted on Reply
#141
Sisyphus
Dr. DroPeople have more money than sense.
No, they have other opinions. But there is a disturbing tendency regarding the ability to objectively discuss differing opinions.
to I only bought a 3090 back then because I saw a window at launch, I had the money and I took it. A week later, the crypto mining boom sent GPU prices skyrocketing, and at its height, it was at almost triple the already absurd sum I spent on it. But that doesn't make the price fine, even if sales are as expected for corporate. There's little justification beyond "just 'cause we can" for its pricing
Has anyone here claimed that cryptomining has ensured fair prices for gaming GPUs?
Not to mention I'm not too sure about that either. After the stores sold the initial batch of 4090s that got to my country, I haven't seen any restocks occur yet...
There are many sources, claiming 4080 is in stock, 4090 sold out fast. So what is your argument here?
And no, NVIDIA just seized the moment. They didn't invent raytraced graphics. They just were the first to market with a product ready for it. AMD, Intel, and NV worked with Microsoft to design the specification. Also... the 4080 16GB isn't enough to make AMD pack up even if they didn't have something better than the 6900 XT on the way. It's just not that good a product.
"Ray tracing was introduced by nVidia into gaming". 100% fact. Your answer assumes I said otherwise. Can you explain why? Raytracing is industry standard visualization for a long time. Due to high computing demand, it was for professionals only before nVidia implemented it to the rtx 20x0 series.
You failed to explain your statement. I said: 4080 for 700$ and 7900 would be DOA for 900-1000$, if they have the same rasterization power. Have you any argument?
ARFIt is very possible that RTX is the new PhysX and the same fate will be shared.
Wrong. GPUs without hardware-supported RT can only be found in the entry-level class. The higher the price, the more important the RT performance and a broad software support. AMD will lose the enthusiast customer base if they don't catch up with RT like they lost the pro segment. It won't be long before the first games developed on new generation gaming engines will hit the market. Then it gets serious.
Posted on Reply
#142
ARF
SisyphusWrong. GPUs without hardware-supported RT can only be found in the entry-level class. The higher the price, the more important the RT performance and a broad software support. AMD will lose the enthusiast customer base if they don't catch up with RT like they lost the pro segment. It won't be long before the first games developed on new generation gaming engines will hit the market. Then it gets serious.
I disagree.
RT is too far away from being reality, you can't run it without very deep and aggressive upscaling (DLSS and similar)...

I am also an enthusiast, and I don't care about ray-tracing - traditional lighting is good for me, the games have much more serious problems than only the lighting.
This is why I can't wait to order a brand new AMD Radeon RX 7900 XT 20 GB.
Posted on Reply
#143
big_glasses
64KI will......

When hell freezes over.
so now?
:D:p
Sisyphus"Ray tracing was introduced by nVidia into gaming". 100% fact. Your answer assumes I said otherwise. Can you explain why? Raytracing is industry standard visualization for a long time. Due to high computing demand, it was for professionals only before nVidia implemented it to the rtx 20x0 series.
ehh, nah kinda. It's like a vague half-truth. Saying it like that, makes it sound like nVidia just straight up "invented" RT for gaming/real-time. Which isn't really the truth.
Do you think MS just scrapped together DXR the month between 2080 release and DXR public release? Or that PS5 and Xbox just added (weak) RT possibility in the 2 year between (or the 6000-series RDNA2 (inb4 it is weak, yes not the point))
Yes, nVidia was first to market with RT-"capable" product, and did some extremely good marketing work, with using RTX branding for their initial RT
SisyphusWrong. GPUs without hardware-supported RT can only be found in the entry-level class. The higher the price, the more important the RT performance and a broad software support.
lmao, I quoted this inbetween it being edited, so it's a bit mangled

Too early to say if it'll get the "physX" treatment, but I doubt it. Given RT isn't a vendor specific product, but is gaining access through the API's (DXR, Vulkan RT), then it'll probably spread and be usable in the future (matter for discussion ofc). As it look now, it'll probably end up being ether full on solo light source, or in a hybrid way (but more) like now (or both)
Posted on Reply
#144
Sisyphus
big_glassesso now?
:D:p



ehh, nah kinda. It's like a vague half-truth. Saying it like that, makes it sound like nVidia just straight up "invented" RT for gaming/real-time. Which isn't really the truth.
I said, nVidia introduced RT to gaming, which is 100% factual.
Do you think MS just scrapped together DXR the month between 2080 release and DXR public release? Or that PS5 and Xbox just added (weak) RT possibility in the 2 year between (or the 6000-series RDNA2 (inb4 it is weak, yes not the point))
Yes, nVidia was first to market with RT-"capable" product, and did some extremely good marketing work, with using RTX branding for their initial RT
It needs about 4-5 years to upgrade the game engine development. The better effects are about to come the next years. PS5 and Xbox won't be able to offer good RT anytime soon. Who wants cutting edge graphics, requires cutting edge GPUs, not mainstream.
Posted on Reply
#145
Unregistered
Legacy-ZAIndeed; hopefully there will be a hardware website somewhere that will make a chart like this, to put things back into perspective and hopefully make people aware of what a "ride" nGreedia is taking them for.
You can check AdoredTV videos, he was one of the few that didn't fall for nVidia's marketing for Ampere.
#146
Sisyphus
ARFI disagree.
RT is too far away from being reality, you can't run it without very deep and aggressive upscaling (DLSS and similar)...

I am also an enthusiast, and I don't care about ray-tracing - traditional lighting is good for me, the games have much more serious problems than only the lighting.
This is why I can't wait to order a brand new AMD Radeon RX 7900 XT 20 GB.
They have all problems, as the RT effects have been subsequently integrated into mature game engines. The next generation of game engines has RT as an integrated part, which gives a different quality.
Posted on Reply
#147
rv8000
ARFIt is very possible that RTX is the new PhysX and the same fate will be shared.
Closed ecosystem anything is bad. Eventually everything will be ray traced and RTX will certainly be a thing of the past.
Posted on Reply
#148
Dr. Dro
ARFDid you sell it for large profit?
No. I kept it because I needed it, and I still do.

Also, I don't think it is going anywhere soon. DirectX Raytracing is not a closed NVIDIA-only thing, and since Ampere it's supported from the bottom up through every tier - and it works.
Posted on Reply
#149
efikkan
Dr. DroThey aren't now (beyond holding back on lower SKUs in the stack to move the same Ampere stock they call ancient GPUs unworthy of DLSS 3 but as a certified poor, I digress) but they would rather do that then engage in a price war, especially considered how much room for SKUs they have.
So you're saying Nvidia could potentially exploit something so we are going to assume they are evil?
Where is the evidence of Nvidia holding back the lower SKUs? (beyond nonsense from some YouTube channels)
It's normal that lower chips follow in sequence. I thought people would remember this by now.
Dr. DroOn the last bit, nonsense? I wasn't the one who announced a card and then unlaunched it, it was NVIDIA.
Renaming a product due to market backlash? How is this relevant to your claims?
Dr. DroNo matter where you put it, even in these generations of old (the GTX 980 wasn't the top model, there were two GPUs above it, the 980 Ti and the Titan X),
GTX 980 was the top model for about half a year, and it remained in the high-end segment until it was suceeded by Pascal.
Dr. Drothe xx104-class cards have always been the middle of the pack ones. Even with Kepler, the GTX 600 series were quickly complimented by the 700 series that introduced the GK110 which was sizably faster than the GK104, similarly to how the GTX 500 series (and the 580) launched only 8 months apart from the GTX 480 that fixed the 400 series' terrible thermals, it was wise of NVIDIA at the time not to repeat the GF100.
The mid-range cards of the 600-series was using both GK106 and GK104 chips.
The 600-series was "short lived" compared to the current release tempo. Back then Nvidia used to release a full generation and a refreshed generation (with new silicon) every ~1.25-1.5 years or so.
Geforce GTX 480 was delayed due to at least three extra steppings.

And back in the 400-series they used a GF100 chip in the GTX 465, which scaled terribly.
You should spend some time looking through the List of Nvidia GPSs. The naming is arbitrary; in one generation a 06 chip is the lowest, in others the 08 chip is. What they do is design the biggest chip in the family first, then "cut down" the design into as many chips as they want to, and name them accordingly; 0, 2, 3, 4, 6, 7, 8. Sometimes they even make it more complicated by making 110, 114, etc. which seems like minor revisions to 100 and 104 respectively.
So listen and learn, or keep digging…
ARFAMD has always been the cheaper, value option and despite this, it is not enough to improve the market situation of the company.
So, AMD needs to step up with something completely different, plus the discounts, of course.
This might be your impression, but it doesn't match the reality. Back in the ATI days, they used to offer higher value in the upper mid-range to lower high-end segments, but since then they have been all over the place.
The Fury cards didn't start things off well, low availability and high price. Followed by RX 480/580 which were very hard to come by at a good price, compared to the competitor GTX 1060 which sold massive amounts and still was very available, even below MSRP at times. The RX Vega series was even worse, most have now forgotten that the $400/$500 price tag was initially with a game bundle, and it took months before they were somewhat available close to that price. Over the past 5+ years, AMD's supplies have been too low. Quite often the cheaper models people want are out of stock, while Nvidia's counterparts usually are. This is why I said AMD needs to have plenty of supplies to gain market shares.

We need to stop painting Nvidia/AMD/(Intel) as villains or heroes. They are not our friends, they are companies who want to make money, and given the chance, they will all overcharge for their products.
ARFIt is very possible that RTX is the new PhysX and the same fate will be shared.
RTX is their term for the overarching GPU architecture:

I doubt it will go away until their next major thing.
Posted on Reply
#150
Dr. Dro
efikkanSo you're saying Nvidia could potentially exploit something so we are going to assume they are evil?
Where is the evidence of Nvidia holding back the lower SKUs? (beyond nonsense from some YouTube channels)
It's normal that lower chips follow in sequence. I thought people would remember this by now.


Renaming a product due to market backlash? How is this relevant to your claims?


GTX 980 was the top model for about half a year, and it remained in the high-end segment until it was suceeded by Pascal.


The mid-range cards of the 600-series was using both GK106 and GK104 chips.
The 600-series was "short lived" compared to the current release tempo. Back then Nvidia used to release a full generation and a refreshed generation (with new silicon) every ~1.25-1.5 years or so.
Geforce GTX 480 was delayed due to at least three extra steppings.

And back in the 400-series they used a GF100 chip in the GTX 465, which scaled terribly.
You should spend some time looking through the List of Nvidia GPSs. The naming is arbitrary; in one generation a 06 chip is the lowest, in others the 08 chip is. What they do is design the biggest chip in the family first, then "cut down" the design into as many chips as they want to, and name them accordingly; 0, 2, 3, 4, 6, 7, 8. Sometimes they even make it more complicated by making 110, 114, etc. which seems like minor revisions to 100 and 104 respectively.
So listen and learn, or keep digging…



This might be your impression, but it doesn't match the reality. Back in the ATI days, they used to offer higher value in the upper mid-range to lower high-end segments, but since then they have been all over the place.
The Fury cards didn't start things off well, low availability and high price. Followed by RX 480/580 which were very hard to come by at a good price, compared to the competitor GTX 1060 which sold massive amounts and still was very available, even below MSRP at times. The RX Vega series was even worse, most have now forgotten that the $400/$500 price tag was initially with a game bundle, and it took months before they were somewhat available close to that price. Over the past 5+ years, AMD's supplies have been too low. Quite often the cheaper models people want are out of stock, while Nvidia's counterparts usually are. This is why I said AMD needs to have plenty of supplies to gain market shares.

We need to stop painting Nvidia/AMD/(Intel) as villains or heroes. They are not our friends, they are companies who want to make money, and given the chance, they will all overcharge for their products.


RTX is their term for the overarching GPU architecture:

I doubt it will go away until their next major thing.
You kind of said a lot and said nothing at the same time. Why is there market backlash? Perhaps because of what I mentioned earlier. It doesn't hold up to the x80 tier. They are holding back the 4070 and 4060 until next year, too.

The GTX 465 was die harvested to move inventory. It's not that it used GF100 because it was designed around it, it was just a way to shift bad bins of higher end cards. No wonder it sucked. The GF100 at best felt like a prototype of the GF110, and I should know 'cause I had 3 480s in SLI, and then 2 580s back in the day.

The 11x-class chips haven't been released since the GK110, which already goes back around 8 years at this point. They are intra-generational refreshes, same as the -20x chips such as GK208 and GM204/GM200. I don't know why you brought up the correlation between HBM cards (low yield, expensive tech) and their midrange successors, both GTX 1060 and Polaris sold tens of millions of units and are still amongst the most widely used GPUs of all time. The 480's very low launch price at $199 may have been a little difficult at the beginning, but for a couple of years after they lost their shine and before the crypto boom, you could easily get them for change.

GA106 was not the smallest Ampere, for example. The GA107 was also used in some SKUs and in the mobile space, and there is also a GA107S intended for the embedded market. It's not really a hard-rule, but the tiers are clearly denominated.

I... don't see how any of this was productive?
Posted on Reply
Add your own comment
Dec 20th, 2024 04:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts