Thursday, May 25th 2023
NVIDIA GeForce RTX 4060 Ti Sees Price Cuts in Europe Just Hours After Launch
NVIDIA's GeForce RTX 4060 Ti mid-range graphics card saw its first round of retailer-level price cuts mere hours into its market-availability (starting May 24). The card was seen starting at 439€ (including taxes), but quickly saw its price trimmed by 20€, down to 419€, in what could be very early signs that the card isn't exactly flying off the shelves. Our reviews of the RTX 4060 Ti show it to be barely matching up to the previous-generation RTX 3070 (a card that it was expected to beat), and the prospect of a 439€ graphics card for 1080p gaming, when a 4K-capable Sony PlayStation 5 can be had for around 449€, isn't going down well with gamers. The Comments sections of our reviews show readers to be agitated, to say the least.
A generational performance uplift trend has been for a graphics card from a particular market segment to match or beat the graphics card from a segment above in the previous generation. The RTX 4060 Ti, for example, should be matching the RTX 3070 Ti in performance, while the upcoming RTX 4060 should be matching the RTX 3070. This doesn't seem to be happening, and the RTX 4060 Ti is being offered at the same launch MSRP as the RTX 3060 Ti, which at launch beat even the enthusiast-class RTX 2080 from its preceding generation.
Source:
VideoCardz
A generational performance uplift trend has been for a graphics card from a particular market segment to match or beat the graphics card from a segment above in the previous generation. The RTX 4060 Ti, for example, should be matching the RTX 3070 Ti in performance, while the upcoming RTX 4060 should be matching the RTX 3070. This doesn't seem to be happening, and the RTX 4060 Ti is being offered at the same launch MSRP as the RTX 3060 Ti, which at launch beat even the enthusiast-class RTX 2080 from its preceding generation.
57 Comments on NVIDIA GeForce RTX 4060 Ti Sees Price Cuts in Europe Just Hours After Launch
- Every next launch MSRP is going to be met with big questions
- Nvidia's position as a salesman gets obliterated (I mean, dropping everything even half a tier in pricing this shortly after launch is like the market dealer you have to haggle with - you just know that price can be halved, whatever it is).
- There is no reason on the competitive front - AMDs whole line up would suddenly be standing there pointless altogether. So you're selling by undercutting, not because the product is worthwhile on its own, which is an opening shot for all out price war, and you don't know where the bottom lies, but Nvidia does know its operating cost is higher than AMD's, as is their initial investment, even if they can put it to use in a bigger market. We can take an educated guess here that Nvidia's GPU in general is also more expensive to craft, if you include the whole software dev behind it, monolithic die, etc.
This among many other reasons I probably don't know of, is the reason you will always see them refresh a lineup before doing what you suggest. This mechanic also highlights why and how AMD and Nvidia trail each other closely - neither benefit from going way out of line. Some semblance of balance preserves both their product value. Its not a cartel... but it damn well feels like one.
Then again, their profits from gaming products are down 38% YoY...
www.techpowerup.com/309125/nvidia-announces-financial-results-for-first-quarter-fiscal-2024-gaming-down-38-yoy-stock-still-jumps-25
I opened up this thread in sheer excitement.... but.... 20 euros?
The last base model xx50 card that I remember actually being entry level price with decent performance was the 1050 which was like 7 or 8 years ago ? Probably RTX 2000 era when Nvidia phased those xx50,xx30,xx10 cards out
Cards below that power level are basically legacy GPUs made for retro gaming (7-10+ years old games). Why would anyone dish out 400-500€ (price of a 2k-4k modern console) for a GPU that can play solidly maybe Skyrim or GTA5? The only scenario I can think of is if old GPU died and even then I would look elsewhere or save a bit more cash or buy a console (even Switch can do that job).
1080p is lowest most common used resolution for gaming. - pure example of entry level
1440p is increasing it's presence and is considered a sweet spot for gaming at this time (monitors are accessible, HW is accessible, it's just Nvidia trying to overprice their GPUs and AMD following the way).
4k is the top, besides 8k which is not really a thing (those monitors and TVs are just coming down in price and it will take several years for them to be established an masse).
Just by this simple division of resolutions it's clear 1080p is entry level, 1440p is midrange (where most xx70 and some xx80 cards fall into) and 4k is at the top.
- Titan = 3584 cores, $1200
- 1050 = 640 cores, $110
Let's compare to the 40-series; The 1080p budget card is (according to Nvidia) the 4060 which has 18% of the 4090's cores for 19% of its price- 4090 = 16384 cores, $1600
- 4060 = 3072 cores, $300
Not only are we being charged twice as much for 1080p, it's a xx50 class of card, by both specification and relative performance.Nvidia no price cuts, OEM have to.
Sounds painful for OEMs
Let's wait and see.
Yep... ran Ivy too :) The days. That quad core is still working in my HTPC lol.
1650 and 1630 weren't even serious attempts either. They came out really late, weren't even RTX 2000 generation cards, and performed significantly worst then the similar prices AMD cards at the time
Like I said the last card I remember being actually a decent entry-level card price and performance was the 1050 from like 6 or 7 years ago. Everything since then has either been completely forgettable or mediocre at launch