• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3060 Ti Founders Edition

  • Like
Reactions: r9
How could TPU control pricing/MSRP pretty please?

And it's not a secret figure either, you can see it with your eyes easily (unlike performance, which is what this site is testing)
If the price is fake then the whole review is invalid or meaningless. Because the major part of the conclusion is based on the price/performance of the card, especially when this segment is supposedly a price/performance segment.
Not to mention that the who whole Founder Edition has just been fake, just a paper launch or only being sent to media. No FE has been actually available. They are either not sold at all or either sold in a very limited numbers, just not to get sued or legally get into problems.
 
Pretty much most top-end, or 'cutting-edge' process tech is thin on the ground. The foundries are having problems supplying demand, whether it's Nvidia and Samsung, or AMD and TSMC. The one thing that truly is missing, however, is patience. From most people, it seems. :kookoo:
 
Once again TPU falls in the trap of nvidia and participate in the fraud of announcing an unavailable product at a fake MSRP.
Shame!
Have you read the last paragraph? I basically say "I think these prices are fake"
 
Have you read the last paragraph? I basically say "I think these prices are fake"
I've definitely read that. And I believe that is not enough. Because not only the prices are fake, but the product itself is fake too. It is an (almost) always out-of-stock product, only available at launch, only for media, with of course, a fake price.
 
Prices are definitely ... not there. Here in Sweden, the MSRP is supposedly SEK 4500, yet the cheapest one I can find (all of which are fully booked by pre-orders, from what I can tell) is at 4800, with most being 5000 or more.

Still, I think @W1zzard did a good job in pointing out this issue. It's not as if TPU can help this.

Still, it makes me wonder what exactly is happening with tech pricing these days. Consoles being a bit more expensive than previously is understandable to a certain degree, and of course the effects are being exacerbated here due to exchange rates fluctuating, but with GPUs in particular ... something must be off, with relatively high MSRPs, yet those still producing 0% margins for partners, all the while there is no supply to speak of. Are these new process nodes that expensive? Are the chips that big? Are yields that bad? Are we in a new, less visible crypto boom? Is TSMC 7nm that squeezed for capacity? Is Covid having a bigger than expected effect on manufacturing and shipping? Is (regular end-user, not miner, datacenter, etc.) demand that much higher than previously?
 
NVIDIA has full fabs at Samsung yet failed to deliver the products. It's so ironic.
Well, it's completely weird. It's a shady business to jack the price up. I don't believe the scalpers.
I believe they are laughing at their customer right now. :roll:
 
I've definitely read that. And I believe that is not enough. Because not only the prices are fake, but the product itself is fake too. It is an (almost) always out-of-stock product, only available at launch, only for media, with of course, a fake price.
You're exaggerating wildly, I fully expect the 3060Ti to be readily available at not much over MSRP in a couple weeks or less. The main reason being, it only matches the second best card of previous generation in performance (2080 super), which means that it won't be interesting to owners of 1080Ti, 2080, 2080s, 2070s, 2070, 5700XT and likely also not to those of 2060s, 5700, 2060, 1080, Vega 64 and even 5600xt, 1070Ti and Vega 56
 
Well as a flight crew I can chime into why the cost of products are going up, because shipping cost is rising to accomodate the quarantine rules of each country.

Before Covid: daily flight
Now: 3 cargo flight per week, because crews have to remain in quarantine (14 days) after oversea flights leading to not enough crew.
So yeah, I have been in quarantine for the past month :banghead: doing cargo flights. I would assume the same things are happening to other airlines and air crews.
Basically air crews are being treated as potential oversea virus carriers :roll:.

Whoever keep complaining about no GPU availability can suck it :D
 
Pretty much most top-end, or 'cutting-edge' process tech is thin on the ground. The foundries are having problems supplying demand, whether it's Nvidia and Samsung, or AMD and TSMC. The one thing that truly is missing, however, is patience. From most people, it seems. :kookoo:
It's easy for you to be patient with a 2080ti. Some people are stuck on turds and wanna upgrade in time for CP2077.
 
"NVIDIA's GeForce RTX 3060 Ti comes at incredible pricing of $399 "
It's incredible literally, because no one believes it. :roll:
Craziest part here is that nobody believes an xx60 card is priced at $400 and it's because people think it's too cheap.
 
Craziest part here is that nobody believes an xx60 card is priced at $400 and it's because people think it's too cheap.
People don't think it's too cheap - that wording implies that people think they're charging less than it's worth. People don't believe it'll actually be available at that price, due to current market realities. Also, an xx60 Ti card is not an xx60 card.
 
NVIDIA has full fabs at Samsung yet failed to deliver the products. It's so ironic.
Well, it's completely weird. It's a shady business to jack the price up. I don't believe the scalpers.
I believe they are laughing at their customer right now. :roll:

Who says the problem is at Samsung. There’s a whole line of suppliers and vendors involved in getting a GPU onto shelves.
 
You're not wrong about the power draws increasing this generation, but a GPU running at ~210W, peaking at ~220W does not require a >600W PSU unless your build is really out there. Most PCs these days have one SSD, maybe one HDD, a few fans, perhaps an AIO pump, and that's it. Combine the GPU power draw with the real-world power draw of a CPU in a matching price range, like the 3600 or 5600X, you have about 300W, add in another 50-70W for the motherboard, RAM, storage and fans, and add another 20% or so for safety and margin for PSU wear. That leaves you with a minimum PSU of ~420-440W. So even a 500W PSU would be plenty and would leave you room for future upgrades too (though statistically the chance of someone moving significantly up in power level from their current GPU when they upgrade is rather unlikely - the far more likely thing is for them to buy a newer GPU in the same tier, with roughly comparable power draw). This of course assumes one buys PSUs of reasonable quality from reliable manufacturers, but that's a given.

Quality > wattage. The impact of ripple is getting higher as the peak demands of GPUs go up, and they do. Clean power delivery matters, but high wattage on a somewhat less impressive unit in terms of ripple, can still end up with lower ripple at the typical load.

real fast the GTX465 wasn't faster than a GTX460, it was the stop gap because GTX460 GF104 wasn't ready yet.

Here is the Trend over the last decade for price per bracket for which series card, some things stick out Top end cards have move from 700 to 1200, the midrange moved from 250 to 499.

I think Nvidia has lost touch, the GTX 1660 was not a good replacement for the GTX 1060 6gb, it was more of a side grade and at times maybe 10% better, the real improvement meant moving up bracket and out of the midrange territory and into the range of the high end

A few oddites to notice, Geforce 3 was over priced but like the later Nvidia cards ATI had nothing to compete, and the GTX200 series is all over the place on launch prices, you can tell when AMD launched the HD4000 series, so it seems normally if Nvidia has competition prices go down, but not this time.

But going back in time the Geforce2 Ultra for 500 is only equal to about 750 in todays money, so its not just inflation

2000-3000GTX Titan ZGTX Titan VTitan RTX
1000-2000GTX 690GTX Titan Black, GTX TitanGTX Titan XGTX Titan X, GTX Titan XPRTX 2080 TiRTX 3090
700-10008800 Ultra9800 GX2GTX 590GTX 780 Ti
600-7007800 GTX 512, 7800 GTX 5127950 GX28800 GTXGTX 780GTX 980 TiGTX 1080 TiRTX 2080 Super, RTX 2080RTX 3080
500-600UltraTi 500, 3
Ti 4800, Ti 4600

FX 5800 Ultra, FX5900 Ultra, FX5950 Ultra
6800 Ultra Extreme7900 GTXGTX 280GTX 480GTX 580GTX 680GTX 980GTX 1080RTX 2070 Super, RTX 2070RTX 3070
400-5006800 Ultra7800 GT8800 GTS 640GTX 295, GTX 260GTX 670GTX 770GTX 1070 TiRTX 2060 SuperRTX 3060 Ti
300-400Pro, GTSTi 200Ti 4400FX 5800, FX 59007800 GS7950 GT, 7900GT8800 GTS 512, 8800 GTS 3209800 GTXGTX 285, GTX 260 216GTX 470GTX 570GTx 660 TiGTX 970GTX 1070
RTX 2060
250-300Ti6800 GT7900 GSGTX 275GTX 465GTX 560 Ti 448
GTX 1660 Ti
200-250Ti 4200FX 5900XT68008800GT9800 GTGTX 460GTX 560 TiGTX 660GTX 760GTX 960GTX 1060 6gbGTX 1660 Super, GTX 1660
150-200MX 400, MXMX 460FX 5700 Ultra, FX 5700, FX 5600 Ultra, FX 56006800 LE, 6600 GT7600GT8800GS, 8600 GTS9600 GTGTS 250GTX 460 768GTX 560GTX 650 Ti Boost, GTX 650 TiGTX 750 TiGTX 950GTX 1060 3gbGTX 1650 Super
100-150MX 440FX 5700LE, FX 5600 LE, FX 5200 Ultra66007600GS8600 GT9600 GSOGT 240GTS 450GTX 550 TiGTX 650GTX 750GTX 1050 Ti, GTX 1050GTX 1650

Its not just inflation. Its a marketplace. Lots of factors are of varying influence. Inflation is just a constant influence.

Even if the MSRPs are set, we see that external influences do affect price. Currently we have a lot of those factors stacking up. Covid, trade war, Christmas shopping, past gen with lacking competition, and a good product line on both sides today after several years of weak releases.

But another aspect I always see people omitting is the fact that right now the performance delta between the lower end and the highest end card is absolutely friggin massive. It wasn't always like that. You can now buy 1080p capable cards in the lower regions (as in ultra 60-120 fps capable, which used to be a holy grail just a few years back) and when you hit the midrange, you'll be set for 1440p gaming. Since Pascal - by the way - which also got the first notable price hike per tier. I said back then it was justified and I still believe it is. Pascal offered a major jump, bigger than most generations before it, its not weird Nvidia wanted to cash in on that - you get what you pay for. Ironically, the current price point of RDNA2 confirms that, too. Pascal was priced right.

Right now people are somehow convinced they MUST game at 4K and then start complaining the cards are so expensive... do we even logic. Similar ideas apply to quality and FPS targets... everyone has to have high refresh rate ultra it seems, even if the wallet isn't deep enough to get there.

Has Nvidia lost touch... meh. Not sure. The stars just didn't align for Ampere for them, I think. They're still leading in many ways, but that leadership is definitely shifting away from gaming and more towards datacenter/enterprise. That began properly with Volta and their attempt to acquire ARM. All they need going from there to feed Geforce is just a trickle down.
 
Last edited:
Quality > wattage. The impact of ripple is getting higher as the peak demands of GPUs go up, and they do. Clean power delivery matters, but high wattage on a somewhat less impressive unit in terms of ripple, can still end up with lower ripple at the typical load.
Yeah, PSU quality is so much more important than the rated wattage. It's a damn shame that no PSU manufacturer is willing to make good-quality, gold or higher rated 400-500W modular units at affordable prices, as that would fulfill the needs of >90% of users. Sadly these likely wouldn't sell well as too many people still believe in and regurgitate the decade-old silliness of planning for 2x the necessary capacity - which was smart back when you couldn't trust the wattage rating of your PSU, but is a load of rubbish these days. The average gaming PC (something like an i5 or Ryzen 5 + a 1060 or similar) doesn't even hit 300W internal power draw under gaming loads, and likely not even under a power virus load, so people buying 600W+ PSUs for these things is just silly. Of course the CYA approach of "minimum PSU wattage" ratings for components just exacerbates this, as Nvidia and AMD both seem to factor in the most outlandish configurations, worst PSUs, and still add heaps of headroom to these numbers. I've been happily running my i5-2400+RX 570 off of a 350W (fanless, custom AC-DC+PicoPSU) PSU for a year or so, and it doesn't even get close to warm under load, and IIRC it doesn't consume more than ~230W at the wall, including PSU losses, under a torture load - so I've got plenty of headroom for upgrades, even. I'm greatly looking forward to the day when people start buying sensible PSUs for themselves.
 
MSRP ia too low, so the market is adjusting the prices according to performance to boost the econ performance
 
Have you read the last paragraph? I basically say "I think these prices are fake"
Hi,
Probably should of been in the first paragraph lol
 
Hi,
Probably should of been in the first paragraph lol
I always write my conclusions following a certain flow, pricing discussion is a the end. I always thought it makes sense to know more about the product first and then talk about pricing
 
Hi,
It takes a long time usually 4-6 months for prices to settle after a release.

I always write my conclusions following a certain flow, pricing discussion is a the end. I always thought it makes sense to know more about the product first and then talk about pricing
Hi,
I've seen you add you opinions in first paragraph too :-)
 
Quality > wattage. The impact of ripple is getting higher as the peak demands of GPUs go up, and they do. Clean power delivery matters, but high wattage on a somewhat less impressive unit in terms of ripple, can still end up with lower ripple at the typical load.



Its not just inflation. Its a marketplace. Lots of factors are of varying influence. Inflation is just a constant influence.

Even if the MSRPs are set, we see that external influences do affect price. Currently we have a lot of those factors stacking up. Covid, trade war, Christmas shopping, past gen with lacking competition, and a good product line on both sides today after several years of weak releases.

But another aspect I always see people omitting is the fact that right now the performance delta between the lower end and the highest end card is absolutely friggin massive. It wasn't always like that. You can now buy 1080p capable cards in the lower regions (as in ultra 60-120 fps capable, which used to be a holy grail just a few years back) and when you hit the midrange, you'll be set for 1440p gaming. Since Pascal - by the way - which also got the first notable price hike per tier. I said back then it was justified and I still believe it is. Pascal offered a major jump, bigger than most generations before it, its not weird Nvidia wanted to cash in on that - you get what you pay for. Ironically, the current price point of RDNA2 confirms that, too. Pascal was priced right.

Right now people are somehow convinced they MUST game at 4K and then start complaining the cards are so expensive... do we even logic. Similar ideas apply to quality and FPS targets... everyone has to have high refresh rate ultra it seems, even if the wallet isn't deep enough to get there.

Has Nvidia lost touch... meh. Not sure. The stars just didn't align for Ampere for them, I think. They're still leading in many ways, but that leadership is definitely shifting away from gaming and more towards datacenter/enterprise. That began properly with Volta and their attempt to acquire ARM. All they need going from there to feed Geforce is just a trickle down.

Disagree, pascal was a return to normal where the mid range was competitive and then they decided meh. I can think of bigger generation jumps.

6600gt vs fx5950 Ultra
7600gt vs 6800 Ultra
Gtx 460 vs gtx 285
And you only had to wait a year between generations
 
You're not wrong about the power draws increasing this generation, but a GPU running at ~210W, peaking at ~220W does not require a >600W PSU unless your build is really out there. Most PCs these days have one SSD, maybe one HDD, a few fans, perhaps an AIO pump, and that's it. Combine the GPU power draw with the real-world power draw of a CPU in a matching price range, like the 3600 or 5600X, you have about 300W, add in another 50-70W for the motherboard, RAM, storage and fans, and add another 20% or so for safety and margin for PSU wear. That leaves you with a minimum PSU of ~420-440W. So even a 500W PSU would be plenty and would leave you room for future upgrades too (though statistically the chance of someone moving significantly up in power level from their current GPU when they upgrade is rather unlikely - the far more likely thing is for them to buy a newer GPU in the same tier, with roughly comparable power draw). This of course assumes one buys PSUs of reasonable quality from reliable manufacturers, but that's a given.

You're assuming no one ever OC's anything in their rig, they all buy Founders Editions, they never get a higher end CPU, they aren't using any USB 3.2 \ USB-C devices, no SATA drives or HDD, and so on. Keep in mind these benchmarks are on stripped down systems.

A single HDD for example, can consume 20W. SSDs are better, but Tom's shows for example using a WD Blue vs a Samsung 850 can lop 30 minutes of battery life off a laptop. It's not zero.

For me, I have a USB hub plugged into a USB-C port that I use to charge my phone, keyboard, mouse, and so on. Separately I have a USB 3.1 external 3TB HDD. I also have both a SATA SSD and an M.2 (tests only have one M.2), and a pcie wireless/bluetooth card. This stuff all adds up, I bet there's an extra 50W draw in there, and if I plug in multiple devices to that hub it could be more. I don't think this is unusual, plenty of folks have much more.

This is why the rec from Nvidia is to have 650W for a 3070 and 750W for a 3080. I'm sure Nvidia will rec 550W+ for a 3060 Ti.
 
You're assuming no one ever OC's anything in their rig, they all buy Founders Editions, they never get a higher end CPU, they aren't using any USB 3.2 \ USB-C devices, no SATA drives or HDD, and so on. Keep in mind these benchmarks are on stripped down systems.

A single HDD for example, can consume 20W. SSDs are better, but Tom's shows for example using a WD Blue vs a Samsung 850 can lop 30 minutes of battery life off a laptop. It's not zero.
Nope. Please re-read - though there were a few details I left out: a core tenet of this approach is checking real-world power draws, in other words the numbers for the GPU you're planning to buy. Not generic numbers, not FE numbers (unless you're buying the FE), not total system power numbers. If you plan to OC, obviously factor that in, but the vast majority don't OC, and besides, the 20% total overhead is typically sufficient to account for that. Getting a higher end CPU doesn't make that much of a difference - CPU power draws scale far less than GPU power draws, except for the past two generations of Intel chips, of course (though even those are far from their peak draws while gaming). But again, the 20% headroom accounts for that.

USB devices generally consume little power, and are unlikely to be in heavy use while the PC is under heavy load, like a game being run. The same goes for drives - and as I said, the average gaming PC today has a single SSD and possibly a HDD. HDD peak power draw happens only during spin-up, so the chances of that happening during gaming is ... tiny. In-use power for a 7200rpm 3.5" HDD is <10W. But more importantly: cumulative power numbers adds a lot of invisible headroom. Gaming never stresses both CPU and GPU to their maximum power draw, let alone the rest of the system. So if you have a peak 90W CPU and a peak 150W GPU, you're never going to see 240W from those two components while gaming. Games don't load the whole PC 100%. So in real-world usage scenarios those additional 20% are already on top of built-in headroom.
For me, I have a USB hub plugged into a USB-C port that I use to charge my phone, keyboard, mouse, and so on. Separately I have a USB 3.1 external 3TB HDD. I also have both a SATA SSD and an M.2 (tests only have one M.2), and a pcie wireless/bluetooth card. This stuff all adds up, I bet there's an extra 50W draw in there, and if I plug in multiple devices to that hub it could be more. I don't think this is unusual, plenty of folks have much more.
That is definitely above average, if not uncommon. As I said, most PC builds these days have a single SSD, and maybe an HDD. Two years ago HDDs were ubiquitous, but not today. Mice and keyboards consume maybe a few watts each - they need to be USB 2.0 compliant, which means 2.5W max, though typically much less unless they have excessively bright RGB. Desktop USB-C ports output a maximum of 15W (5V3A) - that's all the specification allows for without an external PSU. And your HDD again might peak at 20W, but is more likely to be idling at 1-3W or running at 5-10W while the PC is being stressed.

This is why the rec from Nvidia is to have 650W for a 3070 and 750W for a 3080. I'm sure Nvidia will rec 550W+ for a 3060 Ti.
The thing is, even with your additional numbers, you get nowhere near 650W. Not even close. The 3070 is a 220W (240W peak) GPU. Add a ~150W CPU, ~25W for the motherboard and RAM, 20W for a couple of drives, 20W for a few fans and an AIO pump, and another 10W for peripherals, and you get 465W, or 558W with a 20% margin. And again, that system will never, ever consume 465W. Never. That's not how PCs work. Every single component is never under full stress at the same time, even for a millisecond, let alone long enough for it to trip the PSU's OCP. And remember, that's with a 150W CPU, not a 95W or 65W one. There is, in other words, plenty of headroom built into these numbers already. For any other 3070 than the FE, exchange 240W in the calculation with its peak power draw. It really isn't hard.
 
Last edited:
Right now people are somehow convinced they MUST game at 4K and then start complaining the cards are so expensive... do we even logic. Similar ideas apply to quality and FPS targets... everyone has to have high refresh rate ultra it seems, even if the wallet isn't deep enough to get there.
Just imagine if Samsung said: 1280x720 is enough for mobile users, we should convince them if they want more resolutions, they should pay more and do not complain about higher prices
Or if Qualcomm said same thing about processors
or if Apple said 4" is enough, people should not convince themselves to use bigger screen, bigger screens are more expensive, their production is costly.
do you see now how your poor your point is?
 
Back
Top