• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Founders Edition

common unicorns
 
"Now available"
1600355170906.png
 

It was like that already a few seconds after release. I guess no human being ever saw the purchase button. Only bots did.

I was actually waiting and from "notify me" to "out of stock" was literally a sip from my coffee .... I refresehd, not Ready to buy yet, supped my coffee, pressed F5, out of stock ….


Really not funny from NVidia, whatever the reason for it is :(
 
It was like that already a few seconds after release. I guess no human being ever saw the purchase button. Only bots did.

I was actually waiting and from "notify me" to "out of stock" was literally a sip from my coffee .... I refresehd, not Ready to buy yet, supped my coffee, pressed F5, out of stock ….


Really not funny from NVidia, whatever the reason for it is :(
I guess it's a mix of limited supply + miners.
 
Put it this way: if you're willing to pay $700 for a GPU (no matter the generation) yet unwilling to pay more than ~$300 for your monitor, that is ... wait for it ... unwillingness. There have been GPUs perfectly capable of handling >60fps @1440p for years below $700, and there have been 1440p>60Hz monitors at relatively reasonable prices for just as long. And while I'm in no way denying the effect of high refresh rates on perceived visual quality or the gaming experience as a whole, unless one is a die-hard esports gamer I would say 1440p120 beats 1080p240 every time.

I mentioned 1440p@120+ and 2160p@60, because that is why someone most likely was willing to go for a 700€+ card, actually much more giving Turing's pricing.
Further, the reference point was 1080p@144+ with all the jazz (Paneltype, proper HDR, colorspace, contrast, backlighting, etc.), which is indeed readily available for ~200-300€.
Add to that a beefy system to max everything in 1080p at those framerates ~800 to 1000€. All said and done, 1080p goodness for ~1000 to 1500€. (or just a single AIB RTX 2080 Ti)

1440p@120+ and 2160p@60 displays with the same properties ~500€ to "the sky is the limit"€ add a system to match, easily an additional 2500€+.
Even with the RTX 2080 Ti, you still had to fiddle with settings to really enjoy those resolutions on such displays in current titles and some from past years.

Pascal didn't have the horsepower, Turing was horridly overpriced (while also still kinda lacking the performance).
Just Ampere now (and hopefully RDN2), at a sufficient performance envelope as well as a more reasonable price point, made it feasible to really jump to 1440p@120+ and 2160p@60 without compromising for anything you had with 1080p.

It was just a pricy compromise in the past, a trend which now changes and hopefully gains more traction, that's why I questioned the term "unwillingness" in regards to 1080p users.
I totally agree that 1080p gets rather long in the tooth. Regarding the pace, tech went from 4:3 resolutions to 16:9 720p, 1080p, and an excursion to ultra and "ultra-ultra "-wide.
 
Finally had time to catch up on this review.

It's impressive, in the same way that the AMD Radeon Fury was impressive - it brute-forced its way past the previous generation but at great cost.

The 3080 is overshadowed by several glaring issues for anyone who has been following:
  • It's not the 90% faster than Nvidia claimed. That's barely realistic even with RTX and DLSS-optimised titles that have been carefully hand-tuned by Nvidia/developer collaboration and many new games won't get that preferential treatment.
  • 10GB RAM. Yeah, it's enough for now but it's not 'plenty' and it's not necessary enough for that much longer.
  • That power draw. Even though the cooler is quiet enough, that heat has to go somewhere. Your poor CPU, RAM, motherboard, drives, and armpits are going to pay for that. Whatever's in the XBSX and PS5 can't possibly be as power hungry because their design and power bricks simply aren't enough to handle a 350W GPU.
  • RDNA2 is looming ominously and this has failed to meet the overblown claims. Rushed out too early to try and maximise sales at the current price before big Navi lands? Nvidia must be working closely with game devs and insiders who also have hands-on time with the consoles, so regardless of confidentiality agreements, I'm sure someone's been coerced into leaking info to Nvidia about the RDNA2 in the upcoming consoles.
I guess it's a mix of limited supply + miners.

Its resellers.
 
It was like that already a few seconds after release. I guess no human being ever saw the purchase button. Only bots did.

I was actually waiting and from "notify me" to "out of stock" was literally a sip from my coffee .... I refresehd, not Ready to buy yet, supped my coffee, pressed F5, out of stock ….


Really not funny from NVidia, whatever the reason for it is :(
I had some in my cart within the first minute. But by the time the cart loaded it was already sold out. I was chasing the tail the entire time, lol.

It's too bad we got shafted by a partner or we would have had one for review. Piiiiiiiiiiiised.
 
More like QUIET important, amirite???

On a more serious note - I don't use headphones either, my PC is under 1m away from my ears, and I never hear it. Fan stop under a certain temperature threshold for GPUs and PSUs is amazing.
I think you confuse quiet and quite. Damn sometimes English is so confusing.
I sit closer to my PC, but if I undervolt my hopeless 5700 it's very "quiet".
Yea fan stop is great, when I'm not using the GPU the case is basically silent.
 
I think you confuse quiet and quite. Damn sometimes English is so confusing.
I sit closer to my PC, but if I undervolt my hopeless 5700 it's very "quiet".
Yea fan stop is great, when I'm not using the GPU the case is basically silent.
It was a pun.
 
I had some in my cart within the first minute. But by the time the cart loaded it was already sold out. I was chasing the tail the entire time, lol.

It's too bad we got shafted by a partner or we would have had one for review. Piiiiiiiiiiiised.

Yeah can't even access some online shops, they are getting hammered.
 
Yeah can't even access some online shops, they are getting hammered.

Remember all the AMD fanboys saying NVIDIA is scared of RDNA2? Turns out AMD fanboys aren't the ones killing e-shops trying to buy NVIDIA's latest GPU, whodathunkit?
 
It was worse than I expected, the launch sell out that is, wow.
 
"I have to applaud NVIDIA for including a 12-pin to 2x 8-pin adapter with every Founders Edition card as it ensures people can start gaming right away without having to hunt for adapter cables of potentially questionable quality."

I don't think for all the positive things this card has, that that's a thing that deserves any kind of applause... without nVidia providing an adapter, every single buyer of a reference Ampere card would just have bought a solid useless brick that they couldn't use (and a very expensive one at that), because there would be no cables to connect to it at launch... What would nVidia expect? To people buy the card in mass at launch but then having to wait until they could also buy, or request their PSU brands to get them a completely new cable design that serves only nVidia reference cards? (they'll make them from now on, but mostly for some clients due PC build aesthetic reasons). Therefore, the included custom made non-standard adapter is something that is... "expected" (to say the least) to be shipped with their own new cards.

The custom 12pin is a problem they created themselves to exclusively serve their own reference PCB design decisions... it's not an established (or even recent) standard that was already offered with any PSU units for some time now. It's not even something that is essentially and technically needed for the card to work VS standard PCI-E ones, it's a FE design related decision (and nothing that AIB's have the need to jump into). For what is worth, AIB Ampere cards just proved nVidia didn't need any of that V-shaped PCB stuff, and by consequence none of that non-standard 12pin connector... all they needed was a proper normal cooler design, a normal PCB, and normal PCI-E connectors. But they did it anyway, because they wanted to (and that's nothing wrong, really), but they do have to provide the means for the buyers to connect and use what they're buying.

That's like applauding a CPU cooler brand for including their proprietary bracket / MB backplate that servers their own new retention system... You don't applaud them for that, you expect and demand that they include their completely new custom parts inside the box, so you can actually... use the product you just bought from them. I mean, if you're designing your own system to make your product actually "work" in a way you just made up that isn't compatible with the standard brackets form AMD/Intel, then why would you ship your product to the client without that critical custom part that is 100% needed for it's most basic function (since there's no other means to get it any way else when you launch the cooler)?

(anyway, regardless, great review, I'm just baffled by that part)
 
It was worse than I expected, the launch sell out that is, wow.
This hobby is only going to get worst. The market's ratio of money/sense is continually increasing. COVID stimulus money doesn't help the situation either (those that don't NEED it, mind you).
 
This hobby is only going to get worst. The market's ratio of money/sense is continually increasing. COVID stimulus money doesn't help the situation either (those that don't NEED it, mind you).
I think many would just make the money available, that's what I do, bills n food be dammed, can't if they're not available though.
 
I tried to buy one as soon as they went live, after 10 minutes all I could hear is


I never even saw the chance to buy it at nvidia's site
 
Last edited:
It was like that already a few seconds after release. I guess no human being ever saw the purchase button. Only bots did.

I was actually waiting and from "notify me" to "out of stock" was literally a sip from my coffee .... I refresehd, not Ready to buy yet, supped my coffee, pressed F5, out of stock ….


Really not funny from NVidia, whatever the reason for it is :(
On the bright side, you can now wait that little bit longer and buy the 20GB version when it's released.
At least then you won't feel ripped off.
 
I just knew we here in Gougeland were going to be ripped a BIG FAT new one $1469.99NZD to $1798NZD the Asus TUF is the cheapest and the Asus Strix is the most expensive every other AIB partner cards is in between those two prices but makes no difference even if you could afford one then tuff they're outta stock
 
of course I had noticed it, on tom's IT I had also pointed it out to everyone, but still they declared 1.5x with the same power draw! see the picture.
Well, where is this 1.5x in performance/watt?
declare false, there is no excuse.

1600327583346.jpg
You are still not fully reading the chart, the 3080 is at the end of the green line at 320W, you extrapolated 1.5x perf/W at the same power as a 2080 Super but to validate this you have to lower the power target of the 3080 and test with Control at 4K.
As with prior architectures you gain lots of efficiency just by lowering the power target, if you don't want to create a custom voltage/frequency curve, an RTX 2080 Ti at the same power as a GTX 1080 is faster than a 1080 Ti at 280W just to give you a picture, Pascal behave similarly too.
 
I just knew we here in Gougeland were going to be ripped a BIG FAT new one $1469.99NZD to $1798NZD the Asus TUF is the cheapest and the Asus Strix is the most expensive every other AIB partner cards is in between those two prices but makes no difference even if you could afford one then tuff they're outta stock

Your right! here in Aussie land the cards are WAY over priced, $1300 is the cheapest and they should be more priced at $900, what a load of BS! Its even more expensive then a 2080 TI brand new lol
 
Does it all matter?
After all...No one can actually buy the thing.
It's all marketing hype at the moment...and I can guarantee it will never be available for £649 - EVER.
 
It’s not bad but “biggest gen performance leap ever!”????? Nvidia must have forgotten 7 series to G80. 8800gtx was +40-90% every game @1600x1200, sometimes more than double FPS. I expected almost 2x Perf but it came up WAY short. I’m disappointed based on the false hype Nvidia planted with that quoted^ statement.
 
Back
Top