# NVIDIA Cancels GeForce RTX 4080 12GB, To Relaunch it With a Different Name



## btarunr (Oct 14, 2022)

NVIDIA has decided to cancel the November 2022 launch of the GeForce RTX 4080 12 GB. The company will relaunch the card under a different name, though it didn't announce the replacement name just yet. The naming of the RTX 4080 12 GB was cause for much controversy. With the RTX 40-series "Ada," NVIDIA debuted three SKUs—the already launched RTX 4090 which is in stores right now; the RTX 4080 16 GB, and the RTX 4080 12 GB. Memory size notwithstanding, the RTX 4080 12 GB is a vastly different graphics card from the RTX 4080 16 GB.

The RTX 4080 12 GB and RTX 4080 16 GB didn't even share the same silicon. While the 16 GB model is based on the larger "AD103" silicon, has 9,728 CUDA cores, and a 256-bit wide GDDR6X memory bus; the RTX 4080 12 GB is based on the smaller "AD104" silicon, has just 7,680 CUDA cores (21% fewer CUDA cores); and a meager 192-bit wide GDDR6X memory bus. This had the potential to confuse buyers, especially given the $900 price. With criticism spanning not just social media but also bad press, NVIDIA decided to pull the plug on the RTX 4080 12 GB. The company will likely re-brand it as a successor to the RTX 3070 Ti, although then it will have a hard time justifying its $900 price-tag. The RTX 4080 16 GB, however, is on track for a November 16 availability date, with a baseline price of $1,200.





*View at TechPowerUp Main Site* | Source


----------



## P4-630 (Oct 14, 2022)

Great, now what to do with my $900....


----------



## JalleR (Oct 14, 2022)

Just an idea Nvidia...... You could call it the RTX 4080 MX...............................................


----------



## the54thvoid (Oct 14, 2022)

P4-630 said:


> Great, now what to do with my $900....



Buy the same card but now it'll be called a 4070Ti (maybe even just a vanilla 4070).

This shows how much Nvidia are taking the piss. You can see the marketing boardroom with folk giggling about what they were going to release. They knew it was a 4070 in spec but thought, "hey, let's just try and pretend its another 4080 model, even though it's way inferior."

I don't normally get annoyed at Nvidia (up until now) - they are after all a business with shareholders but when they pull this sort of shit, it's just a symptom of a disconnect with consumer reality.


----------



## Tuvok (Oct 14, 2022)

I think they realised there's very high demand for the 4090 and the same will be true for 4080, so no need to rush the lower end parts.

Those can come next year when AMD gpus are available, 3000 series are depleted and they will have to rescale prices anyway, so they also avoid the embarassment of launching a 70 class card at $900


----------



## swirl09 (Oct 14, 2022)

Um... can I make a suggestion? ...

How about, the 4070!!


----------



## cvaldes (Oct 14, 2022)

Or maybe the RTX 4080 WTF?

Jokes aside, my guess is that they'll end up calling it the 4070 Super or Ti (12GB) above a vanilla 4070 (8GB).

This really makes their marketing department look clueless.


----------



## Chrispy_ (Oct 14, 2022)

The reason they didn't want to call it a 4070 is because the price they want for an xx70 card is obscene.

Perhaps now it will be a 4070 after all, but they're going to have to explain why it's $900 instead of $500. If I had to guess, the explanation from Nvidia will be "F*CK YOU ALL, GIVE US YOUR MONEY"


----------



## Vayra86 (Oct 14, 2022)

Oh wow. So there goes any hopes of there being some higher strategic plan we didn't know about.

They really are fucking idiots.... I can't shake that feeling that RT was the trigger for all this bull. Forget inflation, war, shortages or mining. They invested heavily on something that, like many recent tech developments, is way too expensive for normality, and inspired primarily by greed by creating a new artificial problem for GPUs to solve on the fly.

I mean the marketing strategy supports it, the way it was launched supports it, the overall respect to gamers and game content supports it, and the specs of every gen in succession support it, along with their pricing. Things get ugly fast.


----------



## Dirt Chip (Oct 14, 2022)

Big LOL.
Now get you renamed 4080-12GB at the same cost and feel much better about it...

Remind me of good old Rage Against the Machine - killing in the name of  
Rock on, bitches


----------



## trsttte (Oct 14, 2022)

cvaldes said:


> Or maybe the RTX 4080 WTF?
> 
> Jokes aside, my guess is that they'll end up calling it the 4070 Super or Ti (12GB) above a vanilla 4070 (8GB).
> 
> This really makes their marketing department look clueless.



We're past the point where a 4070 can have 8gb of vram, that would be doa quite simply. They'll probably just name what it should always have been named - 4070 - and wait to launch until they can price it lower after getting rid of all the 30 series cards they still have pilled up. 

They can do a 4070 super ti with 4080 dies later on depending how the market evolves (availability, competition, 5000 series timing, etc..)


----------



## Vayra86 (Oct 14, 2022)

Chrispy_ said:


> The reason they didn't want to call it a 4070 is because the price they want for an xx70 card is obscene.
> 
> Perhaps now it will be a 4070 after all, but they're going to have to explain why it's $900 instead of $500. If I had to guess, the explanation from Nvidia will be "F*CK YOU ALL, GIVE US YOUR MONEY"


Oh yeah, I mean in terms of popcorn I have boxes full stored for all the great moments to come.


----------



## Dyatlov A (Oct 14, 2022)

Great news, but It is not even 4070, it is just 4060ti.


----------



## JalleR (Oct 14, 2022)

Nvidia: but it is not named right READ: 4090 is selling like crazy so we dont need a "cheaper" card yet...


----------



## Dirt Chip (Oct 14, 2022)

Man, EVGA got their crystal ball up and running pretty good- Pulled the trigger spot on.
So many 4080-12GB carton boxes with wrong letters on it in a snap.
What a shitstorm.


----------



## AnotherReader (Oct 14, 2022)

swirl09 said:


> Um... can I make a suggestion? ...
> 
> How about, the 4070!!


I have a suggestion: 4060 Ti. The 4080 16 GB should be the 4070 Ti.


----------



## AnarchoPrimitiv (Oct 14, 2022)

Wow....look how much influence just bad press had... imagine how much the GPU market would improve for consumers if they actually had the willpower to follow through with their threats and abstain from buying Nvidia's products.....Nvidia just might be forced to actually lower its prices


----------



## Tuvok (Oct 14, 2022)

Launching Ti/Super versions before or alongside vanilla is stupid in itself


----------



## Niceumemu (Oct 14, 2022)

JalleR said:


> Nvidia: but it is not named right READ: 4090 is selling like crazy so we dont need a "cheaper" card yet...


Except it isn't selling well, microcenter had a few stores where stock didn't sell out on first day and most stores still have stock right now


----------



## AnotherReader (Oct 14, 2022)

Niceumemu said:


> Except it isn't selling well, microcenter had a few stores where stock didn't sell out on first day and most stores still have stock right now


Strange. It's the only reasonably priced card in the Ada stack.


----------



## Tuvok (Oct 14, 2022)

Niceumemu said:


> Except it isn't selling well, microcenter had a few stores where stock didn't sell out on first day and most stores still have stock right now


they have stock because they made boatloads of them, stop applying shortage-time logic

also, current stock pricing is inflated because of shops trying to capitalize on fomo


----------



## Vayra86 (Oct 14, 2022)

AnotherReader said:


> Strange. It's the only reasonably priced card in the Ada stack.


Lol we call this a 'stack'?  Two slices of bread and a burger... no cheese or bacon or lettuce, forget about sauce... and the bottom slice of bread just went missing.


----------



## Nanochip (Oct 14, 2022)

Good. There’s no reason to have an unforced error and deliberately deplete good will amongst gamers. The 4090 is a beast. Rdna3, if as powerful as rumors claim, would have eaten the 4080 12 GB, I mean the 4070, for lunch. Hopefully nvidia drops the price of the 4070 too.

What they should do is call the 4080 12 GB the 4060, the 16 GB the 4070, release the real 4080 with 16 or 20 GB in between the current 4080 16 GB and the 4090. Price the new 4080 at $900 max, the 4070 at 700 and the 4060 at 500. That would be a strong offense against the Radeon onslaught that’s coming.


----------



## Vayra86 (Oct 14, 2022)

Nanochip said:


> Good. There’s no reason to have an unforced error and deliberately deplete good will amongst gamers. The 4090 is a beast. Rdna3, if as powerful as rumors claim, would have eaten the 4080 12 GB, I mean the 4070, for lunch. Hopefully nvidia drops the price of the 4070 too.


Pfeww. You just made a very interesting point. AMD said itself I believe, top end performance isn't going to reach 4090 levels. Perhaps this also influenced Nvidia's move to lower this GPU down the stack in the future. Now AMD can't compete with any x80 with a bigger part of the stack.


----------



## Nanochip (Oct 14, 2022)

Vayra86 said:


> Pfeww. You just made a very interesting point. AMD said itself I believe, top end performance isn't going to reach 4090 levels. Perhaps this also influenced Nvidia's move to lower this GPU down the stack in the future. Now AMD can't compete with any x80 with a bigger part of the stack.


We’ll have to wait and see even if top rdna3 isn’t as powerful in ray tracing, if it is in roughly the same ballpark in rasterization at 4K and 1440p while consuming less power and at a lower cost, it would’ve sold very well and earned good will with gamers many of whom groaned about the 4090’s price.


----------



## Darmok N Jalad (Oct 14, 2022)

Maybe this was a PR stunt. Now they can say they listen to their customers’ concerns and do something about it!


----------



## Gargf (Oct 14, 2022)

RTX 4080 _-10_ ?
RTX 4080 _Almost_ ?
RTX 4080 _But Not Quite_ ?
RTX 4080 _But Actually 4070_ ?
RTX 4080 _Ark ?_


----------



## john_ (Oct 14, 2022)

Probably too many 3090 Tis still in the market to come out with a 4000 series card at $900.

I guess we can expect a price drop of 4090 and 4080 16GB prices in a few months and 4080 12GB coming out with it's original naming, as RTX 4070 and at $100 less.

So, speculating in 6 months from now
4090 Ti at $1600
4090 at $1200
4080 16GB at $1000
4080 12GB as 4070 at $800.


----------



## ZetZet (Oct 14, 2022)

john_ said:


> Probably too many 3090 Tis still in the market to come out with a 4000 series card at $900.
> 
> I guess we can expect a price drop of 4090 and 4080 16GB prices in a few months and 4080 12GB coming out with it's original naming, as RTX 4070 and at $100 less.
> 
> ...


I all depends on AMD pricing I think. 4080 16GB is going to be their main competitor it seems so they might try to undercut it by a lot right away and Nvidia will drop the price sooner. I think 1000 in 6 months is super generous.


----------



## MxPhenom 216 (Oct 14, 2022)

AnarchoPrimitiv said:


> Wow....look how much influence just bad press had... imagine how much the GPU market would improve for consumers if they actually had the willpower to follow through with their threats and abstain from buying Nvidia's products.....Nvidia just might be forced to actually lower its prices


Only if AMD comes in with better cards with undercut prices, but I doubt that now.


----------



## Niceumemu (Oct 14, 2022)

Tuvok said:


> they have stock because they made boatloads of them, stop applying shortage-time logic
> 
> also, current stock pricing is inflated because of shops trying to capitalize on fomo


There was less stock than the 30 series launch so no, they didn't have boatloads of them


----------



## Daven (Oct 14, 2022)

AnarchoPrimitiv said:


> Wow....look how much influence just bad press had... imagine how much the GPU market would improve for consumers if they actually had the willpower to follow through with their threats and abstain from buying Nvidia's products.....Nvidia just might be forced to actually lower its prices


Most PC enthusiasts just reflexively buy Nvidia and cannot bring themslves to buy anything else. It’s a condition known as gullible.


----------



## Fluffmeister (Oct 14, 2022)

Yeah zero reason to play their hand yet, launch the 16GB 4080 then wait for AMD to show theirs, insert said card as required.


----------



## Rahmat Sofyan (Oct 14, 2022)

NVIDIA: The Way It's Meant To Be "Played" . . .​


----------



## Hxx (Oct 14, 2022)

So stupid why did they have to postpone the launch? For a fucking name ? BS I don’t believe that. They can’t come up with one by nov 16? The issue now is we probably won’t get a sub 1k card this year .


----------



## xorbe (Oct 14, 2022)

Someone pointed out that the estimated performance/hardware relative to the 4090 would actually put it around the X060Ti mark historically.  They'll probably angle for 4070Ti though (not even 4070).


----------



## cvaldes (Oct 14, 2022)

trsttte said:


> We're past the point where a 4070 can have 8gb of vram, that would be doa quite simply. They'll probably just name what it should always have been named - 4070 - and wait to launch until they can price it lower after getting rid of all the 30 series cards they still have pilled up.
> 
> They can do a 4070 super ti with 4080 dies later on depending how the market evolves (availability, competition, 5000 series timing, etc..)



As I pointed out in another discussion thread, it makes zero sense trying to compare model numbers between GeForce generations.

This announcement pounds that concept home with brutal clarity.

Prospective customers simply need to assess the various models in the product generation and decide whether or not each SKU's features are worthy of the price being asked while disregarding the model number printed on the card and the retail packaging.

NVIDIA changes their idea of what each model number represents. Essentially, a --70 card is simply a product between a --60 model and an --80 model from the same generation's product stack. Comparing the 1070, 2070, 3070, and 4070 lacks relevancy because of NVIDIA's constant reinterpretation of their model numbers.


----------



## ModEl4 (Oct 14, 2022)

Vayra86 said:


> Pfeww. You just made a very interesting point. AMD said itself I believe, top end performance isn't going to reach 4090 levels. Perhaps this also influenced Nvidia's move to lower this GPU down the stack in the future. *Now AMD can't compete with any x80 with a bigger part of the stack.*


So you think the cut down Navi31 die based model won't be able to compete in raster with RTX 4080 16GB?
I'm not debating, just asking.
Potentially it could mean bad news regarding Navi31 based parts pricing since Nvidia this year will start at $1200 instead of $900 for the new gen.(unless they rename it 4070Ti for example and still launch this year since the chips are ready at $899 or even lower since its 70s series  hencefort, making it good news)


----------



## jesdals (Oct 14, 2022)

Wonder if 12gb 4080 is going to be collectors items?






						MSI GeForce RTX 4080 SUPRIM X - 12GB GDDR6X RAM - Grafikkort | Billig
					

10.690,00 kr. Grafikkort, NVIDIA GeForce RTX 4080 Overclocked (Core clock MHz / Boost clock MHz), <strong>7680</strong> CUDA kerner, 16 GB GDDR6X (Memory clock 22.6 GHz) - 256-bit, PCI-Express 4.0 x16, 3 x DisplayPort 1.4 / 1 x HDMI 2.1 tilslutninger, understøtter NVIDIA G-Sync, 1 x 16-pins (1 x...




					www.proshop.dk


----------



## MxPhenom 216 (Oct 14, 2022)

Fluffmeister said:


> Yeah zero reason to play their hand yet, launch the 16GB 4080 then wait for AMD to show theirs, insert said card as required.


And shift pricing as needed....one could hope


----------



## cvaldes (Oct 14, 2022)

jesdals said:


> Wonder if 12gb 4080 is going to be collectors items?



A sealed retail box might be worth something in 20 or 30 years but probably not as much as this









						If you kept an original iPhone in the box, it might be worth $30,000 | AppleInsider
					

A 2007 iPhone sealed in its original packaging is up for auction, and is expected to sell for $30,000 or more.




					appleinsider.com
				




since the RTX 4080 12GB isn't a groundbreaking invention like the original iPhone.

Plus, if NVIDIA releases the exact card under a different model number, the retail packaging mostly becomes an oddity.

You could buy one in hopes that it becomes a collectors item, the problem is you have to find that collector who will buy it at top dollar before they buy it from someone else at a lower price. And if there aren't that many collectors, the price will stagnate.

It's the same free market concept as brand new cards intended to be operated. If demand outpaces supply, prices will go up. We're seeing that with the 4090 models as we type this.


----------



## Dristun (Oct 14, 2022)

Lmao. Nobody fell for the trick so they decided to try again later. Also saves them an embarrassment of launching a new xx80 card that barely beats the old 80ti (looking at the core count vs 4090 I imagine in some games it's gonna be really close)


----------



## sephiroth117 (Oct 14, 2022)

Daven said:


> Most PC enthusiasts just reflexively buy Nvidia and cannot bring themslves to buy anything else. It’s a condition known as gullible.


AMD needs to make their drivers more stable, tried a RX 580, had drivers issues even when surfing it was BSODing. Passed to a RTX2070, never looked back since.

I don't know if they improved drastically with time, tbh I am looking seriously at RDNA3 too in addition to the 4080 for my next upgrade...but if picking a GPU with the most stable drivers is gullible, then I am gullible. Nothing instinctive with picking up Nvidia...had 2 AMD GPU a ATI 4870 and an RX 580, both were drivers hell, at one point you get tired of DDU, troubleshooting, etc.


----------



## Vayra86 (Oct 14, 2022)

ModEl4 said:


> So you think the cut down Navi31 die based model won't be able to compete in raster with RTX 4080 16GB?
> I'm not debating, just asking.
> Potentially it could mean bad news regarding Navi31 based parts pricing since Nvidia this year will start at $1200 instead of $900 for the new gen.(unless they rename it 4070Ti for example and still launch this year since the chips are ready at $899 or even lower since its 70s series  hencefort, making it good news)


No no, Im saying they cant position their entire stack as competitive against everything Nvidia has. It now might seem you need a higher AMD number to beat an Nvidia number. Pure marketing


----------



## Dirt Chip (Oct 14, 2022)

Here's is the spin: 4090 sales are too good, so NV see they can sell much more of them so no need to hurry with 4080s` but you can`t cancell those 2 produce without being to suspicious. So go 'samrt' and cancel the least profetable one (4090-12GB) with the reson it is "confusing" as no one in all of NV could have thought about it earlier (a company with "unlimited" resource..).

Bonus #1: More time to dump the 3xxx GPUs to the market.
Bonus #2: "Hi, we hear you consumers and media and react buy your will immediately" (this is what you supposed to believe in at least)
Bonus #3: A free "upgrade" to the X7xx tier price that will probably take 4080-12GB name but for the very same cost.
Bonus #4: Confuse AMD a bit.
(Big) Bonus #5: Hail of free PR from media and forums, cus` there no such thing as bad publicity.

Next up, conspiracy theories.
Stay tuned and order much more of that popcorn.

What a great lunch we have!


----------



## wheresmycar (Oct 14, 2022)

postponing launch??? aghhhhh!

I bet its not just the criticism which is forcing the name pull but the performance and price which just doesn't stack up. Now that people are better aware (like me) that DLS-3 is just a marketing gimmick (for many), its about time nV gets back to the drawing room to justify smaller performance gains opposed to the over hyped collosal leaps. A reality check... and hopefully more to come!

Shame on you Nvidia... dirty illusions like these will only push me towards AMD regardless of the compromise... thats coming from a to-date all-time and every-time NVIDIA buyer (only foolish business practices push away a healthy retained customer base)


----------



## ZetZet (Oct 14, 2022)

Dirt Chip said:


> Here's is the spin: 4090 sales are too good


Dunno, looking at ebay auctions going on it doesn't seem like 4090 is selling like hot cakes. Scalpers definitely scalped them, but people are not raising the prices much beyond MSRP.


----------



## windwhirl (Oct 14, 2022)

The so-called 4080 12 GB card is a 4060 Ti. Screw whatever spin Nvidia wants to put on it.


----------



## ZetZet (Oct 14, 2022)

windwhirl said:


> The so-called 4080 12 GB card is a 4060 Ti. Screw whatever spin Nvidia wants to put on it.


It beats a 3090 and a 3080. 4070 would have been definitely justified. If they launched it with that name right away.


----------



## agatong55 (Oct 14, 2022)

Dont worry they will announce the new 4070 ti 12 gig version tomorrow.


----------



## windwhirl (Oct 14, 2022)

ZetZet said:


> It beats a 3090 and a 3080. 4070 would have been definitely justified. If they launched it with that name right away.


I'm thinking percentage of active CUDA cores compared to the fully maxed out chip, so the 4090 as of right now.


----------



## cvaldes (Oct 14, 2022)

agatong55 said:


> Dont worry they will announce the new 4070 ti 12 gig version tomorrow.



LOL, it's Saturday tomorrow.

Maybe Monday.


----------



## ZetZet (Oct 14, 2022)

windwhirl said:


> I'm thinking percentage of active CUDA cores compared to the fully maxed out chip, so the 4090 as of right now.


Eh, 1070 vs 1080 ti was roughly half the CUDA cores too. I think gen to gen performance is what matters most for naming. Next-gen 70 card beating previous gen 80 card is justifiable and it's also easy to understand for consumers.


----------



## Easo (Oct 14, 2022)

I guess it is a small win. But the most important part - price - doesn't change, so they may as well not have bothered...


----------



## mahirzukic2 (Oct 14, 2022)

P4-630 said:


> Great, now what to do with my $900....


After all the pushback, they caved in and decided to rename the thing.
After all that you want a reduced price? Renaming was not good enough for you? What's next? Why are you so unreasonable to Nvidia?
/s


----------



## N/A (Oct 14, 2022)

Nvidia will attempt to outsmart you all and be original.


----------



## mahirzukic2 (Oct 14, 2022)

AnarchoPrimitiv said:


> Wow....look how much influence just bad press had... imagine how much the GPU market would improve for consumers if they actually had the willpower to follow through with their threats and abstain from buying Nvidia's products.....Nvidia just might be forced to actually lower its prices


Tell it to the plebs.



Darmok N Jalad said:


> Maybe this was a PR stunt. Now they can say they listen to their customers’ concerns and do something about it!


I highly doubt it. They don't need that kind of stunt. This is stupidity from the marketing department.


----------



## ARF (Oct 14, 2022)

sephiroth117 said:


> AMD needs to make their drivers more stable, tried a RX 580, had drivers issues even when surfing it was BSODing. Passed to a RTX2070, never looked back since.




Speaking of the drivers, I think this is quite an extreme and wrong opinion.
Imagine that the claim "AMD's driver doesn't work" was true, how do you think, would AMD be able to sell *any* card or not?
If AMD sells cards and if someone like me can say that AMD's drivers are as good as it can be, then you simply are mistaken in your judgement.

lol, I am not surprised that you are happier with an RTX 2070 over the RX 580. That is 80% performance improvement 








TSMC and its suppliers (ASML Netherlands?) are to blame for completely ruining the usual business initiatives.

Normally, AMD could launch *first* a pipe cleaner on the newest process.

nvidia launches the largest and worst thing as a pipe cleaner - it's the largest die.


nvidia made a mistake. They should cancel all large die projects and concentrate on the following:

AD102 - a 350 sq. mm die powering something like RTX 4080 Ultra;
AD103 - a 200 sq. mm die powering RTX 4060 and RTX 4070 depending on the binning and yields;
AD104 - a 100 sq. mm die powering RTX 4040 and RTX 4050 ===//===;
AD105 - a 50 sq. mm die powering RTX 4010 and RTX 4030 ===//===.

Starting, of course, with the smallest die as a pipe cleaner on the new 4N TSMC process.


----------



## mb194dc (Oct 14, 2022)

Niceumemu said:


> Except it isn't selling well, microcenter had a few stores where stock didn't sell out on first day and most stores still have stock right now



It's probably brand protection. To stop backlash in reviews and generally.

Embarrassing to have what was obviously the 4070 as the 4080 12GB.


----------



## Hofnaerrchen (Oct 14, 2022)

Based on the pricing of RTX 4000 GPUs in Euroland it will need a quite hefty price reduction if they want to sell it even as a "RTX 4070ti Super" right from the start.

If it was just for renaming the GPU they should have been able to keep the release date (printing new boxes does not take ages and for the FE you don't have to do much about the cooler design) but I doubt it will now only be a renaming and repricing, specs will probably also get adjusted (no, not improved^^).


----------



## ModEl4 (Oct 14, 2022)

Dirt Chip said:


> Bonus #4: Confuse AMD a bit.


AMD has in 20 days the RDNA3 announcement.
At least before they knew Nvidia's performance level (more or less) and pricing, so they could announce competitive pricing more securely.
If they announce pricing based on previous information, there is a chance Nvidia to respond after the 3rd of November with a more competitive pricing for a AD104 based GPU (but even in this optimistic scenario it's hard to imagine lower than $799 SRP and after all Nvidia announcement was missing anything about price commentary)

"The RTX 4080 12GB is a fantastic graphics card, but it’s not named *and priced* right. Having two GPUs with the 4080 designation is confusing.
So, we’re pressing the “unlaunch” button on the 4080 12GB. The RTX 4080 16GB is amazing and on track to delight gamers everywhere on November 16th.
If the lines around the block and enthusiasm for the 4090 is any indication, the reception for the 4080 will be awesome."

I wonder how far off are the lower end models:


----------



## ARF (Oct 14, 2022)

ModEl4 said:


> "The RTX 4080 12GB is a fantastic graphics card, but it’s not named *and priced* right. Having two GPUs with the 4080 designation is confusing.



Not only confusing, it is a serious provocation for global class-action lawsuits against nvidia.


----------



## ModEl4 (Oct 14, 2022)

ARF said:


> Not only confusing, it is a serious provocation for global class-action lawsuits against nvidia.


Did you spot the RTX 4050 in the photo?


----------



## BenchAndGames (Oct 14, 2022)

ModEl4 said:


> Did you spot the RTX 4050 in the photo?


Thats just a box, an empty box, you did not spot anything.
Anyone can make a box like this, even youself


----------



## ModEl4 (Oct 14, 2022)

BenchAndGames said:


> Thats just a box, an empty box, you did not spot anything.
> Anyone can make a box like this, even youself


Probably!


----------



## medi01 (Oct 14, 2022)

What is the expected performance of 12GB 4080 if 100% is 4090?


----------



## Zareek (Oct 14, 2022)

ARF said:


> Speaking of the drivers, I think this is quite an extreme and wrong opinion.
> Imagine that the claim "AMD's driver doesn't work" was true, how do you think, would AMD be able to sell *any* card or not?
> If AMD sells cards and if someone like me can say that AMD's drivers are as good as it can be, then you simply are mistaken in your judgement.
> 
> lol, I am not surprised that you are happier with an RTX 2070 over the RX 580. That is 80% performance improvement


I concur, I ran a Vega 64 for three and a half years and never had a single BSOD. I think I crashed to desktop once during that same time period. The drivers may be a little flaky at launch but within a few months they get it right, just like Nvidia. The early adopters pay the price.


----------



## wheresmycar (Oct 14, 2022)

ModEl4 said:


> Did you spot the RTX 4050 in the photo?



Yep thats the renaming scheme finalised for the 4080 12GB. 4050's for $900 and then we should see another tier down (maybe a 4020) for non-gaming low-profile graphic cards with one video output for a single display + a thumb size heatsink for $699 (if we're lucky). As long as it produces fake frames thats pretty good value if you ask me.


----------



## xorbe (Oct 14, 2022)

They should have simply "re-launched" the 4080 16GB as the 4085, all problems solved.


----------



## ARF (Oct 14, 2022)

wheresmycar said:


> Yep thats the renaming scheme finalised for the 4080 12GB. 4050's for $900 and then we should see another tier down (maybe a 4020) for non-gaming low-profile graphic cards with one video output for a single display + a thumb size heatsink for $699 (if we're lucky).



With Ada Lovelace architecture, and 1000 shaders, for instance, it would actually be a low-profile entry-level e-sports gaming card, as cheap, and as cool and quiet as it can be.
Of course, with HDMI 2.1 support, DisplayPort 2.1 support and AV1 decode/encode so that the card can run *any *YouTube video autonomously from the CPU.


----------



## Wolfkin (Oct 14, 2022)

Niceumemu said:


> Except it isn't selling well, microcenter had a few stores where stock didn't sell out on first day and most stores still have stock right now


Very much depends on allocation, here in Europe where most cards will lighten your wallet by $2400-2600, when all taxes and currency conversion is done, sold out within 10-15 minutes in most countries, and to add insult Nvidia can't be bothered to sell the FE in most countries.


----------



## mahirzukic2 (Oct 14, 2022)

ModEl4 said:


> Did you spot the RTX 4050 in the photo?


I did. Why is it's box so huge?


----------



## ModEl4 (Oct 14, 2022)

wheresmycar said:


> Yep thats the renaming scheme finalised for the 4080 12GB.


If only...
Anyway, 4050 should be 3 Qs away just like the Wccftech article that i linked suggested!


----------



## AnotherReader (Oct 14, 2022)

cvaldes said:


> As I pointed out in another discussion thread, it makes zero sense trying to compare model numbers between GeForce generations.
> 
> This announcement pounds that concept home with brutal clarity.
> 
> ...


Not completely; the 1070 and 2070 are exactly half of the full flagship die in their generation, and the 3070's die, GA104, is 57% of the 3090 Ti's die: GA102. This holds true as far back as the 770.


----------



## ARF (Oct 14, 2022)

Wolfkin said:


> Very much depends on allocation, here in Europe where most cards will lighten your wallet by $2400-2600, when all taxes and currency conversion is done, sold out within 10-15 minutes in most countries, and to add insult Nvidia can't be bothered to sell the FE in most countries.



If you wish, from a German retailer for euro 2186. NVIDIA GeForce RTX 4090 Grafikkarte (2022) Preisvergleich | Günstig bei idealo kaufen



mahirzukic2 said:


> I did. Why is it's box so huge?



I guess it is photoshopped.


----------



## N3M3515 (Oct 14, 2022)

ZetZet said:


> It beats a 3090 and a 3080. 4070 would have been definitely justified. If they launched it with that name right away.


They highest they can charge for a 4070 is $600 even accounting for inflation. The 3070 beat the 2080 Ti for less than half the price.
3080 was damn near(13%) to 3090 for less than half the price, now the 4080 is FARTHER in performance BUT CLOSER in price (how anyone can wish to buy this shit is unknown to me). 4080 at the most should cost 800 - 850 accounting inflation.

If amd can match the 4080 with the 7800 XT, for $750(more like $850), that would be awesome....


----------



## trsttte (Oct 14, 2022)

cvaldes said:


> As I pointed out in another discussion thread, it makes zero sense trying to compare model numbers between GeForce generations.
> 
> This announcement pounds that concept home with brutal clarity.
> 
> ...



My answer has nothing to do with older model numbers, simply that the card bellow the (real) 4080 can't have just 8gb of vram, much less when the 4080 takes 16gb. 

Nvidia continues to push their rtx narrative and also dlss and 4k gaming, 8gb is simply not enough on a high end card. The 3070 with 8bg and even the 3080 with 10gb were already starved by vram, the 4070 would be even more so.


----------



## ModEl4 (Oct 14, 2022)

mahirzukic2 said:


> I did. Why is it's box so huge?


It seems Galax is taking the GB RX6600 Eagle approach? (But probably just a trial placeholder box used just for the show)


----------



## R0H1T (Oct 14, 2022)

Fool me once shame on you, fool me twice shame on you, fool me thrice ~ ST eff up & take my mortgage JHH 




Nvidia diehards probably right now


----------



## cvaldes (Oct 14, 2022)

trsttte said:


> My answer has nothing to do with older model numbers, simply that the card bellow the (real) 4080 can't have just 8gb of vram, much less when the 4080 takes 16gb. Nvidia continues to push their rtx narrative and also dlss and 4k gaming, 8gb is simply not enough on a high end card. The 3070 with 8bg and even the 3080 with 10gb were already starved by vram, the 4070 would be even more so.



I never said that releasing a 4070 with 8GB VRAM is best. However I think that's what NVIDIA will do.

Two different concepts.

I admire NVIDIA for their engineering, both their hardware and their software (and I'm not even a developer).

I have a separate opinion about how they run their business however I choose not to provide any more details today. I've already said enough about the topic this week.


----------



## ARF (Oct 14, 2022)

cvaldes said:


> I never said that releasing a 4070 with 8GB VRAM is best. However I think that's what NVIDIA will do.
> 
> Two different concepts.



They will have to explain. Because 8 GB is a limitation and a deal-breaker for potential buyers with their own thinking process.
This is like launching the RTX 3080 with 8 GB when even 10 GB led to cheating by nvidia, lowering texture resolution and image quality to maintain framerates.


----------



## Wolfkin (Oct 14, 2022)

ARF said:


> If you wish, from a German retailer for euro 2186. NVIDIA GeForce RTX 4090 Grafikkarte (2022) Preisvergleich | Günstig bei idealo kaufen


Thank you but I have no need for a 4090 at the moment and I very much want to see what AMD brings before I make a decision. 
I'm planning to step up to a 4K monitor come christmas but for now on 1440p my 3090 is more than enough.


----------



## AnotherReader (Oct 14, 2022)

medi01 said:


> What is the expected performance of 12GB 4080 if 100% is 4090?


The 12 GB 4080 is as cut down compared to a 4090 as a 3060 Ti is to a 3090: so about 60% at 4K judging from the 3060 Ti founder edition review.


----------



## cvaldes (Oct 14, 2022)

ARF said:


> They will have to explain. Because 8 GB is a limitation and a deal-breaker for potential buyers with their own thinking process.
> This is like launching the RTX 3080 with 8 GB when even 10 GB led to cheating by nvidia, lowering texture resolution and image quality to maintain framerates.



Well, NVIDIA sold a ton of 3080 10GB cards before they released the 3080 Ti 12GB and eventual 3080 12GB cards.

I have a 3060 Ti 8GB card and for 1440p I don't recall running into VRAM limits apart from maybe one or two titles. 8GB VRAM is a factor for 4K gaming.

Clearly they could debut a 4070 8GB card and later release a 10GB or 12GB variant. It's not like NVIDIA is forced to pick a specific VRAM amount for any given GPU. And consumers will likely buy whatever is offered regardless.


----------



## Aretak (Oct 14, 2022)

sephiroth117 said:


> AMD needs to make their drivers more stable, tried a RX 580, had drivers issues even when surfing it was BSODing. Passed to a RTX2070, never looked back since.


I bought an Nvidia card once and my dog caught on fire and my wife left me and I had some minor artifacting in World of Warcraft. Never again.


----------



## Kissamies (Oct 14, 2022)

Nvidia listened to customers for once? This is incredible.



cvaldes said:


> So clearly they could debut a 4070 8GB card and later release a 10GB or 12GB variant.


The 10GB variant then would be with 160-bit memory bus. Having a memory bus that narrow in a xx70 card would be beyond insane.


----------



## ARF (Oct 14, 2022)

cvaldes said:


> Well, NVIDIA sold a ton of 3080 10GB cards before they released the 3080 Ti 12GB and eventual 3080 12GB cards.
> 
> So clearly they could debut a 4070 8GB card and later release a 10GB or 12GB variant. It's not like NVIDIA is forced to pick a specific VRAM amount for any given GPU. And consumers will likely buy whatever is offered regardless.



I know and it's bizarre. Next is they can try a 4070 4 GB. I guess it will also sell a ton.


----------



## cvaldes (Oct 14, 2022)

Lenne said:


> The 10GB variant then would be with 160-bit memory bus. Having a memory bus that narrow in a xx70 card would be beyond insane.



I wouldn't count anything out from them. After all, they were ready to launch this 3080 12GB card at an abominable price.


----------



## Kissamies (Oct 14, 2022)

cvaldes said:


> I wouldn't count anything out from them. After all, they were ready to launch this 3080 12GB card at an abominable price.


True. But a card with 160-bit bus sold with a high-end price is beyond any sense. On the other hand, it's Ngreedia..


----------



## medi01 (Oct 14, 2022)

AnotherReader said:


> The 12 GB 4080 is as cut down compared to a 4090 as a 3060 Ti is to a 3090: so about 60% at 4K judging from the 3060 Ti founder edition review.


Ouch.

The "it's because 4090 sells so well" comments are even more peculiar. Which GPU flagship wasn't sold out in the first weeks after release.


----------



## Sisyphus (Oct 14, 2022)

A product name change is without any importance. The price depends on demand and competition. AMD is late, nVidia has about 6 months to cash in with its AD102 and AD103. Then we will see, what price AMD calls up for GPUs with comparable performance. A secondary impact will arise from the recession, together with inflation. Difficult to forecast the impact on the chip market, pricing.


----------



## AnotherReader (Oct 14, 2022)

Sisyphus said:


> A product name change is without any importance. The price depends on demand and competition. AMD is late, nVidia has at least 6 months to cash in with its AD102 and AD103. Then we will see, what price AMD calls up for GPUs with about the same performance. A secondary impact will arise from the recession, together with inflation. Difficult to forecast the impact on the chip market, pricing.


Names have meaning. For a long time, Nvidia's top tier GPU was the x80. It changed to x80 Ti when AMD surprised them with Hawaii, and remained so until Ampere. I would also like to know your source regarding a delay in RDNA3.


----------



## medi01 (Oct 14, 2022)

Sisyphus said:


> A product name change is without any importance.


Wrong.

Per NVs own discovery (was in their slides) they've shockingly discovered that customers stick with series (e.g. 970 => 1070 => 2070) rather than sticking with the price bracket.

4080 losing "80" would be a major hit.


----------



## cvaldes (Oct 14, 2022)

Lenne said:


> True. But a card with 160-bit bus sold with a high-end price is beyond any sense. On the other hand, it's Ngreedia..



Due to the relationship between VRAM memory capacity and memory bus width, the 40 Series has created some tricky choices.

Most consumers will just look at the amount of VRAM printed on the retail box and figure that 12GB is better than 10GB even if the bus width is narrower on the former.


----------



## AnotherReader (Oct 14, 2022)

ZetZet said:


> It beats a 3090 and a 3080. 4070 would have been definitely justified. If they launched it with that name right away.


We'll see if it beats a 3090. From Nvidia's own benchmarks, in at least one game, it's closer to the 3080 than the 3090 Ti. The grey bars are the native rendering, i.e. sans DLSS, numbers.


----------



## pavle (Oct 14, 2022)

Maybe, just maybe the chinese looking fellow on pictures a few posts back just wanted to give the people something to chew on to not observe the real problem: the co-existence of RTX 30xx and 40xx series on the market with the latter at much higher price just so nvidiots can prop up their stocks?
The nvidia excrement show continues...


----------



## ZetZet (Oct 14, 2022)

AnotherReader said:


> We'll see if it beats a 3090. From Nvidia's own benchmarks, in at least one game, it's closer to the 3080 than the 3090 Ti. The grey bars are the native rendering, i.e. sans DLSS, numbers.


Okay, maybe it doesn't, it's still within the same performance bracket. Ofc the 900 USD price is insane if it doesn't beat the 3090...


----------



## Sisyphus (Oct 14, 2022)

AnotherReader said:


> Names have meaning. For a long time, Nvidia's top tier GPU was the x80. It changed to x80 Ti when AMD surprised them with Hawaii, and remained so until Ampere. I would also like to know your source regarding a delay in RDNA3.


Names are PR. Consumers need to inform themselves about the specs. I will not defend people, who buy their GPUs based on naming or nicely colored packaging. 
The delay is an estimation. Fill in whatever timespan you like.


----------



## ARF (Oct 14, 2022)

Sisyphus said:


> Names are PR. Consumers need to inform themselves about the specs. I will not defend people, who buy their GPUs based on naming or nicely colored packaging.



There are lawyers who will defend them, though. Because giving a 4080 brand to a 12-GB-AD104-card is like claiming a C-class mercedes to be S-class.


----------



## Dimitriman (Oct 14, 2022)

Lenne said:


> Nvidia listened to customers for once? This is incredible.
> 
> 
> The 10GB variant then would be with 160-bit memory bus. Having a memory bus that narrow in a xx70 card would be beyond insane.



More like, they got caught with their pants down and are now acting desperate to fix things:

- Pretending there is a scalper/miner shortage on 4090
- "Loyalty program" for buying FE 4090, only for current nvidia owners
- Killing the LHR from 30 series
- And now, cancelling the dumb 4080 12GB.

More "unusual moves" to be expected as Jensen continues to wake up from his hubris enduced coma.


----------



## ModEl4 (Oct 14, 2022)

Anyone having access (I don't) in DRAMexchange can see the GDDR6 spot price differences and have an indication.(just an indication!)
Spot price for 8Gbit GDDR6 is even lower than GDDR5 and there is a chance the 16Gb GDDR6 to be only around 1.5X the 8Gb price.(and Nvidia probably buys lower than the below spot session lows...)
So for example the actual difference between 8 8Gbit GDDR6 ICs (256bit bus case/8GB total) and 6 16Gbit GDDR6 ICs (192bit bus case/12GB) can even be only $5 total depending the 16Gbit GDDR6 IC price.
There is a reason for example ARC A770 8GB has only $20 difference with 16GB version and this could be it (8 x $5 for 8Gbit ICs vs 8 x $7.5 for 16Gbit ICs) (i don't buy Intel just want to push 16GB version)
Anyone with access can enlight us!


----------



## AnotherReader (Oct 14, 2022)

Sisyphus said:


> Names are PR. Consumers need to inform themselves about the specs. I will not defend people, who buy their GPUs based on naming or nicely colored packaging.
> The delay is an estimation. Fill in whatever timespan you like.


Estimation means you have no source. Unless AMD managed to make a properly scaling multi GPU chiplet card, they are unlikely to have anything competitive with the 4090, launching around the 4080 would be fine. I would expect the 7900XT to beat the 4080 easily in rasterized games; raytracing performance is unlikely to be as good as Nvidia.


----------



## 80-watt Hamster (Oct 14, 2022)

AnotherReader said:


> Names have meaning. For a long time, Nvidia's top tier GPU was the x80. It changed to x80 Ti when AMD surprised them with Hawaii, and remained so until Ampere. I would also like to know your source regarding a delay in RDNA3.



Not exactly.  We've had the GTX 295, 590 and 690.  Then x90 took a long nap until Ampere.


----------



## Kissamies (Oct 14, 2022)

Dimitriman said:


> More like, they got caught in their own stupidity and now are acting desperate to fix things:
> 
> - Pretending there is a scalper/miner shortage on 4090
> - "Loyalty program" for buying FE 4090, only for current nvidia owners
> ...


Ah, there are plot twists like that. I have to admit that I haven't even been looking that much news about these new ones as their pricing make them so uninteresting.

But I wouldn't call this a canceling but rather renaming it to a model it should've been in the beginning.


----------



## AnotherReader (Oct 14, 2022)

80-watt Hamster said:


> Not exactly.  We've had the GTX 295, 590 and 690.  Then x90 took a long nap until Ampere.


Right. Historically, the x80 was the top single GPU card.


----------



## Kissamies (Oct 14, 2022)

AnotherReader said:


> Right. Historically, the x80 was the top single GPU card.


Yeah and I count 285 as well as it was just a die-shrinked, improved 280.


----------



## Dimitriman (Oct 14, 2022)

Lenne said:


> Ah, there are plot twists like that. I have to admit that I haven't even been looking that much news about these new ones as their pricing make them so uninteresting.
> 
> But I wouldn't call this a canceling but rather renaming it to a model it should've been in the beginning.



Well it's more than renaming cause there is no way they will just call it 4070 but keep the same price. That would reach a new level of rejection.


----------



## Sisyphus (Oct 14, 2022)

medi01 said:


> Wrong.
> 
> Per NVs own discovery (was in their slides) they've shockingly discovered that customers stick with series (e.g. 970 => 1070 => 2070) rather than sticking with the price bracket.
> 
> 4080 losing "80" would be a major hit.


Perspective, it doesn't make a difference to me and to anyone who is informed beforehand. If you want to talk about sales strategies and uninformed consumers, you are right. If nVidia gets a major hit, depends more on AMD, then AD104 naming.


----------



## ARF (Oct 14, 2022)

AnotherReader said:


> Estimation means you have no source. Unless AMD managed to make a properly scaling multi GPU chiplet card, they are unlikely to have anything competitive with the 4090, launching around the 4080 would be fine. I would expect the 7900XT to beat the 4080 easily in rasterized games; raytracing performance is unlikely to be as good as Nvidia.



Why? I think you think the RTX 4090 is out of reach, why exactly?

RX 6950 XT difference to RTX 4090 is only 53%.





The performance jump from the older generation top dog RX 5700 XT to RX 6900 XT was 101% in a single move.





AMD can do it.


----------



## Kissamies (Oct 14, 2022)

Dimitriman said:


> Well it's more than renaming cause there is no way they will just call it 4070 but keep the same price. That would reach a new level of rejection.


Just wondering that what will happen to the cards already been made, manufacturers will bios-flash those with a 4070 named bios?


----------



## cvaldes (Oct 14, 2022)

AnotherReader said:


> Estimation means you have no source. Unless AMD managed to make a properly scaling multi GPU chiplet card, they are unlikely to have anything competitive with the 4090, launching around the 4080 would be fine. I would expect the 7900XT to beat the 4080 easily in rasterized games; raytracing performance is unlikely to be as good as Nvidia.



AMD knows for sure how the 7900XT stacks up to the 4090 in pure rasterization. The 7900XT launch is placed closer to the 4080 launch because it will be more comparable in performance to that card. It also buys AMD some time to improve their driver software. The hardware is already finished, probably sitting on pallets in some warehouse's finished goods section.

Radeon RT cores will be weaker than GeForce RT cores and there's no indication that AMD will dethrone NVIDIA any time soon in machine learning either.

And one key battleground is the developer environment. NVIDIA stands very tall here.


----------



## 80-watt Hamster (Oct 14, 2022)

AnotherReader said:


> Right. Historically, the x80 was the top single GPU card.



Ok, for single chip you're entirely correct.  Wasn't considering that distinction.


----------



## AnotherReader (Oct 14, 2022)

ARF said:


> Why? I think you think the RTX 4090 is out of reach, why exactly?
> 
> RX 6950 XT difference to RTX 4090 is only 53%.
> 
> ...


I think they can do it, but all indications are that they are using chiplets for the larger GPUs. Thus they will take a power hit compared to a monolithic GPU as on-die interconnects will now be inter-chip.



cvaldes said:


> AMD knows for sure how the 7900XT stacks up to the 4090 in pure rasterization. The 7900XT launch is placed closer to the 4080 launch because it will be more comparable in performance to that card. It also buys AMD some time to improve their driver software. The hardware is already finished, probably sitting on pallets in some warehouse's finished goods section.
> 
> Radeon RT cores will be weaker than GeForce RT cores and there's no indication that AMD will dethrone NVIDIA any time soon in machine learning either.
> 
> And one key battleground is the developer environment. NVIDIA stands very tall here.


The 4080 isn't much faster than a 3090 Ti; again, Nvidia's own benchmarks show a 10 to 25% improvement over the 3090 Ti. AMD has claimed a performance per watt increase of over 50%. A 7900XT 50% faster than a 6950 XT would be comfortably 35% faster than a 3090 Ti. On the other hand, your points about RT cores, machine learning, and developer relations are all valid.


----------



## cvaldes (Oct 14, 2022)

Lenne said:


> Just wondering that what will happen to the cards already been made, manufacturers will bios-flash those with a 4070 named bios?



Yes. They are probably sitting in some warehouse in bulk packaging anyhow waiting for final firmware. Even if there were a few sample units sent out, they will all likely get reflashed to new code that identifies it as whatever model number NVIDIA decides.

There are product stickers on the PCB with the wrong SKU and some AIB partners might have put the model number on the cooler.

All of the packaging will have to be scrapped of course.

Remember that Apple's iOS software is RTM'ed shortly before the iPhone launch, maybe 10-14 days to give the manufacturer time to flash units for channel distribution and bricks-and-mortar stores.


----------



## Daven (Oct 14, 2022)

sephiroth117 said:


> AMD needs to make their drivers more stable, tried a RX 580, had drivers issues even when surfing it was BSODing. Passed to a RTX2070, never looked back since.
> 
> I don't know if they improved drastically with time, tbh I am looking seriously at RDNA3 too in addition to the 4080 for my next upgrade...but if picking a GPU with the most stable drivers is gullible, then I am gullible. Nothing instinctive with picking up Nvidia...had 2 AMD GPU a ATI 4870 and an RX 580, both were drivers hell, at one point you get tired of DDU, troubleshooting, etc.


There have been no significant tech reviewer comments about AMD driver failures. Only reason anyone thinks AMD drivers are bad is anonymous internet posts like this. Nothing you said can be verified but it will still make someone casually reading these forums think twice and continue to perpetuate this myth.


----------



## N3M3515 (Oct 14, 2022)

ZetZet said:


> Okay, maybe it doesn't, it's still within the same performance bracket. Ofc the 900 USD price is insane if it doesn't beat the 3090...


Have people gone mad? How is 900 justifiable for a 4070??, the freaking 3070 beat the 2080 Ti!! and for less than half the price!!, that would translate to less than half or the $1500 for the 3090 = 650 - 700 and that is pushing the price still.


----------



## Sisyphus (Oct 14, 2022)

AnotherReader said:


> Estimation means you have no source.


Every forecast is an estimate. My argument was, nVidia can cash in until AMDs new GPUs are physically available. 


> Unless AMD managed to make a properly scaling multi GPU chiplet card, they are unlikely to have anything competitive with the 4090, launching around the 4080 would be fine. I would expect the 7900XT to beat the 4080 easily in rasterized games; raytracing performance is unlikely to be as good as Nvidia.


I agree. Market price depends on demand and competitive AMD products. As MSRP loses its meaning, we may have to wait for the street price, once rx7000 is available. Soon we will see if that's pressuring nVidia to drop the AD104 and AD103 in price. I hope so.


----------



## AnotherReader (Oct 14, 2022)

Sisyphus said:


> Every forecast is an estimate. My argument was, nVidia can cash in until AMDs new GPUs are physically available.
> I agree. Market price depends on demand and competitive AMD products. As MSRP loses its meaning, we may have to wait for the street price, once rx7000 is available. Soon we will see if that's pressuring nVidia to drop the AD104 and AD103 in price. I hope so.


Of course, they'll cash in both before and after the availability of AMD's new GPUs. The uninformed masses will continue to buy Nvidia even when AMD is better. That is why the 3060 has almost the same street price as the far superior 6700 XT in Canada.


----------



## medi01 (Oct 14, 2022)

cvaldes said:


> The 7900XT launch is placed closer to the 4080 launch because it will be more comparable in performance to that card



AMD also wants to get into business of next gen cards being sold as yet another (more expensive) higher tier.



cvaldes said:


> driver


What year is it, seriously...



AnotherReader said:


> all indications are that they are using chiplets for the larger GPUs.


Chiplets is how underdog had trounced Intel.

I doubt Frau Su would do that if it meant losing the flagship competition outright.


----------



## ModEl4 (Oct 14, 2022)

Pre-binned Navi21 XTXH could hit 3GHz with the right conditions.
Supposedly Navi31 with the same conditions can hit close to 4GHz, let's say 3.9GHz that's +30% speed improvement.
The most power efficient Navi21 GPU in most cases/res was RX6800.
That had 2105MHz boost, add 30% on that (already high, 15% is the official TSMC difference regarding nodes) and maybe we have 2735MHz boost for the most efficient Navi31 based GPU (the one that they claim has +50% more performance/W?)
With so low frequency and if we are talking about a 300W to 335W Navi31 based model the performance potential can be uneventful. (in relation with what Nvidia can achieve)
Add to that the pessimistic scenario that the +50% performance/W claim made using upcoming FSR3.0 that maybe RDNA3 has an advantage there and the conclusions are even more pessimistic regarding performance potential.
Anyway, the above are probably bullshit, I don't believe them, i just examined the possibilities...


----------



## AnotherReader (Oct 14, 2022)

medi01 said:


> AMD also wants to get into business of next gen cards being sold as yet another (more expensive) higher tier.
> 
> 
> What year is it, seriously...
> ...


No one doubts AMD's engineering chops; connecting a multi die GPU would require a massive off-chip interconnect, but it can be done. If they could pull it off, then such a multi die GPU, with the proper scaling, would easily surpass the 4090 in rasterization, but we will see in November.


----------



## cvaldes (Oct 14, 2022)

medi01 said:


> AMD also wants to get into business of next gen cards being sold as yet another (more expensive) higher tier.



Both NVIDIA and AMD have excess inventory of previous generation cards in the channel as well as unused GPU chips.

As far as I can tell, NVIDIA doesn't have any problem right now selling 4090 cards. The best binned GPU chips will end up in data centers anyhow.

It's really the low to mid-range graphics cards (Ampere and RDNA2) that are the major source of headaches for both companies and have contributed to these weird marketing conundrums.



> What year is it, seriously...



A very tricky one for AMD, Intel, NVIDIA, and others.



> Chiplets is how underdog had trounced Intel.



Well, the hardware for the 7900XT is already done. It's not like AMD can make it a multi-chiplet GPU in reaction to the 4090. These sort of architectural decisions need to be made years in advance.

It's important to point out that Intel gave AMD a chance to catch up by failing to transition to a smaller process node in a timely manner. It's not just the chiplet design that helped, TSMC gets a lot of the credit in the success of Ryzen 2, Ryzen 3, and now Ryzen 4.

Both AMD and NVIDIA are using TSMC's foundries for this new generation's family of GPUs. NVIDIA probably gave AMD a little help by using Samsung's foundries for the Ampere generation.


----------



## Kabouter Plop (Oct 14, 2022)

I bet they changed their mind because of combination of things, community realizing rtx 4080 12 gb are just old rtx 30 cards, and them not wanting people to find out, next having people find out about dlss actually working fine on rtx 30 cards and 20 cards.
I wish Nvidia treated their own customers properly with new featrures, imagine buying a gtx 1080 and all the old cards get cool new features, but then they release the rtx 20 series and they cant even release anything new for gtx 1080, and having to rely on AMD FSR instead.


----------



## dont whant to set it"' (Oct 14, 2022)

P4-630 said:


> Great, now what to do with my $900....


Maybe stop $ feeding the troll?



Darmok N Jalad said:


> Maybe this was a PR stunt. Now they can say they listen to their customers’ concerns and do something about it!


Preemptive , perceive company image damage control ?


agatong55 said:


> Dont worry they will announce the new 4070 ti 12 gig version tomorrow.


Or , better yet ? Super ti - mx-q.


xorbe said:


> They should have simply "re-launched" the 4080 16GB as the 4085, all problems solved.  View attachment 265499


Long gone are the days of the "GF-Fermi" where one could unlock more shader cores with bios mod.

I am 3 pages in so far and calling it a night.
Is it just me "sensing a pattern" with the last couple graphics cards launches from the "big green camp"?
I mean first to time around(rtx3xna series) could be seen as a one off , but , all over again can shift towards another p.o.w. , such as , the "green camp" turned into a troll(price-scalping troll).

Not like that I couldn't fork the buck-a-roos for a 3000 series at launch'es (-90-80-70) , though it of been to the last cent then. Similarly different thing now , kinda all over again type a situation, got the savings ok after not having to "burn" trough them post surgery. Nope , nVidia's RTX 4090 out of stock, or way north of $2Kilo. At peace with skipping this gen' of theirs I am.


----------



## Kissamies (Oct 14, 2022)

cvaldes said:


> Yes. They are probably sitting in some warehouse in bulk packaging anyhow waiting for final firmware. Even if there were a few sample units sent out, they will all likely get reflashed to new code that identifies it as whatever model number NVIDIA decides.
> 
> There are product stickers on the PCB with the wrong SKU and some AIB partners might have put the model number on the cooler.
> 
> ...


Though relabeling packages isn't too uncommon. I remember RX 480 4GB for example, they used the same ones as 8GB versions, just a 4GB sticker over it.


----------



## TheoneandonlyMrK (Oct 14, 2022)

How did I miss them Directly validate all the naysayers, filth.

I bet the price doesn't change.


----------



## cvaldes (Oct 14, 2022)

Lenne said:


> Though relabeling packages isn't too uncommon. I remember RX 480 4GB for example, they used the same ones as 8GB versions, just a 4GB sticker over it.



Sure.

It's probably the decision of the individual manufacturer and what sort of impression they'd like to convey. I certainly understand the practicality of using a sticker on a box/bag to denote a difference in the product inside, and not just PC hardware.

However, this would likely be a full model number change.

Maybe they should slap a "4090" sticker on the box and charge accordingly.


----------



## mama (Oct 14, 2022)

Well, it's official: the 3080 to 4080 has *an MSRP increase of 71%*.


----------



## Timelessest (Oct 14, 2022)

They could also relaunch the 4080 16gb as 4080 ti.


----------



## cvaldes (Oct 14, 2022)

Timelessest said:


> They could also relaunch the 4080 16gb as 4080 ti.



But they decided not to. And that wouldn't change the large gap between it and the 4090.









						NVIDIA cancels GeForce RTX 4080 12GB - VideoCardz.com
					

NVIDIA scraps RTX 4080 12GB In a bizarre move, NVIDIA decided to cancel the RTX 4080 12GB SKU before it was even released. Facing never-ending criticism, NVIDIA has just announced it will not launch GeForce RTX 4080 12GB model, the card that we knew and will always know as RTX 4070. In a...




					videocardz.com
				




Clearly there is a place for a "4080 Ti" card with 100 SMs, 12000 CUDA cores, 20GB VRAM on a 320-bit memory bus, ~400W TDP, at a price point of $1400.

And from a marketing standpoint, it makes more sense for them to launch something with a plain model number (3080, 3090, 4080, 4090) versus a variant with a suffix. Otherwise they will spend six months fielding the question, "Where's the regular 4080?"


----------



## ARF (Oct 14, 2022)

mama said:


> Well, it's official: the 3080 to 4080 has *an MSRP increase of 71%*.



It depends on AMD if this nvidia pricing will survive.
A potential RX 7900 XT at ~800, 900 or 1000 msrp will drop the RTX 4080 price.

If we are lucky, AMD can launch a cut-down Navi 31 as 7800 XT for 600 or 700 msrp.





NVIDIA "Unlaunches" GeForce RTX 4080 12 GB, RTX 4080 16 GB On Track For 16th November Launch (wccftech.com)


----------



## Naito (Oct 14, 2022)

Still think that chip should be a 60 tier card...


----------



## Crackong (Oct 14, 2022)

EVGA got out at the right time


----------



## Zareek (Oct 14, 2022)

ARF said:


> It depends on AMD if this nvidia pricing will survive.
> A potential RX 7900 XT at ~800, 900 or 1000 msrp will drop the RTX 4080 price.
> 
> If we are lucky, AMD can launch a cut-down Navi 31 as 7800 XT for 600 or 700 msrp.
> ...


If people gobble up these 4080s like they did 3080s, the 5080 will cost $1699+. Nvidia will keep jacking up the prices until the market will bear it no more.


----------



## MentalAcetylide (Oct 14, 2022)

So am I to understand that buyers are getting SKU sku ska-roo'ed? I think Nvidia is just trying to muddy the waters to make it more difficult to compare price vs. performance with previous card generations.


----------



## cvaldes (Oct 15, 2022)

Zareek said:


> If people gobble up these 4080s like they did 3080s, the 5080 will cost $1699+. Nvidia will keep jacking up the prices until the market will bear it no more.



People gobbled up scalped Ampere and RNDA2 cards during the Great GPU Shortage. And now they're buying up scalped 4090 cards.

The marketplace has been sending a very clear message to AMD, NVIDIA and their AIB partners for the past few years.


----------



## Kissamies (Oct 15, 2022)

Naito said:


> Still think that chip should be a 60 tier card...


Agree. 192-bit bus on a xx70 card feels like too cut-down.


----------



## JrRacinFan (Oct 15, 2022)

Tuvok said:


> I think they realised there's very high demand for the 4090


Joke incoming ....

Yup, both power consumption and on your wallet xD 

Jokes aside, good that they realized that they dont have to flood the market with SKU naming.


----------



## c2DDragon (Oct 15, 2022)

nVidumb
Now to know who is the most dumb, the consumer who will buy those new generation cards (4080 & 4090) eating so much power for such a high price or them (nVidumb) for the attempt to use the same name for 2 cards with clearly different specs...


----------



## Fluffmeister (Oct 15, 2022)

ARF said:


> It depends on AMD if this nvidia pricing will survive.
> A potential RX 7900 XT at ~800, 900 or 1000 msrp will drop the RTX 4080 price.
> 
> If we are lucky, AMD can launch a cut-down Navi 31 as 7800 XT for 600 or 700 msrp.
> ...



But if AMD have great products, why won't they be scalped too? Stock gone, MSRP's meaningless, tears flow... nothing changes.


----------



## chrcoluk (Oct 15, 2022)

It will be a 4080M or something.  I dont think they will call it a 4070 due to the price point.



AnotherReader said:


> We'll see if it beats a 3090. From Nvidia's own benchmarks, in at least one game, it's closer to the 3080 than the 3090 Ti. The grey bars are the native rendering, i.e. sans DLSS, numbers.
> 
> View attachment 265524


Wow, why they made the gap so big between 4080 and 4090? lol


----------



## MarsM4N (Oct 15, 2022)

cvaldes said:


> Clearly there is a place for a "4080 Ti" card with 100 SMs, 12000 CUDA cores, 20GB VRAM on a 320-bit memory bus, ~400W TDP, at a price point of $1400.



Between the _4080 (16GB)_ and the _4090_ is not only a place for a _"4080ti"_, there is a performance gap *for a truck load of faster models*.  4090 is up to *~93%* faster than a 4080(16GB)!
I get that the "fake" _4080_ is now the hanger and I am glad it's finally beeing adressed, but the 16GB version is a scam, too. Barely faster than a _3090ti_ in rasterization (ignoring the fake DLSS FPS).












cvaldes said:


> And from a marketing standpoint, it makes more sense for them to launch something with a plain model number (3080, 3090, 4080, 4090) versus a variant with a suffix. Otherwise they will spend six months fielding the question, "Where's the regular 4080?"



To me it looks like the marketing division decided to slap *"4080"* on everything, because folks associate _*"4080" = top model = fast = expensive*_.
And with the gap between the _4080_ and the _4090_ there is a lot of air to release every 6 month a new faster _4080_. It's all about justifying high prices & increasing profits.

I just call it: Q3/23 = *4080**ti*, Q1/24 = *4080Super*, Q3/24 = _*4080Ultra*_


----------



## neatfeatguy (Oct 15, 2022)

sephiroth117 said:


> AMD needs to make their drivers more stable, tried a RX 580, had drivers issues even when surfing it was BSODing. Passed to a RTX2070, never looked back since.
> 
> I don't know if they improved drastically with time, tbh I am looking seriously at RDNA3 too in addition to the 4080 for my next upgrade...but if picking a GPU with the most stable drivers is gullible, then I am gullible. Nothing instinctive with picking up Nvidia...had 2 AMD GPU a ATI 4870 and an RX 580, both were drivers hell, at one point you get tired of DDU, troubleshooting, etc.



I've had many driver issues with Nvidia over the years.

* Broken video playback in multiple formats; while playing games or streaming/playing videos
* 2nd monitor never waking up from sleep. Only resolution was a system reboot.
* Green pixilated shadows
* crash to desktop
* BSOD
* games not launching
* games that ran just fine before, but now run like crap
* one annoying issue I had for a while was lights shining through other objects. Examples would be the sun, as it would pass behind a mountain (or a building), the sun would shine right through it. Torches or street lights would shine through walls and so on. This issue actually plagued me over a handful of driver versions. 

Those are just the ones that I still remember, I'm sure there are more. No side is perfect and to keep on telling people that AMD drivers are unstable, that just means you've been lucky enough to not have to deal with bad Nvidia drivers. Also, not all drivers work with all hardware/software configurations. I've learned some time ago that if I find a driver that works, I stick with it and I don't update my driver unless I absolutely have to (new generation of a GPU or a new game that requires the most recent driver to run).


----------



## Why_Me (Oct 15, 2022)

MarsM4N said:


> Between the _4080 (16GB)_ and the _4090_ is not only a place for a _"4080ti"_, there is a performance gap *for a truck load of faster models*.  4090 is *~93%* faster than a 4080(16GB)!
> I get that the "fake" _4080_ is now the hanger and I am glad it's finally beeing adressed, but the 16GB version is a scam, too. Barely faster than a _3090ti_ in rasterization (ignoring the fake DLSS FPS).
> 
> 
> ...


I must be missing something here.  The cheapest 3090 Ti I can find in the US is going for $1450 atm yet loses out to the $1200 4080 16GB in gaming .. that's with *DLSS turned off*.


----------



## iO (Oct 15, 2022)

"We're sorry we got caught"


----------



## Chaitanya (Oct 15, 2022)

Dyatlov A said:


> Great news, but It is not even 4070, it is just 4060ti.


True, that 16GB is the true 4070 and fans are drooling over it.


----------



## Why_Me (Oct 15, 2022)

Crackong said:


> EVGA got out at the right time


EVGA will now be known for their Channel Well psu's and their overpriced boards.



Why_Me said:


> I must be missing something here.  The cheapest 3090 Ti I can find in the US is going for $1450 atm yet loses out to the $1200 4080 16GB in gaming .. that's with *DLSS turned off*.





Chaitanya said:


> True, that 16GB is the true 4070 and fans are drooling over it.


----------



## ModEl4 (Oct 15, 2022)

Why_Me said:


> I must be missing something here.  The cheapest 3090 Ti I can find in the US is going for $1450 atm yet loses out to the $1200 4080 16GB in gaming .. that's with *DLSS turned off*.


I saw yesterday the below NewEgg offer for AMP Extreme Holo, it's still valid (still +$100 vs before 1 week for the cheapest 3090Ti model)


----------



## Why_Me (Oct 15, 2022)

ModEl4 said:


> I saw yesterday the below NewEgg offer for AMP Extreme Holo, it's still valid:
> View attachment 265563


Zotac is right up there with Inno, Palit and Asrock as far as budget brands go .. but good find. ^^


----------



## Mistral (Oct 15, 2022)

This only happened because they judged they couldn't get away with what they were trying to pull. If you want nVidia to treat you better, stop blindly giving them your money.


----------



## solarmystic (Oct 15, 2022)

Now watch as they re-release it as the 12 GB RTX 4070 after the furore has died down in a few months time. For the same 899 USD. Because why not.


----------



## regs (Oct 15, 2022)

AnotherReader said:


> I have a suggestion: 4060 Ti. The 4080 16 GB should be the 4070 Ti.


4080/16 (AD103) was about to be just 4070 and 4080/12 was about to be 4060. 4080 series were to be AD102. 4070 Ti and 4060 Ti had to be silicon selection of AD103 and AD104 coming later.


----------



## ModEl4 (Oct 15, 2022)

Ada has great technology, but personally I'm not interested to buy a new system and VGA for at least half a year.
And the only way I'm getting one (RTX 4050 class when is going to be available and if SRP is competitive) is if POE2 and Diablo 4 support DLSS 3.0!


----------



## comtek (Oct 15, 2022)

nvidia should release RTX4080 16GB as RTX 4090 16GB and RTX4080 16GB as RTX 4090 12GB just to piss consumers off. 

/s


----------



## wolf (Oct 15, 2022)

trsttte said:


> even the 3080 with 10gb were already starved by vram


I hear that a lot, but owning one for 2 years and now running 4k120 it's not an issue, and hasn't been. And even when it isn't enough, slightly lowering the textures would certainly avoid the card being 'starved'. 

As for this move, all I can say is

LOLLLLLLLLLL 

4070/ti incoming, but at what price?


----------



## MarsM4N (Oct 15, 2022)

I find it especially funny how they portray the anouncement on their homepage: _*Unlaunching The 12GB 4080*_
Quote:_ "If the lines around the block and enthusiasm for the 4090 is any indication, the reception for the 4080 will be awesome."_

Posting images of consumer sheeple lining up on the stores.  Bet most of them where _"*physical scalpers*"_, lol.
Can't wait for their Black Friday PR posts.










*P.S.:* gonna be interrsting how they handle warranty for the rebadged "4080's". Companies like Gigabyte or MSI have some special_ *warranty conditions*_ where the warranty already starts from the day of production. Which means theoretically if they take 3 months for rebading and delivery you get 3 months less warranty.


----------



## DeathtoGnomes (Oct 15, 2022)

This was as pre-planned event, Nvidia had no intention of launching a mid-range card as long as 30 series stock was still full.


----------



## Chaitanya (Oct 15, 2022)

MarsM4N said:


> I find it especially funny how they portray the anouncement on their homepage: _*Unlaunching The 12GB 4080*_
> Quote:_ "If the lines around the block and enthusiasm for the 4090 is any indication, the reception for the 4080 will be awesome."_
> 
> Posting images of consumer sheeple lining up on the stores.  Bet most of them where _"*physical scalpers*"_, lol.
> ...


In India both Gigabyte and MSI have warranties from date of purchase by customer and they are held responsible against that. So not sure where they are enforcing that from the date of manufacturing condition.


----------



## R0H1T (Oct 15, 2022)

Well I don't see how they'll be able to enforce warranty from date of manufacture! Are they gonna setup a shop just for assembly near Amazon/Newegg warehouses


----------



## cvaldes (Oct 15, 2022)

R0H1T said:


> Well I don't see how they'll be able to enforce warranty from date of manufacture! Are they gonna setup a shop just for assembly near Amazon/Newegg warehouses



They don't.

Warranty coverage starts from the date of purchase. That's why one needs to furnish a receipt (which shows the sales date). If you buy a PSU that's been sitting on a store shelf for six months, plus a month from the factory to some distribution warehouse via ocean cargo, it doesn't mean you lose seven months of warranty coverage.

Some of the discussions in this forum are getting stranger and stranger.


----------



## The Von Matrices (Oct 15, 2022)

I don't buy the "we listened to the customers and decided to rename." It's pretty clear that NVIDIA is having trouble unloading all its remaining 3000 series cards. Since AD104 has similar performance to GA102 it makes sense to delay the release until the supply of GA102 chips is exhausted.


----------



## HTC (Oct 15, 2022)

The Von Matrices said:


> I don't buy the "we listened to the customers and decided to rename." *It's pretty clear that NVIDIA is having trouble unloading all its remaining 3000 series cards.* Since AD104 has similar performance to GA102 it makes sense to delay the release until the supply of GA102 chips is exhausted.



AMD is in an unique position to cripple nVidia's 30XX and 40XX sales: all they have to do is to have their next generation cards priced "much more consumer friendly", and nVidia will lose millions.

While such a move would mean lower margins for AMD too, it would also have the effect of increasing their sales across the board.

In the current global economic situation, the company that sells cheaper is likely to have the bigger sales, but it's a double edged sword since AMD would lose A LOT OF MONEY too, though nowhere near nVidia's because of their market share.


----------



## N3M3515 (Oct 15, 2022)

mama said:


> Well, it's official: the 3080 to 4080 has *an MSRP increase of 71%*.


Nobody seems to realize this.


----------



## Minus Infinity (Oct 15, 2022)

LOL we had clowns on this forum praising the 4070 Ti in disguise (as I've been calling it since day one) and saying it was awesome. Brilliant cave-in from Ngreedia after they were called out. As a 4070 Ti and $699 it'll make sense.


----------



## The Von Matrices (Oct 15, 2022)

HTC said:


> AMD is in an unique position to cripple nVidia's 30XX and 40XX sales: all they have to do is to have their next generation cards priced "much more consumer friendly", and nVidia will lose millions.
> 
> While such a move would mean lower margins for AMD too, it would also have the effect of increasing their sales across the board.
> 
> In the current global economic situation, the company that sells cheaper is likely to have the bigger sales, but it's a double edged sword since AMD would lose A LOT OF MONEY too, though nowhere near nVidia's because of their market share.


That assumes that AMD isn't also having trouble clearing out 6800 and 6900 series cards.  From a quick glance online I don't see any indications that supplies of them are running out either.


----------



## HTC (Oct 15, 2022)

The Von Matrices said:


> That assumes that AMD isn't also having trouble clearing out 6800 and 6900 series cards.  From a quick glance online I don't see any indications that supplies of them are running out either.



True: they would also lose money there but, do the potential gains outweigh the loss of revenue?

It would be a gamble, for sure ... but if it works ... AMD would see their market share improve tremendously while forcing nVidia to lose millions at the same time, and not only from the market share loss. Heads AND jackets might roll ...


----------



## HD64G (Oct 15, 2022)

It is written that big arrogance turns into big humiliation. And this happens in most cases sooner or later.


----------



## MarsM4N (Oct 15, 2022)

cvaldes said:


> They don't.
> 
> Warranty coverage starts from the date of purchase. That's why one needs to furnish a receipt (which shows the sales date). If you buy a PSU that's been sitting on a store shelf for six months, plus a month from the factory to some distribution warehouse via ocean cargo, it doesn't mean you lose seven months of warranty coverage.
> 
> Some of the discussions in this forum are getting stranger and stranger.



Well, I did some research and it looks like there are some _"**regional differences may apply*"_.  German buyers are certanly fu$$ed if they buy one of these.

Global:
*GIGABYTE / AORUS WARRANTY TERMS AND CONDITIONS*_ (3 years of limited local warranty, from the date of purchase, GIGABYTE graphics cards, except those labeled “Mining Series”, are intended only for use with desktop PCs. Other types of use, such as blockchain computing or cryptocurrency mining, will render the product warranty void.)
*MSI Warranty Information* (warranty period of 3 years, The warranty period will begin on the date of purchase the end user made from an authorized retailer if an invoice is provided as a proof of purchase. If the end user is unable to provide the invoice, then the warranty period will begin based off that product’s manufactured date)_

Germany:
*Gigabyte Garantie*_ (3 years from production date - serial number/sticker, warranty claims only via retailer)_
*MSI GPU Warranty*_ (3 years from production date - serial number/sticker, warranty claims only via retailer)_


----------



## ixi (Oct 15, 2022)

4700 here we come


----------



## Chaitanya (Oct 15, 2022)

N3M3515 said:


> Nobody seems to realize this.


3090 to 3070 - ~44% Cores cut
4090 to "4080 16GB" - ~44% Cores cut

So that 71% price increase is not correct, its much worse. nGreedia are selling ~$500 70 series gpu(from last gen) for mere $1200 over which people are drooling over.


----------



## R0H1T (Oct 15, 2022)

Well a lot of us have hence the ~


MarsM4N said:


> Well, I did some research and it looks like there are some _"**regional differences may apply*"_.  German buyers are certanly fu$$ed if they buy one of these.
> 
> Global:
> *GIGABYTE / AORUS WARRANTY TERMS AND CONDITIONS*_ (3 years of limited local warranty, from the date of purchase, GIGABYTE graphics cards, except those labeled “Mining Series”, are intended only for use with desktop PCs. Other types of use, such as blockchain computing or cryptocurrency mining, will render the product warranty void.)
> ...


It isn't just a German problem, like for *BenQ *we have "39 months from the date of manufacturing or 36 months from the date of invoice (POP) whichever is earlier" ~ ultimately it depends on your consumer protection laws & also the retailer. I usually buy through Amazon & they do come good if the manufacturer is being a d*** but YMWV.


----------



## ZetZet (Oct 15, 2022)

N3M3515 said:


> Have people gone mad? How is 900 justifiable for a 4070??, the freaking 3070 beat the 2080 Ti!! and for less than half the price!!, that would translate to less than half or the $1500 for the 3090 = 650 - 700 and that is pushing the price still.


People have gone realistic. There is demand for those graphics cards at those prices, you can cry about the good old times all you want they are not coming back.


----------



## Dirt Chip (Oct 15, 2022)

Using the same numbering is not something new with NV. We are all use to this by now.
3080 10+12gb
3060 6+12gb
1060 3+6gb
And this just from recent years.

In the end, I sense that canceling and renaming 4080-12gb will do worse to the consumer, unless major price drop are inbound.
Changing just the name doesn't, obviously, change the preformance nor the price.
It is just a name, no strings attached.


----------



## N/A (Oct 15, 2022)

Chaitanya said:


> So that 71% price increase is not correct, its much worse. nGreedia are selling ~$500 70 series gpu(from last gen) for mere $1200 over which people are drooling over.


Much worse in Europe. 4080 is $1500, it's completely out of reach now. Didn't they get the memo,. mining has ended.


----------



## OneMoar (Oct 15, 2022)

192bit ~500Gb/s 
this should be a 4060TI
wtf where they thinking


----------



## Bwaze (Oct 15, 2022)

"NVIDIA “Unlaunches” GeForce RTX 4080 12 GB" 

That's hilarious. :-D


----------



## R0H1T (Oct 15, 2022)

Alternate headline ~ *Nvidia unleashes the beast within, unlocks full potential of RTX 4060*


----------



## Dazzm8 (Oct 15, 2022)

Just call it a 4070 like everyone said nvidia you stupid pricks.


----------



## lexluthermiester (Oct 15, 2022)

btarunr said:


> NVIDIA has decided to cancel the November 2022 launch of the GeForce RTX 4080 12 GB.


Change the name to something like RTX4060ti and drop the pricetag to match and this would be a great card!


----------



## TheDeeGee (Oct 15, 2022)

Remember how we had GT, GTS and GTX?

So why not the RTS 4080.


----------



## wolf (Oct 15, 2022)

Minus Infinity said:


> LOL we had clowns on this forum praising the 4070 Ti in disguise (as I've been calling it since day one) and saying it was awesome. yada yada Ngreedia yada yada


Ive yet to see a *single* person praise the 4080 12gb (4070 ti in disguise), are you sure it was _this_ forum? This forum _certainly_ swings pro AMD anti nvidia.


----------



## ARF (Oct 15, 2022)

TheDeeGee said:


> Remember how we had GT, GTS and GTX?
> 
> So why not the RTS 4080.



Why should they insist that it has to be an 80 card and not a 60 card?



wolf said:


> Ive yet to see a *single* person praise the 4080 12gb (4070 ti in disguise), are you sure it was _this_ forum? This forum _certainly_ swings pro AMD anti nvidia.



No, people are simply objective, while nvidia cheats.


----------



## wolf (Oct 15, 2022)

ARF said:


> No, people are simply objective, while nvidia cheats.


I don't think that covers the entirety of nvidia, their business, their products across all segments, their software across all segments and so on, and I think that says more about you than anyone else.

Liking and enjoying certain nvidia products does not require a departure from objectivity.

Objectively, the 4080 12gb was (set to be) an outright garbage product, and like I said, I haven't seen a single person defend it, and I'd be ultra surprised to see anyone on TPU defend it, given the user base here, especially the vocal ones.


----------



## lexluthermiester (Oct 15, 2022)

TheDeeGee said:


> Remember how we had GT, GTS and GTX?
> 
> So why not the RTS 4080.


Nope. The stated specs don't qualify. RTS-4070 might work.


----------



## Dirt Chip (Oct 15, 2022)

ARF said:


> The performance jump from the older generation top dog RX 5700 XT to RX 6900 XT was 101% in a single move.
> 
> View attachment 265528
> 
> AMD can do it.


Not quite: 5700=>6900 is like 3070=>4090 (more than 120% increase) - 2 tier up.
With 5xxx series, AMD top was mid-tier gpus so compere it to next-gen enthusiast tier is dubious.


----------



## N/A (Oct 15, 2022)

they simply unnanounced it, now we have to unsee and unhear.


----------



## Vayra86 (Oct 15, 2022)

N3M3515 said:


> Nobody seems to realize this.


Oh? Its clear as day price makes the product and Ada isnt moving things forward as of yet. Its just a way to extend old pricing and the Ampere product stack at this moment. Totally uninteresting from a buy POV.


----------



## Hyderz (Oct 15, 2022)

Bring back the LE moniker  RTX4080LE


----------



## medi01 (Oct 15, 2022)

TheDeeGee said:


> Remember how we had GT, GTS and GTX?
> 
> So why not the RTS 4080.



I mean "because it is much slower" would sound plausible, but  I also remember 1060 3GB. So, yeah...


----------



## Vayra86 (Oct 15, 2022)

Hyderz said:


> Bring back the LE moniker  RTX4080LE


I would pay to say I've got Le graphics card tbh


----------



## pavle (Oct 15, 2022)

Hyderz said:


> Bring back the LE moniker  RTX4080LE


Or if we compare it at the extreme of GeForce FX 5900 vs Radeon 9800, the GeForce RTX 4080 XT (12GB); The XT FX 5900 was 400/700 (core/mem) instead of 450/850.


----------



## phill (Oct 15, 2022)

I'm glad for not buying any new GPUs this time around.....


----------



## ARF (Oct 15, 2022)

wolf said:


> I don't think that covers the entirety of nvidia, their business, their products across all segments, their software across all segments and so on, and I think that says more about you than anyone else.
> 
> Objectively, the 4080 12gb was (set to be) an outright garbage product



Well, I have the impression that nvidia as a whole is a garbage business. They certainly broke many ethical and ordinary laws.
Don't wanna list them but there are some:

- cheats in 3DMark Nvidia accused of cheating in 3DMark 03 - GameSpot
- lower image quality - cheating in textures resolution resulting in higher performance to maintain competitiveness with ATi and AMD Radeon Question: Is nvidia cheating on image quality? Can someone confirm this? : Amd (reddit.com)
- nvidia partner program - illegal way to cut AMD from partnership with select OEMs GeForce Partner Program - Wikipedia
- GTX 970 3.5 GB marketed as 4 GB Nvidia settles class-action lawsuit over GTX 970 VRAM | KitGuru
- EVGA leaves nvidia EVGA Quits NVIDIA | Electronic Design
- XFX leaves nvidia XFX officially stops doing Nvidia (fudzilla.com)
- RTX 3080 10 GB controversy and cheating RTX 3080 VRAM usage warnings and the issue with VRAM pool sizes: the compromise of 4K gaming | ResetEra


----------



## [XC] Oj101 (Oct 15, 2022)

Dirt Chip said:


> So many 4080-12GB carton boxes with wrong letters on it in a snap.
> What a shitstorm.


I imagine that they will have a sticker over the old labeling.


----------



## sephiroth117 (Oct 15, 2022)

neatfeatguy said:


> I've had many driver issues with Nvidia over the years.
> 
> * Broken video playback in multiple formats; while playing games or streaming/playing videos
> * 2nd monitor never waking up from sleep. Only resolution was a system reboot.
> ...



I'm sorry but I choose according to my own experience. If I've had 2 very bad AMD experience I'm not going to pick it a third time (unless RDNA 3 is good and even more stable)
I have 0 doubts Nvidia has its share of issues, just that as someone who had both GPUs, my experience with Nvidia was far better.

just like Nvidia lost a customer with your issues, AMD lost a customer, me, with their drivers.




Daven said:


> There have been no significant tech reviewer comments about AMD driver failures. Only reason anyone thinks AMD drivers are bad is anonymous internet posts like this. Nothing you said can be verified but it will still make someone casually reading these forums think twice and continue to perpetuate this myth.



This is very disappointing take.

So my opinion is probably false because can't be verified and is thus a myth, but your opinion that AMD is OK is the truth. Your post is anonymous too and someone causally reading you will believe AMD has 0 drivers issues no ?
I stated my experience with AMD, period. It's not universal, pretty sure many have great experience with AMD, I didn't. 
Which customer is going to pick AMD a third time after being disappointed by the first two times ?


----------



## ARF (Oct 15, 2022)

sephiroth117 said:


> I'm sorry but I choose according to my own experience. If I've had 2 very bad AMD experience I'm not going to pick it a third time (unless RDNA 3 is good and even more stable)
> I have 0 doubts Nvidia has its share of issues, just that as someone who had both GPUs, my experience with Nvidia was far better.
> 
> just like Nvidia lost a customer with your issues, AMD lost a customer, me, with their drivers.



I suspect your RX 580 was defective on a hardware level, nothing to do with the software from AMD. Or broken Windows and browser.


----------



## Dirt Chip (Oct 15, 2022)

[XC] Oj101 said:


> I imagine that they will have a sticker over the old labeling.


Also on the pcb?


----------



## 64K (Oct 15, 2022)

The 12GB 4080 was stupid anyway. Nvidia will launch the 4080 16GB and then launch the upgraded 4080 later on.

EDIT: This 4080 12 GB will most likely be named the 4070 or 4070 TI which it should of been to begin with.


----------



## remunramu (Oct 15, 2022)

They tried to test people stupidity, if people were willing to pay $2k+ for 3080 during the GPU crisis why not $900 4070. That is also the possible cause why Ngreedia charge 4080 16gb with $1200 for this gen.


----------



## N3M3515 (Oct 15, 2022)

64K said:


> The 12GB 4080 was stupid anyway. Nvidia will launch the 4080 16GB and then launch the upgraded 4080 later on.
> 
> EDIT: This 4080 12 GB will most likely be named the 4060 Ti or 4070 which it should of been to begin with.


Fixed.


----------



## ARF (Oct 15, 2022)

remunramu said:


> They tried to test people stupidity, if people were willing to pay $2k+ for 3080 during the GPU crisis why not $900 4070. That is also the possible cause why Ngreedia charge 4080 16gb with $1200 for this gen.



Maybe nvidia had a positive intention, to soften the outcry because of the elevated pricings:






But why didn't they rename the 16 GB card as RTX 4080 Ti? While leaving the 12 GB as the vanilla RTX 4080?


----------



## TheDeeGee (Oct 15, 2022)

ARF said:


> Why should they insist that it has to be an 80 card and not a 60 card?
> 
> 
> 
> No, people are simply objective, while nvidia cheats.


A 4060 with 285 watt tdp makes no sense either, just because it has 12GB doesn't mean it's the 3060 replacement.

It's clearly a 4070 then, or 4070 Ti.


----------



## 64K (Oct 15, 2022)

N3M3515 said:


> Fixed.



imo Nvidia won't launch this GPU with the current specs as a 4060 TI. That would put it in the lower end of the midrange GPUs. Gamers won't spend that much on a lower end midrange GPU.


----------



## N3M3515 (Oct 15, 2022)

64K said:


> imo Nvidia won't launch this GPU with the current specs as a 4060 TI. That would put it in the lower end of the midrange GPUs. Gamers won't spend that much on a lower end midrange GPU.


I know, it's just that the "4080" 12GB, is as cut down from the 4090 as the 3060 Ti is from the 3090.


----------



## HTC (Oct 15, 2022)

ARF said:


> But why didn't they rename the 16 GB card as RTX 4080 Ti? While leaving the 12 GB as the vanilla RTX 4080?



Likely, the WAY TOO BIG gap between the 4090 and the "new" 4080 Ti.

If they keep the "old" 4080 16 GB, then they leave room for a 4080 Ti in the future that has performance roughly in between those 2 cards.


----------



## R0H1T (Oct 15, 2022)

TheDeeGee said:


> A 4060 with *285 watt tdp makes no sense either*, just because *it has 12GB doesn't mean it's the 3060 replacement*.
> 
> It's clearly a 4070 then, or 4070 Ti.


TDP on everything Nvidia has been creeping up for 3 gens now, it's a natural progression with 4xxx but Nvidia being Nvidia want to overcharge for a mid/high mid range card just because they usually can 

Neither does it make a xx70 card, the die size does!


----------



## wolf (Oct 15, 2022)

ARF said:


> Well, I have the impression that nvidia as a whole is a garbage business. They certainly broke many ethical and ordinary laws.


Be that as it may, they make products and software that are compelling, just because you've drawn a moral/ethical or whatever line and won't purchase their products, it doesn't by default mean the products are bad. Other tech companies aren't perfect and have made some shady choices, and I buy their products too, Don't wanna list them so I won't.

So Nvidia lets agree Nvidia cheated in 3dmark 03 and the 970 with the 3.5gb was a fiasco and a half etc, that doesn't, for me, by virtue disallow ownership of their products. Again, this decision doesn't require a departure from objectivity.

The 4080 12GB as a product was objectively dumb and misleading and I'm glad they caved, it's actually good that they're listening, to what extent we'll have to see, but at least it wont be called a 40*8*0 anymore, that's a start.

Never want to buy an NVidia product again? power to you, I guess.


----------



## N3M3515 (Oct 15, 2022)

HTC said:


> Likely, the WAY TOO BIG gap between the 4090 and the "new" 4080 Ti.
> 
> If they keep the "old" 4080", then they leave room for a 4080 Ti in the future that has performance roughly in between those 2 cards.


Comparing the gap between the 3080 and the 3090, nvidia managed to slot in 2 3080's, the Ti and the 12GB, so now that they intentionally made the gap gigantic, i guess they can slot in like 5 gpu's in there...........lol


----------



## TheoneandonlyMrK (Oct 15, 2022)

One thing I would say is that this change proves that enthusiast opinion, reaction and in general the consumer does have influence and can reshape the nonsense these companies are occasionally trying to pull.

You don't have to buy it.

You definitely don't need to back a company doing shity thing's.

And your opinion Does matter, shout it out.


----------



## R0H1T (Oct 15, 2022)

It does but the prices are still way above where they should be in this global economy.


----------



## Dirt Chip (Oct 15, 2022)

We will see a new sub-category suffix. Not "ti", not "super", not "SE".
Mybe RL (re-launch),  ST (secont-try) or just self explanatory "whathef*havewedone"


----------



## ThrashZone (Oct 15, 2022)

Hi,
Funny 9 pages now


----------



## ARF (Oct 15, 2022)

TheDeeGee said:


> A 4060 with 285 watt tdp makes no sense either



It is a very aggressive factory overclock.
They have been increasing the TDP since forever. *8800 Ultra* is a 171 watt card with suggested PSU only 450 W.

This is a natural consequence of their desire to always be on top of the competition no matter the tools.

So, if the top dog can eat up to 600 watts, why not the mid range eat up to 285 watts?


----------



## N3M3515 (Oct 15, 2022)

wolf said:


> The 4080 12GB as a product was objectively dumb and misleading and I'm glad they caved


The 4080 16GB being $500 price hiked (71%) over the 3080 is also DUMB. Don't know what mental gymnastics people will do to buy it at that price. Even if i had the money i wouldn't buy it at that price, there is no more crypto mining. 

I don't know how people also hasn't realized this: *the top of the range always has been the x80, nvidia invented x90 to justify doubling the asking price. Just like apple and samsung have done with the "Pro Max" and "Ultra" versions. And people swallowed that trick! lol. If they hadn't do that, the regular Galaxy S22* *would have all the stuff the ultra version has, for $700. 
The key here is: people just don't think before they buy.*


----------



## solarmystic (Oct 15, 2022)

N3M3515 said:


> The 4080 16GB being $500 price hiked (71%) over the 3080 is also DUMB. Don't know what mental gymnastics people will do to buy it at that price. Even if i had the money i wouldn't buy it at that price, there is no more crypto mining.
> 
> I don't know how people also hasn't realized this: *the top of the range always has been the x80, nvidia invented x90 to justify doubling the asking price. Just like apple and samsung have done with the "Pro Max" and "Ultra" versions. And people swallowed that trick! lol. If they hadn't do that, the regular Galaxy S22* *would have all the stuff the ultra version has, for $700.
> The key here is: people just don't think before they buy.*



Tbf, when they introduced the x90 in the past, it was used exclusively for cards that had SLI on a board (GTX 295, GTX 590, GTX 690). So the premium was somewhat justified since they were usually cheaper than buying 2 of the x80 cards, and putting them in SLI mode.

When they reintroduced it with the RTX 3090 as the Titan replacement for that generation, that's when they went full Gordon Gecko Greed is Good with it.


----------



## mechtech (Oct 15, 2022)

meh no DP 2.0, no deal 

edit - and $900 for a 192-bit mem bus....................lolz no thanks


----------



## Super Firm Tofu (Oct 15, 2022)

Vayra86 said:


> I would pay to say I've got Le graphics card tbh



But I am Le Tired.

This is turning out to be one of the strangest things in the tech world in a while.  'Unreleasing' a product a month later?  gg nVidia.


----------



## TheoneandonlyMrK (Oct 15, 2022)

wolf said:


> I don't think that covers the entirety of nvidia, their business, their products across all segments, their software across all segments and so on, and I think that says more about you than anyone else.
> 
> Liking and enjoying certain nvidia products does not require a departure from objectivity.
> 
> Objectively, the 4080 12gb was (set to be) an outright garbage product, and like I said, I haven't seen a single person defend it, and I'd be ultra surprised to see anyone on TPU defend it, given the user base here, especially the vocal ones.


Have a read through this thread and count the times AMD are mentioned in a negative light, usually via driver's, tangential arguments meant for deflection are defensive stances.
AMD have f all to do with this yet.

And I was stunned by a few telling me in this forum that it was named a 4080 they can't change it, I should deal with it.

And that it's on consumers to buy right and not be stupid, it's on them to know they're shit.

We'll eat this thread, because it's not on consumers, it's all our choice, accept you can change they're output or pucker your butt and pay up, are not the only choice.
Be vocal, and Act on your opinions, perhaps next generation they'll make cards that mere mortals can afford.

They would have to to survive instead of making Car sized and priced GPU's.

@ARF crack on then debate, what's your points, you prefer high price massive GPU's, perhaps you want Huang in a new jacket?! What?!.


----------



## ARF (Oct 15, 2022)

TheoneandonlyMrK said:


> @ARF crack on then debate, what's your points, you prefer high price massive GPU's, perhaps you want Huang in a new jacket?! What?!.



No, don't you see my irony?

I prefer Radeons  If you don't want to buy an expensive GF for $1600, then buy a "cheap" RX 6800 for $550:

*Radeon RX 6400 - 170.00
Radeon RX 6500 XT - 194.80
Radeon RX 6600 - 285.89
Radeon RX 6600 XT - 423.23
Radeon RX 6650 XT - 364.00
Radeon RX 6700 XT - 448.17
Radeon RX 6750 XT - 497.28
Radeon RX 6800 - 549.00
Radeon RX 6800 XT - 646.93
Radeon RX 6900 XT - 737.00
Radeon RX 6950 XT - 909.00*

AMD has been adjusting the prices recently with a negative trend. Prices slowly on the decline.


----------



## wolf (Oct 15, 2022)

N3M3515 said:


> The 4080 16GB being $500 price hiked (71%) over the 3080 is also DUMB. Don't know what mental gymnastics people will do to buy it at that price. Even if i had the money i wouldn't buy it at that price, there is no more crypto mining.


The price for the 16GB model I also agree is insanity. The 12GB should have just never been called a 4080 at all. Interestingly excessive use of bold for the rest of it


----------



## Chaitanya (Oct 15, 2022)

Looks like memes like these hurt the ego of that leather jacket man.


----------



## neatfeatguy (Oct 15, 2022)

sephiroth117 said:


> I'm sorry but I choose according to my own experience. If I've had 2 very bad AMD experience I'm not going to pick it a third time (unless RDNA 3 is good and even more stable)
> I have 0 doubts Nvidia has its share of issues, just that as someone who had both GPUs, my experience with Nvidia was far better.
> 
> just like Nvidia lost a customer with your issues, AMD lost a customer, me, with their drivers.
> ...



I never said I stopped using Nvidia. I said both sides have issues. It is your choice to use whatever hardware you like. The only thing you shouldn't do is to blatantly claim AMD has unstable drivers solely based on your experience. I get it that it's harder to believe otherwise (I have the same stigmata for Apple products as you do for AMD's drivers).  I think an outright statement that AMD drivers are unstable isn't accurate, you're experience with them leads you to believe that and that's okay. I think it's just how you worded it as saying "AMD needs to make their drivers more stable" is what throws people off. If you added in, "In my experience" and immediately followed with why you feel the drivers are unstable, it wouldn't rub those AMD fanboys people the wrong way.

Like I said in my last post, I've had a myriad of issues with Nvidia drivers over the years and just because I still have an Nvidia card doesn't mean I explicitly trust them or think they're better over AMD (I really wanted to get a 6800 or 6800XT when they launched, but scalping and direct sales to miners screwed us out of easily obtaining these cards and by the time they were available they were easily priced $300-500 higher than Nvidia's overpriced GPUs - I ended up finding a 3080 for around $800, at the same time the 6800/6800XT were between $1200-1500). Look at how Nvidia has pushed out an updated driver to remove LHR....I think it's a horseshit way they've tried to play the consumers and blatantly misguide them about how they care about the gamers and thus came the creation of LHR. They don't care, it's just marketing to help push sales and now with them removing LHR in this latest driver, that's just another marketing ploy to help try and drive sales to clear more of the remaining stock of Ampere cards.

To each their own. I know Nvidia has driver issues and I know AMD does. I'm not here to sway people one way or the other. I just want them to understand issues are on both sides of the fence, regardless of what personal feelings/experiences they've had.



64K said:


> imo Nvidia won't launch this GPU with the current specs as a 4060 TI. That would put it in the lower end of the midrange GPUs. Gamers won't spend that much on a lower end midrange GPU.



They won't rename it as a 4060Ti or 4070. I'm sure Nvidia is panicking and trying to figure out how they can insert this into the Ampere lineup and call it something like the 3080 12GB Super and price it right around $800. One last "revision" to the Ampere line-up (even though it's a different chip, yeah I know....but this is Nvidia we're talking about) so they quickly try to remedy the shitty situation they've put themselves in by trying to claim this card should be a 4080 model, when the 4080 16GB walks all over it by 25% or more.


----------



## TheDeeGee (Oct 15, 2022)

mechtech said:


> meh no DP 2.0, no deal
> 
> edit - and $900 for a 192-bit mem bus....................lolz no thanks


Someone explain me why everyone get's their panties in a twist because of a memory bus?


----------



## ThrashZone (Oct 15, 2022)

TheDeeGee said:


> Someone explain me why everyone get's their panties in a twist because of a memory bus?


Hi,
Design.
Supposed to work best with 8 or 16gb 
But nobody wants 8gb memory so they crapout a 10 or 12gb and call it a day.


----------



## Calenhad (Oct 15, 2022)

Good. Well played Nvidia. Easiest bonus coverage in a long time


----------



## outpt (Oct 15, 2022)

Nvidia can jam it after what happened to evga I bought amd this time around.


----------



## TheoneandonlyMrK (Oct 15, 2022)

TheDeeGee said:


> Someone explain me why everyone get's their panties in a twist because of a memory bus?


Do you take A and B roads to far away places or motorways.

Take away all the motorways now in your mind.

Your there.


----------



## Zareek (Oct 15, 2022)

ARF said:


> I suspect your RX 580 was defective on a hardware level, nothing to do with the software from AMD. Or broken Windows and browser.


I was going to say the same exact thing what was described makes no sense otherwise. If it was a launch card, maybe but if doesn't sound like it was. The other thought I had was older Windows versions had issues when switching between AMD and Nvidia, you had to use a third party utility to get all the old drivers out or re-install Windows from scratch. I think that was fixed since at least since Windows 10. I think that issue was a BSOD on boot, I may just be remembering wrong. I know I used to use a third party utility to clean out old drivers for some reason.


----------



## Sisyphus (Oct 15, 2022)

remunramu said:


> They tried to test people stupidity, if people were willing to pay $2k+ for 3080 during the GPU crisis why not $900 4070. That is also the possible cause why Ngreedia charge 4080 16gb with $1200 for this gen.


Anti scalping MSRP.


----------



## cvaldes (Oct 15, 2022)

Sisyphus said:


> Anti scalping MSRP.



There is no blanket threshold amount above which scalping happens and below which scalping stops.

Scalping happens when there are people who are willing to pay above MSRP whether it's a graphics card, concert tickets, or passes to the World Cup finals.

With the launch of the 4090, its quick sellout and subsequent eBay activity, clearly there are people who are willing to buy scalped 4090s above the porcine MSRPs.

During the pandemic and the Great GPU Shortage, we saw graphics cards (and CPUs) at all price levels scalped. Hell, even prices for used graphics cards went nuts. I bought a Sapphire Pulse Radeon RX 550 2GB card for my non-gaming daily driver PC in September 2020 for $65 (below its $79 launch price). At the peak of the shortage, this card was going for over $200.

Four. Year. Old. Graphics. Card.

People will buy a scalped potato if potatoes are rarities.


----------



## ARF (Oct 15, 2022)

cvaldes said:


> There is no blanket threshold amount above which scalping happens and below which scalping stops.
> 
> Scalping happens when there are people who are willing to pay above MSRP whether it's a graphics card, concert tickets, or passes to the World Cup finals.
> 
> ...



Instead of allowing to be scalped, these people can simply patiently wait. Buying a graphics card is not a life-saving requirement


----------



## cvaldes (Oct 15, 2022)

ARF said:


> Instead of allowing to be scalped, these people can simply patiently wait. Buying a graphics card is not a life-saving requirement



Of course patience is an option. However not everyone is patient nor sensible.

Scalpers exploit those who prioritize instant gratification over patiently waiting for a good value. FOMO is a driving influence for many. Look at the 3090. It's frequently available below MSRP now. The 6900 XT can be occasionally found at a ~50% discount from its MSRP which doesn't even go into the street prices during the Great GPU Shortage. It's not like 3090s stopped working the moment the 4090s started shipping.

Look at all of those super geniuses who pre-purchased _Cyberpunk 2077_ at full price which later repeatedly went on sale with discounts of 50% or more. FOR SOFTWARE. That wasn't even release quality. They paid top dollar for a sh!tty gameplay experience riddled with bugs and performance issues just to be first on the block to say they owned the game.

Heck, even graphics cards generally end up being better toward the end of the release cycle due to minor hardware improvements (respinning the boards, new chip steppings, etc.) and better software drivers.


----------



## AsRock (Oct 15, 2022)

P4-630 said:


> Great, now what to do with my $900....



yeah waiting here too, but AMD right around the corner too.


----------



## ARF (Oct 15, 2022)

cvaldes said:


> Of course patience is an option. However not everyone is patient.
> 
> Scalpers exploit those who prioritize instant gratification over a good value. FOMO is a driving influence for many.
> 
> Look at all of those super geniuses who pre-purchased _Cyberpunk 2077_ at full price which later repeatedly went on sale with discounts of 50% or more. FOR SOFTWARE.



I hope that they deeply regret afterwards. There is no justification of throwing their hard-earned money against the wind.


----------



## medi01 (Oct 15, 2022)

Plenty of DE shops sell 4090 and have it in stock.

Although price point is 2500... which is quite a bit too much even given the 19% VAT.


----------



## ZoneDymo (Oct 15, 2022)

cvaldes said:


> Of course patience is an option. However not everyone is patient.
> 
> Scalpers exploit those who prioritize instant gratification over a good value. FOMO is a driving influence for many.
> 
> Look at all of those super geniuses who pre-purchased _Cyberpunk 2077_ at full price which later repeatedly went on sale with discounts of 50% or more. FOR SOFTWARE. That wasn't even release quality. They paid top dollar for a sh!tty gameplay experience riddled with bugs and performance issues just to be first on the block to say they owned the game.



Even on sale its still not even a shadow from what was promised so even buying it at a 50% discount would be giving too much.
at this point ill get it when its like 8 bucks at some point, its not worth more.


----------



## 3x0 (Oct 15, 2022)

Droping the 4080 16GB to 900$ and the 4070 aka 4080 12GB to 600$ would make it a slightly better situation, but I'm pretty sure they won't do it unless pressured by AMD.


----------



## cvaldes (Oct 15, 2022)

ARF said:


> I hope that they deeply regret afterwards. There is no justification of throwing their hard-earned money against the wind.



Well, it's their money. They can do with it however they please. They have the opportunity to turn down that $7 espresso drink, that $15 10-oz. beer at the ballpark, that $25 bowl of ramen.

When the renamed 4080 12GB card starts selling, it's up to each person to decide whether or not they will pay what is being charged for it whether its the Founders Edition at MSRP or some ridiculously priced scalped offering on fleeceBay.

But for sure, there are those who will pay big bucks for scalped product including some TPU participants. By doing business with scalpers they are encouraging scalping to continue.


----------



## medi01 (Oct 15, 2022)

From resetera thread, cough:


----------



## cvaldes (Oct 15, 2022)

medi01 said:


> From resetera thread, cough:
> 
> View attachment 265602



This is just the results from one game at one resolution so it only gives a glimpse of the performance differences. Again, cherry picking one game benchmark isn't really productive since most PC gamers don't play just one title.

There's a +45% performance increase between the 3080 and 4080 (16GB) models.

Amusingly, there's also a +45% increase between the 3070 and 4080 (12GB) models. 

So the 4080 (12GB) really does look like a 4070 if one were to expect a similar generational uplift in performance.


----------



## medi01 (Oct 15, 2022)

cvaldes said:


> This is just the results from one game at one resolution


No shit Watson, but it is also quite in line with the expectations, given the CU cutdowns.



cvaldes said:


> , cherry picking one game


Oh, I didn't know there are other game benchmarks. Share them please.

PS
Eternal






cvaldes said:


> Amusingly, there's also a +45% increase between the 3070 and 4080 (12GB) models.


That's a funny way to refer to "4080 12GB is quite on par with 3080"


----------



## cvaldes (Oct 15, 2022)

medi01 said:


> Oh, I didn't know there are other game benchmarks. Share them please.



I could share more gaming benchmarks -- as time permits -- after you show mastery of the term STFW.

Hint: there are plenty of other gaming benchmarks on the Internet. Even TPU graphics cards reviews cover multiple games so start here:









						TechPowerUp
					






					www.techpowerup.com
				




and once you've read all 833 posts, check back at the beginning because invariably there will be more. That should keep you busy for the weekend.

Enjoy!


----------



## N/A (Oct 15, 2022)

Why on earth is 4090 106 Fps, and 4080 - 55 Fps. That's 92.5% and it only has 70% more CUDA, what is this free performance,. 16GB not enough already?


----------



## cvaldes (Oct 15, 2022)

N/A said:


> Why on earth is 4090 106 Fps, and 4080 - 55 Fps. That's 92.5% and it only has 70% more CUDA, what is this free performance,. 16GB not enough already?



It's likely that it is not just one factor but a combination of many factors including -- but not limited to -- memory bus width, memory clock frequency, memory bandwidth, GPU clock frequency, and other things. Game performance isn't based on one type of transistor on a GPU.

Remember that the 4090 has a 384-bit memory bus and the 4080 has 256-bit.

As mentioned repeatedly in many, many threads, NVIDIA is binning silicon and earmarking better transistors for their higher priced products. Their top GPUs end up in data centers.

Very excellent GPUs end up in their top tier graphics cards. They are also binning VRMs, VRAM, and other silicon. Not all GDDR6X chips are the same in the same way that not all DDR4 chips perform equally. Not sure if you've noticed that.

All of these slight improvements add up.

There's also a very real possibility that the software driver used for these comparisons was optimized for the 4090. After all, that was the first Ada Lovelace card to be released so undoubtedly NVIDIA engineers prioritized that GPU.

This is yet another example why one can't look at a single game benchmark for at a single display resolution and make conclusive statements. Some games will benefit from more raster cores, some games can take advantage of RT and ML cores. Other games might favor fast and wide memory transfers, others just a lot of VRAM. Some games rely more on the CPU. And not each card works equally well with all graphics APIs. Some cards are better for DX11 games, others are better for DX12. And game developers sometimes end up writing code that favors a particular architecture, occasionally because game development was sponsored by a GPU manufacturer (AMD, NVIDIA, and now Intel).

So in the end, it's more than counting CUDA cores.


----------



## medi01 (Oct 15, 2022)

cvaldes said:


> Hint: there are plenty of


You didn't realize the chart contains 4080, did you...


----------



## Nater (Oct 15, 2022)

Zareek said:


> I was going to say the same exact thing what was described makes no sense otherwise. If it was a launch card, maybe but if doesn't sound like it was. The other thought I had was older Windows versions had issues when switching between AMD and Nvidia, you had to use a third party utility to get all the old drivers out or re-install Windows from scratch. I think that was fixed since at least since Windows 10. I think that issue was a BSOD on boot, I may just be remembering wrong. I know I used to use a third party utility to clean out old drivers for some reason.


I literally had a GTX 1070 + A2000 in my rig (with gaming and workstation drivers) and then swapped the 1070 for an RX580 + A2000 in my system.  So I had Adrenaline and RTX Studio drivers installed.  Never did an uninstall/reinstall between cards on either swap.  They all worked flawlessly together, although I only ran it that way for a few hours to make sure the cards were working.


----------



## metalslaw (Oct 15, 2022)

New name is gonna be 4070 ti Super Duper, and they will charge $100 more.


----------



## cvaldes (Oct 15, 2022)

medi01 said:


> You didn't realize the chart contains 4080, did you...



Oh, I'm pretty sure I've typed "4080" multiple times in my recent comments to this discussion about the resetera chart. In fact, I even noticed that both 4080 16GB and 4080 12GB were used.

The point is that using a single game benchmark to describe _____ graphics card (regardless of the make and model number) at one resolution is not a meaningful way of summarizing its capabilities. Different cards exhibit different performance with different software under different operating conditions (not just display resolution).

Again, cherry picking through graphics benchmarks to pick one to argue a point just shows pure desperation and a complete disconnect from reality. That benchmark is only valid for that one game under those specific test conditions.

I don't know about you but I don't just play one game. Most people do not which is why these single-game benchmarks are nearly meaningless unless part of a larger assessment that includes other benchmarks, datapoints, and situations.

These single game benchmarks are actually better for assessing how well the game's code was written by seeing if there are gross anomalies between various cards. Sometimes gaming benchmarks/performance tuning guides are good for pinning down optimal settings for better performance.

I realize this is a very difficult concept for you to grasp and undoubtedly someone else can explain it better to you than me. I don't write graphics card reviews for a living.


----------



## mechtech (Oct 15, 2022)

TheDeeGee said:


> Someone explain me why everyone get's their panties in a twist because of a memory bus?


I don't.  I got sand in them for a $900 192-bit bus.  For that price I want a 256-bit bus................whether it helps or not.


----------



## N/A (Oct 15, 2022)

I expected 4070 with 8704 Cuda and 16GB. Whatever they name it would still let me down. 42 FPS and 4090 does 106. This whatever this is now, should be no more than $599. and 799 for the 4080 16GB.
Nvidia isn't undoing how bad this is.. Everytime after the mining ends they somehow put up the most ridiculous prices, Lets hope AMD puts a dent in their market share for good.


----------



## mechtech (Oct 15, 2022)

P4-630 said:


> Great, now what to do with my $900....


Halloween candy??


----------



## ARF (Oct 15, 2022)

N/A said:


> I expected 4070 with 8704 Cuda and 16GB. Whatever they name it would still let me down. 42 FPS and 4090 does 106. This whatever this is now, should be no more than $599. and 799 for the 4080 16GB.
> Nvidia isn't undoing how bad this is.. Everytime after the mining ends they somehow put up the most ridiculous prices, Lets hope AMD puts a dent in their market share for good.



Yes, it was not fair for the customers.

Look:
*RTX 3080 10 GB*: SM - 68; Shaders - 8704; Memory throughput - 760.3 GB/s; TDP - 320 W; Price - $700
*RTX 4080 12 GB*: SM - 60; Shaders - 7680; Memory throughput - 504 GB/s; TDP - 285 W; Price - $900





NVIDIA cancels GeForce RTX 4080 12GB - VideoCardz.com


----------



## ARF (Oct 16, 2022)

What is more profitable?
1. to sell in quantities of 1,000,000 units with profit margin of 5 money units, or;
2. to sell in quantities of 1,000 units with profit margin of 600 money units?

I think that nvidia sold no more than 1000 RTX 4090 worldwide, and its profit margin for this card is in the range of hundreds of dollars.
If it is $1600, then at least $1000 is net profit margin *per card!*.


----------



## N/A (Oct 16, 2022)

Actually 4090 is sold at a loss, 4080 just about breaks even, and 4070 is $150 into profit.

There was no need to impose on us the RTRT and tensor cores or GDDR6X or use 4N, there is the 6N that also works perfectly well, 2.5Ghz no problem. 
We are a long way from RTRT in a value king consumer card, but Nvidia had to have it their way.


----------



## N3M3515 (Oct 16, 2022)

cvaldes said:


> There's a +45% performance increase between the 3080 and 4080 (16GB) models.


And a +70% price increace



N/A said:


> Why on earth is 4090 106 Fps, and 4080 - 55 Fps. That's 92.5% and it only has 70% more CUDA, what is this free performance,. 16GB not enough already?


Just look at the 3080 vs 3090, only 13% difference.


----------



## ARF (Oct 16, 2022)

N/A said:


> Actually 4090 is sold at a loss, 4080 just about breaks even, and 4070 is $150 into profit.
> 
> There was no need to impose on us the RTRT and tensor cores or GDDR6X or use 4N, there is the 6N that also works perfectly well, 2.5Ghz no problem.
> We are a long way from RTRT in a value king consumer card, but Nvidia had to have it their way.



Yeah, AMD has warned them that full-scale RT is available only in the cloud.

Anyways, without ramping the production up to some millions of N4 units sold, they won't receive that profit. I guess they cannot pay the tape-outs and the initial lithography equipment investments.


----------



## Minus Infinity (Oct 16, 2022)

Further to this story, its now comes to light Nvidia were originally going to use AD103 for 4070, not 4080. So why they changed and used it for 4080 is screwed up. AMD's 7800XT will demolish the 4080 12GB and their MCM architecture is cheaper to manufacture than even RDNA2's monolithic chips, and they will be easily able to undercut Ngreedia on price. The 7600XT is supposedly going to match the 6900XT in rasterisation if not beat it, so 7700XT will probably beat the 4080 12GB itself (other than in RT). AMD's stack looks very promising and AMD is increasing bus widths as Ngreedia cut them. Don't expect them to die in the arse at 4K this time. 7900XT 384 bit, 7800XT 320 bit, 7700XT 256 bit.


----------



## Fluffmeister (Oct 16, 2022)

Yeah! God bless AMD, they will save the day with cheap and powerful cards for all!


----------



## wheresmycar (Oct 16, 2022)

Minus Infinity said:


> Further to this story, its now comes to light Nvidia were originally going to use AD103 for 4070, not 4080. So why they changed and used it for 4080 is screwed up. AMD's 7800XT will demolish the 4080 12GB and their MCM architecture is cheaper to manufacture than even RDNA2's monolithic chips, and they will be easily able to undercut Ngreedia on price. The 7600XT is supposedly going to match the 6900XT in rasterisation if not beat it, so 7700XT will probably beat the 4080 12GB itself (other than in RT). AMD's stack looks very promising and AMD is increasing bus widths as Ngreedia cut them. Don't expect them to die in the arse at 4K this time. 7900XT 384 bit, 7800XT 320 bit, 7700XT 256 bit.



Whats even more frustrating is the 4080(16)'s set price and the 4080(12)'s 'attempted' MSRP. A mid-tier 4070 now has a reference point of $900, even if you trim $100-$200... it's still way too expensive. IMO, they should have accepted FULL DEFEAT and pulled both 4080 variants - rebranding the 4080-12 to 4070 and then MSRP-fine-tuning both cards to scale back to last Gens price-2-performance trajectory with some icing on top for the 40-series perks. Yeah i know vastly improbable in the current launch climax but its hardly recklessly hopeful. Even last Gens incremental uptick in price is heavy on the wallet... can't believe how NVIDIA took last Gens already inflated price levels and irrespective of the buyers leanings smashed those prices through the roof.

If I hear one more comment "its a business not a charity" ....without responding i'll be cursing at them through motionless silence and deep reflection of consumer absurdity

Now it just boils down to what AMD offers and how NVIDIA responds. If its not slashing prices but introducing SUPERS/TIs to compete, i aint parting a single dime towards the life-time ascribed green camp.


----------



## wolf (Oct 16, 2022)

Fluffmeister said:


> Yeah! God bless AMD, they will save the day with cheap and powerful cards for all!


Far too many people say this unironically.

Seen a good chunk of what AMD is capable of in terms of greed in the last few years too.


The 6500XT as a product and the price
The SAM debarkle
The older am4 chipsets and Zen 3 debarkle
Pricing of the 5600x and 7600x, among others.
There are certainly more. No saints in this game. Are they 'less bad'? Perhaps, I find it easier to disregard all the bulldust and focus on which products are compelling to me.


----------



## AnotherReader (Oct 16, 2022)

N/A said:


> Actually 4090 is sold at a loss, 4080 just about breaks even, and 4070 is $150 into profit.
> 
> There was no need to impose on us the RTRT and tensor cores or GDDR6X or use 4N, there is the 6N that also works perfectly well, 2.5Ghz no problem.
> We are a long way from RTRT in a value king consumer card, but Nvidia had to have it their way.


How do you figure that the 4090 is sold at a loss?


----------



## Tsukiyomi91 (Oct 16, 2022)

just fucking name it the RTX4070, revise the price to $699 and be done with it. NoVideo is gonna contend with not just the consumers but their investors. I wanna see them try.



wolf said:


> Fsr too many people say this unironically.
> 
> Seen a good chunk of what AMD is capable of in terms of greed in the last few years too.
> 
> ...


with AMD missing their 1 billion dollar revenue mark, I say their "fanbase" have woken up to the BS Team Red has been doing.


----------



## Prima.Vera (Oct 16, 2022)

N/A said:


> Actually 4090 is sold at a loss, 4080 just about breaks even, and 4070 is $150 into profit.


I don't think I ever read such a big pile of crap ever on TPU before. 
Biggest nonsense ever.


----------



## PapaTaipei (Oct 16, 2022)

N/A said:


> Actually 4090 is sold at a loss, 4080 just about breaks even, and 4070 is $150 into profit.
> 
> There was no need to impose on us the RTRT and tensor cores or GDDR6X or use 4N, there is the 6N that also works perfectly well, 2.5Ghz no problem.
> We are a long way from RTRT in a value king consumer card, but Nvidia had to have it their way.


Sold at a loss? Nice one troll.


----------



## Dirt Chip (Oct 16, 2022)

N/A said:


> Actually 4090 is sold at a loss, 4080 just about breaks even, and 4070 is $150 into profit.
> 
> There was no need to impose on us the RTRT and tensor cores or GDDR6X or use 4N, there is the 6N that also works perfectly well, 2.5Ghz no problem.
> We are a long way from RTRT in a value king consumer card, but Nvidia had to have it their way.


I see. So basically "the more you buy, the more they loss" right?

People, we found the way to bankrupt NV.
Stock all 4090 you can.

That's a good one


----------



## AusWolf (Oct 16, 2022)

Chrispy_ said:


> The reason they didn't want to call it a 4070 is because the price they want for an xx70 card is obscene.
> 
> Perhaps now it will be a 4070 after all, but they're going to have to explain why it's $900 instead of $500. If I had to guess, the explanation from Nvidia will be "F*CK YOU ALL, GIVE US YOUR MONEY"


You have to pay double because it's Nvidia. The same way you have to pay double for a (used) BMW even though it's just a car like any other.



ARF said:


> What is more profitable?
> 1. to sell in quantities of 1,000,000 units with profit margin of 5 money units, or;
> 2. to sell in quantities of 1,000 units with profit margin of 600 money units?
> 
> ...


Of course. It's a halo product, and history has proven that people pay whatever they're told to pay for halo products. Besides, it doesn't matter how profitable the 4090 is as long as it's profitable - and as long as the rest of the market is covered by 30-series cards.


----------



## Dirt Chip (Oct 16, 2022)

Endgame: renamed to 4075, drop 50$.
Everyone loss.
AMD, please make your move already


----------



## [XC] Oj101 (Oct 16, 2022)

Dirt Chip said:


> Also on the pcb?


Won’t be the first time, I’ve seen it done on MSI motherboards.


----------



## wolf (Oct 16, 2022)

Tsukiyomi91 said:


> with AMD missing their 1 billion dollar revenue mark, I say their "fanbase" have woken up to the BS Team Red has been doing.


To an extent but it's far from everyone, the mental gymnastics I see to defend them continues to baffle me.

I fanboy no company for anything whatsoever, it's absolutely bloody pointless, where some would have you believe this is some holy war and they're occupying some sort of perceived moral high ground or something.

Call out when products or 'moves' are bad, praise them when good, recommend good products, don't recommend bad ones and so on, it's all a pretty simple affair if you don't allow yourself to 'take sides'.


----------



## arni-gx (Oct 16, 2022)

so, RTX 4080 12gb will become RTX 4070 12gb..... ??? the price must be going down.... included new RTX 4080 16gb too.....


----------



## Bomby569 (Oct 16, 2022)

No one won anything, Nvidia still doesn't care what you say, they still have lots of 3000 series cards to sell, and can't have a 900$ card getting in the way.


----------



## ZoneDymo (Oct 16, 2022)

what an odd conversation this has turned into


----------



## GamerGuy (Oct 16, 2022)

IF nVidia wants to stick with the 'RTX 4080' moniker, they can always call it the RTX 4080 FE ...... as in Fake Edition.


----------



## AusWolf (Oct 16, 2022)

Bomby569 said:


> No one won anything, Nvidia still doesn't care what you say, they still have lots of 3000 series cards to sell, and can't have a 900$ card getting in the way.


On the one hand, they won't be able to justify selling a $900 4070. That's a small win. On the other hand, they won't sell anything new until their inventory of the old stuff lasts. So ultimately, the "4080 12 GB" didn't get cancelled - it only got delayed to be released by another name later. Nvidia will probably sit on their AD104 chips until 30-series stock completely dries out to make sure there is a shortage of similarly priced cards. As long as they have the upper hand with better RT, DLSS and the Tensor cores, they don't need to compete on price - the only thing they need is a shortage of Nvidia cards, which they can create whenever they want to. We may win battles like this one, but the war can only be won by decent competition.


----------



## Dirt Chip (Oct 16, 2022)

AusWolf said:


> On the one hand, they won't be able to justify selling a $900 4070. That's a small win. On the other hand, they won't sell anything new until their inventory of the old stuff lasts. So ultimately, the "4080 12 GB" didn't get cancelled - it only got delayed to be released by another name later. Nvidia will probably sit on their AD104 chips until 30-series stock completely dries out to make sure there is a shortage of similarly priced cards. As long as they have the upper hand with better RT, DLSS and the Tensor cores, they don't need to compete on price - the only thing they need is a shortage of Nvidia cards, which they can create whenever they want to. We may win battles like this one, but the war can only be won by decent competition.


Correct. 4070 is just a name.
You pay for preformance, not for words.
4070 in the same 900$ or a bit less is very much possible. The preformance of the suk haven't and will not change- they are already exist and assembled. NV still have the upper hand so don't expect better pref\$ high-end GPU anytime soon.


----------



## AusWolf (Oct 16, 2022)

Dirt Chip said:


> 4070 is just a name.
> You pay for preformance, not for words.
> 4070 in the same 900$ is very much possible. The preformance haven't changed...


Like I said, as long as they have better RT and DLSS, they can name and price it whatever they want to.


----------



## ARF (Oct 16, 2022)

Dirt Chip said:


> 4070 is just a name.
> You pay for preformance, not for words.
> 4070 in the same 900$ or a bit less is very much possible. The preformance haven't changed...



The performance is low for a new generation card at $900.
The RTX 3090 and RTX 3090 Ti cost the same are 100% better buys.

So, at $900 it will be DOA.


----------



## HTC (Oct 16, 2022)

I do wonder what they'll end up naming the new card.

If they name it 4070 something, they'll be called out on it because *this chip* was being named as a "low 4080" and it would "lose a level" but, and according to nVidia and their reason for "unlaunching" this card as a "low 4080", they won't be able to sell it as a 4080 something later on, or they'll be called out for what they did this time: i don't see a way forward with this card either way.


----------



## Dirt Chip (Oct 16, 2022)

ARF said:


> The performance is low for a new generation card at $900.
> The RTX 3090 and RTX 3090 Ti cost the same are 100% better buys.
> 
> So, at $900 it will be DOA.


Maybe for you it is DOA, but as long there isn't any better product, the market prove it willing to pay as much as needed.

I agree that the price is way out of scale, no matter what letters the product hold. I don't even considering any 4xxx at this time.
I advise to all to do the same, knowing many will still buy them soon as they come.


----------



## ARF (Oct 16, 2022)

HTC said:


> I do wonder what they'll end up naming the new card.
> 
> If they name it 4070 something, they'll be called out on it because *this chip* was being named as a "low 4080" and it would "lose a level" but, and according to nVidia and their reason for "unlaunching" this card as a "low 4080", they won't be able to sell it as a 4080 something later on, or they'll be called out for what they did this time: i don't see a way forward with this card either way.



They can't change the specifications - they have the chip and must use it. They didn't cancel the production of the chip. They only canceled the launch of a product called "4080-12".
AD104 will be used, maybe with more enabled shaders.

I suspect AMD will surprise them with Navi 31 and Navi 32, so they suddenly realised that "4080-12" as was, would lose badly, so the only choice was to think about a new product to compete better with Navi 31 and Navi 32.



Dirt Chip said:


> Maybe for you it is DOA, but as long there isn't any better product



lol, RX 6900 XT, RX 6950 XT, RTX 3090 and RTX 3090 Ti are already better products.. This is BEFORE Navi 31 and Navi 32


----------



## Dirt Chip (Oct 16, 2022)

ARF said:


> lol, RX 6900 XT, RX 6950 XT, RTX 3090 and RTX 3090 Ti are already better products.. This is BEFORE Navi 31 and Navi 32


We yet to have any 4080 proper benchmark so it is hard to tell.
Of course an existing product is better than a delayed on 
Maybe NV are betting too much on dlss3 as a stalling point to justify same preformance as 3090\ti with much less memory.
Time will tall...

Anyway, the point was that the renamed 4080-12 doesn't need to change it cost just because of the re-nameing by itself


----------



## AusWolf (Oct 16, 2022)

Dirt Chip said:


> Anyway, the point was that the renamed 4080-12 doesn't need to change it cost just because of the re-nameing by itself


It might have to because of its performance... unless Nvidia waits with its release until all 3080 and 3090 inventory has dried up. Then, being the only card in its performance segment, they can name and price it whatever they want to.


----------



## ARF (Oct 16, 2022)

Dirt Chip said:


> We yet to have any 4080 proper benchmark so it is hard to tell.



With those garbage specifications? 192-bit MI and only 12 GB and only 7680 shaders? Please, don't expect miracles.
Its performance will be around RTX 3080-RTX 3090 max.


----------



## Bomby569 (Oct 16, 2022)

AusWolf said:


> Like I said, as long as they have better RT and DLSS, they can name and price it whatever they want to.



to be completely fair, they don't even need a better card, like we seen in the past, they will always sell more of them than the competition. You just have to see steam charts to see the insane difference


----------



## N/A (Oct 16, 2022)

Prima.Vera said:


> I don't think I ever read such a big pile of crap ever on TPU before.
> Biggest nonsense ever.



A quote from Tech Yes City  that I did't quote properly. I used it to reinforce my point that big RT cores are very expensive and mostly useless at this point in time.


----------



## medi01 (Oct 16, 2022)

cvaldes said:


> cherry picking through graphics benchmarks


Share other graphics benchmarks of 4080, be so kind.



cvaldes said:


> I don't know


That's ok and rather obvious.



Dirt Chip said:


> We yet to have any 4080 proper benchmark so it is hard to tell.


Well, here is "non-proper" (from her majesty nGreedia) that is supposed to highlight the 4000 series greatness... I guess (note how 4080 12GB stacks against 3080):














						Nvidia Releases the First RTX 4080 and DLSS 3 Benchmarks - ExtremeTech
					

The 16GB RTX 4080 is notably faster than its 12GB sibling.




					www.extremetech.com


----------



## ARF (Oct 16, 2022)

medi01 said:


> Share other graphics benchmarks of 4080, be so kind.



Very low performance:


----------



## gffermari (Oct 16, 2022)

ARF said:


> Very low performance:
> 
> View attachment 265692



It's not low performance.
It's ok for a 70 series card to match the TI/90 of the previous one.
The 3070 did the same to the 2080Ti. 
The 4070 has the advantage to actually beat the 3090Ti in newer games while the 3070 does not acordingly.


----------



## Tatty_One (Oct 16, 2022)

HTC said:


> I do wonder what they'll end up naming the new card.
> 
> If they name it 4070 something, they'll be called out on it because *this chip* was being named as a "low 4080" and it would "lose a level" but, and according to nVidia and their reason for "unlaunching" this card as a "low 4080", they won't be able to sell it as a 4080 something later on, or they'll be called out for what they did this time: i don't see a way forward with this card either way.


As they will already have a 4070 in production I would imagine, maybe this card could become the 4070Ti at some point?


----------



## N/A (Oct 16, 2022)

gffermari said:


> It's not low performance.
> It's ok for a 70 series card to match the TI/90 of the previous one.
> The 3070 did the same to the 2080Ti.
> The 4070 has the advantage to actually beat the 3090Ti in newer games while the 3070 does not acordingly.



970 did that to 780 Ti, 1070 did that to 980 Ti, and 2070 to 1080 Ti, maybe. but this is a 90 Ti. could be considered 80 now. and only the 16GB qualifies in todays games.


----------



## Dirt Chip (Oct 16, 2022)

ARF said:


> With those garbage specifications? 192-bit MI and only 12 GB and only 7680 shaders? Please, don't expect miracles.
> Its performance will be around RTX 3080-RTX 3090 max.


xxx-bit MI is minnigles.
Shaders cont also minnigles.
GB is enough for 4K.

The performance are the performance, not the memory-bit or shaders cont or GB..
If it is in the 3090ti level or better, as the above image, then no matter the spec- it is in the same performance level.
It can actually do about the same FPS with much less spec, I think it's a wonderful thing by itself - 192bit do the same work as 384bit, 10752 vs 7680 , 12GB vs 24GB. No less than a "miracle" 



ARF said:


> Very low performance:


How 3090ti level and above is very low??


----------



## ARF (Oct 16, 2022)

Dirt Chip said:


> GB is enough for 4K.



It's fine if you think so. 12 GB doesn't look good for *future* 4K gaming. It will not fine wine


----------



## HTC (Oct 16, 2022)

Tatty_One said:


> As they will already have a 4070 in production I would imagine, maybe this card could become the 4070Ti at some point?



This "low 4080 card" was based on the AD104 chip IIRC: if they start using those chips for 4070 class products, what does that tell us about nVidia trying to originally use that as a 4080 class product?

There's A BIG DIFFERENCE between using a cut down version of a product as a lesser product of the same class *using the same chip*, and using different chips to achieve the that.


----------



## gffermari (Oct 16, 2022)

N/A said:


> 970 did that to 780 Ti, 1070 did that to 980 Ti, and 2070 to 1080 Ti, maybe. but this is a 90 Ti. could be considered 80 now. and only the 16GB qualifies in todays games.



12GB is ok for a 70 series card and yes, most of the 70s nearly match or exceed the best card of the previous gen, no matter what the name of the best, gaming, card is.(980Ti, 1080Ti, 2080Ti, 3090Ti)

Apart from the 2070 which is slower than the 1080Ti.


----------



## ARF (Oct 16, 2022)

Dirt Chip said:


> It can actually do about the same FPS with much less spec, I think it's a wonderful thing by itself - 192bit do the same work as 384bit, 10752 vs 7680 , 12GB vs 24GB. No less than a "miracle"



It's an illusion. Between the higher-specced proper 24G RTX 3090 Ti and a limited 12G RTX 4080 I would choose the former any day.


----------



## Dirt Chip (Oct 16, 2022)

ARF said:


> It's an illusion. Between the higher-specced proper 24G RTX 3090 Ti and a limited 12G RTX 4080 I would choose the former any day.


illusion? So you say the FPS counter is false only on 4080-12GB?
If you are buying GPU to have spec insted of GPU to play games or works than I get you, but you might end having lower FPS in the end.
Why instantly choosing a 450W GPU if you can have the same performance level with just 285W GPU (37% less)?


----------



## Aquinus (Oct 16, 2022)

Once a morally bankrupt company, always a morally bankrupt company.


----------



## gffermari (Oct 16, 2022)

Aquinus said:


> Once a morally bankrupt company, always a morally bankrupt company.
> View attachment 265696



Not exactly.
It's actually the only company that innovates and delivers and since they've been miles ahead of the competition, they give the pace. Like Intel did for a decade. I don't like their pace but anyone would do the same in their position.
If you don't like it, though, there are alternatives. There have always been.


----------



## AusWolf (Oct 16, 2022)

Bomby569 said:


> to be completely fair, they don't even need a better card, like we seen in the past, they will always sell more of them than the competition. You just have to see steam charts to see the insane difference


That's what I mean. They can easily sell lesser products only based on the fact that they have DLSS. Just look at 3050 prices as an example. You can buy a 6600 for that price, but they're still selling.

AMD has the price/performance advantage, but Nvidia has the technological advantage, and they can always build on that when they set the price of the next product.


----------



## HTC (Oct 16, 2022)

AusWolf said:


> AMD has the price/performance advantage, but Nvidia has the technological advantage, and they can always build on that when they set the price of the next product.



That works when there's no global economic downturn: we'll have to wait and see if it will still hold true after both manufacturers release their lineups, a few months from now.


----------



## AusWolf (Oct 16, 2022)

ARF said:


> Very low performance:
> 
> View attachment 265692


Comparison with the previous gen won't matter when the previous gen isn't available anymore - that is, when your only choice is limited to buying the newest shiny crap or not.



HTC said:


> That works when there's no global economic downturn: we'll have to wait and see if it will still hold true after both manufacturers release their lineups, a few months from now.


I can imagine two scenarios:
1. AMD slightly undercuts Nvidia with pricing, but people will still buy Nvidia due to their technological advantage and/or fanboyism.
2. AMD joins the game of big, heavy and expensive GPUs and comes out with something massive while still selling RDNA 2 for the people who care about price.

I do not expect RDNA 3 to be significantly cheaper than Ada.


----------



## R0H1T (Oct 16, 2022)

ARF said:


> *It's an illusion.* Between the higher-specced proper 24G RTX 3090 Ti and a limited 12G RTX 4080 I would choose the former any day.


"*Power resides where men believe it resides. It’s a trick. A shadow on the wall. And a very small man can cast a very large shadow.*"


Spoiler











The real power is in the hands of buyers! If enough of them believe these things are worthless (or not worth *$900*) then Nvidia is effed, sadly not many do that


----------



## ARF (Oct 16, 2022)

AusWolf said:


> Comparison with the previous gen won't matter when the previous gen isn't available anymore - that is, when your only choice is limited to buying the newest shiny crap or not.



That's true only if you ignore that the RTX 3090 Ti will exist for quite a while either in a brand new state or in the second-hand offers.



Dirt Chip said:


> Why instantly choosing a 450W GPU if you can have the same performance level with just 285W GPU (37% less)?



It's not 450 watts and one can always undervolt.

24G is 24G. Double that of 12G..


----------



## Dyatlov A (Oct 16, 2022)

ARF said:


> That's true only if you ignore that the RTX 3090 Ti will exist for quite a while either in a brand new state or in the second-hand offers.
> 
> 
> 
> ...



Yeah, i put also an 24GB monster into my PC, which only has 16GB system memory. But i could buy it t for 730USD a couple of weeks ago, while for 4080 (4060ti) i would still need to wait for.


----------



## ZetZet (Oct 16, 2022)

N3M3515 said:


> *The key here is: people just don't think before they buy.*


No it isn't. The key is people want those things. We can live with candles instead of lightbulbs and a cheap beat down car, or we buy luxuries, because we can afford to buy them. Nvidia is simply pricing things according to the market, market spends more, Nvidia raises prices. Back in the day 80 series cards were a lot cheaper and people still didn't buy them. When 1080Ti came out A LOT of people bought one, that was the catalyst for Nvidia to realize they underpriced it.


----------



## N3M3515 (Oct 16, 2022)

ZetZet said:


> No it isn't. The key is people want those things. We can live with candles instead of lightbulbs and a cheap beat down car, or we buy luxuries, because we can afford to buy them. Nvidia is simply pricing things according to the market, market spends more, Nvidia raises prices. Back in the day 80 series cards were a lot cheaper and people still didn't buy them. When 1080Ti came out A LOT of people bought one, that was the catalyst for Nvidia to realize they underpriced it.


Comparing a candle to a lightbulb is the same as to gpus, yeah allright.
You are just saying the same thing, people buying those stupidly overpriced gpus is what shows they *don't think before they buy*, they are compulsive buyers that happen to have money to waste.


----------



## Wasteland (Oct 16, 2022)

gffermari said:


> It's not low performance.
> It's ok for a 70 series card to match the TI/90 of the previous one.
> The 3070 did the same to the 2080Ti.
> The 4070 has the advantage to actually beat the 3090Ti in newer games while the 3070 does not acordingly.



For $900?  Absolutely it would be low performance.  

The problem is that NVIDIA appears to be going_ backwards_ in terms of perf/dollar. It looks more and more credible to me that they frontloaded the 4090 because the 4090 is where they put the bulk of their generational performance increase. So sure, the 4090 looks good if you compare it to previous halo cards that were an absolutely terrible value--and no one ever pretended otherwise, by the way. Everyone acknowledged that 80 and below were where the value lies, yet all of a sudden the 4090 releases and we're supposed to fall all over ourselves congratulating NVIDIA for releasing a $1600 Ada card that handily beats a ludicrously overpriced Ampere card whose MSRP was based on a concurrent and unprecedented GPU shortage.

But the 3090 and its Ti used the same die as the 3080.  Any projections we discuss must take that into account; in other words, we can't depend on the lower tier cards to perform at traditional levels, relative to the halo product.  NVIDIA's own charts bear me out on that, at least so far.  

I'm glad they relented on the goofy naming scheme for the $900 "4080," but it isn't obvious that that this card won't come back with a different name and at an equally ridiculous price--or, as some in this thread have suggested, perhaps this card will just disappear entirely for a long time, while NVIDIA burns off its remaining stock of Ampere.  Either way, the $1200 version of the 4080 doesn't exactly look like a value king, either, ATM.  It looks like it will give you fewer frames per dollar than its predecessor, once you strip away the hocus pocus about "DLSS 3.0"--which is nothing of the kind, incidentally; 3.0 has nothing to do with the extant versions of DLSS, and it's far less useful.  The name implies that 3.0 (interpolation, "fake" or visual-only FPS) is _better than_ 2.0 (upscaling, real FPS), which is nonsense. 

NVIDIA isn't just comparing apples to oranges in this case; they're selling you a single grape and telling you that it's the hottest new thing in orange technology.

That could be another reason that NVIDIA led with the 90 card, this time--DLSS 3.0 looks better visually at higher frame rates.  Essentially they're selling you a feature that purports to raise frame rates dramatically, and thus a feature that would logically appeal most to lower-end consumers, but the feature really only works well when you already have high frame rates (and when you have an extremely high refresh rate monitor to take advantage of the extra AI-generated frames; since these frames don't reduce input latency, any of them above your monitor's max refresh are pointless).  

And I'm not fanboying for AMD here, either.  You'd think this situation presents a great opportunity for AMD to seize the day with aggressively priced products, but I'll believe it when I see it.  NVIDIA sure doesn't seem to be worried.  For the moment, I suppose at least Intel's getting a breather to refine Arc's drivers.  Remember when we all lamented the timing of Intel's launch?  Well, Big Blue may not have to worry about next-gen competition at their current price bracket any time soon.  A third credible player in this market can't come soon enough.


----------



## Dirt Chip (Oct 16, 2022)

ARF said:


> It's not 450 watts and one can always undervolt.
> 
> 24G is 24G. Double that of 12G..


3090ti is a 450w TDP at stock and go as much as 550w for OC.
You can also undervolt the 285w, no efficiency gain here.
And I agree on the 24gb, if you must know you have way more memory than you need.
On any other occasion, the 4080-12gb will do just fine, with way less power and heat, and will be the better choice for most if both are price equivalent.


----------



## Vya Domus (Oct 16, 2022)

It just boggles my mind how they keep getting away with naming products in ways that are clearly misleading on purpose. They been doing this for so long, especially on mobile parts, if you're looking at a laptop with an Nvidia GPU you'll pretty much have no idea whatsoever what it is that you're buying going by name alone unless you do extensive research and even then you wont know for sure because manufacturers do a very good job at straight up hiding or obfuscating the TDPs.

Someone has to sue their asses and put an end to this once and for all.


----------



## Zareek (Oct 16, 2022)

Nater said:


> I literally had a GTX 1070 + A2000 in my rig (with gaming and workstation drivers) and then swapped the 1070 for an RX580 + A2000 in my system.  So I had Adrenaline and RTX Studio drivers installed.  Never did an uninstall/reinstall between cards on either swap.  They all worked flawlessly together, although I only ran it that way for a few hours to make sure the cards were working.


Yeah, like I said, I know that works in Windows 10 and newer. I was just trying to imagine a scenario that made sense. I've been building PCs since Windows 95 and there have been plenty of bad drivers and bad OSes in that time. In the past 7 to 10 years, I haven't seen many BSODs that could be attributed to a GPU driver, maybe I've been lucky.



Vya Domus said:


> It just boggles my mind how they keep getting away with naming products in ways that are clearly misleading on purpose. They been doing this for so long, especially on mobile parts, if you're looking at a laptop with an Nvidia GPU you'll pretty much have no idea whatsoever what it is that you're buying going by name alone unless you do extensive research and even then you wont know for sure because manufacturers do a very good job at straight up hiding or obfuscating the TDPs.
> 
> Someone has to sue their asses and put an end to this once and for all.


Laptops in general are bad in every way when it comes to that. All these CPUs with configurable TDPs on top of the GPU shenanigans. You really have to find a review of a specific laptop model and then only buy that particular model and make sure it is configured exactly as in the review to know what you are getting. It's a real pain!


----------



## medi01 (Oct 16, 2022)

AusWolf said:


> Comparison with the previous gen won't matter when the previous gen isn't available anymore - that is, when your only choice is limited to buying the newest shiny crap or not.


This assumes AMD would roll out RDNA3 with 7800 perf on par with 6800, but PRICIER.

I wonder why would they do that though.


----------



## R0H1T (Oct 16, 2022)

Zareek said:


> All these CPUs with configurable TDPs on top of the GPU shenanigans.


Well at least with CPU's you can underclock/fine tune them or say plug them (laptop) while running & more or less get what you're paying for, with GPU's there's very little way around the "TGP" limit BS also in large part because the cooling would not be sufficient on such laptops! GPU's are definitely much worse in that regard.


----------



## ARF (Oct 16, 2022)

Dirt Chip said:


> And I agree on the 24gb, if you must know you have way more memory than you need.
> On any other occasion, the 4080-12gb will do just fine, with way less power and heat, and will be the better choice for most if both are price equivalent.



If 24 GB is too many for you, you can always buy a 16 GB RX 6900 XT or 6950 XT, still more than the pathetic 12 GB.
No VRAM in-game warnings, no lower textures resolution - cheating by nvidia driver, no questionable future-proofing.


----------



## sepheronx (Oct 16, 2022)

Yeah, I decided to look into the benchmarks of the 4080 that nvidia was showcasing and its rather pathetic, for something $1200.

So the only gpu worth it for 4xxx series is the 4090 if you got a 3070 or equivalent


----------



## N3M3515 (Oct 16, 2022)

Wasteland said:


> the $1200 version of the 4080 doesn't exactly look like a value king


The +71% higher priced than the previous 3080 doesn't exactly look like a value king? really? I must be dreaming, how are people not sure this gpu is a complete rip off of gargantuan proportions. Dude, they increased $500 on a card that previously launched at $700. Add inflation, and whatever other stuff you want and still does not touch even $900.


----------



## sepheronx (Oct 16, 2022)

Hmm









						Nvidia DLSS 3.0 may not require an RTX 4000 GPU after all
					

Someone claims to be using Nvidia DLSS 3.0 without an RTX 4000 GPU, running Cyberpunk 2077 with higher frame rates on an older GeForce graphics card




					www.pcgamesn.com
				




Wonder how true?

Edit: in either case, DLSS 3.0 with frame injections isn't exactly what I care or want. Raw performance with raw visuals instead.


----------



## ZetZet (Oct 16, 2022)

N3M3515 said:


> Comparing a candle to a lightbulb is the same as to gpus, yeah allright.
> You are just saying the same thing, people buying those stupidly overpriced gpus is what shows they *don't think before they buy*, they are compulsive buyers that happen to have money to waste.


People think before they buy iPhone Ultra Max for sending text messages and they still do it. Graphics cards are luxury purchases anyway.


----------



## R0H1T (Oct 16, 2022)

I don't think too many people think before buying that god awful ugly piece of brick, no not the 2 kilo one


----------



## ARF (Oct 16, 2022)

ZetZet said:


> People think before they buy iPhone Ultra Max for sending text messages and they still do it. Graphics cards are luxury purchases anyway.



Some think, others don't think. For example, I always research for what I need - for example - water resistance, a normal price tag - all the things that Apple iPhone canNOT deliver.


----------



## N/A (Oct 16, 2022)

sepheronx said:


> Edit: in either case, DLSS 3.0 with frame injections isn't exactly what I care or want. Raw performance with raw visuals instead.



According to HWB you don't want DLSS3 on anything less than 120 FPS, since it adds what can be perceived as a mouse input latency. 4070 only has 40 FPS in 4K. and at this point doesn't make sense having it at all, on 20 - 30 series that don't have the optical tensor flow is even worse. And there go 20 billion transistors on the 4080 12  just wasted on tensor cores and raytracing, and l2 cache that seemingly does nothing or compensates poorly for the lack of 64 bit bus that was there on the similarly sized 1070/1080.


----------



## DemonicRyzen666 (Oct 16, 2022)

medi01 said:


> Share other graphics benchmarks of 4080, be so kind.
> 
> 
> That's ok and rather obvious.
> ...



No one has done some simple math for that DLSS 3 graph...

if you have a RTX 4090 24gb you get 65% increase in Frame rates from DLSS 3
if you have a RTX 4080 16gb you get 132% increase in frame rates from DLSS 3
if you have a RTX 4080 12gb you get 150 % increase in frame rates from DLSS 3

there a drop off as you get to a bigger & more powerful cards....


----------



## sepheronx (Oct 16, 2022)

N/A said:


> According to HWB you don't want DLSS3 on anything less than 120 FPS, since it adds what can be perceived as a mouse input latency. 4070 only has 40 FPS in 4K. and at this point doesn't make sense having it at all, on 20 - 30 series that don't have the optical tensor flow is even worse. And there go 20 billion transistors on the 4080 12  just wasted on tensor cores and raytracing, and l2 cache that seemingly does nothing or compensates poorly for the lack of 64 bit bus that was there on the similarly sized 1070/1080.


Which makes no sense to get anything less than a 4090.

Wonder if they did this all on purpose.  Ignore anything less than a 4090 or just grab a 30 series card.


----------



## N/A (Oct 16, 2022)

DemonicRyzen666 said:


> there a drop off as you get to a bigger & more powerful cards....



And you know why that is. The GPU Load is 60% because of CPU bottleneck. hilarious, on 12900 is less but it's still there.


----------



## DemonicRyzen666 (Oct 16, 2022)

N/A said:


> And you know why that is. The GPU Load is 60% because of CPU bottleneck. hilarious, on 12900 is less but it's still there.
> 
> View attachment 265720


now see thats the totally the opposite of what jensen said DLSS 3 did. Jensen claimed that it removed the cpu bottleneck from the cpu.


----------



## wheresmycar (Oct 16, 2022)

ZetZet said:


> No it isn't. The key is people want those things. We can live with candles instead of lightbulbs and a cheap beat down car, or we buy luxuries, because we can afford to buy them. Nvidia is simply pricing things according to the market, market spends more, Nvidia raises prices. Back in the day 80 series cards were a lot cheaper and people still didn't buy them. When 1080Ti came out A LOT of people bought one, that was the catalyst for Nvidia to realize they underpriced it.



candles and cheap beat down cars are lost at sea! ....thats where the problem is. The higher premiums at the top level are calculatively manipulating bottom/mid-segment prices. Looking at 40-series, a $1200 4080 sets an extremely high reference point to tolerably cater for the mid-segment (or inferior). NVIDIA tried masking this massive gap in price by throwing in a 4080-12 which failed. We all get it, profits come first and we're more than happy to offer charitable donations in achieving our performance targets but come on, don't charge luxury car prices for standard premium models otherwise you end up with cheap beat down cars going for the premium model rate.

The way I see it, eventually we'll end up with lacklustre feature/performance trimmed $350-$500 SKUs which is hardly something to be excited about. For me, 40-series is just tiresome, over-hyped and in its current SKU formation financially inaccessible for 99.9% of game-dedicated consumers. Obviously its early days... but its not looking pretty!


----------



## medi01 (Oct 16, 2022)

DemonicRyzen666 said:


> increase in Frame rates from DLSS


Lol.


----------



## AusWolf (Oct 16, 2022)

medi01 said:


> This assumes AMD would roll out RDNA3 with 7800 perf on par with 6800, but PRICIER.
> 
> I wonder why would they do that though.


That part of my post was mainly about Nvidia - it's known to be their game to play with availability instead of price.


----------



## Totally (Oct 16, 2022)

Chrispy_ said:


> The reason they didn't want to call it a 4070 is because the price they want for an xx70 card is obscene.
> 
> Perhaps now it will be a 4070 after all, but they're going to have to explain why it's $900 instead of $500. If I had to guess, the explanation from Nvidia will be "F*CK YOU ALL, GIVE US YOUR MONEY"


Why’s it an issue now? In the not too distant past xx80 cards were $500.


----------



## awesomesauce (Oct 16, 2022)

Why over 300 comment?

can someone do a tldr for me ?

why people soo mad about this?


----------



## Aquinus (Oct 16, 2022)

gffermari said:


> Not exactly.
> It's actually the only company that innovates and delivers and since they've been miles ahead of the competition, they give the pace. Like Intel did for a decade. I don't like their pace but anyone would do the same in their position.
> If you don't like it, though, there are alternatives. There have always been.


If you stick around TPU enough, you'll see that I vote with my wallet and I'll take every opportunity to call out nVidia on their braindead business practices. AMD and Intel are far less hostile of companies than nVidia is. You don't have to look any further than the Linux kernel and GPU drivers to recognize that. If you think Apple is a walled garden, then nVidia is Guantanamo; it serves the purpose but is totally distasteful.


----------



## InVasMani (Oct 16, 2022)

NVIDIA: RTX 4090






Also NVIDIA: RTX 4080 12GB


----------



## Why_Me (Oct 17, 2022)

InVasMani said:


> NVIDIA: RTX 4090
> View attachment 265738











						NVIDIA GeForce RTX 4090 Founders Edition Review - Impressive Performance
					

The NVIDIA GeForce RTX 4090 Founders Edition offers huge gains over its predecessors. It's the first graphics card to get you 4K 60 FPS with ray tracing enabled, and upscaling disabled. Do you prefer 120 FPS instead of 60? Just turn on DLSS 3.




					www.techpowerup.com


----------



## N3M3515 (Oct 17, 2022)

sepheronx said:


> Which makes no sense to get anything less than a 4090.


That's by design my dude.


----------



## sepheronx (Oct 17, 2022)

N3M3515 said:


> That's by design my dude.


I figured that would be the case.  I mean, it's a total joke anything below the 4090.  Either 4090 or a RTX 3000 series.

Oh well.


----------



## ZoneDymo (Oct 17, 2022)

awesomesauce said:


> Why over 300 comment?
> 
> can someone do a tldr for me ?
> 
> why people soo mad about this?



There is no TLDR, this has just completely spiraled out of control, if I were TPU I would just lock it now but whatever.
TLDR would be, ignore and move on, very little of value here, we will pick up when there is more news on the new cards...or when they are actually out.


----------



## N/A (Oct 17, 2022)

The only way i can explain why 4090 is so much faster than 4080 in 4K is the $L2. somehow having 72MB instead of 64MB makes a difference. what a fail. 4080 is not the perfectly designed 4K card it could have been.


----------



## Sisyphus (Oct 17, 2022)

cvaldes said:


> There is no blanket threshold amount above which scalping happens and below which scalping stops.[...]


There will always be scalper, even if they lose money . . . 
Back to reality: The amount of scalping depends on the difference between MSRP and demand driven higher street price. 
The two main counter strategies are: 
1) Auction. AIBs or nVidia can auction the first charges, until broad availability is reached. Everyone interested can make his bid. The highest bids are winning. Consequence: Scalper and consumer are paying the same price. Nothing to gain for scalper. 
2) Fill the warehouses with high stock and start selling, once there are enough to supply each buyer with a product. Consequence: Scalper won't find consumers, who will pay more.

In the past, Titan was significantly more expensive. These days the high-end models with 24 GB Vram are only a bit more expensive, the 4090 disproportionate more performant, than the x080 series. Good prices for fans of high-end GPUs. 3090 and 4090 are fairly priced. Within the next 4-6 weeks I will decide, which one I buy. 3090 around 1000$ or later the 4090 around 1600-1700$, once the early adopter problems are solved, market saturated, AMD 7000 in the shells. 
40x0 series continues the problem of 30x0 series. Overpriced mid-tier cards.


----------



## Dirt Chip (Oct 17, 2022)

awesomesauce said:


> Why over 300 comment?
> 
> can someone do a tldr for me ?
> 
> why people soo mad about this?


Some people feel they defeated NV in some way and cheer up to that (many flavors to that icecream), while other say nothing will change as long as AND isn't up to the game with real competition.
Also a lot of new name guessing, assuming (wrongly, I might add) that it will affect price (or performance..) in any meaningful way.

All in all, we all doing NV a big service by keep  ing this PR sunt around.


----------



## Mussels (Oct 17, 2022)

I love that this has 15 pages of comments caused by the fact nvidia tried to do something really shitty


At least previous variants had the same GPU, and any memory bandwidth lost was technically required - removing one of 6 memory chips cuts the bus width by 1/6th, simple.

Not uhh... claiming it's the same product when it's entirely different
(See: apple with the US getting different hardware to the rest of hte world, and samsung with exynos vs snapdragon on the S22 series)


Maybe we need the Eu to turn around and say "you cant f*cking sell two different products under the same name" and get companies to stop this crapt


----------



## Jimmy_ (Oct 17, 2022)

4070 or 4070ti incoming................


----------



## ARF (Oct 17, 2022)

awesomesauce said:


> Why over 300 comment?
> 
> can someone do a tldr for me ?
> 
> why people soo mad about this?



Because AMD doesn't present anything new and interesting and people's energy comes here against nvidia's shenanigans.

New Radeon release can't come soon enough.


----------



## DuxCro (Oct 17, 2022)

Unfortunately, Nvidia started with insane pricing that has been just getting worse since RTX cards came out. And AMD is just following this "new normal." Even if AMD manages to match RTX 4090 performance with its RX 7950 XT, it will probably be priced at $1500. $1400 best case scenario.


----------



## ARF (Oct 17, 2022)

DuxCro said:


> Unfortunately, Nvidia started with insane pricing that has been just getting worse since RTX cards came out. And AMD is just following this "new normal." Even if AMD manages to match RTX 4090 performance with its RX 7950 XT, it will probably be priced at $1500. $1400 best case scenario.



AMD will not release RX 7950 XT card.

People were saying the same thing about Navi 21 - that it would only match the RTX 2080 Ti. What happened in reality?

No Nvidia Killer? Big Navi's best could match the RTX 2080 Ti, fall short of the RTX 3080 Ti | TechSpot


----------



## MrMeth (Oct 17, 2022)

Nanochip said:


> Good. There’s no reason to have an unforced error and deliberately deplete good will amongst gamers. The 4090 is a beast. Rdna3, if as powerful as rumors claim, would have eaten the 4080 12 GB, I mean the 4070, for lunch. Hopefully nvidia drops the price of the 4070 too.
> 
> What they should do is call the 4080 12 GB the 4060, the 16 GB the 4070, release the real 4080 with 16 or 20 GB in between the current 4080 16 GB and the 4090. Price the new 4080 at $900 max, the 4070 at 700 and the 4060 at 500. That would be a strong offense against the Radeon onslaught that’s coming.


I Have been against nvidia's pricing since the original RTX 2000 launch with there steady price increase. I dont find it healthy or consumer friendly that the **60 tier of card is $500.00 Think about this for a sec **50 cards or the **50 ti cards will then be $350-$400 ? that makes no sense ? We need to vote with our wallets or next time Nvidia will do just this and release a **60 card at $900.00!!! not saying AMD/Intel are better but we need them to succed this generation so we can have some competition and nvidia can get the wind knocked out there sales. And this is not fanboy talk , I currently have EVGA 3080 FTW3 ultra RIP :{ , & 3 other nvidia cards in my render farm.


----------



## ARF (Oct 17, 2022)

We need updates in the low-end. Because today nvidia sells such cards:
GF 210, GT 610, GT 710, GT 730 and *GT 1030* - a 2017 card at the extreme low-end starting at 100 euro


----------



## fevgatos (Oct 17, 2022)

Daven said:


> Most PC enthusiasts just reflexively buy Nvidia and cannot bring themslves to buy anything else. It’s a condition known as gullible.


And there is a tiny minority like you that thinks that amd gpus in the high end are comparable to nvidias. They are not. The 6900xt at 4k max settings (RT) is comparable to a 3070, not a 3090 or a 3080ti. That;s just the sad reality


----------



## AusWolf (Oct 17, 2022)

fevgatos said:


> And there is a tiny minority like you that thinks that amd gpus in the high end are comparable to nvidias. They are not. The 6900xt at 4k max settings (RT) is comparable to a 3070, not a 3090 or a 3080ti. That;s just the sad reality


That doesn't disprove the original point - there are people who would never buy an AMD card even at a comparable range and use case.

Straw man argument

Just trying to look for proof to your point...


----------



## Aquinus (Oct 17, 2022)

fevgatos said:


> And there is a tiny minority like you that thinks that amd gpus in the high end are comparable to nvidias. They are not. The 6900xt at 4k max settings (RT) is comparable to a 3070, not a 3090 or a 3080ti. That;s just the sad reality


...or some of us don't care that nVidia has the best hardware because of their scummy business practices and will buy something else just because of that. Your statement makes sense if you're just as morally bankrupt as nVidia and insist on having the fastest despite their actions being detestable.


----------



## medi01 (Oct 17, 2022)

ARF said:


> People were saying the same thing about Navi 21 - that it would only match the RTX 2080 Ti. What happened in reality?



Yeah. Tell us "what happened in reality". I mean, in your reality, which is quite different from our reality, it seems, "only match 2080Ti", what the heck are you smoking???
















						AMD Radeon RX 6900 XT Review - The Biggest Big Navi
					

AMD's Radeon RX 6900 XT offers convincing 4K gaming performance, yet stays below 300 W. Thanks to this impressive efficiency, the card is almost whisper-quiet, quieter than any RTX 3090 we've ever tested. In our review, we not only benchmark the RX 6900 XT on Intel, but also on Zen 3, with fast...




					www.techpowerup.com
				






DuxCro said:


> Even if AMD manages to match RTX 4090 performance with its RX 7950 XT, it will probably be priced at $1500. $1400 best case scenario.


There are good reasons why AMD quit the "undercut NV so that its customers can buy overpriced NV cards for a bit less" I think.

Curiouly though, in the last quarter AMD margins were 10% higher than NVs. (across all products) about 50% (down from 56%) vs about 40% (down from 66%)



AusWolf said:


> there are people who would never buy an AMD card even at a comparable range and use case.


3050 vs 6600 is the greatest of examples.
The former is significantly (a good tier) slower AND more expensive AND sells more.


----------



## Chrispy_ (Oct 17, 2022)

Totally said:


> Why’s it an issue now? In the not too distant past xx80 cards were $500.


It's an issue if it's a card that has the die-size, bus-width, VRM-layout, and PCB complexity of a 4060Ti.

Nvidia were trying to charge $900 for a xx60 Ti card. That's a disgusting precedent to set and because they got away with such a crazy increase during the pandemic/ETH-mining boom, they thought they could repeat the same insane price hike.


----------



## fevgatos (Oct 17, 2022)

Aquinus said:


> ...or some of us don't care that nVidia has the best hardware because of their scummy business practices and will buy something else just because of that. Your statement makes sense if you're just as morally bankrupt as nVidia and insist on having the fastest despite their actions being detestable.


What exactly are those scummy practices? What morally bankrupt actions are you talking about? Id like to know, if that's the case I won't buy nvidia either


----------



## AusWolf (Oct 17, 2022)

fevgatos said:


> What exactly are those scummy practices? What morally bankrupt actions are you talking about? Id like to know, if that's the case I won't buy nvidia either


Did you read the article you're commenting under?

A few more from the past:


----------



## fevgatos (Oct 17, 2022)

AusWolf said:


> Did you read the article you're commenting under?
> 
> A few more from the past:


The only questionable thing of those 3 is the 1030. The rest are fine, I don't see any problem.

But then again AMD does similar practices, like with the cut lanes on their cheaper cards, requireing PCIE 4 platforms to work properly.


----------



## AusWolf (Oct 17, 2022)

fevgatos said:


> The only questionable thing of those 3 is the 1030. The rest are fine, I don't see any problem.


So cutting components off of similarly named products and lying about actual useful VRAM capacity is fine. Well... suit yourself, I guess.



fevgatos said:


> But then again AMD does similar practices, like with the cut lanes on their cheaper cards, requireing PCIE 4 platforms to work properly.


No company is perfect. But at least AMD's x4 lane PCI-e 4.0 connection isn't a lie.

Another one (if the above didn't move you, then this one won't either, but anyway):


----------



## fevgatos (Oct 17, 2022)

AusWolf said:


> So cutting components off of similarly named products and lying about actual useful VRAM capacity is fine. Well... suit yourself, I guess.
> 
> 
> No company is perfect. But at least AMD's x4 lane PCI-e 4.0 connection isn't a lie.
> ...


They never lied about vram capacity, the 970 does indeed have 4 gb of ram. Regarding the 1060, nobody cares. You should watch reviews before buying a product, and buy the product that performs as well as you want it to perform. I bought a 1060 3gb, I watched a review and I liked it's performance compared to the price, I found it more vfm than the 1060 6gb. So, where is the issue exactly? I really don't get it


----------



## AusWolf (Oct 17, 2022)

fevgatos said:


> They never lied about vram capacity, the 970 does indeed have 4 gb of ram. Regarding the 1060, nobody cares. You should watch reviews before buying a product, and buy the product that performs as well as you want it to perform. I bought a 1060 3gb, I watched a review and I liked it's performance compared to the price, I found it more vfm than the 1060 6gb. So, where is the issue exactly? I really don't get it


It's not the same product as the 1060 6 GB, but it bears the same name. Whether you personally found its performance acceptable or not is irrelevant. To stay on topic: it's the same thing with the 4080 12 GB in question. If this is all OK, then maybe Ford should produce a Mustang that is electric and actually a crossover. Oh wait, they did! And guess what, everybody hates them for it.

"Read reviews" is easy to say from someone who is generally interested in tech. Most people aren't and they don't even know where reviews are or how to interpret them.


----------



## fevgatos (Oct 17, 2022)

AusWolf said:


> It's not the same product as the 1060 6 GB, but it bears the same name. Whether you personally found it acceptable or not is irrelevant. To stay on topic: it's the same thing with the 4080 12 GB in question.
> 
> "Read reviews" is easy to say from someone who is generally interested in tech. Most people aren't and they don't even know where reviews are or how to interpret them.
> 
> If this is all OK, then maybe Ford should produce a Mustang that is electric and actually a crossover. Oh wait, they did! And guess what, everybody hates them for it.


But if you don't watch any reviews, why would you buy a 4080? I mean even if there was only the 16gb 4080 model, without seeing reviews - why the heck would you buy it??? It makes absolutely no sense to me. I assume people are paying to buy performance, not a model number. By not watching a review - you are not buying performance - you are buying a model number. That's totally on you, not on nvidia.

Let's say next month the 4080 16gb hits the shelves for the same price as the 3090ti. If you don't watch a review, which one would you buy and why? I really don't get your point honestly.


----------



## AusWolf (Oct 17, 2022)

fevgatos said:


> But if you don't watch any reviews, why would you buy a 4080? I mean even if there was only the 16gb 4080 model, without seeing reviews - why the heck would you buy it??? It makes absolutely no sense to me. I assume people are paying to buy performance, not a model number. By not watching a review - you are not buying performance - you are buying a model number. That's totally on you, not on nvidia.
> 
> Let's say next month the 4080 16gb hits the shelves for the same price as the 3090ti. If you don't watch a review, which one would you buy and why? I really don't get your point honestly.


You say that because you're interested in tech - like I said, most people aren't. They just want to play games. Do you think everybody who buys a pre-built knows what's in it? And even if they do, do they know what the model numbers mean? What you're saying is basically the same as expecting every single person in the world to service their own cars.

Let me demonstrate with an actual conversation that I had with a colleague of mine not long ago:
Him: "My son plays games."
Me: "I play games too."
Him: "Ah, so you have one of those fancy computers too?"
Me: "Yep."
Him: "What Core is it? i5? i7?"
Me: "i7."
Him: "Wow, that's awesome!"

Or another one:
Him: "Would X game run on my daughter's laptop?"
Me: "I don't know. What kind of laptop is it?"
Him: "Lenovo. About 5 years old."


----------



## InVasMani (Oct 17, 2022)

fevgatos said:


> The only questionable thing of those 3 is the 1030. The rest are fine, I don't see any problem.
> 
> But then again AMD does similar practices, like with the cut lanes on their cheaper cards, requireing PCIE 4 platforms to work properly.


 
The GTX 970 was fine or did you mean fined because they absolutely were sued over that 3.5GB VRAM fiasco.


----------



## fevgatos (Oct 17, 2022)

InVasMani said:


> The GTX 970 was fine or did you mean fined because they absolutely were sued over that 3.5GB VRAM fiasco.


I dont care if it was fined or not. It was by far the best vfm card of its generation, whether it had 4gb or 4kb shouldnt matter. For example the review from this very site gave it a recommendation and called it EXCELLENT for its price. So if the performance is excellent for the price, what difference does it make if it has 4gb, 1gb or no vram at all?

The exact same thing happened to amd with their bulldozer cpus. They got sued because it wasn't a real 8 core or whatever. Who freaking cares, you buy it for the performance, if the performance is there then whether it has 1 core or 5000 of them is irrelevant 



AusWolf said:


> You say that because you're interested in tech - like I said, most people aren't. They just want to play games. Do you think everybody who buys a pre-built knows what's in it? And even if they do, do they know what the model numbers mean? What you're saying is basically the same as expecting every single person in the world to service their own cars.
> 
> Let me demonstrate with an actual conversation that I had with a colleague of mine not long ago:
> Him: "My son plays games."
> ...


What difference would it make for these people though? I mean a prebuild with an xx80 and an i7 or i9 would cost what, over 2 - 2.5k. If someone spends that amount of money on a computer and has absolutely no idea what he is buying, its noones fault but his.

I would have an issue if both cards were called 4080 (like it happened with the 1030 you mentioned before), but since there is an extra moniker im personally fine with it.


----------



## Vya Domus (Oct 17, 2022)

fevgatos said:


> But if you don't watch any reviews, why would you buy a 4080? I mean even if there was only the 16gb 4080 model, without seeing reviews - why the heck would you buy it??? It makes absolutely no sense to me. I assume people are paying to buy performance, not a model number. By not watching a review - you are not buying performance - you are buying a model number. That's totally on you, not on nvidia.
> 
> Let's say next month the 4080 16gb hits the shelves for the same price as the 3090ti. If you don't watch a review, which one would you buy and why? I really don't get your point honestly.



By having two different products bearing the same name you're implying the performance is the same with the only difference being the memory size. The fact that a potential buyer with no technical knowledge cannot ascertain that there is in fact a performance difference between the two makes it misleading, third party reviews are irrelevant. False adverting is 100% on the manufacturer, they're compelled by the law to provide accurate information about their products that wont mislead the consumer, this isn't subjective or a matter of opinion. You think Nvidia backed down on this naming scheme for no reason ? Of course not, they knew very well what they were doing was misleading, they just got too much shit for it and decided that it's better to ditch the names before they find themselves in legal trouble.

Just as a real life example, I actually have a friend who didn't know there was a performance difference between the 12Gb and 16Gb until I explained it to him and he's not a moron, he wasn't going to buy one anyway and if he looked into it more carefully I am sure he would have realized they're not the same but it goes to show how easy it to mislead people.

I guess you just can't understand the concept of false advertising for some reason. That's fine, you should apply for a marketing position at Nvidia, they'd love you.


----------



## fevgatos (Oct 17, 2022)

Vya Domus said:


> By having two different products bearing the same name you're implying the performance is the same with the only difference being the memory size. The fact that a potential buyer with no technical knowledge cannot ascertain that there is in fact a performance difference between the two makes it misleading, third party reviews are irrelevant. False adverting is 100% on the manufacturer, they're compelled by the law to provide accurate information about their products that wont mislead the consumer, this isn't subjective or a matter of opinion. You think Nvidia backed down on this naming scheme for no reason ? Of course not, they knew very well what they were doing was misleading, they just got too much shit for it and decided that it's better to ditch the names before they find themselves in legal trouble.
> 
> Just as a real life example, I actually have a friend who didn't know there was a performance difference between the 12Gb and 16Gb until I explained it to him and he's not a moron, he wasn't going to buy one anyway and if he looked into it more carefully I am sure he would have realized they're not the same but it goes to show how easy it to mislead people.
> 
> I guess you just can't understand the concept of false advertising for some reason. That's fine, you should apply for a marketing position at Nvidia, they'd love you.


So you are saying nvidia, a multibillion dollar company, didnt know they would have legal trouble until just today? Lol, ok buddy, maybe you should apply for a marketing position. 

What does it even mean to be misled. If you dont watch any reviews how would you freaking know how the 16gb version performs? If you dont know how the 16gb version performs how and why would you assume it performs similar to the 12gb? And what is similar even mean if you have no idea how either of them perform? 

Im sorry but if you walk into a shop and randomly buy a gpu based on the name, nvidia misleading you should be the least of your concerns.


----------



## Vya Domus (Oct 17, 2022)

fevgatos said:


> So you are saying nvidia, a multibillion dollar company, didnt know they would have legal trouble until just today?



What I am saying is that Nvidia, a multibillion dollar company, wouldn't have thrown down the drain god knows much marketing and time and effort spent working with AIBs for a product that they now wont launch, for no good reason.

And this is not even me saying that it's misleading by the way, Nvidia said so, you know, the multibillion dollar company :









						Unlaunching The 12GB 4080
					

16GB GeForce RTX 4080 on track to delight gamers everywhere November 16th.



					www.nvidia.com


----------



## wheresmycar (Oct 17, 2022)

fevgatos said:


> So you are saying nvidia, a multibillion dollar company, didnt know they would have legal trouble until just today? Lol, ok buddy, maybe you should apply for a marketing position.
> 
> What does it even mean to be misled. If you dont watch any reviews how would you freaking know how the 16gb version performs? If you dont know how the 16gb version performs how and why would you assume it performs similar to the 12gb? And what is similar even mean if you have no idea how either of them perform?
> 
> Im sorry but if you walk into a shop and randomly buy a gpu based on the name, nvidia misleading you should be the least of your concerns.



What happens when some less informed buyers view 4080(16) reviews and then mistakenly run off to their local stores to pick up a 4080(12) prebuilt. Unfortunately not everyone is well-informed with manipulating complex SKU-disparities or the technical know-how to fully recognise what they're buying into. Hence there are consumer regulations in place to keep buyers protected to some degree. Likewise if it didn't require extensive reading (reviews) and some tech-know-how familiarity even I would have been misled with DLS 3.0 performance charts. Thanks to the TPU fam (and others), i can now make a more informed decision by giving DLS-3 a boot in the teeth and focus in areas where raw performance and relative latencies matter. Again, unfortunately for the less informed, those very same charts will be a hoodwinked deciding factor in determining where their hard earned cash is spent.

And this "multibillion dollar company" is no stranger to "class action lawsuits". Big Daddy global brutes naturally serving their busienss interests - it happens, nothing surprising... only it should be met with consumer/reviewer/newsflash criticism to keep damages/consumer manipulation/market turbulence at a minimum.

Also lets not forget the BIG elephant in the room.... 4080-16 with an exaggeratedly embellished asking price of *"$1200"*. Thats right the XX80 horizon is now set at *twelve hundred dollars. * No doubt, they had to justify the XX80-wedge with a massive leap in MSRP in some way, shape or form.... _"ah ha! throw the fools a 2-piece 4080 designation with the lesser part for $300 less and we'll be smooth sailing". _Even $900 for a lacklustre entry level 80-part is absurd - no thanks!! NVIDIA will make money regardless... so business as usual for them but broader consumer expectations and impending upgrades have been shot 6 feet under so lets not be surprised at the elevated level of criticism.


----------



## AusWolf (Oct 17, 2022)

fevgatos said:


> What difference would it make for these people though? I mean a prebuild with an xx80 and an i7 or i9 would cost what, over 2 - 2.5k. If someone spends that amount of money on a computer and has absolutely no idea what he is buying, its noones fault but his.
> 
> I would have an issue if both cards were called 4080 (like it happened with the 1030 you mentioned before), but since there is an extra moniker im personally fine with it.


An extra moniker that indicates nothing but a difference in VRAM capacity. It's not the same as AMD's RX 480 4 GB and 8 GB were. They're completely different cards at completely different performance levels.

So you're saying that fooling buyers who don't browse tech sites every day like you or I do is fine? Personally, I find it disgusting. I know a little bit about cars, but I won't study to become a mechanic just to buy one.


----------



## fevgatos (Oct 17, 2022)

AusWolf said:


> An extra moniker that indicates nothing but a difference in VRAM capacity. It's not the same as AMD's RX 480 4 GB and 8 GB were. They're completely different cards at completely different performance levels.
> 
> So you're saying that fooling buyers who don't browse tech sites every day like you or I do is fine? Personally, I find it disgusting. I know a little bit about cars, but I won't study to become a mechanic just to buy one.


You really actually think that nvidia was trying to fool buyers? Im with you on the fact that the 2 gpus sharing the same name is stupid to say the least, i just dont think nvidia did it to fool anyone. I dont see any benefit from them for doing so



Vya Domus said:


> What I am saying is that Nvidia, a multibillion dollar company, wouldn't have thrown down the drain god knows much marketing and time and effort spent working with AIBs for a product that they now wont launch, for no good reason.
> 
> And this is not even me saying that it's misleading by the way, Nvidia said so, you know, the multibillion dollar company :
> 
> ...


You said that they would have legal issues, meaning that they didn't know it for the last few months and realised it yesterday... If you still cant admit that your point was just not making much sense i dont see the point of arguing with you


----------



## AusWolf (Oct 17, 2022)

fevgatos said:


> You really actually think that nvidia was trying to fool buyers? Im with you on the fact that the 2 gpus sharing the same name is stupid to say the least, i just dont think nvidia did it to fool anyone. I dont see any benefit from them for doing so


Yes. The benefit is being able to sell it for more money than it's worth. The 2070 and 3070 both started at an MSRP of $499, so a 4070 for $900 would have looked bad. Very bad.


----------



## InVasMani (Oct 17, 2022)

So let me get this straight it wasn't misleading and Nvidia's just being a Karen and canceling it's own GPU product launch because it could buy a egg McNuffin at McDonald's after 12!!?


----------



## fevgatos (Oct 17, 2022)

AusWolf said:


> Yes. The benefit is being able to sell it for more money than it's worth. The 2070 and 3070 both started at an MSRP of $499, so a 4070 for $900 would have looked bad. Very bad.


Then where is the line? Let's say there was only one model of the 4080, and it was slower than the 3080. Are you saying that's misleading? Are you saying they shouldn't be allowed to release such a product? I mean come on, you shouldn't buy anything based on the name. You should check reviews - period. Especially now, that it seems like the 4080 and the 3090 / 3090ti will be at similar prices, there is no way to choose one over the other unless you check reviews.


----------



## AusWolf (Oct 17, 2022)

fevgatos said:


> Then where is the line? Let's say there was only one model of the 4080, and it was slower than the 3080. Are you saying that's misleading? Are you saying they shouldn't be allowed to release such a product? I mean come on, you shouldn't buy anything based on the name. You should check reviews - period. Especially now, that it seems like the 4080 and the 3090 / 3090ti will be at similar prices, there is no way to choose one over the other unless you check reviews.


Who said the 4080 will be slower than the 3080? What you're saying has no connection to my point at all. You're speaking in hypotheticals, whereas I started facts regarding actual products and pricing.


----------



## Sisyphus (Oct 17, 2022)

A discussion about 970? A good card. I used it for offline raytracing. nVidia implemented direct cudacore support and gave iRay away, for creatives with low budged who wanted to gain experience in photorealistic renders. It did its job until I upgraded to 2070. I bought it, because of its wide range of capabilities for an acceptable price and excellent software support. Cudacore was working in many entry 3D apps. AMD had nothing nearly as good and is still far behind in professional software support. Of course, I bought the card after the slower connection of 500mb VRam became known. As I always say, everybody has to inform itself and its always recommended, not to buy products as an early adopter. It is currently my reserve card and still works perfectly.



Vya Domus said:


> And this is not even me saying that it's misleading by the way, Nvidia said so, you know, the multibillion dollar company :


They won't be able to mislead informed consumers. Kindergarten.


----------



## AusWolf (Oct 17, 2022)

Sisyphus said:


> A discussion about 970? A good card. I used it for offline raytracing. nVidia implemented direct cudacore support and gave iRay away, for creatives with low budged who wanted to gain experience in photorealistic renders. It did its job until I upgraded to 2070. I bought it, because of its wide range of capabilities for an acceptable price and excellent software support. Cudacore was working in many entry 3D apps. AMD had nothing nearly as good and is still far behind in professional software support. Of course, I bought the card after the slower connection of 500mb VRam became known. As I always say, everybody has to inform itself and its always recommended, not to buy products as an early adopter. It is currently my reserve card and still works perfectly.


That's again, besides the point. Similarly, I can't say that the 1030 DDR4 is a brilliant card just because it's faster than the 710 DDR3 and has better video decode support.


----------



## Sisyphus (Oct 17, 2022)

AusWolf said:


> That's again, besides the point. Similarly, I can't say that the 1030 DDR4 is a brilliant card just because it's faster than the 710 DDR3 and has better video decode support.


You can say, whatever you like. It does not change anything of the usability of the gtx 970. I bought it, it fitted my needs. My condolences to anyone who purchases products that didn't meet their needs.


----------



## AusWolf (Oct 17, 2022)

Sisyphus said:


> You can say, whatever you like. It does not change anything of the usability of the gtx 970. I bought it, it fitted my needs. My condolences to anyone who purchases products that didn't meet their needs.


Again, a product fitting your needs and fitting its own name, description and price are entirely different things. A 4080 12 GB with its specs would have fit my needs perfectly. It doesn't change the fact that lying about specs (970) and releasing two entirely different products with the same name (4080) is wrong.

My point is about naming and pricing which determines a product's position in the market and its (false) perception by the average consumer. Your point is about the product itself. Please read what I'm saying before you reply.


----------



## wheresmycar (Oct 18, 2022)

P4-630 said:


> Great, now what to do with my $900....



Its extremely upsetting to see our TPU family hasn't bothered to help with this rather simple problem. I mean come on, first message in the thread, easily observable at first glance and no solution in sight 

Anyway, dont worry im here to help.

Ok, i have a $800 budget for my next GPU upgrade. If we combine our budgets we get a total of $1700. This is looking good. We can pick up a 4090 for $1600 woohoo.... that leaves us $100 in hand - its yours! i know - i'm feeling charitable. We can then split the card in half and each one of us takes his share. Seeing the TPU family failed to help and much time has passed.... you can have the larger portion of the share. I'll just take the measly tiny PCB which is probably around 5% of the total size and the rest is yours... yep full 95% casing, massive heatsink, heatpipes, fans, etc etc ...the full MONTY!!

Before you crack your head open on that brick wall... just sign here _____________________


----------



## AusWolf (Oct 18, 2022)

wheresmycar said:


> Its extremely upsetting to see our TPU family hasn't bothered to help with this rather simple problem. I mean come on, first message in the thread, easily observable at first glance and no solution in sight
> 
> Anyway, dont worry im here to help.
> 
> ...


Jokes aside: just read back the last page or two and you'll see the problem. People are happy to pay all the money on earth for whatever Nvidia shits out because _"it's the new high-tech Nvidia flagship, mwahaha, not some crappy AMD knock-off, no no"._

Disclaimer: it's not about the people who genuinely need an Nvidia feature, like CUDA. It's about people who would otherwise be perfectly fine with an AMD card, but would rather pay double for the Nvidia equal as a sort of cure for their inferiority complex. It's also about people who think lying to customers is acceptable as long as the product by its own merits is anywhere near usable.


----------



## sepheronx (Oct 18, 2022)

AusWolf said:


> Jokes aside: just read back the last page or two and you'll see the problem. People are happy to pay all the money on earth for whatever Nvidia shits out because _"it's the new high-tech Nvidia flagship, mwahaha, not some crappy AMD knock-off, no no"._
> 
> Disclaimer: it's not about the people who genuinely need an Nvidia feature, like CUDA. It's about people who would otherwise be perfectly fine with an AMD card, but would rather pay double for the Nvidia equal as a sort of cure for their inferiority complex. It's also about people who think lying to customers is acceptable as long as the product by its own merits is anywhere near usable.


We need a hero to save us. We need the one, the chosen one who can stop the evil that is Nvidia, and to bring decent value of GPU's back to the land so we can all be free, happy and entertained again.

Come sir Robin of Drop bears.  Lead us so we can all get good cheap GPUs again.


----------



## InVasMani (Oct 18, 2022)

sepheronx said:


> We need a hero to save us. We need the one, the chosen one who can stop the evil that is Nvidia, and to bring decent value of GPU's back to the land so we can all be free, happy and entertained again.
> 
> Come sir Robin of Drop bears.  Lead us so we can all get good cheap GPUs again.


Jensen: You under estimate my power...
Everyone:


----------



## sepheronx (Oct 18, 2022)

Ananas is a face to be feared and revered.


----------



## InVasMani (Oct 18, 2022)

Will have to remember that when people hate on pineapple on pizza...  for your crimes you have been sentenced to death by pineapple...


----------



## sepheronx (Oct 18, 2022)

InVasMani said:


> Will have to remember that when people hate on pineapple on pizza...  for your crimes you have been sentenced to death by pineapple...


You would eat this face?


----------



## InVasMani (Oct 18, 2022)

Is it warm or cold?


----------



## AusWolf (Oct 18, 2022)

sepheronx said:


> We need a hero to save us. We need the one, the chosen one who can stop the evil that is Nvidia, and to bring decent value of GPU's back to the land so we can all be free, happy and entertained again.
> 
> Come sir Robin of Drop bears.  Lead us so we can all get good cheap GPUs again.


Intel Arc A770 entered the chat...

Hmm... _"hero... chosen one... stop the evil..."_

Intel Arc A770 left the chat...


----------



## sepheronx (Oct 18, 2022)

InVasMani said:


> Is it warm or cold?


as cold as AusWolf's heart


AusWolf said:


> Intel Arc A770 entered the chat...
> 
> Hmm... _"hero... chosen one... stop the evil..."_
> 
> Intel Arc A770 left the chat...



I had an interest in the A770 until the fact I cannot even get one here.


----------



## AusWolf (Oct 18, 2022)

sepheronx said:


> as cold as AusWolf's heart


Maybe not that cold. Particles completely stop moving, disrupting the known laws of space-time at those temperatures.



sepheronx said:


> I had an interest in the A770 until the fact I cannot even get one here.


There is literally one store in the UK where I could put it on pre-order for £400, but that's just a big "_NO_"_._


----------



## sepheronx (Oct 18, 2022)

AusWolf said:


> Maybe not that cold. Particles completely stop moving, disrupting the known laws of space-time at those temperatures.
> 
> 
> There is literally one store in the UK where I could put it on pre-order for £400, but that's just a big "_NO_"_._


I have quite a few GPU's I would have liked to do my own tests with and share with the community. Been wanting to start my own channel to go with a site I've been working on but of course, without goods, I got no content. No content, can't get viewers. No viewers, can't get some kind of borrowed equipment to do further tests.  Only way to get viewers is to do stupid videos where the videos front image has me with my mouth wide open in an "O" looking like I'm about to stick something Phallic into it.

So yeah, would like one to also document performance gains over driver iterations.

But no pre orders either. Just nothing.  How absolutely pathetic canada has become beyond just our general stupidity. Can't even get goods now.


----------



## AusWolf (Oct 18, 2022)

sepheronx said:


> I have quite a few GPU's I would have liked to do my own tests with and share with the community.


Same here. I always say first-hand is the only real experience. I've been positively disappointed by massively hated products (6500 XT, Rocket Lake i7) and negatively disappointed by massively loved ones (R5 3600). That's why I buy a lot more PC hardware than I need to. That's also why I want a Zen 4 system in the near future even though I have zero need for that too. I like seeing for myself what the fuss is about.



sepheronx said:


> How absolutely pathetic canada has become beyond just our general stupidity.


That's the whole world, I'm afraid.


----------



## sepheronx (Oct 18, 2022)

If I find a place to buy a arc 770 or 750 I'll grab one.  Kinda wanted the acer model.

I would have purchased a 4090 if it was cheaper, like $1200 but at 1600 usd it's just too much.

I don't have much faith in AMD either honestly.

I've been telling people, want 4K (even if it isn't actually 4K) gaming? Cheap? Get a ps5 or a Xbox series x.

Right now, even with GPU prices as low as they are, it still is abysmal.


----------



## Mussels (Oct 18, 2022)

fevgatos said:


> What exactly are those scummy practices? What morally bankrupt actions are you talking about? Id like to know, if that's the case I won't buy nvidia either


To be fair, there have been quite a few over the years.

The texture compression and lower quality rendering cheats of the old FX series
The 970 having 3.5GB and not 4GB VRAM
Deliberately selling cards in tiers that make them obsolete faster (1050Ti 4GB vs 1060 3GB - they'd BOTH be better off with the VRAM amounts swapped)
Then the modern shenanigans with FE cards being limited to certain countries, the 4080 12GB being a 4070 at 4080 prices


Hell look at the laptops for the worst things where product names became meaningless, a laptop 1060 could have been just about anything, they had products using names to mislead people as well as different TDP variants with drastically differing perfomance they did their best to keep hidden.

Theres more and AMD is not above this sort of thing either - i think all brands have done dodgy shit over the years.


----------



## fevgatos (Oct 18, 2022)

Mussels said:


> The texture compression and lower quality rendering cheats of the old FX series


Both companies pulled that crap, and that was literally around 20 years ago, no?


Mussels said:


> The 970 having 3.5GB and not 4GB VRAM


The 970 did in fact have 4gb vram actually. But I think it's kind of whatever, it was the best vfm GPU on the planet at the time, I don't think whether it was advertised as 4 or 3.5gb would change sales by one IOTA.

Only the 1030 shaenigans with the ddr4 vs gddr5 I consider to be a bs move, because even an informed buyer couldn't actually distinguish between the products


----------



## AusWolf (Oct 18, 2022)

Mussels said:


> Deliberately selling cards in tiers that make them obsolete faster (1050Ti 4GB vs 1060 3GB - they'd BOTH be better off with the VRAM amounts swapped)


Since you mentioned that, I think the whole Ampere lineup deserves a few words too:

The 3090 Ti that only came out to milk flagship hunters beyond imagination after they've got their 3090s already,
The 3080 having 10 GB VRAM initially, with a 12 GB version unexpectedly coming out later,
The 3070 Ti 16 GB version being scrapped to force buyers into planned obsolescence with a choice of more VRAM on a lesser card or more performance with less VRAM,
The 3060 having 12 GB VRAM, basically pissing on everything that has only 8 or 10 GB even several tiers above, despite the fact that it doesn't necessarily need that much,
The 3050 with questionable performance for its tier selling for way more than it should just because it's RTX, and...
Having nothing below that. The 3050 is still based on a scrappy version of the mid-tier GA106, which means Nvidia is openly pissing on gamers on a budget.
The 20-series Super cards established a trend which Ampere only continued. That trend is _"milk gamers now, then make them realise their mistake when the proper version comes out later"_. I'm pretty sure every "mistake" we see in Ada cards is planned ahead of time, and we'll see them somewhat rectified in Ti/Super releases next year. Only somewhat because someone will have to buy 50-series, too.

Edit: Before someone argues it, these are not dealbreaker moves, just plain scummy ones.

Like you said, AMD isn't innocent, either (their laptop chip naming scheme is horrendous), but at least they give you the best performance physically possible (fully enabled chips with decent amounts of VRAM), and not crippled versions that only tease you into the "refresh" coming next year.


----------



## fevgatos (Oct 18, 2022)

AusWolf said:


> Since you mentioned that, I think the whole Ampere lineup deserves a few words too:
> 
> The 3090 Ti that only came out to milk flagship hunters beyond imagination after they've got their 3090s already,
> The 3080 having 10 GB VRAM initially, with a 12 GB version unexpectedly coming out later,
> ...


Have you heard of the XT moniker they used on zen 2 and now on their RDNA cards? They have multiple versions of the 6900, the normal 6900xt, the 6900xt special with the binned core and the 6950xt. I mean come on


----------



## AusWolf (Oct 18, 2022)

fevgatos said:


> Have you heard of the XT moniker they used on zen 2 and now on their RDNA cards? They have multiple versions of the 6900, the normal 6900xt, the 6900xt special with the binned core and the 6950xt. I mean come on


I had a feeling someone would bring this up.

Those are a different matter, because the 3800X for example, was a fully enabled chip with full capability compared to the 3800XT. Same as the 6900XT compared to the 6950XT. The only thing you lose with the "lesser" version is a couple percent performance, max, which you can bring back with overclocking if you're lucky. Nvidia's "Super" game is different, because nowadays, you mostly see vanilla (non-Ti) releases with crippled chips, which you cannot necessarily overcome with overclocking, especially if you pair it with crippled VRAM capacity as well. Additionally, my personal note is that anyone who knows a thing or two about GPUs can suspect from the crippled chips that the good ones are reserved for something coming later. They give you the scraps to tease you into the good stuff you didn't wait for - this is the scummy side of it, imo.


----------



## fevgatos (Oct 18, 2022)

AusWolf said:


> I had a feeling someone would bring this up.
> 
> Those are a different matter, because the 3800X for example, was a fully enabled chip with full capability compared to the 3800XT. Same as the 6900XT compared to the 6950XT. The only thing you lose with the "lesser" version is a couple percent performance, max, which you can bring back with overclocking if you're lucky. Nvidia's "Super" game is different, because nowadays, you mostly see vanilla (non-Ti) releases with crippled chips, which you cannot necessarily overcome with overclocking, especially if you pair it with crippled VRAM capacity as well. Additionally, my personal note is that anyone who knows a thing or two about GPUs can suspect from the crippled chips that the good ones are reserved for something coming later. They give you the scraps to tease you into the good stuff you didn't wait for - this is the scummy side of it, imo.


That's just the wrong buyers mentality. If you bought a product at a specific point in time at a specific price, you MUST think that the product was worth it. So what difference does it makes if 6 or 9 or 15 months later a different better product is released? I bought a 3090, and then the 3090 ti came up. It didn't bother me a tiny bit. Only people that aren't sure about what they are buying have issues with these kinds of things.


----------



## AusWolf (Oct 18, 2022)

fevgatos said:


> That's just the wrong buyers mentality. If you bought a product at a specific point in time at a specific price, you MUST think that the product was worth it. So what difference does it makes if 6 or 9 or 15 months later a different better product is released? I bought a 3090, and then the 3090 ti came up. It didn't bother me a tiny bit. Only people that aren't sure about what they are buying have issues with these kinds of things.


I'm not suggesting that you're suddenly unhappy with your 3090. Of course you're not. It's that you could have had the choice to buy the 3090 or the Ti right from the start, but Nvidia didn't give you that choice. The scummy thing is that the 3090 Ti is not an entirely different product with its own development cycle - it's just a more enabled one that was reserved to make sure people buy the lesser version. It's not even that pronounced on this level, but if you look at the 3080 12 GB, that represents what I mean a bit better.

Edit: It's like a girlfriend that gives you the worse slice of pizza, then halfway through her better slice, realizes that she's full, and you're left thinking "great, I could have had that one right from the start".


----------



## fevgatos (Oct 18, 2022)

AusWolf said:


> I'm not suggesting that you're suddenly unhappy with your 3090. Of course you're not. It's that you could have had the choice to buy the 3090 or the Ti right from the start, but Nvidia didn't give you that choice. The scummy thing is that the 3090 Ti is not an entirely different product with its own development cycle - it's just a more enabled one that was reserved to make sure people buy the lesser version. It's not even that pronounced on this level, but if you look at the 3080 12 GB, that represents what I mean a bit better.
> 
> Edit: It's like a girlfriend that gives you the worse slice of pizza, then halfway through her better slice, realizes that she's full, and you're left thinking "great, I could have had that one right from the start".


But you dont think there is a technical issue with not launching the 3090ti from the get go? Like for example yield issues? Generally speaking as manafacturing matures you get better products. 

For example, even though they were sold as the same product, if you happen to buy a 1st gen ryzen close to launch compared to 10 months later, they were completely different products in terms of ocing. My 1600 needed an insane amount of voltage to hit 3.6ghz, while 2 i bought for my friends were casually hitting 4ghz with minimal amount.


----------



## Vya Domus (Oct 18, 2022)

fevgatos said:


> You said that they would have legal issues, meaning that they didn't know it for the last few months and realised it yesterday



No, they knew it was disingenuous from the start, they just hoped it would fly under the radar. I don't know why you are trying so hard to be obtuse on purpose and pretend to not understand the basic idea here.

We get it, you love Nvidia and wont admit that they did something shitty, even though they've themselves have admitted the naming was confusing. But like you said, I am sure you know better than a multibillion dollar company who decided to retract a product because they were too dumb or something, I don't know.



Sisyphus said:


> They won't be able to mislead informed consumers.



What even is an informed consumer ? Millions of people play video games, it's evident that most wont have technical knowledge about these things, that's why there are laws against false advertising in the first place.


----------



## AusWolf (Oct 18, 2022)

fevgatos said:


> But you dont think there is a technical issue with not launching the 3090ti from the get go? Like for example yield issues? Generally speaking as manafacturing matures you get better products.
> 
> For example, even though they were sold as the same product, if you happen to buy a 1st gen ryzen close to launch compared to 10 months later, they were completely different products in terms of ocing. My 1600 needed an insane amount of voltage to hit 3.6ghz, while 2 i bought for my friends were casually hitting 4ghz with minimal amount.


If there are technical issues during manufacturing (for example yield), then why do they only hit Nvidia GPUs specifically and nothing else? They are the only company (as far as I know) that has ever pulled off a launch of a new generation without having any single product based on a fully enabled die in the lineup.

OCing is a different matter. The 1700X one bought at launch is a fully enabled chip, and the same 1700X one bought a year later when you compare stock settings. Intel/AMD/Nvidia aren't selling OC - that's something you do for yourself.


----------



## fevgatos (Oct 18, 2022)

Vya Domus said:


> No, they knew it was disingenuous from the start, they just hoped it would fly under the radar. I don't know why you are trying so hard to be obtuse on purpose and pretend to not understand the basic idea here.
> 
> We get it, you love Nvidia and wont admit that they did something shitty, even though they've themselves have admitted the naming was confusing. But like you said, I am sure you know better than a multibillion dollar company who decided to retract a product because they were too dumb or something, I don't know.


I love nvidia? No i dont, but even if i did, that's not an actual argument. I could tell you that you hate nvidia blablabla, doesnt matter. 

What im saying is naming is irrelevant as long as an informed consumer can differentiate between your products on a shelf. That wasnt the case with the 2 versions of the 1030 for example, but it was the case with the 2 versions of the 4080. 

What do you think will actually change now that they took it back? Theyll rename it to 4070 and sell it for the same price. WOW, huge win for the consumer right?



AusWolf said:


> If there are technical issues during manufacturing (for example yield), then why do they only hit Nvidia GPUs specifically and nothing else?
> 
> OCing is a different matter. The 1700X one bought at launch is a fully enabled chip, and the same 1700X one bought a year later when you compare stock settings. Intel/AMD/Nvidia aren't selling OC - that's something you do for yourself.


They dont only hit nvidia. Amd does it, intel does it. Sure you can get down to the technicalities and claim its different cause nvidia changes the amount of cudas but is it really different? Is 10 % more cuda cores immoral but 10% more clockspeeds are fine?


----------



## Vya Domus (Oct 18, 2022)

fevgatos said:


> What im saying is naming is irrelevant as long as an informed consumer



As I wrote above not everyone can be an informed consumer because not everyone has the knowledge or interest to be so, that's why the law protects these consumers.



fevgatos said:


> huge win for the consumer right?



Actually, yes, it is a huge win because a huge corporation doesn't get to mislead it's consumers.


----------



## AusWolf (Oct 18, 2022)

fevgatos said:


> What do you think will actually change now that they took it back? Theyll rename it to 4070 and sell it for the same price. WOW, huge win for the consumer right?


They won't be able to sell it for $900.

If I'm mistaken, and they actually do sell it for that price, then I'll agree with you that people are stupid for buying it (besides being disappointed by the human intellect for the Nth time).



fevgatos said:


> They dont only hit nvidia. Amd does it, intel does it. Sure you can get down to the technicalities and claim its different cause nvidia changes the amount of cudas but is it really different? Is 10 % more cuda cores immoral but 10% more clockspeeds are fine?


Sorry, I edited my post a bit late, so I'll write it down again (my bad, really).

Nvidia is the only company I've ever seen to pull off a full launch of an entirely new generation without one single product based on a fully enabled die anywhere in the product stack. Why is that? If Intel can release the 12900K together with the 12700K and 12600K, if AMD can release the 7950X and 7700X together with the 7900X and 7600X (or the 6900XT together with the 6800 and 6800XT for that matter), then why can't Nvidia do the same?


----------



## BSim500 (Oct 18, 2022)

sepheronx said:


> I have quite a few GPU's I would have liked to do my own tests with and share with the community. Been wanting to start my own channel to go with a site I've been working on but of course, without goods, I got no content. No content, can't get viewers. No viewers, can't get some kind of borrowed equipment to do further tests.  Only way to get viewers is to do stupid videos where the videos front image has me with my mouth wide open in an "O" looking like I'm about to stick something Phallic into it.


I say go for it. You've certainly nailed the Youtube algorithm...


----------



## N3M3515 (Oct 18, 2022)

AusWolf said:


> every "mistake" we see in Ada cards is planned ahead of time


Yep, with the 4000 series prices they were testing que waters.



fevgatos said:


> Theyll rename it to 4070 and sell it for the same price.


They won't sell it for the same price, you know that right?


----------



## wheresmycar (Oct 18, 2022)

AusWolf said:


> Jokes aside: just read back the last page or two and you'll see the problem. People are happy to pay all the money on earth for whatever Nvidia shits out because _"it's the new high-tech Nvidia flagship, mwahaha, not some crappy AMD knock-off, no no"._
> 
> Disclaimer: it's not about the people who genuinely need an Nvidia feature, like CUDA. It's about people who would otherwise be perfectly fine with an AMD card, but would rather pay double for the Nvidia equal as a sort of cure for their inferiority complex. It's also about people who think lying to customers is acceptable as long as the product by its own merits is anywhere near usable.



oh trust me i get it. I've always remained silent in hopes of GPU prices returning back to some level of acceptance. I don't mind filling the pockets of others and supporting businesses who offer excellent products and more importantly keep us entertained. Happily filling those pockets essentially implies "reasonably healthy profit margins" opposed to "appallingly fatty-inflated and sadistically greedy margins".... the latter being where we are at the moment for the best of cards. I'm not interested in XX90 flagships Pocketpits.... but I did suspect the 4080 (or similarly performing 40X0) would go for around $800 (maybe i'm too optimistic). Seeing the $1200 price tag seriously through me off guard.

I have a feeling AMD's gonna go the same route... feeding off NVIDIAs perf/price ratios with a little trim in price. Simply not good enough for me. BEING A CRAZY IMPULSIVE BUYER... ive convinced myself not to buy anything above $800 which by all means is already a huge sum of money for a gaming card (esp @ 1440p).


----------



## Sisyphus (Oct 18, 2022)

AusWolf said:


> [...]My point is about naming and pricing which determines a product's position in the market and its (false) perception by the average consumer. Your point is about the product itself. Please read what I'm saying before you reply.


I will not defend ill-informed consumers. Should they buy a product for its name, it will teach them a lesson.  Hopefully before they reach the age, to sign real estate contracts.


----------



## AusWolf (Oct 18, 2022)

Sisyphus said:


> I will not defend ill-informed consumers. Should they buy a product for its name, it will teach them a lesson.  Hopefully before they reach the age, to sign real estate contracts.


So someone who doesn't read IT media every day, just wants to play games without wanting to know everything about the technical bits is an ill-informed consumer and should be punished for their ignorance. Wow, what an attitude!


----------



## Sisyphus (Oct 18, 2022)

Vya Domus said:


> What even is an informed consumer ?


I am an informed consumer. I am even able to vote . . . 


> Millions of people play video games, it's evident that most wont have technical knowledge about these things, that's why there are laws against false advertising in the first place.


So these people are unable, to inform themselves? Not even google or YouTube, where the basics are explained in simple language? Those who do not inform themselves have to live with the consequences. False advertising only occurs, if the product is advertised with a property that it does not have. You can name it, whatever you like.



AusWolf said:


> So someone who doesn't read IT media every day, just wants to play games without wanting to know everything about the technical bits is an ill-informed consumer and should be punished for their ignorance. Wow, what an attitude!


You don't have to read IT media every day, to inform yourself about specs. 30 mins YouTube or google does the job. People who don't do their research before spending hundreds or thousands of dollars are punishing themselves.


----------



## sepheronx (Oct 18, 2022)

I agree. That's why I will never buy a Tesla or a Ford Pinto


----------



## AusWolf (Oct 18, 2022)

Sisyphus said:


> I am an informed consumer. I am even able to vote . . .


That's not a definition.



Sisyphus said:


> So these people are unable, to inform themselves? Not even google or YouTube, where the basics are explained in simple language? Those who do not inform themselves have to live with the consequences. False advertising only occurs, if the product is advertised with a property that it does not have. You can name it, whatever you like.
> 
> You don't have to read IT media every day, to inform yourself about specs. 30 mins YouTube or google does the job. People who don't do their research before spending hundreds or thousands of dollars are punishing themselves.


You cannot inform yourself about everything.

I read countless reviews about my car before I bought it, but not one of them mentioned that the back coil springs are prone to rust away and break in a few years time, or that the small brakes work so hard that I have to change brake fluid basically every year due to degradation. Is this my fault then?

I have a colleague who comes to me for PC advice from time to time because he knows nothing about them. But he knows a lot about fishing. I've never fished in my life. If I ever go fishing with him, I'll be completely clueless as to what to do or what equipment to get. I'm just not interested enough to spend time reading about something that I don't care about. Is this my fault as well?

TLDR: People have different hobbies and interests. A day is 24 hours for everyone. You can't seriously expect people to spend a chunk of their free time to read about something that they don't give a flying F about. That's what shop assistants are for. I'm sure you've bought stuff in your life that you didn't care to read reviews about. It's not your fault that you don't know everything about everything.


----------



## fevgatos (Oct 19, 2022)

AusWolf said:


> So someone who doesn't read IT media every day, just wants to play games without wanting to know everything about the technical bits is an ill-informed consumer and should be punished for their ignorance. Wow, what an attitude!


But what does an uninformed consumer lose if nvidia releases a 12gb 4080? I don't get it. He will buy whatever premade PC is on his budget or is suggested by the seller, whether it has a 4080 12gb, a 4080 16gb, or a renamed 4080 that's now called 4070.


----------



## N3M3515 (Oct 19, 2022)

fevgatos said:


> But what does an uninformed consumer lose if nvidia releases a 12gb 4080?


If nvidia had released it at $900, he would have lost $300 if he bought it


----------



## AusWolf (Oct 19, 2022)

fevgatos said:


> But what does an uninformed consumer lose if nvidia releases a 12gb 4080? I don't get it. He will buy whatever premade PC is on his budget or is suggested by the seller, whether it has a 4080 12gb, a 4080 16gb, or a renamed 4080 that's now called 4070.


_He doesn't know the difference, so let's lie to him and feed him some sh*t, 'cause that's what non-techies deserve._ That's the attitude again! 

If you build a PC for someone who doesn't know anything about PCs, do you ask them to pay you $1,000 and give them a $500 computer?


----------



## fevgatos (Oct 19, 2022)

N3M3515 said:


> If nvidia had released it at $900, he would have lost $300 if he bought it


How so? Do you think the 600$ nvidia card will be equal / faster than the 900$ 4080 12gb?



AusWolf said:


> _He doesn't know the difference, so let's lie to him and feed him some sh*t, 'cause that's what non-techies deserve._ That's the attitude again!
> 
> If you build a PC for someone who doesn't know anything about PCs, do you ask them to pay you $1,000 and give them a $500 computer?


There is no lying. There is a card for 1200$ (the 4080 16gb) and a card for 900$ (the 12gb). There is no lying involved, an uninformed consumer will buy whatever is on his budget and the seller suggests. 

Again, im not saying it's a great decision to name them both 4080, but I don't see it as a huge issue anyways. They could have named it 4099Ti for all I care, doesn't really matter.


----------



## AusWolf (Oct 19, 2022)

fevgatos said:


> How so? Do you think the 600$ nvidia card will be equal / faster than the 900$ 4080 12gb?


I think the point was that the 4080 12 GB isn't worth $900 no matter how hard Nvidia pushes it.



fevgatos said:


> There is no lying. There is a card for 1200$ (the 4080 16gb) and a card for 900$ (the 12gb). There is no lying involved, an uninformed consumer will buy whatever is on his budget and the seller suggests.
> 
> Again, im not saying it's a great decision to name them both 4080, but I don't see it as a huge issue anyways. They could have named it 4099Ti for all I care, doesn't really matter.


You can't imagine how much the name of the product matters in terms of its marketability towards non-tech oriented consumers (aka. average gamers). Higher number = better = more expensive. Maybe it doesn't work like that for you and me, but it does for most people.


----------



## Dirt Chip (Oct 19, 2022)

AusWolf said:


> _He doesn't know the difference, so let's lie to him and feed him some sh*t, 'cause that's what non-techies deserve._ That's the attitude again!
> 
> If you build a PC for someone who doesn't know anything about PCs, do you ask them to pay you $1,000 and give them a $500 computer?


As long as you up-front with all info and not laying about it, than basically I see no real problem.
NV offer a product, all info about it is available publicly and they put a price tag on it.
You are free to compere and choose with other similar product and decide if it is good for you or not. If yes- pay.

Unless you show that NV made falls advertisements than all responsibility fall on the consumer.


----------



## AusWolf (Oct 19, 2022)

Dirt Chip said:


> As long as you up-front with all info and not laying about it, than basically I see no real problem.
> NV offer a product, all info about it is available publicly and put a price tag.
> You are free to compere and choose with other similar product and decide if it is good for you or not. If yes- pay.
> 
> Unless you show that NV made falls advertisements than all responsibility fall on the consumer.


What info is available? The number of CUDA cores? That's really helpful for someone who doesn't know what a CUDA core is. Heck, Nvidia confused even enthusiasts when they changed what the term means in the Ampere generation. Even Ampere vs. pre-Ampere CUDA cores aren't the same. If a non-tech oriented person read what I just wrote about in the last two lines, they'd have no idea what I just said, just like they'd have no idea looking at Nvidia's web page (the rest of which is pure marketing).


----------



## fevgatos (Oct 19, 2022)

AusWolf said:


> What info is available? The number of CUDA cores? That's really helpful for someone who doesn't know what a CUDA core is. Heck, Nvidia confused even enthusiasts when they changed what the term means in the Ampere generation. Even Ampere vs. pre-Ampere CUDA cores aren't the same. If a non-tech oriented person read what I just wrote about in the last two lines, they'd have no idea what I just said, just like they'd have no idea looking at Nvidia's web page (the rest of which is pure marketing).


Well, okay then, let's talk about amd. There is an amd graph flying around about how better value their products are. No where in that graph does it mention how terrible they are on RT. You don't think that's also misleading - comparing a 6900xt to a 3080ti (or a 3090 don't remember). Imagine the poor guy buying a 3090 equivalent GPU (or so he thought) and tries to play cyberpunk - only to realize he basically bought a 3060 equivalent product. So if every company is trying to mislead you, what's the solution? Not buying nvidia products isn't a solution, since amd is also misleading.

Especially nowadays, you cannot make any decision if you don't see a review, cause gpus arent all about raster anymore. There are so many things on top like RT / DLSS / FSR and other stuff to take into account, no matter if nvidia or amd are trying or not to mislead you, you need to do some research regardless


----------



## Dirt Chip (Oct 19, 2022)

AusWolf said:


> What info is available? The number of CUDA cores? That's really helpful for someone who doesn't know what a CUDA core is. Heck, Nvidia confused even enthusiasts when they changed what the term means in the Ampere generation. Even Ampere vs. pre-Ampere CUDA cores aren't the same. If a non-tech oriented person read what I just wrote about in the last two lines, they'd have no idea what I just said, just like they'd have no idea looking at Nvidia's web page.


Spec is there if you want it, but it`s mostly irrelevant for the reasons you mentioned.
You have TPU and other sites that compere show all kinds of metrics.
If you, as a computer user, don't do even the basic search on the internet before you buy hundreds-1000+ of dollars product than the problem is yours, clean and simple.

You can argue that NV is "unfair" with it`s naming but this is where you need to better look what's right for you.
If NV will overly miss-name it`s product than in the end they will suffer big time and ultimately destroy it`s own GeForce brand name.


----------



## AusWolf (Oct 19, 2022)

fevgatos said:


> Well, okay then, let's talk about amd. There is an amd graph flying around about how better value their products are. No where in that graph does it mention how terrible they are on RT. You don't think that's also misleading - comparing a 6900xt to a 3080ti (or a 3090 don't remember). Imagine the poor guy buying a 3090 equivalent GPU (or so he thought) and tries to play cyberpunk - only to realize he basically bought a 3060 equivalent product. So if every company is trying to mislead you, what's the solution? Not buying nvidia products isn't a solution, since amd is also misleading.


Yes, it is misleading. I wasn't defending AMD on their bullshit, I was only calling out Nvidia on theirs. Although, in AMD's defence is the fact that a higher numbered product is actually better than a lower numbered one within their own product stack, and there aren't two different products with the same number / designation.



fevgatos said:


> Especially nowadays, you cannot make any decision if you don't see a review, cause gpus arent all about raster anymore. There are so many things on top like RT / DLSS / FSR and other stuff to take into account, no matter if nvidia or amd are trying or not to mislead you, you need to do some research regardless


That's true, but not everybody knows that. I agree that some form of knowledge is necessary for a good decision, but I still disagree that cheating those who don't do the research is right.



Dirt Chip said:


> Spec is there if you want it, but it`s mostly irrelevant for the reasons you mentioned.
> You have TPU and other sites that compere show all kinds of metrics.
> If you, as a computer user, don't do even the basic search on the internet before you buy hundreds-1000+ of dollars product than the problem is yours, clean and simple.
> 
> ...


That's probably why they backed off the two vastly different 4080s - which I'm glad for.


----------



## Dirt Chip (Oct 19, 2022)

I for once would be happy to see "GeForce" brand destroyed and replaced. Also, NV has way too much confidence. We need them to be more humble in order to get proper performance with sane price.

#medreaming


----------



## N3M3515 (Oct 19, 2022)

fevgatos said:


> No where in that graph does it mention how terrible they are on RT


Yeah because that 5% of games with RT are so important. RT is irrelevant for the majority of people until it is someting almost everygame has, hell i don't play even 1 game that supports RT.


----------



## fevgatos (Oct 19, 2022)

N3M3515 said:


> Yeah because that 5% of games with RT are so important. RT is irrelevant for the majority of people until it is someting almost everygame has, hell i don't play even 1 game that supports RT.


10% of games require a high end graphics card. If 5% of games have rt, that means that 50% of games that require a high end gpu have rt. Thats a huge percentage. 

High end gpus are irrelevant for the majority of people, but the minority that buys 800 to 2k $ gpus cares about it. Sales tell you as much


----------



## N3M3515 (Oct 19, 2022)

fevgatos said:


> but the minority that buys 800 to 2k $ gpus cares about it


2k sure, 800 don't think so. I tend to believe people that buy only nvidia do it for mindshare and "prestige", or professionals for work in the case of the halo gpus. Also, RT games are not the only games that need expensive gpus. So that "huge" percentage is not that huge after all. Nvidia CEO has all the reasons to PUSH rt, since he spent a huge amount of money developing it and wanted to roi in advance, when said products could not use that technology (like 3 games when it started, and even those 3 could not be played because of the performance hit). Now with the 4000 series is that RT is STARTING to be important because you can play it without any upscaling, and there are more games, although not many to make it the most relevant benchmark. Most relevant benchmark still is raster.


----------



## Sisyphus (Oct 19, 2022)

AusWolf said:


> That's not a definition.
> You cannot inform yourself about everything.


Nobody can. 



> I read countless reviews about my car before I bought it, but not one of them mentioned that the back coil springs are prone to rust away and break in a few years time, or that the small brakes work so hard that I have to change brake fluid basically every year due to degradation. Is this my fault then?


Your example is not related to the previous discussion. The car wasn't bought because of the name, neither it was renamed. A new name would not have changed the problem here either. In any case, you bear the consequences for your purchase decision, regardless of how well you informed yourself beforehand. The better informed you are, the lower the probability of making a bad purchase. The canceled rtx 4080 12 GB proves me right. 


> I have a colleague who comes to me for PC advice from time to time because he knows nothing about them. But he knows a lot about fishing. I've never fished in my life. If I ever go fishing with him, I'll be completely clueless as to what to do or what equipment to get. I'm just not interested enough to spend time reading about something that I don't care about. Is this my fault as well?
> TLDR: People have different hobbies and interests. A day is 24 hours for everyone. You can't seriously expect people to spend a chunk of their free time to read about something that they don't give a flying F about. That's what shop assistants are for. I'm sure you've bought stuff in your life that you didn't care to read reviews about. It's not your fault that you don't know everything about everything.


I understand your point of view. You are starting from moral imperatives. The seller should advise correctly, the manufacturer should choose understandable product names, advertising should not exaggerate. I suppose it's better to control than to trust. There are different ways to do this. Get multiple opinions from different sellers or pay a professional you trust. In any case, you should learn the basics. If you spend large sums on a product without getting information or obtaining professional, independent advice, you have a good chance of being disadvantaged or making a bad purchase. 
And yes, I bought a lot of stuff during my life, bad informed, later regretted the purchase. I learned from it.


----------

