# NVIDIA GeForce RTX 3090 Founders Edition Potentially Pictured: 3-slot Behemoth!



## VSG (Aug 22, 2020)

The rumor mill has no weekend break, and it churned out photos of what appears to be an NVIDIA Founders Edition version of the upcoming GeForce RTX 3090 next to the equivalent FE RTX 2080, with the latter looking like a toy compared to the massive triple slotter. The cooler comprises of the same design we discussed in detail in June, with the unique obverse dual-fan + aluminium heatsink seen in the images below. We also covered alleged PCB photos, in case you missed them before, and all lines up with the most recent leaks. The only difference here is that pricing for the RTX 3090 FE is claimed to be $1400, a far cry from the $2000 mark we saw for certain aftermarket offerings in the makings, and yet significantly higher from the previous generation- a worrying trend that we eagerly await to see justified with performance, before we even get into case compatibility concerns with the increased length here. Either way, if the images below are accurate, we are equally curious about the cooling capability and how it affects partner solutions and pricing.



 



*View at TechPowerUp Main Site*


----------



## ZoneDymo (Aug 22, 2020)

dear lord that is massive


----------



## DuxCro (Aug 22, 2020)

Massive card with a massive price. I doubt AMD will be able to compete. Maybe with RDNA 3 which will bring the chiplet GPU design and hopefully a massive boost in performance.


----------



## Toothless (Aug 22, 2020)

Well. NVIDIA did say it was going to have cooling on front and back.


----------



## Space Lynx (Aug 22, 2020)

This is it, the Ultima Weapon of Final Fantasy Nvidia Edition has finally arrived!  2080 ti owners will weep at it's almighty glory!


----------



## Vayra86 (Aug 22, 2020)

The cost of RT?

Hard pass, tbh


----------



## Chomiq (Aug 22, 2020)

Only $1400? I'll take two. Who needs expansion slots anyway?


----------



## ZoneDymo (Aug 22, 2020)

DuxCro said:


> Massive card with a massive price. I doubt AMD will be able to compete. Maybe with RDNA 3 which will bring the chiplet GPU design and hopefully a massive boost in performance.



I dont think 99.9% of the people care about AMD competing with this overpriced monster tbh.
I know its more attainable then, but its like saying you doubt a new Lambo is going to compete with a LaFerrari, its not really relevant for the vast majority of us.


----------



## micropage7 (Aug 22, 2020)

i need this one to play plants vs zombies


----------



## john_ (Aug 22, 2020)

I have it bigger...

...the graphics card.


Well, I think AMD would have been in much worst position if the 3090 was a normal size graphics card.
Nvidia having to go over 300W, as it is rumored and probably is true, based on the size of the card, is a good thing for AMD.
Just consider the possibility of Nvidia coming up with a card 50% faster than 2080 Ti and at 200W. This huge card is good news for AMD, even if it is 50% faster than 2080 Ti. Especially at the rumored $1400 price with 3080 at $800, Nvidia is leaving plenty of room to AMD to breath.


----------



## lexluthermiester (Aug 22, 2020)

VSG said:


>


That beast will not be going into my Dell T3500. It literally won't fit in physically. I'm sure it would work though. Might be time to build a new system... Been dragging my feet for almost a year anyway... I've settled on Threadripper, just trying to decide on which one..


----------



## s3thra (Aug 22, 2020)

That thing is so tall! The weight of it too I assume. Poor PCIe slot!

At least with the larger fan size it might be quiet.


----------



## Xex360 (Aug 22, 2020)

That is huge, is it because they are planning on putting two silicon in there or maybe because they'll use a less efficient node from Samsung.
I hope the prices for the rest of the Ampere family will be reasonable, I can't even imagine the price of this monstrosity.



DuxCro said:


> Massive card with a massive price. I doubt AMD will be able to compete. Maybe with RDNA 3 which will bring the chiplet GPU design and hopefully a massive boost in performance.


Navi was already faster than equivalent Turing while being much smaller, they didn't scale it up, RDNA2 and if we look at consoles, apparently the Series X should be equivalent to a 2080, and the PS5 is able to run at high frequencies, so if they manage to add more CUs and run it fast enough I don't see why they can't compete with Ampere at least in rasterization and probably less efficient than Ampere.


----------



## Rob94hawk (Aug 22, 2020)

With this card I can finally play Zork in 8k!


----------



## Searing (Aug 22, 2020)

DuxCro said:


> Massive card with a massive price. I doubt AMD will be able to compete. Maybe with RDNA 3 which will bring the chiplet GPU design and hopefully a massive boost in performance.



That's part of the problem, I want AMD to deliver better cards at the same price point. I don't want them to compete with overpriced nonsense. I want the $1000 AMD card to be much better than the $1000 nVidia one. Or $500 vs $500 (right now the most expensive card AMD has is about $360 USD all the time).


----------



## DuxCro (Aug 22, 2020)

ZoneDymo said:


> I dont think 99.9% of the people care about AMD competing with this overpriced monster tbh.
> I know its more attainable then, but its like saying you doubt a new Lambo is going to compete with a LaFerrari, its not really relevant for the vast majority of us.


Well competition is always better for us, end consumers. Look at what AMD did to CPU market. we were sentenced to Intels incremental IPC increases and insane prices for many years. Now the only advantage intel has over AMD Zen 2 is clock speeds. With ZEn 3 around the corner, idk why anyone would buy intel CPU for a new build. If AMD does manage to make something that will compete with Nvidia in all Price / performance segments, it will force both sides for competitive pricing. Hopefully AMD goes with very low pricing in order to gain market share. Just like they did with ZEN CPU's. But i think we can all agree that discrete graphics card prices don't have anything to do with common sense since RTX series came out.


----------



## Lomskij (Aug 22, 2020)

DuxCro said:


> Massive card with a massive price. I doubt AMD will be able to compete. Maybe with RDNA 3 which will bring the chiplet GPU design and hopefully a massive boost in performance.



I think AMD is perfectly capable of building big and expensive cards :-D


----------



## Max(IT) (Aug 22, 2020)

that's insane   

It could probably fit my case (I have space in both directions), but it still is massive.
BTW I'm not going to buy a $2000 VGA 

Looking forward for a RTX 3070.


----------



## ypsylon (Aug 22, 2020)

Honestly I have never expected that after 580/680 era we'll be moving back in time again in terms of ridiculous VGA sizes. Only a step until this. 






I'm excited for 3000 series of course, my rendering rig could use some new muscle, but no way 3 slots and I even cringe at 2 slot models being spoiled by multiple 1 slot 1080Ti.


----------



## BorisDG (Aug 22, 2020)

This has to be hot... hmm


----------



## zlobby (Aug 22, 2020)

Extra THICC!


----------



## Fluffmeister (Aug 22, 2020)

I thought the mob had already confirmed it was $2k?


----------



## P4-630 (Aug 22, 2020)




----------



## Animalpak (Aug 22, 2020)

I dont get it, they showed while ago that the PCB is shorter than any actual high end graphics card and now we see something unusal and very hard to believe.

To produce is absolutley not cost effective, to have it in a ATX case ... Neither...


----------



## VulkanBros (Aug 22, 2020)

BUT - can it play Crysis?


----------



## P4-630 (Aug 22, 2020)

VulkanBros said:


> BUT - can it play Crysis?











						Crytek onthult 8K screenshot van Crysis Remastered: 'Can it run Crysis (again)?'
					

Het originele deel van Crysis (2007) stond toendertijd bekend als een game die veel van een videokaart vroeg. Crysis draaide op een voor die tijd hyperm...




					nl.hardware.info
				




De remaster of Crysis includes:


4k textures
Screen Space Direction Occlusion (SSDO)
Voxel-based global illumination (SVOGI)
High-end volumetric fog/lighting effects and reflection
Software-based ray-tracing in CryEngine 5.6
Motion blur
Parallax Occlusion mapping
New particle effects
Enhanced depth of field


----------



## ZoneDymo (Aug 22, 2020)

lexluthermiester said:


> That beast will not be going into my Dell T3500. It literally won't fit in physically. I'm sure it would work though. Might be time to build a new system... Been dragging my feet for almost a year anyway... I've settled on Threadripper, just trying to decide on which one..



honestly though, if done neatly that could be a pretty cool case mod, literally cutting in the side panel to have the videocard stick out, basically one of those muscle cars but then for PC


----------



## midnightoil (Aug 22, 2020)

To me, this suggests that NVIDIA *think* they are likely to be in serious trouble with RDNA2.

This card screams massive die pushed way past a reasonable point on the clock / voltage (efficiency) curve.  This appears to be a much larger die in relative terms than the already huge 2080Ti die, given how big the process shrink is.  Wonder if it's at the limits of what Samsung can reasonably do?  If so, yields are probably crap, even before they bin them.

It doesn't directly say anything about RDNA2, but we know the former is pretty damned efficient from the XBOX & PS5 numbers, and I simply don't think NVIDIA would have gone this big, this hot, this hungry for any other reason than they think that as an architecture, RDNA2 likely has them beat.  Especially not when they're changing from TSMC to Samsung as their lead supplier.


----------



## stimpy88 (Aug 22, 2020)

So nVidia is scared of AMD this time round...  Interesting.


----------



## Chomiq (Aug 22, 2020)

VulkanBros said:


> BUT - can it play Crysis?


Well Crysis is still cpu bound so it wouldn't matter really.


----------



## medi01 (Aug 22, 2020)

DuxCro said:


> doubt AMD will be able to compete.


Why would it? Remind me the last time AMD rolled out an oversized chip.

You don't go bananas with power consumption (and, I guess, chip size) if you are not afraid of the competitor.
When your cards are far ahead, you can comfortably downclock and boast uber perf/watt.
Sandbagging your performance also allows you more room to roll out updated version of the cards and milk mraket more.

AMD is expected to roll out a card that beats 2080Ti and is 500mm2 or smaller in size and below $1k in price. (how much below will depend on how far ahead of 2080Ti it is).


----------



## Max(IT) (Aug 22, 2020)

VulkanBros said:


> BUT - can it play Crysis?


At medium settings


----------



## Totally (Aug 22, 2020)

Searing said:


> That's part of the problem, I want AMD to deliver better cards at the same price point. I don't want them to compete with overpriced nonsense. I want the $1000 AMD card to be much better than the $1000 nVidia one. Or $500 vs $500 (right now the most expensive card AMD has is about $360 USD all the time).



Unfortunately, those cash strapped paupers at AMD seem glad to charge $800, if Invidia's card sells for $1000 if they can deliver 80% of the performance.


----------



## mbeeston (Aug 22, 2020)

meh... my 1070ti is already triple slot... bring it on.. and well i'm at it.. bring on a lottery win!



P4-630 said:


> Crytek onthult 8K screenshot van Crysis Remastered: 'Can it run Crysis (again)?'
> 
> 
> Het originele deel van Crysis (2007) stond toendertijd bekend als een game die veel van een videokaart vroeg. Crysis draaide op een voor die tijd hyperm...
> ...


huh.. wonder it it could cap the 240hz of my monitor on that >.>


----------



## medi01 (Aug 22, 2020)

midnightoil said:


> we know the former is pretty damned efficient from the XBOX & PS5 numbers


Have they ever voiced power consumption figures, by the way?



Totally said:


> AMD seem glad to charge $800, if Invidia's card sells for $1000


How dare they.



Totally said:


> and they can deliver 80% of the performance.


A lovely strawman.


----------



## Max(IT) (Aug 22, 2020)

midnightoil said:


> To me, this suggests that NVIDIA *think* they are likely to be in serious trouble with RDNA2.
> 
> This card screams massive die pushed way past a reasonable point on the clock / voltage (efficiency) curve.  This appears to be a much larger die in relative terms than the already huge 2080Ti die, given how big the process shrink is.  Wonder if it's at the limits of what Samsung can reasonably do?  If so, yields are probably crap, even before they bin them.
> 
> It doesn't directly say anything about RDNA2, but we know the former is pretty damned efficient from the XBOX & PS5 numbers, and I simply don't think NVIDIA would have gone this big, this hot, this hungry for any other reason than they think that as an architecture, RDNA2 likely has them beat.  Especially not when they're changing from TSMC to Samsung as their lead supplier.


Most probably AMD isn’t even playing in the same league...


----------



## TheDeeGee (Aug 22, 2020)

Looks like 120 MM fans then?


----------



## chris.london (Aug 22, 2020)

midnightoil said:


> It doesn't directly say anything about RDNA2, but we know the former is pretty damned efficient from the XBOX & PS5 numbers, and I simply don't think NVIDIA would have gone this big, this hot, this hungry for any other reason than they think that as an architecture, RDNA2 likely has them beat.  Especially not when they're changing from TSMC to Samsung as their lead supplier.



Maybe. If history is any indication, you are probably right. Or maybe nVidia just wants to make the new consoles look obsolete out of the gate. Or Samsung’s 8nm is simply subpar. We’ll know soon enough.

In any case, that is a massive card. I am not even sure it will fit into my Silverstone FT05.


----------



## kings (Aug 22, 2020)

Some people are exaggerating a lot, the dimensions don't seem to be very different from most 2080Ti AIB versions.


----------



## TheoneandonlyMrK (Aug 22, 2020)

Soo 400+ Watt's is a possibility for realz.

@kings yes big card's have been seen before, but this is the reference, and from Nvidia it is a step beyond any reference cooler seen so far.


----------



## cueman (Aug 22, 2020)

we might forget to say amd 'big navi'.. its not amd call it and not deserve it anyway,but BIG AMPERE...yes indeed. all away also,belive it, performance and size.


----------



## bogami (Aug 22, 2020)

When I played Crysis at the time of release I was running the Nvidia 9800 X2 in SLI and at the highest settings they were capable of 28 FPS and 59 FPS with an upward view on 1920x1200. By turning off the AA features at less I reached somewhere around 34 FPS and it was fluid enough to play. The cards cost me €1000€ (2xGeForce 9800 GX2).  The current mid-range offering is 500€ to 800€+ . At the beginning of the GTX 1080 Mid-range offering was sold as the best product and more expensive  . Nvidia will do this as long as there is no real competition. It turns out that thet buyeng Bank is more expensive than expected.


----------



## ZoneDymo (Aug 22, 2020)

DuxCro said:


> Well competition is always better for us, end consumers. Look at what AMD did to CPU market. we were sentenced to Intels incremental IPC increases and insane prices for many years. Now the only advantage intel has over AMD Zen 2 is clock speeds. With ZEn 3 around the corner, idk why anyone would buy intel CPU for a new build. If AMD does manage to make something that will compete with Nvidia in all Price / performance segments, it will force both sides for competitive pricing. Hopefully AMD goes with very low pricing in order to gain market share. Just like they did with ZEN CPU's. But i think we can all agree that discrete graphics card prices don't have anything to do with common sense since RTX series came out.



I specifically said that AMD does not need to compete with the card shown here.
"we" the vast majority of the people want a new RX480 type of thing, affordable and much faster, AMD needs to compete with the RTX3050/3060/3070 and hopefully in the pricebrackets under or at 400 dollars max, anything above is for those who want to spend too much and if they do, then go ahead and buy Nvidia.


----------



## RedelZaVedno (Aug 22, 2020)

*GeForce GTX 960 = $199* -> fast forward 6 year -> *RTX 3060 = $399*. (+100%)
*GeForce GTX 980TI = $649*-> fast forward 6 year -> *RTX 3090 = $1,399* (+100%)
*RX 470 = $179* ->  fast forward 4 years ->  *RX 5700 =* *$349* (+98%)
WTF just happened???  *100 % price increase in 6 years*... It's not inflation (inflation 2014-2020 = 9,7%), it's *GREED Inc*.

F that, I'm out of DIY PC building until Leather Jacket One and his niece come to their senses which will probably never happen.


----------



## GeorgeMan (Aug 22, 2020)

RedelZaVedno said:


> *GeForce GTX 960 = $199* -> fast forward 6 year -> *RTX 3060 = $399*. (+100%)
> *GeForce GTX 980TI = $649*-> fast forward 6 year -> *RTX 3090 = $1,399* (+100%)
> *RX 470 = $179* ->  fast forward 4 years ->  *RX 5700 =* *$349* (+98%)
> WTF just happened???  *100 % price increase in 6 years*... It's not inflation (inflation 2014-2020 = 9,7%), it's *GREED Inc*.
> ...



Exactly. I got my 1080Ti for 670€ on an EVGA deal. I'm not paying more for lower end cards. I'll keep it as long as I can and if then things are still the same, I'm out too. Vote with our wallets.


----------



## Kissamies (Aug 22, 2020)

Probably AIB models won't be as huge. Zotac for example managed to release a 1080 Ti mini.


----------



## iO (Aug 22, 2020)

$1400 for FE, up to $1600 for high end AIB designs, $1800+ for the halo "ethusiast" models.

And the blade geometry of the fan on the back looks wonky, that might be a pull fan...


----------



## Manoa (Aug 22, 2020)

you don't have to go out of the PC systems guys, you just have to go out of the video cards (for a while)   just don't buy 10,900K for 600$ and you are ok 
I running 780Ti now 6+ years and no problems with any games


----------



## trparky (Aug 22, 2020)

DuxCro said:


> Massive card with a massive price. I doubt AMD will be able to compete.


They'll be able to compete just fine. Not everyone's willing to pay such stupidly high prices to nGreedia.


----------



## jabbadap (Aug 22, 2020)

P4-630 said:


> Crytek onthult 8K screenshot van Crysis Remastered: 'Can it run Crysis (again)?'
> 
> 
> Het originele deel van Crysis (2007) stond toendertijd bekend als een game die veel van een videokaart vroeg. Crysis draaide op een voor die tijd hyperm...
> ...



You forgot this:


> For the first time a Crytek game will feature ray tracing on Xbox One X and PlayStation 4 Pro powered by CRYENGINE’s proprietary software based ray tracing solution. The PC version will additionally support NVIDIA® DLSS technology and hardware-based ray tracing using NVIDIA’s VKRay Vulkan extension, for NVIDIA® GeForce® RTX GPU.


----------



## Lokran88 (Aug 22, 2020)

I remember some news Item about Nvidia not getting enough capacities for 7nm production at TSMC and instead they reserved capacities in 5nm and 7nm in 2021.
What will those be used for? Isn't next year too close for another release or will AMD/Intel Release something in a small node in 2021 and so Nvidia changes it's release schedule and brings Hopper in 5nm?


----------



## ZoneDymo (Aug 22, 2020)

RedelZaVedno said:


> *GeForce GTX 960 = $199* -> fast forward 6 year -> *RTX 3060 = $399*. (+100%)
> *GeForce GTX 980TI = $649*-> fast forward 6 year -> *RTX 3090 = $1,399* (+100%)
> *RX 470 = $179* ->  fast forward 4 years ->  *RX 5700 =* *$349* (+98%)
> WTF just happened???  *100 % price increase in 6 years*... It's not inflation (inflation 2014-2020 = 9,7%), it's *GREED Inc*.
> ...



just want to state that the rx5700 is not really on par with the rx470 bracket.
rx480 - rx580 - rx590 which is still sold today but arguably the best/closest card from amd for the rx470 is the rx5600 but your point still stands, prices have gone mad which is why and many friends have no upgraded for a while now, its just rediculous


----------



## reflex75 (Aug 22, 2020)

This 3090 volcano push/pull will melt the memory just above!!


----------



## Quicks (Aug 22, 2020)

People must just boycott Nvidia and AMD, for a year and refuse to buy any video cards that is more than 500$
Only reason they get away with these insane prices is because people keep on buying them!
Performance per dollar is going to suck hard on this...


----------



## stimpy88 (Aug 22, 2020)

VulkanBros said:


> BUT - can it play Crysis?


That meme is dead...  Let me elaborate...

But can it play Flight Sim 2020?


----------



## Hattu (Aug 22, 2020)

I wonder, how they handle qc on these huge (and expensive) cards! Even my "cheap" cards have Bad history.

~2011 i bought ASUS GTX 560Ti. 252€. DOA.
~year ago Gigabyte RTX 2060. 399€. DOA.

In maybe 2028 it's time for a new GPU. xTXxx60, 800€. DOA again?


----------



## rtwjunkie (Aug 22, 2020)

Welp, gonna rethink my whole PC strategy moving forward if this is true. First was the $1,800 cost that fewer people than ever can afford, and now a behemoth that necessitates going back to a super-sized case. Nope. Nope.

Is NVIDIA operating in some kind of vacuum in which real life doesn’t exist?


----------



## FreedomEclipse (Aug 22, 2020)

Im more curious about the cooler than the actual performance of the card itself.

Nvidia are supposedly using some sort of vapour chamber tech

But if we check out this random render.






Then you look at pictures of the cooler itself...





Pay attention to the fins in the center of the cooler because despite the first picture of the render show air being expelled by the fan through the fins. Those fins are actually blocked off (marked it in red - ignore the white line, it was supposed to be red but i messed up and cba to fix)

While the cooler design looks more interesting. It also looks less efficient to a certain point. At least with the 2080 and old blower style design air was being pushed through and 'channelled' more efficiently but blower style coolers can only handle so much heat before it comes saturated and the fan becomes super loud.

I really cant get my head around the design other than it possibly being 'disruptive' to the flow of air inside the case and not cooling very well...

The fan furthest from the I/O pushes air DOWN through the heatsink which then gets sucked up by the fans closest to the I/O and partially pushed out of the back of the case  - temperatures exacerbated because hot air is being recycled. But if you reverse the fan furthest away from the I/O (which would make more sense and a better idea) that hot air then gets sucked up by your CPU cooler, Though if youre using an AIO it might not effect you so much.


I dont know.... I see the cooler and i just cant work out how its supposed to cool the card efficiently

and then there are pictures of coolers with a 'shroud' on and off...






Like wut.


----------



## yotano211 (Aug 22, 2020)

Chloe Price said:


> Probably AIB models won't be as huge. Zotac for example managed to release a 1080 Ti mini.
> 
> View attachment 166339


I had 5 of those on a mining machine, open air, they where not very good, they ran very hot. They where the same size of the 1080mini, I had 9 of those. This was about 3 years ago now.



GeorgeMan said:


> Exactly. I got my 1080Ti for 670€ on an EVGA deal. I'm not paying more for lower end cards. I'll keep it as long as I can and if then things are still the same, I'm out too. Vote with our wallets.


Its why I buy most computer items used but still with plenty of warranty left, including current laptop. Stock price was $1300.


----------



## Punkenjoy (Aug 22, 2020)

If it's 400w at full load, no matter what the cooling on the card look like, you will need to have serious air flow inside your case. If you have a 100+ watt cpu, you have 500w+ getting in your room while playing. 

No big deal if you live in a cold climate during winter, but that is a lot of heat to handle during summer. It the equivalent of having a 500w heater running while you play. 

it can be handled, but i hope it's the maximum they will go and hope mid/high range card will not go over 300w.


----------



## yotano211 (Aug 22, 2020)

Why not just wait until it really comes out instead of saying something bad about this graphics card. That picture could have been some prototype that will never be released.


----------



## Mayclore (Aug 22, 2020)

Imagine trying to make this work in a Mini-ITX build.


----------



## Razbojnik (Aug 22, 2020)

Whoever thinks that amd's prices will be much more affordable compared to nvidia's must be living in a wonderland. The amount of money they've invested over the years in which they haven't even competed must be insane by now, and the fact that their frankly late architecture is still coming short in speed and performance doesn't look good for them at all.


----------



## Darksword (Aug 22, 2020)

yotano211 said:


> Why not just wait until it really comes out instead of saying something bad about this graphics card. That picture could have been some prototype that will never be released.



I think this is probably the case.  Just like the "leaked" PS5 design that was totally off from the final version.


----------



## Kissamies (Aug 22, 2020)

yotano211 said:


> I had 5 of those on a mining machine, open air, they where not very good, they ran very hot. They where the same size of the 1080mini, I had 9 of those. This was about 3 years ago now.


Doesn't digging toy money heat GPUs lot more than gaming?


----------



## Steevo (Aug 22, 2020)

They will sell 10s of these at that price!!!


----------



## Kissamies (Aug 22, 2020)

Steevo said:


> They will sell 10s of these at that price!!!


Price in Finland will be at least 1.5 kiloeuros. Usually you can translate the US MSRP to Euros and add the "Finland extra" to it which is usually 100-200EUR. Though our prices includes VAT (24%).

And people will probably buy these like 600EUR enthusiast cards back in the day. When the OG Titan was released, everyone was like "WTF, 1000EUR for a GPU? Hell no!" and now they buy 2080 Ti:s like those 600EUR cards a decade ago.


----------



## semantics (Aug 22, 2020)

kings said:


> Some people are exaggerating a lot, the dimensions don't seem to be very different from most 2080Ti AIB versions.







Yeah it's the same as the oversized non reference cards everyone buys


----------



## Kissamies (Aug 22, 2020)

The 3 slot reference one isn't for those mITX users at least..


----------



## Xaled (Aug 22, 2020)

Please TPU I kindly request you to
not be a part of Nivida dishonest, fraudulent pricing strategy! 
According to Nvidia 2080ti's price is like this:
999$ for non FE edition (which never existed at all)
and just! 1199$ for FE edition !
These prices are just not true, never was. the cheapes  new 2080ti now can found only for 1500$.
Nvidia would again announce a lower real price (1400$) and would sell it on much higher price (2000$). Please take this in consideration when you make a review and don't use Nvidia's fake numbers/prices when making a conclusion a decision or even in soecifications because those prices were never true .


----------



## Dracius (Aug 22, 2020)

Wow that's freaking huge.. Surely it can't be that big? 

(that's what she said)


----------



## Jawz (Aug 22, 2020)

In the first photo near the PCIE connector does that read "Config 1"?


----------



## MxPhenom 216 (Aug 22, 2020)

I don't care one bit about the 3090. I want to see the 3080!


----------



## Chomiq (Aug 22, 2020)

MxPhenom 216 said:


> I don't care one bit about the 3090. I want to see the 3080!


Yeah that actually might be a part of their strategy. Drop 3090 for $1400 just for people to accept $1000 for 3080.


----------



## swirl09 (Aug 22, 2020)

RedelZaVedno said:


> *GeForce GTX 980TI = $649*-> fast forward 6 year -> *RTX 3090 = $1,399* (+100%)


It is a lot to bare. The 1080ti->2080ti was a 40 something % increase, that was already a difficult pill to swallow. _Any _increase at this point is going to be hard to justify IMO. 

When I see it being phrased as "The alleged $1400 price isnt so bad compared to the $2000 price tag that was rumoured previously", I dont think so! That would still be a price jump! Having a false anchor of $2K is a foolish way to look at it.


----------



## chodaboy19 (Aug 22, 2020)

I'm waiting for the real deal to make an appearance on September 1st!


----------



## mechtech (Aug 22, 2020)

Almost looks like photoshopped clickbait.


----------



## Legacy-ZA (Aug 22, 2020)

The prices have gotten out of hand, seriously, seriously out of hand.


----------



## Vayra86 (Aug 22, 2020)

Chomiq said:


> Only $1400? I'll take two. Who needs expansion slots anyway?



Maybe this is the very first GPU that sags the motherboard instead of itself 

Buy three and you can build yourself a house. Man, this is easy.  AMD team will have a good day 

Regardless I'm not even remotely touching this even on a discount. Triple slot screams 'lacking efficiency' and if RT means we have to go back to the early 2000's in terms of GPU quality of life, they can stick it right up their behind. Its arguably worse than Turing if that is the case.

its not worth it, tbh. Come back when you've refined this enough so it fits in your usual power envelope. I'm not your 2000 dollar paying beta tester. GTFO

But... I'm still on a wait and see stance. So far nothing is official.



mechtech said:


> Almost looks like photoshopped clickbait.



Yep. Wouldn't surprise me one bit. The whole design is 4chan level ridiculous



Quicks said:


> People must just boycott Nvidia and AMD, for a year and refuse to buy any video cards that is more than 500$
> Only reason they get away with these insane prices is because people keep on buying them!
> Performance per dollar is going to suck hard on this...



End result, you get a cheaper GPU that is 5% faster than what you had. Across the whole stack.

Well played?  At least we've got Intel, right?


----------



## AddSub (Aug 22, 2020)

Is SLI  thing of the past? No SLI support for these monsters?

...
..
.


----------



## chstamos (Aug 22, 2020)

ZoneDymo said:


> "we" the vast majority of the people want a new RX480 type of thing, affordable and much faster, AMD needs to compete with the RTX3050/3060/3070 and hopefully in the pricebrackets under or at 400 dollars max, anything above is for those who want to spend too much and if they do, then go ahead and buy Nvidia.



Well said. The enthusiasm over intel's entry into the discrete GPU market alone shows that most people are interested in mid to midhigh range performance and prices, as nobody expected intel to offer high end competitive products from the get go (unfortunately, intel seems to be failing at the moment even at entry level performance, but we can always hope against hope they will catch up ...eventually).

If AMD offers 2060 Super performance at ~200-dollar price range (wasn't the RX480 pretty much that, at introduction? gtx970 level perf at 230 bucks?), people will see them like manna from heaven and nVidia can keep pretending to be god's gift to gamers with their big unwieldy cards that cost as much as functional second hand cars.


----------



## Dante Uchiha (Aug 22, 2020)

Chloe Price said:


> Doesn't digging toy money heat GPUs lot more than gaming?



Nop, usually the GPUs are running with undervolt.

----

Interesting how many leaks appear on "Green side" and almost nothing is known about the RX 6080/6090... Sign of delay or whim?

I think the 3090 is still a cutdown, so when AMD release something better Nvidia simply takes out of pocket the 3090 ti/3090S.


----------



## Fluffmeister (Aug 22, 2020)

Dante Uchiha said:


> Interesting how many leaks appear on "Green side" and almost nothing is known about the RX 6080/6090... Sign of delay or whim?
> 
> I think the 3090 is still a cutdown, so when AMD release something better Nvidia simply takes out of pocket the 3090 ti/3090S.



I was thinking that, hopefully Big Balls Navi is competitive, but at this rate two year old RTX 2080 Ti's are going EOL without a response from team red.


----------



## TheoneandonlyMrK (Aug 22, 2020)

Simply zoom in on the right hand picture and it looks like some writing used to be there and was digitally removed.


----------



## PowerPC (Aug 22, 2020)

The speculations here about this card are insane. Just let it come out first at least, sheesh. Then we'll know if the size is warranted or not, or if the price is even close to warranted. I'm not gonna pay this much for a GPU anyway, but I'm not the target demographic for this I guess. I'm just saying that if it's cool enough, performant enough and crazy enough, I think the price can theoretically even be warranted. But just leave the speculations be until it actually arrives.


----------



## ironwolf (Aug 22, 2020)

Ouch.  Gonna have to sell a LOT of plasma or figure out how much my soul is worth...


----------



## Dristun (Aug 22, 2020)

Fluffmeister said:


> I was thinking that, hopefully Big Balls Navi is competitive, but at this rate two year old RTX 2080 Ti's are going EOL without a response from team red.


1080ti has been EOL for a while now without being ever comprehensively beaten by team red, lol. But I think they'll manage to at least match the 2080ti in some games, after all they were already quite close to standard 2080 with 5700XT in Forza and some other titles.


----------



## yotano211 (Aug 22, 2020)

Chloe Price said:


> Doesn't digging toy money heat GPUs lot more than gaming?


I dont know, all my mining rigs where all open air with a huge box fan per 2 mining machines, blowing air over it.



ironwolf said:


> Ouch.  Gonna have to sell a LOT of plasma or figure out how much my soul is worth...


Where I live you can make about $600/month for plasma.



Xaled said:


> Please TPU I kindly request you to
> not be a part of Nivida dishonest, fraudulent pricing strategy!
> According to Nvidia 2080ti's price is like this:
> 999$ for non FE edition (which never existed at all)
> ...


You could find 2080ti's over on reddit hardwareswap for $900 each with full warranty. They are used.


----------



## matar (Aug 22, 2020)

I don't think this is accurate , if bigger it may be slightly bigger but not like what the pic shows , because then only full tower cases can house this card and they will lose a lot of money.


----------



## kings (Aug 22, 2020)

chstamos said:


> If AMD offers 2060 Super performance at ~200-dollar price range (wasn't the RX480 pretty much that, at introduction? gtx970 level perf at 230 bucks?), people will see them like manna from heaven and nVidia can keep pretending to be god's gift to gamers with their big unwieldy cards that cost as much as functional second hand cars.



What does this have to do with halo cards like the RTX 3090?

If AMD offer that, Nvidia will have something similar, both in performance and price. Look at RX 5600XT and RTX 2060 or RX 5500XT and GTX 1650 Super, for example.

So yeah, Nvidia can launch a RTX 3090 at insane price, thinking they are the king of the castle or the owners of the world, and sell a ton of cards in the mid-range in the process. One thing does not invalidate the other.


----------



## MrAMD (Aug 22, 2020)

Feel like most people haven't seen FTW3 or similar cards before. This is probably smaller than those lol. Of course it's jarring seeing reference design this big though. The 3090 is gonna be a monsta.


----------



## Nkd (Aug 22, 2020)

DuxCro said:


> Massive card with a massive price. I doubt AMD will be able to compete. Maybe with RDNA 3 which will bring the chiplet GPU design and hopefully a massive boost in performance.



Its becuase the chip is built on 8nm and has shit load of memory chips and its a gigantic chip. So bigger doesn't mean AMD can't compete. AMD chip will be smaller and you are looking at similar number of cores on big navi. Yes history is not in their favor but just because the cooler is gigantic doesn't imply AMD can't compete lol.


----------



## Chrispy_ (Aug 22, 2020)

iO said:


> And the blade geometry of the fan on the back looks wonky, that might be a pull fan...


Hard to be sure - if it's a pull fan the rotation direction is anticlockwise and the leading/trailing edge sweep is very wrong.
My guess is that it's a clockwise-spinning, high-static pressure fan, forcing air through the extremely dense, very restrictive heatsink.


----------



## kiriakost (Aug 22, 2020)

NVIDIA it should hire ASUS as cooler system designer. 
At that price range I would also expect a 12 years warranty.


----------



## TheoneandonlyMrK (Aug 22, 2020)

kiriakost said:


> NVIDIA it should hire ASUS as cooler system designer.
> At that price range I would also expect a 12 years warranty.


Good luck finding a use for this in 12 years!?.

Be lucky to get five, things tend to double in performance  making these cards pretty wall mounts


----------



## kiriakost (Aug 22, 2020)

theoneandonlymrk said:


> Good luck finding a use for this in 12 years!?.
> 
> Be lucky to get five, things tend to double in performance  making these cards pretty wall mounts



I will recruit here my sense of humor,  if this card does not play Unreal tournament Ver 1.0 ?  Then I will never get it.


----------



## Kissamies (Aug 22, 2020)

kiriakost said:


> NVIDIA it should hire ASUS as cooler system designer.
> At that price range I would also expect a 12 years warranty.


Why Asus? It's not the best which comes to coolers. MSI's been hella fine for several years, Asus has insane brand premium for its decent cards.


----------



## chstamos (Aug 22, 2020)

kings said:


> What does this have to do with halo cards like the RTX 3090?
> 
> If AMD offer that, Nvidia will have something similar, both in performance and price. Look at RX 5600XT and RTX 2060 or RX 5500XT and GTX 1650 Super, for example.
> 
> So yeah, Nvidia can launch a RTX 3090 at insane price, thinking they are the king of the castle or the owners of the world, and sell a ton of cards in the mid-range in the process. One thing does not invalidate the other.



Maybe my memory deceives me, but doesn't nvidia have a history of raising mid-range cards prices when they release a halo card at an insane premium?


----------



## kings (Aug 22, 2020)

chstamos said:


> Maybe my memory deceives me, but doesn't nvidia have a history of raising mid-range cards prices when they release a halo card at an insane premium?



Well, It depends on the cards you compare. For example, the most common first mid-end Turing cards (1660/1660Ti), were not bad values compared to the 1060 3GB and 1060 6GB, respectively.

The 1660 cost 10% more than the 1060 3GB, but offered 35% more performance at 1080p, in addition to increasing the VRAM to 6GB.

The 1660Ti cost 12% more (if we ignore the higher Founders Edition price of 1060), but it had 36% more performance at 1080p than 1060 6GB.


----------



## Raendor (Aug 22, 2020)

Mayclore said:


> Imagine trying to make this work in a Mini-ITX build.



will fit just fine in my Sliger Conswole and perhaps in Ncase M1 too.

but it doesn’t matter as I will build a mid-range system for reasonable amount of money only


----------



## Mitsman (Aug 22, 2020)

Haven't nvidia x90 cards historically been dual gpu cards? Could explain the high price and the cards size. I can't see any other reason why they would bring back a xx90 series card unless its a dual gpu


----------



## robal (Aug 22, 2020)

So, as the cooler is becoming larger and larger chunk of the card's cost, we should expect watercooled versions to become cheaper (ie: not way more expensive than FE). Right ?  Right....


----------



## kings (Aug 22, 2020)

Mitsman said:


> Haven't nvidia x90 cards historically been dual gpu cards? Could explain the high price and the cards size. I can't see any other reason why they would bring back a xx90 series card unless its a dual gpu



Very unlikely, but it would be a good plot twist. 

Both AMD and Nvidia have been increasingly distancing themselves from the dual-GPU thing, including SLI/Crossfire.

It´s a solution that has many disadvantages. The future will be MCM designs and that is what both companies are working on.


----------



## Dave65 (Aug 22, 2020)

DuxCro said:


> Massive card with a massive price. I doubt AMD will be able to compete. Maybe with RDNA 3 which will bring the chiplet GPU design and hopefully a massive boost in performance.



AMD don't need to compete with the three people who can afford that..


----------



## thesmokingman (Aug 22, 2020)

That is obnoxiously large, ridonkulous!


----------



## lexluthermiester (Aug 22, 2020)

ZoneDymo said:


> honestly though, if done neatly that could be a pretty cool case mod, literally cutting in the side panel to have the videocard stick out, basically one of those muscle cars but then for PC


As cool as I could likely make it, the base hardware is already slightly bottlenecking the RTX2080 I have in it now, While there would still be a boost in GPU performance, the CPU bottleneck would become very pronounced. I think the RTX2080 is the best GPU one can pair with an unlocked socket 1366 Xeon and expect to get reasonably balanced performance.



kings said:


> Some people are exaggerating a lot, the dimensions don't seem to be very different from most 2080Ti AIB versions.


While you make a good point, it is big enough that fitting it into some systems will be a challenge and impossible for others.


----------



## valrond (Aug 23, 2020)

kings said:


> Very unlikely, but it would be a good plot twist.
> 
> Both AMD and Nvidia have been increasingly distancing themselves from the dual-GPU thing, including SLI/Crossfire.
> 
> It´s a solution that has many disadvantages. The future will be MCM designs and that is what both companies are working on.



Yes, it has SOME disavantages, but also a lot of advantages. The reason to kill SLI has a lot to do with the overinflated prices of the 2080Ti. If SLI were to work in most games like in the past, would you rather get a 2080Ti, or get a second 1080Ti that would be a lot cheaper and much more powerful?.


----------



## Batailleuse (Aug 23, 2020)

RedelZaVedno said:


> *GeForce GTX 960 = $199* -> fast forward 6 year -> *RTX 3060 = $399*. (+100%)
> *GeForce GTX 980TI = $649*-> fast forward 6 year -> *RTX 3090 = $1,399* (+100%)
> *RX 470 = $179* ->  fast forward 4 years ->  *RX 5700 =* *$349* (+98%)
> WTF just happened???  *100 % price increase in 6 years*... It's not inflation (inflation 2014-2020 = 9,7%), it's *GREED Inc*.
> ...



Well they inserted more tier compared to before 

960/970/980
1650/1660/2060/2070/2080/2080ti/titan

Thats literally over double the amount of reference per generation (I do not even count the updated super and ti when it happens) 

Your previous 960 is more akin to either 1650ti or 1660 (200-300$ bracket) 


Same goes with amd 

5500xt/5600xt/5700xt they did not even drop a 5800xt 

But basically your rx470 would match 5600xt not 5700xt since there was an rx480. 

Pick the tier that match compared to the rest of the offers currently proposed, not simply by naming conventions.



GeorgeMan said:


> Exactly. I got my 1080Ti for 670€ on an EVGA deal. I'm not paying more for lower end cards. I'll keep it as long as I can and if then things are still the same, I'm out too. Vote with our wallets.



Yeah but 1080ti was an upgrade to the 1080

They dropped an unmatched and no upgrade 2080ti on launch day the 1080ti is more akin to a 2080 super which is around the same pricing actually. 

2080ti/titan are purely for bragging. Not that RTX had any use this generation, a totally gimmick feature up until now. I have a 2080ti and the only game that had it at launch was control. Too many game said they would support RTX and maybe they had it but I'm not waiting 6mo post release of a game just to see rtx, it either at launch or not at all for me. 

Hopefully thst changes with next gen console but we'll see.


----------



## kings (Aug 23, 2020)

valrond said:


> Yes, it has SOME disavantages, but also a lot of advantages. The reason to kill SLI has a lot to do with the overinflated prices of the 2080Ti. If SLI were to work in most games like in the past, would you rather get a 2080Ti, or get a second 1080Ti that would be a lot cheaper and much more powerful?.



SLI/Crossfire, even in its bright days, has always had a lot of problems with support, it is very dependent on the drivers for each game and the goodwill of game developers.

Then there was the problem of poor scalability. Adding a second GPU, at best, could bring 40% or 50% more performance, but mostly it was well below that. That is, you paid for a GPU to have half or 1/3 the performance of it, depending on the game. And it got worse, the more GPUs we put in.

There could be rares cases where it paid off, but as a general rule, after a few years, it was better to sell the card and buy one of the new generation, avoiding a lot of hassle. Not to mention the heat, noise and power consumption that SLI/Crossfire usually caused. Often, the GPU that was on top was constantly throttling due to not having room to breathe, further decreasing the performance gain.

I don't think it has anything to do with the price of the RTX 2080Ti, Nvidia started to follow this path long before that, for example, the GTX 1060 in 2016 no longer had support for this. AMD did not embark on cards over $1000 and also abandoned dual-GPU.


----------



## Totally (Aug 23, 2020)

medi01 said:


> Have they ever voiced power consumption figures, by the way?
> 
> 
> How dare they.
> ...



oops typo, fixed. Besides that strawman? When hasn't that been the case in that price segment? That they come with 10-25% of an Nvdia card and price it proportionally? with worse power consumption might I add? Also when Nvidia at added a tier at bottom to hide their price hike, what did AMD do? Matched Nvidia's new pricing.


----------



## lexluthermiester (Aug 23, 2020)

Batailleuse said:


> Your previous 960 is more akin to either 1650ti or 1660 (200-300$ bracket)


Performance wise? Not even close..




__





						GeForce GTX 960 vs GeForce GTX 1650 Ti vs GeForce GTX 1660 [videocardbenchmark.net] by PassMark Software
					





					www.videocardbenchmark.net


----------



## mouacyk (Aug 23, 2020)

Mitsman said:


> Haven't nvidia x90 cards historically been dual gpu cards? Could explain the high price and the cards size. I can't see any other reason why they would bring back a xx90 series card unless its a dual gpu


It might be so obvious we are not seeing it.   Hardware is going MCM. DX12 was meant to support multi-GPU effortlessly, same as Vulkan.  So the APIs already had it.  Games just need to implement it.  May be NVidia is ready to push, for the glory of 4K and RT?










75% scaling in the best case with heaviest draw calls: 1070 + 980 Ti in Explicit Multi-GPU mode.









NVidia silently added checker board rendering for multiple GPUs into drivers 8 months ago.


----------



## rtwjunkie (Aug 23, 2020)

lexluthermiester said:


> Performance wise? Not even close..
> 
> 
> 
> ...


No, I think he means where they fit on their respective tiers, which is correct. Not a performance comparison.


----------



## Space Lynx (Aug 23, 2020)

mouacyk said:


> May be NVidia is ready to push, for the glory of 4K and RT?



for the glory!!!!



theoneandonlymrk said:


> Good luck finding a use for this in 12 years!?.
> 
> Be lucky to get five, things tend to double in performance  making these cards pretty wall mounts



normally I would agree with you, but I think those days are gone now, moving forward after this bump will be 5% max gains on fps ends, but improving DLSS and RT and any other new gimmicks they come up with to keep us spending money and not worrying about the 5%.


----------



## Minus Infinity (Aug 23, 2020)

Stupid price, stupid size, stupid power consumption. Pathetically desperate effort from Nvidia to cling to the performance crown no matter what. Again so much time devoted to an ultra niche product only 0.01% of gamers will actually buy despite all the keyboard warriors claiming otherwise.


----------



## semantics (Aug 23, 2020)

Minus Infinity said:


> Stupid price, stupid size, stupid power consumption. Pathetically desperate effort from Nvidia to cling to the performance crown no matter what. Again so much time devoted to an ultra niche product only 0.01% of gamers will actually buy despite all the keyboard warriors claiming otherwise.


Halo branding works. Even if it loses them money halo products raise the perceived worth of products downline due to it raising the perceived worth of the brand itself.

It's unclear if this what appears to be a reaching product is due to trying to justify RTX/4K and people actually buying new top end gpus because a 1080ti still gets you there at the top end at 1440p and 4k so upgrading has been a wash with RTX. Or this reach is due to anticipating actual competition from AMD even though nvidia hasn't had competition for the performance crown since the 1080ti was released.


----------



## Agentbb007 (Aug 23, 2020)

They probably leaked the $2000 price so when $1400 was leaked we’d be happy. But still these prices are getting a bit nuts for a part to play games.  Console gaming is starting to look a lot more attractive to me.


----------



## Candor (Aug 23, 2020)




----------



## silkstone (Aug 23, 2020)

I wonder if there are any ITX cases that are big enough to fit this beast.


----------



## JalleR (Aug 23, 2020)

lexluthermiester said:


> That beast will not be going into my Dell T3500. It literally won't fit in physically. I'm sure it would work though. Might be time to build a new system... Been dragging my feet for almost a year anyway... I've settled on Threadripper, just trying to decide on which one..



Well as a temp setup while building hard tubing in my primary pc I put my Asus Rog Strix OC 1080TI in a T3500 I had to tweak the Closing mechanism for the cards a little but it worked like a charm


----------



## medi01 (Aug 23, 2020)

mouacyk said:


> DX12 was meant to support multi-GPU effortlessly, same as Vulkan.


Effortlessly for the card manufacturer. (no need to support it in driver)



Totally said:


> When hasn't that been the case in that price segment?


It is hard find examples that match your weird take, than examples that don't.
5700XT is about 10% slower, but 20%+ cheaper than 2070sup and simply faster than 2070 which it matches price wise.
There were times when 10-20% slower AMD gpus were selling for the half of NV price.



mouacyk said:


> May be NVidia is ready to push, for the glory of 4K and RT?


What the heck, dude, seriously?

There is a baseline of "resolution and framerate acceptable to the users", which determines how much complexity could be dedicated to graphics fidelity.
Now *next gen consoles are officially targeting 4*k (with 2080+ kind of GPUs).
FPS wise, many studios will settle with 30.
So achieving 4k 30fps on PC will be doable with GPUs somewhat faster than those mentioned above.
There will be no need to "overpower" baseline 4 times to go 4k.

As for RT, it's very close to mere marketing push by NV at this point.
This was done without any hardware RT whatsoever and, wait for it, *people had to ask if RT was used or not*.
Speaks volumes about tech itself.


----------



## mouacyk (Aug 23, 2020)

medi01 said:


> Effortlessly for the card manufacturer. (no need to support it in driver)


They can make it easier to use, such as GameWorks or some such library that is uniform.



medi01 said:


> There is a baseline of "resolution and framerate acceptable to the users", which determines how much complexity could be dedicated to graphics fidelity.
> Now *next gen consoles are officially targeting 4*k (with 2080+ kind of GPUs).
> FPS wise, many studios will settle with 30.
> So achieving 4k 30fps on PC will be doable with GPUs somewhat faster than those mentioned above.
> There will be no need to "overpower" baseline 4 times to go 4k.


Most people who care about the high end cards are looking to go past 60fps at 4K (because we like our 120-144Hz monitors on PC, mind you).  And NVidia likely cares too from their side, because they may really want to deliver on BFGD someday.  You put all the pieces together, from API readiness to driver optimizations to tech demos and the new MCM hardware roadmap, there is no doubt that explicit multi-GPU will be the answer to go beyond 60fps at 4K.  (It's that classic throw money at the problem solution, because additional hardware does scale -- it's just a matter of software support, as demo'ed by Ashes.)


----------



## medi01 (Aug 23, 2020)

mouacyk said:


> They can make it easier to use, such as GameWorks or some such library that is uniform.


Possibly, although I doubt it. Besides, only a tiny part of the market using that, means it's a wasted effort.




mouacyk said:


> there is no doubt that explicit multi-GPU will be the answer to go beyond 60fps at 4K


Definitely not.
As with 4k, you could get that by lowering complexity of the rendered objects.
If devs, for some far from obvious reason (and "I have a device that could show more fps" is one hell of a funny argument). decide to target 60fps, well, you'll have it.

The most recent time someone tried to highlight 60fps was that awkward Halo Infinite demo that born countles memes:





Now, most games are cross-platform.
95% of the steam survey PC market is slower than PS5/XSeX.

Getting a card that is roughly faster than 2080sup would do it.
Now let's do the napkin math:  2080sup * 1.11 = 2080Ti., 2080sup * 2 = GPUThatShouldBring4k60fps

GPUThatShouldBring4k60fps = a card that is 80% faster than 2080Ti.

Just stresses how nonsensical PC Master Race becomes. You won't be able to vastly overpower consoles, but if you do not have that weirs penis size to fps neural entanglement, why would you.


----------



## dicktracy (Aug 23, 2020)

So the reference cooler is finally onpar with highend AIB coolers. Unlike the angry mobs, I commend Nvidia for not cheapening out on reference cards. Thank you!


----------



## dj-electric (Aug 23, 2020)

dicktracy said:


> So the reference cooler is finally onpar with highend AIB coolers. Unlike the angry mobs, I commend Nvidia for not cheapening out on reference cards. Thank you!



There are no reference cards, that's the joke with NVIDIA reference cards. It will now be FE and command a higher than MSRP price without having an actually cheap version with that MSRP, just like with high end RTX 20 cards.


----------



## Blueberries (Aug 23, 2020)

I've been craving another power monster since the 295x2.


----------



## valrond (Aug 23, 2020)

kings said:


> SLI/Crossfire, even in its bright days, has always had a lot of problems with support, it is very dependent on the drivers for each game and the goodwill of game developers.
> 
> Then there was the problem of poor scalability. Adding a second GPU, at best, could bring 40% or 50% more performance, but mostly it was well below that. That is, you paid for a GPU to have half or 1/3 the performance of it, depending on the game. And it got worse, the more GPUs we put in.
> 
> ...


Dude, you really have NO IDEA about SLI, don't you? Where are you getting those numbers? 40-50%? That happened when it first came back with PCIe. By the time we got to the GTX 780 and R9 290x the improvements were in the 90-100% range. Not only that, what you say about temperatures and sound? totally wrong. Why? Because you rarely had to go 100% in both cards, unlike when you have one card and you have to squeeze all the power from them. Yep, they consume more power, but that's about it.
I have used SLI and CFX quite a few times. I had a CFX of 4870, then of GTX 480, and then for R9 290x. Heck, my MSI GT80 still has a SLI of GTX980m. 
Of course, the games that to properly implement SLI and CFX, if you, there was little to no gain, and that is what happened. If the card makers aren't pushing dual (or triple, quadruple even) configurations any more, game companies will not support them.


----------



## watzupken (Aug 23, 2020)

Given the size of the cooler, I wonder how hot will the RTX 3xxx series run. Despite the use of a newer fab (unknown at this point but surely better than 12nm), the power requirement is still shooting through the roof. I feel Nvidia went ultra aggressive to cram in as much as they can. I suspect most of the die space to be taken up by RT, Tensor and whatever bespoke cores they are going to add in there.

Ultimately, the one with the best value to performance will still be the hottest selling card. Cards like the XX80Ti are not meant for most people, whether it is due to their requirements or budget.



silkstone said:


> I wonder if there are any ITX cases that are big enough to fit this beast.


Big enough ITX? Chances are slim. Able to handle the heat? I think chances are very slim.


----------



## Xaled (Aug 23, 2020)

kings said:


> Well, It depends on the cards you compare. For example, the most common first mid-end Turing cards (1660/1660Ti), were not bad values compared to the 1060 3GB and 1060 6GB, respectively.
> 
> The 1660 cost 10% more than the 1060 3GB, but offered 35% more performance at 1080p, in addition to increasing the VRAM to 6GB.
> 
> The 1660Ti cost 12% more (if we ignore the higher Founders Edition price of 1060), but it had 36% more performance at 1080p than 1060 6GB.


Are you focking serious? 1060 is THREE years older than 16xx's. In old days you could've get 100% performance for same price in such period.


----------



## saki630 (Aug 23, 2020)

I guess im keeping my 1080ti or upgrading to 2080ti cuz nothing else will fit in this N1Case.


----------



## Kissamies (Aug 23, 2020)

Blueberries said:


> I've been craving another power monster since the 295x2.


You forgot the Radeon Pro Duo


----------



## Fluffmeister (Aug 23, 2020)

Blueberries said:


> I've been craving another power monster since the 295x2.



Ah yes the $1500 face melting water cooled beast! The persuit of performance was fun back then, now people just like to get upset about things they don't intend on buying anyway.


----------



## silkstone (Aug 23, 2020)

watzupken said:


> Given the size of the cooler, I wonder how hot will the RTX 3xxx series run. Despite the use of a newer fab (unknown at this point but surely better than 12nm), the power requirement is still shooting through the roof. I feel Nvidia went ultra aggressive to cram in as much as they can. I suspect most of the die space to be taken up by RT, Tensor and whatever bespoke cores they are going to add in there.
> 
> Ultimately, the one with the best value to performance will still be the hottest selling card. Cards like the XX80Ti are not meant for most people, whether it is due to their requirements or budget.
> 
> ...



I think size would be the biggest limiting factor. There are plenty of ITX cases that allow the GPU to draw fresh air through the side panel, that heat isn't an issue. 
I've seen small builds using threadrippers and those things can pump out some real heat.


----------



## Kissamies (Aug 23, 2020)

Fluffmeister said:


> Ah yes the $1500 face melting water cooled beast! The persuit of performance was fun back then, now people just like to get upset about things they don't intend on buying anyway.


Though I have to give a credit that AMD managed to cool it with just an 120mm AIO. 

That truly was a beast, I had a R9 290 CF year ago and when CF worked, it scaled pretty nice. But I kinda understand why SLI/CF is kinda yesterday's thing.


----------



## Valantar (Aug 23, 2020)

FreedomEclipse said:


> and then there are pictures of coolers with a 'shroud' on and off...
> 
> 
> 
> ...


That is the rear of the card, what you're looking at is the backplate. Look at the position of the PCIe slot and I/O bracket. The PCB is obviously behind there, so it's not like it's blocking any airflow. Entirely agree with the rest of the post though.


Xaled said:


> Are you focking serious? 1060 is THREE years older than 16xx's. In old days you could've get 100% performance for same price in such period.


While you're not entirely wrong, that is also the ever increasing reality of chipmaking - as time passes, the generational gains shrink. As production nodes near various physical limits, they become more expensive and difficult to make, making bigger chips expensive and low yielding. As the number of GPU cores increases, memory bandwidth is ever more of a bottleneck. As is the rest of the system (hence why the new consoles are NVMe-only and have dedicated decompression hardware to feed their GPUs). So while in the past a three year wait would allow for for example a 100% increase in CUs/CUDA cores, a small bump in clock speeds and the same power draw at the same price, that isn't happening any longer. Mind you, that doesn't justify current GPU prices by any stretch of the imagination, but it does explain why generational gains are shrinking. It's a sign of maturing technologies. Hopefully increased competition this generation will drop prices a bit and keep them there though, as the current midrange and upper midrange cards are priced where high end and flagship cards used to be...


----------



## Kissamies (Aug 23, 2020)

Xaled said:


> Are you focking serious? 1060 is THREE years older than 16xx's. In old days you could've get 100% performance for same price in such period.


Probably even more. Think about 6600 GT (2004) -> 8600 GTS (2007) for example. In the high-end range the boost is even higher than in those mid-end cards.


----------



## Frick (Aug 23, 2020)

Fluffmeister said:


> Ah yes the $1500 face melting water cooled beast! The persuit of performance was fun back then, now people just like to get upset about things they don't intend on buying anyway.



Those cards were special, abberations. This card is a bog standard card.


----------



## TheoneandonlyMrK (Aug 23, 2020)

lynx29 said:


> for the glory!!!!
> 
> 
> 
> normally I would agree with you, but I think those days are gone now, moving forward after this bump will be 5% max gains on fps ends, but improving DLSS and RT and any other new gimmicks they come up with to keep us spending money and not worrying about the 5%.


Raytracing is new enough that we are about to see generation 1 from one team and gen 2 from Nvidia, it's still very new , generational leaps should be quite good for five years, but 12 years use of a 3090, you and him are having a laugh, no chance of it remaining useful.

No GPU would, it's not a slur against Nvidia to say such, it's a slur against them to imply they can't improve adequately in that time.


----------



## dont whant to set it"' (Aug 23, 2020)

Gtx 480 might have a succesor with respect to power drawn and heat outputed.


----------



## EarthDog (Aug 23, 2020)

I've been singing the same tune for several weeks.. if this power rumor us true (3 slot monster seems like that is coming true) amd doesn't stand a chance to compete with non titan flagship. Unless this silicon is completely borked, how does a new architecture and a die shrink at 300W+ compare against a new arch with a tweaked process? Remember 5700XT was 45% slower than a 2080ti. If ampere is 50% faster, then AMD needs to be ~100% faster to compete. We havent see a card come close, from any camp, ever. That saod, maybe its RTRT performance is where the big increase is... who knows. 

So lomg as amd's card lands between them and is notably cheaper, it will be a win for everyone. But I just don't think rdna2 flagship will be within 15%.


----------



## vega22 (Aug 23, 2020)

nvidia knows that this next year/18 month that pc gaming hardware will take a hit, they always do with the start of a new console cycle. as such they are going to try and cream as much margin off the top as they can.


----------



## efikkan (Aug 23, 2020)

Minus Infinity said:


> Stupid price, stupid size, stupid power consumption. Pathetically desperate effort from Nvidia to cling to the performance crown no matter what. Again so much time devoted to an ultra niche product only 0.01% of gamers will actually buy despite all the keyboard warriors claiming otherwise.


Desperate? High-end is what moves the market forward.

And get your facts straight, Nvidia's current top model, RTX 2080 Ti, holds 0.88% of the Steam user base, that's slightly more than e.g. RX 5700 XT.

Neither price nor power consumption is confirmed at this point.


----------



## Chrispy_ (Aug 23, 2020)

It would seem a lot of people in this thread are confusing the RTX 3080 leaked a few weeks ago with this new 3090.

3080 leak from July:




3090 leak from Friday:




No, these are not the same card. Please stop posting leaked images of the 3080 and using it to claim things about the 3090.


----------



## Wilson (Aug 23, 2020)

lexluthermiester said:


> That beast will not be going into my Dell T3500. It literally won't fit in physically. I'm sure it would work though. Might be time to build a new system... Been dragging my feet for almost a year anyway... I've settled on Threadripper, just trying to decide on which one..


How about getting some proper case for 60$ if throwing $1000+ on VGA?


----------



## Serhend (Aug 23, 2020)

how can it be justified by performance? That is nonsense.
GTX 1070 e.g. for a launch price of 380 USD gave the same performance, if not more, of 980ti, which had a launch price of 650 USD. It was even on par with Titan X, which in 2015 had launch price of 1000 USD. That was progress. It was also 50 USD more than GTX 970, which however, it beat by 50% to 60% in various scenarios.

RTX 2070 on the other hand was only 30% better than GTX 1070 and had a launch price of 500 USD, FE was even 600 USD I think. So it provided 30% performance for at a price increase of 31+%. That is no progress, that is a very linear upgrade with more money being paid. 

They already screwed up the game there. 3000 series add a cherry on top of this.


----------



## ppn (Aug 23, 2020)

nvidia should just put GPU on the front CPU on the back of the PCB. and voila, not case needed. just a stand.


----------



## jayseearr (Aug 23, 2020)

Chrispy_ said:


> It would seem a lot of people in this thread are confusing the RTX 3080 leaked a few weeks ago with this new 3090.
> 
> 3080 leak from July:
> View attachment 166438
> ...



Lol^ you are asking people to stop speculating in a thread that is entirely speculation. Why? nobody is claiming anything there is literally the word potential in the title. Relax and don't take it too seriously.


----------



## yotano211 (Aug 23, 2020)

I'll go back to sailing my boat while you guys talk it out.


----------



## crimsontape (Aug 23, 2020)

This card is going to be workstation oriented. I think it's just too big for most systems, plus those power requirements are going to be insane. I don't think all power supplies will play fair with this monster. I'll wait for the 3050 or 3060 series and see what they offer then. Something a little more reasonable. 

It's cards like this that allow some crazy gaming, dont get me wrong. 4k gaming is wicked. But, we've seen where 4k gaming is headed at a dev-level with the Xbox and PS5. Pseudo 4k is the current future. The 3090 is likely to offer like a full 4k experience, but at an unreasonable cost. RDNA2 will probably win with cross platform optimizations on a more pedestrian and market-accessible product that comes in two consoles or your choice of AIB GPUs.


----------



## Raendor (Aug 23, 2020)

saki630 said:


> I guess im keeping my 1080ti or upgrading to 2080ti cuz nothing else will fit in this N1Case.



like there won’t be normal size cards for better price to match 2080ti (i.e. 3070). That’s what I’m expecting to put in my ncase 6.1 too instead of still doing fine 1080.


----------



## Arjai (Aug 23, 2020)

Looks like this thing runs hot enough to bubble the fan hub cover. ?


----------



## siki (Aug 23, 2020)

Wilson said:


> How about getting some proper case for 60$ if throwing $1000+ on VGA?



If PC case doesn't fall apart while sitting there doing nothing, its good enough.


----------



## Valantar (Aug 23, 2020)

EarthDog said:


> I've been singing the same tune for several weeks.. if this power rumor us true (3 slot monster seems like that is coming true) amd doesn't stand a chance to compete with non titan flagship. Unless this silicon is completely borked, how does a new architecture and a die shrink at 300W+ compare against a new arch with a tweaked process? Remember 5700XT was 45% slower than a 2080ti. If ampere is 50% faster, then AMD needs to be ~100% faster to compete. We havent see a card come close, from any camp, ever. That saod, maybe its RTRT performance is where the big increase is... who knows.
> 
> So lomg as amd's card lands between them and is notably cheaper, it will be a win for everyone. But I just don't think rdna2 flagship will be within 15%.


Hey, for once we disagree on something! Not necessarily in your conclusion (I also find it unlikely that AMD would be able to compete with a 350-400W Nvidia card assuming +~10% IPC and a reasonable efficiency boost from the new process node) but mostly your reasoning. Firstly, I find it unlikely that AMD will compete at this level mainly because I find it unlikely that they'll make a GPU this (ridiculously) power hungry. (As you said, assuming that the power draw rumors are true, obviously.) Beyond that though, you're comparing a 215-225W GPU against a 275-300W GPU and extrapolating from that as if both were equal, which is obviously not true. The 5700 XT was ~45% slower but also used 28% less power. On this point it's also worth noting that TPU measures power at 1080p, where the performance delta is just 35%, and that the 2808ti is one of the most efficient renditions of Turing while the 5700XT is the least efficient rendition of RDNA by quite a bit. AMD has also promoted "up to 50%" improved perf/W for RDNA 2, which one should obviously take with a heaping pile of salt (does that for example mean up 50% from the 5700 XT, or from any RDNA 1 GPU?), but must also be correct in some sense lest they be subjected to yet another shareholder lawsuit. So IMO it's reasonable to expect a notable perf/W jump from RDNA overall even if it's just an improved arch on a tweaked node. Will it be on par with Ampere? I don't quite think they'll be there, but I think it will be closer than we're used to seeing. Which would/could also explain Nvidia deciding to make a power move of a GPU like this to cement themselves as having the most powerful GPU despite much tougher competition, as AMD would be very unlikely to gamble on a >300W GPU given their history in GPUs for the past half decade or so.


----------



## medi01 (Aug 23, 2020)

Valantar said:


> The 5700 XT was ~45% slower


If 2080Ti is 45% faster, then 5700xt is 27% slower.
Just saying


----------



## EarthDog (Aug 23, 2020)

5700XT is a 225W card in reference form. A 2080Ti is 260W in FE (not reference, = 250W) form....a 17% difference. It is totally irrelevant sweetspot/efficiencyversus over extending... it is what it is for each. In fact I would think that supports my thoughts more, no? If AMD is already reaching and over extending themselves to be 45% below 2080Ti (I get it, never intended to compete there, RDNA2 will be a true high-end) and NV isn't....so what if they try the same thing with big navi running it out of the sweetspot again to be closer? I don't think many will have an issue with a 250W BNavi... but it had better be within 20% of flagship Ampere. I think few doubt it will be more efficient but it isn't catching up within 10-15% if I had to guess.

AMD has a hell of a leap to catch up and be competitive. I think they'll do it.. but they'll be on the outside looking in by at least 10-15%. It will be slower, cheaper and use less power...AMD's motto on the GPU side.


----------



## Jayp (Aug 23, 2020)

It's a big card but mostly when compared to reference size card. If you put that next to the AIB 2080 Ti cards it probably won't look so huge. Also, just because the cooler is that big doesn't necessarily mean it is minimum/reasonable cooling as often found on Nvidia reference cards. It is possible that Nvidia wanted to have an AIB competitive factory cooler this time around instead of the barely sufficient reference cooler on the 2080 Ti.


----------



## Legacy-ZA (Aug 23, 2020)

I am really curious to see what Big Navi has to offer, I am sure we will start to see leaks from the AMD camp after the 1st of September.


----------



## lexluthermiester (Aug 23, 2020)

JalleR said:


> Well as a temp setup while building hard tubing in my primary pc I put my Asus Rog Strix OC 1080TI in a T3500 I had to tweak the Closing mechanism for the cards a little but it worked like a charm


That can be a problem too. How did you fix it?


----------



## Ubersonic (Aug 23, 2020)

lexluthermiester said:


> That beast will not be going into my Dell T3500. It literally won't fit in physically.


Actually from the image I'm pretty sure it would.  It looks to be ~40-50mm longer than the pictured 2080 and 10-15mm higher, so should fit in a T3500 physically (just) but you will have to remove the left HDD (if fitted) and take out the blanking plate from the HDD bay (it's removable for fitting expansion cards).

Having said that, You probably wouldn't want to do it as a 5700XT will bottleneck like mad in a T3500 (even with a 3.8GHz 6c12t CPU) so this GPU would be choked to death.



dont whant to set it"' said:


> Gtx 480 might have a succesor with respect to power drawn and heat outputed.


Isn't that exactly what the GTX580 was, it beat the GTX480 in both, hell some AIB 580s were sucking over 100w more than the 480 lol


----------



## lexluthermiester (Aug 23, 2020)

Ubersonic said:


> Actually from the image I'm pretty sure it would.


It will not, unless I modify the case.


Ubersonic said:


> Having said that, You probably wouldn't want to do it as a 5700XT will bottleneck like mad in a T3500 (even with a 3.8GHz 6c12t CPU)


I currently have an RTX2080 that is only CPU bottlenecked in some games. It's not severe. However...


Ubersonic said:


> so this GPU would be choked to death.


...this is correct, which is why I will be building a new system. I only started using the T3500 as a daily driver on a challenge and then was impressed enough from it's performance that I just kept it. It is starting to show it's age these days and I'm jones'ing for a ThreadRipper.. with 32GB of DDR4-3800. I'm likely going to put an RTX 30xx in that system.


----------



## Valantar (Aug 23, 2020)

EarthDog said:


> 5700XT is a 225W card in reference form. A 2080Ti is 260W in FE (not reference, = 250W) form....a 17% difference. It is totally irrelevant sweetspot/efficiencyversus over extending... it is what it is for each. In fact I would think that supports my thoughts more, no? If AMD is already reaching and over extending themselves to be 45% below 2080Ti (I get it, never intended to compete there, RDNA2 will be a true high-end) and NV isn't....so what if they try the same thing with big navi running it out of the sweetspot again to be closer? I don't think many will have an issue with a 250W BNavi... but it had better be within 20% of flagship Ampere. I think few doubt it will be more efficient but it isn't catching up within 10-15% if I had to guess.
> 
> AMD has a hell of a leap to catch up and be competitive. I think they'll do it.. but they'll be on the outside looking in by at least 10-15%. It will be slower, cheaper and use less power...AMD's motto on the GPU side.


Average gaming power vs. average gaming power in TPU's benchmarks, they are 219W vs. 273W, which makes the 5700 XT consume 80% of the 2080 Ti's power, or the 2080Ti consume 125% the power of the 5700 XT. I guess I should have looked up the numbers more thoroughly (saying 215 vs. 275 did skew my percentages a bit), but overall, your 17% number is inaccurate. Comparing TDPs between manufacturers isn't a trustworthy metric due to the numbers being defined differently.

As for the 5700 being stretched in efficiency somehow proving they're further behind: obviously not, which you yourself mention. The 5700 XT is a comparatively small die, which AMD chose to push the clocks of to make it compete at a higher level than it was likely designed for originally. The 2080 Ti on the other hand is a classic wide-and-(relatively-)slow big die GPU, which gives it plenty of OC headroom if the cooling is there, but also makes it operate in a more efficient DVFS range. AMD could in other words compete better simply by building a wider chip and clocking it lower. Given just how much more efficient the 5700 non-XT is (166W average gaming power! With the 5700 XT just winning by ~14%!) we know even RDNA 1 can get a lot more efficient than the 5700 XT (not to mention the 5600 XT, of course, which beats any Nvidia GPU out there for perf/W). And the 2080 Ti still can't get 2x the performance of the 5700 non-XT (+54-76% depending on resolution). Which tells us that AMD could in theory build a slightly downclocked double 5700 non-XT and clean the 2080 Ti's clock at the same power, as long as the memory subsystem keeps up. Of course they never built such a GPU, and it's entirely possible there are architectural bottlenecks that would have prevented this scaling from working out, but the efficiency of the architecture and node is there. And RDNA 2 GPUs promise to improve that both architecturally and from the node. We also know that they can clock to >2.1GHz _even in a console _(which means limited power delivery and cooling), so there's definitely improvements to be found in RDNA 2.

The point being: if AMD is finally going to compete in the high end again, they aren't likely to go "hey, let's clock the snot out of this relatively small GPU" once again, but rather design as wide a GPU as is reasonable within their cost/yield/balancing/marketability constraints. _Then_ they might go higher on clocks if it looks like Nvidia are pulling out all the stops, but I would be downright shocked if the biggest big Navi die had less than 80 CUs (all might not be active for the highest consumer SKU of course). They might still end up releasing a >350W clocked-to-the-rafters DIY lava pool kit, but if so that would be a reactive move rather than one due to design constraints (read: a much smaller die/core count than the competition) as in previous generations (RX 590, Vega 64, VII, 5700 XT).

I don't think anyone will mind a 250W Big Navi being more than 20% behind Ampere if said Ampere is 350W or more. On the other hand, if it was more than 20% Ampere at the same power? That would be a mess indeed - but it's looking highly unlikely at this point. If Nvidia decided to go bonkers with power for their high end card, that's on them.


----------



## EarthDog (Aug 23, 2020)

Valantar said:


> Average gaming power vs. average gaming power in TPU's benchmarks, they are 219W vs. 273W, which makes the 5700 XT consume 80% of the 2080 Ti's power, or the 2080Ti consume 125% the power of the 5700 XT. I guess I should have looked up the numbers more thoroughly (saying 215 vs. 275 did skew my percentages a bit), but overall, your 17% number is inaccurate. Comparing TDPs between manufacturers isn't a trustworthy metric due to the numbers being defined differently.
> 
> As for the 5700 being stretched in efficiency somehow proving they're further behind: obviously not, which you yourself mention. The 5700 XT is a comparatively small die, which AMD chose to push the clocks of to make it compete at a higher level than it was likely designed for originally. The 2080 Ti on the other hand is a classic wide-and-(relatively-)slow big die GPU, which gives it plenty of OC headroom if the cooling is there, but also makes it operate in a more efficient DVFS range. AMD could in other words compete better simply by building a wider chip and clocking it lower. Given just how much more efficient the 5700 non-XT is (166W average gaming power! With the 5700 XT just winning by ~14%!) we know even RDNA 1 can get a lot more efficient than the 5700 XT (not to mention the 5600 XT, of course, which beats any Nvidia GPU out there for perf/W). And the 2080 Ti still can't get 2x the performance of the 5700 non-XT (+54-76% depending on resolution). Which tells us that AMD could in theory build a slightly downclocked double 5700 non-XT and clean the 2080 Ti's clock at the same power, as long as the memory subsystem keeps up. Of course they never built such a GPU, and it's entirely possible there are architectural bottlenecks that would have prevented this scaling from working out, but the efficiency of the architecture and node is there. And RDNA 2 GPUs promise to improve that both architecturally and from the node. We also know that they can clock to >2.1GHz _even in a console _(which means limited power delivery and cooling), so there's definitely improvements to be found in RDNA 2.
> 
> ...


As far as wattages, I simply used the nameplate values for ease of scope and context. 

You're going a lot further down the wormhole than I ever want to go. Time will tell... but I dont see big navi within 15%. That said, we'll all take that as a win im sure (depending on price).


----------



## neatfeatguy (Aug 24, 2020)

RTX 2080 FE is:
10.5" long
4.6" high
1.4" wide

My 980Ti AMP! Omega (and the Extreme version) is:
12.9" long
5.25" high
??? wide - can't find specific width dimensions listed, but it takes up just shy of 3 slots (by "just shy" I mean about 1/4" of an inch, if that)

My guess is the pictured (supposedly) 3090 is similar in size as my 980Ti AMP Omega card.



Manoa said:


> you don't have to go out of the PC systems guys, you just have to go out of the video cards (for a while)   just don't buy 10,900K for 600$ and you are ok
> I running 780Ti now 6+ years and no problems with any games



Only problem you run into if you keep a card for a very long period of time is they will eventually drop support. I had some GTX 280 in SLI for about 3.5 years. About a year after I stopped using them I gifted one to my younger brother and he used it for about 3 years. This put the age of card just over 7 years old of the release date (originally released June 2008). Nvidia stopped driver support for that series of cards in 2014, if I remember correctly.

He used that card until the release of Dying Light (which was 2015), but the driver support was gone for his 280 and Dying Light literally wouldn't work because the driver was too old. The game would tell him his card/driver was not supported.

My point is, sure, you can use a card for a good amount of time, but unfortunately it will stop getting support and new games won't run.


----------



## Easo (Aug 24, 2020)

I thought technicaly progress mean't that the cards should not grow in size, but at least stay the same. 
This does look like it would need a support inside the case, because I can already imagine cards breaking the PCIe slots/their own connectors...


----------



## watzupken (Aug 24, 2020)

Xaled said:


> Are you focking serious? 1060 is THREE years older than 16xx's. In old days you could've get 100% performance for same price in such period.



In my opinion, it would have been possible for Nvidia to create a successor to the GTX 1060 that is close to 100% faster. If you look at the RTX cards which are supposed to be premium Nvidia cards, Nvidia invested sideways and heavily into RT and DLSS. I believe significant die space where they could have cram in more powerful hardware to spruce up performance, was allocated to the RT and Tensor cores. Just comparing the transistor count between the GTX 1660 vs RTX 2060, there is a whooping 4.2 billion difference. The latter has more CUDA cores, but still I feel the extra CUDA cores will not be contribute to the bulk of the difference.

With the premium series being capped in performance, Nvidia will need to artificially gimp their GTX series to avoid cannibalizing the sales of RTX series. The same should be expected with the upcoming 3xxx series as I am sure Nvidia will double down on the likes of RT and DLSS. In addition, I am not sure what sorts of bespoke tech will Nvidia introduce at the hardware level since they tend to do this with every new generation.



Easo said:


> I thought technicaly progress mean't that the cards should not grow in size, but at least stay the same.
> This does look like it would need a support inside the case, because I can already imagine cards breaking the PCIe slots/their own connectors...



I don't agree. While I am not fan of giant graphic card/ cooler, the reality is that with every few passing generations, we are observing a jump in size. When I started on my first PC, the graphic card I used relies on passive cooling with a small heatsink. Then active cooling started creeping in after a few years. The active coolers grew in size over the years but maintained as a single slot. Then 2 slots cooler appeared, 2x fans. Fast forward to the recent 3 to 4 years, it is not uncommon to see coolers with 3x fan, taking up 3 slots, and also taller than the graphic card. As technology improve, the graphic card makers get more aggressive with adding hardware and features, pushing the boundaries and also power consumption.


----------



## Vayra86 (Aug 24, 2020)

Arjai said:


> View attachment 166452
> 
> Looks like this thing runs hot enough to bubble the fan hub cover. ?



I think that's some leftover from an old Asus *AREZ *sticker under there...

Once more, all I can say is... credibility... LOW



Candor said:


> View attachment 166418



Is that Jerry? Lol. I can sort of hear his soothing voice 



EarthDog said:


> As far as wattages, I simply used the nameplate values for ease of scope and context.
> 
> You're going a lot further down the wormhole than I ever want to go. Time will tell... but I dont see big navi within 15%. That said, we'll all take that as a win im sure (depending on price).



Big Navi might gain 30, best case 40% over RDNA2. Best case. Or AMD has gone similarly mental and this whole 3090 BS is true and they do both sport 400+W cards that I won't ever buy  In that case I'm staying far away from any res higher than 1440p for the foreseeable future and keep rolling with sensible pieces of kit... but then I might do that anyway.



EarthDog said:


> 5700XT is a 225W card in reference form. A 2080Ti is 260W in FE (not reference, = 250W) form....a 17% difference. It is totally irrelevant sweetspot/efficiencyversus over extending... it is what it is for each. In fact I would think that supports my thoughts more, no? If AMD is already reaching and over extending themselves to be 45% below 2080Ti (I get it, never intended to compete there, RDNA2 will be a true high-end) and NV isn't....so what if they try the same thing with big navi running it out of the sweetspot again to be closer? I don't think many will have an issue with a 250W BNavi... but it had better be within 20% of flagship Ampere. I think few doubt it will be more efficient but it isn't catching up within 10-15% if I had to guess.
> 
> AMD has a hell of a leap to catch up and be competitive. I think they'll do it.. but they'll be on the outside looking in by at least 10-15%. It will be slower, cheaper and use less power...AMD's motto on the GPU side.



Perhaps the far more interesting question is what AMD is going to offer across the stack below their top end RDNA part. Because it was AMD itself that once told us when they were gonna do RT, it would be from midrange on up. Where is it? .... Its starting to smell a lot like late to the party again.


----------



## medi01 (Aug 24, 2020)

Legacy-ZA said:


> I am really curious to see what Big Navi has to offer, I am sure we will start to see leaks from the AMD camp after the 1st of September.



Well, looking at what we reasonably expect from AMD is:
1) 505mm2 (a rumor, however from a source with good track record) and 80CUs (over 5700XTs 40CUs), all sounds reasonable
2) PS5 being able to push GPU to 2.1Ghz (with some power consumption reservations)
3) RDNA2 should be a bit faster, not slower, than RDNA1

Optimistically, next gen, improved fab node, twice 5700XT with faster RAM could be about 100% faster.
Taking 2080Ti as being 45% faster than 5700Xt, we get:

RDNA2 505mm2 thing with 80CUs = 2/1.45 =*38% faster than 2080Ti*, or somewhat lower (it would, of course, be drastically different in different games, but note how optimizing for RDNA2 becomes unavoidable, given AMD's dominance in console market)


----------



## kiriakost (Aug 24, 2020)

Chloe Price said:


> Why Asus? It's not the best which comes to coolers. MSI's been hella fine for several years, Asus has insane brand premium for its decent cards.



Its not ASUS , its about a choice of ASUS  about them using quality cooler at 1660 Super so them to impress INTEL and become their business partner at MINI PC. 
They did form an good card in a package with best ever cooler system, this does not happen every day. 





						Login
					

Login



					www.ittsb.eu


----------



## EarthDog (Aug 24, 2020)

Vayra86 said:


> Big Navi might gain 30, best case 40% over RDNA2.


Oh..I fully believe big navi will beat the 2080Ti... and that is AT LEAST 45%... I think it will land between the 2080Ti and 3090... I just hope it is closer the latter, not the former.

I also believe if big navi does that, it will be at least a 225W GPU... more likely 250W.


----------



## Valantar (Aug 24, 2020)

Vayra86 said:


> Big Navi might gain 30, best case 40% over RDNA2. Best case. Or AMD has gone similarly mental and this whole 3090 BS is true and they do both sport 400+W cards that I won't ever buy  In that case I'm staying far away from any res higher than 1440p for the foreseeable future and keep rolling with sensible pieces of kit... but then I might do that anyway.


30-40% absolute performance, perf/W, or something else? 30-40% increased absolute performance could theoretically be done just by scaling up RDNA 1 to flagship power draw levels with a wider die, so that seems like a too low bar IMO. 30-40% increased perf/W could make for a potent Ampere competitor - if going from the (least efficient rendition of RDNA1, the) 5700 XT, that would mean +30% performance at ~225W (slightly lower at stock according to TPU's numbers, but let's go by what it says on the tin for now). For the sake of simplicity, let's assume perf/W is flat across the RDNA 2 range - it won't be, but it's not a crazy assumption either - which then puts a 275W RDNA 2 GPU at ~159% the performance of the 5700 XT, matching or beating the 2080 Ti even at 4k where it wins by the highest margin (35/46% for 1080p/1440p), which is admittedly not a high bar in 2020, or a 300W RDNA 2 GPU at 173% of the 5700 XT, soundly beating the 2080Ti overall. That is going by 30% increased overall/average perf/W though, which for me is the minimum reasonable expectation when AMD has said "up to 50%". I'm by no means expecting +50% perf/W overall based on that statement, obviously, but 30% overall seems reasonable based on that.



EarthDog said:


> Oh..I fully believe big navi will beat the 2080Ti... and that is AT LEAST 45%... I think it will land between the 2080Ti and 3090... I just hope it is closer the latter, not the former.
> 
> I also believe if big navi does that, it will be at least a 225W GPU... more likely 250W.


If AMD is going back to competing for the GPU crown, wouldn't the safe assumption be that Big Navi is in the 275-300W range? That's where the flagships tend to live, after all. It would be exceptionally weird for them to aim for flagship performance yet limit themselves to upper midrange power draw levels.


----------



## Vayra86 (Aug 24, 2020)

Valantar said:


> 30-40% absolute performance, perf/W, or something else? 30-40% increased absolute performance could theoretically be done just by scaling up RDNA 1 to flagship power draw levels with a wider die, so that seems like a too low bar IMO. 30-40% increased perf/W could make for a potent Ampere competitor - if going from the (least efficient rendition of RDNA1, the) 5700 XT, that would mean +30% performance at ~225W (slightly lower at stock according to TPU's numbers, but let's go by what it says on the tin for now). For the sake of simplicity, let's assume perf/W is flat across the RDNA 2 range - it won't be, but it's not a crazy assumption either - which then puts a 275W RDNA 2 GPU at ~159% the performance of the 5700 XT, matching or beating the 2080 Ti even at 4k where it wins by the highest margin (35/46% for 1080p/1440p), which is admittedly not a high bar in 2020, or a 300W RDNA 2 GPU at 173% of the 5700 XT, soundly beating the 2080Ti overall. That is going by 30% increased overall/average perf/W though, which for me is the minimum reasonable expectation when AMD has said "up to 50%". I'm by no means expecting +50% perf/W overall based on that statement, obviously, but 30% overall seems reasonable based on that.
> 
> 
> If AMD is going back to competing for the GPU crown, wouldn't the safe assumption be that Big Navi is in the 275-300W range? That's where the flagships tend to live, after all. It would be exceptionally weird for them to aim for flagship performance yet limit themselves to upper midrange power draw levels.



+30-40% perf compared to 5700XT, to clarify. Maybe if I have a good day I'd be ootimistic enough to say +50%.

Any more would have me very surprised.


----------



## EarthDog (Aug 24, 2020)

Valantar said:


> If AMD is going back to competing for the GPU crown, wouldn't the safe assumption be that Big Navi is in the 275-300W range? That's where the flagships tend to live, after all. It would be exceptionally weird for them to aim for flagship performance yet limit themselves to upper midrange power draw levels.


It depends on who you ask and what you expect out of them and a minor node tweak. I fully expect it to be over extended to perform closer to NV cards and run into a similar situation as the 5700XT did. So yeah, nameplate values (again, I don't play this review said XXX W stuff right now), I expect it to be 225-250W in reference form. I'm trying to give them some credit on the arch change and minor node tweak. If big navi is any closer than 15% I''ll expect 250+ out of it for sure.


----------



## Valantar (Aug 24, 2020)

EarthDog said:


> It depends on who you ask and what you expect out of them and a minor node tweak. I fully expect it to be over extended to perform closer to NV cards and run into a similar situation as the 5700XT did. So yeah, nameplate values (again, I don't play this review said XXX W stuff right now), I expect it to be 225-250W in reference form. I'm trying to give them some credit on the arch change and minor node tweak. If big navi is any closer than 15% I''ll expect 250+ out of it for sure.


Again, I think that is a really weird expectation. The 225W rating of the 5700 XT is on the high side but nothing abnormal _for an upper midrange card._ For a flagship GPU in 2020 that kind of power draw (if it is at all competitive) would be revolutionary. The 7970 GHz edition was 300W. The 290X was 290W. The 390X was (admittedly a minor tweak of the 290X, and) 275W. The Fury X was 275W. The Vega 64 was 295W. The VII was 295W. You would need to go back to 2010 and the 6970 to find a single-GPU AMD flagship at 250W, and the 5870 in 2009 at 188W. And Nvidia's flagships have consistently been at or above 250W for more than a decade as well. The 5700 XT never made any claim to being or performing on the level of a flagship GPU.  AMD's current fastest GPU is an upper midrange offering, is explicitly positioned as such, so expecting their well publicized upcoming _flagship _offering to be in the same power range seems to entirely disregard the realities of GPU power draw. Higher end = higher performance = more power draw.



Vayra86 said:


> +30-40% perf compared to 5700XT, to clarify. Maybe if I have a good day I'd be ootimistic enough to say +50%.
> 
> Any more would have me very surprised.


That sounds overly pessimistic to me. The 5700 XT was never designed to be anything but upper midrange, and pushed a small die higher than was efficient. As I said above, on paper even RDNA (1) could hit that performance level if scaled up to flagship-level power draws with a matching wide die. AMD is promising significant perf/W gains for RDNA 2, so expecting increases beyond that seems sensible simply from the fact that this time around they'll be designing a die for the high end and not the midrange.


----------



## Vayra86 (Aug 24, 2020)

Valantar said:


> Again, I think that is a really weird expectation. The 225W rating of the 5700 XT is on the high side but nothing abnormal _for an upper midrange card._ For a flagship GPU in 2020 that kind of power draw (if it is at all competitive) would be revolutionary. The 7970 GHz edition was 300W. The 290X was 290W. The 390X was (admittedly a minor tweak of the 290X, and) 275W. The Fury X was 275W. The Vega 64 was 295W. The VII was 295W. You would need to go back to 2010 and the 6970 to find a single-GPU AMD flagship at 250W, and the 5870 in 2009 at 188W. And Nvidia's flagships have consistently been at or above 250W for more than a decade as well. The 5700 XT never made any claim to being or performing on the level of a flagship GPU.  AMD's current fastest GPU is an upper midrange offering, is explicitly positioned as such, so expecting their well publicized upcoming _flagship _offering to be in the same power range seems to entirely disregard the realities of GPU power draw. Higher end = higher performance = more power draw.
> 
> 
> That sounds overly pessimistic to me. The 5700 XT was never designed to be anything but upper midrange, and pushed a small die higher than was efficient. As I said above, on paper even RDNA (1) could hit that performance level if scaled up to flagship-level power draws with a matching wide die. AMD is promising significant perf/W gains for RDNA 2, so expecting increases beyond that seems sensible simply from the fact that this time around they'll be designing a die for the high end and not the midrange.



My pessimism has been on the right track more often than not though, when it comes to these predictions.

So far AMD has not shown us a major perf/w jump on anything GCN-based ever, but now they call it RDNA# and they suddenly can? Please. Tonga was a failure and that is all they wrote. Then came Polaris - more of the same. Now we have RDNA2 and already they've been clocking the 5700XT out of its comfort zone to get the needed performance. And to top it off they felt the need to release vague 14Gbps BIOS updates that nobody really understood, post/during launch. You don't do that if you've got a nicely rounded, future proof product here.

I'm not seeing the upside here, and I don't think we can credit AMD with trustworthy communication surrounding their GPU department. It is 90% left to the masses and the remaining 10% is utterly vague until it hits shelves. 'Up to 50%'... that sounds like Intel's 'Up to' Gigahurtz boost and to me it reads 'you're full of shit'.

Do you see Nvidia market 'up to'? Nope. Not a single time. They give you a base clock and say a boost is not guaranteed... and then we get a slew of GPUs every gen that ALL hit beyond their rated boost speeds. That instills faith. Its just that simple. So far, AMD has not released a single GPU that was free of trickery - either with timed scarcity (and shitty excuses to cover it up, I didn't forget their Vega marketing for a second, it was straight up dishonest in an attempt to feed hype), cherry picked benches (and a horde of fans echoing benchmarks for games nobody plays), supposed OC potential (Fury X) that never materialized, supposed huge benefits from HBM (Fury X again, it fell off faster than GDDR5 driven 980ti which is still relevant with 6GB), the list is virtually endless.

Even in the shitrange they managed to make an oopsie with the 560D. 'Oops'. Wasn't that their core target market? Way to treat your customer base. Of course we both know they don't care at all. Their revenue is in the consoles now. We get whatever falls off the dev train going on there.

Nah, sorry. AMD's GPU division has lost the last sliver of faith a few generations back, over here. I don't see how or why they would suddenly provide us with a paradigm shift. So far, they're still late with RDNA as they always have been - be it version 1, 2 or 3. They still haven't shown us a speck of RT capability, only tech slides. The GPUs they have out lack feature set beyond RT. Etc etc ad infinitum. They've relegated themselves to followers and not leaders. There is absolutely no reason to expect them to leap ahead. Even DX12 Ultimate apparently caught them by surprise... hello? Weren't you best friends with MS for doing their Xboxes? Dafuq happened?

On top of that, they still haven't managed to create a decent stock cooler to save their lives, and they still haven't got the AIBs in line like they should. What could possibly go wrong eh

//end of AMD roast  Sorry for the ninja edits.


----------



## medi01 (Aug 24, 2020)

Unbelievable how one could be regularly posting on a tech savvy forum, yet be so ignorant.


----------



## Valantar (Aug 24, 2020)

Vayra86 said:


> My pessimism has been on the right track more often than not though, when it comes to these predictions.
> 
> So far AMD has not shown us a major perf/w jump on anything GCN-based ever, but now they call it RDNA# and they suddenly can? Please. Tonga was a failure and that is all they wrote. Then came Polaris - more of the same. Now we have RDNA2 and already they've been clocking the 5700XT out of its comfort zone to get the needed performance. And to top it off they felt the need to release vague 14Gbps BIOS updates that nobody really understood, post/during launch. You don't do that if you've got a nicely rounded, future proof product here.
> 
> ...


I don't disagree with the majority of what you're saying here, though I think you're ignoring the changing realities behind the past situations you are describing vs. AMD in 2020. AMD PR has absolutely done a lot of shady stuff, have overpromised time and time again, and is generally not to be trusted until we have a significant body of proof to build trust on. Their product positioning and naming in China (like the 560D, various permutations of "580" and so on, etc.) is also deeply problematic. But so far I don't think I've seen Dr. Su overpromise or skew results in a significant way - but I might obviously have missed or forgotten something - and the "up to 50% improved perf/W" comes from her. (Sure, you could debate the value of Cinebench as a measure of overall performance - I think it shows AMD in too good a light if seen as broadly representative - but at least it's accurate and representative of some workloads.) And despite the fundamental shadyness of promising maximums ("up to") rather than averages or baselines, there's at least the security that it must in some sense be true for AMD to not be sued by their shareholders. And given how early that was said relative to the launch of the GPUs, I would say any baseline or average promise would be impossible to make.

Beyond that, most of what you describe is during the tenure of Koduri, and while it is obviously wrong to place the blame for this (solely) at his feet, he famously fought tooth and nail for near total autonomy for RTG, with him taking the helm for the products produced and deciding the direction taken with them. He obviously wasn't at fault for the 64 CU architectural limit of GCN, which crippled AMD's GPU progress from the Fury X and onwards, but he was responsible for how the products made both then and since were specified and marketed. And he's no longer around, after all. All signs point towards there having been some serious talking-tos handed out around AMD HQ in the past few years.

But beyond the near-constant game of musical chairs that is tech executive positions, the main change is AMD's fortunes. In 2015 they were near bankrupt, and definitely couldn't afford to splurge on GPU R&D. In 2020, they are riding higher than ever, with Ryzen carrying them to record revenues and profits. In the meantime they've shown with RDNA that even on a relatively slim budget (RDNA development must have started around 2016 or so, picking up speed around 2018 at the latest) they could improve things signifcantly, and now they're suddenly flush with cash, including infusions from both major high performance console manufacturers. The last time they had that last part was pre-2013, when they were already struggling financially, and both console makers went (very) conservative in cost and power draw for their designs. That is by no means the case this time around. They can suddenly afford to build as powerful a GPU as they want to within the constraints of their architecture, node and fab capacity.

And as I mentioned, RDNA has shown that AMD has found a way out of the GCN quagmire - while 7nm has obviously been enormously beneficial in allowing them to get close to (5700 XT), match (5700, 5500 XT) or even beat (5600 XT) Nvidia's perf/W, it is by no means the main reason for this, as is easily seen by comparing the efficiency of the Radeon VII vs. even the 5700 XT. And with RDNA being a new architecture with a lot of new designs, it stands to reason that there are more major improvements to be made to it in its second generation than there were to GCN 1.whatever.

As for the Fury X falling behind the 980 Ti: not by much. Sure, the 980 Ti is faster, and by a noticeable percentage (and a higher percentage than at launch), but they're still firmly in the same performance class. The 980 Ti has "aged" better, but by a few percent at best.

So while I'm all for advocating pessimism - you'll find plenty of my posts here doing that, including on this very subject - in this case I think you're being _too _pessimistic. I'm not saying I think AMD will beat or even necessarily match Ampere either in absolute performance or perf/W, but there are reasons to believe AMD has something good coming, just based on firm facts: We know the XSX can deliver a 52CU, 12TF GPU and an 8c16t 3.6GHz Zen2 CPU in a console package, and while we don't know its power draw, I'll be _shocked_ if that SoC consumes more than 300W - console power delivery and cooling, even in the nifty form factor of the XSX, won't be up for that. We also know the XSX runs at a relatively low clock speed with its 52 CUs thanks to the PS5 demonstrating that RDNA 2 can sustain 2.1 GHz even in a more traditional (if enormous) console form factor. We also know that even RDNA 1 can clearly beat Nvidia in perf/W if clocked reasonably (hello, 5600 XT!). What can we ascertain from this? That RDNA 2 at ~1.8GHz is quite efficient; that RDNA 2 is capable of notably higher clock speeds than RDNA 1, and that AMD is entirely capable of building a wider die than the RX 5700 - _even for a cost-sensitive console_.


----------



## Kovoet (Aug 24, 2020)

Can't see me paying those prices. I update every year but now the prices are going to get stupid.


----------



## John Naylor (Aug 25, 2020)

DuxCro said:


> With ZEn 3 around the corner, idk why anyone would buy intel CPU for a new build.



Well 1, around the corner is not here ... and 2.... it's always best to choose your tools based upon what you need it to do and what jobs you do.   Unfortunately the apps in which where AMD excels are not on a large % of peoples PCs.  For gamers , the 10400 / 10400F  is faster in gaming than anything in AMDs entire lineup ... hard to get your hands on one even with vendors selling well over MSRP.  When I first looked at the $265 10600k versus the $250 3600X, they were in the same general prce tier. ... now th e 10600k is sometimes over $300, while the 3600X has dropped

While I would agree it's noyt wise to make a choice w/o all cards being on the table as yet, we can ony make decisions based upon what is real.  The only applications that matter are the ones each user is running and how often.  If you are rendering or going Gaming Develppment moist of your day ... most definitely you'd be better off with AMD.  But for what people actually do on an every day basis .... most folks will do better with Intel.... even with many workstation type or advanced apps ... like Photo Editing, Video Editing and software development.  And even tho many things favor Intel in testing (i.e office suites), I'm ignoring them as the test scripts ignore the real bottlenect, user input.  Just gpoinmg to use gaming + video editing as if we want to include an area where more cores is supposed to help, the most common app of this type would be video editing.

$400 Price Point - 10700k ($406) vs 3900X ($429) ..... All data based upon TPUs testing

Gaming:  10700k = 100% / 3900x = 93% (OC'd the 10700K picks up 1.3% / OC'd the 3900x was slower) .... 10700k is 7.5% faster ignoring the OC advantage

Video Editing:  10700k = 236.5 / 3900x  = 252.2 (OC'd the 10700K = 228.6 / OC'd the 3900x was slower) .... 10700k is 6.6% faster ignoring the OC advantage

Stock Temps @ Max Load:   10900k = 61C / 3900 XT = 79C

$300 Price Point  -  10600K ($278) vs 3700X ($290) ..... All data based upon TPUs testing 

Gaming:  10600k = 100% / 3700x = 93.6% (OC'd the 10600K picks up 1.9% / OC'd the 3700x was 0.1% faster) .... 10700k is 6.4 % faster ignoring the OC advantage

Video Editing:  10600k = 241.00 / 3900x  = 253.40 (OC'd the 10700K = 228.6 / OC'd the 3700x was slower) .... 10700k is 5.1% faster ignoring the OC advantage

Stock Temps @ Max Load:   10600k = 56C / 3970 X =66C

$200 Price Point  -  10600KF ($182) vs 3600 ($175) ..... All data based upon TPUs testing 

Gaming:  10400KF = 100% / 3600 = 93.8%  .... 10700k is 6.2 % faster

Video Editing:  10400KF = 267.40 / 3600  = 263.00  .... 3600 is 1.7% faster 

We were asked to do a build like this 2 weeks ago. user didn't know which way he wanted to go so we aked what's the frequency of usage and irelative importance. ? It came down to, ***in his case***. What's more important 6.2% every night of the week versus rendering once or twice a week ?

So there are many reasons why someone would do an Intel build .... it will depend on what applications are being used.  For a build dedicated to gaming (and other incidental stuff where speed is of no significance), the opposite is true ... it is real hard to make a case for an AMD build, as Intel's $180 CPU is faster in gaming than anything in AMDs entire lineup extending up to the $475 flagship.  And that's "in general" ... if you exclusively into SIM games, you might go the other way.

No one buys a CPU cause it's faster in Powerpoint, and the fact that Intel is faster in Word and Excel again isn't going to sway any buying decisions.  As an Engineer, my primary workday apps are AutoCAD and Spreadsheets, both of which favor Intel. 

From AutoCAD Workstation Vendor Support
"In the case of AutoCAD, the majority of the software is only single threaded so it is only able to utilize a single core of the CPU. For this reason, our general recommendation when choosing a processor is to get the highest frequency. For current generation CPUs, that is Intel's Core i9 10900K or i7 10700K, both of which can boost to over 5.0GHz with a single core active. "

So the relevant questions becomes...

a)  what application will your box have installed that benefit from one CPU versus another.  Looking at TPU testing , I see about 7 or 8 that favor Intel ,,, and other that are too close to matter.
b)  what application will your box have installed that benefit from one CPU versus another, that you use on a every day basis .  Looking at TPU, I see about 4 or 5  that favor Intel
c)  what application will your box have installed that benefit from one CPU versus another, that you use on a every day basis and in which the user is not the primary bottleneck.  Looking at TPU, I see just gaming and that favors Intel.

I do have an engineering colleagues that wants me to build him a dual system box.  His idea is to run a threadripper for doing rendering jobs overnight, while he plays games on the Intel side.  But every time he's ready to pull a trigger , he hears about another "next big thing" and puts it off.  

Saying there's no reason to buy [insert anything here] is doesn't fit everybody.   The best answer would be a hammer if you are a roofer. ... but an electrician would likely say a screwdriver and an auro mechanic a wrench.  Picking a tool is best done by picking the tool that best fits the application. ... and a CPU is just that.  For gamers (outside of SIMs and a few others,) that's going to be Intel.   For renderers, game developers, that's going to be AMD ... for CAD users, video editors, photo editors, CAD peeps, etc, that's going to be Intel.


----------



## DuxCro (Aug 25, 2020)

John Naylor said:


> Wall of text.


Seriously, you need to get out more.


----------



## Turmania (Aug 25, 2020)

I do not mind or care what I buy. But,when I buy it, I want it to work and have the peace of mind if there is a driver issue i know that it will get fixed immediately. This is where Nvidia excels and won me as a consumer.


----------



## Manoa (Aug 25, 2020)

this is sad but bad news is until AMD decide to stop the "anti-software" policy and stop looking for open source suckers to do them drivers for free this not going to change :x
yhe sure NV policies suckx but they deliver and consistently so, and in the end the user care about what is working not broken half made drivers, even if the hardware is bether :x
you can see similar example in linux vs windows: linux fail every time in games, first was WINE and now it's Proton and each time it the same story: nothing working only problems
and windows spilling millions in QA in the end worth more to user than principles of linux


----------



## Parn (Aug 25, 2020)

Holy cr*p! The card is huge. In a small-ish case, this thing is going to split the entire internal space in half and completely block the airflow in between. What's the TDP?


----------



## TheoneandonlyMrK (Aug 25, 2020)

Manoa said:


> this is sad but bad news is until AMD decide to stop the "anti-software" policy and stop looking for open source suckers to do them drivers for free this not going to change :x
> yhe sure NV policies suckx but they deliver and consistently so, and in the end the user care about what is working not broken half made drivers, even if the hardware is bether :x
> you can see similar example in linux vs windows: linux fail every time in games, first was WINE and now it's Proton and each time it the same story: nothing working only problems
> and windows spilling millions in QA in the end worth more to user than principles of linux


And a comment a on topic? At best you got six words in.

People need to read the title more round here, that might calm some flame wars , wtaf.


----------



## AddSub (Aug 28, 2020)

Preordered x2!


----------

