# NVIDIA GeForce RTX 3090 Looks Huge When Installed



## btarunr (Sep 10, 2020)

Here's the first picture of an NVIDIA GeForce RTX 3090 Founders Edition card installed in an tower case. The triple-slot card measures 31.3 cm in length, and is 13.8 cm tall. Its design is essentially an upscale of that of the RTX 3080. The card still pulls power from a single 12-pin power connector, with an adapter included for two 8-pin connectors to convert to the 12-pin. The typical board power of the card is rated at 350 W. This particular card in the leak, posted on ChipHell forums, is pre-production as VideoCardz comments, given that some parts of its metal superstructure lack the chrome finish of the card NVIDIA CEO Jen-Hsun Huang unveiled on September 1. The RTX 3090 launches on September 24. 





*View at TechPowerUp Main Site*


----------



## BoboOOZ (Sep 10, 2020)

The only way it's not gonna look huge is when you put it next to a Hummer...


----------



## bonehead123 (Sep 10, 2020)

wHY OH Why did they put the power connector in the middle of the card, thereby creating moar visible cable clutter, especially with the adapter ?

f.A.i.L....


----------



## Rahnak (Sep 10, 2020)

bonehead123 said:


> wHY OH Why did they put the power connector in the middle of the card, thereby creating moar visible cable clutter, especially with the adapter ?
> 
> f.A.i.L....


That’s where the PCB ends.


----------



## DuxCro (Sep 10, 2020)

I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it?  But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan  replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card. 

So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300.  Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699


----------



## xkm1948 (Sep 10, 2020)

As clean as that looks, I am still going with EVGA, for the sake of better warranty.


----------



## GLeader (Sep 10, 2020)

i like how the card is supported by the HDD cage


----------



## john_ (Sep 10, 2020)

LOL.... And yes, this is a perfect PC case for this card. The disk cage also helps to minimize the stress from the weight of the card on the PCIe slot.


----------



## windwhirl (Sep 10, 2020)

I think the PCIE riser market is gonna get a slight boost with this...


----------



## JAB Creations (Sep 10, 2020)

Nvidia pushing it's watt-sucking heat in to AMD CPUs? ︀

Nah, we can _trust Nvidia_! Right...?


----------



## Assimilator (Sep 10, 2020)

Someone needs to design a long GPU (long enough to hang over the motherboard, like this 3090) and put the power connectors on the bottom, i.e. the same edge that the PCIe connector is on. Then you could just run the needed power cables through the cable-routing holes in the motherboard tray, and no need to worry about hiding those cables.

My MS Paint skills are woeful but this should give you an idea. Power cables come through the hole indicated by the bottom of the arrow, plug in around where the tip of the arrow indicates.


----------



## BoboOOZ (Sep 10, 2020)

JAB Creations said:


> Nvidia pushing it's watt-sucking heat in to AMD CPUs? ︀
> 
> Nah, we can _trust Nvidia_! Right...?


I actually agree with most of what's been said in that video and I remember well when many of those things happened, and I still don't like it one bit.

But in this case, I think there's no need to call out the conspiracy yet, wait for Gamer's Nexus review of how that design affects thermals of different case setups. I'm pretty sure for many cases the outcome will be positive (improving the airflow in the case, although outputting more heat).


----------



## Vayra86 (Sep 10, 2020)

Cool! For 1499 you don't even need to buy a bracket! You just use your HDD tray.



BoboOOZ said:


> I actually agree with most of what's been said in that video and I remember well when many of those things happened, and I still don't like it one bit.
> 
> But in this case, I think there's no need to call out the conspiracy yet, wait for Gamer's Nexus review of how that design affects thermals of different case setups. I'm pretty sure for many cases the outcome will be positive (improving the airflow in the case, although outputting more heat).



Still completely the wrong time and place, if you ask me. I reported it as LQ, because its flame bait and nothing else. Do I need to post an AMD equivalent video now? What's the point?


----------



## Mr Bill (Sep 10, 2020)

I'm just an "old" dude that just surfs the web and watches "lots" of YouTube videos, would that be overkill for my IBM 8088?


----------



## P4-630 (Sep 10, 2020)

xkm1948 said:


> As clean as that looks, I am still going with EVGA, for the sake of better warranty.



EVGA only gives a 24 month warranty in my country, while most other brands 36 months.


----------



## Ravenmaster (Sep 10, 2020)

Probably should have removed that drive bay cage, it looks like its hoisting the end of the card upwards slightly


----------



## Assimilator (Sep 10, 2020)

Vayra86 said:


> I reported it as LQ, because its flame bait and nothing else.



Did the same. Tired of fanboys trying to turn every thread into yet another idiotic willy-waving flamewar.


----------



## Xex360 (Sep 10, 2020)

DuxCro said:


> I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it?  But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan  replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.
> 
> So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300.  Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699


Yeah Ampere is a huge improvement compared to Turing but it's still expensive, the 1070 costs less than 400$ comprehensively beating the 980ti with more memory.
As for the 3090, I believe it's a niche product, probably the yield is low and the memory is expensive, and maybe just to retain the performance crown against AMD, because looking at the PS5/Series X a big RDNA2 could end up beating the 3080 (except maybe in RT).


----------



## Caring1 (Sep 10, 2020)

If that top fan is meant to pull the airflow up as previously shown, the blades are designed the wrong way.


----------



## midnightoil (Sep 10, 2020)

DuxCro said:


> I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it?  But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan  replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.
> 
> So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300.  Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699



They're going to be virtually unobtainum.  Huge die, ultra low yields, limited wafers initially.  Tbh if they didn't charge it this high, the extra margin would just go to gougers.

The only reason you're seeing the 3090 is because NVIDIA are very worried about RDNA2. The 3080 die would have been the 3080Ti / 3090 if they thought RDNA2 was going to be average / a wet blanket, and the 3090 would not have existed. It's why the whole thing reeks of Frankenstein.


----------



## Vya Domus (Sep 10, 2020)

Caring1 said:


> If that top fan is meant to pull the airflow up as previously shown, the blades are designed the wrong way.



Or it just spins the other way around maybe ?

It's a nonsensical design from the get-go anyway, it's meant to be visually striking. Form over function. You can tell that's the case because despite only dissipating around 20% more power than the 3080, for some reason the heatsink needs to have something like 50% more surface area, which means the airflow through it must be horrid.


----------



## TheLostSwede (Sep 10, 2020)

P4-630 said:


> EVGA only gives a 24 month warranty in my country, while most other brands 36 months.


EVGA isn't worth buying outside of the US, as none of their offers apply outside of the US. They offered their L-shaped power adapter for free if you bought one of their cards. Turns out, if you live outside the US, you have to pay for the postage... Turns out they only accepted Paypal, which at the time was impossible to use here. I offered to go to their office here to collect mine, but apparently that wasn't even an option. That's what you call top notch support. Also, they screwed up the thermal pads on the card I had, so they sent out a replacement kit that you had to swap yourself...  
I will not be buying any more of their products based on my experience with them.


----------



## bug (Sep 10, 2020)

bonehead123 said:


> wHY OH Why did they put the power connector in the middle of the card, thereby creating moar visible cable clutter, especially with the adapter ?
> 
> f.A.i.L....


Because that's where the PCB ends, maybe?


----------



## EarthDog (Sep 10, 2020)

In other news... water is wet.

Silly news article, this... lol


----------



## TheoneandonlyMrK (Sep 10, 2020)

Swoon at the tidyup of the Power cable's Not.


----------



## PerfectWave (Sep 10, 2020)

bonehead123 said:


> wHY OH Why did they put the power connector in the middle of the card, thereby creating moar visible cable clutter, especially with the adapter ?
> 
> f.A.i.L....



IT'S UGLY AS FU .... CK


----------



## phanbuey (Sep 10, 2020)

i like how the drive cage doubles as an anti-sag bracket.  karma win.


----------



## TheLostSwede (Sep 10, 2020)

Caring1 said:


> If that top fan is meant to pull the airflow up as previously shown, the blades are designed the wrong way.


Seems like it's spinning in "reverse" from what one would expect.




__ https://www.facebook.com/video.php?v=326294165378763


----------



## dgianstefani (Sep 10, 2020)

Vya Domus said:


> Or it just spins the other way around maybe ?
> 
> It's a nonsensical design from the get-go anyway, it's meant to be visually striking. Form over function. You can tell that's the case because despite only dissipating around 20% more power than the 3080, for some reason the heatsink needs to have something like 50% more surface area, which means the airflow through it must be horrid.


Silly comment, the new design is both cooler and quieter while having more heat to dissipate, it isn't like it performs the same.


----------



## Assimilator (Sep 10, 2020)

TheLostSwede said:


> Seems like it's spinning in "reverse" from what one would expect.
> 
> 
> 
> ...



The blade curvature is definitely reversed compared to what we would expect - I would imagine for aesthetics and noise. Remember that a fan pushes and pulls air regardless of its blade orientation.

I'm sure heroes like GN will eventually do an experiment with the blades oriented "the right way" to find out what the impact is.


----------



## milewski1015 (Sep 10, 2020)

dgianstefani said:


> Silly comment, the new design is both cooler and quieter while having more heat to dissipate, it isn't like it performs the same.


According to what, Nvidia's marketing?


----------



## AddSub (Sep 10, 2020)

Kinda a like the return of giga-cards. Looks like a 3090 "GTX 295" edition. 

...
..
.


----------



## TheLostSwede (Sep 10, 2020)

AddSub said:


> Kinda a like the return of giga-cards. Looks like a 3090 "GTX 295" edition.
> 
> ...
> ..
> .


You getting a full-tower for your next upgrade to go with it?
A classic like this maybe?


			http://global.aopen.com/products_detail.aspx?auno=718


----------



## AnarchoPrimitiv (Sep 10, 2020)

Looks like the new Intel NUC extreme card



dgianstefani said:


> Silly comment, the new design is both cooler and quieter while having more heat to dissipate, it isn't like it performs the same.



How do you know that for sure? I'm genuinely asking


----------



## BluesFanUK (Sep 10, 2020)

There should be a wall of shame for all the plebs who bought a 2080ti and for the ones considering a 3090. Bonkers monkey regardless of how flush you are.


----------



## agent_x007 (Sep 10, 2020)

Caring1 said:


> If that top fan is meant to pull the airflow up as previously shown, the blades are designed the wrong way.


Top fan is "scooping" air from fins [Pull] with blade design like this : "|_", and their movement "=>" way. 
It's meant to decrease chopping sound of blades.


----------



## Franzen4Real (Sep 10, 2020)

xkm1948 said:


> As clean as that looks, I am still going with EVGA, for the sake of better warranty.


I have only one experience in dealing with a GPU rma and that was with the 2080tiFE RAM issue. I can say that the actual process of getting a replacement when dealing direct with nVidia was well beyond what I had expected. During my phone call to support, they asked me to screen shot the artifacts so that they could confirm. It was about a 10 min phone call. They then overnighted a brand new retail boxed card and a prepaid return shipping label, with instructions to box up my bad card in the same packaging and return it. So there was down time of about 30min total at most. I would have been happy with just a cross-ship replacement, but they wanted me to wait until the new one arrived before dealing with my old card. For me defects have been pretty much a non-issue in nearly two decades of pc building, but it is nice to know that in the event of a problem it can be handled with minimal fuss.


----------



## mechtech (Sep 10, 2020)

Maybe the 3090 will come with a new pc case and a 12 pin connector ?


----------



## chodaboy19 (Sep 10, 2020)

Are OEMs using a similar PCB? 
I see that a lot of their 3 fan cards have that last fan set up as flow through similar to the Nvidia founder's edition we are seeing today.


----------



## dgianstefani (Sep 10, 2020)

AnarchoPrimitiv said:


> Looks like the new Intel NUC extreme card
> 
> 
> 
> How do you know that for sure? I'm genuinely asking


In the presentation he mentioned the new cards, despite using more power run 10c cooler and i forgot how many dB quieter.


----------



## Mysteoa (Sep 10, 2020)

Assimilator said:


> Someone needs to design a long GPU (long enough to hang over the motherboard, like this 3090) and put the power connectors on the bottom, i.e. the same edge that the PCIe connector is on. Then you could just run the needed power cables through the cable-routing holes in the motherboard tray, and no need to worry about hiding those cables.
> 
> My MS Paint skills are woeful but this should give you an idea. Power cables come through the hole indicated by the bottom of the arrow, plug in around where the tip of the arrow indicates.
> 
> View attachment 168243



So you are suggesting for them to create a case specific GPUs? You know that those cable holes are in different places in different cases.


----------



## silkstone (Sep 10, 2020)

Why are they putting it in an ITX board?



Spoiler



J/k


----------



## Chrispy_ (Sep 10, 2020)

Without that hard drive cage to rest it on, that thing would probably just tear the PCIe slot out of the motherboard.


----------



## Fluffmeister (Sep 10, 2020)

Huge and less dusty compared to the competition.


----------



## AddSub (Sep 10, 2020)

TheLostSwede said:


> You getting a full-tower for your next upgrade to go with it?
> A classic like this maybe?
> 
> 
> http://global.aopen.com/products_detail.aspx?auno=718



Got my hands on a Cosmos II 25th Anniversary edition month ago, brand new in box, old stock. Haven't used it for anything yet. I have several old Cosmos 1000 cases but this thing seems much bigger. Might be a base for a 3090 SLI build... not that I play modern games anymore (do user made levels/wads for 1993-Doom count? )

...
..
.


----------



## purplekaycee (Sep 10, 2020)

How thick is this card or breadth?


----------



## TheoneandonlyMrK (Sep 10, 2020)

AddSub said:


> Got my hands on a Cosmos II 25th Anniversary edition month ago, brand new in box, old stock. Haven't used it for anything yet. I have several old Cosmos 1000 cases but this thing seems much bigger. Might be a base for a 3090 SLI build... not that I play modern games anymore (do user made levels/wads for 1993-Doom count? )
> 
> ...
> ..
> .


Good lord ,two 3090s to play doom, I hope you have pro uses for those or I'm lost for words(That I can actually say, I have many I couldn't).


----------



## AddSub (Sep 10, 2020)

HWbot friends, HWbot. Just took some 3DMark 01 and 03 records the other day. Crazy what you can do with Comet Lake and custom WinXP install tweaked to run on Z490 platform.   

...
..
.


----------



## TheoneandonlyMrK (Sep 10, 2020)

AddSub said:


> HWbot friends, HWbot. Just took some 3DMark 01 and 03 records the other day. Crazy what you can do with Comet Lake and custom WinXP install tweaked to run on Z490 platform.
> 
> ...
> ..
> .


Fair enough  ima point down to that folding@home thing, get on it, call it stability testing the team loves new members.


----------



## neatfeatguy (Sep 10, 2020)

_The triple-slot card measures 31.3 cm in length, and is 13.8 cm tall._ 

31.1 cm = 12.25" long
13.8 cm = 5.43" high

The 3090 is about .5" shorter in length than my 980Ti AMP Omega, but stands about .2" taller. And she's just as wide, triple slot.

I could drop a 3090 in my CM HAF XB Evo case. I wouldn't have to worry about GPU sag. The case isn't that pretty looking, but I find it great that MB mounts parallel to the ground and my GPU sits on top; no more GPU sag for me.

Then again, the 3090 is a month's car payment, cell phone bill and mortgage.....if I didn't have any of those things I could afford one. Who am I kidding? If I didn't have those things the wife would want me to spend my money on other crap we don't need instead.


----------



## ddarko (Sep 10, 2020)

neatfeatguy said:


> _The triple-slot card measures 31.3 cm in length, and is 13.8 cm tall._
> 
> 31.1 cm = 12.25" long
> 13.8 cm = 5.43" high
> ...



Yeah, the 3090 is a big card no doubt but it's not unprecedented.  The MSI Lightning 1080Ti Z which sits in my case is also longer (12.6 inches vs 12.3 inches for the 3090) and taller (140mm/5.5 inches vs 138/5.4 inches).  The Lightning Z is not as wide but though it's spec'ed as a 2-slot card, in reality it's closer to 3-slots.


----------



## halodies (Sep 10, 2020)

but your case is tiny!! what case is that?


----------



## Another-Stin (Sep 10, 2020)

neatfeatguy said:


> Then again, the 3090 is a month's car payment, cell phone bill and mortgage.....if I didn't have any of those things I could afford one. Who am I kidding? If I didn't have those things the wife would want me to spend my money on other crap we don't need instead.



Anyone who has a car payment, student loans, and/or credit card debt should most definitely not be looking at the 3090. They should be getting their finances in order (out of debt and have savings) as their first order of business.  If content creation pays the bills, a case could be made as they could likely expense it for tax purposes. 

It's just sad to see so many people falling into the consumer trap of giving in to marketing rather than thinking for themselves what they actually need and what they can truly afford (not just make the monthly payments like a slave).


----------



## Steevo (Sep 10, 2020)

Rahnak said:


> That’s where the PCB ends.


So the cable couldn't have gone on the other side?


----------



## xman2007 (Sep 10, 2020)

Another-Stin said:


> Anyone who has a car payment, student loans, and/or credit card debt should most definitely not be looking at the 3090. They should be getting their finances in order (out of debt and have savings) as their first order of business.  If content creation pays the bills, a case could be made as they could likely expense it for tax purposes.
> 
> It's just sad to see so many people falling into the consumer trap of giving in to marketing rather than thinking for themselves what they actually need and what they can truly afford (not just make the monthly payments like a slave).


I'm sure he's really grateful for your unwanted/unwarrented financial advice on a tech forum and your first post as well...  Anything to say in  relation to the OP? or did you just register to share your wisdom on how we should all strive to be, just an FYI, for you, some people work low paid jobs/minimum wage, if they didn't then who would be there to work them? some people have kids to put through school and college and maybe that's where their savings go and not directly into their back pockets to use as pocket money, some people cant afford to splash out 30k on a brand new car yet need something more reliable than a 1k tin bucket to make their daily commute to work and back, school runs, appointments, days out etc but..... nevermind all of this, his comment was made in jest for the most part, you must be a right barrell of laughs in real life....

What an odd first post....


----------



## Another-Stin (Sep 10, 2020)

xman2007 said:


> What an odd first post....



I wasn't talking to OP specifically, as I don't know OP's situation.
How's posting your opinion any different than me posting mine? 

If someone makes minimum wage and buys a 3090, I just hope they're not going into debt in doing so. It's just so not worth it after experiencing the pain of paying down debts myself.
The fact that you became so defensive on this, though, shows that it must hit home. I was raised in a blue collar family and took on debt to get my Msc in ECE. I understand, from a first-hand perspective, what it's like... And I also understand the freedom of being disciplined enough to live well below your means in order to live a better life tomorrow. I didn't mean to get into a big, philosophical or financial discussion. I've just seen so many posts on "can't wait to pick up a 3090", and I'm just surprised is all. It's not really targeted for gamers, and it seems like most enthusiasts are still on 1440p or 1440p ultrawide anyways, so why buy all that VRAM when there's likely a 3080 (Ti?) 20GB coming next year? Sorry to derail the convo!

Back to tech -- I think the 3090 is an overpriced 3080 Ti, plus I personally wouldn't use all that VRAM. I'm looking forward to tinkering with the 3080, tho!


----------



## GhostRyder (Sep 10, 2020)

Well with a water block I bet it will look tiny


----------



## neatfeatguy (Sep 10, 2020)

xman2007 said:


> I'm sure he's really grateful for your unwanted/unwarrented financial advice on a tech forum and your first post as well...  Anything to say in  relation to the OP? or did you just register to share your wisdom on how we should all strive to be, just an FYI, for you, some people work low paid jobs/minimum wage, if they didn't then who would be there to work them? some people have kids to put through school and college and maybe that's where their savings go and not directly into their back pockets to use as pocket money, some people cant afford to splash out 30k on a brand new car yet need something more reliable than a 1k tin bucket to make their daily commute to work and back, school runs, appointments, days out etc but..... nevermind all of this, his comment was made in jest for the most part, you must be a right barrell of laughs in real life....
> 
> What an odd first post....



If that's how you really feel, then maybe I won't share my secret to financial security. (don't mind the sarcasm, I'm putting it on pretty thick)



Spoiler



Nope, not gonna do it.


Spoiler



Okay, I lied:


----------



## TheoneandonlyMrK (Sep 10, 2020)

GhostRyder said:


> Well with a water block I bet it will look tiny


It looks the same as the 3080 apparently, I can't escape the thought these cooler's were a late decision designed to polish a turd personally but reviews will be here soon, and though some will say just buy it, they did a 2080 I will wait until the dust settles.
That middle connection doesn't sit right with me.


----------



## xman2007 (Sep 10, 2020)

Another-Stin said:


> I wasn't talking to OP specifically, as I don't know OP's situation.
> How's posting your opinion any different than me posting mine?
> 
> The fact that you became so defensive on this, though, shows that it hits home. I was raised in a blue collar family and took on debt to get my Msc in ECE. I understand, from a first-hand perspective, what it's like... And I also understand the freedom of being disciplined enough to live well below your means in order to live a better life tomorrow. I didn't mean to get into a big, philosophical or financial discussion. I've just seen so many posts on "can't wait to pick up a 3090", and I'm just surprised is all. It's not really targeted for gamers, and it seems like most enthusiasts are still on 1440p or 1440p ultrawide anyways, so why buy all that VRAM when there's likely a 3080 (Ti?) 20GB coming next year? Sorry to derail the convo!
> ...


No, you quoted someone who was replying to a thread about a GPU, I didnt say you were specifically replying to the OP, I have eyes, I can see who you quoted, i find it odd you choose to register on a tech forum and your first post completely ignores the OP and you start talking bollocks to a random comment that was made tongue in cheek and choose to make that the focus of your first post, your introduction to TPU   I find it odd, do you not?Maybe it's just me...


----------



## ppn (Sep 10, 2020)

Waiting optimum tech to post his MITX build with full cover water block single slot, yeah, now that the USB port is gone the card is real 1 slot. beauty. PCB may look like square fish but still.

As for the waste traps, it is a head game. If I spend more than $1 a day for food it is already too much. just by staying home I save $10, avoiding the traps, so the 3080 will pay or itself in 2 months.

The more you spend the more I save. By "you" CEO means all of us, some of us will spend, but at the same time most of us will save.

And before you know it 4070 will be out in 18 months with skinnier die size and power. 3090 is just another titan to be replaced by 70 card 18 months after.


----------



## Another-Stin (Sep 10, 2020)

xman2007 said:


> choose to make that the focus of your first post, your introduction to TPU   I find it odd, do you not?Maybe it's just me...


Maybe you're right... I should care more about my inaugural post to TPU. I hope to climb the ranks of the TPU members and become the most renowned member of all! (/s)

I had a thought, signed up, and posted it. Simple as that. Wish you well!


----------



## GhostRyder (Sep 10, 2020)

theoneandonlymrk said:


> It looks the same as the 3080 apparently, I can't escape the thought these cooler's were a late decision designed to polish a turd personally but reviews will be here soon, and though some will say just buy it, they did a 2080 I will wait until the dust settles.
> That middle connection doesn't sit right with me.


Oh I agree, I really dont like that either as I wish they had at least had an adaptor that extends it so if you use the air cooler its normal, then if you decided to water cool it you could take it off and have the connector like it is now.

I was just thinking though with these PCB's being small it will be interesting to see underneath.


----------



## xman2007 (Sep 10, 2020)

Another-Stin said:


> Maybe you're right... I should care more about my inaugural post to TPU. I hope to climb the ranks of the TPU members and become the most renowned member of all! (/s)
> 
> I had a thought, signed up, and posted it. Simple as that. Wish you well!


Do you register on car froums and talk about football on a thread about inlet manifolds? I mean I'm not sure what you don't get but ok buddy, welcome to TPU


----------



## Kissamies (Sep 10, 2020)

Nice thing for all the mITX owners. 



Rahnak said:


> That’s where the PCB ends.


That's not a problem if we think how GTX 1060 reference had its power connector.


----------



## Another-Stin (Sep 10, 2020)

xman2007 said:


> Do you register on car froums and talk about football on a thread about inlet manifolds?


If the CTR launched at $60k, Supra @ $80k, Mustang GT @ $70k, and people were just like "yeah, can't wait to pick one up!", I'd probably jump in the convo and be like.. "why is the collective cool with this price/performance ratio?" I didn't just jump in here and start talking about a different topic, it was just related to price/perf. Lol


----------



## agent_x007 (Sep 10, 2020)

Chloe Price said:


> That's not a problem if we think how GTX 1060 reference had its power connector.


And we all know how that ended... also this is a 1500$ card.


----------



## freeagent (Sep 10, 2020)

Looks a tad longer than a triple slot fermi


----------



## Kissamies (Sep 10, 2020)

agent_x007 said:


> And we all know how that ended... also this is a 600$ card.


Well they used the similar solution in RTX 2060(S) and 2070 reference at least too.


----------



## bobbybluz (Sep 10, 2020)

It's still smaller than a 6990 with an aftermarket cooler on it.


----------



## AddSub (Sep 10, 2020)

Just per-ordered three RTX 3090 cards, turns out only two card SLI is supported, oh well. Kids are crying in the corner because of no food. Car getting repossessed!   Am I doing this right? 

...
..
.


----------



## bug (Sep 10, 2020)

AddSub said:


> Just per-ordered three RTX 3090 cards, turns out only two card SLI is supported, oh well. Kids are crying in the corner because of no food. Car getting repossessed!   Am I doing this right?
> 
> ...
> ..
> .


It depends. You livin' in a shack yet?


----------



## Fluffmeister (Sep 10, 2020)

bug said:


> It depends. You livin' in a shack yet?



I'm just sad to hear his Veyron is getting repossessed.


----------



## bug (Sep 10, 2020)

Fluffmeister said:


> I'm just sad to hear his Veyron is getting repossessed.


It's Chiron these days. keep up


----------



## Rahnak (Sep 10, 2020)

Steevo said:


> So the cable couldn't have gone on the other side?


Too close to the PCIe slot and probably would've been too squished between GPU and motherboard.



Chloe Price said:


> That's not a problem if we think how GTX 1060 reference had its power connector.


I'm sure they considered that option. If they didn't go for it, they probably had a reason to. Don't forget that unlike the 1060, this is a 350W card.


----------



## bug (Sep 10, 2020)

Rahnak said:


> Too close to the PCIe slot and probably would've been too squished between GPU and motherboard.
> 
> 
> I'm sure they considered that option. If they didn't go for it, they probably had a reason to. Don't forget that unlike the 1060, *this is a 350W card*.


I'm actually really looking forward seeing how much this actually draws under various conditions (RTX, non-RTX, compute).


----------



## Voidxearo (Sep 10, 2020)

I would be surprised if it looked huge in my Tower 900 build


----------



## xman2007 (Sep 10, 2020)

From the nvidia presentation regarding Ampere, it seems there is no IPC or architectural increase when compared to Turing, infact the "huge" performance increase seems to have come from doubling down on the shader units and not from an architectural pov? obviously there are power consumption benefits, possibly from moving from 14nm to Samsung 8nm (aka 10nm) but aside from that if ampere had the same number of shaders as turing, there would likely be virtually no difference in performance, ray tracing excluded of course.


----------



## bug (Sep 10, 2020)

xman2007 said:


> From the nvidia presentation regarding Ampere, it seems there is no IPC or architectural increase when compared to Turing, infact the "huge" performance increase seems to have come from doubling down on the shader units and not from an architectural pov? obviously there are power consumption benefits, possibly from moving from 14nm to Samsung 8nm (aka 10nm) but aside from that if ampere had the same number of shaders as turing, there would likely be virtually no difference in performance, ray tracing excluded of course.


Is it bad if Ampere is Turing with beefed up RT and tensor cores?


----------



## John Naylor (Sep 11, 2020)

Looks very much like my lower radiator.


----------



## xman2007 (Sep 11, 2020)

bug said:


> Is it bad if Ampere is Turing with beefed up RT and tensor cores?


Not at all, I'm just expressing an opinion, there doesnt seem to be any IPC or architectural improvements, yes there is more performance but that seems to have been brought upon by the shader increase and not anything to do with a newer/more refined architecture, but 2080 SLI in a single GPU is still a massive deal and it costs less than half of the previous gen to achieve so it's still a win. 

In fact if anything they should be lauded for their power efficiency, as it's Turing IPC and shaders with 1/3 less power consumption, probably down to the node shrink and more conservative core/boost clocks plus like you said "beefed" up RT and tensor cores


----------



## AddSub (Sep 11, 2020)

What's the ROP count on these?

...
..
.


----------



## windwhirl (Sep 11, 2020)

AddSub said:


> What's the ROP count on these?
> 
> ...
> ..
> .







Ignore the ones that have just "2020" as Release date. Those are just placeholders.


----------



## SIGSEGV (Sep 11, 2020)

looks ugly.


----------



## AsRock (Sep 11, 2020)

bonehead123 said:


> wHY OH Why did they put the power connector in the middle of the card, thereby creating moar visible cable clutter, especially with the adapter ?
> 
> f.A.i.L....





Rahnak said:


> That’s where the PCB ends.



And did not want to use glue again, even more so on the 3080.


----------



## Cosmocalypse (Sep 11, 2020)

DuxCro said:


> I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it?  But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan  replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.
> 
> So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300.  Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699



It is a Titan replacement. This is always how they do it. Releasing the Ti version at the same time as the regular card is something that normally doesn't happen (like it did with 20xx series). In 6 months they'll release a slightly cut down 3090 with less VRAM and that will be the 3080 Ti. They charge a high price for the latest tech and those capable will buy it as early adopters. Like the Titan, it's really a workstation card. Gamers should go for 3080.


----------



## lemoncarbonate (Sep 11, 2020)

I don't know, at first glance the card looks clean, but I'm not digging it when it's installed. Maybe the fan and power connector placement ruin the otherwise clean aesthetic. Just in my opinion. I'm more into classic look with plain backplate and fan facing down.

Although FE card is never sold in my country.


----------



## Rob94hawk (Sep 11, 2020)

bonehead123 said:


> wHY OH Why did they put the power connector in the middle of the card, thereby creating moar visible cable clutter, especially with the adapter ?
> 
> f.A.i.L....



Are you buying the 3090 to look pretty or to give you incredible 4k fps?

I'll take awesome fps in 4k for $1499 Alex.


----------



## Jcguy (Sep 11, 2020)

DuxCro said:


> I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it?  But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan  replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.
> 
> So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300.  Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699



Why are you worried about what people are willing to spend? If you can't afford, don't buy it. That simple.


----------



## zo0lykas (Sep 11, 2020)

Evga 3years and another 2 if you register item, so 5 in total. 





P4-630 said:


> EVGA only gives a 24 month warranty in my country, while most other brands 36 months.


----------



## agent_x007 (Sep 11, 2020)

xman2007 said:


> From the nvidia presentation regarding Ampere, it seems there is no IPC or architectural increase when compared to Turing, infact the "huge" performance increase seems to have come from doubling down on the shader units and not from an architectural pov? obviously there are power consumption benefits, possibly from moving from 14nm to Samsung 8nm (aka 10nm) but aside from that if ampere had the same number of shaders as turing, there would likely be virtually no difference in performance, ray tracing excluded of course.


Arch changes are kinda like Sandy/Ivy Bridge vs. Haswell (if you like IPC comparisons).
You get more execution hardware (AVX2), with more cache bandwidth to not starve it.


----------



## BlackWater (Sep 11, 2020)

The amount of VRAM on the 3090 made me think on the following:

Let's assume most enthusiasts right now game at 1440p/144 Hz, and that Nvidia is doing a strong push for 4K/144 Hz. OK, so far, so good. But even then, we know that 4K doesn't need 24GB of VRAM. They say the card is capable of "8K", but this is with DLSS upscaling, so we are not talking about actual 8K native resolution rendering. Regardless of IPC improvements or not, I absolutely don't believe we have the processing power to do 8K yet, and even if we did... We're gonna do 8K on what exactly? After all, this is a PC GPU - how many people are going to attach this to a gigantic 8K TV? And let's not even mention ultra-high resolution monitors, the very small amount of them that exist are strictly professional equipment and have 5 figure prices...

So, considering that 1440p is 3.7 Mpixels, 4K is 8.3 Mpixels and 8K is 33.2 Mpixels, perhaps a more realistic application for the 3090 is triple-monitor 1440p/4K @ 144 Hz? 3x 1440p is 11.1 Mpixel, which is slightly above one 4K display's resolution, so it shouldn't have any trouble driving it and with DLSS, triple 4K is about 25 Mpixel, which seems somewhat possible - perhaps then the 24 GB VRAM would come into play? 

But even then, where are the 4K monitors - at the moment the choice is very limited, and let's be honest, 4K on a 27" panel makes no sense, and there are a few monitors at 40+", which again, for a triple monitor setup doesn't really work either. So, either a wave of new 4K/144 Hz monitors at about 30" is coming or... The 3090 doesn't really make much sense at all... And I'm not even talking about the price here, it's irrelevant. The question is - why does the 3090 actually exists and what is the actual application for it - Titan replacement or not, Nvidia is strongly pushing the card as a gaming product, which is all fine, but I fail to see the scenario where the 24 GB VRAM is relevant to gaming. Regardless, in about 2 weeks, benchmarks will tell us all we need to know.


----------



## sk8er (Sep 11, 2020)

I want $799/$849 3090 12gb on 3080FE size/2 slots cooler (for itx build).
with half vram, cheaper 3090 i think 12gigs just enough for me, 49" 32:9 1080p or triple 16:9 1080p/1440p racing setup..

I can wait till nov/des after all Navi & 3070 16gb / 3080 20gb released to decide


----------



## havox (Sep 11, 2020)

BlackWater said:


> But even then, where are the 4K monitors - at the moment the choice is very limited, and let's be honest, 4K on a 27" panel makes no sense, and there are a few monitors at 40+", which again, for a triple monitor setup doesn't really work either. So, either a wave of new 4K/144 Hz monitors at about 30"


40" is also kind of small for 4K. At the distance I'm sitting 55" hits the right spot for me.
-t owner of LG OLED55C9 TV


----------



## P4-630 (Sep 11, 2020)

zo0lykas said:


> Evga 3years and another 2 if you register item, so 5 in total.


----------



## BlackWater (Sep 11, 2020)

havox said:


> 40" is also kind of small for 4K. At the distance I'm sitting 55" hits the right spot for me.
> -t owner of LG OLED55C9 TV



Yeah, true, but in your use case, do you use the TV more as a TV, or more as a monitor? I'm wondering from the standpoint of someone hooking the 3090 to a monitor or multiple monitors on a desk, in which case viewing distance would be... about 0.5 to 1 meters, perhaps? I currently have a 27" 1440p monitor and I sit about 60-70 cm from it, and then I was looking at monitors like the Asus PG43UQ - great specs on paper, but it's pretty much the size of a TV, and I just can't see myself, or anyone really, hooking up 3 of those on a desk... You'd have to sit quite far so that you would literally have to constantly turn your head left and right, and at that point, you're better off just getting a large TV and using it from a distance, as you'd use a TV normally.

So that's what I was thinking originally, if we assume that the 3090 is targeted towards multi-monitor 4K gaming at 120-144Hz, where are the monitors suitable for that? IMO, the ideal desk implementation of 4K is 30-34", and pretty much all the ones available, that we can call 'gaming' monitors are 40+", and some are even straight up TVs without the TV tuner (the BFGDs)... So I kind of don't really get what exactly the 3090 is supposed to do. If you want to do big screen gaming in the living room, the 3080 can easily do that (allegedly), so then what even is the purpose of the 3090? It's not professional or scientific research for sure, since Nvidia is pretty much pushing it as the top-end gaming card. But to me it seems that if you try to figure out what it's supposed to do, it just comes out as a slightly bigger chip than the 3080 with a strangely large amount of VRAM, just so Nvidia can say "look what we can do, lol".


----------



## havox (Sep 11, 2020)

I'm using it mainly as a PC monitor for games, I don't watch that many movies. 0.5-1m sounds about right. It's great both in fullscreen mode for shooters, as well as running other genres like RPG or MMO in a 2K window while also having a browser and some other stuff open.

I'm personally fine with Nvidia rebranding big chungus Ampere from Titan to a xx90. I had issues with Titan series, only reference coolers meant it got outperformed by nonref cooler xx80Ti's at half the price, and also available only from Nvidia store which is not available in my country.
But here I can just buy a nonref tripple slot monstrosity and have a peace of mind for a couple years that for all the unoptimized Ubisoft garbage I can set all the graphics sliders to the right and get a minimum of 4K 60FPs, something 2080Ti was not capable of. And if it has twice more memory than I will ever need for 4K gaming... I can live with that.


----------



## Gungar (Sep 11, 2020)

DuxCro said:


> I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it?  But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan  replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.
> 
> So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300.  Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699



I dont understand your question. The RTX 3090 is the 3080ti, they changed it to 3090 maybe because they need the Ti for something else.


----------



## Pumper (Sep 11, 2020)

They did not even try to make an unobtrusive cable adapter.


----------



## bug (Sep 11, 2020)

Pumper said:


> They did not even try to make an unobtrusive cable adapter.


What would that look like?


----------



## havox (Sep 11, 2020)

Pumper said:


> They did not even try to make an unobtrusive cable adapter.


It's not a Titan, AIBs got you covered with their 3 fan RGB clown editions xD


Spoiler


----------



## ratirt (Sep 11, 2020)

Not sure why people think this card is Titan? It is called 3090 not Titan for one thing and second, The Titans have always had full chip enabled and this one doesn't (well most of them there was this XP and X Titan crap NV pulled off with 1080 TI era).


----------



## bug (Sep 11, 2020)

ratirt said:


> Not sure why people think this card is Titan? It is called 3090 not Titan for one thing and second, The Titans have always had full chip enabled and this one doesn't (well most of them there was this XP and X Titan crap NV pulled off with 1080 TI era).


Well, it was confusing with Turing, too. With 2080Ti being so similar to the Titan (save for the price, but even then, they were both in the "crazy" territory).

Neither card serves any purpose, except letting Nvidia tell AMD: see? whatever you can do, we can build two more tiers on top. Halo products through and through.


----------



## EarthDog (Sep 11, 2020)

ratirt said:


> Not sure why people think this card is Titan? It is called 3090 not Titan for one thing and second, The Titans have always had full chip enabled and this one doesn't (well most of them there was this XP and X Titan crap NV pulled off with 1080 TI era).


We've went over this (in multiple threads) on why some(me) feel this way. There may be a Titan, but NV literally talked about Titan class and in the same breath he took the 3090 out of his fancy ass oven. 

Also, just before that, they state verbally AND on the slide that the 3080 is the "FLAGSHIP". Either one will come out later (doubtful to me - why considering what the 3090 is) or they are ditching the Titan name. Why, if the 3090 isn't a titan replacement (as he inferred in the video) is the 3080 a flagship? What is the 3090 then?


----------



## Vayra86 (Sep 11, 2020)

Another-Stin said:


> I wasn't talking to OP specifically, as I don't know OP's situation.
> How's posting your opinion any different than me posting mine?
> 
> If someone makes minimum wage and buys a 3090, I just hope they're not going into debt in doing so. It's just so not worth it after experiencing the pain of paying down debts myself.
> ...



Its a tech. forum.

Tech: you get enthusiasts here so a much higher pecentage wanting the biggest hardware they can find. Useful? Dude, its big. Its the reason pimp cars, drive motorcycles etc. for fun.
Forum: you get 90% lies and bs fed to you. The better half saying they'll buy every next product is just that.


EarthDog said:


> We've went over this (in multiple threads) on why some(me) feel this way. There may be a Titan, but NV literally talked about Titan class and in the same breath he took the 3090 out of his fancy ass oven.
> 
> Also, just before that, they state verbally AND on the slide that the 3080 is the "FLAGSHIP". Either one will come out later (doubtful to me - why considering what the 3090 is) or they are ditching the Titan name. Why, if the 3090 isn't a titan replacement (as he inferred in the video) is the 3080 a flagship? What is the 3090 then?



Simple. A graphics card

/thread


----------



## Assimilator (Sep 11, 2020)

BluesFanUK said:


> There should be a wall of shame for all the plebs who bought a 2080ti and for the ones considering a 3090. Bonkers monkey regardless of how flush you are.



Last time I checked, people are allowed to spend their hard-earned money the way they want. Jealousy makes you nasty.



EarthDog said:


> We've went over this (in multiple threads) on why some(me) feel this way. There may be a Titan, but NV literally talked about Titan class and in the same breath he took the 3090 out of his fancy ass oven.
> 
> Also, just before that, they state verbally AND on the slide that the 3080 is the "FLAGSHIP". Either one will come out later (doubtful to me - why considering what the 3090 is) or they are ditching the Titan name. Why, if the 3090 isn't a titan replacement (as he inferred in the video) is the 3080 a flagship? What is the 3090 then?



Why does it matter what it's called, FFS?


----------



## EarthDog (Sep 11, 2020)

Assimilator said:


> Why does it matter what it's called, FFS?


You've confused my simple response to a question as someone who GAF, me thinks. 

Ask the others.


----------



## HenrySomeone (Sep 11, 2020)

BoboOOZ said:


> I actually agree with most of what's been said in that video and I remember well when many of those things happened, and I still don't like it one bit.
> 
> But in this case, I think there's no need to call out the conspiracy yet, wait for Gamer's Nexus review of how that design affects thermals of different case setups. I'm pretty sure for many cases the outcome will be positive (improving the airflow in the case, although outputting more heat).


If anyone should be displeased, it would have to be Intel owners because let's face it - that's what 95%+ of buyers of this bad boy are going to pair them with; the same as with 2080Ti, but obviously even more so, due to the much increased performance - you don't want a (notable in many, massive in some cases) cpu bottleneck in every other game...


----------



## TheoneandonlyMrK (Sep 11, 2020)

HenrySomeone said:


> If anyone should be displeased, it would have to be Intel owners because let's face it - that's what 95%+ of buyers of this bad boy are going to pair them with; the same as with 2080Ti, but obviously even more so, due to the much increased performance - you don't want a (notable in many, massive in some cases) cpu bottleneck in every other game...


You realise not everyone would buy a 3090 or 80 to 1080,p game on, 1440_4k is where these cards are aimed at making the CPU disparity negligible at best.
1080p wins mean absolutely nothing to me for example ,and at 4k most CPUs are equal, ATM.

And why would any of this anger intel owners, probably 99% couldn't give a rat's ass in reality the 1% of enthusiasts are not the norm.


----------



## HenrySomeone (Sep 11, 2020)

Joke's on you bud - the first of the above graphs is 1440p....with a 2080Ti and everywhere there are already notable differences at that resolution with a top Turing card, there are now going to be even at 4k with the Ampere champion. The only case where those wouldn't matter much is if you have a 4k 60Hz display AND don't intend to upgrade to a higher refresh rate one anytime soon, but I suspect most future 3090 owners will...


----------



## TheoneandonlyMrK (Sep 11, 2020)

HenrySomeone said:


> Joke's on you bud - the first of the above graphs is 1440p....with a 2080Ti and everywhere there are already notable differences at that resolution with a top Turing card, there are now going to be even at 4k with the Ampere champion. The only case where those wouldn't matter much is if you have a 4k 60Hz display AND don't intend to upgrade to a higher refresh rate one anytime soon, but I suspect most future 3090 owners will...


Yes the niche will indeed have a niche, They are not more than a skint bit of the 1% of enthusiasts.

4k120 is appealing tbf, but expensive , do you believe lot's of people have a few thousand to spend on a GPU and monitor.

Hardware unboxed tested the Radeon 7 ,1080ti and 2080ti with present driver's recently, go check it out, yes the 2080Ti wins all but there are instances it gets beat by the 1080Ti.

It simply wasn't as good as some hoped, and certainly wasn't anything like the value some wanted.

We all do want our games to run and look as good as possible, but that is measured against cost for nearly everyone, the next guy saying 1500£ is nothing should note 98% of reader's of that opinion disagree.


----------



## Toxicscream (Sep 11, 2020)

I will put it inside a Corsair 1000D and it will look normal size.


----------



## phanbuey (Sep 11, 2020)

if the fan is meant to pull air through that card then they mounted the blades backwards.  That is a fan oriented to push air through...  if it spins the other way to to pull air then it's definitely mounted backwards


----------



## Caring1 (Sep 11, 2020)

HenrySomeone said:


> If anyone should be displeased, it would have to be Intel owners because let's face it - that's what *95%+ of buyers* of this bad boy are going to pair them with...


I'm calling Troll post.


----------



## HenrySomeone (Sep 11, 2020)

To give you just one example (that you can quickly start checking out) - out of several dozen users on this forum who have a 2080Ti listed as their gpu (or one of their gpus) I think there is only 1 who has it paired with Ryzen of any kind. Now, 3090 will be at least around 50% faster and trust me, people who dish out 1.5 - 1.8k $ for a top of the line gpu don't want to see it being bottlenecked by an inferior cpu, especially not one that is just a couple bucks cheaper than the competing, faster option.


----------



## John Naylor (Sep 11, 2020)

theoneandonlymrk said:


> You realise not everyone would buy a 3090 or 80 to 1080,p game on, 1440_4k is where these cards are aimed at making the CPU disparity negligible at best.
> 1080p wins mean absolutely nothing to me for example ,and at 4k most CPUs are equal, ATM.



Until now... frankly 4k did not provide the user experience that 1440p did, at least to my eyes.  I didn't see the logic to doing 4k until it can do 1440p in ULMB @ 120 hz or better.  I think we will see a drop on the relative market saturation of the xx60 versus xx70 versus xx80 as more people will drop a tier because they are not gaming at 4k..... only 2.24%  of steam users are at 4k while 2 out of every 3 are still at 1080.  Only 6.6% are at 1440p

1024 x 768    0.35%
1280 x 800    0.55%
1280 x 1024    1.12%
1280 x 720    0.35%
1360 x 768    1.51%
1366 x 768    9.53%
1440 x 900    3.05%
1600 x 900    2.46%
1680 x 1050    1.83%
*1920 x 1080    65.55%*
1920 x 1200    0.77%
*2560 x 1440    6.59%*
2560 x 1080    1.14%
3440 x 1440    0.90%
*3840 x 2160    2.24%*
Other    2.07%

The thing to consider tho is the same that we have always had to consider .... the relative performance depends on what you are measuring.  We've always had the argument whereby both sides proved they were right simply by choosing what to test and which games to test with.

RAM Speed Doesn't Matter: 

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck.  But look at other things such as:

a) Certain Games Like STRIKER and F1 and RAM speed did matter.
b) Go with SLI / CF and RAM Speed mattered
c)  Look at Min. fps and RAM Speed mattered

This was because the GPU was the bottleneck at average fps ... in other situations it was not.

RAM Quantity Doesn't Matter as long as ya have xx GB: 

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck.  But look at other things such as:

a) Certain Games and RAM Quantity did matter.
b) Go with SLI / CF and RAM Quantity mattered.
c)  Look at Min. fps and RAM Quantity mattered

Again, this was because the GPU was the bottleneck at average fps ... in other situations it was not.


At 140p / 4k, CPU performance Doesn't Matter:

Yes, if you are a gamer, there's hardly a reason to buy anything more than the Intel 10400 at any resolution and as you say... the differences lessen at higher resolutions.  However, that 1s not universay



			https://tpucdn.com/review/intel-core-i5-10400f/images/relative-performance-games-1920-1080.png
		



			https://tpucdn.com/review/intel-core-i5-10400f/images/relative-performance-games-38410-2160.png
		


A 9900k paired with a 2080 Ti delivered 49.1 average fps (41.0 @ 99th percentile)  at 1440p in MS Flight Simulator Ultra Settings
A ($500 MSRP) 3900X paired with a 2080 Ti delivered 43.9 average fps (34.5 @ 99th percentile)  at 1440p in MS Flight Simulator Ultra Settings

That's a performance difference of 12% (19%  @ 99th percentile)

Point here being that, while average fps deserves it's "go to" yardstick for relative performance between cards, as the average to minimum ratio will most times be relative, the relative performance of CPUs and RAM can be significantly impacted by other factors.  This is also true with VRAM ... At 1080p, you will rarely see a difference between a 3 GB / 4 GB card or 4 / 8 GB card .... but if you all big on Hitman and Tomb Raider ... it will be something to consider.  Witcher 3 and most other games was less than 5% performance difference and that can be attributed solely to the 11% shader count difference.

Would love to see TPU include 99th percentile numbers in the TPU reviews ... would also like to see overclocked performance.  But while most sites that do provide this info. only test a handful of games, it's asking a bit much to do this with TPUS 18 - 23 game test suite.


----------



## TheoneandonlyMrK (Sep 11, 2020)

John Naylor said:


> Until now... frankly 4k did not provide the user experience that 1440p did, at least to my eyes.  I didn't see the logic to doing 4k until it can do 1440p in ULMB @ 120 hz or better.  I think we will see a drop on the relative market saturation of the xx60 versus xx70 versus xx80 as more people will drop a tier because they are not gaming at 4k..... only 2.24%  of steam users are at 4k while 2 out of every 3 are still at 1080.  Only 6.6% are at 1440p
> 
> 1024 x 768    0.35%
> 1280 x 800    0.55%
> ...


Do you have template's?.
I'm not overplaying 4k!? But you are under playing it , I have enjoyed it and 1080p144Hz, but 1080p less so.


----------



## moproblems99 (Sep 12, 2020)

Nothing is big in a Tower 900.  I fear not.



John Naylor said:


> Other 2.07%



Wtf is other?  640x480?


----------



## bug (Sep 12, 2020)

moproblems99 said:


> Nothing is big in a Tower 900.  I fear not.
> 
> 
> 
> Wtf is other?  640x480?


Probably custom resolutions.
But hey, I was playing Test Drive in CGA and 320x200


----------



## aQi (Sep 12, 2020)

Feeling Nostalgia

Courtesy of Nvidia when they released 8800 ultra


----------



## Vayra86 (Sep 12, 2020)

John Naylor said:


> Until now... frankly 4k did not provide the user experience that 1440p did, at least to my eyes.  I didn't see the logic to doing 4k until it can do 1440p in ULMB @ 120 hz or better.  I think we will see a drop on the relative market saturation of the xx60 versus xx70 versus xx80 as more people will drop a tier because they are not gaming at 4k..... only 2.24%  of steam users are at 4k while 2 out of every 3 are still at 1080.  Only 6.6% are at 1440p
> 
> 1024 x 768    0.35%
> 1280 x 800    0.55%
> ...



I think what's most telling is that despite laptops dying left and right (3-5 years is being generous for midrangers and below) and 720p being very much yesterday's mainstream, *we still have a 50% higher amount of people on 1366x768 than we have on 1440p*.

CPU matters. There is no question and its apparently still the last bastion for Intel, if you want high refresh rates. I mean we can walk past those 1440p benches as if nothing happened, but realistically... 4K is nothing and 1440p is for gaming, the next '1080p mainstream station'... You can safely ignore 4K for gaming even with a 3090 out and the tiny subset that uses the resolution can still drop to something lower; (Monitor sales won't tell the whole story) but it never happens the other way around (1080p owners won't DSR 4K, what's the point). As GPUs get faster, the importance of the fastest possible CPU do increase again. AMD is going to have to keep pushing hard on that to keep pace, and its still a meaningful difference in many places, with Intel.

Will the consoles drive that much desired push to 4K content? I strongly doubt it. Its a picture on a TV in the end, not any different from 720, 1080 or 1440p. Sit back a bit and you won't even notice the difference.

1080p is as relevant as its ever been and will remain so for the foreseeable future. Resolution upgrades past this point ALWAYS involve a monitor *diagonal *upgrade to go with it. A large number of gamers just doesn't have the desire, the space or the need for anything bigger. And the advantages of sticking to this 'low' res are clear: superb performance at the highest possible detail levels, with relatively cheap GPUs, and an easy path to 120 fps fixed which is absolutely glorious. I'll take a fast monitor over a slow as molasses TV with shit color accuracy any day of the week, regardless of diagonals. Once OLED can be transplanted to a monitor with good endurance, I'll start thinking differently. Until then, we're being sold ancient crap with lots of marketing sauce, mostly, and its all a choice of evils.



theoneandonlymrk said:


> You realise not everyone would buy a 3090 or 80 to 1080,p game on, 1440_4k is where these cards are aimed at making the CPU disparity negligible at best.
> 1080p wins mean absolutely nothing to me for example ,and at 4k most CPUs are equal, ATM.
> 
> And why would any of this anger intel owners, probably 99% couldn't give a rat's ass in reality the 1% of enthusiasts are not the norm.





theoneandonlymrk said:


> Yes the niche will indeed have a niche, They are not more than a skint bit of the 1% of enthusiasts.
> 
> 4k120 is appealing tbf, but expensive , do you believe lot's of people have a few thousand to spend on a GPU and monitor.
> 
> ...



Eh? I don't follow. First, 1080p wins mean nothing, and then, 1440p wins on a 2080ti don't say anything because it only applies to a tiny niche while 4K120 is too expensive? Those 1440p wins aren't any different on a lower res either when it comes to the limitations of the CPU. The gap will remain, even with much weaker GPUs. But even more importantly, that 2080ti performance will soon be available to midrange buyers at 500 bucks.

Are you being honest with yourself here?


----------



## Jism (Sep 12, 2020)

DuxCro said:


> I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it?  But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan  replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.
> 
> So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300.  Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699



it's called buying the full / binned chip. They can ask premium for it since there's no competition yet.


----------



## TheoneandonlyMrK (Sep 12, 2020)

Vayra86 said:


> I think what's most telling is that despite laptops dying left and right (3-5 years is being generous for midrangers and below) and 720p being very much yesterday's mainstream, *we still have a 50% higher amount of people on 1366x768 than we have on 1440p*.
> 
> CPU matters. There is no question and its apparently still the last bastion for Intel, if you want high refresh rates. I mean we can walk past those 1440p benches as if nothing happened, but realistically... 4K is nothing and 1440p is for gaming, the next '1080p mainstream station'... You can safely ignore 4K for gaming even with a 3090 out and the tiny subset that uses the resolution can still drop to something lower; (Monitor sales won't tell the whole story) but it never happens the other way around (1080p owners won't DSR 4K, what's the point). As GPUs get faster, the importance of the fastest possible CPU do increase again. AMD is going to have to keep pushing hard on that to keep pace, and its still a meaningful difference in many places, with Intel.
> 
> ...


It's a stretch to say the 2080ti was a success was a point.
And given perspective, Intel CPUs are not That much better for gaming than AMD was another point made in reply to only Intel man and that's the main point I was getting at, it's hard with a tangential argument passed back then a an unquoted sneaky reply.


----------

