# ASUS GeForce RTX 4090 STRIX OC



## W1zzard (Oct 12, 2022)

The ASUS GeForce RTX 4090 STRIX OC is the company's flagship graphics card. It comes with a huge cooler that's all metal, even on the cooler shroud. In terms of performance our review confirms that this is the best air cooler available and out of the box performance is great too, thanks to a default power limit of 500 W.

*Show full review*


----------



## ShiningSapphire (Oct 12, 2022)

You are massively bottlenecked by used platform. The 5800X is absolutely unable to properly drive this card even at 4K that's why your results and differences between previous generation are significantly lower than other reviewers with the 12900K or 5800X3D.


----------



## the54thvoid (Oct 12, 2022)

Oof. $2000.

Hey ASUS, suck my silicon nib.


----------



## clopezi (Oct 12, 2022)

Wonderful card but too expensive. Waiting for the TUF review!


----------



## W1zzard (Oct 12, 2022)

ShiningSapphire said:


> You are massively bottlenecked by used platform. The 5800X is absolutely unable to properly drive this card even at 4K that's why your results and differences between previous generation are significantly lower than other reviewers with the 12900K or 5800X3D.


RTX 4090 + 13900K coming soon, running right now, don't you worry, the differences (at 4K) will be very small



clopezi said:


> Waiting for the TUF review!


No plans for a TUF review


----------



## clopezi (Oct 12, 2022)

W1zzard said:


> No plans for a TUF review


You broke my heart


----------



## thelawnet (Oct 12, 2022)

hahahaha, $2000.

lolllllll.


----------



## ymdhis (Oct 12, 2022)

Can't we group reviews for the same product/different brand together somehow? So they don't take up several pages of the front page? This has been bugging me forever.


----------



## thewan (Oct 12, 2022)

table shows MSI instead of ASUS


----------



## Wirko (Oct 12, 2022)

ymdhis said:


> Can't we group reviews for the same product/different brand together somehow? So they don't take up several pages of the front page? This has been bugging me forever.


If that's not possible, it would be still nice if W1zzard, or one of the mods, opened a common discussion thread titled "GeForce RTX 4090 General Discussion" as soon as the reviews are out. The comments will be scattered across 7 threads now, and many of them will be about the 4090 in general.


----------



## champsilva (Oct 12, 2022)

W1zzard said:


> RTX 4090 + 13900K coming soon, running right now, don't you worry, the differences (at 4K) will be very small
> 
> 
> No plans for a TUF review



Can you tell when embargo lift for 13900K?


----------



## W1zzard (Oct 12, 2022)

champsilva said:


> Can you tell when embargo lift for 13900K?


Soon


----------



## diatribe (Oct 12, 2022)

Performance over FE is minimal except for cooling and noise levels.  Not worth a $400 premium on an overpriced GPU card.


----------



## Rokugan (Oct 12, 2022)

Seems all models perform exactly the same. Last-gen differences were very close, but temps had a bit more of variation, and hot-spot was worringly high.
This time, ALL is basically equal, so what's the excuse for Asus demanding such a stupid premium for the Strixx?
It performs like the Gigabyte OC or the FE basically, on temps/fps...


----------



## rojo (Oct 12, 2022)

£2400 for this card on overlocker's UK, GULP!!!


----------



## Xex360 (Oct 12, 2022)

Let's take time to salute @W1zzard for the all the benching.
I am curious to see how it (any 4090) performs at 8k.


----------



## Night (Oct 12, 2022)

What could possibly justify the $400 increase over FE that's already priced terribly? A few degrees lower temperatures? No. Few more FPS due to higher clocks? No. ROG STRIX written on the card? Yes!


----------



## R0H1T (Oct 12, 2022)

the54thvoid said:


> Oof. $2000.
> 
> Hey ASUS, suck my silicon nib.


Should probably point the nib at JHH ~


----------



## thelawnet (Oct 12, 2022)

R0H1T said:


> Should probably point the nib at JHH ~


For making Asus charge $250 more than the competing liquid cooled card?


----------



## swirl09 (Oct 12, 2022)

clopezi said:


> You broke my heart


HUB are testing the TUF as we speak, video coming in the next day or so.


----------



## L|NK|N (Oct 12, 2022)

@W1zzard I just want to say thank you for the exhaustive work that you do in these reviews. You are definitely underpaid for your incredible work and the amount of detail that you put into your reviews. I have trusted your methodology and numbers for 17 years now and you have my utmost respect.

I say to Nvidia and AMD: Release something actually good and ground breaking in the $300 range and I’ll be impressed.


----------



## Jism (Oct 12, 2022)

ShiningSapphire said:


> You are massively bottlenecked by used platform. The 5800X is absolutely unable to properly drive this card even at 4K that's why your results and differences between previous generation are significantly lower than other reviewers with the 12900K or 5800X3D.



You do know that at 4K the CPU speed(s) is quite less of an impact to for example 720p or 1080p right?

There's virtually no difference at such resolutions.


----------



## RandomWan (Oct 12, 2022)

thelawnet said:


> For making Asus charge $250 more than the competing liquid cooled card?


For preventing AIBs from doing anything special with their versions of the cards.  If Nvidia wasn't so draconian with their requirements about modifications, we'd still see creative things coming out of the AIBs for the money that would likely produce something more than a 2% increase in frames.


----------



## kapone32 (Oct 12, 2022)

This card is $2800 before tax and when you add tax it will be over $3000. Too bad Multi GPU support has been killed by the industry as 2 6800XTs would cost you about 1/2 of $3000 to buy 2 cards and if AMD did crossfire like Polaris  (basically default in the Driver) and blow this card out of the water with a comparable power draw.


----------



## JackCarver (Oct 12, 2022)

As you can see on The PCB back they use 4 POS-Caps instead of MLCC capacitors. The FE use only MLCCs. With RTX 3080 and 3090 cards there were problems with those mixes at higher boost frequencies which result in crashes. The FE seems a more stable build according these problems.


----------



## mahanddeem (Oct 12, 2022)

Great reviews. I like the relative performance charts and average fps.
I wish you can also add 3DMark Time spy graphics score to graphic card reviews.


----------



## W1zzard (Oct 12, 2022)

mahanddeem said:


> Great reviews. I like the relative performance charts and average fps.


Thanks 



mahanddeem said:


> I wish you can also add 3DMark Time spy graphics score to graphic card reviews.


I stopped using synthetic benchmarks a long time ago and focus on actual gameplay now


----------



## ShiningSapphire (Oct 12, 2022)

W1zzard said:


> RTX 4090 + 13900K coming soon, running right now, don't you worry, the differences (at 4K) will be very small


These are simply the facts. There is 55-60% difference in 4K between 3090ti/4090. Of course it depends on the game set but your only 30% clearly indicates CPU bottleneck not to mention 1080p/1440p. Also i have one question regarding testing methodology, are you using built-in benchmarks or trying to find worst case scenario gpu bound in-game locations for testing?


----------



## W1zzard (Oct 12, 2022)

ShiningSapphire said:


> Also i have one question regarding testing methodology, are you using built-in benchmarks or trying to find best gpu bound in-game locations for testing?


I'm not using integrated benchmarks because they are usually super unrealistic and run too short (most cards slow down quite a bit as they heat up over the first 2 minutes or so, you play for hours, not 30 seconds, so steady state is the perf you should get in reviews so you know what to expect)

I've played through nearly all the games that I benchmark. The scenes that I pick are representative of "demanding, typical" gameplay, not "worst case". I'm also intentionally not publishing the actual locations so driver teams can't just optimize that one spot in a game


----------



## ShiningSapphire (Oct 12, 2022)

W1zzard said:


> I'm not using integrated benchmarks because they are usually super unrealistic and run too short (most cards slow down quite a bit as they heat up over the first 2 minutes or so, you play for hours, not 30 seconds, so steady state is the perf you should get in reviews so you know what to expect)
> 
> I've played through nearly all the games that I benchmark. The scenes that I pick are representative of "demanding, typical" gameplay, not "worst case". I'm also intentionally not publishing the actual locations so driver teams can't just optimize that one spot in a game


Hmm in my opinion, the methodology should be to look for the worst places, then someone interested in the performance of a particular game knows that they will get at least that many fps, and usually more during normal gameplay. Additionally, this reduces the bottleneck.


----------



## Blaazen (Oct 12, 2022)

Founders Edition for $1600 and other AiB cards for $1700 for literally 1% higher performance. Here it is $2000 for 2-3% from ASUS. It seems that each percent more is for $100 .
nVidia doesn't give much space for AiB for overclocking cards. No wonder EVGA left that bussiness.


----------



## DemonicRyzen666 (Oct 12, 2022)

kapone32 said:


> This card is $2800 before tax and when you add tax it will be over $3000. Too bad Multi GPU support has been killed by the industry as 2 6800XTs would cost you about 1/2 of $3000 to buy 2 cards and if AMD did crossfire like Polaris  (basically default in the Driver) and blow this card out of the water with a comparable power draw.


AMD still supports mGPU on all RDNA2 gpu's support it.
They just don't support anything old like Crossfire on DX11 & older anymore.
I have the list of mGPU games if you like it?


----------



## Tek-Check (Oct 12, 2022)

2% better for €500 extra? Wow! There is a serious problem with AIB cards.


----------



## R0H1T (Oct 12, 2022)

thelawnet said:


> For making Asus charge $250 more than the competing liquid cooled card?


For the wafer thin margins on many of these products, or did you miss this by any chance 








						EVGA Announces Cancelation of NVIDIA Next-gen Graphics Cards Plans, Officially Terminates NVIDIA Partnership
					

Towards the latter half of August, multiple EVGA employees involved in technical marketing and engineering had let us know privately that they were leaving the company for other ventures. When pushed further, several hinted towards some decisions being made by EVGA's management, including CEO...




					www.techpowerup.com
				



I bet Nvidia's still making more on these AIB models than the board partners themselves!


----------



## r9 (Oct 12, 2022)

clopezi said:


> Wonderful card but too expensive. Waiting for the TUF review!


Yeah it can go 0.1% either way. I'll be at the edge of my seat.



Night said:


> What could possibly justify the $400 increase over FE that's already priced terribly? A few degrees lower temperatures? No. Few more FPS due to higher clocks? No. ROG STRIX written on the card? Yes!


True, but you'll be stepping on your epeen when walking with this card.


----------



## Frick (Oct 12, 2022)

the54thvoid said:


> Oof. $2000.
> 
> Hey ASUS, suck my silicon nib.



They're something like $2600 here.


----------



## webdigo (Oct 12, 2022)

In these energy crisis time we got in europe, this power consumption is YIKES.
Thats my entire pc under gaming minus 150w.


----------



## GC_PaNzerFIN (Oct 12, 2022)

Will the Asus RTX 4090 Strix or TUF fit Lian Li O11 Dynamic EVO?

Based on Asus specs, this card is 150mm wide, EVO has 167mm clearance to glass from mobo.
But the new 16-pin PCIe high power cables are stiff! Wondering if it is enough the connector is slightly recessed on the PCB? 

Wow, I really didn't think I would have to worry about a year old 1000W PSU having right cables and this size case fitting a new graphics card.

Seems the Strix is going for anywhere from 200 to 500 euros more than TUF non-OC model here.
Whu would you do this Asus?


----------



## Upgrayedd (Oct 12, 2022)

Jism said:


> You do know that at 4K the CPU speed(s) is quite less of an impact to for example 720p or 1080p right?
> 
> There's virtually no difference at such resolutions.


They've never had a gpu this fast.


----------



## waltc (Oct 12, 2022)

kapone32 said:


> This card is $2800 before tax and when you add tax it will be over $3000. Too bad Multi GPU support has been killed by the industry as 2 6800XTs would cost you about 1/2 of $3000 to buy 2 cards and if AMD did crossfire like Polaris  (basically default in the Driver) and blow this card out of the water with a comparable power draw.


Yes, D3d12 multiGPU supports it very well--but the only AAA games I know of that support it are the Crystal Dynamics SotTR, and I believe they backported the support to Rise ofTR, as well.  I used it years ago with SofTR between a RX-480/590-Fat boy--worked perfectly in the game!  AMD's solution uses the PCIe bus instead of physical connectors--which worked great, too.  But it seems the game devs aren't interested in supporting D3d12 MultiGPU in their games!


----------



## BorisDG (Oct 12, 2022)

ShiningSapphire said:


> You are massively bottlenecked by used platform. The 5800X is absolutely unable to properly drive this card even at 4K that's why your results and differences between previous generation are significantly lower than other reviewers with the 12900K or 5800X3D.


I don't believe this at all.


----------



## xorbe (Oct 12, 2022)

Weird how the ASUS offered from nVidia site is Tuf and not Strix.


----------



## kapone32 (Oct 12, 2022)

waltc said:


> Yes, D3d12 multiGPU supports it very well--but the only AAA games I know of that support it are the Crystal Dynamics SotTR, and I believe they backported the support to Rise ofTR, as well.  I used it years ago with SofTR between a RX-480/590-Fat boy--worked perfectly in the game!  AMD's solution uses the PCIe bus instead of physical connectors--which worked great, too.  But it seems the game devs aren't interested in supporting D3d12 MultiGPU in their games!


I know what you mean but there are plenty of Games that officially support Multi GPUs like Star Wars Jedi Fallen Order among them. I attached a link that shows every Game that supports Multi GPU. The issue is that AMD and Nvidia influenced the industry to abandon Multi GPU as you could get a current card to get up to 80% increased performance insted of paying double for a new card. I do feel that Intel is working on a solution to combine the IGPU and DGPU to increase performance. This could have the rest of the industry re adapt Multi GPU as a technology. I know that we don't have any real information yet but I could see AMD doing the same thing too.







						List of games that support Crossfire - PCGamingWiki PCGW - bugs, fixes, crashes, mods, guides and improvements for every PC game
					






					www.pcgamingwiki.com


----------



## Readlight (Oct 12, 2022)

Looks good results with 4K, 8x multisamle atialiasing, high quality shadows and Ray tracing.


----------



## Prima.Vera (Oct 13, 2022)

No argument in this world justifies paying 2000 $ or more than that in Euros for a stupid gaming card. 
This is beyond ridiculous and Asus is another proof of a greedy, callous company. 
I was planning to buy from them a motherboard for my new build, but no thank you. I don't and I will never support greedy companies.


----------



## Legacy-ZA (Oct 13, 2022)

Mmmmm, got to give credit, where it is due, nVidia made a good all-around cooler this time around, the AIBs are way overcharging (on an already ridiculous price) for something that doesn't perform much better.

I wonder if this is why EVGA pulled out? Seems nVidia can just sell them all directly at this point.


----------



## Ferrum Master (Oct 13, 2022)

Prima.Vera said:


> No argument in this world justifies paying 2000 $ or more than that in Euros for a stupid gaming card.
> This is beyond ridiculous and Asus is another proof of a greedy, callous company.
> I was planning to buy from them a motherboard for my new build, but no thank you. I don't and I will never support greedy companies.



Let's be more precise... what an atrocity, ain't it?


----------



## JcRabbit (Oct 13, 2022)

With the current USD vs EUR exchange rate plus the typical 20% EU sales tax, I was already expecting something like this. My God!


----------



## AnarchoPrimitiv (Oct 13, 2022)

"...AMD has to innovate here with their next-gen, or they'll fall behind too much and NVIDIA will win ray tracing."

Yeah...obviously...the problem is that Nvidia has a multi-year head start and a much, much, much larger R&D budget.  Nvidia's 2021 R&D budget is $5.27 Billion while AMD's is only $2 billion.  Furthermore, Nvidia is able to spend that $5+ billion basically entirely in the area of graphics....perhaps a fraction goes to ARM CPU development, but the vast majority goes to graphics.  AMD on the other hand has to split that $2 billion between x86 and graphics and because x86 makes up the majority of AMD's revenue and has a larger T.A.M. than graphics,  we can safely assume that more than half of AMD's R&D budget goes to x86.  So in the end, Nvidia has probably $4+ billion for GPU R&D while AMD has less than $1 billion for the same application.

Considering Nvidia is easily spending four times as much on GPU research and development as AMD, I think it's extremely impressive that AMD was able to basically Match Nvidia in raster on RDNA2 and make such a strong showing with Ray tracing and FSR 2.0.  That said, AMD has been able to accomplish more with less than Nvidia or Intel, but I wouldn't expect them, especially considering Nvidia's head start, to match Nvidia on raw raytracing performance with RDNA3, though they should close the gap.


----------



## Enterprise24 (Oct 13, 2022)

I miss the day when overclocking GPU gave massive performance boost. 780 Ti and 980 Ti are a prime example. Slap water block on it and overclock to 1.3Ghz (780 Ti) or 1.55Ghz (980 Ti) and you received 25-30% performance boost from stock reference card.

Overclocking ada gives only minor boost. I believe water-cooling won't change these numbers much.


----------



## SOAREVERSOR (Oct 13, 2022)

The untold  thing here is the best chips will go to the cloud.


----------



## shoman24v (Oct 13, 2022)

Chasing those frames per second at high resolutions are really costing a lot these days.

Great review!


----------



## Saty (Oct 13, 2022)

@W1zzard Doesn't Asus recommend 1000W PSU for this model?

Were there any issues running the card on 850W? Would you say a 850W PSU is enough for this card and high-end CPU? 12700K ,12900K or the 13 Gen equivalent and AMD's latest?


----------



## W1zzard (Oct 13, 2022)

I don't care what various people recommend, I make my own recommendation 

I had zero issues, not even at max power limit, not on any 4090 that I've tested, and I've been running them for over a week, all day. Spikes are minimal as our testing shows. A decent 850 W should be fine


----------



## medi01 (Oct 13, 2022)

ShiningSapphire said:


> You are massively bottlenecked by used platform. The 5800X is absolutely unable to properly drive this card even at 4K that's why your results and differences between previous generation are significantly lower than other reviewers with the 12900K or 5800X3D.


Computerbase tested with Intel CPU and there is barely any difference.


----------



## mechtech (Oct 14, 2022)

Still waiting for a 5 slot cooler............



ShiningSapphire said:


> You are massively bottlenecked by used platform. The 5800X is absolutely unable to properly drive this card even at 4K that's why your results and differences between previous generation are significantly lower than other reviewers with the 12900K or 5800X3D.


Just imagine someone/all the people out there pairing it with a cpu thats half the 5800x lol


----------



## N/A (Oct 14, 2022)

When bottlenecked It's good to have this valuable information included. for example 66 Fps (at 66% GPU usage). But we know all that, it's obvious when 1080p and 1440p deliver same FPS.


----------



## HenrySomeone (Oct 14, 2022)

As someone who has owned 4-5 Strix cards I'll say this: the card is nice (both in design and I like big cards honestly), cool and quiet, however it's NOT worth a $400 premium, not even close honestly.


----------



## BoredErica (Oct 14, 2022)

Why is the temps under varying heat output noise normalized at 35dba straight lines?








						ASUS GeForce RTX 4090 STRIX OC Review
					

The ASUS GeForce RTX 4090 STRIX OC is the company's flagship graphics card. It comes with a huge cooler that's all metal, even on the cooler shroud. In terms of performance our review confirms that this is the best air cooler available and out of the box performance is great too, thanks to a...




					www.techpowerup.com


----------



## xorbe (Oct 14, 2022)

SOAREVERSOR said:


> The untold thing here is the best chips will go to the cloud.



This, manufs caught on long ago, the best stuff (cpu+gpu) is binned for higher paying servers, and consumer leftovers are already squeezed with dynamic clocks.


----------



## QUANTUMPHYSICS (Oct 16, 2022)

I have this card.  It's huge and bulky. 
I may trade it for the SUPRIM LIQUID X as soon as I can find one.


----------



## Mr_bubbles (Oct 18, 2022)

QUANTUMPHYSICS said:


> I have this card.  It's huge and bulky.
> I may trade it for the SUPRIM LIQUID X as soon as I can find one.



Why 2 SSDs of 2TB with similar highend PCIe gen4 specs? RAID plan?


----------



## W1zzard (Oct 18, 2022)

BoredErica said:


> Why is the temps under varying heat output noise normalized at 35dba straight lines?
> 
> 
> 
> ...


What else should it be, and why?


----------



## TheDeeGee (Nov 1, 2022)

Shame they went with the forced Red and Blue, it look bad.

Guess i'll go with MSI for a 4060 (Ti) then.


----------



## Undertoker (Nov 12, 2022)

Well i ended up getting one
Its a beast , its about 85%-100% faster than my Strix 3090 OC at about the same or just less power draw and it runs a lot cooler at a steady 60065 Deg C maximum
I think imanaged to get a decnet sample as itll overclockand boost to 3075mhz maintaning a steady 3030mhz throughout the run and again it onyl peaks at 70Deg C overclocked.
I dont regret buying it at all but my only gripe is the positionof the power connector - its almost as though they emply a guy specifically to choose the most annoying and stupid place to put it, also the quality of the 4 into 1 adaptod doesntisntill confidence and frankly they ought to have supplied a decent cable for the price.
Over all it is a beast and with the Strix 2080ti OC to Strix 3090 OC i saw between 35-40% uplift and with the Strix 4090 OC there is between 85% and 100% uplift over the Strix 3090 OC 
So its a big jump at less power mostly than the 3090, i doubt we will see this kind of uplift agian in a single generation.
Its a great card.
I run a PUBG NZXT H700 case and it fits fine btw for those wondering case and size.


----------



## Hankieroseman (Jan 3, 2023)

I want one. Trade ya 2 RTX3090 O24's for a new ROG Strix? 3000 on ebay... Wait for PCIe 5?


----------



## Undertoker (Jan 4, 2023)

In honesty i could not afford the electricity that two 3090s running in Sli would cost me here in the UK  
I suspect the 4090 would still be a match for 2 x 3090s in honesty  and would run also of course DLSS 3.0 where the 3090s of course wont - at least not yet
They do this at a fraction of the power draw which is amazing
Trust me get yourself an Asus Strix 4090 OC if you can get one - you wont regret it.

You can actually scale down the power draw down to 70% using GPu Tweak or any of the other similar programs and youll lose just 3% of the performance and this is done at just about 60deg C under load - the 4090 is a truly amazing card and i cannot imagine us seeing  abetter uplift on a top tier card angain any time soon

I certainly would recommend the purchase of a custom power cable to plug in for powering it up rather than  using the nvida adaptor which is garbage frankly.
I went with the Cable mod one and my PSU is an Asus Thor 1200w

Oh and i love fishing


----------



## Hankieroseman (Jan 4, 2023)

Just a matter of time.


----------



## P4-630 (Monday at 10:06 PM)

If TPU doesn't mention the ambient temp when measuring temps it's not very useful https://www.techpowerup.com/review/asus-geforce-rtx-4090-strix-oc/37.html


----------

