• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ASUS GeForce RTX 4090 STRIX OC

Great reviews. I like the relative performance charts and average fps.
I wish you can also add 3DMark Time spy graphics score to graphic card reviews.
 
RTX 4090 + 13900K coming soon, running right now, don't you worry, the differences (at 4K) will be very small
These are simply the facts. There is 55-60% difference in 4K between 3090ti/4090. Of course it depends on the game set but your only 30% clearly indicates CPU bottleneck not to mention 1080p/1440p. Also i have one question regarding testing methodology, are you using built-in benchmarks or trying to find worst case scenario gpu bound in-game locations for testing?
 
Also i have one question regarding testing methodology, are you using built-in benchmarks or trying to find best gpu bound in-game locations for testing?
I'm not using integrated benchmarks because they are usually super unrealistic and run too short (most cards slow down quite a bit as they heat up over the first 2 minutes or so, you play for hours, not 30 seconds, so steady state is the perf you should get in reviews so you know what to expect)

I've played through nearly all the games that I benchmark. The scenes that I pick are representative of "demanding, typical" gameplay, not "worst case". I'm also intentionally not publishing the actual locations so driver teams can't just optimize that one spot in a game
 
I'm not using integrated benchmarks because they are usually super unrealistic and run too short (most cards slow down quite a bit as they heat up over the first 2 minutes or so, you play for hours, not 30 seconds, so steady state is the perf you should get in reviews so you know what to expect)

I've played through nearly all the games that I benchmark. The scenes that I pick are representative of "demanding, typical" gameplay, not "worst case". I'm also intentionally not publishing the actual locations so driver teams can't just optimize that one spot in a game
Hmm in my opinion, the methodology should be to look for the worst places, then someone interested in the performance of a particular game knows that they will get at least that many fps, and usually more during normal gameplay. Additionally, this reduces the bottleneck.
 
Founders Edition for $1600 and other AiB cards for $1700 for literally 1% higher performance. Here it is $2000 for 2-3% from ASUS. It seems that each percent more is for $100 :).
nVidia doesn't give much space for AiB for overclocking cards. No wonder EVGA left that bussiness.
 
This card is $2800 before tax and when you add tax it will be over $3000. Too bad Multi GPU support has been killed by the industry as 2 6800XTs would cost you about 1/2 of $3000 to buy 2 cards and if AMD did crossfire like Polaris (basically default in the Driver) and blow this card out of the water with a comparable power draw.
AMD still supports mGPU on all RDNA2 gpu's support it.
They just don't support anything old like Crossfire on DX11 & older anymore.
I have the list of mGPU games if you like it?
 
Last edited:
For making Asus charge $250 more than the competing liquid cooled card?
For the wafer thin margins on many of these products, or did you miss this by any chance :rolleyes:
I bet Nvidia's still making more on these AIB models than the board partners themselves!
 
Wonderful card but too expensive. Waiting for the TUF review!
Yeah it can go 0.1% either way. I'll be at the edge of my seat.

What could possibly justify the $400 increase over FE that's already priced terribly? A few degrees lower temperatures? No. Few more FPS due to higher clocks? No. ROG STRIX written on the card? Yes!
True, but you'll be stepping on your epeen when walking with this card.
 
In these energy crisis time we got in europe, this power consumption is YIKES.
Thats my entire pc under gaming minus 150w.
 
Will the Asus RTX 4090 Strix or TUF fit Lian Li O11 Dynamic EVO?

Based on Asus specs, this card is 150mm wide, EVO has 167mm clearance to glass from mobo.
But the new 16-pin PCIe high power cables are stiff! Wondering if it is enough the connector is slightly recessed on the PCB?

Wow, I really didn't think I would have to worry about a year old 1000W PSU having right cables and this size case fitting a new graphics card.

Seems the Strix is going for anywhere from 200 to 500 euros more than TUF non-OC model here.
Whu would you do this Asus? :confused:
 
You do know that at 4K the CPU speed(s) is quite less of an impact to for example 720p or 1080p right?

There's virtually no difference at such resolutions.
They've never had a gpu this fast.
 
This card is $2800 before tax and when you add tax it will be over $3000. Too bad Multi GPU support has been killed by the industry as 2 6800XTs would cost you about 1/2 of $3000 to buy 2 cards and if AMD did crossfire like Polaris (basically default in the Driver) and blow this card out of the water with a comparable power draw.
Yes, D3d12 multiGPU supports it very well--but the only AAA games I know of that support it are the Crystal Dynamics SotTR, and I believe they backported the support to Rise ofTR, as well. I used it years ago with SofTR between a RX-480/590-Fat boy--worked perfectly in the game! AMD's solution uses the PCIe bus instead of physical connectors--which worked great, too. But it seems the game devs aren't interested in supporting D3d12 MultiGPU in their games!
 
You are massively bottlenecked by used platform. The 5800X is absolutely unable to properly drive this card even at 4K that's why your results and differences between previous generation are significantly lower than other reviewers with the 12900K or 5800X3D.
I don't believe this at all.
 
Weird how the ASUS offered from nVidia site is Tuf and not Strix.
 
Yes, D3d12 multiGPU supports it very well--but the only AAA games I know of that support it are the Crystal Dynamics SotTR, and I believe they backported the support to Rise ofTR, as well. I used it years ago with SofTR between a RX-480/590-Fat boy--worked perfectly in the game! AMD's solution uses the PCIe bus instead of physical connectors--which worked great, too. But it seems the game devs aren't interested in supporting D3d12 MultiGPU in their games!
I know what you mean but there are plenty of Games that officially support Multi GPUs like Star Wars Jedi Fallen Order among them. I attached a link that shows every Game that supports Multi GPU. The issue is that AMD and Nvidia influenced the industry to abandon Multi GPU as you could get a current card to get up to 80% increased performance insted of paying double for a new card. I do feel that Intel is working on a solution to combine the IGPU and DGPU to increase performance. This could have the rest of the industry re adapt Multi GPU as a technology. I know that we don't have any real information yet but I could see AMD doing the same thing too.


 
Looks good results with 4K, 8x multisamle atialiasing, high quality shadows and Ray tracing.
 
No argument in this world justifies paying 2000 $ or more than that in Euros for a stupid gaming card.
This is beyond ridiculous and Asus is another proof of a greedy, callous company.
I was planning to buy from them a motherboard for my new build, but no thank you. I don't and I will never support greedy companies.
 
Mmmmm, got to give credit, where it is due, nVidia made a good all-around cooler this time around, the AIBs are way overcharging (on an already ridiculous price) for something that doesn't perform much better.

I wonder if this is why EVGA pulled out? Seems nVidia can just sell them all directly at this point.
 
No argument in this world justifies paying 2000 $ or more than that in Euros for a stupid gaming card.
This is beyond ridiculous and Asus is another proof of a greedy, callous company.
I was planning to buy from them a motherboard for my new build, but no thank you. I don't and I will never support greedy companies.

Let's be more precise... what an atrocity, ain't it?

1665641259272.png
 
With the current USD vs EUR exchange rate plus the typical 20% EU sales tax, I was already expecting something like this. My God!
 
"...AMD has to innovate here with their next-gen, or they'll fall behind too much and NVIDIA will win ray tracing."

Yeah...obviously...the problem is that Nvidia has a multi-year head start and a much, much, much larger R&D budget. Nvidia's 2021 R&D budget is $5.27 Billion while AMD's is only $2 billion. Furthermore, Nvidia is able to spend that $5+ billion basically entirely in the area of graphics....perhaps a fraction goes to ARM CPU development, but the vast majority goes to graphics. AMD on the other hand has to split that $2 billion between x86 and graphics and because x86 makes up the majority of AMD's revenue and has a larger T.A.M. than graphics, we can safely assume that more than half of AMD's R&D budget goes to x86. So in the end, Nvidia has probably $4+ billion for GPU R&D while AMD has less than $1 billion for the same application.

Considering Nvidia is easily spending four times as much on GPU research and development as AMD, I think it's extremely impressive that AMD was able to basically Match Nvidia in raster on RDNA2 and make such a strong showing with Ray tracing and FSR 2.0. That said, AMD has been able to accomplish more with less than Nvidia or Intel, but I wouldn't expect them, especially considering Nvidia's head start, to match Nvidia on raw raytracing performance with RDNA3, though they should close the gap.
 
I miss the day when overclocking GPU gave massive performance boost. 780 Ti and 980 Ti are a prime example. Slap water block on it and overclock to 1.3Ghz (780 Ti) or 1.55Ghz (980 Ti) and you received 25-30% performance boost from stock reference card.

Overclocking ada gives only minor boost. I believe water-cooling won't change these numbers much.
 
Back
Top