That's the problem. In modern games, especially BRs with large open environments with a lot of structures, even at minimum graphics settings at 1080p, the FPS drops (fluctuates) well below 240fps. Forget 360fps and fully utilizing 360Hz even on a 3090.If you trick out RTX at native 1080p with shadows, reflections, and global illumination, I wouldn't surprised if this card could only handle it at 60fps. Mind you, that's still just 1 bounce per pixel, with de-noising to smooth out light gaps. DLSS isn't free -- it at least costs time for training to ensure no artifacts for a given game, but I'm sure NVidia charges for this training. Full fidelity path-tracing graphics isn't as cheap as it's been made out to be.
For those who want 1080p at 360Hz, isn't it better to just lower the details on a mid-range high-frequency GPU, since they only care about the competitive advantage? Having RTX and all sorts of other details in the scene just make it hard to discern the opponent anyway.
Processor | Intel i9 9900K @5GHz w/ Corsair H150i Pro CPU AiO w/Corsair HD120 RBG fan |
---|---|
Motherboard | Asus Z390 Maximus XI Code |
Cooling | 6x120mm Corsair HD120 RBG fans |
Memory | Corsair Vengeance RBG 2x8GB 3600MHz |
Video Card(s) | Asus RTX 3080Ti STRIX OC |
Storage | Samsung 970 EVO Plus 500GB , 970 EVO 1TB, Samsung 850 EVO 1TB SSD, 10TB Synology DS1621+ RAID5 |
Display(s) | Corsair Xeneon 32" 32UHD144 4K |
Case | Corsair 570x RBG Tempered Glass |
Audio Device(s) | Onboard / Corsair Virtuoso XT Wireless RGB |
Power Supply | Corsair HX850w Platinum Series |
Mouse | Logitech G604s |
Keyboard | Corsair K70 Rapidfire |
Software | Windows 11 x64 Professional |
Benchmark Scores | Firestrike - 23520 Heaven - 3670 |
We also did a test run with the power limit maximized to the 480 W setting that ASUS provides. 480 W is much higher than anything else availalable on any other RTX 3090, so I wondered how much more performance can we get. At 4K resolution it's another 2%, which isn't that much, but it depends on the game, too. Only games that hit the power limit very early due to their rendering design can benefit from the addded power headroom.
System Name | [Primary Workstation] |
---|---|
Processor | Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench] |
Motherboard | EVGA X58 E758-A1 [Tweaked right!] |
Cooling | Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans] |
Memory | 12GB [Kingston HyperX] |
Video Card(s) | constantly upgrading/downgrading [prefer nVidia] |
Storage | constantly upgrading/downgrading [prefer Hitachi/Samsung] |
Display(s) | Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays] |
Case | Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload] |
Audio Device(s) | ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.] |
Power Supply | Corsair TX950W [aka Reactor] |
Software | This and that... [All software 100% legit and paid for, 0% pirated] |
Benchmark Scores | Ridiculously good scores!!! |
Benchmark Scores | Faster than yours... I'd bet on it. :) |
---|
All games have significant fluctuations. MP can be worse... but that isn't dependent on the GPU really. That's the nature of MP gaming man.I doubt Apex Legends is the only MP game that has large FPS fluctuations @1080p even at medium to low settings, and upcoming games will generally be more demanding.
Not to mention 360Hz will only become more common.
System Name | RyzenGtEvo/ Asus strix scar II |
---|---|
Processor | Amd R5 5900X/ Intel 8750H |
Motherboard | Crosshair hero8 impact/Asus |
Cooling | 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK |
Memory | Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB |
Video Card(s) | Asus tuf RX7900XT /Rtx 2060 |
Storage | Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme |
Display(s) | Samsung UAE28"850R 4k freesync.dell shiter |
Case | Lianli 011 dynamic/strix scar2 |
Audio Device(s) | Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset |
Power Supply | corsair 1200Hxi/Asus stock |
Mouse | Roccat Kova/ Logitech G wireless |
Keyboard | Roccat Aimo 120 |
VR HMD | Oculus rift |
Software | Win 10 Pro |
Benchmark Scores | laptop Timespy 6506 |
It really depends on the use case where a Titan would benefit.
480w it is insane
Processor | E5-1680 V2 |
---|---|
Motherboard | Rampage IV black |
Video Card(s) | Asrock 7900 xtx |
Storage | 500 gb sd |
Software | windows 10 64 bit |
Benchmark Scores | 29,433 3dmark06 score |
This card (just like I predicted) is a lot more power efficient that the RTX 3080:
So, your criticism is not totally sincere as the RTX 3080 must be an even worse card in your opinion. Also, NVIDIA has a higher performance per watt ratio for this gen vs the previous gen cards, so your comparison to Intel isn't valid.
As for me personally I never buy GPUs with a TDP above 150W (and CPUs above 95W) because I simply don't want that much heat under my desk, so I guess I will skip the entire GeForce 30 series because RTX 3060 has been rumored to have a TDP around 180-200W.
"Good lord" is misplaced. Again, this is a Titan class card. Either use it appropriately or forget about it - it does not justify its cost purely from a gaming perspective.
Nah, I'm just saying people shouldn't make blanket statements without acknowledging the benefits that most people don't realize can actually be very noticeable. We all acknowledge the price/performance value statement of a 3080 over a 3090. However, I do not agree with anyone saying the 3090 makes no noticeable difference at 1080p.All games have significant fluctuations. MP can be worse... but that isn't dependent on the GPU really. That's the nature of MP gaming man.
360Hz will become more common, indeed. By that time, the next gen cards will be out. Like 4K, any decent 360 Hz monitor is quite pricey and overkill for anyone who doesn't have F U money or plays competitively where that stuff matters.
You're fighting for the <1%.
System Name | Cyberline |
---|---|
Processor | Intel Core i7 2600k -> 12600k |
Motherboard | Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4 |
Cooling | Tuniq Tower 120 -> Custom Watercoolingloop |
Memory | Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz |
Video Card(s) | AMD RX480 -> RX7800XT |
Storage | Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD |
Display(s) | Philips 32inch LPF5605H (television) -> Dell S3220DGF |
Case | antec 600 -> Thermaltake Tenor HTCP case |
Audio Device(s) | Focusrite 2i4 (USB) |
Power Supply | Seasonic 620watt 80+ Platinum |
Mouse | Elecom EX-G |
Keyboard | Rapoo V700 |
Software | Windows 10 Pro 64bit |
Sorry, this is not a 1080p card, not by a long shot. If you're here to argue about this card merits for this low resolution I don't know what else to say because I don't have any polite words in my lexicon.
What's next, you're going to argue that an excavator is not a good way of planting seeds? Or maybe you need to play your favourite game at 600fps? Why?
System Name | 2nd AMD puppy |
---|---|
Processor | FX-8350 vishera |
Motherboard | Gigabyte GA-970A-UD3 |
Cooling | Cooler Master Hyper TX2 |
Memory | 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU) |
Video Card(s) | Sapphire RX 580 Nitro+;1450/2000 Mhz |
Storage | SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb |
Display(s) | Benq XL2730Z 144 Hz freesync |
Case | NZXT 820 PHANTOM |
Audio Device(s) | Audigy SE with Logitech Z-5500 |
Power Supply | Riotoro Enigma G2 850W |
Mouse | Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU) |
Keyboard | MS Sidewinder x4 |
Software | win10 64bit ltsc |
Benchmark Scores | irrelevant for me |
Well no need to get upset, its just a videocard, lets look at the facts and not made up stats shall we?
Nvidia themselves pushed for new 360hz screens, nr1 so high refreshrate/framerate gaming is definitly something they are going for, now lets look at this review shall we?
Average FPS at 1080p with this card is about 200fps.....so yeah we are not even close to touching that new pushed 360hz standard and not even remotely close to your made up 600fps...
sooo yeah.
Processor | Ryzen 5600 |
---|---|
Motherboard | X570 I Aorus Pro |
Cooling | Deepcool AG400 |
Memory | HyperX Fury 2 x 8GB 3200 CL16 |
Video Card(s) | RX 6700 10GB SWFT 309 |
Storage | SX8200 Pro 512 / NV2 512 |
Display(s) | 24G2U |
Case | NR200P |
Power Supply | Ion SFX 650 |
Mouse | G703 (TTC Gold 60M) |
Keyboard | Keychron V1 (Akko Matcha Green) / Apex m500 (Gateron milky yellow) |
Software | W10 |
Benchmark Scores | Faster than yours... I'd bet on it. :) |
---|
I get you... I don't like blanket statements either, but when everything is covered except for the toes....Nah, I'm just saying people shouldn't make blanket statements without acknowledging the benefits that most people don't realize can actually be very noticeable. We all acknowledge the price/performance value statement of a 3080 over a 3090. However, I do not agree with anyone saying the 3090 makes no noticeable difference at 1080p.
CPU limited at such a low resolution man. It may be the game devs, it may be too slow CPUs for the beast of a GPU at 1080p. Look at how little the card gains over a 3080 at 1080p compared to 4K.Average FPS at 1080p with this card is about 200fps.....so yeah we are not even close to touching that new pushed 360hz standard and not even remotely close to your made up 600fps...
The flagship is the 3080. This is a Titan replacement.Only 5 years ago Graphics card flagship were only 500 dollars today you need 2000 dollars. What a rip off
System Name | Potato |
---|---|
Processor | Intel core i5 9600 KF |
Motherboard | Asus ROG Strix Gaming F |
Cooling | Corsair H80I Rgb |
Memory | Corsair Vengeance 32 Gb 4x8GB 3600 Mhz |
Video Card(s) | EVGA Geforce GTX 1080 FTW |
Storage | Samsung 500 Gb m2 nvme |
Display(s) | Asus ROG QHD |
Case | Nzxt h440 |
Audio Device(s) | Creative blaster Z |
Power Supply | Fractal design Tesla 1000W |
Mouse | MSI DS 200 RGB |
Keyboard | SteelSeries RGB m800 |
Software | Windows X |
Benchmark Scores | 12000 ish |
Only 5 years ago Graphics card flagship were only 500 dollars today you need 2000 dollars. What a rip off
System Name | RyzenGtEvo/ Asus strix scar II |
---|---|
Processor | Amd R5 5900X/ Intel 8750H |
Motherboard | Crosshair hero8 impact/Asus |
Cooling | 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK |
Memory | Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB |
Video Card(s) | Asus tuf RX7900XT /Rtx 2060 |
Storage | Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme |
Display(s) | Samsung UAE28"850R 4k freesync.dell shiter |
Case | Lianli 011 dynamic/strix scar2 |
Audio Device(s) | Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset |
Power Supply | corsair 1200Hxi/Asus stock |
Mouse | Roccat Kova/ Logitech G wireless |
Keyboard | Roccat Aimo 120 |
VR HMD | Oculus rift |
Software | Win 10 Pro |
Benchmark Scores | laptop Timespy 6506 |
According to one guy on here these cooler's cost ÂŁ20 (can't recall who luckily for him or his name would be here), the silicon cost has gone up, so has Nvidia's die size , and the cost of all other parts making the 3080 and 70 good buys, personally I think the markup on these 3090 is a bit much, they're worth more than a 3080 but not this much.The BOM has steadily increased too.
Yet the 3090 is geared primarily towards the "toes" or 1%.I get you... I don't like blanket statements either, but when everything is covered except for the toes....
Think of this way too... you're going to be heavily CPU limited at low res trying to feed all of those frames in. I think in TPU reviews we see much larger, significant gains at the higher resolution. To use this on anything less than 4K does it an injustice IMO. You're so CPU limited, especially at 1080p, the bottleneck shifts significantly to the CPU with this monster (and the 3080).
Watch the FPS go up with CPU clock increases at that low of a res. Look how poorly AMD does with its lack of clock speed. I think starting off at a static 4.5, then go 4.7, 4.9, 5.1, 5.3, etc... hell, I'd love to see some single stage results at 5.5+ just to see if it keeps going so we know how high it scales. The lows aren't due to the GPU at that point....
These overclocked versions like the Strix are awesome, but, unless you are at 4K+, the 'value' (and I use that term loosely, like hot dog down a hallway loosely) of this card for gaming is garbage. Similar to the Titan it 'replaces'.
Lucky bastard. I'm assuming you're saying you were able to order the Asus 3090 Strix O24G.I've ordered one and hopefully going to receive it before Cyberpunk 2077 release (I am from Italy).
What to say, will probably end up being the fastest 3090 together with Aorus Xtreme and Zotac AMP extreme. $/perf 3080 is for sure better but you're paying the fact this card is the fastest GPU on Earth.
Benchmark Scores | Faster than yours... I'd bet on it. :) |
---|
80% gpu load is a problem. They should be at 99% when they arent held back.Snip
System Name | Cyberline |
---|---|
Processor | Intel Core i7 2600k -> 12600k |
Motherboard | Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4 |
Cooling | Tuniq Tower 120 -> Custom Watercoolingloop |
Memory | Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz |
Video Card(s) | AMD RX480 -> RX7800XT |
Storage | Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD |
Display(s) | Philips 32inch LPF5605H (television) -> Dell S3220DGF |
Case | antec 600 -> Thermaltake Tenor HTCP case |
Audio Device(s) | Focusrite 2i4 (USB) |
Power Supply | Seasonic 620watt 80+ Platinum |
Mouse | Elecom EX-G |
Keyboard | Rapoo V700 |
Software | Windows 10 Pro 64bit |
So, NVIDIA is to be blamed for extremely poorly optimized games which are often limited by the CPU performance. AMD fans never fail to disappoint with the utmost disrespect towards intelligence and logic.
And according to you games under RDNA 2.0 will magically scale better. LMAO.
sooo yeah.
Here's some harsh reality for you:
Tell me exactly how NVIDIA is supposed to fix this suckfest.
That's a common misconception. GPU load or utilization is only a certain kind of metric over a sample period of time. Specifically:80% gpu load is a problem. They should be at 99% when they arent held back.
utilization.gpu | Percent of time over the past sample period during which one or more kernels was executing on the GPU. The sample period may be between 1 second and 1/6 second depending on the product. |
You are funny, really.I like that Nvidia built this BFGPU and put it out there. It just showcases their top tier product and the potential of their technology.
Kind of like cars, M3 vs. 3-series. You pay for those infinitesimal improvements.
LOL, a lot more power efficient than the 3080? F.e. this chart compares an AIB 3090 to a reference 3080. If you take ASUS TUF 3080 f. .e. it's 2% better in perf/W than the ref 3080. And if you check back-to-back generations, the 2080 Ti was 22% more efficient than the 1080 Ti (which is not a big leap at all), while the 3080 Ti is also around that 22% uplift from the 2080 Ti (assuming a 96-97% reference in the chart). When you check the 980 Ti - 1080 Ti switch, there is a whopping 65% efficiency increase. 3 times more than the 1080TI-2080Ti or the 2080Ti-3080Ti switch.This card (just like I predicted) is a lot more power efficient that the RTX 3080:
So, your criticism is not totally sincere as the RTX 3080 must be an even worse card in your opinion.
So, NVIDIA is to be blamed for extremely poorly optimized games which are often limited by the CPU performance.
System Name | msdos |
---|---|
Processor | 8086 |
Motherboard | mainboard |
Cooling | passive |
Memory | 640KB + 384KB extended |
Video Card(s) | EGA |
Storage | 5.25" |
Display(s) | 80x25 |
Case | plastic |
Audio Device(s) | modchip |
Power Supply | 45 watts |
Mouse | serial |
Keyboard | yes |
Software | disk commander |
Benchmark Scores | still running |