Processor | Ryzen 5700x |
---|---|
Motherboard | Gigabyte X570S Aero G R1.1 BiosF5g |
Cooling | Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm |
Memory | Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A) |
Video Card(s) | AMD RX 6800 - Asus Tuf |
Storage | Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX |
Display(s) | LG 27UL550-W (27" 4k) |
Case | Be Quiet Pure Base 600 (no window) |
Audio Device(s) | Realtek ALC1220-VB |
Power Supply | SuperFlower Leadex V Gold Pro 850W ATX Ver2.52 |
Mouse | Mionix Naos Pro |
Keyboard | Corsair Strafe with browns |
Software | W10 22H2 Pro x64 |
It's amazing you've just signed up to spoil the launch with your very valuable opinion. Also, remember, we live in a free society, so please stop it with "overpriced". Did NVIDIA put a gun against your head and ask you to buy any of their GPUs? No? Then how on Earth are they overpriced? Also, I agree, "0GB VRAM is very low... for 4k". Except this card features 10GB and we have close to zero games which actually require more than 8GB of VRAM at 4K. Also read the rest of my comment.
Speaking of "planned obsolescence" due a lack of VRAM:
See how the GTX 1060 3GB still works relatively OK despite not having enough VRAM even 5 years ago. Yes, it's very slow in fact 33% slower, but not 2 or 3 times as slow as its 6GB brother. Also, see how both cards are unusable at this resolution.
Still you can always buy a future-proof GPU from AMD. BTW, do you remember Radeon R9 Fury? Was released with paltry 4GB of VRAM which was strangely enough for AMD fans.
- Game developers target the most popular GPUs and most of them contain 8GB of VRAM or even less. Consoles also won't really feature more than 8GB of VRAM because their memory pool is shared between an OS core, game code and VRAM and that all should fit into 16GB or less. And both consoles feature uber fast textures streaming off their storage.
- Also, it's been shown time and again, that NVIDIA has superb drivers and their GPUs performance is not seriously affected even when there's not "enough" VRAM. E.g. Call of Duty: Modern Warfare eats VRAM for breakfast (over 9.5GB of VRAM use at 4K), yet, are NVIDIA cards with 6GB of VRAM affected? Not at all. Besides with RTX IO it all becomes moot.
- Lastly, by the time 10GB of VRAM is not enough, this GPU performance will be too low even if had twice as much VRAM.
System Name | The de-ploughminator Mk-III |
---|---|
Processor | 9800X3D |
Motherboard | Gigabyte X870E Aorus Master |
Cooling | DeepCool AK620 |
Memory | 2x32GB G.SKill 6400MT Cas32 |
Video Card(s) | Asus RTX4090 TUF |
Storage | 4TB Samsung 990 Pro |
Display(s) | 48" LG OLED C4 |
Case | Corsair 5000D Air |
Audio Device(s) | KEF LSX II LT speakers + KEF KC62 Subwoofer |
Power Supply | Corsair HX850 |
Mouse | Razor Death Adder v3 |
Keyboard | Razor Huntsman V3 Pro TKL |
Software | win11 |
Vega was also on an inferior node & admittedly an inferior uarch, that compounded their problem. But what's clear that there's still bias against AMD & towards Nvidia, JHH can launch a proverbial turd (remember Fermi?) & still get accolades while AMD not only has to please the audience but also pay them to do so
System Name | Wut? |
---|---|
Processor | 3900X |
Motherboard | ASRock Taichi X570 |
Cooling | Water |
Memory | 32GB GSkill CL16 3600mhz |
Video Card(s) | Vega 56 |
Storage | 2 x AData XPG 8200 Pro 1TB |
Display(s) | 3440 x 1440 |
Case | Thermaltake Tower 900 |
Power Supply | Seasonic Prime Ultra Platinum |
that was also why Raja left.
System Name | H7 Flow 2024 |
---|---|
Processor | AMD 5800X3D |
Motherboard | Asus X570 Tough Gaming |
Cooling | Custom liquid |
Memory | 32 GB DDR4 |
Video Card(s) | Intel ARC A750 |
Storage | Crucial P5 Plus 2TB. |
Display(s) | AOC 24" Freesync 1m.s. 75Hz |
Mouse | Lenovo |
Keyboard | Eweadn Mechanical |
Software | W11 Pro 64 bit |
Would you rather he wore only an apron?There's no way i'm buying ANYTHING from someone who wears a leather jacket in the kitchen.
Processor | Ryzen 5700x |
---|---|
Motherboard | Gigabyte X570S Aero G R1.1 BiosF5g |
Cooling | Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm |
Memory | Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A) |
Video Card(s) | AMD RX 6800 - Asus Tuf |
Storage | Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX |
Display(s) | LG 27UL550-W (27" 4k) |
Case | Be Quiet Pure Base 600 (no window) |
Audio Device(s) | Realtek ALC1220-VB |
Power Supply | SuperFlower Leadex V Gold Pro 850W ATX Ver2.52 |
Mouse | Mionix Naos Pro |
Keyboard | Corsair Strafe with browns |
Software | W10 22H2 Pro x64 |
Oh sure when AMD fucks up it's their fault, when they don't it's still their faultRTG just reap what they sown, the market will response when RTG make a good product like Ryzen.
For good reason, going EPYC not only helped them stay afloat but also now lead the top end CPU market virtually on all platforms.well you can blame AMD for that because the R&D funding was pooled into CPU developments
System Name | msdos |
---|---|
Processor | 8086 |
Motherboard | mainboard |
Cooling | passive |
Memory | 640KB + 384KB extended |
Video Card(s) | EGA |
Storage | 5.25" |
Display(s) | 80x25 |
Case | plastic |
Audio Device(s) | modchip |
Power Supply | 45 watts |
Mouse | serial |
Keyboard | yes |
Software | disk commander |
Benchmark Scores | still running |
Processor | 6700K |
---|---|
Motherboard | M8G |
Cooling | D15S |
Memory | 16GB 3k15 |
Video Card(s) | 2070S |
Storage | 850 Pro |
Display(s) | U2410 |
Case | Core X2 |
Audio Device(s) | ALC1150 |
Power Supply | Seasonic |
Mouse | Razer |
Keyboard | Logitech |
Software | 22H2 |
Processor | Ryzen 7 5700X |
---|---|
Memory | 48 GB |
Video Card(s) | RTX 4080 |
Storage | 2x HDD RAID 1, 3x M.2 NVMe |
Display(s) | 30" 2560x1600 + 19" 1280x1024 |
Software | Windows 10 64-bit |
most certainlydo you think with v.mod like shunt mods i have done on many cards will help with overclocking headroom
No plans, bnet banned me for trying to benchmark the game, and their support didn't want to admit it and chose to ignore my ticket for 3 months.Is there any plan to add the Call of Duty series back into reviews any time soon?
System Name | The de-ploughminator Mk-III |
---|---|
Processor | 9800X3D |
Motherboard | Gigabyte X870E Aorus Master |
Cooling | DeepCool AK620 |
Memory | 2x32GB G.SKill 6400MT Cas32 |
Video Card(s) | Asus RTX4090 TUF |
Storage | 4TB Samsung 990 Pro |
Display(s) | 48" LG OLED C4 |
Case | Corsair 5000D Air |
Audio Device(s) | KEF LSX II LT speakers + KEF KC62 Subwoofer |
Power Supply | Corsair HX850 |
Mouse | Razor Death Adder v3 |
Keyboard | Razor Huntsman V3 Pro TKL |
Software | win11 |
While the performance is impressive finally a true 4k card and now that it's been confirmed that on LG OLEDs it works at 4k120HDR, this combination is just phenomenal.
But the card is way too expensive, the 700$ is just fake (I assume it doesn't include tax) elsewhere it's much more expensive (even in Asia where the card is built it can cost up to 1000$ or more), why not use a similar pricing as consoles (MSRP), the PS5 and Series X cost around the same in Europe, north America and Japan.
At the end of the day, great performance ruined by stupid pricing (I'm talking about nVidia MSRP).
System Name | MightyX |
---|---|
Processor | Ryzen 9800X3D |
Motherboard | Gigabyte X650I AX |
Cooling | Scythe Fuma 2 |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | Asus TUF RTX3080 Deshrouded |
Storage | WD Black SN850X 2TB |
Display(s) | LG 42C2 4K OLED |
Case | Coolermaster NR200P |
Audio Device(s) | LG SN5Y / Focal Clear |
Power Supply | Corsair SF750 Platinum |
Mouse | Corsair Dark Core RBG Pro SE |
Keyboard | Glorious GMMK Compact w/pudding |
VR HMD | Meta Quest 3 |
Software | case populated with Artic P12's |
Benchmark Scores | 4k120 OLED Gsync bliss |
There are a couple of links on page 9, it seems that this is indeed possible and the results are promising. One site went down the slider to 83% max power limit to match the wattage draw of a 2080Ti at 270w, but it only resulted in a ~4% performance loss, not bad at all! the hypothesis is that Nvidia have had to push the GPU beyond the optimal efficiency sweet spot to hit the performance target they were seeking.If it has a manual power limit adjustment range of up to 370 W, from a default of 320 W, is it also possible to lower max power consumption, for example to 250W?
System Name | 2nd AMD puppy |
---|---|
Processor | FX-8350 vishera |
Motherboard | Gigabyte GA-970A-UD3 |
Cooling | Cooler Master Hyper TX2 |
Memory | 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU) |
Video Card(s) | Sapphire RX 580 Nitro+;1450/2000 Mhz |
Storage | SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb |
Display(s) | Benq XL2730Z 144 Hz freesync |
Case | NZXT 820 PHANTOM |
Audio Device(s) | Audigy SE with Logitech Z-5500 |
Power Supply | Riotoro Enigma G2 850W |
Mouse | Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU) |
Keyboard | MS Sidewinder x4 |
Software | win10 64bit ltsc |
Benchmark Scores | irrelevant for me |
Processor | 7800X3D 2x16GB CO |
---|---|
Motherboard | Asrock B650m HDV |
Cooling | Peerless Assassin SE |
Memory | 2x16GB DR A-die@6000c30 tuned |
Video Card(s) | Asus 4070 dual OC 2610@915mv |
Storage | WD blue 1TB nvme |
Display(s) | Lenovo G24-10 144Hz |
Case | Corsair D4000 Airflow |
Power Supply | EVGA GQ 650W |
Software | Windows 10 home 64 |
Benchmark Scores | Superposition 8k 5267 Aida64 58.5ns |
Perhaps all GPU-powerconsumption-testing should be done at 4k so we get a true idea of how much power it uses if not CPU-limited?That's because lower resolutions are CPU limited.
Edit: Actually you're making a great point. I'm using the same 303 W typical power consumption value from the power measurements page on all 3 resolutions, which isn't 100% accurate. Because it's some games are CPU limited, then in those games the power consumption is down, too, which I'm not taking into account
No I'm not, this is the MSRP of nVidia, for instance they price the 3080 at 699$ in the US but 1000$+ in Japan the prices are from their website.You are mixing up supply and demand there, retailers just like to jack up the price when stuffs are in short supply.
Just wait few months when price settles, PS5/XBX price will be inflated when they first launch too.
of course I had noticed it, on tom's IT I had also pointed it out to everyone, but still they declared 1.5x with the same power draw! see the picture.It isn't NVIDIA's fault if you can't read a chart, it's 1.9x at the same performance...
Processor | Ryzen 7800X3D |
---|---|
Motherboard | ROG STRIX B650E-F GAMING WIFI |
Memory | 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5) |
Video Card(s) | INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2 |
Storage | 2TB Samsung 980 PRO, 4TB WD Black SN850X |
Display(s) | 42" LG C2 OLED, 27" ASUS PG279Q |
Case | Thermaltake Core P5 |
Power Supply | Fractal Design Ion+ Platinum 760W |
Mouse | Corsair Dark Core RGB Pro SE |
Keyboard | Corsair K100 RGB |
VR HMD | HTC Vive Cosmos |
Consoles are not quire around the same MSRP either. For example, the recently announced price of Xbox Series X is $499/499€/£449 and 49980 yen. Compared to US price Japanese price is reduced as it is a difficult market for Xbox and EU/GB prices reflect included taxes.No I'm not, this is the MSRP of nVidia, for instance they price the 3080 at 699$ in the US but 1000$+ in Japan the prices are from their website.
So why they didn't price the cards at 699$ or thereabouts everywhere?
The consoles are around the same MSRP in most markets.
15-31%. RTX3080 is considerably more CPU limited on 1440p.13-24% performance increase over 2080Ti. Depending on resolution. With 28% higher TDP. Don't bother selling your 2080Ti.
There are a couple of links on page 9, it seems that this is indeed possible and the results are promising. One site went down the slider to 83% max power limit to match the wattage draw of a 2080Ti at 270w, but it only resulted in a ~4% performance loss, not bad at all! the hypothesis is that Nvidia have had to push the GPU beyond the optimal efficiency sweet spot to hit the performance target they were seeking.
System Name | The de-ploughminator Mk-III |
---|---|
Processor | 9800X3D |
Motherboard | Gigabyte X870E Aorus Master |
Cooling | DeepCool AK620 |
Memory | 2x32GB G.SKill 6400MT Cas32 |
Video Card(s) | Asus RTX4090 TUF |
Storage | 4TB Samsung 990 Pro |
Display(s) | 48" LG OLED C4 |
Case | Corsair 5000D Air |
Audio Device(s) | KEF LSX II LT speakers + KEF KC62 Subwoofer |
Power Supply | Corsair HX850 |
Mouse | Razor Death Adder v3 |
Keyboard | Razor Huntsman V3 Pro TKL |
Software | win11 |
No I'm not, this is the MSRP of nVidia, for instance they price the 3080 at 699$ in the US but 1000$+ in Japan the prices are from their website.
So why they didn't price the cards at 699$ or thereabouts everywhere?
The consoles are around the same MSRP in most markets.
System Name | Hotbox |
---|---|
Processor | AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6), |
Motherboard | ASRock Phantom Gaming B550 ITX/ax |
Cooling | LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14 |
Memory | 32GB G.Skill FlareX 3200c14 @3800c15 |
Video Card(s) | PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W |
Storage | 2TB Adata SX8200 Pro |
Display(s) | Dell U2711 main, AOC 24P2C secondary |
Case | SSUPD Meshlicious |
Audio Device(s) | Optoma Nuforce μDAC 3 |
Power Supply | Corsair SF750 Platinum |
Mouse | Logitech G603 |
Keyboard | Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps |
Software | Windows 10 Pro |
...if those numbers show anything, it's that there are no CPU bottlenecks at 4k, given that it scales beyond the lower resolutions. Which is exactly what I was pointing out, while @birdie was claiming that this GPU is CPU limited even at 4k:Just check the relative performance of 2080 Ti compare to 3080 across 3 resolution and you will see a change
1080p: 87%
1440p: 81%
4K: 75%
Might as well use DSR to do some 8K testing on this bad boy
People with 1080p and 1440p screen might as well use DSR if they have the 3080, otherwise you are just wasting all performance prowess of 3080.
Which I then asked for some data demonstrating, as this review fails to show any such bottleneck. And, as GN pointed out in their excellent video review, not all seeming CPU limitations are actual CPU limitations - some examples of poor GPU scaling are down to the architecture or layout of the GPU causing GPU bottlenecks that don't show up as 100% load.quite a lot of games being reviewed are severely CPU limited even at 4K!
Only if there is a perceptible difference in graphical quality - without that, the difference is entirely theoretical. And that's the entire point: a lot of games have extremely "expensive" Ultra settings tiers with near imperceptible or even entirely imperceptible differences in quality. If your benchmark for not having a VRAM limitation is the ability to enable all of these, then your benchmark is problematic. If the thing you're worried about for this GPU is that you might at some point in the future need to lower settings an imperceptible amount to maintain performance, then ... what are you worried about, exactly? Stop promoting graphical quality nocebo effects, please. Because at that point, all you are arguing for is the value/security of knowing your GPU can handle everything set to Ultra, no matter what this actually means for the quality of the game. Which is just silly. Not only is it an impossible target, but it's a fundamentally irrational one.That's a strange way to think about it, if it is possible to exceed that amount then it inevitably becomes a limitation.
Processor | R5 5600X |
---|---|
Motherboard | Asus TUF Gaming X570-Plus |
Memory | 32 GB 3600 MT/s CL16 |
Video Card(s) | Sapphire Vega 64 |
Storage | 2x 500 GB SSD, 2x 3 TB HDD |
Case | Phanteks P300A |
Software | Manjaro Linux, W10 if I have to |