Processor | Ryzen 7 5700X |
---|---|
Memory | 48 GB |
Video Card(s) | RTX 4080 |
Storage | 2x HDD RAID 1, 3x M.2 NVMe |
Display(s) | 30" 2560x1600 + 19" 1280x1024 |
Software | Windows 10 64-bit |
IndeedOhh, it is different perfomance/watt numbers now. I have to re-read the test and rethink everything.
Processor | i9-10850K @ 125W Power Limit |
---|---|
Motherboard | ASUS TUF Gaming Z590-PLUS |
Cooling | Noctua NH-D15S |
Memory | Kingston KF432C16RBK2/64 |
Video Card(s) | ASUS RTX 3070 TUF GAMING O8G @ 950mV / 2010MHz |
Storage | Samsung 970 EVO Plus 2TB + Kingston KC3000 2TB + Samsung 860 EVO 2TB + Samsung 870 EVO 4TB |
Display(s) | ASUS PB287Q + DELL S2719DGF |
Case | FRACTAL Define 7 Dark TG |
Audio Device(s) | integrated + Microlab FC330 / Audio-Technica ATH-M50s/LE |
Power Supply | Seasonic PRIME TX-650, 80+ Titanium, 650W |
Mouse | SteelSeries Rival 600 |
Keyboard | Corsair K70 RGB TKL – CHERRY MX SPEED |
Now, looking at the new numbers, it's even clearer to me what my problem with Ampere is.This review has been updated with new gaming power consumption numbers for RTX 2080 Ti, RTX 3070, RTX 3080, RTX 3090, RX 6800 and RX 6800 XT. For these cards I'm now running Metro at 1440p and not 1080p, to ensure proper loading.
The perf/W charts have been updated, too, and relevant texts as well.
System Name | ibuytheusedstuff |
---|---|
Processor | 5960x |
Motherboard | x99 sabertooth |
Cooling | old socket775 cooler |
Memory | 32 Viper |
Video Card(s) | 1080ti on morpheus 1 |
Storage | raptors+ssd |
Display(s) | acer 120hz |
Case | open bench |
Audio Device(s) | onb |
Power Supply | antec 1200 moar power |
Mouse | mx 518 |
Keyboard | roccat arvo |
IndeedWhat's the outcome of your reconsideration? Personally I don't think it changes much
IndeedWhat's the outcome of your reconsideration? Personally I don't think it changes much
Processor | Intel Core Ultra 9 285K |
---|---|
Motherboard | MSI MAG Z890 TOMAHAWK WIFI | BIOS 1.A71 |
Cooling | Noctua NH-D15 G2 | ARCTIC MX-6 |
Memory | G.Skill Trident Z5 RGB 96GB (2x48GB) DDR5 6400MHz | CL32-39-39-102-141-701-2T | 1,35V | Gear 2 |
Video Card(s) | MSI GeForce RTX 5080 VANGUARD SOC 16GB |
Storage | Intel Optane 900P 280GB | WD Black 10TB WD101FZBX |
Display(s) | AOC AG274QZM 27“ 2560 x 1440 10bit 240Hz |
Case | Lian Li O11 Air Mini Black |
Audio Device(s) | Creative Sound Blaster AE-7 | Audio-Technica ATH-A990Z |
Power Supply | be quiet! Dark Power 13 750W |
Mouse | Logitech G MX518 |
Keyboard | Logitech G213 Prodigy |
Software | Windows 11 Pro x64 24H2 |
Processor | Ryzen 7 5700X |
---|---|
Memory | 48 GB |
Video Card(s) | RTX 4080 |
Storage | 2x HDD RAID 1, 3x M.2 NVMe |
Display(s) | 30" 2560x1600 + 19" 1280x1024 |
Software | Windows 10 64-bit |
AMD lists this in their "known issues" document, some reviewers chose to repro the issue and report on it. I'm sure there's many driver issues in RT, because all RT so far has been developer for and tested on NV only. I'm actually impressed you can run existing gamesI don't know if this is a driver issue or something completely different, but in this review, the image quality with ray tracing in Watch Dogs Legion on AMD cards is clearly different from that of NVIDIA cards. I am really curious if some owner of RX 6800 or RX 6800 XT could please test if it's true and if so if it applies to more titles. Thanks for any information on this.
System Name | Legion |
---|---|
Processor | i7-12700KF |
Motherboard | Asus Z690-Plus TUF Gaming WiFi D5 |
Cooling | Arctic Liquid Freezer 2 240mm AIO |
Memory | PNY MAKO DDR5-6000 C36-36-36-76 |
Video Card(s) | PowerColor Hellhound 6700 XT 12GB |
Storage | WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB |
Display(s) | Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440 |
Case | Montech Air X |
Power Supply | Corsair CX750M |
Mouse | Logitech MX Anywhere 25 |
Keyboard | Logitech MX Keys |
Software | Lots |
Processor | Ryzen 5 3600 |
---|---|
Motherboard | ASRock X470 Taichi |
Cooling | Scythe Kotetsu Mark II |
Memory | G.SKILL 32GB DDR4 3200 CL16 |
Video Card(s) | EVGA GeForce RTX 3070 FTW3 Ultra (1980 MHz / 0.968 V) |
Display(s) | Dell P2715Q; BenQ EX3501R; Panasonic TC-P55S60 |
Case | Fractal Design Define R5 |
Audio Device(s) | Sennheiser HD580; 64 Audio 1964-Q |
Power Supply | Seasonic SSR-650TR |
Mouse | Logitech G700s; Logitech G903 |
Keyboard | Cooler Master QuickFire TK; Kinesis Advantage |
VR HMD | Quest 2 |
This is super interesting, and reveals something that no other reviewer has noticed.i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?
![]()
System Name | RyzenGtEvo/ Asus strix scar II |
---|---|
Processor | Amd R5 5900X/ Intel 8750H |
Motherboard | Crosshair hero8 impact/Asus |
Cooling | 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK |
Memory | Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB |
Video Card(s) | Asus tuf RX7900XT /Rtx 2060 |
Storage | Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme |
Display(s) | Samsung UAE28"850R 4k freesync.dell shiter |
Case | Lianli 011 dynamic/strix scar2 |
Audio Device(s) | Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset |
Power Supply | corsair 1200Hxi/Asus stock |
Mouse | Roccat Kova/ Logitech G wireless |
Keyboard | Roccat Aimo 120 |
VR HMD | Oculus rift |
Software | Win 10 Pro |
Benchmark Scores | laptop Timespy 6506 |
Your opinion, I disagree with parts of , but technically ANY card is more attractive at £130 cheaper, is it not.Interesting product placement though I don't know that it will find a home priced as it is .... It I was to look at the 3070 ... I don't see anything that would make it worthwhile to move up to the $580 price point ... and if it dies, I'd be more inclined go up to $650 or $700.
As to the extra the VRAM, it's the proverbial "teats on a bull". In every case where multiple VRAM versions of a card have been issued .... yes every single instance or comparison testing on a reliable web site, from the 6xxx series, 7xx series, 9xxx seies, 10xxx series where this has occured, it has never been shown that the extra VRAM brought anything to the table. By the time you jack up the settings to a point where the VRAM differences is significant, you have outstripped the capabilities of the GPU .... a 33% increase in fps is meaningless when that increase is from 25 to 20 fps. While a small number of games, primarily poor console ports have been exceptions, the exception as the saying goes, proves the rule.
![]()
GTX 770 4GB vs 2GB Showdown - Page 4 of 4 - AlienBabelTech
Do you need 4GB of ram? We tested EVGA's GTX 770 4GB versus Nvidia's GTX 770 2GB version, at 1920x1080, 2560x1600 and 5760x1080.alienbabeltech.com
"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600. We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. If we start to add even more AA, in most cases, the frame rates will drop to unplayable on both cards. This leaves five games out of 30 where a 4GB GTX 770 gives more than a 1 frame per second difference over the 2GB-equipped GTX 770. And one of them, Metro: Last Light still isn’t even quite a single frame difference.
here is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s. "
![]()
Is 4GB of VRAM enough? AMD's Fury X faces off with Nvidia's GTX 980 Ti, Titan X
Is 4GB enough for a high-end GPU? We investigated and tested 15 titles to find out.www.extremetech.com
"Some games won’t use much VRAM, no matter how much you offer them, while others are more opportunistic. This is critically important for our purposes, because there’s not an automatic link between the amount of VRAM a game is using and the amount of VRAM it actually requires to run. Our first article on the Fury X showed how Shadow of Mordor actually used dramatically more VRAM on the GTX Titan X as compared with the GTX 980 Ti, without offering a higher frame rate.
While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, provoking this problem in current-generation titles required us to use settings that rendered the games unplayable on any current GPU."
W1zzard obviously agrees here as he again echoes what others have said before when he writes:
"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 has 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. "
This kinda reminds me of the 960 / 380 era ... the 380x was $230 - 240ish ... the $ 60-70 jump to the 970 was just too small to ignore. There's nothing here to complain about it's a solid card .... however priced between the 3070 and 6800XT / 3080 .... AMD's primary competition here is itself ... At $450, it would make a lot more sense.
Processor | AMD Ryzen 5 5600@80W |
---|---|
Motherboard | MSI B550 Tomahawk |
Cooling | ZALMAN CNPS9X OPTIMA |
Memory | 2*8GB PATRIOT PVS416G400C9K@3733MT_C16 |
Video Card(s) | Sapphire Radeon RX 6750 XT Pulse 12GB |
Storage | Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB |
Display(s) | AOC 27G2U/BK IPS 144Hz |
Case | SHARKOON M25-W 7.1 BLACK |
Audio Device(s) | Realtek 7.1 onboard |
Power Supply | Seasonic Core GC 500W |
Mouse | Sharkoon SHARK Force Black |
Keyboard | Trust GXT280 |
Software | Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux |