Processor | Ryzen 7 5700X |
---|---|
Memory | 48 GB |
Video Card(s) | RTX 4080 |
Storage | 2x HDD RAID 1, 3x M.2 NVMe |
Display(s) | 30" 2560x1600 + 19" 1280x1024 |
Software | Windows 10 64-bit |
I checked again for you. If I run over 200 FPS and put my ear right next to the card I can hear a little bit of coil whine, but it's minimal, similar to other cards, not worth mentioning.Nice Review! But how was the experience with coil whine, if any?
System Name | M3401 notebook |
---|---|
Processor | 5600H |
Motherboard | NA |
Memory | 16GB |
Video Card(s) | 3050 |
Storage | 500GB SSD |
Display(s) | 14" OLED screen of the laptop |
Software | Windows 10 |
Benchmark Scores | 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling. |
The key question is, if it is a real release, or a la 3080, to trick AMD to price own cards lower, while NV would shrug off "oh, no availability, sorry" until they have an actual answer to Big Navi.I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?
I mean 3070 is such an easy target for AMD to beat. Adding 4GB extra GDDR6 costs 20 bucks at most and as we've seen from reputable leaks RDNA2 can hit nearly 2.5Ghz now. All AMD has to offer is "X Box X's" 52CU (less than 300 mm2 die size) GPU clocked at 2.2Ghz (PS5 gaming clock speed) to marginally beat 3070 in standard rasterization performance. I'd expect Nvidia to wait for 6700XT announcement and then tweak and release 3070 accordingly. I think Nvidia will lose xx70 battle this time around if it can't release competitive 3070 TI soon after 6700XT launches.Quite obviously to steal some thunder. AMD can't 'pop more vram' on the card a day before. AMD's card is what it is on that front. Surely clocks are being tweaked, but yeah.
Benchmark Scores | Faster than yours... I'd bet on it. :) |
---|
Not the goal I shot at... but ok.I mean 3070 is such an easy target for AMD to beat. Adding 4GB extra GDDR6 costs 20 bucks at most and as we've seen from reputable leaks RDNA2 can hit nearly 2.5Ghz now. All AMD has to offer is "X Box X's" 52CU (less than 300 mm2 die size) GPU clocked at 2.2Ghz (PS5 gaming clock speed) to marginally beat 3070 in standard rasterization performance. I'd expect Nvidia to wait for 6700XT announcement and then tweak and release 3070 accordingly. I think Nvidia will lose xx70 battle this time around if it can't release competitive 3070 TI soon after 6700XT launches.
From GUru3d:
Much like the 3080, the GeForce RTX 3070 does exhibit coil squeal. Is it annoying? It's at a level you can hear it. In a closed chassis, that noise would fade away in the background. However, with an open chassis, you can hear coil whine/squeal. Graphics cards all make this in some form, especially at high framerates; this can be perceived.
Guru3d mention felt like a worse problem. Thanks for your second check, i'll trust you.I checked again for you. If I run over 200 FPS and put my ear right next to the card I can hear a little bit of coil whine, but it's minimal, similar to other cards, not worth mentioning.
System Name | Virtual Reality / Bioinformatics |
---|---|
Processor | Undead CPU |
Motherboard | Undead TUF X99 |
Cooling | Noctua NH-D15 |
Memory | GSkill 128GB DDR4-3000 |
Video Card(s) | EVGA RTX 3090 FTW3 Ultra |
Storage | Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB |
Display(s) | 32'' 4K Dell |
Case | Fractal Design R5 |
Audio Device(s) | BOSE 2.0 |
Power Supply | Seasonic 850watt |
Mouse | Logitech Master MX |
Keyboard | Corsair K70 Cherry MX Blue |
VR HMD | HTC Vive + Oculus Quest 2 |
Software | Windows 10 P |
I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?
Processor | Ryzen 7 5700X |
---|---|
Memory | 48 GB |
Video Card(s) | RTX 4080 |
Storage | 2x HDD RAID 1, 3x M.2 NVMe |
Display(s) | 30" 2560x1600 + 19" 1280x1024 |
Software | Windows 10 64-bit |
It's half of AMD ?@W1zzard : why don't you mention the very high multi monitor power draw in the cons of your Ampere reviews, because compared to Turing or Pascal its significantly worse? AMD gets constantly bashed for this metric, but nVidia gets a free pass?
Could be variance between samplesGuru3d mention felt like a worse problem. Thanks for your second check, i'll trust you.
System Name | My PC |
---|---|
Processor | AMD 2700 x @ 4.1Ghz |
Motherboard | Gigabyte X470 Aorus Gaming |
Cooling | Zalman CNPS20X |
Memory | Corsair Vengeance LPX Black 32GB DDR4 |
Video Card(s) | Sapphire Radeon RX 570 PULSE |
Storage | Adata Ultimate SU800 |
Case | Phanteks Eclipse P500A |
Audio Device(s) | Logitech G51 |
Power Supply | Seasonic Focus GX, 80+ Gold, 550W |
Keyboard | Roccat Vulcan 121 |
System Name | Titan |
---|---|
Processor | AMD Ryzen™ 7 7950X3D |
Motherboard | ASRock X870 Taichi Lite |
Cooling | Thermalright Phantom Spirit 120 EVO CPU |
Memory | TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30 |
Video Card(s) | ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) |
Storage | Crucial T500 2TB x 3 |
Display(s) | LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA) |
Case | Cooler Master QUBE 500 Flatpack Macaron |
Audio Device(s) | Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless |
Power Supply | Corsair SF1000 |
Mouse | Logitech Pro Superlight 2 (White), G303 Shroud Edition |
Keyboard | Keychron K2 HE Wireless / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2 |
VR HMD | Meta Quest 3 512GB |
Software | Windows 11 Pro 64-bit 24H2 Build 26100.2605 |
@W1zzard : why don't you mention the very high multi monitor power draw in the cons of your Ampere reviews, because compared to Turing or Pascal its significantly worse? AMD gets constantly bashed for this metric, but nVidia gets a free pass?
1. RT really doesn't matter for now and SP5/XboX X will have RDNA chips in them so expect AMD's implementation of ray tracing to rule gaming market with only exception being Nvidia sponsored titles.1. RTRT performance. AMD will be a lot slower.
2. DLSS (read: free performance, tier higher performance than your card is actually capables of). Just missing.
This is exactly what I was looking for, thank you. So indeed, most games will never benefit from the new SM array it seems as the games seldomly use FP64 int, correct?The Ampere SM has 128 CUDA cores however only 64 of them can execute FP32 operations concurrent with 64 integer operations in a clock cycle, meaning that in the best case scenario the Ampere SM has twice the shading power (128 FP32 per cycle) of a Turing one and in the worst case scenarios it's about as fast (64 FP32).
In other words the SM has some pretty bad constraints making it less efficient per cycle in terms of FP32.
System Name | Good enough |
---|---|
Processor | AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge |
Motherboard | ASRock B650 Pro RS |
Cooling | 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30 |
Memory | 32GB - FURY Beast RGB 5600 Mhz |
Video Card(s) | Sapphire RX 7900 XT - Alphacool Eisblock Aurora |
Storage | 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB |
Display(s) | LG UltraGear 32GN650-B + 4K Samsung TV |
Case | Phanteks NV7 |
Power Supply | GPS-750C |
This is exactly what I was looking for, thank you. So indeed, most games will never benefit from the new SM array it seems as the games seldomly use FP64 int, correct?
W1zzard has mentioned this and this has been disspelled a thousand times already:
It's half of AMD ?
Benchmark Scores | Faster than yours... I'd bet on it. :) |
---|
It will be OK to hold your breath... seriously.No he has not. Next gen games haven't even been released yet and we are on the edge. He said it was good enough for now, which isn't what we should be looking at, but over the next year as PS5 level games get ported to PC. There is also a difference between "enough" and providing the hardware so that game makers can provide better textures packs, and nVidia is holding PCs back here. I was able to use more than 4GB the day the 1070 released because game makers released new options and new texture packs.
4 years from the 770 to the 1070 got us 4x more VRAM. 4 years from the 1070 to the 3070 got us nothing at all. That has never happened before and I think we shouldn't be so glib about it.
A wonderful review, a great card, only I cannot quite accept the fact that it's still priced quite high:
GTX 770 - $399
GTX 970 - $329
GTX 1070 - $449
Lastly it's the most power efficient Ampere GPU but I expected a much higher overall efficiency for this generation. It's only a few percent more efficient than the GTX 1660 Ti based on a much less advanced node. I expect availablily will totally suck for the next at least three months - Ampere cards are so good several generations of AMD/NVIDIA owners are in line to get them. The RTX3080 is still nowhere to be seen: https://www.nowinstock.net/computers/videocards/nvidia/rtx3080/
There is only ONE Benchmark I'm interested in.
MICROSOFT FLIGHT SIMULATOR 2020 4K (since no one bothers benchmarking DCS-WORLD).
The 3070 is just below the 2080Ti at 30fps (+/- 5 fps)
For $500 vs. $1200, that's pretty good.
But I have the 3090 and I can't even see 60fps in that game.