- Joined
- Jul 20, 2020
- Messages
- 1,156 (0.71/day)
System Name | Gamey #1 / #3 |
---|---|
Processor | Ryzen 7 5800X3D / Ryzen 7 5700X3D |
Motherboard | Asrock B450M P4 / MSi B450 ProVDH M |
Cooling | IDCool SE-226-XT / IDCool SE-224-XTS |
Memory | 32GB 3200 CL16 / 16GB 3200 CL16 |
Video Card(s) | PColor 6800 XT / GByte RTX 3070 |
Storage | 4TB Team MP34 / 2TB WD SN570 |
Display(s) | LG 32GK650F 1440p 144Hz VA |
Case | Corsair 4000Air / TT Versa H18 |
Power Supply | EVGA 650 G3 / EVGA BQ 500 |
Yet in actual games that people play on computers, you know: computer games, the numbers look far better than that for AMD cards when using RT. And all that graph shows is that even a $1200 card can barely crack 30fps and renders every hyped previous gen GPU from same company utterly inadequate. Please explain how $1200 for 30fps is good for PC gaming.You can check hardware RT on unreal Engine, a 4090 is 3 times as fast as the 7900xtx, and no one can claim unreal is nvidia optimized.
This is from PUGET's review.. But somehow people in this forum will insist the difference in RT is 16%. What can you do, amd defenders just hate reality.
But this is a fanboy slide of course, we all know that