- Joined
- May 22, 2015
- Messages
- 13,839 (3.95/day)
Processor | Intel i5-12600k |
---|---|
Motherboard | Asus H670 TUF |
Cooling | Arctic Freezer 34 |
Memory | 2x16GB DDR4 3600 G.Skill Ripjaws V |
Video Card(s) | EVGA GTX 1060 SC |
Storage | 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500 |
Display(s) | Dell U3219Q + HP ZR24w |
Case | Raijintek Thetis |
Audio Device(s) | Audioquest Dragonfly Red :D |
Power Supply | Seasonic 620W M12 |
Mouse | Logitech G502 Proteus Core |
Keyboard | G.Skill KM780R |
Software | Arch Linux + Win10 |
Says the guy that doesn't understand these are ballpark figures.I'm starting to feel sorry for you people who can't grasp the obvious, it doesn't matter that they don't specify which GPUs they're using (Although, even a child would notice that they're respectively XTX, 4090 and 5090, given the known difference between the first two). What matters is that they are different GPUs, and the fact that they point to GPU performance as shady marketing for their GDDR technology, because they certainly aren't using the same GPU to isolate the difference the technology is supposed to bring.
Micron is already being destroyed by multiple lawsuits, if you don't know. I wouldn't be surprised if someone shoved another one up their backsides.
This is GDDR, you will never see several generations paired with the same GPU so that you can judge the actual difference. Didn't happen till now, won't happen in the future.