- Joined
- Jul 15, 2006
- Messages
- 1,349 (0.20/day)
- Location
- Noir York
Processor | AMD Ryzen 7 5700G |
---|---|
Motherboard | Gigabyte B450M S2H |
Cooling | Scythe Kotetsu Mark II |
Memory | 2 x 16GB SK Hynix CJR OEM DDR4-3200 @ 4000 20-22-20-48 |
Video Card(s) | Colorful RTX 2060 SUPER 8GB GDDR6 |
Storage | 250GB WD BLACK SN750 M.2 + 4TB WD Red Plus + 4TB WD Purple |
Display(s) | AOpen 27HC5R 27" 1080p 165Hz curved VA |
Case | AIGO Darkflash C285 |
Audio Device(s) | Creative SoundBlaster Z + Kurtzweil KS-40A bookshelf / Sennheiser HD555 |
Power Supply | Great Wall GW-EPS1000DA 1kW |
Mouse | Razer Deathadder Essential |
Keyboard | Cougar Attack2 Cherry MX Black |
Software | Windows 10 Pro x64 22H2 |
That's really funny to use RTX2080Ti forced to use 1080p just to enable RTRT, and then use integer scaling to 4kI just love how suddenly AMD fans praise this, when just few days earlier Radeon Image Sharpening was so much better than that Nvidia's gimmick. Same story as always.
I can't wait for AMD's RTRT launch.
Because these games simply don't offer 4K resolution, so scaling from 1080p is necessary.
That said, the likely reason why Nvidia added integer scaling was not because of retro games, but because of RTRT even many 2080Ti owners would decide to fall back to 1080p.
And 1080p with RTX and integer scaling looks really well on 4K monitors.
AMD is going to add RTRT in the next generation, so it makes a lot of sense to bundle integer scaling as well.
You know why people praise this? When AMD makes this its highly likely supported for all generations of card instead of tying this to latest generation of cards.