- Joined
- May 22, 2015
- Messages
- 13,924 (3.95/day)
Processor | Intel i5-12600k |
---|---|
Motherboard | Asus H670 TUF |
Cooling | Arctic Freezer 34 |
Memory | 2x16GB DDR4 3600 G.Skill Ripjaws V |
Video Card(s) | EVGA GTX 1060 SC |
Storage | 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500 |
Display(s) | Dell U3219Q + HP ZR24w |
Case | Raijintek Thetis |
Audio Device(s) | Audioquest Dragonfly Red :D |
Power Supply | Seasonic 620W M12 |
Mouse | Logitech G502 Proteus Core |
Keyboard | G.Skill KM780R |
Software | Arch Linux + Win10 |
True that. Initial attempts to move past supersampling were pretty cringe. First LOD implementations resulted in those demarcation lines that moved as you moved... I have faith current attempts will be able to overcome their teething problems, too.Yeah... except one visible artifact in the picture is enough to kill the dream and the ignorance.
I feel it's far better than an alpha, but I agree that everybody will adopt (or not) at their own pace. Continuing the example above I, myself, was all over the control panel trying various AA and trilinear optimization settings for quite a while. As time went by, I can't even tell where those settings are otoh.To me it all really comes down to very simple principles. Its either convincing, or its not. There's no in-between, those are just failed attempts. The moment RT and AI generated frames are convincing to me, I'll be an adopter. So far, its just a live alpha that you keep paying for. F*ck that.
Again, agreed. Realism is just a baseline: do things the default way, you'll get something as close as possible to "real life". From there on, you're free to add any artistic flavor you want.Another aspect that's often forgotten in computer graphics is that the chase to approach 'realism' doesn't always make for a better picture. Cinematography is more than just 'showing things as they are', its often the very opposite.
Last edited: