- Joined
- Oct 30, 2022
- Messages
- 243 (0.30/day)
- Location
- Australia
System Name | Blytzen |
---|---|
Processor | Ryzen 7 7800X3D |
Motherboard | ASRock B650E Taichi Lite |
Cooling | Deepcool LS520 (240mm) |
Memory | G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30 |
Video Card(s) | Powercolor 6800XT Red Dragon (16 gig) |
Storage | 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS |
Display(s) | MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k |
Case | Coolermaster HAF 500 |
Audio Device(s) | Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites) |
Power Supply | Corsair HX850 |
Mouse | Logitech G502X lightspeed |
Keyboard | Logitech G915 TKL tactile |
Benchmark Scores | Squats and calf raises |
Thing is, that's taken the sub optimal time degraded image and restored it.That's just flat out wrong though. Have you seen image restoration for example?
4k at 32", definitely needs AA.
If you took a picture of the same subject today with a quality modern camera at the same detail level (pretend for arguments sake the original was 6000x4000 pixels - so a camera that uses the same resolution) then you couldn't improve upon the picture taken without interpolation, which is not the original image then.
Image restoration is recreation of the best picture, or returning it to as close to the original as you can, which is exactly what frame generation/upscaling does, tries to get as close to the original as it can.
The difference with image restoration is that the original image when first created (so at it's best) might not be as good as what could be produced today.
Mathematically speaking though, if it was digital and the file wasn't corrupted, you couldn't restore it to be better without interpolating data.