Actually, my question was a bit of both. I never expected the 7600 to use less power than the 6500 XT even in video playback, but I was curious if it was at least somewhat comparable due to being a monolithic chip, or if it wasn't comparable due to the architectural differences.
You see in the linked chart that even the 4090 uses less power in video playback than the 7600, so it's not only a matter of GPU size and the number of RAM chips. Modern GPUs have video decoders built-in, so I don't see why they should use their shader cores and ramp up clocks and power to play video, but apparently, RDNA 3 does just that. It's very strange.
Edit: I was also curious if newer drivers alleviated the matter a little since the TPU review.