Edit:
Not to sound rude:
'Increase in watts' I think you mean increased bitrate and channels, 'Reduced due to signal strength' you mean lossy due to compression. -- Yes you are right but I did mention it.
"As a side note, Optical with lossy compression still sounds better than 32 bit analogue.
> Can depend on equipment, and DAC's."
If you and I had a receiver that could do 5.1 LCPM on SPDIF, then already I can do lossless 5.1 on SPDIF (my converter).
----
The idea is to reduce the use of analogue as much as possible, since it causes loss, it should be minimised.
Excluding THD THD-N, amplify lossy analogue signal, or amplify lossless digital signal?
In my case the Logitech Z906 is the main point of loss (its DAC), if I used analogue, the loss starts at the computer.
-- Ignoring digital compression
====
Lossless digital (optical) right up to each satellite smart speaker, with a PowerDAC say 2cm away from the driver, and 0 ohm speakers should be unbeatable circuiting.
How many of you cut your speaker cables as short as possible, and why?
----
Me pinging Google Australia from the UK, using fiber optic broadband - TOSLink modules are rated in NRZ (
see here).
----
If you want to see lossless digital (and lossless compressed), above 2 channel, you need to badger OEM's into updating their SPDIF and digital converters.
Decided to post some info I found. ---- S/PDIF (tech-faq.com) - "Although the SPDIF protocol doesn’t specific a max resolution or data rate, the equipment which uses the SPDIF connectors has to determine the data rate..." S/PDIF - Wikipedia - "...has no defined data rate. Instead, the data is...
www.techpowerup.com
Can't say I can find many that do DD+ and DTS HRA even though 2x 192k bitrate supports it.
15 x 192k = 69120 kbit/s, and 2880 total sample aggregates, too much for HDA.