AnalogKid Wrote:The medium is digital, the laser is analog. 20% of the disk is reserved for error correction precisely for this reason. The readback of ALL digital optical media is error strewn.
The laser might be thought of as analog in a sense, I suppose, but the logic that it feeds is strictly digital. Either the laser reads a bit, or it doesn't. It might read "half a bit" but the logic will make a determination via a threshold comparison of either bit or no bit. I understand the point you are making... that the environment of the cd and laser, quality of disc and laser, etc. all have an impact on whether the laser correctly reads the bit or not. But you are grossly exaggerating the net effect. 20% might be reserved for error correction, and for precisely that reason redbook audio is
very error resistant on final output given a clean disc and reasonable electronics. Readback of optical media at the laser level might be somewhat error strewn, but on the output level it is certainly not. The layers of error correction in summation are very robust and tolerant of single bit errors. Only large physical media defects are routinely transferred through the logic chain in all but the worst players.
I've captured digital streams from a few different players and compared large segments of the resulting raw data. Even with the possibility of transmission and/or capture errors (though unlikely), I found discrepancies almost nonexistant.
That's the whole point of digital audio and the redbook standard. If playback had something even remotely approaching a 1% output error rate, the design of the whole technology and error correction scheme would be an absolute and utter failure.
Audiophiles claim to hear differences in various uber expensive cables, where the actual transmission differences have to be a fraction of a fraction of a fraction of a percent. Put on a blindfold, and they can't even tell the difference between a wav from an ipod and a wav from and ultra expensive CD transport.