I know that this is a Home Cinema forum, and my question falls more in the Hi-Fi category, but I'm sure there are a few of you out there, who has a decent Hi-Fi system as well and may have given some thought to the following problem, which I have recently spent quite some time thinking about. And the topic is relevant for DVD users as well, so I think I can defend to post it here.
Making a digital copy of a CD should in principle create a perfect copy (e.g. in a dual deck CD-Recorder where the signal is never converted to analog). Several people/magazines/companies suggest that this is not the case: - "The XXX CD-Recorder makes near-perfect digital copies". Why only "near-perfect"? - "XXX Recordable audio discs gives significantly better quality copies than YYY". Why? - "A digital copy created at 4Xspeed is not as good as one created at normal speed". Why?
So how come, that digital copies are not always perfect. Even if a few bits here and there were toggled on the copy, the error correction should be able to handle the situation (even a small hole in a CD can be corrected by the CD error correction on most players...) so why is it that it doesn't work like this in practice?
Any comments are welcome, but I hope I don't start a very long thread like discussions about cables etc. Next topic from me will be why digital interconnect cables sound different, but I'll wait a bit with that... ;-)