> A deviation in playback speed of +0.6% or -0.6% corresponds to a pitch
> deviation of about 10 cents. That's audible.
If you play it back on the same hardware you used to record then there
shouldn't be any pitch change :-)
But even if you played back something recorded elsewhere, do you really
think a normal person would notice a 0.6% higher or lower pitch (without
the correct reference pitch to compare with at the same time). When
they show films (shot at 24 fps) on PAL (25 fps) doesn't the video and
audio come out 4% faster and higher pitch (or do they resample that)?
> (I would have expected the DAC to run at whatever speed is convenient,
> and then resample the signal in software... But what do I know?)
If the software knew what the real clock rate was with 100% accuracy
there wouldn't be this discussion in the first place! Anyway if you
were to resample from (say) 43500 Hz to 44100 Hz you're going to get a
lot of artefacts and use a lot of CPU (battery) - better to just play
your 43500 Hz recording assuming it was at 44100 Hz.
You also have conflicting technical requirements in a device like a
tablet. For example the audio circuit needs the clock to be as steady
and accurate as possible, whereas other parts of the circuits will
definitely not want a steady clock frequency to avoid EMC/EMI issues.
Guess which requirement will get forgotten when the EMC tests fail or
the bean counters tighten the strings?
Post a reply to this message
|