My application uses an external USB microphone with a very accurate temperature compensated crystal oscillator (TCXO). The sampling frequency is 48 kHz. I plug it into iOS through the camera connector. I use the EZAudio library and everything works fine, except that iOS seems to retain its own internal clock source for fetching audio instead of my exact 48 kHz.
I read all the documentation about CoreAudio, but I did not find anything related to the source of synchronization when using lightning sound.
Is there any way to choose between an internal or external source of synchronization?
Thank!
var audioFormatIn = AudioStreamBasicDescription(mSampleRate: Float64(48000),
mFormatID: AudioFormatID(kAudioFormatLinearPCM),
mFormatFlags: AudioFormatFlags(kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked),
mBytesPerPacket: 2,
mFramesPerPacket: 1,
mBytesPerFrame: 2,
mChannelsPerFrame: 1,
mBitsPerChannel: 16,
mReserved: 0)
func initAudio()
{
let session : AVAudioSession = AVAudioSession.sharedInstance()
do {
try session.setCategory(AVAudioSessionCategoryPlayAndRecord)
try session.setMode(AVAudioSessionModeMeasurement)
try session.setActive(true)
}
catch {
print("Error Audio")
}
self.microphone = EZMicrophone(microphoneDelegate: self, withAudioStreamBasicDescription: audioFormatIn)
}
:
@Rhythmic Fistman, , . iOS, TCXO .
-, . . , , !
7 :

15 :
