ExtAudioFileWrite for m4a / aac not working on dual-core devices (ipad 2, iphone 4s)

I wrote a loop to encode the pcm audio data created by my application in aac using Extended Audio File Services. Encoding occurs in the background stream synchronously, and not in real time.

Encoding works flawlessly on ipad 1 and iphone 3gs / 4 for ios 4 and 5. However, for dual-core devices (iphone 4s, ipad 2), the third call to ExtAudioFileWrite produces an encoding stream without a stack trace and there is no error code.

Here is the code in question:

Data formats

AudioStreamBasicDescription AUCanonicalASBD(Float64 sampleRate, UInt32 channel){ AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = sampleRate; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kAudioFormatFlagsAudioUnitCanonical; audioFormat.mChannelsPerFrame = channel; audioFormat.mBytesPerPacket = sizeof(AudioUnitSampleType); audioFormat.mBytesPerFrame = sizeof(AudioUnitSampleType); audioFormat.mFramesPerPacket = 1; audioFormat.mBitsPerChannel = 8 * sizeof(AudioUnitSampleType); audioFormat.mReserved = 0; return audioFormat; } AudioStreamBasicDescription MixdownAAC(void){ AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = 44100.0; audioFormat.mFormatID = kAudioFormatMPEG4AAC; audioFormat.mFormatFlags = kMPEG4Object_AAC_Main; audioFormat.mChannelsPerFrame = 2; audioFormat.mBytesPerPacket = 0; audioFormat.mBytesPerFrame = 0; audioFormat.mFramesPerPacket = 1024; audioFormat.mBitsPerChannel = 0; audioFormat.mReserved = 0; return audioFormat; } 

Rendering cycle

 OSStatus err; ExtAudioFileRef outFile; NSURL *mixdownURL = [NSURL fileURLWithPath:filePath isDirectory:NO]; // internal data format AudioStreamBasicDescription localFormat = AUCanonicalASBD(44100.0, 2); // output file format AudioStreamBasicDescription mixdownFormat = MixdownAAC(); err = ExtAudioFileCreateWithURL((CFURLRef)mixdownURL, kAudioFileM4AType, &mixdownFormat, NULL, kAudioFileFlags_EraseFile, &outFile); err = ExtAudioFileSetProperty(outFile, kExtAudioFileProperty_ClientDataFormat, sizeof(AudioStreamBasicDescription), &localFormat); // prep AllRenderData *allData = &allRenderData; writeBuffer = malloc(sizeof(AudioBufferList) + (2*sizeof(AudioBuffer))); writeBuffer->mNumberBuffers = 2; writeBuffer->mBuffers[0].mNumberChannels = 1; writeBuffer->mBuffers[0].mDataByteSize = bufferBytes; writeBuffer->mBuffers[0].mData = malloc(bufferBytes); writeBuffer->mBuffers[1].mNumberChannels = 1; writeBuffer->mBuffers[1].mDataByteSize = bufferBytes; writeBuffer->mBuffers[1].mData = malloc(bufferBytes); memset(writeBuffer->mBuffers[0].mData, 0, bufferBytes); memset(writeBuffer->mBuffers[1].mData, 0, bufferBytes); UInt32 framesToGet; UInt32 frameCount = allData->gLoopStartFrame; UInt32 startFrame = allData->gLoopStartFrame; UInt32 lastFrame = allData->gLoopEndFrame; // write one silent buffer ExtAudioFileWrite(outFile, bufferFrames, writeBuffer); while (frameCount < lastFrame){ // how many frames do we need to get if (lastFrame - frameCount > bufferFrames) framesToGet = bufferFrames; else framesToGet = lastFrame - frameCount; // get dem frames err = theBigOlCallback((void*)&allRenderData, NULL, NULL, 1, framesToGet, writeBuffer); // write to output file ExtAudioFileWrite(outFile, framesToGet, writeBuffer); frameCount += framesToGet; } // write one trailing silent buffer memset(writeBuffer->mBuffers[0].mData, 0, bufferBytes); memset(writeBuffer->mBuffers[1].mData, 0, bufferBytes); processLimiterInPlace8p24(limiter, writeBuffer->mBuffers[0].mData, writeBuffer->mBuffers[1].mData, bufferFrames); ExtAudioFileWrite(outFile, bufferFrames, writeBuffer); err = ExtAudioFileDispose(outFile); 

Pcm frames are correctly created, but ExtAudioFileWrite does not execute the second or third time it is called.

Any ideas? Thanks!

+6
source share
1 answer

I had a very similar problem when I tried to use Extended Audio File Services to stream PCM audio to m4a file on iPad 2. Everything worked, except that every call to ExtAudioFileWrite returned error code -66567 (kExtAudioFileError_MaxPacketSizeUnknown). The fix I eventually found was to install "Codec Manufacturer" instead of software. So place

 UInt32 codecManf = kAppleSoftwareAudioCodecManufacturer; ExtAudioFileSetProperty(FileToWrite, kExtAudioFileProperty_CodecManufacturer, sizeof(UInt32), &codecManf); 

before setting the client data format.

This would lead me to think that Apple hardware codecs can only support very specific encoding, but software codecs can more reliably do what you want. In my case, translating a software codec to m4a takes 50% more time than writing the same file to the LPCM format.

Does anyone know if Apple is pointing somewhere where their audio codec hardware is capable of? It seems that the software developers were stuck playing an hour-long fortune-telling game with settings of ~ 20 parameters in AudioStreamBasicDescription and AudioChannelLayout for the client and for the file for each possible permutation until something works ...

+17
source

All Articles