Recording to AAC from RemoteIO: data is being recorded, but files cannot be played

I am trying to record from a RemoteIO device directly to AAC in renderCallback in iOS 5 on iPad 2. I saw conflicting information that this is not possible and that it is possible (in the comments here ). My reason for this is that writing to PCM requires so much disk space to record any length, even if it is subsequently converted to AAC.

I'm ready to give up, though. I slipped through Google, SO, the Core Audio book and the Apple Core-Audio mailing list and forums and reached the point where I get no errors and write something to disk, but the resulting file does not play. This applies to both the simulator and the device.

So ... if anyone has experience with this, I would really like to push you in the right direction. The setup is that RemoteIO plays the output from AUSamplers and works fine.

Here is what I do in the code below

  • Specify AudioStreamBasicDescription formats for the remote device unit to kAudioFormatLinearPCM

  • Create and specify a destination format for ExtAudioFileRef Specify the client format by extracting it from the RemoteIO module.

  • Specify renderCallback for the RemoteID module

  • In renderCallback, write data to the kAudioUnitRenderAction_PostRender phase

As I said, I do not get any errors, and as a result, the sizes of the audio files show that something is written, but the file does not play. Perhaps I have my own formats?

In any case, this is my message in the bottle and / or the flag โ€œBe Here Dragonsโ€ for everyone who dares to the dark waters of Core-Audio.


// Unhappy msg that I get when I try to play the file:

enter image description here

// remoteIO configuration part

  // Enable IO for recording UInt32 flag = 1; result = AudioUnitSetProperty(ioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, kInputBus, // == 1 &flag, sizeof(flag)); if (noErr != result) {[self printErrorMessage: @"Enable IO for recording" withStatus: result]; return;} // Describe format - - - - - - - - - - size_t bytesPerSample = sizeof (AudioUnitSampleType); AudioStreamBasicDescription audioFormat; memset(&audioFormat, 0, sizeof(audioFormat)); audioFormat.mSampleRate = 44100.00; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked; audioFormat.mFramesPerPacket = 1; audioFormat.mChannelsPerFrame = 1; audioFormat.mBitsPerChannel = 16; audioFormat.mBytesPerPacket = 2; audioFormat.mBytesPerFrame = 2; result = AudioUnitSetProperty(ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, kInputBus, // == 1 &audioFormat, sizeof(audioFormat)); result = AudioUnitSetProperty(ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, kOutputBus, // == 0 &audioFormat, sizeof(audioFormat)); 

// Function that sets the file and rendercallback

  - (void)startRecordingAAC { OSStatus result; NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *recordFile = [documentsDirectory stringByAppendingPathComponent: @"audio.m4a"]; CFURLRef destinationURL = CFURLCreateWithFileSystemPath(kCFAllocatorDefault, (__bridge CFStringRef)recordFile, kCFURLPOSIXPathStyle, false); AudioStreamBasicDescription destinationFormat; memset(&destinationFormat, 0, sizeof(destinationFormat)); destinationFormat.mChannelsPerFrame = 2; destinationFormat.mFormatID = kAudioFormatMPEG4AAC; UInt32 size = sizeof(destinationFormat); result = AudioFormatGetProperty(kAudioFormatProperty_FormatInfo, 0, NULL, &size, &destinationFormat); if(result) printf("AudioFormatGetProperty %ld \n", result); result = ExtAudioFileCreateWithURL(destinationURL, kAudioFileM4AType, &destinationFormat, NULL, kAudioFileFlags_EraseFile, &extAudioFileRef); if(result) printf("ExtAudioFileCreateWithURL %ld \n", result); AudioStreamBasicDescription clientFormat; memset(&clientFormat, 0, sizeof(clientFormat)); result = AudioUnitGetProperty(ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, & clientFormat, &size); if(result) printf("AudioUnitGetProperty %ld \n", result); result = ExtAudioFileSetProperty(extAudioFileRef,kExtAudioFileProperty_ClientDataFormat,sizeof(clientFormat),&clientFormat); if(result) printf("ExtAudioFileSetProperty %ld \n", result); result = ExtAudioFileWriteAsync(extAudioFileRef, 0, NULL); if (result) {[self printErrorMessage: @"ExtAudioFileWriteAsync error" withStatus: result];} result = AudioUnitAddRenderNotify(ioUnit, renderCallback, (__bridge void*)self); if (result) {[self printErrorMessage: @"AudioUnitAddRenderNotify" withStatus: result];} } 

// And finally rendercallback

 static OSStatus renderCallback (void * inRefCon, AudioUnitRenderActionFlags * ioActionFlags, const AudioTimeStamp * inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { OSStatus result; if (*ioActionFlags == kAudioUnitRenderAction_PostRender){ MusicPlayerController* THIS = (__bridge MusicPlayerController *)inRefCon; result = ExtAudioFileWriteAsync(THIS->extAudioFileRef, inNumberFrames, ioData); if(result) printf("ExtAudioFileWriteAsync %ld \n", result); } return noErr; } 
+7
source share
3 answers

So, I finally figured it out! Phew, what a hunt for information.

Anyway, here is a bit in the docs for ExtAudioFile that I missed (see bold text). I have not set this property. The data was written to my .m4a file, but it was not read during playback. So to summarize: I have a bunch of AUSamplers -> AUMixer -> RemoteIO. The render callback on the RemoteIO instance writes data to disk in compressed m4a format. Thus, you can create compressed sound "on the fly" (iOS 5 / iPad 2)

Seems pretty solid - I had some printf statements in rendercallback, and the recording worked fine.

Yay

ExtAudioFileProperty_CodecManufacturer The manufacturer of the codec that will be used by the extended audio file object. The value is UInt32 read / write. You must specify this property before setting the kExtAudioFileProperty_ClientDataFormat property (page 20), which, in turn, initiates the creation of the codec. Use this property in iOS to choose between a hardware or software encoder by specifying kAppleHardwareAudioCodecManufacturer or kAppleSoftwareAudioCodecManufacturer. Available on Mac OS X version 10.7 and later. Announced in ExtendedAudioFile.h.

 // specify codec UInt32 codec = kAppleHardwareAudioCodecManufacturer; size = sizeof(codec); result = ExtAudioFileSetProperty(extAudioFileRef, kExtAudioFileProperty_CodecManufacturer, size, &codec); if(result) printf("ExtAudioFileSetProperty %ld \n", result); 
+14
source

Are you writing the magic cookie needed at the beginning of the mpg 4 audio file?

You also need to perform at least the first file write outside of the audio device render callback.

Added:

Did you close the audio file correctly at the end? (outside the AU callback)

+1
source

It seems like a little more than just specifying a codec to output an audio file.

According to this chain: ExtAudioFileWrite for m4a / aac, not working on dual-core devices (ipad 2, iphone 4s) , some iPhone (4S) models, for example, don Do not play with a hardware decoder.

From my experience with 4S, trying to encode an AAC file with a hardware codec, either a) sometimes works, b) with error -66567 (kExtAudioFileError_MaxPacketSizeUnknown) or c) writes several samples, and then just hangs with no traceable error.

The software codec works great on the iPhone 4S due to lower performance.

Edit: some argue that the hardware codec does not like the sampling rate of 44.1 kHz. I have yet to confirm this.

0
source

All Articles