CompleteHandler of AVAudioPlayerNode.scheduleFile () is called too early

I am trying to use the new AVAudioEngine in iOS 8.

It looks like completeHandler player.scheduleFile () is called before the sound file finished the game.

I am using a 5 s sound file, and println() -Message appears about 1 second before the end of the sound.

Am I doing something wrong or am I misunderstanding the idea of ​​completeHandler?

Thanks!


Here is the code:

 class SoundHandler { let engine:AVAudioEngine let player:AVAudioPlayerNode let mainMixer:AVAudioMixerNode init() { engine = AVAudioEngine() player = AVAudioPlayerNode() engine.attachNode(player) mainMixer = engine.mainMixerNode var error:NSError? if !engine.startAndReturnError(&error) { if let e = error { println("error \(e.localizedDescription)") } } engine.connect(player, to: mainMixer, format: mainMixer.outputFormatForBus(0)) } func playSound() { var soundUrl = NSBundle.mainBundle().URLForResource("Test", withExtension: "m4a") var soundFile = AVAudioFile(forReading: soundUrl, error: nil) player.scheduleFile(soundFile, atTime: nil, completionHandler: { println("Finished!") }) player.play() } } 
+5
source share
6 answers

This seems like a mistake, we have to apply for a radar! http://bugreport.apple.com

At the same time, as a workaround, I noticed that instead of using scheduleBuffer:atTime:options:completionHandler: callback starts as expected (after playback is completed).

Code example:

 AVAudioFile *file = [[AVAudioFile alloc] initForReading:_fileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil]; AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length]; [file readIntoBuffer:buffer error:&error]; [_player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{ // reminder: we're not on the main thread in here dispatch_async(dispatch_get_main_queue(), ^{ NSLog(@"done playing, as expected!"); }); }]; 
+6
source

I see the same behavior.

From my experiment, I believe that the callback is called after the buffer / segment / file has been “scheduled”, and not when it is finished.

Although the documents explicitly stated: "Called after the buffer has fully played or the player is stopped. Maybe zero."

Therefore, I think that this is either a mistake or incorrect documentation. I don't know what

+6
source

You can always calculate future audio playback time using AVAudioTime. The current behavior is useful because it supports the planning of additional buffers / segments / files for playback from the callback until the end of the current current buffer / segment / file, which avoids a gap in sound reproduction. This allows you to create a simple player without much work. Here is an example:

 class Latch { var value : Bool = true } func loopWholeFile(file : AVAudioFile, player : AVAudioPlayerNode) -> Latch { let looping = Latch() let frames = file.length let sampleRate = file.processingFormat.sampleRate var segmentTime : AVAudioFramePosition = 0 var segmentCompletion : AVAudioNodeCompletionHandler! segmentCompletion = { if looping.value { segmentTime += frames player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion) } } player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion) segmentCompletion() player.play() return looping } 

In the above code, the entire file is displayed twice before calling player.play (). Since each segment is close to completion, in the future he plans another file as a whole to avoid gaps in playback. To stop the loop, you use the return value, Latch, for example:

 let looping = loopWholeFile(file, player) sleep(1000) looping.value = false player.stop() 
+4
source

My error report was closed for this because it “works as intended,” but Apple pointed me to new variants of schedFile, scheduleSegment, and schedBuffer methods in iOS 11. They add the completeCallbackType argument, which you can use to indicate that you want an answer completion completion when playback ends:

 [self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) { // do something here }]; 

The documentation doesn't say anything about how this works, but I tested it and it works for me.

I used this workaround for iOS 8-10:

 - (void)playRecording { [self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionHandler:^() { float totalTime = [self recordingDuration]; float elapsedTime = [self recordingCurrentTime]; float remainingTime = totalTime - elapsedTime; [self performSelector:@selector(doSomethingHere) withObject:nil afterDelay:remainingTime]; }]; } - (float)recordingDuration { float duration = duration = self.audioUnitFile.length / self.audioUnitFile.processingFormat.sampleRate; if (isnan(duration)) { duration = 0; } return duration; } - (float)recordingCurrentTime { AVAudioTime *nodeTime = self.audioUnitPlayer.lastRenderTime; AVAudioTime *playerTime = [self.audioUnitPlayer playerTimeForNodeTime:nodeTime]; AVAudioFramePosition sampleTime = playerTime.sampleTime; if (sampleTime == 0) { return self.audioUnitLastKnownTime; } // this happens when the player isn't playing sampleTime += self.audioUnitStartingFrame; // if we trimmed from the start, or changed the location with the location slider, the time before that point won't be included in the player time, so we have to track it ourselves and add it here float time = sampleTime / self.audioUnitFile.processingFormat.sampleRate; self.audioUnitLastKnownTime = time; return time; } 
+1
source

Yes, it is called a bit before the file (or buffer) completes. If you call [myNode stop] from the completion handler, the file (or buffer) will not be completely completed. However, if you call [myEngine stop], the file (or buffer) will be completed to the end

0
source
 // audioFile here is our original audio audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: { print("scheduleFile Complete") var delayInSeconds: Double = 0 if let lastRenderTime = self.audioPlayerNode.lastRenderTime, let playerTime = self.audioPlayerNode.playerTime(forNodeTime: lastRenderTime) { if let rate = rate { delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate) / Double(rate!) } else { delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate) } } // schedule a stop timer for when audio finishes playing DispatchTime.executeAfter(seconds: delayInSeconds) { audioEngine.mainMixerNode.removeTap(onBus: 0) // Playback has completed } }) 
0
source

All Articles