I implemented an audio player using AVAudioPlayer (not AVPlayer ). I can handle remote control events in the following way. It has been working pretty well so far, however, I see two more subtypes for these events: UIEventSubtypeRemoteControlEndSeekingForward and UIEventSubtypeRemoteControlEndSeekingBackward .
- (void)remoteControlReceivedWithEvent:(UIEvent *)event { //if it is a remote control event handle it correctly if (event.type == UIEventTypeRemoteControl) { if (event.subtype == UIEventSubtypeRemoteControlPlay) { [self playAudio]; } else if (event.subtype == UIEventSubtypeRemoteControlPause) { [self pauseAudio]; } else if (event.subtype == UIEventSubtypeRemoteControlTogglePlayPause) { [self togglePlayPause]; } else if (event.subtype == UIEventSubtypeRemoteControlBeginSeekingBackward) { [self rewindTheAudio]; //this method rewinds the audio by 15 seconds. } else if (event.subtype == UIEventSubtypeRemoteControlBeginSeekingForward) { [self fastForwardTheAudio]; //this method fast-forwards the audio by 15 seconds. } }
So the questions are:
For everything to be correct, should I also implement these two subtypes?
This method allows you to use the rewind , play/pause and fast forward play/pause on the lock screen, but does not display the file name, image and duration. How can I display this information using AVAudioPlayer or AVAudioSession (I don't want another library / API to implement this)?
2-a. During the search, I discovered MPNowPlayingInfoCenter , and I know very little about it. Should I use it to implement these things above ?: - [
ios avaudioplayer avaudiosession mpnowplayinginfocenter background-audio
Neeku
source share