In-memory generation of LPCM buffer for AVAudioPlayer initWithData

I want to programmatically create a sound wave and play it using AVAudioPlayer. I have a code to encode my signal as linear PCM, 44100 Hz, mono, 8 bits per sample.

I don’t understand which envelope I need to wrap around this buffer so that AVAudioPlayer recognizes it as PCM.

+6
iphone audio avaudioplayer core-audio
source share
3 answers

PCM is just a digital representation of an analog audio signal. Unfortunately, it does not contain any metadata about audio channels, bit depth or sample rate - all this is necessary for the correct reading of the specified PCM data. I would suggest that AVAudioPlayer will accept this PCM data wrapped in an NSData object until you can manually set these variables in the AVAudioPlayer object. Unfortunately, these variables are read-only, so even though the documentation says that AVAudioPlayer can handle everything that Core Audio can handle, it cannot process raw LPCM data.

According to zoul, I would suggest that the easiest way to do this is to add a WAV header as it informs AVPlayer about the variables needed above. It is 44 bytes, it is easy to model and it is well defined - I used the same definition as above to implement encoding and decoding of the wav header. Also, it just preceded your unchanged LPCM data.

+5
source share

Maybe adding a WAV header will help?

+4
source share

I published an example of Swift 5 (as a GitHub Gist) converting a sample buffer of an audio floating file to a WAV file in memory for use with AVAudioPlayer initWithData, here: https://gist.github.com/hotpaw2/4eb1ca16c138178113816e78b14dde8b

0
source share

All Articles