How to convert audio so that it can be streamed through devices

I’ve been looking for an answer to this question for about a month now, so any help is appreciated!

I use AVAudioEngine to record sound. This sound is recorded using a tap:

localInput?.installTap(onBus: 0, bufferSize: 4096, format: localInputFormat) {

The type AVAudioPCMBuffer is recorded. It needs to be converted to type [UInt8]

I do this with this method:

func audioBufferToBytes(audioBuffer: AVAudioPCMBuffer) -> [UInt8] {
    let srcLeft = audioBuffer.floatChannelData![0]
    let bytesPerFrame = audioBuffer.format.streamDescription.pointee.mBytesPerFrame
    let numBytes = Int(bytesPerFrame * audioBuffer.frameLength)

    // initialize bytes by 0 
    var audioByteArray = [UInt8](repeating: 0, count: numBytes)

    srcLeft.withMemoryRebound(to: UInt8.self, capacity: numBytes) { srcByteData in
        audioByteArray.withUnsafeMutableBufferPointer {
            $0.baseAddress!.initialize(from: srcByteData, count: numBytes)
        }
    }

    return audioByteArray
}

Then the audio is recorded in the output stream. On another device, the data must be converted back to AVAudioPCMBuffer before it can be played. I am using this method:

func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {

    let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: true)
    let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame

    let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
    audioBuffer.frameLength = frameLength

    let dstLeft = audioBuffer.floatChannelData![0]

    buf.withUnsafeBufferPointer {
        let src = UnsafeRawPointer($0.baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
        dstLeft.initialize(from: src, count: Int(frameLength))
    }

    return audioBuffer
}

However, there should be something wrong with my logic, because on the device when I play audio I hear something, but it sounds just like static.

Any help is appreciated, as I said, I have long been stuck in this problem.


EDIT

. . , ( ):

func audioBufferToData(audioBuffer: AVAudioPCMBuffer) -> Data {
    let channelCount = 1
    let bufferLength = (audioBuffer.frameCapacity * audioBuffer.format.streamDescription.pointee.mBytesPerFrame)

    let channels = UnsafeBufferPointer(start: audioBuffer.floatChannelData, count: channelCount)
    let data = Data(bytes: channels[0], count: Int(bufferLength))

    return data
}

AVAudioPCMBuffer :

func dataToAudioBuffer(data: Data) -> AVAudioPCMBuffer {
    let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)
    let audioBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(data.count)/2)
    audioBuffer.frameLength = audioBuffer.frameCapacity
    for i in 0..<data.count/2 {
        audioBuffer.floatChannelData?.pointee[i] = Float(Int16(data[i*2+1]) << 8 | Int16(data[i*2]))/Float(INT16_MAX)
    }

    return audioBuffer
}

, ...


EDIT 2

, . , , , Data, AVAudioPCMBuffer .

: https://github.com/Lkember/IntercomTest


3

, .


EDIT 4

, . :

NSData OutputStream

+1
3

:

func audioBufferToNSData(PCMBuffer: AVAudioPCMBuffer) -> NSData {
    let channelCount = 1  // given PCMBuffer channel count is 1
    let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: channelCount)
    let data = NSData(bytes: channels[0], length:Int(PCMBuffer.frameCapacity * PCMBuffer.format.streamDescription.pointee.mBytesPerFrame))
    return data
}

func dataToAudioBuffer(data: NSData) -> AVAudioPCMBuffer {
    let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: false)
    let audioBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(data.length) / audioFormat.streamDescription.pointee.mBytesPerFrame)
    audioBuffer.frameLength = audioBuffer.frameCapacity
    let channels = UnsafeBufferPointer(start: audioBuffer.floatChannelData, count: Int(audioBuffer.format.channelCount))
    data.getBytes(UnsafeMutableRawPointer(channels[0]) , length: data.length)
    return audioBuffer
}
0

: , - , , , .

, .floatChannelData Uint8, ,

, .

, , , , (, ).

.withMemoryRebound Integer. . , .

()

audioBuffer.

, . AVAudioPCMBuffer . .

, .

+1

Check out https://www.iis.fraunhofer.de/en/ff/amm/dl/whitepapers.html Using the information here, I did something very similar. Here you will find a detailed PDF file and some sample code.

0
source

All Articles