首页 > 解决方案 > 将 CMSampleBuffer 转换为 AVAudioPCMBuffer 以获取实时音频频率

问题描述

我正在尝试从 of 返回的频率值中读取频率CMSampleBuffer值。captureOutputAVCaptureAudioDataOutputSampleBufferDelegate

这个想法是创建一个AVAudioPCMBuffer,这样我就可以阅读它的floatChannelData. 但我不确定如何将缓冲区传递给它。

我想我可以通过以下方式创建它:

public func captureOutput(_ output: AVCaptureOutput,
                              didOutput sampleBuffer: CMSampleBuffer,
                              from connection: AVCaptureConnection) {
guard let blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer) else {
  return
}
let length = CMBlockBufferGetDataLength(blockBuffer)

let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: false)
let pcmBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat!, frameCapacity: AVAudioFrameCount(length))
pcmBuffer?.frameLength = pcmBuffer!.frameCapacity

但是我怎么能填写它的数据呢?

标签: iosswiftaudio-streaming

解决方案


这些方面的东西应该会有所帮助:

var asbd = CMSampleBufferGetFormatDescription(sampleBuffer)!.audioStreamBasicDescription!
var audioBufferList = AudioBufferList()
var blockBuffer : CMBlockBuffer?

CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
    sampleBuffer,
    bufferListSizeNeededOut: nil,
    bufferListOut: &audioBufferList,
    bufferListSize: MemoryLayout<AudioBufferList>.size,
    blockBufferAllocator: nil,
    blockBufferMemoryAllocator: nil,
    flags: 0,
    blockBufferOut: &blockBuffer
)

let mBuffers = audioBufferList.mBuffers
let frameLength = AVAudioFrameCount(Int(mBuffers.mDataByteSize) / MemoryLayout<Float>.size)
let pcmBuffer = AVAudioPCMBuffer(pcmFormat: AVAudioFormat(streamDescription: &asbd)!, frameCapacity: frameLength)!
pcmBuffer.frameLength = frameLength
pcmBuffer.mutableAudioBufferList.pointee.mBuffers = mBuffers
pcmBuffer.mutableAudioBufferList.pointee.mNumberBuffers = 1

这似乎在捕获会话的末尾创建了一个有效的 AVAudioPCMBuffer 。但是对于我的用例来说,它现在的帧长度是错误的,所以需要做一些进一步的缓冲。


推荐阅读