首页 > 解决方案 > Swift/Cocoa:如何将 AVMutableComposition 导出为 wav 或 aiff 音频文件

问题描述

我有一个AVMutableComposition只包含我想导出到 .wav 音频文件的音频。

我发现的最简单的导出音频的解决方案是AVAssetExportSession在这个简化的示例中使用:

let composition = AVMutableComposition()
// add tracks...

let exportSession = AVAssetExportSession(asset: composition,
                                         presetName: AVAssetExportPresetAppleM4A)!
exportSession.outputFileType = .m4a
exportSession.outputURL = someOutUrl
exportSession.exportAsynchronously {
    // done
}

但它仅适用于 .m4a

这篇文章提到,为了导出为其他格式,必须使用AVAssetReaderand AVAssetWriter,但不幸的是,它没有详细说明。

我试图实现它,但陷入了这个过程。
这就是我到目前为止所拥有的(再次简化):

let composition = AVMutableComposition()
let outputSettings: [String : Any] = [
    AVFormatIDKey: kAudioFormatLinearPCM,
    AVLinearPCMIsBigEndianKey: false,
    AVLinearPCMIsFloatKey: false,
    AVLinearPCMBitDepthKey: 32,
    AVLinearPCMIsNonInterleaved: false,
    AVSampleRateKey: 44100.0,
    AVChannelLayoutKey: NSData(),
]

let assetWriter = try! AVAssetWriter(outputURL: someOutUrl, fileType: .wav)
let input = AVAssetWriterInput(mediaType: .audio, outputSettings: outputSettings)
assetWriter.add(input)
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: CMTime.zero)
input.requestMediaDataWhenReady(on: .main) {

    // as I understand, I need to bring in data from my
    // AVMutableComposition here...

    let sampleBuffer: CMSampleBuffer = ???
    input.append(sampleBuffer)
}

assetWriter.finishWriting {
    // done
}

归结为我的问题:

您能否提供一个将音频从 AVMutableComposition 导出到 wav 文件的工作示例?

标签: swiftavfoundationcore-audioavassetwriter

解决方案


经过更多研究,我想出了以下解决方案。
缺少的部分是 AVAssetReader 的使用。

(简化代码)

// composition
let composition = AVMutableComposition()
// add stuff to composition

// reader
guard let assetReader = try? AVAssetReader(asset: composition) else { return }
assetReader.timeRange = CMTimeRange(start: .zero, duration: CMTime(value: composition.duration.value, timescale: composition.duration.timescale))
let assetReaderAudioMixOutput = AVAssetReaderAudioMixOutput(audioTracks: composition.tracks(withMediaType: .audio), audioSettings: nil)
assetReader.add(assetReaderAudioMixOutput)
guard assetReader.startReading() else { return }

// writer
let outputSettings: [String : Any] = [
    AVFormatIDKey: kAudioFormatLinearPCM,
    AVLinearPCMIsBigEndianKey: false,
    AVLinearPCMIsFloatKey: false,
    AVLinearPCMBitDepthKey: 32,
    AVLinearPCMIsNonInterleaved: false,
    AVSampleRateKey: 44100.0,
    AVChannelLayoutKey: NSData(),
]
guard let assetWriter = try? AVAssetWriter(outputURL: someOutUrl, fileType: .wav) else { return }
let writerInput = AVAssetWriterInput(mediaType: .audio, outputSettings: outputSettings)
assetWriter.add(writerInput)
guard assetWriter.startWriting() else { return }
assetWriter.startSession(atSourceTime: CMTime.zero)

let queue = DispatchQueue(label: "my.queue.id")
input.requestMediaDataWhenReady(on: queue) {

    // capture assetReader in my block to prevent it being released
    let readerOutput = assetReader.outputs.first!

    while writerInput.isReadyForMoreMediaData {
        if let nextSampleBuffer = readerOutput.copyNextSampleBuffer() {
            writerInput.append(nextSampleBuffer)
        } else {
            writerInput.markAsFinished()
            assetWriter.endSession(atSourceTime: composition.duration)
            assetWriter.finishWriting() {
                DispatchQueue.main.async {

                    // done, call my completion
                }
            }
            break;
        }
    }
}



推荐阅读