swift - 尝试使用 AVFoundation AVAssetWriter 将 Metal 帧写入 Quicktime 文件时出现所有黑帧
问题描述
我正在使用这个 Swift 类(最初显示在这个问题的答案中: Capture Metal MTKView as Movie in realtime?)尝试将我的 Metal 应用程序帧记录到电影文件中。
class MetalVideoRecorder {
var isRecording = false
var recordingStartTime = TimeInterval(0)
private var assetWriter: AVAssetWriter
private var assetWriterVideoInput: AVAssetWriterInput
private var assetWriterPixelBufferInput: AVAssetWriterInputPixelBufferAdaptor
init?(outputURL url: URL, size: CGSize) {
do {
assetWriter = try AVAssetWriter(outputURL: url, fileType: AVFileTypeAppleM4V)
} catch {
return nil
}
let outputSettings: [String: Any] = [ AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : size.width,
AVVideoHeightKey : size.height ]
assetWriterVideoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
assetWriterVideoInput.expectsMediaDataInRealTime = true
let sourcePixelBufferAttributes: [String: Any] = [
kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA,
kCVPixelBufferWidthKey as String : size.width,
kCVPixelBufferHeightKey as String : size.height ]
assetWriterPixelBufferInput = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: assetWriterVideoInput,
sourcePixelBufferAttributes: sourcePixelBufferAttributes)
assetWriter.add(assetWriterVideoInput)
}
func startRecording() {
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: kCMTimeZero)
recordingStartTime = CACurrentMediaTime()
isRecording = true
}
func endRecording(_ completionHandler: @escaping () -> ()) {
isRecording = false
assetWriterVideoInput.markAsFinished()
assetWriter.finishWriting(completionHandler: completionHandler)
}
func writeFrame(forTexture texture: MTLTexture) {
if !isRecording {
return
}
while !assetWriterVideoInput.isReadyForMoreMediaData {}
guard let pixelBufferPool = assetWriterPixelBufferInput.pixelBufferPool else {
print("Pixel buffer asset writer input did not have a pixel buffer pool available; cannot retrieve frame")
return
}
var maybePixelBuffer: CVPixelBuffer? = nil
let status = CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &maybePixelBuffer)
if status != kCVReturnSuccess {
print("Could not get pixel buffer from asset writer input; dropping frame...")
return
}
guard let pixelBuffer = maybePixelBuffer else { return }
CVPixelBufferLockBaseAddress(pixelBuffer, [])
let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer)!
// Use the bytes per row value from the pixel buffer since its stride may be rounded up to be 16-byte aligned
let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
let region = MTLRegionMake2D(0, 0, texture.width, texture.height)
texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
let frameTime = CACurrentMediaTime() - recordingStartTime
let presentationTime = CMTimeMakeWithSeconds(frameTime, 240)
assetWriterPixelBufferInput.append(pixelBuffer, withPresentationTime: presentationTime)
CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
}
}
我没有看到任何错误,但生成的 Quicktime 文件中的帧都是黑色的。帧大小正确,我的像素格式正确(bgra8Unorm)。任何人都知道为什么它可能不起作用?
在呈现并提交当前可绘制对象之前,我正在调用 writeFrame 函数,如下所示:
if let drawable = view.currentDrawable {
if BigVideoWriter != nil && BigVideoWriter!.isRecording {
commandBuffer.addCompletedHandler { commandBuffer in
BigVideoWriter?.writeFrame(forTexture: drawable.texture)
}
}
commandBuffer.present(drawable)
commandBuffer.commit()
}
最初我确实收到了一个错误,我的 MetalKitView 层是“framebufferOnly”。所以我在尝试录制之前将其设置为 false。这摆脱了错误,但框架都是黑色的。我还尝试在程序开始时将其设置为 false,但我得到了相同的结果。
我也尝试使用'addCompletedHandler'而不是'addScheduledHandler',但这给了我错误“[CAMetalLayerDrawable texture] 在已经呈现这个drawable之后不应该被调用。取而代之的是nextDrawable。”。
感谢您的任何建议!
编辑:我在@Idogy 的帮助下解决了这个问题。测试显示原始版本适用于 iOS,但不适用于 Mac。他说,因为我有一个 NVIDIA GPU,所以帧缓冲区是私有的。所以我不得不在纹理上添加一个带有同步调用的 blitCommandEncoder,然后它开始工作。像这样:
if let drawable = view.currentDrawable {
if BigVideoWriter != nil && BigVideoWriter!.isRecording {
#if ISMAC
if let blitCommandEncoder = commandBuffer.makeBlitCommandEncoder() {
blitCommandEncoder.synchronize(resource: drawable.texture)
blitCommandEncoder.endEncoding()
}
#endif
commandBuffer.addCompletedHandler { commandBuffer in
BigVideoWriter?.writeFrame(forTexture: drawable.texture)
}
}
commandBuffer.present(drawable)
commandBuffer.commit()
}
解决方案
我相信你写帧太早了——通过writeFrame
在你的渲染循环中调用,你实际上是在它仍然为空的时候捕获可绘制对象(GPU 还没有渲染它)。
请记住,在您调用之前commmandBuffer.commit()
,GPU 甚至还没有开始渲染您的帧。在尝试抓取结果帧之前,您需要等待 GPU 完成渲染。该顺序有点令人困惑,因为您也在调用present()
之前调用commit()
,但这不是运行时操作的实际顺序。该present
调用只是告诉 Metal 安排调用,以便在 GPU 完成渲染后将帧呈现到屏幕上。
您应该writeFrame
从完成处理程序中调用(使用commandBuffer.addCompletedHandler()
)。那应该解决这个问题。
更新:虽然上面的答案是正确的,但它只是部分的。由于 OP 使用的是带有私有 VRAM 的离散 GPU,因此 CPU 无法看到渲染目标像素。该问题的解决方案是添加一个MTLBlitCommandEncoder
, 并使用该synchronize()
方法确保将渲染的像素从 GPU 的 VRAM 复制回 RAM。
推荐阅读
- javascript - 获取 chrome 开发工具网络选项卡值
- python - 如何搜索特定文件,Python 2,Tkinter
- javascript - 我不希望 datetimepicker 在点击时设置默认值
- swift - UIStackView 等间距,包括边缘
- php - 如何通过php将csv(分号分隔)导入mysql?
- c++ - 如何修复私有静态成员中的错误?
- powerbi - Power BI - 条形图排序
- python - 安装 slycot 后 Control.tf2ss() 出现故障
- report - 是否可以在 Thingsboard PE 中安排或自动生成 CSV 或 XLS 报告类型的报告?
- rest - 补丁 http 方法的 REST full api 应该如何响应?