首页 > 解决方案 > 将 AVVideoComposition 初始化程序转换为 Nativescript

问题描述

寻找一些帮助将这个objective-c类方法移植到JS/nativescript ..我尝试的每一个变化都导致了TypeError: undefined is not a function...

https://developer.apple.com/documentation/avfoundation/avvideocomposition/1389556-init

我试图用 JS 写成:

const videoComp = AVVideoComposition.alloc().initWithAssetApplyingCIFiltersWithHandler(asset, (request) => { ... });

//OR
const videoComp = AVVideoComposition.alloc().initAssetApplyingCIFiltersWithHandler(asset, (request) => { ... });

//OR
const videoComp = AVVideoComposition.alloc().initAssetApplyingCIFiltersWithHandlerApplier(asset, (request) => { ... });

//OR
const videoComp = new AVVideoComposition(asset, (request) => { ... });

仅举几例。本质上,我正在尝试将此代码移植到 nativescript/JS:

let blurRadius = 6.0
let asset = AVAsset(url: streamURL)
let item = AVPlayerItem(asset: asset)
item.videoComposition= AVVideoComposition(asset: asset) { request in
    let blurred = request.sourceImage.clampedToExtent().applyingGaussianBlur(sigma: blurRadius)
    let output = blurred.clampedToRect(request.sourceImage.extent)
    request.finish(with: output, context: nil)
}

在这篇博文中找到:https ://willowtreeapps.com/ideas/how-to-apply-a-filter-to-a-video-stream-in-ios

标签: javascriptobjective-cnativescript

解决方案


使用 JavaScript / Typescript 应该看起来像这样,

let blurRadius = 6.0;
let asset = AVAsset.assetWithURL(streamURL);
let item = AVPlayerItem.alloc().initWithAsset(asset);
item.videoComposition = AVVideoComposition.videoCompositionWithAssetApplyingCIFiltersWithHandler(asset, request => {
    let blurred = request.sourceImage.imageByClampingToExtent().imageByApplyingGaussianBlurWithSigma(blurRadius);
    let output = blurred.imageByClampingToRect(request.sourceImage.extent);
    request.finishWithImageContext(output, null);
});

注意:代码未经测试,只是给定本机代码的翻译。使用tns-platform-declarations获得 IntelliSense 支持。


推荐阅读