在 Swift3 中为捕获的视频添加滤镜效果后音频丢失 iOS

Audio is missing after adding filter effect to captured video in Swift3 iOS

我正在开发基于视频的应用程序,我需要将 CIFilter 添加到从设备库中选择的捕获视频中。为此,我使用下面的 VideoEffects 库:

https://github.com/FlexMonkey/VideoEffects

使用它,我可以为我的视频添加滤镜,但问题是最终视频输出中缺少音频。我尝试使用以下代码添加音频资产但无法正常工作:

videoOutputURL = documentDirectory.appendingPathComponent("Output_\(timeDateFormatter.string(from: Date())).mp4")

    do {
      videoWriter = try AVAssetWriter(outputURL: videoOutputURL!, fileType: AVFileTypeMPEG4)
    }
    catch {
      fatalError("** unable to create asset writer **")
    }

    let outputSettings: [String : AnyObject] = [
      AVVideoCodecKey: AVVideoCodecH264 as AnyObject,
      AVVideoWidthKey: currentItem.presentationSize.width as AnyObject,
      AVVideoHeightKey: currentItem.presentationSize.height as AnyObject]

    guard videoWriter!.canApply(outputSettings: outputSettings, forMediaType: AVMediaTypeVideo) else {
      fatalError("** unable to apply video settings ** ")
    }


    videoWriterInput = AVAssetWriterInput(
      mediaType: AVMediaTypeVideo,
      outputSettings: outputSettings)


    //setup audio writer
    let audioOutputSettings: Dictionary<String, AnyObject> = [
        AVFormatIDKey : Int(kAudioFormatMPEG4AAC) as AnyObject,
        AVSampleRateKey:48000.0 as AnyObject,
        AVNumberOfChannelsKey:NSNumber(value: 1),
        AVEncoderBitRateKey : 128000 as AnyObject
    ]

    guard videoWriter!.canApply(outputSettings: audioOutputSettings, forMediaType: AVMediaTypeAudio) else {
        fatalError("** unable to apply Audio settings ** ")
    }

    audioWriterInput = AVAssetWriterInput(
        mediaType: AVMediaTypeAudio,
        outputSettings: audioOutputSettings)


    if videoWriter!.canAdd(videoWriterInput!) {
      videoWriter!.add(videoWriterInput!)
      videoWriter!.add(audioWriterInput!)
    }
    else {
      fatalError ("** unable to add input **")
    }

还有其他方法可以给视频加滤镜吗?请建议我。

我也尝试过使用 GPUImage 添加 CIFilter,但这仅适用于实时视频,不适用于捕获的视频。

从 iOS 9.0 开始,您可以使用 AVVideoComposition 将核心图像过滤器逐帧应用于视频。

let filter = CIFilter(name: "CIGaussianBlur")!
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
        // Clamp to avoid blurring transparent pixels at the image edges
        let source = request.sourceImage.imageByClampingToExtent()
        filter.setValue(source, forKey: kCIInputImageKey)

        // Vary filter parameters based on video timing
        let seconds = CMTimeGetSeconds(request.compositionTime)
        filter.setValue(seconds * 10.0, forKey: kCIInputRadiusKey)

        // Crop the blurred output to the bounds of the original image
        let output = filter.outputImage!.imageByCroppingToRect(request.sourceImage.extent)

        request.finish(with: output, context: nil)
})

现在我们可以使用之前创建的资产创建 AVPlayerItem 并使用 AVPlayer

播放它
let playerItem = AVPlayerItem(asset: asset)
playerItem.videoComposition = composition
let player = AVPlayer(playerItem: playerItem)
player.play()

核心图像过滤器逐帧添加实时。您还可以使用 AVAssetExportSession class.

导出视频

这是 WWDC 2015 的精彩介绍:Link