使用增强现实录制视频的最佳方法是什么?

3

如何使用增强现实录制视频?(从iPhone/iPad相机中的帧添加文本、图像和标志)

之前我正在尝试如何在CIImage中绘制文本(如何在CIImage中绘制文本?),并将CIImage转换回CMSampleBuffer将CIImage转换回CMSampleBuffer

我几乎做到了一切,只有一个问题,就是使用新的CMSampleBufferAVAssetWriterInput中录制视频

但这种解决方案根本不好,它会消耗大量的CPU资源,同时将CIImage转换为CVPixelBufferciContext.render(ciImage!, to: aBuffer)

所以我想到此为止,并寻找其他方法来记录具有增强现实的视频(例如在将视频编码为mp4文件时动态添加(绘制)文本)

这里是我尝试过的,不想再使用的方法...

// convert original CMSampleBuffer to CIImage, 
// combine multiple `CIImage`s into one (adding augmented reality -  
// text or some additional images)
let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
let ciimage : CIImage = CIImage(cvPixelBuffer: pixelBuffer)
var outputImage: CIImage?
let images : Array<CIImage> = [ciimage, ciimageSec!] // add all your CIImages that you'd like to combine
for image in images {
    outputImage = outputImage == nil ? image : image.composited(over: outputImage!)
}

// allocate this class variable once         
if pixelBufferNew == nil {
    CVPixelBufferCreate(kCFAllocatorSystemDefault, CVPixelBufferGetWidth(pixelBuffer),  CVPixelBufferGetHeight(pixelBuffer), kCVPixelFormatType_32BGRA, nil, &pixelBufferNew)
}

// convert CIImage to CVPixelBuffer
let ciContext = CIContext(options: nil)
if let aBuffer = pixelBufferNew {
    ciContext.render(outputImage!, to: aBuffer) // >>> IT EATS A LOT OF <<< CPU
}

// convert new CVPixelBuffer to new CMSampleBuffer
var sampleTime = CMSampleTimingInfo()
sampleTime.duration = CMSampleBufferGetDuration(sampleBuffer)
sampleTime.presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
sampleTime.decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer)
var videoInfo: CMVideoFormatDescription? = nil
CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBufferNew!, &videoInfo)
var oBuf: CMSampleBuffer?
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBufferNew!, true, nil, nil, videoInfo!, &sampleTime, &oBuf)

/*
try to append new CMSampleBuffer into a file (.mp4) using 
AVAssetWriter & AVAssetWriterInput... (I met errors with it, original buffer works ok 
- "from func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)")
*/*

有更好的解决方案吗?
1个回答

0

最好使用Objective-C++类(.mm),在这里我们可以使用OpenCV,并且可以轻松/快速地将CMSampleBuffer转换为cv::Mat,并在处理后再转换回CMSampleBuffer

我们可以很容易地从Swift中调用Objective-C++函数。


2
你能分享一些相关代码吗? - Peder Wessel

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接