AVCapture appendSampleBuffer

7

这个问题让我快要发疯了——我已经到处查找并尝试了所有我能想到的方法。

我正在制作一个使用AVFoundation的iPhone应用程序,具体来说是使用iPhone相机捕获视频。

我需要在录制中包含覆盖在视频上的自定义图像。

到目前为止,我已经设置好了AVCapture会话,可以显示视频源,访问帧,将其保存为UIImage,并将覆盖图像合并到其中。然后将这个新的UIImage转换为CVPixelBufferRef。最后,为了确保bufferRef工作正常,我将其转换回UIImage,它仍然可以正常显示。

当我尝试将CVPixelBufferRef转换为CMSampleBufferRef以附加到AVCaptureSessions assetWriterInput时,问题就开始了。每当我尝试创建它时,CMSampleBufferRef总是返回NULL。

这里是-(void)captureOutput函数

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
    fromConnection:(AVCaptureConnection *)connection
{ 

 UIImage *botImage = [self imageFromSampleBuffer:sampleBuffer];
 UIImage *wheel = [self imageFromView:wheelView];

 UIImage *finalImage = [self overlaidImage:botImage :wheel];
 //[previewImage setImage:finalImage]; <- works -- the image is being merged into one UIImage

 CVPixelBufferRef pixelBuffer = NULL;
 CGImageRef cgImage = CGImageCreateCopy(finalImage.CGImage);
 CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));
 int status = CVPixelBufferCreateWithBytes(NULL,
             self.view.bounds.size.width,
             self.view.bounds.size.height,
             kCVPixelFormatType_32BGRA, 
             (void*)CFDataGetBytePtr(image), 
             CGImageGetBytesPerRow(cgImage), 
             NULL, 
             0,
             NULL, 
             &pixelBuffer);
 if(status == 0){
  OSStatus result = 0;

  CMVideoFormatDescriptionRef videoInfo = NULL;
  result = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoInfo);
  NSParameterAssert(result == 0 && videoInfo != NULL);

  CMSampleBufferRef myBuffer = NULL;
  result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
            pixelBuffer, true, NULL, NULL, videoInfo, NULL, &myBuffer);
  NSParameterAssert(result == 0 && myBuffer != NULL);//always null :S

  NSLog(@"Trying to append");

  if (!CMSampleBufferDataIsReady(myBuffer)){
   NSLog(@"sampleBuffer data is not ready");
   return;
  }

  if (![assetWriterInput isReadyForMoreMediaData]){
   NSLog(@"Not ready for data :(");
   return;
  }

  if (![assetWriterInput appendSampleBuffer:myBuffer]){   
   NSLog(@"Failed to append pixel buffer");
  }



 }

}

另一个我经常听到的解决方案是使用AVAssetWriterInputPixelBufferAdaptor,它消除了需要进行混乱的CMSampleBufferRef封装的必要性。然而,我已经搜索了Stacked和苹果开发者论坛以及文档,但无法找到清晰的描述或示例来设置它或如何使用它。如果有人有它的工作示例,请向我展示或帮助我解决上述问题-我已经不停地工作了一个星期,现在快崩溃了。
如果您需要其他信息,请告诉我。
提前致谢,
迈克尔

抱歉代码格式不正确——在预览中看起来还好:S - Michael O'Brien
2个回答

6
你需要使用AVAssetWriterInputPixelBufferAdaptor,以下是创建它的代码:
    // Create dictionary for pixel buffer adaptor
NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil];

// Create pixel buffer adaptor
m_pixelsBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes:bufferAttributes];

使用它的代码:

// If ready to have more media data
if (m_pixelsBufferAdaptor.assetWriterInput.readyForMoreMediaData) {
    // Create a pixel buffer
    CVPixelBufferRef pixelsBuffer = NULL;
    CVPixelBufferPoolCreatePixelBuffer(NULL, m_pixelsBufferAdaptor.pixelBufferPool, &pixelsBuffer);

    // Lock pixel buffer address
    CVPixelBufferLockBaseAddress(pixelsBuffer, 0);

    // Create your function to set your pixels data in the buffer (in your case, fill with your finalImage data)
    [self yourFunctionToPutDataInPixelBuffer:CVPixelBufferGetBaseAddress(pixelsBuffer)];

    // Unlock pixel buffer address
    CVPixelBufferUnlockBaseAddress(pixelsBuffer, 0);

    // Append pixel buffer (calculate currentFrameTime with your needing, the most simplest way is to have a frame time starting at 0 and increment each time you write a frame with the time of a frame (inverse of your framerate))
    [m_pixelsBufferAdaptor appendPixelBuffer:pixelsBuffer withPresentationTime:currentFrameTime];

    // Release pixel buffer
    CVPixelBufferRelease(pixelsBuffer);
}

不要忘记释放你的像素缓冲适配器。

这真的有效吗?我尝试手动创建像OP一样的像素缓冲区,并尝试使用像素缓冲池。当我尝试使用上面定义的像素缓冲池时,在模拟器中可以工作,但在设备上运行时从未分配像素缓冲池。 - listing boat

1
我使用CMSampleBufferCreateForImageBuffer()来完成。
OSStatus ret = 0;
CMSampleBufferRef sample = NULL;
CMVideoFormatDescriptionRef videoInfo = NULL;
CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
timingInfo.presentationTimeStamp = pts;
timingInfo.duration = duration;

ret = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixel, &videoInfo);
if (ret != 0) {
    NSLog(@"CMVideoFormatDescriptionCreateForImageBuffer failed! %d", (int)ret);
    goto done;
}
ret = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixel, true, NULL, NULL,
                                         videoInfo, &timingInfo, &sample);
if (ret != 0) {
    NSLog(@"CMSampleBufferCreateForImageBuffer failed! %d", (int)ret);
    goto done;
}

你能说明一下你是如何设置timingInfo的吗?我看到了"pts"和"duration",这些值是固定的吗?还是说它们是什么?你是如何设置它们的? - omarojo
在我的情况下它不起作用...然而错误是空的...但是当我获取CMVideoFormatDescriptionRef时...它返回nil。 - Vikesh Prasad

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接