如何将vImage_Buffer转换为CVPixelBufferRef

12

我正在我的iOS应用中录制实时视频。在另一个Stack Overflow页面上,我发现可以使用vImage_Buffer处理我的帧。

问题是我不知道如何从输出的vImage_buffer返回CVPixelBufferRef

这是在另一篇文章中给出的代码:

NSInteger cropX0 = 100,
          cropY0 = 100,
          cropHeight = 100,
          cropWidth = 100,
          outWidth = 480,
          outHeight = 480;

CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);                   
CVPixelBufferLockBaseAddress(imageBuffer,0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);

vImage_Buffer inBuff;                       
inBuff.height = cropHeight;
inBuff.width = cropWidth;
inBuff.rowBytes = bytesPerRow;

int startpos = cropY0 * bytesPerRow + 4 * cropX0;
inBuff.data = baseAddress + startpos;

unsigned char *outImg = (unsigned char*)malloc(4 * outWidth * outHeight);
vImage_Buffer outBuff = {outImg, outHeight, outWidth, 4 * outWidth};

vImage_Error err = vImageScale_ARGB8888(&inBuff, &outBuff, NULL, 0);
if (err != kvImageNoError) NSLog(@" error %ld", err);

现在我需要将outBuff转换为CVPixelBufferRef

我认为我需要使用vImageBuffer_CopyToCVPixelBuffer,但我不确定如何操作。

我的第一个尝试失败了,出现了EXC_BAD_ACCESS: CVPixelBufferUnlockBaseAddress(imageBuffer, 0);的错误。

CVPixelBufferRef pixelBuffer;
CVPixelBufferCreate(kCFAllocatorSystemDefault, 480, 480, kCVPixelFormatType_32BGRA, NULL, &pixelBuffer);
    
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    
vImage_CGImageFormat format = {
    .bitsPerComponent = 8,
    .bitsPerPixel = 32,
    .bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst,  //BGRX8888
    .colorSpace = NULL,  //sRGB
};
    
vImageBuffer_CopyToCVPixelBuffer(&outBuff,
                                 &format,
                                 pixelBuffer,
                                 NULL,
                                 NULL,
                                 kvImageNoFlags);  // Here is the crash!
    
    
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

有任何想法吗?


EXC_BAD_ACCESS对应哪个字节块?是vImage_Buffer.data还是CVPixelBufferRef基地址? - Ian Ollmann
在调试器中运行您的代码。当程序崩溃时,您应该会看到类似于“EXC_BAD_ACCESS(Code=X,address=ADDR)”的信息。此时,请使用调试器查看pixelBuffer和outBuff.data的值。其中一个应该与ADDR相同或接近。 - Stephen Canon
(假设您实际上是在CopyToCVPixelBuffer中崩溃,而不是在更早的函数中;您在哪个函数中崩溃?) - Stephen Canon
好的,调试器停在了vImageBuffer_CopyToCVPixelBuffer处,但实际上addr是0x0,但所有输入参数outBuff、format和pixelBuffer都不同于0x0。代码是1。 - Nils Ziehn
如果您有一个小的可重现的代码示例,似乎值得报告一个bug。这可能是因为NULL颜色空间或背景颜色没有被正确处理,或者其他原因导致了问题。也许新的像素缓冲区做了一些意外的事情。 - Ian Ollmann
显示剩余4条评论
2个回答

2
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
    [NSNumber numberWithBool : YES], kCVPixelBufferCGImageCompatibilityKey,
    [NSNumber numberWithBool : YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
    [NSNumber numberWithInt : 480], kCVPixelBufferWidthKey,
    [NSNumber numberWithInt : 480], kCVPixelBufferHeightKey,
    nil];

status = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                      480,
                                      480,
                                      kCVPixelFormatType_32BGRA,
                                      outImg,
                                      bytesPerRow,
                                      NULL,
                                      NULL,
                                      (__bridge CFDictionaryRef)options,
                                      &pixbuffer);

你应该像上面那样生成一个新的pixelBuffer

1
  • 如果您想将裁剪后的实时视频源嵌入到界面中,请使用AVPlayerLayerAVCaptureVideoPreviewLayer和/或其他CALayer子类,使用层边界、框架和位置来调整您的100x100像素区域到480x480像素区域。

vImage的注意事项(不同情况可能会有所不同):

  1. CVPixelBufferCreateWithBytes will not work with vImageBuffer_CopyToCVPixelBuffer() because you need to copy the vImage_Buffer data into a "clean" or "empty" CVPixelBuffer.

  2. No need for locking/unlocking - make sure you know when to lock & when not to lock pixel buffers.

  3. Your inBuff vImage_Buffer just needs to be initialized from the pixel buffer data, not manually (unless you know how to use CGContexts etc, to init the pixel grid)

  4. use vImageBuffer_InitWithCVPixelBuffer()

  5. vImageScale_ARGB8888 will scale the entire CVPixel data to a smaller/larger rectangle. It won't SCALE a portion/crop area of the buffer to another buffer.

  6. When you use vImageBuffer_CopyToCVPixelBuffer(), vImageCVImageFormatRef & vImage_CGImageFormat need to be filled out correctly.

    CGColorSpaceRef dstColorSpace = CGColorSpaceCreateWithName(kCGColorSpaceITUR_709);
    
    vImage_CGImageFormat format = {
        .bitsPerComponent = 16,
        .bitsPerPixel = 64,
        .bitmapInfo = (CGBitmapInfo)kCGImageAlphaPremultipliedLast  |  kCGBitmapByteOrder16Big ,
        .colorSpace = dstColorSpace
    };
    vImageCVImageFormatRef vformat = vImageCVImageFormat_Create(kCVPixelFormatType_4444AYpCbCr16,
                                                                kvImage_ARGBToYpCbCrMatrix_ITU_R_709_2,
                                                                kCVImageBufferChromaLocation_Center,
                                                                format.colorSpace,
                                                                0);
    
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                          480,
                                          480,
                                          kCVPixelFormatType_4444AYpCbCr16,
                                          NULL,
                                          &destBuffer);
    
    NSParameterAssert(status == kCVReturnSuccess && destBuffer != NULL);
    
    err = vImageBuffer_CopyToCVPixelBuffer(&sourceBuffer, &format, destBuffer, vformat, 0, kvImagePrintDiagnosticsToConsole);
    

注意:这些设置适用于带Alpha通道的64位ProRes - 请根据32位进行调整。


请不要忘记,非常重要 - 内存泄漏城市。您必须自己释放CG对象(颜色空间等)。 - Paul-J
你必须从缓冲区中释放(whatEver.data)。这些函数都不支持ARC,如果你处理许多视频帧,很快就会停滞不前。 - Paul-J
嗨,保罗,你能否回答一下我的SO问题(https://stackoverflow.com/questions/60904676/need-help-in-screen-recording-a-part-of-the-screen-in-ios),它与这个答案有关,我不确定如何在Swift中实现。 - Felix Marianayagam

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接