将CMSampleBufferRef转换为带有YUV颜色空间的UIImage?

8
我正在使用 AVCaptureVideoDataOutput 并希望将 CMSampleBufferRef 转换为 UIImage。许多答案都相同,例如 CMSampleBufferRef 创建的 UIImage 无法在 UIImageView 中显示?具有多个预览的 AVCaptureSession
如果我将 VideoDataOutput 的颜色空间设置为 BGRA,它就可以正常工作(感谢这个答案 CGBitmapContextCreateImage 错误)。
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[dataOutput setVideoSettings:videoSettings];

没有上述videoSettings,我将会收到以下错误。
CGBitmapContextCreate: invalid data bytes/row: should be at least 2560 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.
<Error>: CGBitmapContextCreateImage: invalid context 0x0

使用BGRA并不是一个好的选择,因为存在从YUV(默认AVCaptureSession颜色空间)转换为BGRA的开销,正如Brad和Codo在How to get the Y component from CMSampleBuffer resulted from the AVCaptureSession?中所述。
那么有没有一种方法将CMSampleBufferRef转换为UIImage并使用YUV颜色空间?

有什么解决方案吗? - Adarsh V C
@AdarshVC,为什么不投票支持这个问题,让它更加引人注目呢? - onmyway133
请查看此问题:https://dev59.com/13A75IYBdhLWcg3wboUz - Basheer_CAD
@entropy 我已经投票了,很久以前就投了 :) - Adarsh V C
2个回答

9

经过大量研究和阅读苹果文档及维基百科,我找到了答案,并且它对我非常有效。所以为了未来的读者们,我分享一下将视频像素类型设置为kCVPixelFormatType_420YpCbCr8BiPlanarFullRange时,将 CMSampleBufferRef 转换为 UIImage 的代码:

// Create a UIImage from sample buffer data
// Works only if pixel format is kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
-(UIImage *) imageFromSamplePlanerPixelBuffer:(CMSampleBufferRef) sampleBuffer{

    @autoreleasepool {
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);

        // Get the number of bytes per row for the plane pixel buffer
        void *baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);

        // Get the number of bytes per row for the plane pixel buffer
        size_t bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer,0);
        // Get the pixel buffer width and height
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);

        // Create a device-dependent gray color space
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();

        // Create a bitmap graphics context with the sample buffer data
        CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                     bytesPerRow, colorSpace, kCGImageAlphaNone);
        // Create a Quartz image from the pixel data in the bitmap graphics context
        CGImageRef quartzImage = CGBitmapContextCreateImage(context);
        // Unlock the pixel buffer
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);

        // Free up the context and color space
        CGContextRelease(context);
        CGColorSpaceRelease(colorSpace);

        // Create an image object from the Quartz image
        UIImage *image = [UIImage imageWithCGImage:quartzImage];

        // Release the Quartz image
        CGImageRelease(quartzImage);

        return (image);
    }
}

其他颜色呢? - JULIIncognito
@JULIIncognito 将颜色空间更改为RGB。 - Bluewings
1
通过将colorSpace更改为CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();,@Bluewings它不起作用...对于您是否正在使用彩色图像? - lilouch
1
@Bluewings,我找到了问题所在。通过使用CGBitmapContextCreateImage仅提取Y通道,因此结果始终是灰度。我必须实现一个函数来手动将YUV转换为RGB值,参考:https://dev59.com/Hmox5IYBdhLWcg3w_ZV0#31553521。 - Linjie
最佳的Stack Overflow示例。谢谢! - user3335999
显示剩余2条评论

-1

// 这对我有效。

var image: CGImage?
VTCreateCGImageFromCVPixelBuffer(pixelBuffer, options: nil, imageOut: &image)
DispatchQueue.main.async {
    self.imageView.image = UIImage(cgImage: image!)
}

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接