这段代码大部分工作正常,但生成的数据似乎丢失了一种颜色通道(这是我想的),因为当显示生成的图像数据时会带有蓝色色调!
以下是代码:
UIImage* myImage=[UIImage imageNamed:@"sample1.png"];
CGImageRef imageRef=[myImage CGImage];
CVImageBufferRef pixelBuffer = [self pixelBufferFromCGImage:imageRef];
方法pixelBufferFromCGIImage是从stackoverflow的另一篇帖子中获取的,链接在此处:How do I export UIImage array as a movie?(尽管这个应用与我所尝试做的无关),它就是
+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
{
CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image));
NSDictionary *options = @{
(__bridge NSString *)kCVPixelBufferCGImageCompatibilityKey: @(NO),
(__bridge NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey: @(NO)
};
CVPixelBufferRef pixelBuffer;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
&pixelBuffer);
if (status != kCVReturnSuccess) {
return NULL;
}
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
void *data = CVPixelBufferGetBaseAddress(pixelBuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(data, frameSize.width, frameSize.height,
8, CVPixelBufferGetBytesPerRow(pixelBuffer), rgbColorSpace,
(CGBitmapInfo) kCGImageAlphaNoneSkipLast);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
return pixelBuffer;
}
我认为这与kCVPixelFormatType_32ARGB和kCGImageAlphaNoneSkipLast之间的关系有关,尽管我已经尝试了所有组合,但要么得到相同的结果,要么应用程序崩溃。再次强调,这将UIImage数据转换为CVImageBufferRef,但是当我在屏幕上显示图像时,它似乎失去了一个颜色通道,并且呈现出蓝色色调。该图像是png格式。
kCVPixelFormatType_32BGRA
,并将(CGBitmapInfo) kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst
用作context
的最后一个变量。此外,我认为NSDictionary *options
是无用的。 - Maxi Mus