将图像转换为灰度

50

我正在尝试以以下方式将图片转换为灰度:

#define bytesPerPixel 4
#define bitsPerComponent 8

-(unsigned char*) getBytesForImage: (UIImage*)pImage
{
    CGImageRef image = [pImage CGImage];
    NSUInteger width = CGImageGetWidth(image);
    NSUInteger height = CGImageGetHeight(image);

    NSUInteger bytesPerRow = bytesPerPixel * width;

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    unsigned char *rawData = malloc(height * width * 4);
    CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGColorSpaceRelease(colorSpace);

    CGContextDrawImage(context, CGRectMake(0, 0, width, height), image);
    CGContextRelease(context);

    return rawData;
}

-(UIImage*) processImage: (UIImage*)pImage
{   
    DebugLog(@"processing image");
    unsigned char *rawData = [self getBytesForImage: pImage];

    NSUInteger width = pImage.size.width;
    NSUInteger height = pImage.size.height;

    DebugLog(@"width: %d", width);
    DebugLog(@"height: %d", height);

    NSUInteger bytesPerRow = bytesPerPixel * width;

    for (int xCoordinate = 0; xCoordinate < width; xCoordinate++)
    {
        for (int yCoordinate = 0; yCoordinate < height; yCoordinate++)
        {
            int byteIndex = (bytesPerRow * yCoordinate) + xCoordinate * bytesPerPixel;

            //Getting original colors
            float red = ( rawData[byteIndex] / 255.f );
            float green = ( rawData[byteIndex + 1] / 255.f );
            float blue = ( rawData[byteIndex + 2] / 255.f );

            //Processing pixel data
            float averageColor = (red + green + blue) / 3.0f;

            red = averageColor;
            green = averageColor;
            blue = averageColor;

            //Assigning new color components
            rawData[byteIndex] = (unsigned char) red * 255;
            rawData[byteIndex + 1] = (unsigned char) green * 255;
            rawData[byteIndex + 2] = (unsigned char) blue * 255;


        }
    }

    NSData* newPixelData = [NSData dataWithBytes: rawData length: height * width * 4];
    UIImage* newImage = [UIImage imageWithData: newPixelData];

    free(rawData);

    DebugLog(@"image processed");

    return newImage;

}

因此,当我想转换图像时,我只需调用processImage函数:

imageToDisplay.image = [self processImage: image];

但是imageToDisplay没有显示。可能的问题是什么?

谢谢。


2
哪只顽皮的猴子将它添加到收藏夹而没有点赞?完全缺乏慷慨! - P i
12个回答

2

我有另外一种解决方案。它非常高效,可以处理视网膜图形以及透明度。这个方案是在Sargis Gevorgyan的基础之上进行改进:

+ (UIImage*) grayScaleFromImage:(UIImage*)image opaque:(BOOL)opaque
{
// NSTimeInterval start = [NSDate timeIntervalSinceReferenceDate];

CGSize size = image.size;

CGRect bounds = CGRectMake(0, 0, size.width, size.height);

// Create bitmap content with current image size and grayscale colorspace
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();
size_t bitsPerComponent = 8;
size_t bytesPerPixel = opaque ? 1 : 2;
size_t bytesPerRow = bytesPerPixel * size.width * image.scale;
CGContextRef context = CGBitmapContextCreate(nil, size.width, size.height, bitsPerComponent, bytesPerRow, colorSpace, opaque ? kCGImageAlphaNone : kCGImageAlphaPremultipliedLast);

// create image from bitmap
CGContextDrawImage(context, bounds, image.CGImage);
CGImageRef cgImage = CGBitmapContextCreateImage(context);
UIImage* result = [[UIImage alloc] initWithCGImage:cgImage scale:image.scale orientation:UIImageOrientationUp];
CGImageRelease(cgImage);
CGContextRelease(context);

// performance results on iPhone 6S+ in Release mode.
// Results are in photo pixels, not device pixels:
//  ~ 5ms for 500px x 600px
//  ~ 15ms for 2200px x 600px
// NSLog(@"generating %d x %d @ %dx grayscale took %f seconds", (int)size.width, (int)size.height, (int)image.scale, [NSDate timeIntervalSinceReferenceDate] - start);

return result;
}

使用混合模式是一种优雅的方法,但将内容复制到灰度位图中则更高效,因为您只使用了一个或两个颜色通道,而不是四个。透明度布尔值旨在接收您的UIView的不透明标志,因此如果您知道自己不需要使用alpha通道,则可以选择退出。
我尚未尝试此答案线程中基于Core Image的解决方案,但如果性能很重要,我会非常谨慎地使用Core Image。

这与Natalia的解决方案相比如何? - meaning-matters

0

这是我尝试的一种方法,通过直接绘制到灰度色彩空间来快速转换,而不需要枚举每个像素。它比CIImageFilter解决方案快10倍。

@implementation UIImage (Grayscale)

static UIImage *grayscaleImageFromCIImage(CIImage *image, CGFloat scale)
{
    CIImage *blackAndWhite = [CIFilter filterWithName:@"CIColorControls" keysAndValues:kCIInputImageKey, image, kCIInputBrightnessKey, @0.0, kCIInputContrastKey, @1.1, kCIInputSaturationKey, @0.0, nil].outputImage;
    CIImage *output = [CIFilter filterWithName:@"CIExposureAdjust" keysAndValues:kCIInputImageKey, blackAndWhite, kCIInputEVKey, @0.7, nil].outputImage;
    CGImageRef ref = [[CIContext contextWithOptions:nil] createCGImage:output fromRect:output.extent];
    UIImage *result = [UIImage imageWithCGImage:ref scale:scale orientation:UIImageOrientationUp];
    CGImageRelease(ref);
    return result;
}

static UIImage *grayscaleImageFromCGImage(CGImageRef imageRef, CGFloat scale)
{
    NSInteger width = CGImageGetWidth(imageRef) * scale;
    NSInteger height = CGImageGetHeight(imageRef) * scale;

    NSMutableData *pixels = [NSMutableData dataWithLength:width*height];
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();
    CGContextRef context = CGBitmapContextCreate(pixels.mutableBytes, width, height, 8, width, colorSpace, 0);

    CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
    CGImageRef ref = CGBitmapContextCreateImage(context);
    UIImage *result = [UIImage imageWithCGImage:ref scale:scale orientation:UIImageOrientationUp];

    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
    CGImageRelease(ref);

    return result;
}

- (UIImage *)grayscaleImage
{
    if (self.CIImage) {
        return grayscaleImageFromCIImage(self.CIImage, self.scale);
    } else if (self.CGImage) {
        return grayscaleImageFromCGImage(self.CGImage, self.scale);
    }

    return nil;
}

@end

将透明背景转换为黑色。 - SHN
@SHN 看起来额外的图像遮罩可能会有所帮助:http://incurlybraces.com/convert-transparent-image-to-grayscale-in-ios.html - k06a

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接