巨大的内存峰值 - CGContextDrawImage

4
我使用这段代码来缩放和旋转用相机拍摄的图片。当我使用它时,会看到一个巨大的内存峰值,大约是20 MB左右。当我使用仪器检测时,我可以看到这一行:
CGContextDrawImage(ctxt, orig, self.CGImage);
占用了20 MB的内存。对于全分辨率的照片来说,这正常吗?iPhone 4S可以处理它,但是老设备由于这个代码而崩溃。
在我重新调整图像大小后,我需要将其转换成NSData,所以我使用UIImageJPEGRepresentation()方法。这两者加起来使内存峰值更高。它会在几秒钟内达到70 MB的内存使用量。
是的,我确实阅读了几乎所有与iOS相机相关的有关内存使用的问题。但是没有答案。
// WBImage.mm -- extra UIImage methods
// by allen brunson  march 29 2009

#include "WBImage.h"

static inline CGFloat degreesToRadians(CGFloat degrees)
{
    return M_PI * (degrees / 180.0);
}

static inline CGSize swapWidthAndHeight(CGSize size)
{
CGFloat  swap = size.width;

size.width  = size.height;
size.height = swap;

return size;
}

@implementation UIImage (WBImage)

// rotate an image to any 90-degree orientation, with or without mirroring.
// original code by kevin lohman, heavily modified by yours truly.
// http://blog.logichigh.com/2008/06/05/uiimage-fix/

-(UIImage*)rotate:(UIImageOrientation)orient
{
CGRect             bnds = CGRectZero;
UIImage*           copy = nil;
CGContextRef       ctxt = nil;
CGRect             rect = CGRectZero;
CGAffineTransform  tran = CGAffineTransformIdentity;

bnds.size = self.size;
rect.size = self.size;

switch (orient)
{
    case UIImageOrientationUp:
        return self;

    case UIImageOrientationUpMirrored:
        tran = CGAffineTransformMakeTranslation(rect.size.width, 0.0);
        tran = CGAffineTransformScale(tran, -1.0, 1.0);
        break;

    case UIImageOrientationDown:
        tran = CGAffineTransformMakeTranslation(rect.size.width,
                                                rect.size.height);
        tran = CGAffineTransformRotate(tran, degreesToRadians(180.0));
        break;

    case UIImageOrientationDownMirrored:
        tran = CGAffineTransformMakeTranslation(0.0, rect.size.height);
        tran = CGAffineTransformScale(tran, 1.0, -1.0);
        break;

    case UIImageOrientationLeft:
        bnds.size = swapWidthAndHeight(bnds.size);
        tran = CGAffineTransformMakeTranslation(0.0, rect.size.width);
        tran = CGAffineTransformRotate(tran, degreesToRadians(-90.0));
        break;

    case UIImageOrientationLeftMirrored:
        bnds.size = swapWidthAndHeight(bnds.size);
        tran = CGAffineTransformMakeTranslation(rect.size.height,
                                                rect.size.width);
        tran = CGAffineTransformScale(tran, -1.0, 1.0);
        tran = CGAffineTransformRotate(tran, degreesToRadians(-90.0));
        break;

    case UIImageOrientationRight:
        bnds.size = swapWidthAndHeight(bnds.size);
        tran = CGAffineTransformMakeTranslation(rect.size.height, 0.0);
        tran = CGAffineTransformRotate(tran, degreesToRadians(90.0));
        break;

    case UIImageOrientationRightMirrored:
        bnds.size = swapWidthAndHeight(bnds.size);
        tran = CGAffineTransformMakeScale(-1.0, 1.0);
        tran = CGAffineTransformRotate(tran, degreesToRadians(90.0));
        break;

    default:
        // orientation value supplied is invalid
        assert(false);
        return nil;
}

UIGraphicsBeginImageContext(rect.size);
ctxt = UIGraphicsGetCurrentContext();

switch (orient)
{
    case UIImageOrientationLeft:
    case UIImageOrientationLeftMirrored:
    case UIImageOrientationRight:
    case UIImageOrientationRightMirrored:
        CGContextScaleCTM(ctxt, -1.0, 1.0);
        CGContextTranslateCTM(ctxt, -rect.size.height, 0.0);
        break;

    default:
        CGContextScaleCTM(ctxt, 1.0, -1.0);
        CGContextTranslateCTM(ctxt, 0.0, -rect.size.height);
        break;
}

CGContextConcatCTM(ctxt, tran);
CGContextDrawImage(ctxt, bnds, self.CGImage);

copy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

return copy;
}

-(UIImage*)rotateAndScaleFromCameraWithMaxSize:(CGFloat)maxSize
{
UIImage*  imag = self;

imag = [imag rotate:imag.imageOrientation];
imag = [imag scaleWithMaxSize:maxSize];

return imag;
}

-(UIImage*)scaleWithMaxSize:(CGFloat)maxSize
{
return [self scaleWithMaxSize:maxSize quality:kCGInterpolationHigh];
}

-(UIImage*)scaleWithMaxSize:(CGFloat)maxSize
                quality:(CGInterpolationQuality)quality
{
CGRect        bnds = CGRectZero;
UIImage*      copy = nil;
CGContextRef  ctxt = nil;
CGRect        orig = CGRectZero;
CGFloat       rtio = 0.0;
CGFloat       scal = 1.0;

bnds.size = self.size;
orig.size = self.size;
rtio = orig.size.width / orig.size.height;

if ((orig.size.width <= maxSize) && (orig.size.height <= maxSize))
{
    return self;
}

if (rtio > 1.0)
{
    bnds.size.width  = maxSize;
    bnds.size.height = maxSize / rtio;
}
else
{
    bnds.size.width  = maxSize * rtio;
    bnds.size.height = maxSize;
}

UIGraphicsBeginImageContext(bnds.size);
ctxt = UIGraphicsGetCurrentContext();

scal = bnds.size.width / orig.size.width;  
CGContextSetInterpolationQuality(ctxt, quality);

CGContextScaleCTM(ctxt, scal, -scal);
CGContextTranslateCTM(ctxt, 0.0, -orig.size.height);

CGContextDrawImage(ctxt, orig, self.CGImage);

copy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

return copy;
}

@end
3个回答

3

最终我使用了imageIO,内存占用更少啦!

-(UIImage *)resizeImageToMaxDimension: (float) dimension withPaht: (NSString *)path
{

NSURL *imageUrl = [NSURL fileURLWithPath:path];
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((__bridge CFURLRef)imageUrl, NULL);

NSDictionary *thumbnailOptions = [NSDictionary dictionaryWithObjectsAndKeys:
                                  (id)kCFBooleanTrue, kCGImageSourceCreateThumbnailWithTransform,
                                  kCFBooleanTrue, kCGImageSourceCreateThumbnailFromImageAlways,
                                  [NSNumber numberWithFloat:dimension], kCGImageSourceThumbnailMaxPixelSize,
                                  nil];
CGImageRef thumbnail = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, (__bridge CFDictionaryRef)thumbnailOptions);

UIImage *resizedImage = [UIImage imageWithCGImage:thumbnail];

CFRelease(thumbnail);
CFRelease(imageSource);

return resizedImage;

}

1


没错,它来自你用相机拍摄的照片,旧设备使用分辨率较低的相机,这意味着使用 iPhone 3g 拍摄的图像在分辨率(因此大小)方面比你的 iPhone4s 上的图像要小。图像通常会被压缩,但是当它们在内存中打开进行某种操作时,它们必须被解压缩,所需的大小实际上比文件中的大小要大得多,因为是 number_of_pixel_in_row*number_of_pixel_in_height*byte_for_pixel 如果我没记错的话。
再见, 安德烈亚


谢谢你的回答。但是有没有减少内存使用的解决方案?你提到了压缩,也许你可以指点我正确的方向。 - Melvin
压缩是对文件进行操作的,当您对图像进行操作时,需要对其进行解压缩。我仍然不明白您是否遇到了崩溃问题。我建议您阅读这篇文章[链接](http://www.cocoanetics.com/2011/10/avoiding-image-decompression-sickness/),并从Apple网站下载示例代码LargeImageDownsizing。再见。 - Andrea
也许一个解决方案是使用mmap UNIX函数将图像内存映射到文件上,但这可能会非常困难且潜在地缓慢。 - Andrea
是的,我的应用程序因此崩溃了。我看了一下苹果提供的这个方法:imageWithCGImage:scale:orientation:。但奇怪的是,这个方法只改变了大小属性而没有改变图像。字节数基本上保持不变。苹果没有提供一种缩放或调整UIImage大小的方法吗?也许可以在自动释放池中的while循环中绘制图像,并一次只读取1024个字节。但是,我认为我错过了苹果的默认方法,难道没有吗? - Melvin

0
将以下代码插入到你的方法末尾和return copy;之前:
CGContextRelease(ctxt);

我刚刚尝试了一下,但似乎没有任何效果。我正在使用ARC,所以也许是ARC在处理这个问题。 - Melvin
1
ARC 不会处理“旧”的 Core Foundation 对象。 - Andrea

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接