我正在尝试使用NYXImageKit调整UIImage的大小,但这会导致图像数据大小增加。
附注:原始图像的尺寸为1024 * 1024。
附注:原始图像的尺寸为1024 * 1024。
var originalImage = image;
//originalImage SIZE 108 KB
var resizedImage = image.scaleToFillSize(CGSize(width: 256, height: 256));
//resizedImage BECAME 620 KB
有什么想法吗?
以下代码来自NYXImageKit类UIImage+Resizing.m文件。
-(UIImage*)scaleToFillSize:(CGSize)newSize
{
size_t destWidth = (size_t)(newSize.width * self.scale);
size_t destHeight = (size_t)(newSize.height * self.scale);
if (self.imageOrientation == UIImageOrientationLeft
|| self.imageOrientation == UIImageOrientationLeftMirrored
|| self.imageOrientation == UIImageOrientationRight
|| self.imageOrientation == UIImageOrientationRightMirrored)
{
size_t temp = destWidth;
destWidth = destHeight;
destHeight = temp;
}
/// Create an ARGB bitmap context
CGContextRef bmContext = NYXCreateARGBBitmapContext(destWidth, destHeight, destWidth * kNyxNumberOfComponentsPerARBGPixel, NYXImageHasAlpha(self.CGImage));
if (!bmContext)
return nil;
/// Image quality
CGContextSetShouldAntialias(bmContext, true);
CGContextSetAllowsAntialiasing(bmContext, true);
CGContextSetInterpolationQuality(bmContext, kCGInterpolationHigh);
/// Draw the image in the bitmap context
UIGraphicsPushContext(bmContext);
CGContextDrawImage(bmContext, CGRectMake(0.0f, 0.0f, destWidth, destHeight), self.CGImage);
UIGraphicsPopContext();
/// Create an image object from the context
CGImageRef scaledImageRef = CGBitmapContextCreateImage(bmContext);
UIImage* scaled = [UIImage imageWithCGImage:scaledImageRef scale:self.scale orientation:self.imageOrientation];
/// Cleanup
CGImageRelease(scaledImageRef);
CGContextRelease(bmContext);
return scaled;
}
scaleToFillSize(CGSize size)
的代码,并说明您使用的是哪种类型的图像? - Blind NinjaUIImage(data:)
,然后它的JPEG数据变成了大约600K!我考虑了点大小和像素大小,这些都是正确的。我认为UIImage(data:)
实际上进行了一些平滑处理等操作,使图像数据变大了。 - zrfrank