CMSampleBufferGetImageBuffer 中的内存泄漏问题

3

我正在每N帧视频从CMSampleBufferRef视频缓冲区获取一个UIImage,如下:

- (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion {
    CMSampleBufferRef sampleBuffer = _myLastSampleBuffer;
    if (sampleBuffer != nil) {
        CFRetain(sampleBuffer);
        CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
        _lastAppendedVideoBuffer.sampleBuffer = nil;
        if (_context == nil) {
            _context = [CIContext contextWithOptions:nil];
        }
        CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CGImageRef cgImage = [_context createCGImage:ciImage fromRect:
                              CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))];
        __block UIImage *image = [UIImage imageWithCGImage:cgImage];

        CGImageRelease(cgImage);
        CFRelease(sampleBuffer);

        if(completion) completion(image);

        return;
    }
    if(completion) completion(nil);
}

XCode 和 Instruments 检测到内存泄漏,但我无法摆脱它。 我按照惯例释放了 CGImageRef 和 CMSampleBufferRef:

CGImageRelease(cgImage);
CFRelease(sampleBuffer);

[更新] 我加入了AVCapture输出回调来获取sampleBuffer

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if (captureOutput == _videoOutput) {
        _lastVideoBuffer.sampleBuffer = sampleBuffer;
        id<CIImageRenderer> imageRenderer = _CIImageRenderer;

        dispatch_async(dispatch_get_main_queue(), ^{
            @autoreleasepool {
                CIImage *ciImage = nil;
                ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
                if(_context==nil) {
                    _context = [CIContext contextWithOptions:nil];
                }
                CGImageRef processedCGImage = [_context createCGImage:ciImage
                                                             fromRect:[ciImage extent]];
                //UIImage *image=[UIImage imageWithCGImage:processedCGImage];
                CGImageRelease(processedCGImage);
                NSLog(@"Captured image %@", ciImage);
            }
        });

代码泄漏的部分是 createCGImage:ciImage
CGImageRef processedCGImage = [_context createCGImage:ciImage
                                                             fromRect:[ciImage extent]];

即使使用了autoreleasepool,也会出现CGImageReleaseCIContext实例属性的内存泄漏问题。这似乎是与此处讨论的相同问题:Can't save CIImage to file on iOS without memory leaks
[更新]泄漏似乎是由一个bug引起的。该问题在Memory leak on CIContext createCGImage at iOS 9?中有很好的描述。
一个示例项目展示了如何重现这个泄漏:http://www.osamu.co.jp/DataArea/VideoCameraTest.zip
最后的评论确保了:

看起来他们在9.1b3中修复了这个问题。如果有人需要一个适用于iOS 9.0.x的解决方法,我能够用以下代码使其工作:

在测试代码中(这里是Swift):
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)  
    {  
        if (error) return;  

        __block NSString *filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"ipdf_pic_%i.jpeg",(int)[NSDate date].timeIntervalSince1970]];  

        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];  
        dispatch_async(dispatch_get_main_queue(), ^  
        {  

            @autoreleasepool  
            {  
                CIImage *enhancedImage = [CIImage imageWithData:imageData];  

                if (!enhancedImage) return;  

                static CIContext *ctx = nil; if (!ctx) ctx = [CIContext contextWithOptions:nil];  

                CGImageRef imageRef = [ctx createCGImage:enhancedImage fromRect:enhancedImage.extent format:kCIFormatBGRA8 colorSpace:nil];  

                UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationRight];  

                [[NSFileManager defaultManager] createFileAtPath:filePath contents:UIImageJPEGRepresentation(image, 0.8) attributes:nil];  

                CGImageRelease(imageRef);  
            }  
        });  
    }]; 

对于iOS9.0的解决方法应该是

extension CIContext {  
    func createCGImage_(image:CIImage, fromRect:CGRect) -> CGImage {  
        let width = Int(fromRect.width)  
        let height = Int(fromRect.height)  

        let rawData =  UnsafeMutablePointer<UInt8>.alloc(width * height * 4)  
        render(image, toBitmap: rawData, rowBytes: width * 4, bounds: fromRect, format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB())  
        let dataProvider = CGDataProviderCreateWithData(nil, rawData, height * width * 4) {info, data, size in UnsafeMutablePointer<UInt8>(data).dealloc(size)}  
        return CGImageCreate(width, height, 8, 32, width * 4, CGColorSpaceCreateDeviceRGB(), CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue), dataProvider, nil, false, .RenderingIntentDefault)!  
    }  
}  

Instruments 说有什么泄漏?_myLastSampleBuffer_lastAppendedVideoBuffer.sampleBuffer 在哪里被设置了? - ChrisH
@ChrisH 请参见上面的泄漏代码。 - loretoparisi
这不是苹果代码的泄漏问题,只要不在从copyNextSampleBuffer返回的结果上调用CFRetain(sampleBuffer),代码就能正常工作。 - MoDJ
@MoDJ 我必须尝试一下,好久以前可能已经做过了,因为在苹果论坛上也有关于这个问题的讨论。我会告诉你的。 - loretoparisi
2个回答

3
我们在创建的应用程序中遇到类似问题,我们正在使用OpenCV处理每个帧的特征关键点,并且每隔几秒钟发送一帧。运行一段时间后,我们会得到相当多的内存压力消息。
我们通过在自己的自动释放池中运行处理代码来解决这个问题,如下所示(jpegDataFromSampleBufferAndCrop执行的操作类似于您正在执行的操作,并添加了裁剪):
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
        @autoreleasepool {

            if ([self.lastFrameSentAt timeIntervalSinceNow] < -kContinuousRateInSeconds) {

                NSData *imageData = [self jpegDataFromSampleBufferAndCrop:sampleBuffer];

                if (imageData) {
                    [self processImageData:imageData];
                }

                self.lastFrameSentAt = [NSDate date];

                imageData = nil;
            }
        }
    }
}

жүҖд»ҘпјҢзңӢиө·жқҘжі„жјҸжҳҜз”ұдәҺiOSдёҠзҡ„createCGImageдёӯзҡ„дёҖдёӘbugеј•иө·зҡ„гҖӮиҜ·жҹҘзңӢhttps://forums.developer.apple.com/message/50981#50981 - loretoparisi

1
我可以确认,在iOS 9.2上仍存在这个内存泄漏问题。(我也已经在Apple Developer Forum上发布了。)
我在iOS 9.2上也遇到了同样的内存泄漏问题。我尝试使用MetalKit和MLKDevice来丢弃EAGLContext,尝试使用drawImage、createCGImage和render等不同的CIContext方法,但似乎都没有效果。
很明显,这是iOS 9的一个bug。您可以通过从苹果下载示例应用程序(如下所示),然后在iOS 8.4设备上运行相同的项目,再在iOS 9.2设备上运行,并注意Xcode中的内存表盘来自行测试。
下载 https://developer.apple.com/library/ios/samplecode/AVBasicVideoOutput/Introduction/Intro.html#//apple_ref/doc/uid/DTS40013109 将此代码添加到 APLEAGLView.h:20
@property (strong, nonatomic) CIContext* ciContext;

将 APLEAGLView.m 的第 118 行替换为以下内容。
[EAGLContext setCurrentContext:_context];
 _ciContext = [CIContext contextWithEAGLContext:_context];

最后将 APLEAGLView.m 的第341-343行替换为以下内容。
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);  

    @autoreleasepool  
    {  
        CIImage* sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];  
        CIFilter* filter = [CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:kCIInputImageKey, sourceImage, nil];  
        CIImage* filteredImage = filter.outputImage;  

        [_ciContext render:filteredImage toCVPixelBuffer:pixelBuffer];  
    }  

glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);  

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接