如何从-captureStillImageAsynchronouslyFromConnection:completionHandler:获取的CMSampleBuffer中获取NSImage?

3
我有一个Cocoa应用程序,旨在从USB显微镜中捕获静止图像,然后在保存到图像文件之前对其进行一些后处理。目前,我卡在了尝试从传递给我的completionHandler块的CMSampleBufferRef转换为NSImage或其他我可以使用熟悉的Cocoa API操作和保存的表示形式上。
我在AVFoundation文档中找到了imageFromSampleBuffer()函数,它声称将CMSampleBuffer转换为UIImage(叹息),并相应地修改它以返回NSImage。但在这种情况下不起作用,因为调用CMSampleBufferGetImageBuffer()时返回nil
这是一个显示传递给我的完成块的CMSampleBuffer的日志:
2012-01-21 19:38:36.293 LabCam[1402:cb0f] CMSampleBuffer 0x100335390 retainCount: 1 allocator: 0x7fff8c78620c
     invalid = NO
     dataReady = YES
     makeDataReadyCallback = 0x0
     makeDataReadyRefcon = 0x0
     buffer-level attachments:
          com.apple.cmio.buffer_attachment.discontinuity_flags(P) = 0
          com.apple.cmio.buffer_attachment.hosttime(P) = 79631546824089
          com.apple.cmio.buffer_attachment.sequence_number(P) = 42
     formatDescription = <CMVideoFormatDescription 0x100335220 [0x7fff782fff40]> {
     mediaType:'vide' 
     mediaSubType:'jpeg' 
     mediaSpecific: {
          codecType: 'jpeg'          dimensions: 640 x 480 
     } 
     extensions: {<CFBasicHash 0x100335160 [0x7fff782fff40]>{type = immutable dict, count = 5,
entries =>
     1 : <CFString 0x7fff773dff48 [0x7fff782fff40]>{contents = "Version"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
     2 : <CFString 0x7fff773dff68 [0x7fff782fff40]>{contents = "RevisionLevel"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
     3 : <CFString 0x7fff7781ab08 [0x7fff782fff40]>{contents = "CVFieldCount"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
     4 : <CFString 0x7fff773dfdc8 [0x7fff782fff40]>{contents = "FormatName"} = <CFString 0x7fff76d35fb0 [0x7fff782fff40]>{contents = Photo - JPEG"}
     5 : <CFString 0x7fff773dff88 [0x7fff782fff40]>{contents = "Vendor"} = <CFString 0x7fff773dffa8 [0x7fff782fff40]>{contents = "appl"}
}
}
}
     sbufToTrackReadiness = 0x0
     numSamples = 1
     sampleTimingArray[1] = {
          {PTS = {2388943236/30000 = 79631.441, rounded}, DTS = {INVALID}, duration = {3698/30000 = 0.123}},
     }
     sampleSizeArray[1] = {
          sampleSize = 55911,
     }
     dataBuffer = 0x100335300

这明显包含JPEG数据,但我该如何访问它?(最好还能保留相关元数据...)


你能分享一下如何适当地修改 "revised imageFromSampleBuffer() " 以返回 NSImage 吗? - Stan James
1个回答

8

最终,我在另一个代码示例的帮助下解决了这个问题。 CMSampleBufferGetImageBuffer 仅对从相机可用的未压缩、本地图像格式返回有效结果。因此,为了使我的程序正常工作,我必须配置 AVCaptureStillImageOutput 实例,使用 k32BGRAPixelFormat 替代其默认的(JPEG)压缩格式。

session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
imageOutput = [[AVCaptureStillImageOutput alloc] init];
// Configure imageOutput for BGRA pixel format [#2].
NSNumber * pixelFormat = [NSNumber numberWithInt:k32BGRAPixelFormat];
[imageOutput setOutputSettings:[NSDictionary dictionaryWithObject:pixelFormat
                                                           forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[session addOutput:imageOutput];

谢谢分享。要是自己找肯定得花好长时间才能弄明白! - windson

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接