将CMSampleBufferRef转换为cv :: Mat

3
我正在尝试将CMSampleBufferRef(作为iOS中AVCaptureVideoDataOutputSampleBufferDelegate的一部分)转换为OpenCV Mat,以尝试在半实时状态下稳定输出。目前我正在运行一个测试应用程序,遵循this,但在创建和使用Mat时一直遇到问题。 Swift控制器
let wrapper : OpenCVWrapper = OpenCVWrapper()
...
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    self.wrapper.processBuffer(sampleBuffer, self.previewMat)
}

OpenCVWrapper

- (void)processBuffer:(CMSampleBufferRef)buffer :(UIImageView*)previewMat {
    // Convert current buffer to Mat
    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);
    CVPixelBufferLockBaseAddress( pixelBuffer, 0);

    CGFloat bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
    CGFloat bufferHeight = CVPixelBufferGetHeight(pixelBuffer);

    unsigned char *pixel = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);

    Mat tmp(bufferWidth, bufferHeight, CV_8UC4, pixel);
    Mat cur = tmp.clone();

    dispatch_async(dispatch_get_main_queue(), ^{
        [previewMat setImage:[UIImage imageWithCVMat:cur]];
    });
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
}

Mat cur = tmp.clone()中我遇到了EXC_BAD_ACCESS错误。

你有什么想法,我在这里做错了什么吗?

我已经尝试过bufferWidth、CGFloat和int,并在Mat的构造函数中进行了切换,但问题仍然存在。


你尝试过这样做吗:Mat tmp = Mat(bufferHeight,bufferWidth,CV_8UC4,pixel); Mat cur = tmp.clone(); - freshking
是的,那也不起作用。最后我把缓冲区转换成UIImage,然后再将UIImage转换为Mat。这样做虽然有效,但并没有真正回答问题。 - Richard Poole
3个回答

4
改进的解决方案,修复了“仅限前30%”问题:
- (cv::Mat)matFromBuffer:(CMSampleBufferRef)buffer {
    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);
    CVPixelBufferLockBaseAddress( pixelBuffer, 0 );

    //Processing here
    int bufferWidth = (int)CVPixelBufferGetWidth(pixelBuffer);
    int bufferHeight = (int)CVPixelBufferGetHeight(pixelBuffer);
    unsigned char *pixel = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);

    //put buffer in open cv, no memory copied
    cv::Mat mat = cv::Mat(bufferHeight,bufferWidth,CV_8UC4,pixel,CVPixelBufferGetBytesPerRow(pixelBuffer));

    //End processing
    CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );

    cv::Mat matGray;
    cvtColor(mat, matGray, CV_BGR2GRAY);

    return matGray;
}

如果CVPixelBuffer来自iOS相机,则此代码仅在您设置了camera.videoSettings以包括[String(kCVPixelBufferPixelFormatTypeKey):kCMPixelFormat_32BGRA]时才有效。如果您使用的是kCMPixelFormat_422YpCbCr8像素格式,请参阅此SO帖子:https://dev59.com/AGIk5IYBdhLWcg3wWc_T - Drew H

2
也许这个会有用:
- (void)processBuffer:(CMSampleBufferRef)buffer :(UIImageView*)previewMat {
{
    CVImageBufferRef imgBuf = CMSampleBufferGetImageBuffer(buffer);

    // lock the buffer
    CVPixelBufferLockBaseAddress(imgBuf, 0);

    // get the address to the image data
    void *imgBufAddr = CVPixelBufferGetBaseAddressOfPlane(imgBuf, 0);

    // get image properties
    int w = (int)CVPixelBufferGetWidth(imgBuf);
    int h = (int)CVPixelBufferGetHeight(imgBuf);

    // create the cv mat
    cv::Mat image;
    image.create(h, w, CV_8UC4);
    // memcpy(image.data, imgBufAddr, w * h); copies only 25% of the image
    memcpy(image.data, imgBufAddr, w * h* 4); // copies all pixels

    // unlock again
    CVPixelBufferUnlockBaseAddress(imgBuf, 0);

    dispatch_async(dispatch_get_main_queue(), ^{
        [previewMat setImage:[UIImage imageWithCVMat:image]];
    });
}

1
使用这段代码,我只能获取到图像的顶部三分之一。示例截图:http://i.imgur.com/cRQ0OmL.png - Stan James
编辑了代码以复制整个图像。memcpy() 命令没有计算图像的4个通道。 - Totoro
在复制过程中,最好使用CVPixelBufferGetDataSize(imgBuf)而不是手动计算缓冲区大小(宽4)。 - Mohamed Salah
好的回答。然而,Yun CHEN的答案实现了相同的功能,而无需执行memcpy,因此应该更快。 此外,后期的iPhone模型往往会向CVPixelBuffer中的每行末尾添加填充字节,因此如果memcpy是您的选择,请确保执行memcpy(image.data,imgBuffAddr,CVPixelBufferGetBytesPerRow(pixelBuffer)* h); - Drew H

1
图像类型不相同,你需要尝试进行某种处理。
cvtColor(image, image_copy, CV_BGRA2BGR);

尝试使用其他类型的CV_BGRA2BGR。
希望能有所帮助。

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接