AVFoundation:向CMSampleBufferRef视频帧添加文本

7

我正在使用AVFoundation构建一个应用程序。

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection方法中,在调用[assetWriterInput appendSampleBuffer:sampleBuffer]之前。

我使用像素缓冲区处理样本缓冲区中的像素(使用像素缓冲区应用效果)。

但客户还想将文本(时间戳和帧计数器)添加到帧中,但我还没有找到一种方法来实现这一点。

我尝试将样本缓冲区转换为图像,在图像上应用文本,然后将图像转换回样本缓冲区,但是接下来...(未完待续)

CMSampleBufferDataIsReady(sampleBuffer)

失败。

这是我的UIImage类别方法:

 +  (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
    {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    CVPixelBufferLockBaseAddress(imageBuffer,0);

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);


    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

    CGImageRef newImage = CGBitmapContextCreateImage(newContext);

    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);

    UIImage *newUIImage = [UIImage imageWithCGImage:newImage];

    CFRelease(newImage);

    return newUIImage;
    }

并且
 - (CMSampleBufferRef) cmSampleBuffer
    {
        CGImageRef image = self.CGImage;

        NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                                 [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                                 [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                                 nil];
        CVPixelBufferRef pxbuffer = NULL;

        CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                              self.size.width,
                                              self.size.height,
                                              kCVPixelFormatType_32ARGB,
                                              (__bridge CFDictionaryRef) options,
                                              &pxbuffer);
        NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

        CVPixelBufferLockBaseAddress(pxbuffer, 0);
        void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
        NSParameterAssert(pxdata != NULL);

        CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef context = CGBitmapContextCreate(pxdata, self.size.width,
                                                     self.size.height, 8, 4*self.size.width, rgbColorSpace,
                                                     kCGImageAlphaNoneSkipFirst);
        NSParameterAssert(context);
        CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
        CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                               CGImageGetHeight(image)), image);
        CGColorSpaceRelease(rgbColorSpace);
        CGContextRelease(context);
        CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
        CMVideoFormatDescriptionRef videoInfo = NULL;
        CMSampleBufferRef sampleBuffer = NULL;
        CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
                                           pxbuffer, true, NULL, NULL, videoInfo, NULL, &sampleBuffer);
        return sampleBuffer;
    }

有什么想法吗?

编辑:

我已经按照Tony的回答修改了我的代码。(谢谢!) 这段代码可以工作:

CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    CVPixelBufferLockBaseAddress( pixelBuffer, 0 );

    EAGLContext *eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
    CIContext *ciContext = [CIContext contextWithEAGLContext:eaglContext options:@{kCIContextWorkingColorSpace : [NSNull null]} ];

    UIFont *font = [UIFont fontWithName:@"Helvetica" size:40];
    NSDictionary *attributes = @{NSFontAttributeName: font,
                                 NSForegroundColorAttributeName: [UIColor lightTextColor]};

    UIImage *img = [UIImage imageFromText:@"01 - 13/02/2014 15:18:21:654" withAttributes:attributes];
    CIImage *filteredImage = [[CIImage alloc] initWithCGImage:img.CGImage];

    [ciContext render:filteredImage toCVPixelBuffer:pixelBuffer bounds:[filteredImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()];

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

2
有点随机,但你介意分享一下UIImage *img = [UIImage imageFromText:@"01 - 13/02/2014 15:18:21:654" withAttributes:attributes];的源代码吗? - chrisallick
@chrisallick 请看这里:https://dev59.com/UXE85IYBdhLWcg3wdDSz - Si.
你找到了在CMSampleBuffer上添加文本的解决方案吗? - user924
1个回答

3
你可以参考苹果的示例,并使用以下API直接向缓冲区绘图 -(void)render:(CIImage *)image toCVPixelBuffer:(CVPixelBufferRef)buffer bounds:(CGRect)r colorSpace:(CGColorSpaceRef)cs 你可以从这里下载 WWDC2013 创建上下文。
_eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
_ciContext = [CIContext contextWithEAGLContext:_eaglContext options:@{kCIContextWorkingColorSpace : [NSNull null]} ];

现在呈现图片。
CVPixelBufferRef renderedOutputPixelBuffer = NULL;
OSStatus err = CVPixelBufferPoolCreatePixelBuffer(nil, self.pixelBufferAdaptor.pixelBufferPool, &renderedOutputPixelBuffer);
[_ciContext render:filteredImage toCVPixelBuffer:renderedOutputPixelBuffer bounds:[filteredImage extent]

这个程序完美运行,只是透明图像在一个黑色框里。有任何想法为什么呢? :) - Joris Timmerman
@JoriDor,你弄清楚为什么它是黑色的了吗? - Vsevolod Kukhelny

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接