使用CVPixelBufferRef和着色器在OpenGL中渲染FFmpeg的YUV视频

4

我正在使用iOS 5.0方法"CVOpenGLESTextureCacheCreateTextureFromImage"来渲染ffmpeg的YUV帧。

我像苹果的例子GLCameraRipple一样使用。

我的iPhone屏幕上显示的结果是这样的:iPhone Screen

我需要知道我做错了什么。

我放了一部分代码来查找错误。

ffmpeg配置帧:

ctx->p_sws_ctx = sws_getContext(ctx->p_video_ctx->width, 
                                ctx->p_video_ctx->height, 
                                ctx->p_video_ctx->pix_fmt, 
                                ctx->p_video_ctx->width, 
                                ctx->p_video_ctx->height,
                                PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL);


// Framebuffer for RGB data
ctx->p_frame_buffer = malloc(avpicture_get_size(PIX_FMT_YUV420P,
                                                ctx->p_video_ctx->width, 
                                                ctx->p_video_ctx->height));

avpicture_fill((AVPicture*)ctx->p_picture_rgb, ctx->p_frame_buffer,PIX_FMT_YUV420P, 
               ctx->p_video_ctx->width, 
               ctx->p_video_ctx->height);

我的渲染方法:

if (NULL == videoTextureCache) {
    NSLog(@"displayPixelBuffer error");
    return;
}    


CVPixelBufferRef pixelBuffer;    
   CVPixelBufferCreateWithBytes(kCFAllocatorDefault, mTexW, mTexH, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, buffer, mFrameW * 3, NULL, 0, NULL, &pixelBuffer);



CVReturn err;    
// Y-plane
glActiveTexture(GL_TEXTURE0);
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, 
                                                   videoTextureCache,
                                                   pixelBuffer,
                                                   NULL,
                                                   GL_TEXTURE_2D,
                                                   GL_RED_EXT,
                                                   mTexW,
                                                   mTexH,
                                                   GL_RED_EXT,
                                                   GL_UNSIGNED_BYTE,
                                                   0,
                                                   &_lumaTexture);
if (err) 
{
    NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
}   

glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);     

// UV-plane
glActiveTexture(GL_TEXTURE1);
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, 
                                                   videoTextureCache,
                                                   pixelBuffer,
                                                   NULL,
                                                   GL_TEXTURE_2D,
                                                   GL_RG_EXT,
                                                   mTexW/2,
                                                   mTexH/2,
                                                   GL_RG_EXT,
                                                   GL_UNSIGNED_BYTE,
                                                   1,
                                                   &_chromaTexture);
if (err) 
{
    NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
}

glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);     

glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);

// Set the view port to the entire view
glViewport(0, 0, backingWidth, backingHeight);

static const GLfloat squareVertices[] = {
    1.0f, 1.0f,
    -1.0f, 1.0f,
    1.0f,  -1.0f,
    -1.0f,  -1.0f,
};

GLfloat textureVertices[] = {
    1, 1,
    1, 0,
    0, 1,
    0, 0,
};

// Draw the texture on the screen with OpenGL ES 2
[self renderWithSquareVertices:squareVertices textureVertices:textureVertices];


// Flush the CVOpenGLESTexture cache and release the texture
CVOpenGLESTextureCacheFlush(videoTextureCache, 0);    
CVPixelBufferRelease(pixelBuffer);     

 [moviePlayerDelegate bufferDone];

RenderWithSquareVertices方法

    - (void)renderWithSquareVertices:(const GLfloat*)squareVertices textureVertices:(const GLfloat*)textureVertices
{


  // Use shader program.
    glUseProgram(shader.program);

// Update attribute values.
glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, squareVertices);
glEnableVertexAttribArray(ATTRIB_VERTEX);
glVertexAttribPointer(ATTRIB_TEXTUREPOSITON, 2, GL_FLOAT, 0, 0, textureVertices);
glEnableVertexAttribArray(ATTRIB_TEXTUREPOSITON);

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

// Present
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER];

}

我的片元着色器:

uniform sampler2D SamplerY;
uniform sampler2D SamplerUV;


varying highp vec2 _texcoord;

void main()
{

mediump vec3 yuv;
lowp vec3 rgb;

yuv.x = texture2D(SamplerY, _texcoord).r;
yuv.yz = texture2D(SamplerUV, _texcoord).rg - vec2(0.5, 0.5);

// BT.601, which is the standard for SDTV is provided as a reference

/* rgb = mat3(    1,       1,     1,
 0, -.34413, 1.772,
 1.402, -.71414,     0) * yuv;*/


// Using BT.709 which is the standard for HDTV
rgb = mat3(      1,       1,      1,
           0, -.18732, 1.8556,
           1.57481, -.46813,      0) * yuv;

   gl_FragColor = vec4(rgb, 1);

}

非常感谢。

你正在解码什么类型的视频?你是使用FFmpeg的libavcodec还是iOS的解码设施进行视频解码? - Multimedia Mike
那么这个应用程序有什么问题? - karlphillip
你好,居民,我正在尝试做同样的事情,我也有一个绿屏。你找到解决问题的方法了吗?谢谢! - cpprulez
@resident:你解决了你的问题吗? - Asta ni enohpi
1个回答

1
我猜问题在于YUV420(或I420)是一种三平面图像格式。I420是一个8位Y平面,后跟8位2x2子采样的U和V平面。GLCameraRipple的代码期望NV12格式:8位Y平面,后跟2x2子采样的交错式U/V平面。鉴于此,我认为您将需要三个纹理。luma_tex、u_chroma_tex和v_chroma_tex。
还要注意,GLCameraRipple也可能期望“视频范围”。换句话说,平面格式的值为luma=[16,235] chroma=[16,240]。

你的意思是 kCVPixelFormatType_420YpCbCr8BiPlanarFullRange 是 NV12 吗? - onmyway133

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接