如何播放由MediaCodec编码器产生的原始h264视频?

5

在媒体编解码方面,我有点新手,如果我说错了什么,请纠正我。

我想使用VLC/ffplay播放MediaCodec的原始h264输出。我需要这样做是因为我的最终目标是将一些实时视频流传输到计算机上,并且MediaMuxer只能产生一个文件,而不能像我想要的那样低延迟地流式传输到桌面。(我也可以接受其他解决方案,但我还没有找到符合延迟要求的其他东西)

以下是我用于编码视频并将其写入文件的代码:(它基于此处找到的MediaCodec示例,只是删除了MediaMuxer部分)

package com.jackos2500.droidtop;

import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.opengl.EGL14;
import android.opengl.EGLConfig;
import android.opengl.EGLContext;
import android.opengl.EGLDisplay;
import android.opengl.EGLExt;
import android.opengl.EGLSurface;
import android.opengl.GLES20;
import android.os.Environment;
import android.util.Log;
import android.view.Surface;

import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;

public class StreamH264 {
    private static final String TAG = "StreamH264";
    private static final boolean VERBOSE = true;           // lots of logging

    // where to put the output file (note: /sdcard requires WRITE_EXTERNAL_STORAGE permission)
    private static final File OUTPUT_DIR = Environment.getExternalStorageDirectory();

    public static int MEGABIT = 1000 * 1000;
    private static final int IFRAME_INTERVAL = 10;

    private static final int TEST_R0 = 0;
    private static final int TEST_G0 = 136;
    private static final int TEST_B0 = 0;
    private static final int TEST_R1 = 236;
    private static final int TEST_G1 = 50;
    private static final int TEST_B1 = 186;

    private MediaCodec codec;
    private CodecInputSurface inputSurface;
    private BufferedOutputStream out;

    private MediaCodec.BufferInfo bufferInfo;
    public StreamH264() {

    }

    private void prepareEncoder() throws IOException {
        bufferInfo = new MediaCodec.BufferInfo();

        MediaFormat format = MediaFormat.createVideoFormat("video/avc", 1280, 720);
        format.setInteger(MediaFormat.KEY_BIT_RATE, 2 * MEGABIT);
        format.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
        format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);

        codec = MediaCodec.createEncoderByType("video/avc");
        codec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        inputSurface = new CodecInputSurface(codec.createInputSurface());
        codec.start();

        File dst = new File(OUTPUT_DIR, "test.264");
        out = new BufferedOutputStream(new FileOutputStream(dst));
    }
    private void releaseEncoder() throws IOException {
        if (VERBOSE) Log.d(TAG, "releasing encoder objects");
        if (codec != null) {
            codec.stop();
            codec.release();
            codec = null;
        }
        if (inputSurface != null) {
            inputSurface.release();
            inputSurface = null;
        }
        if (out != null) {
            out.flush();
            out.close();
            out = null;
        }
    }
    public void stream() throws IOException {
        try {
            prepareEncoder();
            inputSurface.makeCurrent();
            for (int i = 0; i < (30 * 5); i++) {
                // Feed any pending encoder output into the file.
                drainEncoder(false);

                // Generate a new frame of input.
                generateSurfaceFrame(i);
                inputSurface.setPresentationTime(computePresentationTimeNsec(i, 30));

                // Submit it to the encoder.  The eglSwapBuffers call will block if the input
                // is full, which would be bad if it stayed full until we dequeued an output
                // buffer (which we can't do, since we're stuck here).  So long as we fully drain
                // the encoder before supplying additional input, the system guarantees that we
                // can supply another frame without blocking.
                if (VERBOSE) Log.d(TAG, "sending frame " + i + " to encoder");
                inputSurface.swapBuffers();
            }
            // send end-of-stream to encoder, and drain remaining output
            drainEncoder(true);
        } finally {
            // release encoder, muxer, and input Surface
            releaseEncoder();
        }
    }

    private void drainEncoder(boolean endOfStream) throws IOException {
        final int TIMEOUT_USEC = 10000;
        if (VERBOSE) Log.d(TAG, "drainEncoder(" + endOfStream + ")");

        if (endOfStream) {
            if (VERBOSE) Log.d(TAG, "sending EOS to encoder");
            codec.signalEndOfInputStream();
        }
        ByteBuffer[] outputBuffers = codec.getOutputBuffers();
        while (true) {
            int encoderStatus = codec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                // no output available yet
                if (!endOfStream) {
                    break;      // out of while
                } else {
                    if (VERBOSE) Log.d(TAG, "no output available, spinning to await EOS");
                }
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                // not expected for an encoder
                outputBuffers = codec.getOutputBuffers();
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                // should happen before receiving buffers, and should only happen once
                MediaFormat newFormat = codec.getOutputFormat();
                Log.d(TAG, "encoder output format changed: " + newFormat);
            } else if (encoderStatus < 0) {
                Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " + encoderStatus);
                // let's ignore it
            } else {
                ByteBuffer encodedData = outputBuffers[encoderStatus];
                if (encodedData == null) {
                    throw new RuntimeException("encoderOutputBuffer " + encoderStatus + " was null");
                }

                if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                    // The codec config data was pulled out and fed to the muxer when we got
                    // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                    if (VERBOSE) Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                    bufferInfo.size = 0;
                }

                if (bufferInfo.size != 0) {
                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    encodedData.position(bufferInfo.offset);
                    encodedData.limit(bufferInfo.offset + bufferInfo.size);

                    byte[] data = new byte[bufferInfo.size];
                    encodedData.get(data);
                    out.write(data);
                    if (VERBOSE) Log.d(TAG, "sent " + bufferInfo.size + " bytes to file");
                }

                codec.releaseOutputBuffer(encoderStatus, false);

                if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    if (!endOfStream) {
                        Log.w(TAG, "reached end of stream unexpectedly");
                    } else {
                        if (VERBOSE) Log.d(TAG, "end of stream reached");
                    }
                    break;      // out of while
                }
            }
        }
    }
    private void generateSurfaceFrame(int frameIndex) {
        frameIndex %= 8;

        int startX, startY;
        if (frameIndex < 4) {
            // (0,0) is bottom-left in GL
            startX = frameIndex * (1280 / 4);
            startY = 720 / 2;
        } else {
            startX = (7 - frameIndex) * (1280 / 4);
            startY = 0;
        }

        GLES20.glClearColor(TEST_R0 / 255.0f, TEST_G0 / 255.0f, TEST_B0 / 255.0f, 1.0f);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

        GLES20.glEnable(GLES20.GL_SCISSOR_TEST);
        GLES20.glScissor(startX, startY, 1280 / 4, 720 / 2);
        GLES20.glClearColor(TEST_R1 / 255.0f, TEST_G1 / 255.0f, TEST_B1 / 255.0f, 1.0f);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
        GLES20.glDisable(GLES20.GL_SCISSOR_TEST);
    }
    private static long computePresentationTimeNsec(int frameIndex, int frameRate) {
        final long ONE_BILLION = 1000000000;
        return frameIndex * ONE_BILLION / frameRate;
    }

    /**
     * Holds state associated with a Surface used for MediaCodec encoder input.
     * <p>
     * The constructor takes a Surface obtained from MediaCodec.createInputSurface(), and uses that
     * to create an EGL window surface.  Calls to eglSwapBuffers() cause a frame of data to be sent
     * to the video encoder.
     * <p>
     * This object owns the Surface -- releasing this will release the Surface too.
     */
    private static class CodecInputSurface {
        private static final int EGL_RECORDABLE_ANDROID = 0x3142;

        private EGLDisplay mEGLDisplay = EGL14.EGL_NO_DISPLAY;
        private EGLContext mEGLContext = EGL14.EGL_NO_CONTEXT;
        private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE;

        private Surface mSurface;

        /**
         * Creates a CodecInputSurface from a Surface.
         */
        public CodecInputSurface(Surface surface) {
            if (surface == null) {
                throw new NullPointerException();
            }
            mSurface = surface;

            eglSetup();
        }

        /**
         * Prepares EGL.  We want a GLES 2.0 context and a surface that supports recording.
         */
        private void eglSetup() {
            mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
            if (mEGLDisplay == EGL14.EGL_NO_DISPLAY) {
                throw new RuntimeException("unable to get EGL14 display");
            }
            int[] version = new int[2];
            if (!EGL14.eglInitialize(mEGLDisplay, version, 0, version, 1)) {
                throw new RuntimeException("unable to initialize EGL14");
            }

            // Configure EGL for recording and OpenGL ES 2.0.
            int[] attribList = {
                    EGL14.EGL_RED_SIZE, 8,
                    EGL14.EGL_GREEN_SIZE, 8,
                    EGL14.EGL_BLUE_SIZE, 8,
                    EGL14.EGL_ALPHA_SIZE, 8,
                    EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
                    EGL_RECORDABLE_ANDROID, 1,
                    EGL14.EGL_NONE
            };
            EGLConfig[] configs = new EGLConfig[1];
            int[] numConfigs = new int[1];
            EGL14.eglChooseConfig(mEGLDisplay, attribList, 0, configs, 0, configs.length,
                    numConfigs, 0);
            checkEglError("eglCreateContext RGB888+recordable ES2");

            // Configure context for OpenGL ES 2.0.
            int[] attrib_list = {
                    EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
                    EGL14.EGL_NONE
            };
            mEGLContext = EGL14.eglCreateContext(mEGLDisplay, configs[0], EGL14.EGL_NO_CONTEXT,
                    attrib_list, 0);
            checkEglError("eglCreateContext");

            // Create a window surface, and attach it to the Surface we received.
            int[] surfaceAttribs = {
                    EGL14.EGL_NONE
            };
            mEGLSurface = EGL14.eglCreateWindowSurface(mEGLDisplay, configs[0], mSurface,
                    surfaceAttribs, 0);
            checkEglError("eglCreateWindowSurface");
        }

        /**
         * Discards all resources held by this class, notably the EGL context.  Also releases the
         * Surface that was passed to our constructor.
         */
        public void release() {
            if (mEGLDisplay != EGL14.EGL_NO_DISPLAY) {
                EGL14.eglMakeCurrent(mEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE,
                        EGL14.EGL_NO_CONTEXT);
                EGL14.eglDestroySurface(mEGLDisplay, mEGLSurface);
                EGL14.eglDestroyContext(mEGLDisplay, mEGLContext);
                EGL14.eglReleaseThread();
                EGL14.eglTerminate(mEGLDisplay);
            }

            mSurface.release();

            mEGLDisplay = EGL14.EGL_NO_DISPLAY;
            mEGLContext = EGL14.EGL_NO_CONTEXT;
            mEGLSurface = EGL14.EGL_NO_SURFACE;

            mSurface = null;
        }

        /**
         * Makes our EGL context and surface current.
         */
        public void makeCurrent() {
            EGL14.eglMakeCurrent(mEGLDisplay, mEGLSurface, mEGLSurface, mEGLContext);
            checkEglError("eglMakeCurrent");
        }

        /**
         * Calls eglSwapBuffers.  Use this to "publish" the current frame.
         */
        public boolean swapBuffers() {
            boolean result = EGL14.eglSwapBuffers(mEGLDisplay, mEGLSurface);
            checkEglError("eglSwapBuffers");
            return result;
        }

        /**
         * Sends the presentation time stamp to EGL.  Time is expressed in nanoseconds.
         */
        public void setPresentationTime(long nsecs) {
            EGLExt.eglPresentationTimeANDROID(mEGLDisplay, mEGLSurface, nsecs);
            checkEglError("eglPresentationTimeANDROID");
        }

        /**
         * Checks for EGL errors.  Throws an exception if one is found.
         */
        private void checkEglError(String msg) {
            int error;
            if ((error = EGL14.eglGetError()) != EGL14.EGL_SUCCESS) {
                throw new RuntimeException(msg + ": EGL error: 0x" + Integer.toHexString(error));
            }
        }
    }
}

然而,从这段代码生成的文件无法在VLC或ffplay中播放。有人能告诉我我做错了什么吗?我认为这是由于播放原始h264所需的标头格式(或完全缺少)不正确导致的,因为我已经成功地使用ffplay播放从互联网下载的.264文件。此外,我不确定如何将此视频流式传输到计算机上,如果有人能给我一些建议,我将非常感激!谢谢!
2个回答

9
您应该能够播放原始的H264流(正如您所写,其他原始.264文件可以使用VLC或ffplay播放),但是您缺少参数集。这些以两种不同的方式传递,在这两种情况下,您都缺少参数集。首先,它们在获取MediaCodec.INFO_OUTPUT_FORMAT_CHANGED时通过MediaFormat返回(您没有处理它,只是记录了一条消息),其次,它们在设置MediaCodec.BUFFER_FLAG_CODEC_CONFIG的缓冲区中返回(您将其忽略,将大小设置为0)。最简单的解决方法是删除对MediaCodec.BUFFER_FLAG_CODEC_CONFIG的特殊处理,这样就可以正常工作了。
您基于的代码是按照这种方式进行操作的,以便测试所有不同的操作方式——在您复制它的地方,参数集是从MediaCodec.INFO_OUTPUT_FORMAT_CHANGED中的MediaFormat中携带的。如果您想在使用原始的H264字节流的情况下使用它,在MediaFormat中使用键为csd-0csd-1的字节缓冲区,并继续忽略设置MediaCodec.BUFFER_FLAG_CODEC_CONFIG的缓冲区。

抱歉回复晚了,但是通过将bufferInfo.size设置为0来删除忽略这些参数的代码完美地解决了问题,谢谢! - jackos2500
您IP地址为143.198.54.68,由于运营成本限制,当前对于免费用户的使用频率限制为每个IP每72小时10次对话,如需解除限制,请点击左下角设置图标按钮(手机用户先点击左上角菜单按钮)。 - yorkw

1
你不能直接播放原始的h264视频。它没有关于格式的任何信息。你也可以在这里找到一些很好的例子。为了进行流媒体传输,你需要实现一些流媒体协议,如RTSP(用于实时流媒体)或更灵活的HLS(如果不需要实时)。

好的,如果我不考虑播放原始视频,我该如何将一帧发送到计算机,对其进行解码并在屏幕上显示它?最好使用Java。 - jackos2500
你可以通过TCP或UDP发送一个帧(描述其大小,以字节为单位),然后如果需要使用一些h264解码器,就提供一个帧并解释其格式,例如bps、宽度/高度等。由于你要流式传输到计算机,所以你需要自己的解码器应用程序,可能使用一些库来帮助你解码视频并呈现它(通常使用OpenGL)。 - Volodymyr Lykhonis

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接