安卓相机2将输出转换为ImageReader格式YUV_420_888仍然很慢

8
我正在尝试在后台服务中运行Android camera2,然后在回调ImageReader.OnImageAvailableListener中处理帧。我已经使用建议的原始格式YUV_420_888来获取最大fps,但是我在分辨率640x480下只能获得约7fps。这甚至比我使用旧的相机接口(我想升级到Camera2以获得更高的fps)或OpenCV JavaCameraView更慢(我不能使用它,因为我需要在后台服务中运行处理)。 下面是我的服务类。 我错过了什么?
我的手机是Redmi Note 3,运行Android 5.0.2
public class Camera2ServiceYUV extends Service {
    protected static final String TAG = "VideoProcessing";
    protected static final int CAMERACHOICE = CameraCharacteristics.LENS_FACING_BACK;
    protected CameraDevice cameraDevice;
    protected CameraCaptureSession captureSession;
    protected ImageReader imageReader;

    // A semaphore to prevent the app from exiting before closing the camera.
    private Semaphore mCameraOpenCloseLock = new Semaphore(1);


    public static final String RESULT_RECEIVER = "resultReceiver";
    private static final int JPEG_COMPRESSION = 90;

    public static final int RESULT_OK = 0;
    public static final int RESULT_DEVICE_NO_CAMERA= 1;
    public static final int RESULT_GET_CAMERA_FAILED = 2;
    public static final int RESULT_ALREADY_RUNNING = 3;
    public static final int RESULT_NOT_RUNNING = 4;

    private static final String START_SERVICE_COMMAND = "startServiceCommands";
    private static final int COMMAND_NONE = -1;
    private static final int COMMAND_START = 0;
    private static final int COMMAND_STOP = 1;

    private boolean mRunning = false;
    public Camera2ServiceYUV() {
    }

    public static void startToStart(Context context, ResultReceiver resultReceiver) {
        Intent intent = new Intent(context, Camera2ServiceYUV.class);
        intent.putExtra(START_SERVICE_COMMAND, COMMAND_START);
        intent.putExtra(RESULT_RECEIVER, resultReceiver);
        context.startService(intent);
    }

    public static void startToStop(Context context, ResultReceiver resultReceiver) {
        Intent intent = new Intent(context, Camera2ServiceYUV.class);
        intent.putExtra(START_SERVICE_COMMAND, COMMAND_STOP);
        intent.putExtra(RESULT_RECEIVER, resultReceiver);
        context.startService(intent);
    }

    // SERVICE INTERFACE
    @Override
    public int onStartCommand(Intent intent, int flags, int startId) {
        switch (intent.getIntExtra(START_SERVICE_COMMAND, COMMAND_NONE)) {
            case COMMAND_START:
                startCamera(intent);
                break;
            case COMMAND_STOP:
                stopCamera(intent);
                break;
            default:
                throw new UnsupportedOperationException("Cannot start the camera service with an illegal command.");
        }

        return START_STICKY;



    }

    @Override
    public void onDestroy() {
        try {
            captureSession.abortCaptures();
        } catch (CameraAccessException e) {
            Log.e(TAG, e.getMessage());
        }
        captureSession.close();
    }

    @Override
    public IBinder onBind(Intent intent) {
        return null;
    }


    // CAMERA2 INTERFACE
    /**
     * 1. The android CameraManager class is used to manage all the camera devices in our android device
     * Each camera device has a range of properties and settings that describe the device.
     * It can be obtained through the camera characteristics.
     */
    public void startCamera(Intent intent) {

        final ResultReceiver resultReceiver = intent.getParcelableExtra(RESULT_RECEIVER);

        if (mRunning) {
            resultReceiver.send(RESULT_ALREADY_RUNNING, null);
            return;
        }
        mRunning = true;

        CameraManager manager = (CameraManager) getSystemService(CAMERA_SERVICE);
        try {
            if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
                throw new RuntimeException("Time out waiting to lock camera opening.");
            }
            String pickedCamera = getCamera(manager);
            Log.e(TAG,"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA " + pickedCamera);
            manager.openCamera(pickedCamera, cameraStateCallback, null);
            CameraCharacteristics characteristics = manager.getCameraCharacteristics(pickedCamera);
            Size[] jpegSizes = null;
            if (characteristics != null) {
                jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.YUV_420_888);
            }
            int width = 640;
            int height = 480;
//            if (jpegSizes != null && 0 < jpegSizes.length) {
//                width = jpegSizes[jpegSizes.length -1].getWidth();
//                height = jpegSizes[jpegSizes.length - 1].getHeight();
//            }
//            for(Size s : jpegSizes)
//            {
//                Log.e(TAG,"Size = " + s.toString());
//            }


            // DEBUG
            StreamConfigurationMap map = characteristics.get(
                    CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
            if (map == null) {
                return;
            }
            Log.e(TAG,"Width = " + width + ", Height = " + height);
            Log.e(TAG,"output stall duration = " + map.getOutputStallDuration(ImageFormat.YUV_420_888, new Size(width,height)) );
            Log.e(TAG,"Min output stall duration = " + map.getOutputMinFrameDuration(ImageFormat.YUV_420_888, new Size(width,height)) );

//            Size[] sizeList = map.getInputSizes(ImageFormat.YUV_420_888);
//            for(Size s : sizeList)
//            {
//                Log.e(TAG,"Size = " + s.toString());
//            }

            imageReader = ImageReader.newInstance(width, height, ImageFormat.YUV_420_888, 2 /* images buffered */);
            imageReader.setOnImageAvailableListener(onImageAvailableListener, null);
            Log.i(TAG, "imageReader created");
        } catch (CameraAccessException e) {
            Log.e(TAG, e.getMessage());
            resultReceiver.send(RESULT_DEVICE_NO_CAMERA, null);
        }catch (InterruptedException e) {
            resultReceiver.send(RESULT_GET_CAMERA_FAILED, null);
            throw new RuntimeException("Interrupted while trying to lock camera opening.", e);
        }
        catch(SecurityException se)
        {
            resultReceiver.send(RESULT_GET_CAMERA_FAILED, null);
            throw new RuntimeException("Security permission exception while trying to open the camera.", se);
        }

        resultReceiver.send(RESULT_OK, null);
    }

    // We can pick the camera being used, i.e. rear camera in this case.
    private String getCamera(CameraManager manager) {
        try {
            for (String cameraId : manager.getCameraIdList()) {
                CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
                int cOrientation = characteristics.get(CameraCharacteristics.LENS_FACING);
                if (cOrientation == CAMERACHOICE) {
                    return cameraId;
                }
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        return null;
    }


    /**
     * 1.1 Callbacks when the camera changes its state - opened, disconnected, or error.
     */
    protected CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
        @Override
        public void onOpened(@NonNull CameraDevice camera) {
            Log.i(TAG, "CameraDevice.StateCallback onOpened");
            mCameraOpenCloseLock.release();
            cameraDevice = camera;
            createCaptureSession();
        }

        @Override
        public void onDisconnected(@NonNull CameraDevice camera) {
            Log.w(TAG, "CameraDevice.StateCallback onDisconnected");
            mCameraOpenCloseLock.release();
            camera.close();
            cameraDevice = null;
        }

        @Override
        public void onError(@NonNull CameraDevice camera, int error) {
            Log.e(TAG, "CameraDevice.StateCallback onError " + error);
            mCameraOpenCloseLock.release();
            camera.close();
            cameraDevice = null;
        }
    };


    /**
     * 2. To capture or stream images from a camera device, the application must first create
     * a camera capture captureSession.
     * The camera capture needs a surface to output what has been captured, in this case
     * we use ImageReader in order to access the frame data.
     */
    public void createCaptureSession() {
        try {
            cameraDevice.createCaptureSession(Arrays.asList(imageReader.getSurface()), sessionStateCallback, null);
        } catch (CameraAccessException e) {
            Log.e(TAG, e.getMessage());
        }
    }

        protected CameraCaptureSession.StateCallback sessionStateCallback = new CameraCaptureSession.StateCallback() {
        @Override
        public void onConfigured(@NonNull CameraCaptureSession session) {
            Log.i(TAG, "CameraCaptureSession.StateCallback onConfigured");

            // The camera is already closed
            if (null == cameraDevice) {
                return;
            }

            // When the captureSession is ready, we start to grab the frame.
            Camera2ServiceYUV.this.captureSession = session;

            try {
                session.setRepeatingRequest(createCaptureRequest(), null, null);
            } catch (CameraAccessException e) {
                Log.e(TAG, e.getMessage());
            }
        }

        @Override
        public void onConfigureFailed(@NonNull CameraCaptureSession session) {
            Log.e(TAG, "CameraCaptureSession.StateCallback onConfigureFailed");
        }
    };

    /**
     * 3. The application then needs to construct a CaptureRequest, which defines all the capture parameters
     *    needed by a camera device to capture a single image.
     */
    private CaptureRequest createCaptureRequest() {
        try {
            /**
             * Check other templates for further details.
             * TEMPLATE_MANUAL = 6
             * TEMPLATE_PREVIEW = 1
             * TEMPLATE_RECORD = 3
             * TEMPLATE_STILL_CAPTURE = 2
             * TEMPLATE_VIDEO_SNAPSHOT = 4
             * TEMPLATE_ZERO_SHUTTER_LAG = 5
             *
             * TODO: can set camera features like auto focus, auto flash here
             * captureRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
             */
            CaptureRequest.Builder captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
//            captureRequestBuilder.set(CaptureRequest.EDGE_MODE,
//                    CaptureRequest.EDGE_MODE_OFF);
//            captureRequestBuilder.set(
//                    CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE,
//                    CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE_ON);
//            captureRequestBuilder.set(
//                    CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE,
//                    CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE_OFF);
//            captureRequestBuilder.set(CaptureRequest.NOISE_REDUCTION_MODE,
//                    CaptureRequest.NOISE_REDUCTION_MODE_OFF);
//            captureRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
//                    CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);
//
//            captureRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
//            captureRequestBuilder.set(CaptureRequest.CONTROL_AWB_LOCK, true);

            captureRequestBuilder.addTarget(imageReader.getSurface());
            return captureRequestBuilder.build();
        } catch (CameraAccessException e) {
            Log.e(TAG, e.getMessage());
            return null;
        }
    }


    /**
     * ImageReader provides a surface for the camera to output what has been captured.
     * Upon the image available, call processImage() to process the image as desired.
     */
    private long frameTime = 0;
    private ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {
        @Override
        public void onImageAvailable(ImageReader reader) {
            Log.i(TAG, "called ImageReader.OnImageAvailable");
            Image img = reader.acquireLatestImage();
            if (img != null) {
                if( frameTime != 0 )
                {
                    Log.e(TAG, "fps = " + (float)(1000.0 / (float)(SystemClock.elapsedRealtime() - frameTime)) + " fps");
                }
                frameTime = SystemClock.elapsedRealtime();
                img.close();
            }
        }
    };

    private void processImage(Image image) {
        Mat outputImage = imageToMat(image);
        Bitmap bmp = Bitmap.createBitmap(outputImage.cols(), outputImage.rows(), Bitmap.Config.ARGB_8888);
        Utils.bitmapToMat(bmp, outputImage);
        Point mid = new Point(0, 0);
        Point inEnd = new Point(outputImage.cols(), outputImage.rows());
        Imgproc.line(outputImage, mid, inEnd, new Scalar(255, 0, 0), 2, Core.LINE_AA, 0);
        Utils.matToBitmap(outputImage, bmp);

        Intent broadcast = new Intent();
        broadcast.setAction("your_load_photo_action");
        broadcast.putExtra("BitmapImage", bmp);
        sendBroadcast(broadcast);
    }

    private Mat imageToMat(Image image) {
        ByteBuffer buffer;
        int rowStride;
        int pixelStride;
        int width = image.getWidth();
        int height = image.getHeight();
        int offset = 0;

        Image.Plane[] planes = image.getPlanes();
        byte[] data = new byte[image.getWidth() * image.getHeight() * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
        byte[] rowData = new byte[planes[0].getRowStride()];

        for (int i = 0; i < planes.length; i++) {
            buffer = planes[i].getBuffer();
            rowStride = planes[i].getRowStride();
            pixelStride = planes[i].getPixelStride();
            int w = (i == 0) ? width : width / 2;
            int h = (i == 0) ? height : height / 2;
            for (int row = 0; row < h; row++) {
                int bytesPerPixel = ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8;
                if (pixelStride == bytesPerPixel) {
                    int length = w * bytesPerPixel;
                    buffer.get(data, offset, length);

                    // Advance buffer the remainder of the row stride, unless on the last row.
                    // Otherwise, this will throw an IllegalArgumentException because the buffer
                    // doesn't include the last padding.
                    if (h - row != 1) {
                        buffer.position(buffer.position() + rowStride - length);
                    }
                    offset += length;
                } else {

                    // On the last row only read the width of the image minus the pixel stride
                    // plus one. Otherwise, this will throw a BufferUnderflowException because the
                    // buffer doesn't include the last padding.
                    if (h - row == 1) {
                        buffer.get(rowData, 0, width - pixelStride + 1);
                    } else {
                        buffer.get(rowData, 0, rowStride);
                    }

                    for (int col = 0; col < w; col++) {
                        data[offset++] = rowData[col * pixelStride];
                    }
                }
            }
        }

        // Finally, create the Mat.
        Mat mat = new Mat(height + height / 2, width, CV_8UC1);
        mat.put(0, 0, data);

        return mat;
    }


    private void stopCamera(Intent intent) {
        ResultReceiver resultReceiver = intent.getParcelableExtra(RESULT_RECEIVER);

        if (!mRunning) {
            resultReceiver.send(RESULT_NOT_RUNNING, null);
            return;
        }

        closeCamera();

        resultReceiver.send(RESULT_OK, null);

        mRunning = false;
        Log.d(TAG, "Service is finished.");
    }

    /**
     * Closes the current {@link CameraDevice}.
     */
    private void closeCamera() {
        try {
            mCameraOpenCloseLock.acquire();
            if (null != captureSession) {
                captureSession.close();
                captureSession = null;
            }
            if (null != cameraDevice) {
                cameraDevice.close();
                cameraDevice = null;
            }
            if (null != imageReader) {
                imageReader.close();
                imageReader = null;
            }
        } catch (InterruptedException e) {
            throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
        } finally {
            mCameraOpenCloseLock.release();
        }
    }
}
4个回答

3
我最近在尝试将我的AR应用程序从Camera1升级到Camera2 API时遇到了问题。我使用了一款中档设备进行测试(魅族S6),其配备了Exynos 7872 CPU和Mali-G71 GPU。我想要实现的是稳定的30fps AR体验,但是在迁移过程中,我发现使用Camera2 API很难获得一个合适的预览帧率。
我使用TEMPLATE_PREVIEW配置了我的捕获请求。
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);

我使用了两个 surface,一个用于预览,大小为 1280x720 的 surfaceTexture,另一个用于图像处理的大小也是 1280x720 的 ImageReader。

mImageReader = ImageReader.newInstance(
    mVideoSize.getWidth(),
    mVideoSize.getHeight(),
    ImageFormat.YUV_420_888,
    2);

List<Surface> surfaces =new ArrayList<>();
Surface previewSurface = new Surface(mSurfaceTexture);
surfaces.add(previewSurface);
mPreviewBuilder.addTarget(previewSurface);

Surface frameCaptureSurface = mImageReader.getSurface();
surfaces.add(frameCaptureSurface);
mPreviewBuilder.addTarget(frameCaptureSurface);

mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE,
                    CameraMetadata.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
mPreviewSession.setRepeatingRequest(mPreviewBuilder.build(), captureCallback, mBackgroundHandler);

一切都按照预期运行,我的TextureView得到了更新并且framecallback也被调用了。但是问题在于……帧率大约只有10 fps,尽管我还没有进行任何图像处理。
我尝试了许多相机2 API设置,包括SENSOR_FRAME_DURATION和不同的ImageFormat和大小组合,但它们都没有提高帧率。但是如果我从输出表面中删除ImageReader,则预览很容易达到30fps!
因此,我认为问题在于通过将ImageReader作为Camera2输出表面添加,会严重降低预览帧率。至少对于我的情况是如此,那么解决方案是什么呢?

我的解决方案是glReadPixel

我知道glReadPixel是其中一种不好的东西,因为它需要从GPU复制字节到主内存,并且会导致OpenGL刷新绘制命令,因此出于性能考虑,最好避免使用它。但是令人惊讶的是,glReadPixel实际上非常快,并且提供比ImageReader的YUV_420_888输出更好的帧率。
为了减少内存开销,除了预览的720p之外,我还使用较小的帧缓冲区进行了另一个绘制调用,大小为360x640,专门用于特征检测。

你有使用 glReadPixel 的工作示例吗? - DDukesterman

1

基于openCV库对camera2的实现。 我遇到了相同的问题,然后我注意到JavaCamera2View中openCV代码中的这段代码,您需要按照以下方式更改CaptureRequest.Builder的设置:

CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);

它将我的fps从10fps提高到了约28-30fps。对我来说,使用了两个目标表面,一个是预览textureview的表面,第二个是ImageReader的表面。
Surface readerSurface = imageReader.getSurface();
Surface surface = new Surface(surfaceTexture);
captureBuilder.addTarget(surface);
captureBuilder.addTarget(readerSurface);

0

无法发表评论(声望不足)。但在Redmi 6上遇到了同样的问题。

如果使用TextureView预览相机输出,则可以获得约30 fps,但将其替换为ImageReader后,它下降到8/9 fps。在任何情况下,所有相机配置都是相同的。

有趣的是,在尝试CameraXBasic时,它显示了相同的问题。来自相机的更新很缓慢。但是android-Camera2Basic(使用TextureView)没有出现任何问题。

更新:1 通过将预览大小从1280x720降低到640x480进行测试,并如预期地看到更好的性能。


0
这是我稍微调整后了解到的情况,问题出在ImageReader的maxImage参数,我将它从2改成3、56,这显著改变了fps,我认为我们渲染到camera2的surface从ImageReader中处理ImageReader.OnImageAvailableListener时或未被释放时会阻塞保存相机图像到缓存中的过程,或者我们可以说相机想要使用缓冲区但没有足够的缓冲区,所以当我们增加imageReader的最大缓冲区时,我们可以为camera2留出保存图像的空间。

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接