从YV12或NV12字节数组中裁剪图像

3

我在实现Camera.PreviewCallback时,获取了原始图像(YV12或NV12格式)作为字节数组。我正在寻找一种方法,在不将其转换为位图的情况下裁剪该图像的一部分。图像的裁剪部分将再次作为字节数组流到其他地方。

感谢您的任何帮助。

public class CameraAccess implements Camera.PreviewCallback,         LoaderCallbackInterface {

private byte[] lastFrame;

@Override
public void onPreviewFrame(byte[] frame, Camera arg1) {
    synchronized(this) {
       this.lastFrame = frame;

    }
}

@Override
public byte[] cropFrame(Integer x, Integer y, Integer width, Integer height) {
    synchronized(this) {
       // how to crop directly from byte array?

    }
}

}

2个回答

4

将图像作为字节数组,就是将图像的每个像素放在一个巨大的数组中。它从左上角像素开始,向右移动到下一行(回到左侧)。

因此,要裁剪它,只需要使用一些 for 循环将您想要的像素复制到一个新的字节数组中:

Rect cropArea = ... //the are to crop
int currentPos = 0;
byte[] croppedOutput = new byte[cropArea.width() * cropArea.height()];
for(int y = 0; y < height; y++){
  for(int x = 0; x < width; x++){
      // here you compare if x and y are within the crop area you want
    if(cropArea.contains(x, y)){
       croppedOutput[currentPos] = frame[positionInArrayForXY(x, y)]
    }
  } 
}

positionInArrayForXY 方法中,需要进行一些额外的数学计算,基本上是 x * y,但还要考虑值为零等情况。

附:我认为每个像素框架是1字节,但不确定,如果每个像素是2字节,则需要进行一些额外的数学计算。但是思路是相同的,您可以从中开发出更多内容。

编辑:

回答您的评论:

不,这个东西没有头文件,直接就是像素。这就是为什么它总是给你相机信息,这样你就可以知道大小。

当然,我的回答肯定不适合,因为我期望YUV遵循与RGB相同的数组顺序。

我做了一些额外的研究,在这里,你可以看到执行YUV到RGB转换的方法,如果您仔细查看,您会注意到它使用每12位,即1.5字节 => 921600 * 1.5 = 1382400

因此,我可以想到几种解决方法:

  • (最容易实现)将帧转换为RGB(我知道您指定了不想要,但这样会更容易),按照我的答案进行裁剪,然后流式传输。
  • (最大的开销,不是很容易)如果流的接收者必须以YUV方式接收,则执行上述操作,但在流传输之前将其转换回YUV,执行链接方法的反转操作。
  • (非常棘手的实现方式,但解决了您最初的问题)根据我的示例代码、我发布的链接和它每个像素使用12位的事实,使用2个for循环来裁剪。

但通常标题应该包含有关图像格式的一些信息,对吧?无论如何,我的预览是1280x720,这应该导致数组长度为921600字节。但预览帧的长度为1382399字节。这不符合你的解决方案,对吧? - Matthias
1
感谢您的高级研究,提供了很好的信息。我将尝试直接从YUV字节数组中裁剪。我希望避免在发送端进行RGB转换,因为这会过多地使用CPU。 - Matthias
运行良好,但我对它的性能有所疑虑。因为它查看原始图像的每个像素,但如果计算出for循环的起始和限制,则两个for循环都可以只迭代裁剪区域。 - Matthias
我完全同意你的观点。如果你已经知道如何将XY位置映射到YUV数组中,下一步优化就是只迭代该区域。虽然这会增加代码的复杂性,但肯定会得到更好的最终解决方案。 - Budius
嗨@Matthias,你能告诉我们你从上述三个选项中选择了哪一个吗?如果是第三个选项,能否请你提供一些代码片段。 - Mahendra Chhimwal
1
嗨@MahendraChhimwal,这是一个棘手的请求,因为我的代码来自2012年和2014年,我没有公共的Github存储库。不管怎样,我用一个新答案回答了我的问题。你可以在那里找到任何你需要的东西。希望这有所帮助。 - Matthias

1

有人要求我的最终解决方案和一些源代码。所以这就是我所做的。

场景:我的项目基于运行Android的SoC。我实现了摄像头处理,包括连接到板子上的本地摄像头,该摄像头可作为Android智能手机上的摄像头。第二个是基于IP的摄像头,通过网络流传图像。因此,软件设计可能看起来有点混乱。可以随时提问。

解决方案:由于OpenCV处理、摄像头初始化、颜色和位图转换是一个棘手的问题,我最终将所有东西封装成了两个类,从而避免了在我的Android代码中多次编写愚蠢的代码。

第一个类处理颜色/位图和OpenCV矩阵转换。它定义为:

import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;
import org.opencv.core.Mat;    
import android.graphics.Bitmap;

public interface CameraFrame extends CvCameraViewFrame {
    Bitmap toBitmap();

    @Override
    Mat rgba();

    @Override
    Mat gray();
}

所有颜色和位图转换都在该接口的实现内完成。实际的转换是由OpenCV for Android附带的工具完成的。您会发现我仅使用了一个Bitmap。这是因为节省资源和位图转换需要大量CPU计算。所有UI组件都显示/渲染此位图。只有在任何组件请求位图时才进行转换。
private class CameraAccessFrame implements CameraFrame {
    private Mat mYuvFrameData;
    private Mat mRgba;
    private int mWidth;
    private int mHeight;
    private Bitmap mCachedBitmap;
    private boolean mRgbaConverted;
    private boolean mBitmapConverted;

    @Override
    public Mat gray() {
        return mYuvFrameData.submat(0, mHeight, 0, mWidth);
    }

    @Override
    public Mat rgba() {
        if (!mRgbaConverted) {
            Imgproc.cvtColor(mYuvFrameData, mRgba,
                    Imgproc.COLOR_YUV2BGR_NV12, 4);
            mRgbaConverted = true;
        }
        return mRgba;
    }

    // @Override
    // public Mat yuv() {
    // return mYuvFrameData;
    // }

    @Override
    public synchronized Bitmap toBitmap() {
        if (mBitmapConverted)
            return mCachedBitmap;

        Mat rgba = this.rgba();
        Utils.matToBitmap(rgba, mCachedBitmap);

        mBitmapConverted = true;
        return mCachedBitmap;
    }

    public CameraAccessFrame(Mat Yuv420sp, int width, int height) {
        super();
        mWidth = width;
        mHeight = height;
        mYuvFrameData = Yuv420sp;
        mRgba = new Mat();

        this.mCachedBitmap = Bitmap.createBitmap(width, height,
                Bitmap.Config.ARGB_8888);
    }

    public synchronized void put(byte[] frame) {
        mYuvFrameData.put(0, 0, frame);
        invalidate();
    }

    public void release() {
        mRgba.release();
        mCachedBitmap.recycle();
    }

    public void invalidate() {
        mRgbaConverted = false;
        mBitmapConverted = false;
    }
};

相机处理被封装在两个特殊的类中,稍后将对它们进行解释。其中一个类(HardwareCamera 实现 ICamera)处理相机初始化和关闭,而第二个类(CameraAccess)处理 OpenCV 初始化以及通知其他组件(CameraCanvasView 扩展 CanvasView 实现 CameraFrameCallback),这些组件有兴趣接收相机图像并在 Android 视图(UI)中显示它们。这样的组件必须连接到该类上(注册)。
回调(由任何 UI 组件实现)定义如下:
public interface CameraFrameCallback {
    void onCameraInitialized(int frameWidth, int frameHeight);

    void onFrameReceived(CameraFrame frame);

    void onCameraReleased();
}

这个接口的实现是由像这样的UI组件完成的:
import android.content.Context;
import android.util.AttributeSet;
import android.view.SurfaceHolder;
import CameraFrameCallback;

public class CameraCanvasView extends CanvasView implements CameraFrameCallback {

    private CameraAccess mCamera;
    private int cameraWidth = -1;
    private int cameraHeight = -1;
    private boolean automaticReceive;
    private boolean acceptNextFrame;

    public CameraCanvasView(Context context, AttributeSet attributeSet) {
        super(context, attributeSet);
    }

    public CameraAccess getCamera() {
        return mCamera;
    }

    public boolean getAcceptNextFrame() {
        return acceptNextFrame;
    }

    public void setAcceptNextFrame(boolean value) {
        this.acceptNextFrame = value;
    }

    public void setCamera(CameraAccess camera, boolean automaticReceive) {
        if (camera == null)
            throw new NullPointerException("camera");

        this.mCamera = camera;
        this.mCamera.setAutomaticReceive(automaticReceive);
        this.automaticReceive = automaticReceive;
    }

    @Override
    public void onCameraInitialized(int frameWidth, int frameHeight) {
        cameraWidth = frameWidth;
        cameraHeight = frameHeight;

        setCameraBounds();
    }

    public void setCameraBounds() {

        int width = 0;
        int height = 0;
        if (fixedWidth > 0 && fixedHeight > 0) {
            width = fixedWidth;
            height = fixedHeight;
        } else if (cameraWidth > 0 && cameraHeight > 0) {
            width = fixedWidth;
            height = fixedHeight;
        }

        if (width > 0 && height > 0)
            super.setCameraBounds(width, height, true);
    }

    @Override
    public void onFrameReceived(CameraFrame frame) {
        if (acceptNextFrame || automaticReceive)
            super.setBackground(frame);

        // reset
        acceptNextFrame = false;
    }

    @Override
    public void onCameraReleased() {

        setBackgroundImage(null);
    }

    @Override
    public void surfaceCreated(SurfaceHolder arg0) {
        super.surfaceCreated(arg0);

        if (mCamera != null) {
            mCamera.addCallback(this);

            if (!automaticReceive)
                mCamera.receive(); // we want to get the initial frame
        }
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder arg0) {
        super.surfaceDestroyed(arg0);

        if (mCamera != null)
            mCamera.removeCallback(this);
    }
}

那个UI组件可以在XML布局中像这样使用:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical" >

    <eu.level12.graphics.laser.CameraCanvasView
        android:id="@+id/my_camera_view"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        />

</LinearLayout>

负责将相机图像/位图绘制到Android UI表面的基础CanvasView,这是另一个棘手的问题,因此进行了封装。很抱歉我无法在此处添加完整的解决方案,因为这将是太多的代码。

无论如何,让我们回到相机处理。UI组件与相机之间的链接由CameraAccess类完成,该类还会在应用程序启动时加载OpenCV。

import java.util.ArrayList;
import java.util.List;

import org.opencv.android.InstallCallbackInterface;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;

import android.content.Context;
import android.content.SharedPreferences;
import android.content.SharedPreferences.OnSharedPreferenceChangeListener;
import android.graphics.Rect;
import android.preference.PreferenceManager;
import android.text.TextUtils;
import android.util.Log;

public final class CameraAccess implements OnSharedPreferenceChangeListener,
        LoaderCallbackInterface {

    public static final int CAMERA_INDEX_IP = Integer.MAX_VALUE;
    private static final int CAM_NONE = -1;
    private static final int CAM_DEFAULT = 0;
    private static final String DEFAULT_IP = "127.0.0.1";

    // see http://developer.android.com/guide/topics/media/camera.html for more
    // details

    private final static String TAG = "CameraAccess";
    private Context context;
    private int cameraIndex;
    private String cameraURI;
    private List<CameraFrameCallback> mCallbacks = new ArrayList<CameraFrameCallback>();
    private List<IOpenCVLoadedCallback> mLoadedCallbacks = new ArrayList<IOpenCVLoadedCallback>();
    private SharedPreferences preferences;
    private ICamera camera;
    private int mFrameWidth;
    private int mFrameHeight;
    private boolean mOpenCVloaded;
    private boolean isFixed;
    private boolean isDirty;
    private final Rect roi = new Rect();
    private final ManualResetEvent automaticReceive = new ManualResetEvent(true);
    private final AutoResetEvent doReceive = new AutoResetEvent(true);

    private static CameraAccess mInstance;

    public static CameraAccess getInstance(Context context) {

        if (mInstance != null) {
            if (mInstance.isDirty) {
                if (!mInstance.isFixed) {
                    mInstance.releaseCamera();
                    mInstance.connectCamera();
                }

                mInstance.isDirty = false;
            }

            return mInstance;
        }

        mInstance = new CameraAccess(context);

        mInstance.isFixed = false;
        mInstance.connectCamera();

        return mInstance;
    }

    public static CameraAccess getIPCamera(Context context, String uri) {
        if (mInstance != null
                && Utils.as(NetworkCamera.class, mInstance) == null)
            throw new IllegalStateException(
                    "Camera already initialized as non-network/IP.");

        if (mInstance != null)
            return mInstance;

        mInstance = new CameraAccess(context);
        mInstance.connectIPCamera(uri);
        mInstance.isFixed = true;

        return mInstance;
    }

    private CameraAccess(Context context) {

        this.context = context;
        this.preferences = PreferenceManager
                .getDefaultSharedPreferences(context);
        this.preferences.registerOnSharedPreferenceChangeListener(this);
        this.cameraIndex = getCameraIndex();

        if (!OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_7, context,
                this)) {
            Log.e(TAG, "Cannot connect to OpenCVManager");
        } else
            Log.d(TAG, "OpenCVManager successfully connected");
    }

    public Context getContext() {
        return context;
    }

    public boolean isOpenCVLoaded() {
        return mOpenCVloaded;
    }

    @Override
    public void onManagerConnected(int status) {
        mOpenCVloaded = true;

        notifyOpenCVLoadedCallbacks();

        if (mCallbacks.size() > 0 && camera != null)
            camera.connect();
    }

    @Override
    public void onPackageInstall(int operation,
            InstallCallbackInterface callback) {
    }

    @Override
    public void onSharedPreferenceChanged(SharedPreferences sharedPreferences,
            String key) {

        String cameraSelectKey = context
                .getString(R.string.settings_select_camera_key);
        String cameraIPKey = context
                .getString(R.string.settings_camera_ip_address_key);

        if (key.equals(cameraIPKey) || key.equals(cameraSelectKey)) {
            this.preferences = sharedPreferences;
            this.cameraIndex = getCameraIndex();

            this.isDirty = true;
        }

    }

    private int getCameraIndex() {
        if (preferences == null || context == null)
            return CAM_NONE;

        String index = preferences.getString(
                context.getString(R.string.settings_select_camera_key), ""
                        + CAM_DEFAULT);

        this.cameraURI = preferences.getString(
                context.getString(R.string.settings_camera_ip_address_key),
                DEFAULT_IP);

        int intIndex;
        try {
            intIndex = Integer.parseInt(index);
            return intIndex;
        } catch (NumberFormatException ex) {
            Log.e(TAG, "Could not parse camera index: " + ex.getMessage());
            return CAM_NONE;
        }
    }

    public synchronized void addCallback(CameraFrameCallback callback) {

        if (callback == null) {
            Log.e(TAG, "Camera frame callback not added because it is null.");
            return;
        }

        // we don't care if the callback is already in the list
        this.mCallbacks.add(callback);

        Log.d(TAG, String.format("Camera frame callback added: %s (count: %d)",
                callback.getClass().getName(), this.mCallbacks.size()));

        if (camera != null) {
            if (camera.isConnected())
                callback.onCameraInitialized(mFrameWidth, mFrameHeight);
            else
                camera.connect();
        }
    }

    public synchronized void removeCallback(CameraFrameCallback callback) {

        synchronized (this) {
            if (callback == null) {
                Log.e(TAG,
                        "Camera frame callback not removed because it is null.");
                return;
            }

            boolean removed = false;
            do {
                // someone might have added the callback multiple times
                removed = this.mCallbacks.remove(callback);

                if (removed) {
                    callback.onCameraReleased();

                    Log.d(TAG, String.format(
                            "Camera frame callback removed: %s (count: %d)",
                            callback.getClass().getName(),
                            this.mCallbacks.size()));
                }

            } while (removed == true);
        }

        if (mCallbacks.size() == 0)
            releaseCamera();
    }

    public synchronized void addOpenCVLoadedCallback(
            IOpenCVLoadedCallback callback) {

        if (callback == null) {
            return;
        }

        if (mOpenCVloaded) {
            callback.onOpenCVLoaded();
            return;
        }

        // we don't care if the callback is already in the list
        this.mLoadedCallbacks.add(callback);
    }

    // private synchronized void removeOpenCvCallback(
    // IOpenCVLoadedCallback callback) {
    //
    // if (callback == null)
    // return;
    //
    // boolean removed = false;
    // do {
    // // someone might have added the callback multiple times
    // removed = this.mLoadedCallbacks.remove(callback);
    //
    // } while (removed == true);
    // }

    private synchronized void notifyOpenCVLoadedCallbacks() {
        if (!mOpenCVloaded)
            return;

        for (IOpenCVLoadedCallback callback : mLoadedCallbacks)
            callback.onOpenCVLoaded();

        mLoadedCallbacks.clear();
    }

    public boolean isAutomaticReceive() {
        return automaticReceive.isSet();
    }

    public void setAutomaticReceive(boolean automatic) {
        if (automatic)
            automaticReceive.set();
        else
            automaticReceive.reset();
    }

    public boolean hasRegionOfInterest() {
        return !this.roi.isEmpty() && camera != null
                && camera.supportsRegionOfInterest();
    }

    public Rect getRegionOfInterest() {
        return this.roi;
    }

    public void setRegionOfInterest(Rect roi) {
        if (roi == null)
            this.roi.set(0, 0, 0, 0);
        else
            this.roi.set(roi);
    }

    public void receive() {
        doReceive.set();
    }

    public boolean waitForReceive(long milliseconds) {
        try {
            return doReceive.waitOne(milliseconds);
        } catch (InterruptedException e) {
            return false;
        }
    }

    private void connectCamera() {
        Log.d(TAG, "connect to camera " + cameraIndex);
        if (cameraIndex == CAMERA_INDEX_IP) {
            connectIPCamera(null);
        } else {
            connectLocalCamera();
        }
    }

    private void connectLocalCamera() {
        camera = new HardwareCamera(context, this, cameraIndex);
    }

    private void connectIPCamera(String uri) {

        if (TextUtils.isEmpty(uri))
            uri = cameraURI;

        if (TextUtils.isEmpty(uri))
            throw new NullPointerException(
                    "No URI (IP) for the remote network camera specified.");

        // camera = new NetworkCameraOpenCV(this, uri);
        camera = new NetworkCameraCached(this, uri);
        // camera = new NetworkCamera(this, uri);
        Log.d(TAG, "Connected to network camera: " + uri);
    }

    private synchronized void releaseCamera() {

        if (camera != null) {
            camera.release();

            for (CameraFrameCallback callback : mCallbacks)
                callback.onCameraReleased();
        }
    }

    public synchronized void onPreviewFrame(CameraFrame frame) {
        for (CameraFrameCallback callback : mCallbacks) {
            callback.onFrameReceived(frame);
        }
    }

    public synchronized void onCameraInitialized(int width, int height) {
        this.mFrameWidth = width;
        this.mFrameHeight = height;

        for (CameraFrameCallback callback : mCallbacks) {
            callback.onCameraInitialized(width, height);
        }
    }

    public interface CameraFrameCallback {
        void onCameraInitialized(int frameWidth, int frameHeight);

        void onFrameReceived(CameraFrame frame);

        void onCameraReleased();
    }

    public interface IOpenCVLoadedCallback {
        void onOpenCVLoaded();
    }

    public interface ICamera {

        boolean supportsRegionOfInterest();

        void connect();

        void release();

        boolean isConnected();
    }
}

实现本地连接相机(同样适用于Android智能手机)的是HardwareCamera类。成员用户可以被视为介于相机和所有UI组件之间的图像消费者。
import java.io.IOException;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;

import org.opencv.android.Utils;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.imgproc.Imgproc;

import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.ImageFormat;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.hardware.Camera.Parameters;
import android.hardware.Camera.Size;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;

public class HardwareCamera implements CameraAccess.ICamera,
        Camera.PreviewCallback {

    // see http://developer.android.com/guide/topics/media/camera.html for more
    // details

    private static final boolean USE_THREAD = true;

    private final static String TAG = "HardwareCamera";
    // private final Context context;
    private final int cameraIndex; // example: CameraInfo.CAMERA_FACING_FRONT or
                                    // -1 for
    // IP_CAM
    private final CameraAccess user;
    private Camera mCamera;
    private int mFrameWidth;
    private int mFrameHeight;
    private CameraAccessFrame mCameraFrame;
    private CameraHandlerThread mThread = null;
    private SurfaceTexture texture = new SurfaceTexture(0);

    // needed to avoid OpenCV error:
    // "queueBuffer: BufferQueue has been abandoned!"
    private byte[] mBuffer;

    public HardwareCamera(Context context, CameraAccess user, int cameraIndex) {
        // this.context = context;
        this.cameraIndex = cameraIndex;
        this.user = user;
    }

    // private boolean checkCameraHardware() {
    // if (context.getPackageManager().hasSystemFeature(
    // PackageManager.FEATURE_CAMERA)) {
    // // this device has a camera
    // return true;
    // } else {
    // // no camera on this device
    // return false;
    // }
    // }

    public static Camera getCameraInstance(int facing) {

        Camera c = null;
        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
        int cameraCount = Camera.getNumberOfCameras();
        int index = -1;

        for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
            Camera.getCameraInfo(camIdx, cameraInfo);
            if (cameraInfo.facing == facing) {
                try {
                    c = Camera.open(camIdx);
                    index = camIdx;
                    break;
                } catch (RuntimeException e) {
                    Log.e(TAG,
                            String.format(
                                    "Camera is not available (in use or does not exist). Facing: %s Index: %s Error: %s",
                                    facing, camIdx, e.getMessage()));

                    continue;
                }
            }
        }

        if (c != null)
            Log.d(TAG, String.format("Camera opened. Facing: %s Index: %s",
                    facing, index));
        else
            Log.e(TAG, "Could not find any camera matching facing: " + facing);

        // returns null if camera is unavailable
        return c;
    }

    private synchronized void connectLocalCamera() {
        if (!user.isOpenCVLoaded())
            return;

        if (USE_THREAD) {
            if (mThread == null) {
                mThread = new CameraHandlerThread(this);
            }

            synchronized (mThread) {
                mThread.openCamera();
            }
        } else {
            oldConnectCamera();
        }

        user.onCameraInitialized(mFrameWidth, mFrameHeight);
    }

    private/* synchronized */void oldConnectCamera() {
        // synchronized (this) {
        if (true) {// checkCameraHardware()) {
            mCamera = getCameraInstance(cameraIndex);
            if (mCamera == null)
                return;

            Parameters params = mCamera.getParameters();
            List<Camera.Size> sizes = params.getSupportedPreviewSizes();

            // Camera.Size previewSize = sizes.get(0);
            Collections.sort(sizes, new PreviewSizeComparer());
            Camera.Size previewSize = null;
            for (Camera.Size s : sizes) {
                if (s == null)
                    break;

                previewSize = s;
            }

            // List<Integer> formats = params.getSupportedPictureFormats();
            // params.setPreviewFormat(ImageFormat.NV21);

            params.setPreviewSize(previewSize.width, previewSize.height);
            mCamera.setParameters(params);

            params = mCamera.getParameters();

            mFrameWidth = params.getPreviewSize().width;
            mFrameHeight = params.getPreviewSize().height;

            int size = mFrameWidth * mFrameHeight;
            size = size
                    * ImageFormat.getBitsPerPixel(params.getPreviewFormat())
                    / 8;

            this.mBuffer = new byte[size];
            Log.d(TAG, "Created callback buffer of size (bytes): " + size);

            Mat mFrame = new Mat(mFrameHeight + (mFrameHeight / 2),
                    mFrameWidth, CvType.CV_8UC1);
            mCameraFrame = new CameraAccessFrame(mFrame, mFrameWidth,
                    mFrameHeight);

            if (this.texture != null)
                this.texture.release();

            this.texture = new SurfaceTexture(0);

            try {
                mCamera.setPreviewTexture(texture);
                mCamera.addCallbackBuffer(mBuffer);
                mCamera.setPreviewCallbackWithBuffer(this);
                mCamera.startPreview();

                Log.d(TAG,
                        String.format(
                                "Camera preview started with %sx%s. Rendering to SurfaceTexture dummy while receiving preview frames.",
                                mFrameWidth, mFrameHeight));
            } catch (Exception e) {
                Log.d(TAG, "Error starting camera preview: " + e.getMessage());
            }
        }
        // }
    }

    @Override
    public synchronized void onPreviewFrame(byte[] frame, Camera arg1) {
        mCameraFrame.put(frame);

        if (user.isAutomaticReceive() || user.waitForReceive(500))
            user.onPreviewFrame(mCameraFrame);

        if (mCamera != null)
            mCamera.addCallbackBuffer(mBuffer);
    }

    private class CameraAccessFrame implements CameraFrame {
        private Mat mYuvFrameData;
        private Mat mRgba;
        private int mWidth;
        private int mHeight;
        private Bitmap mCachedBitmap;
        private boolean mRgbaConverted;
        private boolean mBitmapConverted;

        @Override
        public Mat gray() {
            return mYuvFrameData.submat(0, mHeight, 0, mWidth);
        }

        @Override
        public Mat rgba() {
            if (!mRgbaConverted) {
                Imgproc.cvtColor(mYuvFrameData, mRgba,
                        Imgproc.COLOR_YUV2BGR_NV12, 4);
                mRgbaConverted = true;
            }
            return mRgba;
        }

        // @Override
        // public Mat yuv() {
        // return mYuvFrameData;
        // }

        @Override
        public synchronized Bitmap toBitmap() {
            if (mBitmapConverted)
                return mCachedBitmap;

            Mat rgba = this.rgba();
            Utils.matToBitmap(rgba, mCachedBitmap);

            mBitmapConverted = true;
            return mCachedBitmap;
        }

        public CameraAccessFrame(Mat Yuv420sp, int width, int height) {
            super();
            mWidth = width;
            mHeight = height;
            mYuvFrameData = Yuv420sp;
            mRgba = new Mat();

            this.mCachedBitmap = Bitmap.createBitmap(width, height,
                    Bitmap.Config.ARGB_8888);
        }

        public synchronized void put(byte[] frame) {
            mYuvFrameData.put(0, 0, frame);
            invalidate();
        }

        public void release() {
            mRgba.release();
            mCachedBitmap.recycle();
        }

        public void invalidate() {
            mRgbaConverted = false;
            mBitmapConverted = false;
        }
    };

    private class PreviewSizeComparer implements Comparator<Camera.Size> {
        @Override
        public int compare(Size arg0, Size arg1) {
            if (arg0 != null && arg1 == null)
                return -1;
            if (arg0 == null && arg1 != null)
                return 1;

            if (arg0.width < arg1.width)
                return -1;
            else if (arg0.width > arg1.width)
                return 1;
            else
                return 0;
        }
    }

    private static class CameraHandlerThread extends HandlerThread {
        Handler mHandler;
        HardwareCamera owner;

        CameraHandlerThread(HardwareCamera owner) {
            super("CameraHandlerThread");

            this.owner = owner;

            start();
            mHandler = new Handler(getLooper());
        }

        synchronized void notifyCameraOpened() {
            notify();
        }

        void openCamera() {
            mHandler.post(new Runnable() {
                @Override
                public void run() {
                    owner.oldConnectCamera();
                    notifyCameraOpened();
                }
            });

            try {
                wait();
            } catch (InterruptedException e) {
                Log.w(TAG, "wait was interrupted");
            }
        }
    }

    @Override
    public boolean supportsRegionOfInterest() {
        return false;
    }

    @Override
    public void connect() {
        connectLocalCamera();
    }

    @Override
    public void release() {
        synchronized (this) {

            if (USE_THREAD) {
                if (mThread != null) {
                    mThread.interrupt();
                    mThread = null;
                }
            }

            if (mCamera != null) {
                mCamera.stopPreview();
                mCamera.setPreviewCallback(null);
                try {
                    mCamera.setPreviewTexture(null);
                } catch (IOException e) {
                    Log.e(TAG, "Could not release preview-texture from camera.");
                }

                mCamera.release();

                Log.d(TAG, "Preview stopped and camera released");
            }
            mCamera = null;

            if (mCameraFrame != null) {
                mCameraFrame.release();
            }

            if (texture != null)
                texture.release();
        }
    }

    @Override
    public boolean isConnected() {
        return mCamera != null;
    }
}

最后一步是将它们链接在一起。这是在您的活动实现中的onResume方法完成的。
@Override
protected void onResume() {
    super.onResume();

    if (fourPointView != null) {
        cameraAccess = CameraAccess.getInstance(this);
        canvasView.setCamera(cameraAccess, true);
    } else {
        cameraAccess = null;
    }

    if (cameraAccess != null)
        cameraAccess.setAutomaticReceive(true);

    if (cameraAccess != null && fourPointView != null)
        cameraAccess.setRegionOfInterest(RectTools.toRect(canvasView
                .getCamera().getViewport()));
}

@Override
protected void onPause() {
    super.onPause();

    if (cameraAccess != null)
        cameraAccess.setRegionOfInterest(null);
}
备注:我知道这不是完整的实现,但我希望你能理解。最有趣的部分是颜色转换,这在该帖子的顶部找到。

嗨Matthias,你有没有找到一种不需要导入大量OpenCV文件的方法来完成这个任务? - Rafael Sanches
太久远了。但我敢打赌,有一些不需要OpenCV就能工作的解决方案。你也可以查看我正在使用的OpenCV函数的源代码。只需获取你所需的代码即可。 - Matthias

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接