安卓服务中的OpenCV图像处理

8

我的安卓应用在一个带有JavaCameraView的活动中使用opencv for android进行图像处理,这很好用。现在我想在后台做同样的图像处理,而不给用户显示任何预览。我尝试使用安卓服务。

通过以下代码,我可以在服务中成功加载OpenCV:

import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;

import android.app.Service;
import android.content.Intent;
import android.os.IBinder;
import android.util.Log;

public class CameraService extends Service {

private static final String TAG = "CameraService";

BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(CameraService.this) {

@Override
public void onManagerConnected(int status) {
        switch (status) {
        case LoaderCallbackInterface.SUCCESS: {
            Log.i("", "OpenCV loaded successfully");
        }
            break;
        default: {
            super.onManagerConnected(status);
        }
            break;
        }
    }
};

@Override
public int onStartCommand(Intent intent, int flags, int startId) {
    if(OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_4,
                getApplicationContext(), mLoaderCallback)) {
        Log.i(TAG, "Loaded OpenCV");
    }
    else
        Log.i(TAG, "Couldn't load OpenCV");
    return super.onStartCommand(intent, flags, startId);
}

@Override
public IBinder onBind(Intent intent) {
    return null;
}

}

但是我不知道如何像在以前的活动中一样抓取帧,例如onCameraFrame()? 在那里,我已经实现了CvCameraViewListener2,但没有在我的服务中实现,因为它需要一个CameraBridgeViewBase,而我不想再显示它。 我该如何在后台进行图像处理呢?
更新-> 2
我添加了一个runnable来抓取你告诉我的帧。 现在加载OpenCV并连接到摄像头都很好。但在抓取任何帧之前,它会跳过帧并中止,因为应用程序在主线程上做了太多的工作。
这是我的整个相机服务:
public final class MyService extends Service {

private static final String TAG = MyService.class.getSimpleName();
private boolean mStopThread;
private Thread mThread;
private VideoCapture mCamera;
private int mCameraIndex = -1;

BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
    @Override
    public void onManagerConnected(int status) {
        switch (status) {
        case LoaderCallbackInterface.SUCCESS: {
            Log.i("", "OpenCV loaded successfully");

            try {
                if (!connectCamera(640, 480))
                    Log.e(TAG, "Could not connect camera");
                else
                    Log.d(TAG, "Camera successfully connected");
            } catch (Exception e) {
                Log.e(TAG, "MyServer.connectCamera throws an exception: " + e.getMessage());
            }

        }
            break;
        default: {
            super.onManagerConnected(status);
        }
            break;
        }
    }
};

@Override
public int onStartCommand(Intent intent, int flags, int startId) {
    if(OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_4, getApplicationContext(), mLoaderCallback))
        Log.i(TAG, "Loaded OpenCV");
    else
        Log.i(TAG, "Couldn't load OpenCV");
    return super.onStartCommand(intent, flags, startId);
}

public void onDestroy() {
    this.disconnectCamera();
    Log.d(TAG, "onDestroy");
    super.onDestroy();
}

private boolean connectCamera(int width, int height) {
    /* First step - initialize camera connection */
    if (!initializeCamera(width, height)) {
        Log.d(TAG, "initializeCamera failed");
        return false;
    } else {
        Log.d(TAG, "initializeCamera successfully");
    /* start update thread */
    mThread = new Thread(new CameraWorker());
    mThread.start();

    return true;
    }
}

private boolean initializeCamera(int width, int height) {
    synchronized (this) {
        if (mCameraIndex == -1)
            mCamera = new VideoCapture(Highgui.CV_CAP_ANDROID);
        else
            mCamera = new VideoCapture(Highgui.CV_CAP_ANDROID + mCameraIndex);

        if (mCamera == null)
            return false;

        if (mCamera.isOpened() == false)
            return false;

        /* Select the size that fits surface considering maximum size allowed */
        Size frameSize = new Size(width, height);
        mCamera.set(Highgui.CV_CAP_PROP_FRAME_WIDTH, frameSize.width);
        mCamera.set(Highgui.CV_CAP_PROP_FRAME_HEIGHT, frameSize.height);
    }

    return true;
}

private void releaseCamera() {
    synchronized (this) {
        if (mCamera != null) {
            mCamera.release();
        }
    }
}

private void disconnectCamera() {
    // 1. Stop thread which updating the frames
    // 2. Stop camera and release it
    try {
        mStopThread = true;
        mThread.join();
    } catch (InterruptedException e) {
        e.printStackTrace();
    } finally {
        mThread =  null;
        mStopThread = false;
    }
    releaseCamera();
}

private class CameraWorker implements Runnable {
    public void run() {
        do {
            if (!mCamera.grab()) {
                Log.e(TAG, "Camera frame grab failed");
                break;
            }
            Log.e(TAG, "Camera frame grabbed");
            // img processing
        } while (!mStopThread);
    }
}

@Override
public IBinder onBind(Intent intent) {
    return null; // Not used
}
}

我的日志:

11-29 12:28:24.370: D/OpenCVManager/Helper(5257): Init finished with status 0
11-29 12:28:24.370: D/OpenCVManager/Helper(5257): Unbind from service
11-29 12:28:24.380: D/OpenCVManager/Helper(5257): Calling using callback
11-29 12:28:24.380: I/(5257): OpenCV loaded successfully
11-29 12:28:24.380: D/OpenCV::camera(5257): CvCapture_Android::CvCapture_Android(0)
11-29 12:28:24.440: D/OpenCV_NativeCamera(5257): Connecting to CameraService v 3D
11-29 12:28:24.670: D/OpenCV_NativeCamera(5257): Instantiated new CameraHandler (0x75e4f29d, 0x71e178b8)
11-29 12:28:24.750: D/OpenCV_NativeCamera(5257): Starting preview
11-29 12:28:25.421: E/OpenCV_NativeCamera(5257): CameraHandler::doCall(void*, size_t): cameraCallback returns false (camera connection will be closed)
11-29 12:28:25.421: E/BufferQueue(5257): [unnamed-5257-0] dequeueBuffer: min undequeued buffer count (2) exceeded (dequeued=11 undequeudCount=0)
11-29 12:28:25.431: E/BufferQueue(5257): [unnamed-5257-0] dequeueBuffer: min undequeued buffer count (2) exceeded (dequeued=10 undequeudCount=1)
11-29 12:28:25.451: D/OpenCV_NativeCamera(5257): Preview started successfully
11-29 12:28:25.451: D/OpenCV_NativeCamera(5257): CameraHandler::setProperty(0, 640.000000)
11-29 12:28:25.451: D/OpenCV_NativeCamera(5257): CameraHandler::setProperty(1, 480.000000)
11-29 12:28:25.451: D/MyService(5257): initializeCamera successfully
11-29 12:28:25.451: D/MyService(5257): Camera successfully connected
11-29 12:28:25.451: I/Choreographer(5257): Skipped 86 frames!  The application may be doing too much work on its main thread.
11-29 12:28:25.471: A/libc(5257): @@@ ABORTING: LIBC: HEAP MEMORY CORRUPTION IN tmalloc_small
11-29 12:28:25.471: A/libc(5257): Fatal signal 11 (SIGSEGV) at 0xdeadbaad (code=1), thread 5257 ()

出了什么问题,我现在该怎么办?


我总是得到错误日志:11-28 13:19:04.095: E/OpenCV::camera(2931): ||libnative_camera_r2.3.3.so,但它是可以工作的。忽略此条目。 - user1755546
但是后来:D/OpenCV_NativeCamera(2931): 连接到相机服务v3D,却没有任何反应。应用程序被冻结了。 - roschulze
这是一个本地示例。你如何从相机获取帧? - user1755546
您有以下日志: @@@ ABORTING: LIBC: HEAP MEMORY CORRUPTION IN tmalloc_small。Android中的服务将在下一个工作周期内运行:如果存在资源问题,Android可能会停止该服务。 - user1755546
我正在使用线程来获取帧,所以出了什么问题吗?我能通过其他方式获取这些帧吗? - roschulze
显示剩余2条评论
1个回答

2
您可以使用本地变量(VideoCapture)来实现。 CameraBridgeViewBase - 扩展自Android.Camera,也无法在后台运行。如果您找不到示例,请在 OpenCv-2.4.2 Android库中查看示例FaceDetection 更新:
您可以通过使用Runnable接口从相机获取帧:
private VideoCapture        mCamera;

public void run() {
        Log.i(TAG, "Starting processing thread");

    while (true) {
        Bitmap bmp = null;

        synchronized (this) {
            if (mCamera == null)
                break;

            if (!mCamera.grab()) {
                Log.e(TAG, "mCamera.grab() failed");
                break;
            }

            bmp = processFrame(mCamera);

        }

}


网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接