我该如何操作相机预览?

39

有很多教程可以解释如何在安卓设备上实现简单的相机预览。但是我找不到任何一个例子能够解释在图像被渲染之前如何操作图像。
我想要做的是实现自定义颜色滤镜,以模拟红色和/或绿色缺陷。


1
如果您能用代码解释您想要做什么,那么帮助您就会变得容易。 - the100rabh
1
这是我目前拥有的程序。我想要做的是注册一些回调函数(Camera.PreviewCallback?),然后使用它来操作当前帧。 - whiskeysierra
1
链接已经失效,但是代码仓库仍然存在,如果有人仍然感兴趣,请访问:https://github.com/whiskeysierra/impaired-vision - whiskeysierra
你尝试使用GPUImage了吗?就像我在答案中提到的那样。 - jesses.co.tt
3个回答

56
我对此进行了一些研究,并组合了一个可行的示例。以下是我发现的内容。很容易获取从相机传输的原始数据,它以YUV字节数组返回。您需要手动将其绘制到表面上才能进行修改。为此,您需要拥有一个SurfaceView,可以手动运行绘图调用。有一些标志可以设置来实现这一点。
为了手动进行绘图调用,您需要将字节数组转换为某种位图。目前似乎Bitmaps和BitmapDecoder无法很好地处理YUV字节数组。已经有一个错误报告了这个问题,但不知道现在的状态如何。因此,人们一直在尝试将字节数组解码为RGB格式。
手动解码似乎有点慢,人们在这方面取得了不同程度的成功。像这样的事情应该在NDK级别的本机代码中完成。
不过,仍然可以让它正常工作。此外,我的小演示只是我花了几个小时去把东西拼在一起(我想这让我太过于着迷了😉)。所以通过一些调整,您可以大大改善我已经成功实现的工作。
这个小代码片段还包含了我发现的一些其他宝藏。如果您只想能够在表面上绘制,您可以覆盖表面的onDraw函数 - 您可以潜在地分析返回的相机图像并绘制叠加层 - 它比尝试处理每个帧要快得多。此外,我将SurfaceHolder.SURFACE_TYPE_NORMAL从需要显示相机预览的状态更改了。因此,代码需要进行一些更改 - 注释掉的代码:
//try { mCamera.setPreviewDisplay(holder); } catch (IOException e)
//  { Log.e("Camera", "mCamera.setPreviewDisplay(holder);"); }

并且:

SurfaceHolder.SURFACE_TYPE_NORMAL //SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS - for preview to work

应该允许您在真实预览的顶部基于相机预览叠加帧。

无论如何,这是一份有效的代码 - 应该能给您一个起点。

只需在您的视图之一中放置以下代码行:

<pathtocustomview.MySurfaceView android:id="@+id/surface_camera"
    android:layout_width="fill_parent" android:layout_height="10dip"
    android:layout_weight="1">
</pathtocustomview.MySurfaceView>

在你的源代码中添加这个类:

package pathtocustomview;

import java.io.IOException;
import java.nio.Buffer;

import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.graphics.Rect;
import android.hardware.Camera;
import android.util.AttributeSet;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceHolder.Callback;
import android.view.SurfaceView;

public class MySurfaceView extends SurfaceView implements Callback,
    Camera.PreviewCallback {

    private SurfaceHolder mHolder;

    private Camera mCamera;
    private boolean isPreviewRunning = false;
    private byte [] rgbbuffer = new byte[256 * 256];
    private int [] rgbints = new int[256 * 256];

    protected final Paint rectanglePaint = new Paint();

    public MySurfaceView(Context context, AttributeSet attrs) {
    super(context, attrs);
        rectanglePaint.setARGB(100, 200, 0, 0);
        rectanglePaint.setStyle(Paint.Style.FILL);
        rectanglePaint.setStrokeWidth(2);

        mHolder = getHolder();
        mHolder.addCallback(this);
        mHolder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);
    }

    @Override
    protected void onDraw(Canvas canvas) {
        canvas.drawRect(new Rect((int) Math.random() * 100,
            (int) Math.random() * 100, 200, 200), rectanglePaint);
        Log.w(this.getClass().getName(), "On Draw Called");
    }

    public void surfaceChanged(SurfaceHolder holder, int format, int width,
            int height) {
    }

    public void surfaceCreated(SurfaceHolder holder) {
        synchronized (this) {
            this.setWillNotDraw(false); // This allows us to make our own draw
                                    // calls to this canvas

            mCamera = Camera.open();

            Camera.Parameters p = mCamera.getParameters();
            p.setPreviewSize(240, 160);
            mCamera.setParameters(p);


            //try { mCamera.setPreviewDisplay(holder); } catch (IOException e)
            //  { Log.e("Camera", "mCamera.setPreviewDisplay(holder);"); }

            mCamera.startPreview();
            mCamera.setPreviewCallback(this);

        }
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
        synchronized (this) {
            try {
                if (mCamera != null) {
                    mCamera.stopPreview();
                    isPreviewRunning = false;
                    mCamera.release();
                }
            } catch (Exception e) {
                Log.e("Camera", e.getMessage());
            }
        }
    }

    public void onPreviewFrame(byte[] data, Camera camera) {
        Log.d("Camera", "Got a camera frame");

        Canvas c = null;

        if(mHolder == null){
            return;
        }

        try {
            synchronized (mHolder) {
                c = mHolder.lockCanvas(null);

                // Do your drawing here
                // So this data value you're getting back is formatted in YUV format and you can't do much
                // with it until you convert it to rgb
                int bwCounter=0;
                int yuvsCounter=0;
                for (int y=0;y<160;y++) {
                    System.arraycopy(data, yuvsCounter, rgbbuffer, bwCounter, 240);
                    yuvsCounter=yuvsCounter+240;
                    bwCounter=bwCounter+256;
                }

                for(int i = 0; i < rgbints.length; i++){
                    rgbints[i] = (int)rgbbuffer[i];
                }

                //decodeYUV(rgbbuffer, data, 100, 100);
                c.drawBitmap(rgbints, 0, 256, 0, 0, 256, 256, false, new Paint());

                Log.d("SOMETHING", "Got Bitmap");

            }
        } finally {
            // do this in a finally so that if an exception is thrown
            // during the above, we don't leave the Surface in an
            // inconsistent state
            if (c != null) {
                mHolder.unlockCanvasAndPost(c);
            }
        }
    }
}

我在尝试你的方法时只得到了错误,你可能有完整的可用示例吗?谢谢。 - goodm
c = mHolder.lockCanvas(null); c为空08-05 18:49:23.419: E/SurfaceHolder(13927): Exception locking surface 08-05 18:49:23.419: E/SurfaceHolder(13927): java.lang.IllegalArgumentException 08-05 18:49:23.419: E/SurfaceHolder(13927): at android.view.Surface.nativeLockCanvas(Native Method) 08-05 18:49:23.419: E/SurfaceHolder(13927): at android.view.Surface.lockCanvas(Surface.java:243) 08-05 18:49:23.419: E/SurfaceHolder(13927): at android.view.SurfaceView$4.internalLockCanvas(SurfaceView.java:814) - Ratul Ghosh
你在哪里调用onPreviewFrame?"data"是什么? - Roman Panaget
看起来for循环中的数字160和240是图像大小,但为什么在"bwCounter=bwCounter+256"和调用drawBitmap时使用256?在drawBitmap中,有些256肯定应该是宽度或高度,对吧? - Johannes Brodwall
这是一个很难搜索的话题,相关术语返回的结果太多不相关了。FX、Live Fx、frames……我已经几乎放弃了,然后找到了这个。虽然我不是应用程序开发人员,但我非常需要这个工具来在捕获前向图像添加标志(用于大小和透视)。为了符合堆栈规范,我会尽力以问题的形式表达。是否有一个可行的示例或应用程序,可以演示上面代码中的层/蒙版框架功能? - zzipper72

10
我使用了 walta 的解决方案,但是我在 YUV 转换、相机帧输出大小和相机释放时遇到了一些问题。
最终下面的代码可行:
public class MySurfaceView extends SurfaceView implements Callback, Camera.PreviewCallback {

private static final String TAG = "MySurfaceView";

private int width;
private int height;

private SurfaceHolder mHolder;

private Camera mCamera;
private int[] rgbints;

private boolean isPreviewRunning = false; 

private int mMultiplyColor;

public MySurfaceView(Context context, AttributeSet attrs) {
    super(context, attrs);

    mHolder = getHolder();
    mHolder.addCallback(this);
    mMultiplyColor = getResources().getColor(R.color.multiply_color);
}

// @Override
// protected void onDraw(Canvas canvas) {
// Log.w(this.getClass().getName(), "On Draw Called");
// }

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {

}

@Override
public void surfaceCreated(SurfaceHolder holder) {
    synchronized (this) {
        if (isPreviewRunning)
            return;

        this.setWillNotDraw(false); // This allows us to make our own draw calls to this canvas


        mCamera = Camera.open();
        isPreviewRunning = true;
        Camera.Parameters p = mCamera.getParameters();
        Size size = p.getPreviewSize();
        width = size.width;
        height = size.height;
        p.setPreviewFormat(ImageFormat.NV21);
        showSupportedCameraFormats(p);
        mCamera.setParameters(p);

        rgbints = new int[width * height];

        // try { mCamera.setPreviewDisplay(holder); } catch (IOException e)
        // { Log.e("Camera", "mCamera.setPreviewDisplay(holder);"); }

        mCamera.startPreview();
        mCamera.setPreviewCallback(this);

    }
}


@Override
public void surfaceDestroyed(SurfaceHolder holder) {
    synchronized (this) {
        try {
            if (mCamera != null) {
                //mHolder.removeCallback(this);
                mCamera.setPreviewCallback(null);
                mCamera.stopPreview();
                isPreviewRunning  = false;
                mCamera.release();
            }
        } catch (Exception e) {
            Log.e("Camera", e.getMessage());
        }
    }
}

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
    // Log.d("Camera", "Got a camera frame");
    if (!isPreviewRunning)
        return;

    Canvas canvas = null;

    if (mHolder == null) {
        return;
    }

    try {
        synchronized (mHolder) {
            canvas = mHolder.lockCanvas(null);
            int canvasWidth = canvas.getWidth();
            int canvasHeight = canvas.getHeight();

            decodeYUV(rgbints, data, width, height);

            // draw the decoded image, centered on canvas
            canvas.drawBitmap(rgbints, 0, width, canvasWidth-((width+canvasWidth)>>1), canvasHeight-((height+canvasHeight)>>1), width, height, false, null);

            // use some color filter
            canvas.drawColor(mMultiplyColor, Mode.MULTIPLY);

        }
    }  catch (Exception e){
        e.printStackTrace();
    } finally {
        // do this in a finally so that if an exception is thrown
        // during the above, we don't leave the Surface in an
        // inconsistent state
        if (canvas != null) {
            mHolder.unlockCanvasAndPost(canvas);
        }
    }
}



/**
 * Decodes YUV frame to a buffer which can be use to create a bitmap. use
 * this for OS < FROYO which has a native YUV decoder decode Y, U, and V
 * values on the YUV 420 buffer described as YCbCr_422_SP by Android
 * 
 * @param rgb
 *            the outgoing array of RGB bytes
 * @param fg
 *            the incoming frame bytes
 * @param width
 *            of source frame
 * @param height
 *            of source frame
 * @throws NullPointerException
 * @throws IllegalArgumentException
 */
public void decodeYUV(int[] out, byte[] fg, int width, int height) throws NullPointerException, IllegalArgumentException {
    int sz = width * height;
    if (out == null)
        throw new NullPointerException("buffer out is null");
    if (out.length < sz)
        throw new IllegalArgumentException("buffer out size " + out.length + " < minimum " + sz);
    if (fg == null)
        throw new NullPointerException("buffer 'fg' is null");
    if (fg.length < sz)
        throw new IllegalArgumentException("buffer fg size " + fg.length + " < minimum " + sz * 3 / 2);
    int i, j;
    int Y, Cr = 0, Cb = 0;
    for (j = 0; j < height; j++) {
        int pixPtr = j * width;
        final int jDiv2 = j >> 1;
    for (i = 0; i < width; i++) {
        Y = fg[pixPtr];
        if (Y < 0)
            Y += 255;
        if ((i & 0x1) != 1) {
            final int cOff = sz + jDiv2 * width + (i >> 1) * 2;
            Cb = fg[cOff];
            if (Cb < 0)
                Cb += 127;
            else
                Cb -= 128;
            Cr = fg[cOff + 1];
            if (Cr < 0)
                Cr += 127;
            else
                Cr -= 128;
        }
        int R = Y + Cr + (Cr >> 2) + (Cr >> 3) + (Cr >> 5);
        if (R < 0)
            R = 0;
        else if (R > 255)
            R = 255;
        int G = Y - (Cb >> 2) + (Cb >> 4) + (Cb >> 5) - (Cr >> 1) + (Cr >> 3) + (Cr >> 4) + (Cr >> 5);
        if (G < 0)
            G = 0;
        else if (G > 255)
            G = 255;
        int B = Y + Cb + (Cb >> 1) + (Cb >> 2) + (Cb >> 6);
        if (B < 0)
            B = 0;
        else if (B > 255)
            B = 255;
        out[pixPtr++] = 0xff000000 + (B << 16) + (G << 8) + R;
    }
    }

}

private void showSupportedCameraFormats(Parameters p) {
    List<Integer> supportedPictureFormats = p.getSupportedPreviewFormats();
    Log.d(TAG, "preview format:" + cameraFormatIntToString(p.getPreviewFormat()));
    for (Integer x : supportedPictureFormats) {
        Log.d(TAG, "suppoterd format: " + cameraFormatIntToString(x.intValue()));
    }

}

private String cameraFormatIntToString(int format) {
    switch (format) {
    case PixelFormat.JPEG:
        return "JPEG";
    case PixelFormat.YCbCr_420_SP:
        return "NV21";
    case PixelFormat.YCbCr_422_I:
        return "YUY2";
    case PixelFormat.YCbCr_422_SP:
        return "NV16";
    case PixelFormat.RGB_565:
        return "RGB_565";
    default:
        return "Unknown:" + format;

        }
    }
}

要使用它,请在您的活动的onCreate中运行以下代码:
            SurfaceView surfaceView = new MySurfaceView(this, null);
        RelativeLayout.LayoutParams layoutParams = new RelativeLayout.LayoutParams(RelativeLayout.LayoutParams.MATCH_PARENT, RelativeLayout.LayoutParams.MATCH_PARENT);
        surfaceView.setLayoutParams(layoutParams);
        mRelativeLayout.addView(surfaceView);

3
我尝试了这个主题中提出的不同解决方案和自己的解决方案。我遇到了以下异常。我认为至少从Android 4.4开始,无法从onPreviewFrame方法锁定画布。 E / SurfaceHolder(2872): java.lang.IllegalArgumentException E / SurfaceHolder(2872):在android.view.Surface.nativeLockCanvas(Native Method)中 E / SurfaceHolder(2872):在android.view.Surface.lockCanvas(Surface.java:243)中 - Mahesh Renduchintala
1
我实现了相同的解决方案,但在我的情况下,我得到了一个黑色的SurfaceView,没有任何异常。 - Taha Rushain
我只能看到黑色画面。相机没有视频流。我按照你的步骤操作了一遍。E/AwareLog: AtomicFileUtils: readFileLines文件不存在:android.util.AtomicFile@163047 - E/MemoryLeakMonitorManager: MemoryLeakMonitor.jar不存在!- E/SpannableStringBuilder: SPAN_EXCLUSIVE_EXCLUSIVE跨度长度不能为零。 - Suisse

4

您是否了解GPUImage?

它最初是由Brad Larson制作的OSX / iOS库,作为OpenGL / ES的Objective-C包装器存在。

https://github.com/BradLarson/GPUImage

CyberAgent的人们制作了一个Android移植版(不完全具备特性平衡),这是在OpenGLES基础上的一组Java包装器。它相对较高级,并且非常容易实现,具有上述许多相同的功能...

https://github.com/CyberAgent/android-gpuimage


网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接