将Android OpenGL ES 2.0屏幕坐标转换为世界坐标

20

我正在构建一个使用OpenGL ES 2.0的Android应用程序,但我遇到了麻烦。我试图将屏幕坐标(用户触摸位置)转换为世界坐标。我尝试过阅读和使用GLU.gluUnProject进行实验,但我要么做错了,要么就是不理解它。

这是我的尝试....

public void getWorldFromScreen(float x, float y) {
    int viewport[] = { 0, 0, width , height};

    float startY = ((float) (height) - y);
    float[] near = { 0.0f, 0.0f, 0.0f, 0.0f };
    float[] far = { 0.0f, 0.0f, 0.0f, 0.0f };

    float[] mv = new float[16];
    Matrix.multiplyMM(mv, 0, mViewMatrix, 0, mModelMatrix, 0);

    GLU.gluUnProject(x, startY, 0, mv, 0, mProjectionMatrix, 0, viewport, 0, near, 0);
    GLU.gluUnProject(x, startY, 1, mv, 0, mProjectionMatrix, 0, viewport, 0, far, 0);

    float nearX = near[0] / near[3];
    float nearY = near[1] / near[3];
    float nearZ = near[2] / near[3];

    float farX = far[0] / far[3];
    float farY = far[1] / far[3];
    float farZ = far[2] / far[3];
}

我得到的数字似乎不对,这是使用该方法的正确方式吗?它适用于OpenGL ES 2.0吗?在进行这些计算之前,我应该将模型矩阵设置为单位矩阵(Matrix.setIdentityM(mModelMatix, 0))吗?

作为跟进,如果这是正确的方法,如何选择输出Z值?基本上,我总是知道世界坐标应该处于什么距离,但GLU.gluUnProject中的Z参数似乎是接近面和远截面之间的某种插值。这只是线性插值吗?

提前感谢。

3个回答

23
/**
    * Calculates the transform from screen coordinate
    * system to world coordinate system coordinates
    * for a specific point, given a camera position.
    *
    * @param touch Vec2 point of screen touch, the
      actual position on physical screen (ej: 160, 240)
    * @param cam camera object with x,y,z of the
      camera and screenWidth and screenHeight of
      the device.
    * @return position in WCS.
    */
   public Vec2 GetWorldCoords( Vec2 touch, Camera cam)
   {  
       // Initialize auxiliary variables.
       Vec2 worldPos = new Vec2();

       // SCREEN height & width (ej: 320 x 480)
       float screenW = cam.GetScreenWidth();
       float screenH = cam.GetScreenHeight();

       // Auxiliary matrix and vectors
       // to deal with ogl.
       float[] invertedMatrix, transformMatrix,
           normalizedInPoint, outPoint;
       invertedMatrix = new float[16];
       transformMatrix = new float[16];
       normalizedInPoint = new float[4];
       outPoint = new float[4];

       // Invert y coordinate, as android uses
       // top-left, and ogl bottom-left.
       int oglTouchY = (int) (screenH - touch.Y());

       /* Transform the screen point to clip
       space in ogl (-1,1) */       
       normalizedInPoint[0] =
        (float) ((touch.X()) * 2.0f / screenW - 1.0);
       normalizedInPoint[1] =
        (float) ((oglTouchY) * 2.0f / screenH - 1.0);
       normalizedInPoint[2] = - 1.0f;
       normalizedInPoint[3] = 1.0f;

       /* Obtain the transform matrix and
       then the inverse. */
       Print("Proj", getCurrentProjection(gl));
       Print("Model", getCurrentModelView(gl));
       Matrix.multiplyMM(
           transformMatrix, 0,
           getCurrentProjection(gl), 0,
           getCurrentModelView(gl), 0);
       Matrix.invertM(invertedMatrix, 0,
           transformMatrix, 0);       

       /* Apply the inverse to the point
       in clip space */
       Matrix.multiplyMV(
           outPoint, 0,
           invertedMatrix, 0,
           normalizedInPoint, 0);

       if (outPoint[3] == 0.0)
       {
           // Avoid /0 error.
           Log.e("World coords", "ERROR!");
           return worldPos;
       }

       // Divide by the 3rd component to find
       // out the real position.
       worldPos.Set(
           outPoint[0] / outPoint[3],
           outPoint[1] / outPoint[3]);

       return worldPos;       
   }

该算法在此处有进一步的解释。


2
我的解决方案都是用C++编写的。仅仅浏览一下这个答案,它看起来对我来说是正确的。因此,我不会发布一个答案。:] - TheBuzzSaw

1

1

我认为不需要重新实现这个函数...... 我尝试了Erol的解决方案,它有效了,所以非常感谢Erol。此外,我还尝试了其他方法

        Matrix.orthoM(mtrxProjection, 0, left, right, bottom, top, near, far);

在我的小型OpenGL ES 2.0项目中,它也可以正常工作:


网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接