ARKit - ARAnchor在二维空间中的投影

6
我正在尝试将一个ARAnchor投影到2D空间,但我面临着方向问题...
以下是我的函数,用于将左上角、右上角、左下角、右下角位置投影到2D空间:
/// Returns the projection of an `ARImageAnchor` from the 3D world space
/// detected by ARKit into the 2D space of a view rendering the scene.
///
/// - Parameter from: An Anchor instance for projecting.
/// - Returns: An optional `CGRect` corresponding on `ARImageAnchor` projection.
internal func projection(from anchor: ARImageAnchor,
                         alignment: ARPlaneAnchor.Alignment,
                         debug: Bool = false) -> CGRect? {
    guard let camera = session.currentFrame?.camera else {
        return nil
    }

    let refImg = anchor.referenceImage
    let anchor3DPoint = anchor.transform.columns.3

    let size = view.bounds.size
    let width = Float(refImg.physicalSize.width / 2)
    let height = Float(refImg.physicalSize.height / 2)

    /// Upper left corner point
    let projection = ProjectionHelper.projection(from: anchor3DPoint,
                                              width: width,
                                              height: height,
                                              focusAlignment: alignment)
    let topLeft = projection.0
    let topLeftProjected = camera.projectPoint(topLeft,
                                      orientation: .portrait,
                                      viewportSize: size)

    let topRight:simd_float3 = projection.1
    let topRightProjected = camera.projectPoint(topRight,
                                       orientation: .portrait,
                                       viewportSize: size)

    let bottomLeft = projection.2
    let bottomLeftProjected = camera.projectPoint(bottomLeft,
                                         orientation: .portrait,
                                         viewportSize: size)

    let bottomRight = projection.3
    let bottomRightProjected = camera.projectPoint(bottomRight,
                                          orientation: .portrait,
                                          viewportSize: size)

    let result = CGRect(origin: topLeftProjected,
                        size: CGSize(width: topRightProjected.distance(point: topLeftProjected),
                                     height: bottomRightProjected.distance(point: bottomLeftProjected)))

    return result
}

这个函数在我面对世界原点时运作得非常好。然而,如果我向左或向右移动,角点的计算就会出现问题。

good calculation

bad calculation


你是否基本上想在检测到的图像周围绘制一个框架? - BlackMirrorz
@JoshRobbins 我试图获取角点并将它们投影到二维空间。 - YMonnier
1个回答

6
我找到了一个解决方案,可以根据anchor.transform获取ARImageAnchor的角落3D点,并将它们投影到2D空间:
    extension simd_float4 { 
        var vector_float3: vector_float3 { return simd_float3([x, y, z]) } 
    }

    /// Returns the projection of an `ARImageAnchor` from the 3D world space
    /// detected by ARKit into the 2D space of a view rendering the scene.
    ///
    /// - Parameter from: An Anchor instance for projecting.
    /// - Returns: An optional `CGRect` corresponding on `ARImageAnchor` projection.
    internal func projection(from anchor: ARImageAnchor) -> CGRect? {
        guard let camera = session.currentFrame?.camera else {
            return nil
        }
        
        let refImg = anchor.referenceImage
        let transform = anchor.transform.transpose

        
        let size = view.bounds.size
        let width = Float(refImg.physicalSize.width / 2)
        let height = Float(refImg.physicalSize.height / 2)
        
        // Get corner 3D points
        let pointsWorldSpace = [
            matrix_multiply(simd_float4([width, 0, -height, 1]), transform).vector_float3, // top right
            matrix_multiply(simd_float4([width, 0, height, 1]), transform).vector_float3, // bottom right
            matrix_multiply(simd_float4([-width, 0, -height, 1]), transform).vector_float3, // bottom left
            matrix_multiply(simd_float4([-width, 0, height, 1]), transform).vector_float3 // top left
        ]
        
        // Project 3D point to 2D space
        let pointsViewportSpace = pointsWorldSpace.map { (point) -> CGPoint in
            return camera.projectPoint(
                point,
                orientation: .portrait,
                viewportSize: size
            )
        }
        
        // Create a rectangle shape of the projection
        // to calculate the Intersection Over Union of other `ARImageAnchor`
        let result = CGRect(
           origin: pointsViewportSpace[3],
           size: CGSize(
               width: pointsViewportSpace[0].distance(point: pointsViewportSpace[3]),
               height: pointsViewportSpace[1].distance(point: pointsViewportSpace[2])
           )
        )
        
        
        return result
    }

嘿,你能帮我解决一下目前的问题吗?我试着使用你的代码示例,但遇到了这个错误:"Value of type 'simd_float4' (aka 'float4') has no member 'vector_float3'"。 - Piotr Gawłowski
嘿 @PiotrGawłowski,这只是一个将 simd_float4 转换为 vector_float3 的属性:extension simd_float4 { var vector_float3: vector_float3 { return simd_float3([x, y, z]) } } - YMonnier
我实际上想要弄清楚的事情是如何在3D空间中获取角落位置。我尝试使用pointsWorldSpace,并使用这些向量放置一个圆形几何节点。效果是 - 点以正确的方式分布(ImageRef的形状),正确的大小(宽度-高度),但位置不正确 - 完全在空间中超出了ImageRef的位置。有什么提示可以处理这个问题吗? - Piotr Gawłowski
您可以使用变换矩阵获取3D位置:anchor.transform.columns.3。 - YMonnier
1
嗨,@YasinNazlıcan,当您检测到标记时,您必须调用“projection”函数。具体来说,您应该在“func renderer(_ renderer:SCNSceneRenderer,didAdd node:SCNNode,for anchor:ARAnchor)”委托函数上调用此函数。希望这可以帮助您;) - YMonnier
显示剩余8条评论

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接