ARKit SKVideoNode在渲染上的播放

33

主要问题:

我在此添加这一部分以澄清问题。- 我可以暂停我的视频(我不想让它循环播放)。当我的节点进入视线时,我的节点会播放我的视频,即使它处于暂停状态。如果我的视频已经播放完毕,并进入视线,它将重新开始播放。我想要消除这种行为。

在我的应用程序中,我使用ARKit .ImageTracking来确定何时找到特定图像,并从那里播放视频。我使用SCNNode对象和SCNGeometry对象在3D空间中创建了一个SKVideoNode,其中包含一个由AVPlayer(:URL)创建的视频。一切都很好,除了每次AVPlayer进入视线时,播放器都会自行决定播放时间;然而,无论是ARImageAnchor还是附加到SCNNodeSCNNode,每次节点进入相机镜头的视线时,AVPlayer都会播放。我使用

override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
    if(keyPath == "rate") {
        print((object as! AVPlayer).rate)
    }
}

要打印出比率,它是1,但实际上是0。

我为所有使用 player.pause() 或 player.play() 的函数打印了一些打印函数 print("Play"),但当速率改变时,没有任何一个函数被调用。如何找到更改播放器速率的源代码?

我检查了原始根节点 self.sceneview.scene.rootNode.childNodes,以确保我没有创建额外的 VideoNodes/SCNNodes/AVPlayers 等等,看起来只有一个。

有什么想法,为什么 SKVideoNode/AVPlayer 会在 SCNNode 进入 ARKit 相机视线时播放?谢谢!

编辑1:

做了一个变通方法,仅确定用户何时单击此节点

let tap = UITapGestureRecognizer(target: self, action: #selector(self!.tapGesture))
tap.delegate = self!
tap.name = "MyTap"
self!.sceneView.addGestureRecognizer(tap)

然后在这个下一个函数内部,我放置了

@objc func tapGesture(_ gesture:UITapGestureRecognizer) {
    let tappedNodes = self.sceneView.hitTest(gesture.location(in: gesture.view), options: [SCNHitTestOption.searchMode: 1])

    if !tappedNodes.isEmpty {
        for nodes in tappedNodes {
            if nodes.node == videoPlayer3D {
                videoPlayer3D.tappedVideoPlayer = true
                videoPlayer3D.playOrPause()
                break
            }
        }
    }
}

override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
    if(keyPath == "rate") {
        print((object as! AVPlayer).rate)
        if(!self.tappedVideoPlayer) {
            self.player.pause() //HERE
        }
    }
}

视频播放器3D是包含SKVideoNode的SCNNode。

然而,在上面标记为“HERE”的部分,我遇到了错误com.apple.scenekit.scnview-renderer (17):EXC_BAD_ACCESS(code=2,address=0x16d8f7ad0)。似乎场景视图的渲染器正在尝试在渲染函数中更改我的视频节点,但是,我甚至没有使用renderer(updateAtTime:)函数,我只使用了

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

    guard let imageAnchor = anchor as? ARImageAnchor else { return }

    createVideoNode(imageAnchor)

}

如何确定我何时看到一张图片,即imageTracking,并创建节点。有什么建议吗?

思路1:

错误提示表明从SCNView对象调用了某个方法(这是我从错误中理解的),但我没有特别调用该节点。我认为可能是默认操作,在节点出现视图时被调用,但我不确定如何访问它或确定哪个方法。我使用的对象不是SCNView对象,而且我不认为它们继承自SCNView对象(请查看第一个段落以查看使用的变量)。只想删除每次查看时节点播放的“动作”。

补充说明:

为了方便您了解我的视频播放器的创建过程,这是它的创建过程。如果您还想了解其他内容,请告诉我(不确定您还想了解什么),谢谢您的帮助。

func createVideoNode(_ anchor:ARImageAnchor, initialPOV:SCNNode) -> My3DPlayer? {

    guard let currentFrame = self.sceneView.session.currentFrame else {
        return nil
    }

    let delegate = UIApplication.shared.delegate as! AppDelegate

    var videoPlayer:My3DPlayer!
    videoPlayer = delegate.testing ? My3DPlayer(data: nil, currentFrame: currentFrame, anchor: anchor) : My3DPlayer(data: self.urlData, currentFrame: currentFrame, anchor: anchor)

    //Create TapGesture
    let tap = UITapGestureRecognizer(target: self, action: #selector(self.tapGesture))
    tap.delegate = self
    tap.name = "MyTap"
    self.sceneView.addGestureRecognizer(tap)

    return videoPlayer
}

My3dPlayer 类:

class My3DPlayer: SCNNode {

    init(geometry: SCNGeometry?) {
        super.init()
        self.geometry = geometry
    }

    required init?(coder aDecoder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }

    convenience init(data:Data?, currentFrame:ARFrame, anchor:ARImageAnchor) {
        self.init(geometry: nil)
        self.createPlayer(currentFrame, data, anchor)
    }

    private func createPlayer(_ frame:ARFrame, _ data:Data?,_ anchor:ARImageAnchor) {

        let physicalSize = anchor.referenceImage.physicalSize

        print("Init Player W/ physicalSize: \(physicalSize)")

        //Create video
        if((UIApplication.shared.delegate! as! AppDelegate).testing) {
            let path = Bundle.main.path(forResource: "Bear", ofType: "mov")
            self.url = URL(fileURLWithPath: path!)
        }
        else {
            let url = data!.getAVAssetURL(location: "MyLocation")
            self.url = url
        }
        let asset = AVAsset(url: self.url)
        let track = asset.tracks(withMediaType: AVMediaType.video).first!
        let playerItem = AVPlayerItem(asset: asset)
        let player = AVPlayer(playerItem: playerItem)
        self.player = player
        var videoSize = track.naturalSize.applying(track.preferredTransform)

        videoSize = CGSize(width: abs(videoSize.width), height: abs(videoSize.height))
        print("Init Video W/ size: \(videoSize)")

        //Determine if landscape or portrait
        self.landscape = videoSize.width > videoSize.height
        print(self.landscape == true ? "Landscape" : "Portrait")

        //Do something when video ended
        NotificationCenter.default.addObserver(self, selector: #selector(playerDidFinishPlaying(note:)), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil)

        //Add observer to determine when Player is ready
        player.addObserver(self, forKeyPath: "status", options: [], context: nil)

        //Create video Node
        let videoNode = SKVideoNode(avPlayer: player)

        //Create 2d scene to put 2d player on - SKScene
        videoNode.position = CGPoint(x: videoSize.width/2, y: videoSize.height/2)
        videoNode.size = videoSize


        //Portrait -- //Landscape doesn't need adjustments??
        if(!self.landscape) {
            let width = videoNode.size.width
            videoNode.size.width = videoNode.size.height
            videoNode.size.height = width
            videoNode.position = CGPoint(x: videoNode.size.width/2, y: videoNode.size.height/2)
        }

        let scene = SKScene(size: videoNode.size)

        //Add videoNode to scene
        scene.addChild(videoNode)

        //Create Button-look even though we don't use the button. Just creates the illusion to pressing play and pause
        let image = UIImage(named: "PlayButton")!
        let texture = SKTexture(image: image)
        self.button = SKSpriteNode(texture: texture)
        self.button.position = videoNode.position

        //Makes the button look like a square
        let minimumSize = [videoSize.width, videoSize.height].min()!
        self.button.size = CGSize(width: minimumSize/4, height: minimumSize/4)
        scene.addChild(button)

        //Get ratio difference from physicalsize and video size
        let widthRatio = Float(physicalSize.width)/Float(videoSize.width)
        let heightRatio = Float(physicalSize.height)/Float(videoSize.height)

        let finalRatio = [widthRatio, heightRatio].min()!

        //Create a Plane (SCNPlane) to put the SKScene on
        let plane = SCNPlane(width: scene.size.width, height: scene.size.height)
        plane.firstMaterial?.diffuse.contents = scene
        plane.firstMaterial?.isDoubleSided = true

        //Set Self.geometry = plane
        self.geometry = plane

        //Size the node correctly
        //Find the real scaling variable
        let scale = CGFloat(finalRatio)
        let appearanceAction = SCNAction.scale(to: scale, duration: 0.4)
        appearanceAction.timingMode = .easeOut

        //Set initial scale to 0 then use action to scale up
        self.scale = SCNVector3Make(0, 0, 0)
        self.runAction(appearanceAction)
    }

    @objc func playerDidFinishPlaying(note: Notification) {
        self.player.seek(to: .zero, toleranceBefore: .zero, toleranceAfter: .zero)
        self.player.seek(to: .zero, toleranceBefore: .zero, toleranceAfter: .zero)
        self.setButtonAlpha(alpha: 1)
    }
}

努力1:

我尝试通过以下方式停止跟踪:

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

    guard let imageAnchor = anchor as? ARImageAnchor else { return }

    createVideoNode(imageAnchor)
    self.resetConfiguration(turnOnConfig: true, turnOnImageTracking: false)   
}

func resetConfiguration(turnOnConfig: Bool = true, turnOnImageTracking:Bool = false) {
    let configuration = ARWorldTrackingConfiguration()
    if(turnOnImageTracking) {
        guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else {
            fatalError("Missing expected asset catalog resources.")
        }
        configuration.planeDetection = .horizontal
        configuration.detectionImages = referenceImages
    }
    else {
        configuration.planeDetection = []
    }
    if(turnOnConfig) {
        sceneView.session.run(configuration, options: [.resetTracking])
    }
}

上面,我尝试重置配置。这似乎只会导致它重置平面,因为视频在渲染时仍在播放。无论是暂停还是完成,它都会重置并重新开始或继续播放。 尝试2: 我已经尝试过。
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

    guard let imageAnchor = anchor as? ARImageAnchor else { return }

    createVideoNode(imageAnchor)
    self.pauseTracking()  
}

func pauseTracking() {
    self.sceneView.session.pause()
}

这会停止所有事情,因此相机甚至会冻结,因为没有任何被跟踪的东西。在这里完全没有用处。


创建我的节点的位置如何影响每次视频播放时它何时出现?即使视频处于暂停状态。 - impression7vx
不确定,只是我的猜测,如果每次找到imageAnchor时都会播放,那么肯定存在一些逻辑问题。 - Alok Subedi
我只是为了好玩儿加上了它。 - impression7vx
你的 ARWorldTrackingConfiguration 代码用在哪里了?一旦视频开始播放,停止“检测”是否是一个选项? - Mihai Erős
@MihaiErős,我如何在不停止相机的情况下停止跟踪? - impression7vx
显示剩余6条评论
1个回答

2
好的。这里有一个解决方法,请查看renderer(_:updateAtTime:)函数。
var player: AVPlayer!
var play = true

@objc func tap(_ recognizer: UITapGestureRecognizer){
    if play{
        play = false
        player.pause()
    }else{
        play = true
        player.play()
    }
}

func setVideo() -> SKScene{
    let size = CGSize(width: 500, height: 500)
    let skScene = SKScene(size: size)

    let videoURL = Bundle.main.url(forResource: "video.mp4", withExtension: nil)!
    player = AVPlayer(url: videoURL)

    skScene.scaleMode = .aspectFit

    videoSpriteNode = SKVideoNode(avPlayer: player)
    videoSpriteNode.position = CGPoint(x: size.width/2, y: size.height/2)
    videoSpriteNode.size = size
    videoSpriteNode.yScale = -1
    skScene.addChild(videoSpriteNode)

    player.play()

    return skScene
}

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
    if let image = anchor as? ARImageAnchor{
        print("found")

        let planeGeometry = SCNPlane(width: image.referenceImage.physicalSize.width, height: image.referenceImage.physicalSize.height)
        let plane = SCNNode(geometry: planeGeometry)
        planeGeometry.materials.first?.diffuse.contents = setVideo()

        plane.transform = SCNMatrix4MakeRotation(-.pi/2, 1, 0, 0)

        node.addChildNode(plane)
    }
}

func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
    if !play{
        player.pause()
    }
}

在你的代码中使用这个想法。


如果视频被暂停了怎么办?那就是我的问题。我不想让视频循环播放。我希望能够暂停它,而不是每次进入视图时系统都自动播放我的视频。 - impression7vx
如果我的问题是让视频保持播放,那么这很容易解决,因为它已经实现了这一点。但我希望在点击实现手势类型动作的节点时才暂停视频并播放它。 - impression7vx
会尝试并告诉您! - impression7vx
你真是个狡猾的家伙。我在我的应用程序中使用了renderer(_:updateAtTime)来进行各种其他活动,但从未考虑过这种可能性。通过添加if! play { player.pause() },这个方法起作用了。这是一个很好的解决方法,但是苹果的自然行为需要改变。非常感谢您的帮助,我已经遇到这个问题有一段时间了。 - impression7vx

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接