AVAudioPlayerNode.scheduleFile()的completionHandler被调用得太早了。

8
我正在尝试在iOS 8中使用新的AVAudioEngine。
看起来,player.scheduleFile()函数的completionHandler在声音文件播放完成之前被调用。我使用的声音文件长度为5秒,而println()信息大约出现在声音结束前1秒左右。
我是做错了什么还是对completionHandler的理解有误?
谢谢!
以下是一些代码:
class SoundHandler {
    let engine:AVAudioEngine
    let player:AVAudioPlayerNode
    let mainMixer:AVAudioMixerNode

    init() {
        engine = AVAudioEngine()
        player = AVAudioPlayerNode()
        engine.attachNode(player)
        mainMixer = engine.mainMixerNode

        var error:NSError?
        if !engine.startAndReturnError(&error) {
            if let e = error {
                println("error \(e.localizedDescription)")
            }
        }

        engine.connect(player, to: mainMixer, format: mainMixer.outputFormatForBus(0))
    }

    func playSound() {
        var soundUrl = NSBundle.mainBundle().URLForResource("Test", withExtension: "m4a")
        var soundFile = AVAudioFile(forReading: soundUrl, error: nil)

        player.scheduleFile(soundFile, atTime: nil, completionHandler: { println("Finished!") })

        player.play()
    }
}
7个回答

9

我看到了相同的行为。

通过我的实验,我认为回调函数是在缓冲区/片段/文件被“调度”后调用的,而不是在播放完成后调用。

尽管文档明确说明:

“在缓冲区完全播放或播放器停止后调用。可能为空。”

所以我认为这可能是一个错误或不正确的文档。不知道具体是哪个原因。


与此同时,它已更改为“在播放器已将文件安排在渲染线程上进行播放或播放器停止后调用。可能为空。”- 不确定是否包括播放器自然结束的情况。 - bio

7
您可以使用AVAudioTime计算音频回放完成时的未来时间。当前行为非常有用,因为它支持在回调之前调度其他缓冲区/段/文件以从当前缓冲区/段/文件结束前播放,避免了音频播放中的间隙。这使您可以轻松创建一个简单的循环播放器。以下是示例:
class Latch {
    var value : Bool = true
}

func loopWholeFile(file : AVAudioFile, player : AVAudioPlayerNode) -> Latch {
    let looping = Latch()
    let frames = file.length

    let sampleRate = file.processingFormat.sampleRate
    var segmentTime : AVAudioFramePosition = 0
    var segmentCompletion : AVAudioNodeCompletionHandler!
    segmentCompletion = {
        if looping.value {
            segmentTime += frames
            player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
        }
    }
    player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
    segmentCompletion()
    player.play()

    return looping
}

上述代码在调用player.play()之前将整个文件安排了两次。当每个片段接近完成时,它会在未来安排另一个完整的文件,以避免播放中出现间隙。要停止循环,您可以使用返回值Latch,如下所示:

let looping = loopWholeFile(file, player)
sleep(1000)
looping.value = false
player.stop()

6

iOS 8时期的AVAudioEngine文档可能出现了一些错误。作为一个解决方法,我发现如果使用scheduleBuffer:atTime:options:completionHandler:代替,回调函数会如预期般被唤起(在播放结束后)。

示例代码:

AVAudioFile *file = [[AVAudioFile alloc] initForReading:_fileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length];
[file readIntoBuffer:buffer error:&error];

[_player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
    // reminder: we're not on the main thread in here
    dispatch_async(dispatch_get_main_queue(), ^{
        NSLog(@"done playing, as expected!");
    });
}];

太棒了。运行得非常顺畅! - Next Developer
事实上,在测试后发现,即使使用了缓冲区,在播放器停止之前回调函数仍会被调用。在此示例中,audioPlayer.scheduleBuffer(audioBuffer){dispatch_async(dispatch_get_main_queue()) { [unowned self] in if (self.audioPlayer.playing == false){self.stopButton.hidden = true}}} 条件永远不会被满足。 - Next Developer
奇怪的是,我的AVAudioPlayerNode在iOS9上能够发出声音,但在一些旧设备和运行iOS8的设备上无法工作。 - vikzilla
有人为这个问题提交了错误报告吗?如果需要,我可以提交。 - lancejabr
1
@lancejabr 我已经提交了,但你也可以哦!他们收到的有关某个问题的错误报告越多,就越有可能修复它。 - taber
显示剩余5条评论

5

我的错误报告被关闭,原因是“按照意图工作”,但是苹果向我指出了iOS 11中scheduleFile、scheduleSegment和scheduleBuffer方法的新变化。这些新增了completionCallbackType参数,您可以使用它来指定播放完成时要回调的完成回调函数:

[self.audioUnitPlayer
            scheduleSegment:self.audioUnitFile
            startingFrame:sampleTime
            frameCount:(int)sampleLength
            atTime:0
            completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack
            completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) {
    // do something here
}];
文档没有说明这是如何工作的,但我进行了测试,并且它对我可行。
我一直在使用此解决方法来处理iOS 8-10:
- (void)playRecording {
    [self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionHandler:^() {
        float totalTime = [self recordingDuration];
        float elapsedTime = [self recordingCurrentTime];
        float remainingTime = totalTime - elapsedTime;
        [self performSelector:@selector(doSomethingHere) withObject:nil afterDelay:remainingTime];
    }];
}

- (float)recordingDuration {
    float duration = duration = self.audioUnitFile.length / self.audioUnitFile.processingFormat.sampleRate;
    if (isnan(duration)) {
        duration = 0;
    }
    return duration;
}

- (float)recordingCurrentTime {
    AVAudioTime *nodeTime = self.audioUnitPlayer.lastRenderTime;
    AVAudioTime *playerTime = [self.audioUnitPlayer playerTimeForNodeTime:nodeTime];
    AVAudioFramePosition sampleTime = playerTime.sampleTime;
    if (sampleTime == 0) { return self.audioUnitLastKnownTime; } // this happens when the player isn't playing
    sampleTime += self.audioUnitStartingFrame; // if we trimmed from the start, or changed the location with the location slider, the time before that point won't be included in the player time, so we have to track it ourselves and add it here
    float time = sampleTime / self.audioUnitFile.processingFormat.sampleRate;
    self.audioUnitLastKnownTime = time;
    return time;
}

1
截至今天,在一个部署目标为12.4的项目中,在一台运行12.4.1的设备上,我们找到了一种成功停止节点播放完成的方法。
// audioFile and playerNode created here ...

playerNode.scheduleFile(audioFile, at: nil, completionCallbackType: .dataPlayedBack) { _ in
    os_log(.debug, log: self.log, "%@", "Completing playing sound effect: \(filePath) ...")

    DispatchQueue.main.async {
        os_log(.debug, log: self.log, "%@", "... now actually completed: \(filePath)")

        self.engine.disconnectNodeOutput(playerNode)
        self.engine.detach(playerNode)
    }
}

与以前的答案相比,主要区别在于将节点分离推迟到主线程上(我猜这也是音频渲染线程?),而不是在回调线程上执行。


0
// audioFile here is our original audio

audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: {
        print("scheduleFile Complete")

        var delayInSeconds: Double = 0

        if let lastRenderTime = self.audioPlayerNode.lastRenderTime, let playerTime = self.audioPlayerNode.playerTime(forNodeTime: lastRenderTime) {

            if let rate = rate {
                delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate) / Double(rate!)
            } else {
                delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate)
            }
        }

        // schedule a stop timer for when audio finishes playing
        DispatchTime.executeAfter(seconds: delayInSeconds) {
            audioEngine.mainMixerNode.removeTap(onBus: 0)
            // Playback has completed
        }

    })

0

是的,在文件(或缓冲区)完成之前,它确实会稍微被调用。如果你在完成处理程序中调用[myNode stop],文件(或缓冲区)将不会完全完成。然而,如果你调用[myEngine stop],文件(或缓冲区)将会完全完成。


网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接