如何使用AVAudioPCMBuffer播放声音

13

我无法使用AVAudioPCMBuffer播放声音(尽管我可以使用AVAudioFile播放)。

我遇到了这个错误。

错误信息:AVAudioBuffer.mm:169: -[AVAudioPCMBuffer initWithPCMFormat:frameCapacity:]: required condition is false: isCommonFormat

以下是我的代码,非常感谢您的帮助。

import UIKit
import AVFoundation

class ViewController: UIViewController {

let audioEngine: AVAudioEngine = AVAudioEngine()
let audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()

override func viewDidLoad() {
    super.viewDidLoad()
    // Do any additional setup after loading the view, typically from a nib.

    audioEngine.attachNode(audioFilePlayer)

    let filePath: String = NSBundle.mainBundle().pathForResource("test", ofType: "mp3")!
    let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
    let audioFile = AVAudioFile(forReading: fileURL, error: nil)
    let audioFormat = audioFile.fileFormat
    let audioFrameCount = UInt32(audioFile.length)
    let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)

    var mainMixer = audioEngine.mainMixerNode
    audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)

    audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options: nil, completionHandler: nil)

    var engineError: NSError?
    audioEngine.startAndReturnError(&engineError)

    audioFilePlayer.play()
}

override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
    // Dispose of any resources that can be recreated.
}

}
5个回答

8

让我分享一下,这种方法在某种程度上是有效的,虽然我不完全理解。

import UIKit
import AVFoundation

class ViewController: UIViewController {

var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()

override func viewDidLoad() {
    super.viewDidLoad()
    // Do any additional setup after loading the view, typically from a nib.


    let filePath: String = NSBundle.mainBundle().pathForResource("test", ofType: "mp3")!
    println("\(filePath)")
    let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
    let audioFile = AVAudioFile(forReading: fileURL, error: nil)
    let audioFormat = audioFile.processingFormat
    let audioFrameCount = UInt32(audioFile.length)
    let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
    audioFile.readIntoBuffer(audioFileBuffer, error: nil)

    var mainMixer = audioEngine.mainMixerNode
    audioEngine.attachNode(audioFilePlayer)
    audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
    audioEngine.startAndReturnError(nil)

    audioFilePlayer.play()
    audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options: nil, completionHandler: nil)
}

override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
    // Dispose of any resources that can be recreated.
}

}

3
问题在于你将PCM缓冲区的格式设置为非PCM格式。因此,你需要使用AVAudioFile的processingFormat创建AVAudioPCMBuffer。

3

使用AVAudioPCMBuffer()时,如果尝试使用不是mixer.outputFormat(forBus: 0)的pcm格式,则会出现奇怪的错误。

它不接受单声道格式,即使您将它们完全描述相同,它也会抱怨混音器的输出格式与您的格式不匹配,并且不会产生解释问题的错误。


1

将@Bick的代码更新到Swift 5.3

代码逻辑易于理解

  • 首先,准备数据

创建一个空的AVAudioPCMBuffer,然后在其中填充音频数据。

  • Secondly, connect the nodes, and use the data to play

    import UIKit
    import AVFoundation
    
      class ViewControllerX: UIViewController {
    
        var audioEngine = AVAudioEngine()
        var audioFilePlayer = AVAudioPlayerNode()
    
        override func viewDidLoad() {
          super.viewDidLoad()
    
          // prepare the data
          guard let filePath = Bundle.main.path(forResource: "test", ofType: "mp3") else{ return }
    
          print("\(filePath)")
          let fileURL = URL(fileURLWithPath: filePath)
          do {
              let audioFile = try AVAudioFile(forReading: fileURL)
    
              let audioFormat = audioFile.processingFormat
              let audioFrameCount = UInt32(audioFile.length)
              guard let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount) else{ return }
              try audioFile.read(into: audioFileBuffer)
    
              // connect the nodes, and use the data to play
              let mainMixer = audioEngine.mainMixerNode
              audioEngine.attach(audioFilePlayer)
              audioEngine.connect(audioFilePlayer, to: mainMixer, format: audioFileBuffer.format)
    
              try audioEngine.start()
    
              audioFilePlayer.play()
              audioFilePlayer.scheduleBuffer(audioFileBuffer, completionHandler: nil)
    
          } catch {
              print(error)
          }
    
        }
    }
    

0

不要调用audioFile.fileFormat,而应该使用audioFile.processingFormat作为AVAudioPCMBuffer构造函数的参数。

let buffer = AVAudioPCMBuffer(pcmFormat: audioFile.processingFormat,
                                            frameCapacity: bufferCapacity) 

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接