你好。我想用Swift中的新AVAudioEngine
实现一个实时音频应用程序。有人对这个新框架有经验吗?实时应用程序是如何工作的?
我的第一个想法是将(处理后的)输入数据存储到AVAudioPCMBuffer
对象中,然后通过AVAudioPlayerNode
让其播放,就像你在我的演示类中看到的那样:
import AVFoundation
class AudioIO {
var audioEngine: AVAudioEngine
var audioInputNode : AVAudioInputNode
var audioPlayerNode: AVAudioPlayerNode
var audioMixerNode: AVAudioMixerNode
var audioBuffer: AVAudioPCMBuffer
init(){
audioEngine = AVAudioEngine()
audioPlayerNode = AVAudioPlayerNode()
audioMixerNode = audioEngine.mainMixerNode
let frameLength = UInt32(256)
audioBuffer = AVAudioPCMBuffer(PCMFormat: audioPlayerNode.outputFormatForBus(0), frameCapacity: frameLength)
audioBuffer.frameLength = frameLength
audioInputNode = audioEngine.inputNode
audioInputNode.installTapOnBus(0, bufferSize:frameLength, format: audioInputNode.outputFormatForBus(0), block: {(buffer, time) in
let channels = UnsafeArray(start: buffer.floatChannelData, length: Int(buffer.format.channelCount))
let floats = UnsafeArray(start: channels[0], length: Int(buffer.frameLength))
for var i = 0; i < Int(self.audioBuffer.frameLength); i+=Int(self.audioMixerNode.outputFormatForBus(0).channelCount)
{
// doing my real time stuff
self.audioBuffer.floatChannelData.memory[i] = floats[i];
}
})
// setup audio engine
audioEngine.attachNode(audioPlayerNode)
audioEngine.connect(audioPlayerNode, to: audioMixerNode, format: audioPlayerNode.outputFormatForBus(0))
audioEngine.startAndReturnError(nil)
// play player and buffer
audioPlayerNode.play()
audioPlayerNode.scheduleBuffer(audioBuffer, atTime: nil, options: .Loops, completionHandler: nil)
}
}
但这远离实时且效率不高。有什么想法或经验吗?如果您更喜欢Objective-C或Swift,也没关系,我很感激所有的笔记、评论、解决方案等。
AVAudioEngine.inputNode
连接到AVAudioEngine.outputNode
。 - bio