这是我目前为止处理的结果。离完美还有很大差距,但它在某种程度上是有效的。
要获取所有通道,您需要使用AVAudioPCMBuffer,并将文件中的两个通道存储到每个缓冲区中。此外,对于每个通道对,您需要单独的AVAudioPlayerNode,然后只需将每个播放器连接到AVAudioMixerNode即可完成。以下是一些简单的6声道音频代码:
AVAudioFormat *outputFormat = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatFloat32 sampleRate:file.processingFormat.sampleRate channels:2 interleaved:false]
AVAudioFile *file = [[AVAudioFile alloc] initForReading:[[NSBundle mainBundle] URLForResource:@"nums6ch" withExtension:@"wav"] error:nil]
AVAudioPCMBuffer *wholeBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length]
AVAudioPCMBuffer *buffer1 = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)file.length]
AVAudioPCMBuffer *buffer2 = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)file.length]
AVAudioPCMBuffer *buffer3 = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)file.length]
memcpy(buffer1.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize)
memcpy(buffer1.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[1].mDataByteSize)
buffer1.frameLength = wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize/sizeof(UInt32)
memcpy(buffer2.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[2].mData, wholeBuffer.audioBufferList->mBuffers[2].mDataByteSize)
memcpy(buffer2.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[3].mData, wholeBuffer.audioBufferList->mBuffers[3].mDataByteSize)
buffer2.frameLength = wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize/sizeof(UInt32)
memcpy(buffer3.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[4].mData, wholeBuffer.audioBufferList->mBuffers[4].mDataByteSize)
memcpy(buffer3.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[5].mData, wholeBuffer.audioBufferList->mBuffers[5].mDataByteSize)
buffer3.frameLength = wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize/sizeof(UInt32)
AVAudioEngine *engine = [[AVAudioEngine alloc] init]
AVAudioPlayerNode *player1 = [[AVAudioPlayerNode alloc] init]
AVAudioPlayerNode *player2 = [[AVAudioPlayerNode alloc] init]
AVAudioPlayerNode *player3 = [[AVAudioPlayerNode alloc] init]
AVAudioMixerNode *mixer = [[AVAudioMixerNode alloc] init]
[engine attachNode:player1]
[engine attachNode:player2]
[engine attachNode:player3]
[engine attachNode:mixer]
[engine connect:player1 to:mixer format:outputFormat]
[engine connect:player2 to:mixer format:outputFormat]
[engine connect:player3 to:mixer format:outputFormat]
[engine connect:mixer to:engine.outputNode format:outputFormat]
[engine startAndReturnError:nil]
[player1 scheduleBuffer:buffer1 completionHandler:nil]
[player2 scheduleBuffer:buffer2 completionHandler:nil]
[player3 scheduleBuffer:buffer3 completionHandler:nil]
[player1 play]
[player2 play]
[player3 play]
现在这个解决方案还远非完美,因为由于在不同的时间调用每个播放器的
play
函数,所以会存在各个通道之间的延迟。我仍然无法从我的测试文件中播放8个通道的音频(请参见OP中的链接)。AVAudioFile处理格式的通道计数为0,即使我创建了具有正确通道数量和布局的自定义格式,也会在缓冲区读取时出错。请注意,使用AUGraph我可以完美地播放此文件。
因此,在接受此答案之前,我将等待,如果您有更好的解决方案,请分享。
编辑
看起来我的无法同步节点问题和不能播放这个特定的8通道音频都是错误(由Apple开发人员支持确认)。
因此,对于在iOS上处理音频的人,一点建议。虽然AVAudioEngine适用于简单的事情,但您应该选择AUGraph来处理更复杂的事情,即使是那些应该使用AVAudioEngine的东西。如果您不知道如何在AUGraph中复制AVAudioEngine的某些内容(就像我一样),那么就很难了。
AVAudioPlayerNode
的scheduleBuffer:atTime:options:completionHandler
为所有三个节点指定相同的AVAudioTime
吗? - Mark