我希望使用AVAudioSessionCategoryMultiRoute,但遗憾的是,在苹果开发者中心和Google上都没有相关示例。如何使用/实现AVAudioSessionCategoryMultiRoute来在iPhone ios7.0.4上定义两个不同的路线呢?
我的目标是将音频通过扬声器和耳机同时输出。(我知道在过去是不可能的,但我想在ios7上尝试一下)
谢谢您的帮助,
[_session setCategory:AVAudioSessionCategoryMultiRoute error:nil];
方法1:设置AVAudioPlayer的通道分配:
// My hardware has 4 output channels
if (_outputPortChannels.count == 4) {
AVAudioSessionChannelDescription *desiredChannel1 = [_outputPortChannels objectAtIndex:2];
AVAudioSessionChannelDescription *desiredChannel2 = [_outputPortChannels objectAtIndex:3];
// Create an array of desired channels
NSArray *channelDescriptions = [NSArray arrayWithObjects:desiredChannel1, desiredChannel2, nil];
// Assign the channels
_avAudioPlayer1.channelAssignments = channelDescriptions;
NSLog(@"_player.channelAssignments: %@", _avAudioPlayer1.channelAssignments);
// Play audio to output channel3, channel4
[_avAudioPlayer1 play];
}
方法二:自定义通道映射
// Get channel map indices based on user specified channelNames
NSMutableArray *channelMapIndices = [self getOutputChannelMapIndices:_inChannelNames];
NSAssert(channelMapIndices && channelMapIndices.count > 0, @"Error getting indices for user specified channel names!");
// AVAudioEngine setup
_engine = [[AVAudioEngine alloc] init];
_output = _engine.outputNode;
_mixer = _engine.mainMixerNode;
_player = [[AVAudioPlayerNode alloc] init];
[_engine attachNode:_player];
// open the file to play
NSString *path1 = [[NSBundle mainBundle] pathForResource:@"yujian" ofType:@"mp3"];
NSURL *songURL1 = [NSURL fileURLWithPath:path1];
_songFile = [[AVAudioFile alloc] initForReading:songURL1 error:nil];
// create output channel map
SInt32 source1NumChannels = (SInt32)_songFile.processingFormat.channelCount;
// I use constant map
// Play audio to output channel3, channel4
SInt32 outputChannelMap[4] = {-1, -1, 0, 1};
// This will play audio to output channel1, channel2
//SInt32 outputChannelMap[4] = {0, 1, -1, -1};
// set channel map on outputNode AU
UInt32 propSize = (UInt32)sizeof(outputChannelMap);
OSStatus err = AudioUnitSetProperty(_output.audioUnit, kAudioOutputUnitProperty_ChannelMap, kAudioUnitScope_Global, 1, outputChannelMap, propSize);
NSAssert(noErr == err, @"Error setting channel map! %d", (int)err);
// make connections
AVAudioChannelLayout *channel1Layout = [[AVAudioChannelLayout alloc] initWithLayoutTag:kAudioChannelLayoutTag_DiscreteInOrder | (UInt32)source1NumChannels];
AVAudioFormat *format1 = [[AVAudioFormat alloc] initWithStreamDescription:_songFile.processingFormat.streamDescription channelLayout:channel1Layout];
[_engine connect:_player to:_mixer format:format1];
[_engine connect:_mixer to:_output format:format1];
// schedule the file on player
[_player scheduleFile:_songFile atTime:nil completionHandler:nil];
// start engine and player
if (!_engine.isRunning) {
[_engine startAndReturnError:nil];
}
[_player play];
它对我有效。