无法播放使用AVCaptureAudioDataOutputSampleDelegate录制的语音音频

6

我已经搜索和研究了几天,但似乎无法让它工作,也找不到任何解决方法。

我正在尝试使用麦克风捕获我的声音,然后通过扬声器播放出来。

这是我的代码:

class ViewController: UIViewController, AVAudioRecorderDelegate, AVCaptureAudioDataOutputSampleBufferDelegate {

var recordingSession: AVAudioSession!
var audioRecorder: AVAudioRecorder!
var captureSession: AVCaptureSession!
var microphone: AVCaptureDevice!
var inputDevice: AVCaptureDeviceInput!
var outputDevice: AVCaptureAudioDataOutput!

override func viewDidLoad() {
    super.viewDidLoad()

    recordingSession = AVAudioSession.sharedInstance()

    do{
        try recordingSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
        try recordingSession.setMode(AVAudioSessionModeVoiceChat)
        try recordingSession.setPreferredSampleRate(44000.00)
        try recordingSession.setPreferredIOBufferDuration(0.2)
        try recordingSession.setActive(true)

        recordingSession.requestRecordPermission() { [unowned self] (allowed: Bool) -> Void in
            DispatchQueue.main.async {
                if allowed {

                    do{
                        self.microphone = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)
                        try self.inputDevice = AVCaptureDeviceInput.init(device: self.microphone)

                        self.outputDevice = AVCaptureAudioDataOutput()
                        self.outputDevice.setSampleBufferDelegate(self, queue: DispatchQueue.main)

                        self.captureSession = AVCaptureSession()
                        self.captureSession.addInput(self.inputDevice)
                        self.captureSession.addOutput(self.outputDevice)
                        self.captureSession.startRunning()
                    }
                    catch let error {
                        print(error.localizedDescription)
                    }
                }
            }
        }
    }catch let error{
        print(error.localizedDescription)
    }
}

回调函数:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

    var audioBufferList = AudioBufferList(
        mNumberBuffers: 1,
        mBuffers: AudioBuffer(mNumberChannels: 0,
        mDataByteSize: 0,
        mData: nil)
    )

    var blockBuffer: CMBlockBuffer?

    var osStatus = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(

        sampleBuffer,
        nil,
        &audioBufferList,
        MemoryLayout<AudioBufferList>.size,
        nil,
        nil,
        UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
        &blockBuffer
    )

    do {
        var data: NSMutableData = NSMutableData.init()
        for i in 0..<audioBufferList.mNumberBuffers {

            var audioBuffer = AudioBuffer(
                 mNumberChannels: audioBufferList.mBuffers.mNumberChannels,
                 mDataByteSize: audioBufferList.mBuffers.mDataByteSize,
                 mData: audioBufferList.mBuffers.mData
            )

            let frame = audioBuffer.mData?.load(as: Float32.self)
            data.append(audioBuffer.mData!, length: Int(audioBuffer.mDataByteSize))

        }

        var dataFromNsData = Data.init(referencing: data)
        var avAudioPlayer: AVAudioPlayer = try AVAudioPlayer.init(data: dataFromNsData)
        avAudioPlayer.prepareToPlay()
        avAudioPlayer.play()
        }
    }
    catch let error {
        print(error.localizedDescription)
        //prints out The operation couldn’t be completed. (OSStatus error 1954115647.)
}

任何帮助都将是惊人的,这可能会帮助很多其他人,因为有很多不完整的Swift版本存在。谢谢。

你在 GitHub 上有这个项目吗? - Rhythmic Fistman
是的,@RhythmicFistman,我刚刚创建了一个,这是链接:https://github.com/nullforlife/MicToSpeakersIOS/tree/master/MicToSpeakers - nullforlife
1个回答

4
你非常接近了!你在didOutputSampleBuffer回调中捕获了音频,但这是一个高频回调,所以你正在创建很多AVAudioPlayer并传递原始LPCM数据,而它们只知道如何解析CoreAudio文件类型,然后它们无论如何都会失效。
你可以使用AVCaptureSessionAVAudioEngineAVAudioPlayerNode非常容易地播放你捕获的缓冲区,但此时你可以使用AVAudioEngine从麦克风录制。
import UIKit
import AVFoundation

class ViewController: UIViewController {
    var engine = AVAudioEngine()

    override func viewDidLoad() {
        super.viewDidLoad()

        let input = engine.inputNode!
        let player = AVAudioPlayerNode()
        engine.attach(player)

        let bus = 0
        let inputFormat = input.inputFormat(forBus: bus)
        engine.connect(player, to: engine.mainMixerNode, format: inputFormat)

        input.installTap(onBus: bus, bufferSize: 512, format: inputFormat) { (buffer, time) -> Void in
            player.scheduleBuffer(buffer)
        }

        try! engine.start()
        player.play()
    }
}

太好了,正是我想要的!非常感谢你! - nullforlife
你能再解释一下func captureOutput的用法吗?以及如何使用你的代码?@Rhythmic - Amey
请问您能帮忙解决这个问题吗?我已经写出了所有的要点。问题链接:https://stackoverflow.com/questions/50968519/how-to-play-nsdata-or-buffer-on-iphone - Amey

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接