如何在iOS系统中给文件应用音频效果并写入文件系统

4

我正在建立一款应用程序,应该允许用户对录制的音频应用音频过滤器,如混响和增益。

我无法找到任何可行的信息来源来了解如何对文件本身应用过滤器,因为稍后需要上传处理后的文件到服务器。

我目前正在使用AudioKit进行可视化,并且我知道它有能力进行音频处理,但仅适用于播放。请提供进一步研究的建议。


2个回答

9

AudioKit有一个离线渲染节点,不需要iOS 11。这里有一个例子,player.schedule(...)和player.start(at.)这些部分是必需的,因为AKAudioPlayer的底层AVAudioPlayerNode将在调用线程上阻塞等待下一次渲染,如果你使用player.play()启动它。

import UIKit
import AudioKit

class ViewController: UIViewController {

    var player: AKAudioPlayer?
    var reverb = AKReverb()
    var boost = AKBooster()
    var offlineRender = AKOfflineRenderNode()

    override func viewDidLoad() {
        super.viewDidLoad()

        guard let url = Bundle.main.url(forResource: "theFunkiestFunkingFunk", withExtension: "mp3") else {
            return
        }
        var audioFile: AKAudioFile?
        do {
            audioFile = try AKAudioFile.init(forReading: url)
            player = try AKAudioPlayer.init(file: audioFile!)
        } catch {
            print(error)
            return
        }
        guard let player = player else {
            return
        }


        player >>> reverb >>> boost >>> offlineRender

        AudioKit.output = offlineRender
        AudioKit.start()


        let docs = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
        let dstURL = docs.appendingPathComponent("rendered.caf")

        offlineRender.internalRenderEnabled = false
        player.schedule(from: 0, to: player.duration, avTime: nil)
        let sampleTimeZero = AVAudioTime(sampleTime: 0, atRate: AudioKit.format.sampleRate)
        player.play(at: sampleTimeZero)
        do {
            try offlineRender.renderToURL(dstURL, seconds: player.duration)
        } catch {
            print(error)
            return
        }
        offlineRender.internalRenderEnabled = true

        print("Done! Rendered to " + dstURL.path)
    }
}

嘿,Dave,我正在尝试使用AudioKit 4.0,它似乎具有离线渲染功能(https://audiokitpro.com/audiokit-4-released/),但似乎AudioKit.mainMixer不可用,所以我尝试了`player >>> reverb >>> boost >>> offlineRender >>> **AudioKit.engine.mainMixerNode**`,但它没有起作用。它生成了一个相同长度的音频,但只有静音。谢谢! - Juan Giorello
我刚刚再试了一次,它正在工作,但是我应该怎么做才能避免播放它呢?如果我删除player.play(),那么文件又会变为空。此外,我尝试在play()之后立即使用player.pause(),但它会崩溃。提前感谢! - Juan Giorello
1
在执行渲染到URL之后,在将offlineRender.internalRenderEnabled设置为true之前,请停止播放器。当您将offlineRender.internalRenderEnabled = true设置为true时,您基本上正在恢复正常渲染。 - dave234
工作得非常好!谢谢 Dave! - Juan Giorello
这个答案现在是错误的,因为它使用的是AudioKit 4.0.4版本 - 它现在使用AudioKit.renderToFile,并且仅适用于iOS 11+。 - sacred0x01
显示剩余2条评论

7
您可以使用新引入的音频单元插件中的“手动渲染”功能(请参见下面的示例)。
如果您需要支持旧版本的macOS / iOS,即使我自己没有尝试过,我也会惊讶地发现您不能使用AudioKit实现相同的功能。例如,使用AKSamplePlayer作为第一个节点(它将读取您的音频文件),然后构建和连接您的效果,并使用AKNodeRecorder作为最后一个节点。
使用新的音频单元特性进行手动渲染的示例:
import AVFoundation

//: ## Source File
//: Open the audio file to process
let sourceFile: AVAudioFile
let format: AVAudioFormat
do {
    let sourceFileURL = Bundle.main.url(forResource: "mixLoop", withExtension: "caf")!
    sourceFile = try AVAudioFile(forReading: sourceFileURL)
    format = sourceFile.processingFormat
} catch {
    fatalError("could not open source audio file, \(error)")
}

//: ## Engine Setup
//:    player -> reverb -> mainMixer -> output
//: ### Create and configure the engine and its nodes
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
let reverb = AVAudioUnitReverb()

engine.attach(player)
engine.attach(reverb)

// set desired reverb parameters
reverb.loadFactoryPreset(.mediumHall)
reverb.wetDryMix = 50

// make connections
engine.connect(player, to: reverb, format: format)
engine.connect(reverb, to: engine.mainMixerNode, format: format)

// schedule source file
player.scheduleFile(sourceFile, at: nil)
//: ### Enable offline manual rendering mode
do {
    let maxNumberOfFrames: AVAudioFrameCount = 4096 // maximum number of frames the engine will be asked to render in any single render call
    try engine.enableManualRenderingMode(.offline, format: format, maximumFrameCount: maxNumberOfFrames)
} catch {
    fatalError("could not enable manual rendering mode, \(error)")
}
//: ### Start the engine and player
do {
    try engine.start()
    player.play()
} catch {
    fatalError("could not start engine, \(error)")
}
//: ## Offline Render
//: ### Create an output buffer and an output file
//: Output buffer format must be same as engine's manual rendering output format
let outputFile: AVAudioFile
do {
    let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]
    let outputURL = URL(fileURLWithPath: documentsPath + "/mixLoopProcessed.caf")
    outputFile = try AVAudioFile(forWriting: outputURL, settings: sourceFile.fileFormat.settings)
} catch {
    fatalError("could not open output audio file, \(error)")
}

// buffer to which the engine will render the processed data
let buffer: AVAudioPCMBuffer = AVAudioPCMBuffer(pcmFormat: engine.manualRenderingFormat, frameCapacity: engine.manualRenderingMaximumFrameCount)!
//: ### Render loop
//: Pull the engine for desired number of frames, write the output to the destination file
while engine.manualRenderingSampleTime < sourceFile.length {
    do {
        let framesToRender = min(buffer.frameCapacity, AVAudioFrameCount(sourceFile.length - engine.manualRenderingSampleTime))
        let status = try engine.renderOffline(framesToRender, to: buffer)
        switch status {
        case .success:
            // data rendered successfully
            try outputFile.write(from: buffer)

        case .insufficientDataFromInputNode:
            // applicable only if using the input node as one of the sources
            break

        case .cannotDoInCurrentContext:
            // engine could not render in the current render call, retry in next iteration
            break

        case .error:
            // error occurred while rendering
            fatalError("render failed")
        }
    } catch {
        fatalError("render failed, \(error)")
    }
}

player.stop()
engine.stop()

print("Output \(outputFile.url)")
print("AVAudioEngine offline rendering completed")

您可以在此处找到更多有关AudioUnit格式更新的文档和示例。


问题是,您无法在不播放文件的情况下使用AudioEngine处理文件。 - matt
那一定是iOS 11中的新功能了。而且早该有了。谢谢! - matt
@matt 是的,这是iOS 11 / macOS High Sierra的新功能。您可以在https://developer.apple.com/videos/play/wwdc2017/501/观看相应的WWDC(有关离线手动渲染的部分从8分钟左右开始,到15分钟左右结束)。 - filaton

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接