如何使用PlaybackCapture APIs在Android Q上录制音频?

5
我正在尝试使用Playback Capture API在Android 10(Q)上录制音频。由于Playback Capture API仅允许使用USAGE_GAMEUSAGE_MEDIAUSAGE_UNKNOWN记录声音,因此我已下载了Uamp示例,并在播放歌曲时设置了USAGE_MEDIA。我还在AndroidManifest.xml中添加了android:allowAudioPlaybackCapture="true"。然后我启动了Uamp,开始播放歌曲并让其处于后台状态。
我开发了一个名为CaptureAudio的项目,目标SDK为29,并将其安装在已安装Android 10的OnePlus 7 Pro上。UI上有两个按钮,用于开始和停止捕获。当应用程序开始捕获时,读取函数会将缓冲区中的所有0填充。
要在项目中使用回放捕获,我已经按照以下步骤设置:

1. 清单文件:

        <?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    package="com.example.captureaudio">

    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.CAPTURE_AUDIO_OUTPUT" />
    <uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />

    <application
        android:allowBackup="false"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:supportsRtl="true"
        android:theme="@style/AppTheme"
        tools:ignore="GoogleAppIndexingWarning">
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>

        <service
            android:name=".services.MediaProjectionService"
            android:enabled="true"
            android:exported="false"
            android:foregroundServiceType="mediaProjection"
            tools:targetApi="q" />
    </application>

</manifest>

2. MainActivity:

class MainActivity : AppCompatActivity() {

    companion object {
        private const val REQUEST_CODE_CAPTURE_INTENT = 1
        private const val TAG = "CaptureAudio"
        private const val RECORDER_SAMPLE_RATE = 48000
        private const val RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO
        //or AudioFormat.CHANNEL_IN_BACK
        private const val RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT
        //  AudioFormat.ENCODING_PCM_16BIT
    }

    private var audioRecord: AudioRecord? = null
    private val mediaProjectionManager by lazy { (this@MainActivity).getSystemService(Context.MEDIA_PROJECTION_SERVICE) as MediaProjectionManager }
    private val rxPermissions by lazy { RxPermissions(this) }
    private val minBufferSize by lazy {
        AudioRecord.getMinBufferSize(
            RECORDER_SAMPLE_RATE,
            RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING
        )
    }

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        val intent = Intent(this, MediaProjectionService::class.java)
        startForegroundService(intent)
        getPermissions()
    }

    private fun getPermissions() {
        rxPermissions
            .request(
                Manifest.permission.RECORD_AUDIO,
                Manifest.permission.FOREGROUND_SERVICE,
                Manifest.permission.WRITE_EXTERNAL_STORAGE
            )
            .subscribe {
                log("Permission result: $it")
                if (it) { // Always true pre-M
                    val captureIntent = mediaProjectionManager.createScreenCaptureIntent()
                    startActivityForResult(captureIntent, REQUEST_CODE_CAPTURE_INTENT)
                } else {
                    getPermissions()
                }
            }
    }

    override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == REQUEST_CODE_CAPTURE_INTENT && data != null) {
            val mediaProjection = mediaProjectionManager.getMediaProjection(resultCode, data)
            val playbackConfig = AudioPlaybackCaptureConfiguration.Builder(mediaProjection)
                .addMatchingUsage(AudioAttributes.USAGE_MEDIA)
                .addMatchingUsage(AudioAttributes.USAGE_UNKNOWN)
                .addMatchingUsage(AudioAttributes.USAGE_GAME)
                .build()
            audioRecord = AudioRecord.Builder()
                .setAudioPlaybackCaptureConfig(playbackConfig)
                .setBufferSizeInBytes(minBufferSize * 2)
                .setAudioFormat(
                    AudioFormat.Builder()
                        .setEncoding(RECORDER_AUDIO_ENCODING)
                        .setSampleRate(RECORDER_SAMPLE_RATE)
                        .setChannelMask(RECORDER_CHANNELS)
                        .build()
                )
                .build()
        }
    }

    fun startCapture(view: View) {
        audioRecord?.apply {
            startRecording()
            log("Is stopped: $state $recordingState")
            startRecordingIntoFile()
        }
        stopRecBtn.visibility = View.VISIBLE
        startRecBtn.visibility = View.INVISIBLE
    }

    private fun AudioRecord.startRecordingIntoFile() {
        val file = File(
            getExternalFilesDir(Environment.DIRECTORY_MUSIC),
            "temp.wav"
            //System.currentTimeMillis().toString() + ".wav"
        )
        if (!file.exists())
            file.createNewFile()

        GlobalScope.launch {
            val out = file.outputStream()
            audioRecord.apply {
                while (recordingState == AudioRecord.RECORDSTATE_RECORDING) {

                    val buffer = ShortArray(minBufferSize)//ByteBuffer.allocate(MIN_BUFFER_SIZE)
                    val result = read(buffer, 0, minBufferSize)

                    // Checking if I am actually getting something in a buffer
                    val b: Short = 0
                    var nonZeroValueCount = 0
                    for (i in 0 until minBufferSize) {
                        if (buffer[i] != b) {
                            nonZeroValueCount += 1
                            log("Value: ${buffer[i]}")
                        }
                    }
                    if (nonZeroValueCount != 0) {

                        // Record the non-zero values in the file..
                        log("Result $nonZeroValueCount")
                        when (result) {
                            AudioRecord.ERROR -> showToast("ERROR")
                            AudioRecord.ERROR_INVALID_OPERATION -> showToast("ERROR_INVALID_OPERATION")
                            AudioRecord.ERROR_DEAD_OBJECT -> showToast("ERROR_DEAD_OBJECT")
                            AudioRecord.ERROR_BAD_VALUE -> showToast("ERROR_BAD_VALUE")
                            else -> {
                                log("Appending $buffer into ${file.absolutePath}")
                                out.write(shortToByte(buffer))
                            }
                        }
                    }
                }
            }
            out.close()
        }
    }

    private fun shortToByte(shortArray: ShortArray): ByteArray {
        val byteOut = ByteArray(shortArray.size * 2)
        ByteBuffer.wrap(byteOut).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().put(shortArray)
        return byteOut
    }

    private fun showToast(msg: String) {
        runOnUiThread {
            log("Toast: $msg")
            Toast.makeText(this@MainActivity, msg, Toast.LENGTH_LONG).show()
        }
    }

    fun stopCapture(view: View) {
        audioRecord?.apply {
            stop()
            log("Is stopped: $state $recordingState")
        }
        stopRecBtn.visibility = View.INVISIBLE
        startRecBtn.visibility = View.VISIBLE
    }

    private fun log(msg: String) {
        Log.d(TAG, msg)
    }

    override fun onDestroy() {
        super.onDestroy()
        audioRecord?.stop()
        audioRecord?.release()
        audioRecord = null
    }
}

3. 媒体投影服务

    class MediaProjectionService : Service() {

    companion object {
        private const val CHANNEL_ID = "ForegroundServiceChannel"
    }

    override fun onBind(intent: Intent?): IBinder? {
        return null
    }

    override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {

        createNotificationChannel()
        val notificationIntent = Intent(this, MainActivity::class.java)
        val pendingIntent = PendingIntent.getActivity(
            this,
            0, notificationIntent, 0
        )

        val notification = NotificationCompat.Builder(this, CHANNEL_ID)
            .setContentTitle("Foreground Service")
            .setContentText("Call Recording Service")
//            .setSmallIcon(R.drawable.ic_stat_name)
            .setContentIntent(pendingIntent)
            .build()

        startForeground(1, notification)
        return START_NOT_STICKY
    }

    private fun createNotificationChannel() {
        val serviceChannel = NotificationChannel(
            CHANNEL_ID,
            "Foreground Service Channel",
            NotificationManager.IMPORTANCE_DEFAULT
        )

        val manager = getSystemService(NotificationManager::class.java)
        manager!!.createNotificationChannel(serviceChannel)
    }
}

The problem is, 1. 文件 /storage/emulated/0/Android/data/com.example.captureaudio/files/Music/temp.wav 已创建,但其中只有 0 秒的内容。我也使用以下命令检查了文件:xxd /storage/emulated/0/Android/data/com.example.captureaudio/files/Music/temp.wav
OnePlus7Pro:/sdcard # xxd /storage/emulated/0/Android/data/com.example.captureaudio/files/Music/temp.wav | head
00000000: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000010: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000020: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000030: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000040: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000050: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000060: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000070: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000080: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000090: 0000 0000 0000 0000 0000 0000 0000 0000  ................

2. 在设备上播放时,出现错误“无法播放请求的音轨”。

有什么帮助或建议,我错过了什么吗?


你解决了这个问题吗?我在Android 10中也遇到了相同的问题。 - Bhagyashri
我已经停止对它的工作。但是我知道媒体播放器无法播放该曲目,因为该文件没有头部。如果在文件中添加Wav头,则媒体播放器肯定会播放歌曲。文件中的零是由于无效的AudioRecord参数。您可以尝试阅读有关RECORDER_*参数的信息。由于我还没有解决它,所以无法为您提供更多帮助。 - Vatish Sharma
@VatishSharma 你可以使用Audacity导入原始音频数据并播放。 - Ivan Sheihets
1个回答

3
我认为当你将音频数据写入.wav文件时出现了问题。
这是我的示例应用程序,它使用Playback Capture API在Android 10(Q)上录制音频。在这个应用程序中,我将音频数据写入.pcm文件,然后将其解码为.mp3音频文件,您可以使用任何播放器进行听取和执行。
警告! QRecorder应用程序实现了带有NDK的lib lame
如果您不想浪费时间导入lib lame到您的项目中,您可以使用PCM-Decoder库来解码记录的.pcm文件。

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接