HTML5音频标签在Safari浏览器上存在延迟问题

27

我试图实现一种简单的手绘效果,即在单击时使用html标记播放mp3 / ogg声音。希望它可以在Firefox,Safari和Safari iPad上正常工作。

我已经尝试了许多方法,并选定了以下方案:

HTML

    <span id="play-blue-note" class="play blue" ></span>
    <span id="play-green-note" class="play green" ></span>


    <audio id="blue-note" style="display:none" controls preload="auto" autobuffer> 
        <source src="blue.mp3" />
        <source src="blue.ogg" />
        <!-- now include flash fall back -->
    </audio>

    <audio id="green-note" style="display:none" controls preload="auto" autobuffer> 
        <source src="green.mp3" />
        <source src="green.ogg" />
    </audio>

JS

function addSource(elem, path) {
    $('<source>').attr('src', path).appendTo(elem);
}

$(document).ready(function() {


    $('body').delegate('.play', 'click touchstart', function() {
        var clicked = $(this).attr('id').split('-')[1];

        $('#' + clicked + '-note').get(0).play();



    });

});  

这在Firefox下似乎很好用,但是在Safari下,每次点击都有一定的延迟,即使您点击多次并且音频文件已加载。在iPad的Safari上,它的行为几乎是不可预测的。

此外,当我在本地测试时,Safari的性能似乎有所提高,我猜测Safari每次都在下载文件。这可能吗?如何避免这种情况?谢谢!


嘿,对于Safari的问题有什么意见吗?:S - Ignacio
当前的答案是正确的,并且根据其态度,完全符合此前的答案在类似的主题上。 正如作者和这个主题中的回答所引用的那样,最有效的解决方法确实是将所有行为整合到一个文件中,在不同的帧中调用它。 这需要额外的工作量,但这就是当苹果做出一个设计决定时,其他人被迫遵循的结果。 (如果您希望以此为答案,请告诉我。) - MrGomez
@ign 你能否在桌面版Safari中同时播放多个音频?不幸的是,我在桌面版Safari上遇到了延迟问题。FF和Chrome可以正常工作。 - trainoasis
13个回答

32

在桌面版Safari中,添加AudioContext可以解决这个问题:

const AudioContext = window.AudioContext || window.webkitAudioContext;
const audioCtx = new AudioContext();

我偶然发现了这个方法,所以不知道它为什么有效,但这个方法消除了我的应用程序的延迟。


3
我可以保证,仅仅添加这两行代码就改善了音频性能。 - Brian Risk
1
你能提供更多细节吗?如何将“audioCtx”链接到实际的音频对象?我认为,仅仅添加这些代码行而没有更多连接到实际音频的代码是不够的。 - SeanRtS
1
这个答案的意思是什么?是要正确地使用fetch + AudioContext而不是普通的Audio,还是只是把这两行代码添加为一个魔术技巧?无论如何,在捆绑时它们都将被视为死代码并被删除。 - Alexander Zinchuk
这解决了我在 Chrome 扩展中苦战数周的音频错误。愉快的小意外。 - Tompina
对于那些想知道为什么这样会起作用的人,我认为在移动音频上发生的情况是,任何时候只有一个程序可以保持音频系统处于打开状态。每次它切换回来到你这里,它都会将声音的前100毫秒静音;如果您正在制作短声音,则会省略整个声音。音频上下文抓取音频并保持您的程序处于活动状态,因此您不会在声音完成后立即失去音频通道,并且下一个声音将被截断。这个错误让我发疯了一整天。它与音频长度有关! - Edward De Jong
显示剩余6条评论

15

我刚刚几分钟前回答了另一个iOS/<audio>问题,这里也适用:

iOS设备禁用预加载<audio><video>以节省带宽。

在iOS的Safari浏览器上(包括iPad在内的所有设备),用户可能使用蜂窝网络,并按数据单元收费。因此,预加载和自动播放功能已被禁用。在用户启动之前,不会加载任何数据。

来源: Safari开发者文库


1
谢谢。所以,我猜在iPad上没有办法实现几乎实时的反馈,对吗?有关Safari的任何见解吗? - Ignacio

5
Safari的问题在于每次播放音频文件时都会发出请求。您可以尝试创建HTML5缓存清单。不幸的是,我的经验是您只能一次将一个音频文件添加到缓存中。解决方法可能是将所有音频文件按顺序合并为单个音频文件,并根据所需的声音在特定位置开始播放。您可以创建一个间隔来跟踪当前播放位置,并在达到特定时间戳后暂停它。
在此处阅读有关创建HTML5缓存清单的更多信息:

http://www.html5rocks.com/en/tutorials/appcache/beginner/

http://www.whatwg.org/specs/web-apps/current-work/multipage/offline.html

希望它有所帮助!

谢谢您的见解。不过,对于我正在尝试实现的简单任务来说,这听起来太麻烦了。 - Ignacio
我认为您并不完全正确,但我认为您的建议很好,可以将所有音效放入单个文件中,并在不同上下文中使用偏移量进行播放,正如 https://developer.mozilla.org/zh-CN/docs/Web/Guide/HTML/Using_HTML5_audio_and_video 中的指定播放范围所述。 我想知道这对于具有相同源的多个<audio>元素会产生多大的影响,或者它是否起作用。 对于我正在开发的游戏也有类似的问题:Chrome 播放音频非常流畅,但 Firefox 和 Safari 都明显滞后。 - Bart Read
1
这意味着在快速重复的声音中,比如玩家射击、子弹撞击等,重复的声音通常不会播放。另一个潜在的解决方法可能是将几个<audio>元素附加到同一源上,并按照轮流的方式依次播放它们。再次声明,我不知道这是否有效,但这是我计划稍后尝试的事情。顺便说一下,我在桌面版浏览器上看到了这些问题,更不用说移动版了。 - Bart Read

3

在Safari iOS中使用<audio>元素存在HTML5音频延迟问题,但是可以通过使用AudioContext来克服。

我的代码片段基于我从https://lowlag.alienbill.com/学到的内容。

请在您自己的iOS设备上测试功能(我在iOS 12上进行了测试)。 https://fiddle.jshell.net/eLya8fxb/51/show/

JS Fiddle中的代码片段 https://jsfiddle.net/eLya8fxb/51/

// Requires jQuery 

// Adding:
// Strip down lowLag.js so it only supports audioContext (So no IE11 support (only Edge))
// Add "loop" monkey patch needed for looping audio (my primary usage)
// Add single audio channel - to avoid overlapping audio playback

// Original source: https://lowlag.alienbill.com/lowLag.js

if (!window.console) console = {
  log: function() {}
};

var lowLag = new function() {
  this.someVariable = undefined;
  this.showNeedInit = function() {
    lowLag.msg("lowLag: you must call lowLag.init() first!");
  }
  this.load = this.showNeedInit;
  this.play = this.showNeedInit;
  this.pause = this.showNeedInit;
  this.stop = this.showNeedInit;
  this.switch = this.showNeedInit;
  this.change = this.showNeedInit;
  
  this.audioContext = undefined;
  this.audioContextPendingRequest = {};
  this.audioBuffers = {};
  this.audioBufferSources = {};
  this.currentTag = undefined;
  this.currentPlayingTag = undefined;

  this.init = function() {
    this.msg("init audioContext");
    this.load = this.loadSoundAudioContext;
    this.play = this.playSoundAudioContext;
    this.pause = this.pauseSoundAudioContext;
    this.stop = this.stopSoundAudioContext;
    this.switch = this.switchSoundAudioContext;
    this.change = this.changeSoundAudioContext;

    if (!this.audioContext) {
      this.audioContext = new(window.AudioContext || window.webkitAudioContext)();
    }
  }

  //we'll use the tag they hand us, or else the url as the tag if it's a single tag,
  //or the first url 
  this.getTagFromURL = function(url, tag) {
    if (tag != undefined) return tag;
    return lowLag.getSingleURL(url);
  }
  this.getSingleURL = function(urls) {
    if (typeof(urls) == "string") return urls;
    return urls[0];
  }
  //coerce to be an array
  this.getURLArray = function(urls) {
    if (typeof(urls) == "string") return [urls];
    return urls;
  }

  this.loadSoundAudioContext = function(urls, tag) {
    var url = lowLag.getSingleURL(urls);
    tag = lowLag.getTagFromURL(urls, tag);
    lowLag.msg('webkit/chrome audio loading ' + url + ' as tag ' + tag);
    var request = new XMLHttpRequest();
    request.open('GET', url, true);
    request.responseType = 'arraybuffer';

    // Decode asynchronously
    request.onload = function() {
      // if you want "successLoadAudioFile" to only be called one time, you could try just using Promises (the newer return value for decodeAudioData)
      // Ref: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/decodeAudioData

      //Older callback syntax:
      //baseAudioContext.decodeAudioData(ArrayBuffer, successCallback, errorCallback);
      //Newer promise-based syntax:
      //Promise<decodedData> baseAudioContext.decodeAudioData(ArrayBuffer);


      // ... however you might want to use a pollfil for browsers that support Promises, but does not yet support decodeAudioData returning a Promise.
      // Ref: https://github.com/mohayonao/promise-decode-audio-data
      // Ref: https://caniuse.com/#search=Promise

      // var retVal = lowLag.audioContext.decodeAudioData(request.response);

      // Note: "successLoadAudioFile" is called twice. Once for legacy syntax (success callback), and once for newer syntax (Promise)
      var retVal = lowLag.audioContext.decodeAudioData(request.response, successLoadAudioFile, errorLoadAudioFile);
      //Newer versions of audioContext return a promise, which could throw a DOMException
      if (retVal && typeof retVal.then == 'function') {
        retVal.then(successLoadAudioFile).catch(function(e) {
          errorLoadAudioFile(e);
          urls.shift(); //remove the first url from the array
          if (urls.length > 0) {
            lowLag.loadSoundAudioContext(urls, tag); //try the next url
          }
        });
      }
    };

    request.send();

    function successLoadAudioFile(buffer) {
      lowLag.audioBuffers[tag] = buffer;
      if (lowLag.audioContextPendingRequest[tag]) { //a request might have come in, try playing it now
        lowLag.playSoundAudioContext(tag);
      }
    }

    function errorLoadAudioFile(e) {
      lowLag.msg("Error loading webkit/chrome audio: " + e);
    }
  }

  this.playSoundAudioContext = function(tag) {
    var context = lowLag.audioContext;

    // if some audio is currently active and hasn't been switched, or you are explicitly asking to play audio that is already active... then see if it needs to be unpaused
    // ... if you've switch audio, or are explicitly asking to play new audio (that is not the currently active audio) then skip trying to unpause the audio
    if ((lowLag.currentPlayingTag && lowLag.currentTag && lowLag.currentPlayingTag === lowLag.currentTag) || (tag && lowLag.currentPlayingTag && lowLag.currentPlayingTag === tag)) {
      // find currently paused audio (suspended) and unpause it (resume)
      if (context !== undefined) {
        // ref: https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/suspend
        if (context.state === 'suspended') {
          context.resume().then(function() {
            lowLag.msg("playSoundAudioContext resume " + lowLag.currentPlayingTag);
            return;
          }).catch(function(e) {
            lowLag.msg("playSoundAudioContext resume error for " + lowLag.currentPlayingTag + ". Error: " + e);
          });
          return;
        }
      }
    }
    
    if (tag === undefined) {
      tag = lowLag.currentTag;
    }

    if (lowLag.currentPlayingTag && lowLag.currentPlayingTag === tag) {
      // ignore request to play same sound a second time - it's already playing
      lowLag.msg("playSoundAudioContext already playing " + tag);
      return;
    } else {
      lowLag.msg("playSoundAudioContext " + tag);
    }

    var buffer = lowLag.audioBuffers[tag];
    if (buffer === undefined) { //possibly not loaded; put in a request to play onload
      lowLag.audioContextPendingRequest[tag] = true;
      lowLag.msg("playSoundAudioContext pending request " + tag);
      return;
    }

    // need to create a new AudioBufferSourceNode every time... 
    // you can't call start() on an AudioBufferSourceNode more than once. They're one-time-use only.
    var source;
    source = context.createBufferSource(); // creates a sound source
    source.buffer = buffer; // tell the source which sound to play
    source.connect(context.destination); // connect the source to the context's destination (the speakers)
    source.loop = true;
    lowLag.audioBufferSources[tag] = source;

    // find current playing audio and stop it
    var sourceOld = lowLag.currentPlayingTag ? lowLag.audioBufferSources[lowLag.currentPlayingTag] : undefined;
    if (sourceOld !== undefined) {
      if (typeof(sourceOld.noteOff) == "function") {
        sourceOld.noteOff(0);
      } else {
        sourceOld.stop();
      }
      lowLag.msg("playSoundAudioContext stopped " + lowLag.currentPlayingTag);
      lowLag.audioBufferSources[lowLag.currentPlayingTag] = undefined;
      lowLag.currentPlayingTag = undefined;
    }

    // play the new source audio
    if (typeof(source.noteOn) == "function") {
      source.noteOn(0);
    } else {
      source.start();
    }
    lowLag.currentTag = tag;
    lowLag.currentPlayingTag = tag;
    
    if (context.state === 'running') {
      lowLag.msg("playSoundAudioContext started " + tag);
    } else if (context.state === 'suspended') {
      /// if the audio context is in a suspended state then unpause (resume)
      context.resume().then(function() {
        lowLag.msg("playSoundAudioContext started and then resumed " + tag);
      }).catch(function(e) {
        lowLag.msg("playSoundAudioContext started and then had a resuming error for " + tag + ". Error: " + e);
      });
    } else if (context.state === 'closed') {
      // ignore request to pause sound - it's already closed
      lowLag.msg("playSoundAudioContext failed to start, context closed for " + tag);
    } else {
      lowLag.msg("playSoundAudioContext unknown AudioContext.state for " + tag + ". State: " + context.state);
    }
  }

  this.pauseSoundAudioContext = function() {
    // not passing in a "tag" parameter because we are playing all audio in one channel
    var tag = lowLag.currentPlayingTag;
    var context = lowLag.audioContext;

    if (tag === undefined) {
      // ignore request to pause sound as nothing is currently playing
      lowLag.msg("pauseSoundAudioContext nothing to pause");
      return;
    }

    // find currently playing (running) audio and pause it (suspend)
    if (context !== undefined) {
      // ref: https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/suspend
      if (context.state === 'running') {
       lowLag.msg("pauseSoundAudioContext " + tag);
        context.suspend().then(function() {
          lowLag.msg("pauseSoundAudioContext suspended " + tag);
        }).catch(function(e) {
          lowLag.msg("pauseSoundAudioContext suspend error for " + tag + ". Error: " + e);
        });
      } else if (context.state === 'suspended') {
        // ignore request to pause sound - it's already suspended
        lowLag.msg("pauseSoundAudioContext already suspended " + tag);
      } else if (context.state === 'closed') {
        // ignore request to pause sound - it's already closed
        lowLag.msg("pauseSoundAudioContext already closed " + tag);
      } else {
        lowLag.msg("pauseSoundAudioContext unknown AudioContext.state for " + tag + ". State: " + context.state);
      }
    }
  }

  this.stopSoundAudioContext = function() {
    // not passing in a "tag" parameter because we are playing all audio in one channel
    var tag = lowLag.currentPlayingTag;

    if (tag === undefined) {
      // ignore request to stop sound as nothing is currently playing
      lowLag.msg("stopSoundAudioContext nothing to stop");
      return;
    } else {
      lowLag.msg("stopSoundAudioContext " + tag);
    }

    // find current playing audio and stop it
    var source = lowLag.audioBufferSources[tag];
    if (source !== undefined) {
      if (typeof(source.noteOff) == "function") {
        source.noteOff(0);
      } else {
        source.stop();
      }
      lowLag.msg("stopSoundAudioContext stopped " + tag);
      lowLag.audioBufferSources[tag] = undefined;
      lowLag.currentPlayingTag = undefined;
    }
  }

  this.switchSoundAudioContext = function(autoplay) {
    lowLag.msg("switchSoundAudioContext " + (autoplay ? 'and autoplay' : 'and do not autoplay'));

    if (lowLag.currentTag && lowLag.currentTag == 'audio1') {
      lowLag.currentTag = 'audio2';
    } else {
      lowLag.currentTag = 'audio1';
    }

    if (autoplay) {
      lowLag.playSoundAudioContext();
    }
  }

  this.changeSoundAudioContext = function(tag, autoplay) {
    lowLag.msg("changeSoundAudioContext to tag " + tag + " " + (autoplay ? 'and autoplay' : 'and do not autoplay'));

  if(tag === undefined) {
     lowLag.msg("changeSoundAudioContext tag is undefined");
     return;
    }
    
    lowLag.currentTag = tag;

    if (autoplay) {
      lowLag.playSoundAudioContext();
    }
  }

  this.msg = function(m) {
    m = "-- lowLag " + m;
    console.log(m);
  }
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.0/jquery.min.js"></script>
<script>
  // AudioContext
  $(document).ready(function() {
    lowLag.init();
    lowLag.load(['https://coubsecure-s.akamaihd.net/get/b86/p/coub/simple/cw_looped_audio/f0dab49f867/083bf409a75db824122cf/med_1550250381_med.mp3'], 'audio1');
    lowLag.load(['https://coubsecure-s.akamaihd.net/get/b173/p/coub/simple/cw_looped_audio/0d5adfff2ee/80432a356484068bb0e15/med_1550254045_med.mp3'], 'audio2');
    // starts with audio1
    lowLag.changeSoundAudioContext('audio1', false);
  });

  // ----------------

  // Audio Element
  $(document).ready(function() {
    var $audioElement = $('#audioElement');
    var audioEl = $audioElement[0];
    var audioSources = {
      "audio1": "https://coubsecure-s.akamaihd.net/get/b86/p/coub/simple/cw_looped_audio/f0dab49f867/083bf409a75db824122cf/med_1550250381_med.mp3",
      "audio2": "https://coubsecure-s.akamaihd.net/get/b173/p/coub/simple/cw_looped_audio/0d5adfff2ee/80432a356484068bb0e15/med_1550254045_med.mp3"
    };
    playAudioElement = function() {
      audioEl.play();
    }
    pauseAudioElement = function() {
      audioEl.pause();
    }
    stopAudioElement = function() {
      audioEl.pause();
      audioEl.currentTime = 0;
    }
    switchAudioElement = function(autoplay) {
      var source = $audioElement.attr('data-source');

      if (source && source == 'audio1') {
        $audioElement.attr('src', audioSources.audio2);
        $audioElement.attr('data-source', 'audio2');
      } else {
        $audioElement.attr('src', audioSources.audio1);
        $audioElement.attr('data-source', 'audio1');
      }

      if (autoplay) {
        audioEl.play();
      }
    }
    changeAudioElement = function(tag, autoplay) {
      var source = $audioElement.attr('data-source');
      
      if(tag === undefined || audioSources[tag] === undefined) {
       return;
      }

      $audioElement.attr('src', audioSources[tag]);
      $audioElement.attr('data-source', tag);

      if (autoplay) {
        audioEl.play();
      }
    }
    changeAudioElement('audio1', false); // starts with audio1
  });

</script>

<h1>
  AudioContext (<a href="https://developer.mozilla.org/en-US/docs/Web/API/AudioContext" target="blank">api</a>)
</h1>
<button onClick="lowLag.play();">Play</button>
<button onClick="lowLag.pause();">Pause</button>
<button onClick="lowLag.stop();">Stop</button>
<button onClick="lowLag.switch(true);">Swtich</button>
<button onClick="lowLag.change('audio1', true);">Play 1</button>
<button onClick="lowLag.change('audio2', true);">Play 2</button>

<hr>

<h1>
  Audio Element (<a href="https://developer.mozilla.org/en-US/docs/Web/HTML/Element/audio" target="blank">api</a>)
</h1>
<audio id="audioElement" controls loop preload="auto" src="">
</audio>
<br>
<button onClick="playAudioElement();">Play</button>
<button onClick="pauseAudioElement();">Pause</button>
<button onClick="stopAudioElement();">Stop</button>
<button onClick="switchAudioElement(true);">Switch</button>
<button onClick="changeAudioElement('audio1', true);">Play 1</button>
<button onClick="changeAudioElement('audio2', true);">Play 2</button>

enter image description here enter image description here enter image description here enter image description here enter image description here enter image description here enter image description here


2
苹果决定(为了省钱),不预先加载<audio><video> HTML元素。
来自Safari开发人员文库的内容:
在iOS的Safari浏览器上(适用于所有设备,包括iPad),用户可能通过蜂窝网络连接并按数据单位收费,因此不允许预加载和自动播放。除非用户启动播放,否则不会加载任何数据。这意味着JavaScript中的play()load()方法也处于非活动状态,除非通过用户操作触发play()load()方法。换句话说,用户启动的播放按钮有效,但使用onLoad="play()"事件无效。
这个可以播放电影:<input type="button" value="Play" onClick="document.myMovie.play()"> 这个在iOS上什么也不做:<body onLoad="document.myMovie.play()"> 我认为你无法绕过这个限制,但你可能有办法。
请记住:谷歌是你最好的朋友。
更新:经过一些实验,我找到了用JavaScript播放<audio>的方法:
var vid = document.createElement("iframe");
vid.setAttribute('src', "http://yoursite.com/yourvideooraudio.mp4"); // replace with actual source
vid.setAttribute('width', '1px');
vid.setAttribute('height', '1px');
vid.setAttribute('scrolling', 'no');
vid.style.border = "0px";
document.body.appendChild(vid);

注意: 我只尝试过使用 <audio> 标签。


更新2: 这里有 jsFiddle。看起来可以工作。


4
问题不在于如何播放音频,而在于为什么会有延迟。 - j08691
1
你的回答只是重复了我已经发表的内容,并提供了播放音频文件的片段。我是否错过了你实际回答问题所贡献的新或不同之处? - j08691
@j08691 我记得有一篇关于Safari每次下载文件以及如何告诉Safari缓存它的文章。解决方案是一个meta标签,但我不记得网站了,而且我清除了历史记录:( 对不起。 - Anish Gupta
哦,抱歉,我真的没有看到你的答案,我不是故意抄袭的。 - Anish Gupta

1

1
同样的问题。我尝试通过不同的方式预加载它。最后,我将动画逻辑包装到“playing”回调中。因此,只有在文件加载并开始播放时,此逻辑才应该起作用,但结果是我看到动画逻辑已经开始了,并且音频播放需要大约2秒的延迟。 这让我很困惑,如果音频已经调用了“playing”回调,为什么会有延迟?enter image description here Audio Context解决了我的问题。 我在这里找到的最简单的例子 https://developer.mozilla.org/en-US/docs/Web/API/Body/arrayBuffer getData - 准备您的音频文件; 然后您可以使用source.start(0);播放它。
这个链接没有提供如何获取audioCtx,您可以在此处复制它 let audioCtx = new (window.AudioContext || window.webkitAudioContext)();

你能否发布一个问题,引用父问题并提及你正在使用的所有版本? - Sree.Bh

0
根据我的经验,我发现大多数答案仍然存在显著的延迟,即使更改音频格式并使用base64。唯一解决移动设备音频延迟问题的方法是使用Web Audio API替代new Audio()调用。
以下是一个快速的插入函数:
// Replace this
const audio = new Audio(url)
audio.play()

// With this
const audioPlay = async url => {
    const context = new AudioContext();
    const source = context.createBufferSource();
    const audioBuffer = await fetch(url)
      .then(res => res.arrayBuffer())
      .then(ArrayBuffer => context.decodeAudioData(ArrayBuffer));
  
    source.buffer = audioBuffer;
    source.connect(context.destination);
    source.start();
};
audioPlay(url)

这段代码不是我写的,应该归功于在SO上写这段代码的人。很抱歉我找不到我是在哪里找到它的。

0

我也遇到了同样的问题。奇怪的是,我已经预加载了文件。但是在WiFi上播放得很好,但在手机数据上,开始之前会有很长的延迟。我以为这与加载速度有关,但在所有图像和音频文件加载完成之前,我不会开始播放场景。有什么建议吗?(我知道这不是答案,但我认为这比制作重复帖子要好)。


0

我会在点击时简单地创建<audio autoplay /> dom元素,这在所有主要浏览器中都有效 - 无需处理事件和手动触发播放

如果您想手动响应音频状态更改 - 我建议监听play事件而不是loadeddata - 它在不同的浏览器中的行为更一致


网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接