首页 > 解决方案 > 我可以在 HTML 中播放 JS 生成的音频吗

问题描述

const context = new AudioContext();
let o = null,
    g = null;

function play(){
    o = context.createOscillator();
    g = context.createGain();
    o.type = "sine";
    o.connect(g);
    o.connect(context.destination);
    o.start();
}

function stop(){
    o.stop();
    // DO SOMETHING TO SAVE AUDIO IN AN HTML <AUDIO> TAG
}

调用该函数时play(),会播放一个正弦波声音,然后通过调用该stop()函数停止该声音。我想将此音频发送到 html<audio>标签。可能吗?

标签: javascripthtml

解决方案


在对如何做到这一点感到好奇之后,我偶然发现了一篇MDN 文章正在做这件事。

它使用MediaRecorder接口和一个MediaStreamDestinationNode. 要录制由振荡器创建的声波,您必须将声音传递给MediaStreamDestinationNode以将其转换为流。然后,该流被 使用MediaRecorder,它在播放声音时捕获流向节点的数据。当播放停止时,所有发送的数据都将转换为Blob. 您可以通过将 blob 的type属性设置为您想要使用的所需 MIME 类型来指定 blob 的类型。例如audio/mp3.

您可以使用URL.createObjectURL()已创建的此 blob 的 URL 创建引用。然后可以将此 URL 用作标记src的。<audio>现在音频元素有一个播放源,即您录制的声音。

下面我根据文章中的代码做了一个例子,它记录你的正弦波并允许它在<audio>元素中重放。注意:每当您重新录制时,之前的录制都会丢失。

// Select button and audio elements.
const button = document.querySelector('button');
const audio = document.querySelector('audio');

// Define global variables for oscillator and gain.
let oscillator = null;
let gain = null;
let source = null;

// Create context, stream destination and recorder.
const ctx = new AudioContext();
const mediaStreamDestination = ctx.createMediaStreamDestination();
const recorder = new MediaRecorder(mediaStreamDestination.stream);

// Store the chunks of audio data in an array.
let chunks = [];

// Dump the previous stored blob from memory and clear the chunks array.
// Otherwise, all recorded data will be stored until the page is closed. 
recorder.addEventListener('start', function(event) {
  if (source !== null) {
    URL.revokeObjectURL(source);
  }
  chunks.length = 0;
});

// When all the sound has been recorded, store the recorded data
// in the chunks array. The chunks will later be converted into
// a workable file for the audio element.
recorder.addEventListener('dataavailable', function(event) {
  const { data } = event;
  chunks.push(data);
});

// Whenever the recorder has stopped recording, create a Blob 
// out of the chunks that you've recorded, then create a object url
// to the Blob and pass that url to the audio src property.
recorder.addEventListener('stop', function(event) {
  const blob = new Blob(chunks, { 'type': 'audio/aac' });
  source = URL.createObjectURL(blob);
  audio.src = source;
});

// Click on the button to start and stop the recording.
button.addEventListener('click', function(event) {
  if (recorder.state !== 'recording') {
  
    // Create new oscillator and gain.
    oscillator = ctx.createOscillator();
    gain = ctx.createGain();
    
    // Connect the oscillator and gain to the MediaStreamDestination.
    // And play the sound on the speakers.
    oscillator.connect(gain);
    gain.connect(ctx.destination);
    gain.connect(mediaStreamDestination);
    
    // Start recording and playing.
    recorder.start();
    oscillator.start();
    event.target.textContent = 'Stop recording';
    
  } else {
  
    // Stop recording and playing.
    recorder.stop();
    oscillator.stop();
    event.target.textContent = 'Record sine wave';
    
  }
});
<button>Make sine wave</button>
<audio controls></audio>

如果您对上面的代码有任何疑问,或者我没有正确解释,请告诉我。


推荐阅读