Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save tochimclaren/4a5ae74c62d5aac73cbcfafccb6f6da4 to your computer and use it in GitHub Desktop.
Save tochimclaren/4a5ae74c62d5aac73cbcfafccb6f6da4 to your computer and use it in GitHub Desktop.
🎵 How to record audio using the Web Audio API in JavaScript

🎵 How to record audio using the Web Audio API in JavaScript

Demo video: How to record audio using the Web Audio API in JavaScript

About this article

This article describes how to record audio using the Web Audio API in JavaScript. The related resources are shown below.

Workflow

The workflow is shown below.

  1. Coding preparation
  2. Coding
  3. Operation check

Coding preparation

Run the following commands to parepare for coding.

mkdir javascript-audio-recorder
cd javascript-audio-recorder
npm init -y
npm install --save-dev http-server
touch audio-recorder.js encode-audio.js index.html main.js

Coding

index.html

Open index.html in you editor and enter the following content.

Click to go to index.html

audio-recorder.js

Open index.html in you editor and enter the following content.

Click to go to index.html

The points are shown below.

  1. Define the isRecording parameter.
  2. Store the sample into the buffer when the isRecording parameter is 1.
  3. Send a message when the number of elements in the buffer is 1 or more.
  4. Register the Audio Recorder as an Audio Worklet.

encode-audio.js

Open encode-audio.js in you editor and enter the following content.

Click to go to encode-audio.js

main.js

Open main.js in you editor and enter the following content.

Click to go to main.js

The points are shown below.

  1. Get the media stream to capture the audio from the microphone.
  2. Get the settings such as resolution and sampling frequency.
  3. Load the module from audio-recorder.js.
  4. Create a node to handle the media stream.
  5. Create a node to use the worklet.
  6. Set the event handler when a message is received from the worklet.
  7. Start receiving messages from the worklet.
  8. Connect media streams, worklets and speakers. Since the worklet does not output anything, no sound is played from the speaker.
  9. Set the worklet's isRecording parameter to 1 to start recording.
  10. Set the worklet's isRecording parameter to 0 to stop recording.
  11. Call the encodeAudio function to convert the recorded audio to WAVE format.

Operation check

Just opening index.html in your web browser will fail to call audioContext.audioWorklet.addModule, so run the following command to start the web server.

npx http-server -c-1

Go to http://localhost:8080 in your web browser.

You will be asked to allow access to the microphone, so click the "Allow" button.

Click the Start button to start recording.

Click the "Stop" button to stop recording.

Click the play button to play the audio.

Referenced web pages

The web pages that I referred to when writing the article are shown below.

Conclusion

There is also a way to record audio using the MediaStream Recording API, which is easier. For details on how to use the MediaStream Recording API, see How to record audio using the MediaStream Recording API with JavaScript. I would appreciate it if you could take a look. Thank you for reading!

License

MIT

/node_modules/
/package-lock.json
# Do not ignore package-lock.json other than gist.
class AudioRecorder extends AudioWorkletProcessor {
static get parameterDescriptors () { // <1>
return [
{
name: 'isRecording',
defaultValue: 0,
minValue: 0,
maxValue: 1,
},
]
}
process (inputs, outputs, parameters) {
const buffer = []
const channel = 0
for (let t = 0; t < inputs[0][channel].length; t += 1) {
if (parameters.isRecording[0] === 1) { // <2>
buffer.push(inputs[0][channel][t])
}
}
if (buffer.length >= 1) {
this.port.postMessage({buffer}) // <3>
}
return true
}
}
registerProcessor('audio-recorder', AudioRecorder) // <4>
function encodeAudio (buffers, settings) {
const sampleCount = buffers.reduce((memo, buffer) => {
return memo + buffer.length
}, 0)
const bytesPerSample = settings.sampleSize / 8
const bitsPerByte = 8
const dataLength = sampleCount * bytesPerSample
const sampleRate = settings.sampleRate
const arrayBuffer = new ArrayBuffer(44 + dataLength)
const dataView = new DataView(arrayBuffer)
dataView.setUint8(0, 'R'.charCodeAt(0)) // <10>
dataView.setUint8(1, 'I'.charCodeAt(0))
dataView.setUint8(2, 'F'.charCodeAt(0))
dataView.setUint8(3, 'F'.charCodeAt(0))
dataView.setUint32(4, 36 + dataLength, true)
dataView.setUint8(8, 'W'.charCodeAt(0))
dataView.setUint8(9, 'A'.charCodeAt(0))
dataView.setUint8(10, 'V'.charCodeAt(0))
dataView.setUint8(11, 'E'.charCodeAt(0))
dataView.setUint8(12, 'f'.charCodeAt(0))
dataView.setUint8(13, 'm'.charCodeAt(0))
dataView.setUint8(14, 't'.charCodeAt(0))
dataView.setUint8(15, ' '.charCodeAt(0))
dataView.setUint32(16, 16, true)
dataView.setUint16(20, 1, true)
dataView.setUint16(22, 1, true)
dataView.setUint32(24, sampleRate, true)
dataView.setUint32(28, sampleRate * 2, true)
dataView.setUint16(32, bytesPerSample, true)
dataView.setUint16(34, bitsPerByte * bytesPerSample, true)
dataView.setUint8(36, 'd'.charCodeAt(0))
dataView.setUint8(37, 'a'.charCodeAt(0))
dataView.setUint8(38, 't'.charCodeAt(0))
dataView.setUint8(39, 'a'.charCodeAt(0))
dataView.setUint32(40, dataLength, true)
let index = 44
for (const buffer of buffers) {
for (const value of buffer) {
dataView.setInt16(index, value * 0x7fff, true)
index += 2
}
}
return new Blob([dataView], {type: 'audio/wav'})
}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>How to record audio using Web Audio API in JavaScript</title>
</head>
<body>
<h1>How to record audio using Web Audio API in JavaScript</h1>
<div>
<button type="button" id="buttonStart">Start</button>
<button type="button" id="buttonStop" disabled>Stop</button>
</div>
<div>
<audio controls id="audio"></audio>
</div>
<script src="encode-audio.js"></script>
<script src="main.js"></script>
</body>
</html>
MIT License
Copyright (c) 2024 Tatsuya Sususkida
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
async function main () {
try {
const buttonStart = document.querySelector('#buttonStart')
const buttonStop = document.querySelector('#buttonStop')
const audio = document.querySelector('#audio')
const stream = await navigator.mediaDevices.getUserMedia({ // <1>
video: false,
audio: true,
})
const [track] = stream.getAudioTracks()
const settings = track.getSettings() // <2>
const audioContext = new AudioContext()
await audioContext.audioWorklet.addModule('audio-recorder.js') // <3>
const mediaStreamSource = audioContext.createMediaStreamSource(stream) // <4>
const audioRecorder = new AudioWorkletNode(audioContext, 'audio-recorder') // <5>
const buffers = []
audioRecorder.port.addEventListener('message', event => { // <6>
buffers.push(event.data.buffer)
})
audioRecorder.port.start() // <7>
mediaStreamSource.connect(audioRecorder) // <8>
audioRecorder.connect(audioContext.destination)
buttonStart.addEventListener('click', event => {
buttonStart.setAttribute('disabled', 'disabled')
buttonStop.removeAttribute('disabled')
const parameter = audioRecorder.parameters.get('isRecording')
parameter.setValueAtTime(1, audioContext.currentTime) // <9>
buffers.splice(0, buffers.length)
})
buttonStop.addEventListener('click', event => {
buttonStop.setAttribute('disabled', 'disabled')
buttonStart.removeAttribute('disabled')
const parameter = audioRecorder.parameters.get('isRecording')
parameter.setValueAtTime(0, audioContext.currentTime) // <10>
const blob = encodeAudio(buffers, settings) // <11>
const url = URL.createObjectURL(blob)
audio.src = url
})
} catch (err) {
console.error(err)
}
}
main()
{
"name": "javascript-media-audio",
"version": "1.0.0",
"description": "",
"main": "main.js",
"scripts": {
"dev": "http-server -c-1"
},
"keywords": [],
"author": "",
"license": "MIT",
"devDependencies": {
"http-server": "^14.1.0"
}
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment