Timed Rhythms with Web Audio API and JavaScript

UPDATED ON: December 20, 2014

Timed Rhythms with Web Audio API and JavaScript is a tutorial that details how to create rhythms by triggering buffered audio at specific time intervals. It builds on what we’ve covered in previous tutorials Web Audio API Basics, Web Audio API Audio Buffer, Play a Sound with Web Audio API, and Web Audio API BufferLoader. Before you tackle this, you should have a solid understanding of the Web Audio API. Please refer to the previous tutorials. Once you’re up to speed, you should be able to follow along as I explain how to create and play timed rhythms with Web Audio API and JavaScript.

Timed Rhythms with Web Audio API and JavaScript

With the help of the Web Audio API, musical rhythms can be programmed in JavaScript and executed on the fly in your browser. One of my favorite things about programming music is the moment of fruition when I finally hear what was, until that moment, only in my head. While it’s not the easiest or quickest way to create your backing tracks, it is rewarding in it’s own way. For this tutorial, we’ll create two different rhythms using the same three buffered audio files. The files will be simple kick, snare and hi-hat drum sounds.

Initialization

First, we declare the variables context and bufferLoader in the global scope. Then we’ll create a function named init that’s called when the page loads. This function defines context as the new AudioContext for compatible browsers. For all other browsers, the warning alert “Web Audio API is not supported in this browser” is displayed. Next, our three audio files are loaded into the buffer using the BufferLoader class. Buffering the audio assets immediately upon page load will help prevent any possible delay when the user actually attempts to play a rhythm.

The Functions of Rhythm

In order to play a timed rhythmic pattern, we need to create two functions. One will contain the programmed rhythm, and the other will be used to play each individual audio asset (or note) within the rhythm. They can be thought of as the Play Rhythm and Play Sound functions. The playSound function plays a specified buffered audio asset at a specified time. We’ll come back to this in a bit. The startPlayingRhythm1 function is where the real magic happens. This function is passed the contents of bufferList and assigns previously buffered audio assets to the variables kick, snare, and hihat.

Timing

Now we need to deal with the timing. The startTime variable is set to context.currentTime + 0.100. This will start playing the rhythm 100 milliseconds from the moment it is triggered, which gives the user just enough time to release the button that was clicked. The actual tempo (in BPM or beats per minute) is assigned to the tempo variable. The quarterNoteTime variable is important and requires a bit of math. We need to take the number of seconds in a minute (60) and divide by tempo. This gives us the exact length of each quarter note no matter what the actual tempo is set to.

// We'll start playing the rhythm 100 milliseconds from "now"
var startTime = context.currentTime + 0.100;
    
var tempo = 120; // BPM (beats per minute)
var quarterNoteTime = 60 / tempo;

Create a Rhythm

To create a rhythm, we call the playSound function for each individual note. Each time, we pass along an audio buffer and a time specification. For our purposes, the buffered audio is either kick, snare, or hihat. The time specification is an expression. Using startTime and quarterNoteTime, we can do the math to make our buffered audio play precisely when it’s needed. For example, to play the kick drum on every quarter note, we could use the following code.

playSound(kick, startTime);
playSound(kick, startTime + quarterNoteTime);
playSound(kick, startTime + 2*quarterNoteTime);
playSound(kick, startTime + 3*quarterNoteTime);

We could also achieve the same result with the following code.

for (var i = 0; i < 4; ++i) {
    playSound(kick, startTime + i*quarterNoteTime);
}

Trigger Events

To trigger our rhythms, we use the inline binding technique on input elements with their type attribute set to button. One for each different rhythm. When clicked, they call the appropriate function and pass to it the bufferlist.

<h3>Rhythm 1 (120 BPM)</h3>
<input type="button" value="Play Rhythm 1" onclick="startPlayingRhythm1(bufferLoader.bufferList);" />

<h3>Rhythm 2 (80 BPM)</h3>
<input type="button" value="Play Rhythm 2" onclick="startPlayingRhythm2(bufferLoader.bufferList);" />

A Working Example

The function for the second rhythm is similar to the first, but has a different tempo and of course different time specifications for the individual sounds. Here’s the full code to create two different rhythms in JavaScript with the Web Audio API. Change the path and filename of the audio files to match yours and you’ve got it!

window.onload = init;

var context;
var bufferLoader;

function init() {
    try {
        context = new AudioContext();
    }
    catch(e) {
        alert("Web Audio API is not supported in this browser");
    }
    
    // Start loading the drum kit.
    bufferLoader = new BufferLoader(
        context,
        [
        "sounds/kick.wav",
        "sounds/snare.wav",
        "sounds/hihat.wav"
        ],
        bufferLoadCompleted  
    );

    bufferLoader.load();
}

function playSound(buffer, time) {
    var source = context.createBufferSource();
    source.buffer = buffer;
    source.connect(context.destination);
    source.start(time);
}

// Plays Rhythm 1
function startPlayingRhythm1(bufferList) {
    var kick = bufferList[0];
    var snare = bufferList[1];
    var hihat = bufferList[2];
    
    // We'll start playing the rhythm 100 milliseconds from "now"
    var startTime = context.currentTime + 0.100;
    
    var tempo = 120; // BPM (beats per minute)
    var quarterNoteTime = 60 / tempo;

    // Play the kick drum on beats 1, 2, 3, 4
    playSound(kick, startTime);
    playSound(kick, startTime + quarterNoteTime);
    playSound(kick, startTime + 2*quarterNoteTime);
    playSound(kick, startTime + 3*quarterNoteTime);

    // Play the snare drum on beats 2, 4
    playSound(snare, startTime + quarterNoteTime);
    playSound(snare, startTime + 3*quarterNoteTime);
    
    // Play the hi-hat every 16th note.
    for (var i = 0; i < 16; ++i) {
        playSound(hihat, startTime + i*0.25*quarterNoteTime);
    }
}

// Plays Rhythm 2
function startPlayingRhythm2(bufferList) {
    var kick = bufferList[0];
    var snare = bufferList[1];
    var hihat = bufferList[2];
    
    // We'll start playing the rhythm 100 milliseconds from "now"
    var startTime = context.currentTime + 0.100;
    
    var tempo = 80; // BPM (beats per minute)
    var quarterNoteTime = 60 / tempo;

    // Play the kick drum on beats 1, 2, 3, 4
    playSound(kick, startTime);
    playSound(kick, startTime + 0.5*quarterNoteTime);	
    playSound(kick, startTime + 1.75*quarterNoteTime);
    playSound(kick, startTime + 2*quarterNoteTime);
    playSound(kick, startTime + 2.5*quarterNoteTime);
	
    // Play the snare drum on beats 2, 4
    playSound(snare, startTime + quarterNoteTime);
    playSound(snare, startTime + 3*quarterNoteTime);
    playSound(snare, startTime + 3.75*quarterNoteTime);	
    
    // Play the hi-hat every 16th note.
    for (var i = 0; i < 16; ++i) {
        playSound(hihat, startTime + i*0.25*quarterNoteTime);
    }
    playSound(hihat, startTime + 3.125*quarterNoteTime);
	
}

function bufferLoadCompleted() {
	
}

Timed Rhythms with Web Audio API and JavaScript

I’ve shown you how to create two rhythms by triggering three buffered audio files at specific time intervals. Can you think of any ways to use this technique? Please share if you do. Be creative! I’d love to hear what you’ve done with it. How will you use rhythms programmed in JavaScript to make the web awesome?

Other tutorials in this series

More examples of the Web Audio API in action

4 thoughts on “Timed Rhythms with Web Audio API and JavaScript

Leave a Reply

Your email address will not be published. Required fields are marked *