10 June 2010
General experience building HTML-based applications is suggested. For more details on getting started with this Quick Start, refer to Building the Quick Start sample applications with HTML.
Additional Requirements
Beginning
Adobe AIR includes APIs for playing sound using JavaScript in an HTML-based application.
This article describes how to use the sound in a sample HTML-based AIR application.
Note: The SoundInHTML application is an example application provided, as is, for instructional purposes.
The sample applications include the following files:
morsePlayString() function, which plays a sound in International Morse codeTo test the application:
The application shows how you can use JavaScript to load an MP3 sound file. It also shows how you can use JavaScript to play dynamic sound, based on generated sound sample data.
The application calls the init() method in response to the onload event of the body object. This method instantiates an AIR Sound object. The Sound class includes a load() method, which loads an MP3 file.
sampleMp3 = new air.Sound();
sampleMp3.load(urlReq);
sampleMp3.addEventListener(air.Event.COMPLETE, loaded);
}
Note that the function sets up a handler for the complete event. The Sound object dispatches the complete event when the MP3 file is completely loaded. The event handler then enables the Play MP3 button (the DOM element with the ID "button1")
function loaded(event)
{ button1 = document.getElementById("button1"); button1.enabled = true; }
The event listener for the onclick method of the button is the playSound1() function. The behavior of this function varies depending on the state of the button (whether its displayed text is Play MP3 or Stop). If the text is Play MP3, the function does three things. It toggles the displayed text of the button to Stop. Then the function plays the sound by calling the play() method of the Sound object. And it sets up an event listener for when the sound finishes playing. Here is the code:
button1.value = "Stop";
soundChannel1 = sampleMp3.play();
soundChannel1.addEventListener(air.Event.SOUND_COMPLETE, stopSound1);
The result of the play() method is an AIR SoundChannel object. The code assigns this to the soundChannel1 variable. If the button is in Stop mode, the function does two things. It changes the label to Play MP3. And it stops the sound, by calling the stop() method of the SoundChannel object:
button1.value = "Play MP3";
soundChannel1.stop();
The SoundChannel object dispatches the soundComplete event when the sound has completed playing. The event handler for the soundComplete event also toggles the button label:
function stopSound1(event)
{
button1.label = "Play MP3";
}
The index.html file loads a Morse.js JavaScript file. The Morse.js file defines a morsePlayString() function and other code. The event handler for the onclick event of the Generate Morse Code button calls the morsePlayString() function. It passes the string from the codeText textarea object:
function playMorse(event)
{
var codeText = document.getElementById("codeText");
morsePlayString(codeText.value);
}
The playMorse() function takes a string as an argument and plays Morse Code audio based on the string data. First, the function calls a stringToCode() method, which returns a string of Morse code dots, dashes, and spaces, based on the input string. For example, given an input string "hello", the method returns ".... . .-.. .-.. ---". It does this by looking up characters in an array, mcMap:
function stringToCode(string)
{
var returnString = "";
for (i = 0; i < string.length; i++)
{
var char = string.charAt(i).toLowerCase();
if (mcMap[char] != undefined)
{
returnString += mcMap[char] + " ";
}
else
{
returnString += " ";
}
}
return returnString;
}
The playMorse() method assembles a byte array of sound sample data, based on the code string. An AIR ByteArray object can represent sound data as an array of floating point values. Each floating point value represents the amplitude of the left or right channel (alternating) for the sound. There are 44,000 samples per second. The Morse.js file defines three ByteArray objects, dotBytes, dashBytes, and silenceBytes, upon start-up:
var dotBytes = sineWaveGenerator(1);
var dashBytes = sineWaveGenerator(3);
var silenceBytes = silenceGenerator(2);
The sineWaveGenerator() method returns a ByteArray object of a length, based on the length parameter passed to it:
function sineWaveGenerator(length)
{
var bytes = new air.ByteArray();
for (var i = 0; i < length * 2000; i++ )
{
value = Math.sin(i / 8) * 0.5;
bytes.writeFloat(value);
bytes.writeFloat(value);
}
bytes.writeBytes(silenceGenerator(1));
return bytes
}
The returned ByteArray object contains audio sample data for a sine wave.
Similarly, the silenceGenerator method returns a ByteArray object of a length, based on the length parameter passed to it:
function silenceGenerator(length)
{
var bytes = new air.ByteArray();
for (var i = 0; i < length * 2000; i++ )
{
bytes.writeFloat(0);
bytes.writeFloat(0);
}
return bytes;
}
In this case, the ByteArray object contains audio data for silence.
Getting back to the playMorse() function, it calls the codeStringToBytes() function. This function populates a soundBytes ByteArray object with audio sample data, based on the code string:
function codeStringToBytes(string)
{
for (i = 0; i < string.length; i++)
{
var char = string.charAt(i);
switch (char)
{
case "." :
soundBytes.writeBytes(dotBytes);
break;
case "-" :
soundBytes.writeBytes(dashBytes);
break;
default :
soundBytes.writeBytes(silenceBytes);
}
}
soundBytes.position = 0;
}
The morsePlayString() method then instantiates a Sound object and calls its play() method. Since there is no loaded MP3 data, the Sound object periodically dispatches a sampleData event when it needs sound sample data:
codeSound = new air.Sound();
codeSound.addEventListener(air.SampleDataEvent.SAMPLE_DATA, addSoundBytesToSound);
codeSound.play();
The addSoundBytesToSound() function is the handler for the sampleData event. The sampleData event has a data property. This property stores the ByteArray data passed to the Sound object. The function adds data from the Morse Code sound sample data to the data property of the sampleData event. It adds up to 8192 floating point values, or the amount of remaining samples:
function addSoundBytesToSound(event)
{
sampleBytes.clear();
var numBytes = Math.min(soundBytes.bytesAvailable, 8 * 8192);
soundBytes.readBytes(sampleBytes, 0, numBytes);
event.data.writeBytes(sampleBytes, 0, numBytes);
air.trace(numBytes);
}
The Sound object then uses that sample data to play the Morse code sound. And it continues dispatching sampleData events (and calling the addSoundBytesToSound() event handler) until there is no more data.
For more information on working with sound in HTML-based AIR applications, see the "Working with Sound" chapter of Developing Adobe AIR Applications with HTML and Ajax. Also, see the ID3Info, Sound, SoundChannel, SoundLoaderContext, SoundMixer, and SoundTransform.classes in the Adobe AIR Language Reference for HTML Developers.