10 August 2009
This article covers beginning to intermediate concepts in working with audio for the web. General knowledge of the Flash authoring interface and ActionScript fundamentals is advised.
Beginning
This article gives you a solid understanding of working with audio in Adobe Soundbooth CS4 and Adobe Flash CS4 Professional as well as tutorials and sample files to get you up and running. You'll cover the basic techniques to work with audio in ActionScript 3.
Adobe Flash has a long history of successfully deploying audio on the web. Flash 4 introduced support for the MP3 format, which opened up the door for using larger files with better performance. Flash 5 introduced the Sound object in ActionScript and the ability to control sound dynamically in runtime-based applications. Flash MX introduced the FLV format and MP3 metadata support for expanded options in synchronization and data management. Flash CS3 added new levels of audio support with ActionScript 3, capable of displaying sound spectrums and performing enhanced error handling.
Flash CS4 now adds support for the ASND format, which allows you to take a snapshot of the original audio so you can revert edits all the way back to the starting point. In addition, Flash CS4 FLA files (and FLV files) created with Adobe Media Encoder CS4 can now contain XMP metadata, which describes the audio related to the file along with other file information.
When you're developing audio for SWF applications, you need to be aware of the capabilities and requirements of audio in Adobe Flash Player. This section covers everything you need to know to get started.
The general workflow for audio production in Flash CS4 is the same as with other non-vector media. In most cases the audio asset will be acquired and edited externally from the application. When the audio file is prepared, it is then imported into the FLA file or loaded at runtime using ActionScript. This is the first big decision to be made in the process; do you embed the audio or use audio that is external to the Flash movie?
Using embedded audio allows you to import a range of audio formats and has the benefit of visual authoring in the Adobe Flash interface (you don't need to use any coding to implement it). Embedded audio also has the advantage of visual synchronization with graphic content. The disadvantages are the incurred larger file size to the SWF file and the lack of flexibility for changes and runtime manipulation.
Using external audio is generally the way to go for more complex projects. External audio has the advantage of remaining flexible for edits and dynamic play list driven content. It also has the advantage of excluding the audio's file size from the SWF file. The primary disadvantage is that it requires some ActionScript knowledge to implement it. Some tasks like voice synchronization with a character animation on the Timeline work best if the audio is embedded directly on the Timeline.
The following list is divided into the four facets of production that should be considered: file preparation, working with embedded audio, working with external audio, and editing your audio.
General steps in the preparation workflow include the following:
General steps in the embedded audio workflow include the following:
General steps in the external audio workflow include the following:
General steps in the editing workflow include the following:
Before I go further into production techniques, let's take a look at the types of audio formats that Flash Player supports.
It is important to understand that Flash Player is designed to play audio in a few specific formats. By itself, Flash Player cannot record audio streams. However the player does have access to the microphone object on the end viewer's computer and can be combined with the Flash Media Server to record and store sound files on a server or stream the sound to other SWF application instances. The Flash Media Server greatly expands the possibilities of what can be done with Flash audio, but Flash Player by itself works well for streaming sounds for playback.
The Flash Player supports 8-bit or 16-bit audio at sample rates of 5.512, 11.025, 22.05, or 44.1 kHz. The audio can be converted into lower sample rate during publishing, but it is recommended that you use an audio editing application to resample and edit the audio files—outside of Flash CS4. If you want to add effects to sounds in Flash, it's best to use 16-bit sounds. If you have limited RAM on your system, keep your sounds short and use 8-bit audio.
Tip: Your sound needs to be recorded in a format sampled at 44.1 kHz, or an even factor of 44.1, to assure proper playback in the SWF file. Working in an audio production tool built to produce "Flash-friendly" audio can save a lot of headaches dealing with subtle details. Soundbooth CS4 is a solution designed specifically for Adobe Flash audio production and integration with other Adobe tools.
Flash supports a range of audio formats for importing and embedding sounds in a SWF file. You'll need to have QuickTime 4 or higher installed on your computer to take full advantage of the supported formats during authoring (see Table 1).
Table 1. Supported audio formats when importing audio into Flash
| Format | Type | Platform | QuickTime 4 |
|---|---|---|---|
| AIFF | Audio Interchange File Format | Win/Mac | N/A |
| ASND | Adobe Soundbooth | Win/Mac | N/A |
| AU | Sun File Format | Win/Mac | Required |
| MOV | Sound Only QuickTime Movies | Win/Mac | Required |
| MP3 | MPEG Layer 3 | Win/Mac | N/A |
| SD2 | Sound Designer 2 | Mac | Required |
| WAV | Waveform Audio Format | Win/Mac | N/A |
Note: The ASND format is a nondestructive audio file format native to Adobe Soundbooth. ASND files can contain audio data with effects, sound scores, and multitrack compositions that can be modified later.
When you embed the audio, Flash CS4 bundles the sound with the SWF file. In most cases the embedded sound will be compressed along with the rest of the assets in the file during publishing. So in the case of embedded audio, you also have to think about the exported audio format as well (see Table 2).
Table 2. Supported audio formats when exporting embedded audio in a SWF
| Format | Type | Platform | QuickTime 4 |
|---|---|---|---|
| ADPCM | Adaptive differential pulse code modulation compression |
Win/Mac | N/A |
| MP3 | MPEG Layer 3 compression | Win/Mac | N/A |
| Raw | Uncompressed | Win/Mac | N/A |
| Speech | ASAO compression | Win/Mac | N/A |
Notice that while QuickTime is required for importing the full range of supported audio formats, it is not needed to export them or play the published movie. Flash Player handles the playback of the four export formats. The default and most commonly used audio format is MP3. Also notice that all formats are supported on Mac and Windows regardless of whether the file required a Mac during the authoring process.
The export audio format can be set globally in the Publish Settings dialog box or set per sound file. To adjust audio settings globally, edit the event and streaming fields in the Publish Settings (File > Publish Settings). To adjust audio settings per sound, right-click the sound in the Library to launch the Sound Settings dialog box (see Figure 1).
When you work with embedded audio that is attached to a timeline, you have to decide whether to handle each sound as "event" audio or "streaming" audio in Flash Player. This is a concept that specifies how the audio relates to its timeline (or not). Streaming audio signals Flash Player to synchronize the audio to the timeline to which it is attached, and to start playing the sound as it downloads. When you stream audio, you attach the sound to a timeline and the audio playback is directly synched to the length of that timeline. This approach is commonly used for synchronization with animated content and for streaming playback of larger content files.
Event audio signals Flash Player to handle the sound's playback without regard to its timeline. If its timeline contains a limited number of frames, such as a button, the sound can play from start to finish. Event audio has to download completely before it can play, and therefore is most commonly used for short sounds, such as button clicks.
Flash Player supports playback of external audio in MP3 and FLV format. The MP3 format has been a mainstay with Flash developers since the Flash 4 era whereas the FLV format became an option in Flash MX (6) when the Flash Media Server implemented the format for Flash video and audio streaming.
Tip: Flash Player 9,0,115 and later supports the HE-AAC audio codecs along with H.264 video.
The MP3 format is commonly used because it is familiar to developers from other areas of web production. MP3 formatted files are relatively easy to produce and easy to share with other web-based applications. While this may be the case, there are some advantages to working with the FLV format. FLV formatted files can hold metadata, such as cue points for synchronization. You can also manipulate FLV audio files using the FLVPlayback component, which allows you to load the sound and create a playback interface with little to no ActionScript knowledge. As audio editing tools such as Soundbooth now export source audio to FLV format, FLV has become a viable option for developers working outside of the Flash Media Server environment.
Table 3. Supported audio formats when playing external audio in Flash
| Format | Type | Platform | QuickTime 4 |
|---|---|---|---|
| FLV | Flash Video | Win/Mac | NA |
| MP3 | MPEG Layer 3 | Win/Mac | NA |
Flash Player 6 and later supports metadata for both the MP3 and FLV formats. MP3 files support the ID3 v1.0 and v1.1 standard. FLV files support FLV video metadata parameters, including cue points for content synchronization and custom parameter entries. In both cases the metadata can be retrieved at runtime using event handler functions in ActionScript.
Tip: Flash CS4 FLA files and FLV files created with Adobe Media Encoder CS4 can also contain XMP metadata, which describes the audio related to the file along with other file information. XMP metadata conforms to W3C standards and can be used by web-based search engines to return meaningful search results about the SWF file and its internal content.
See Working with sound metadata in the Programming ActionScript 3 for Flash online documentation for more information on MP3 metadata. Also check out Using cue points and metadata to learn more about working with FLV metadata.
Flash Lite is a runtime engine used to display SWF files on consumer electronics and mobile devices. This article focuses on implementing audio for playback in Flash Player; however it's interesting to note that Flash Lite supports the playback of device sounds such as MIDI and SMAF, among others. See Import and play a device sound in the Developing Flash Lite 2.x and 3.0 Applications online documentation for more information on working with Flash Lite and deploying audio in mobile devices.
Soundbooth CS4 is an audio editing tool that integrates with Flash CS4 and other software in Adobe Creative Suite 4. Soundbooth provides a range of editing and composition tools in a simple workflow that anyone can use. New to this version is the editable ASND format—the ability to create multitrack compositions—along with a handful of other upgrades.
This section focuses on the basic tasks you'll commonly work through when preparing audio for Flash in Soundbooth. For full documentation, see the Soundbooth CS4 online documentation.
One of the most accessible and least expensive ways to obtain audio files is to record the sound yourself. The Soundbooth environment takes this into account and provides the means to record and clean up the artifacts commonly incurred when recording audio outside of a professional studio.
To prepare Soundbooth for audio recording, follow these steps:
To record audio, follow these steps:
For more best practices on recording techniques, see the following resources:
One of the unique features of Soundbooth is the ability to generate soundtracks by combining audio and video clips with prebuilt audio compositions called scores. Basically, you can create a soundtrack that adds your content on top of professional compositions. You can also use the score without additional content as a quick way to add background audio to your movie. Each score template provides a range of editable parameters that allow you to customize and export the score as a new audio file. You may also save customized score templates for reuse later.
For the purposes of this tutorial, I'll focus on the simple audio clip that you recorded in the previous section. To find more information on working with scores, see Customizing scores in the Using Soundbooth CS4 online documentation.
Tip: Soundbooth ships with two default scores available for immediate use. You can acquire more templates by launching the Resource Central panel from the Windows menu.
In addition to recording your own audio and creating your own scores, you can open prerecorded audio and video sources to work with as well. The process is fairly easy. You open a file and edit it using the range of features in Soundbooth. Or you open a handful of files and copy and paste portions of them to collage together a multitrack sound file. You may also open files to use as reference clips while working with a score template.
To open an audio or video file, follow these steps:
New to Soundbooth CS4 is the ability to save an editable source file in ASND format. The ASND format allows you to take a snapshot of the original audio so you can revert edits all the way back to the starting point. You can save ASND files for future editing and embed them directly in Flash CS4 FLA files.
Before you make edits to your source audio, you should save the file in the ASND format:
Soundbooth has some great clean up features built-in to help you create the best quality audio from an office environment. This is a fairly common situation for a web developer; you may be using professional equipment, but you're working in an office environment instead of an actual sound booth. In my studio I often deal with the hum of several large computers or an occasional dog bark that needs to be edited out of an otherwise great track.
To remove noise or clicks and pops from an audio file, follow these steps:
To remove a sound from an audio file, follow these steps:
Another editing best practice is to cut the high frequencies out of the signal before converting the file to a compressed format. Doing so will produce a compressed sound with less of the artifacts that are commonly heard in web audio.
To cut the high frequencies or hiss out of the audio before export, follow these steps:
I find that I run through a standard set of editing steps before moving on toward exporting the file to a compressed format. Usually that includes trimming the audio to the correct length, adding fades and effects, and normalizing the audio if the levels are too low.
To trim the audio file to a specific length, follow these steps:
To apply an effect, such as the Voice Enhancer, follow these steps:
To add a fade in or out of the sound, follow these steps:
To normalize the recording and boost the volume level, follow these steps:
One of the benefits of this new generation of audio software is the ability to easily generate and export timing markers to use for content synchronization.
To create time markers in your audio file, follow these steps:
To export the markers as an XML file, follow these steps:
Tip: It's interesting to note that you can use the File > Import command to import a marker XML file. This feature allows you to create marker definitions for reuse across other media files or allows you to write markers in XML for output in FLV format. You can also import and export marker XML files in Adobe Media Encoder.
Last stop on the audio production side of things is to export the compressed audio from the source audio. Usually you'll be exporting to MP3 or FLV format, although you can work with ASND, WAV, or AIFF if you plan on embedding the audio in your FLA file.
To export an MP3 audio file, follow these steps:
To export an FLV file (with cue point markers bundled with the audio), follow these steps:
Whether you export your source files in MP3 or FLV format, you should end up with two sets of audio files: your master files (used for editing) and your exported files (served from the website).
Note: If you encode the FLV using Adobe Media Encoder outside of Soundbooth, you can embed the cue points directly in the FLV format. F4V video does not support cue points in the same way as the FLV format.
One topic that's often overlooked in web development is the topic of file management. To maintain a fast and easy workflow while handling multiple projects, your best bet is to use a process for organizing your FLA files, assets files, and website files. In addition, the structure of your FLA files, ActionScript coding, and the interface to the web environment should be simple and organized.
This section focuses on file management approaches in handling assets in the file system, while editing in Soundbooth, and implementing functionality in Flash CS4.
The primary means of file management while working in Soundbooth are the File, Scores, and Resource Central panels. You'll work with these panels while performing routine tasks and selecting files.
Adobe Bridge CS4 is used for audio file management across Soundbooth and all CS4 applications. Bridge is a simple, stand-alone application installed with the CS4 products. Included in its feature set is the ability to browse files, add metadata to files, group related assets, and perform automated tasks and batch commands.
To launch the Adobe Bridge CS4 file management tool, follow these steps:
For more information on Adobe Bridge CS4 features, see the Adobe Bridge/Version Cue online documentation.
Throughout this article the topic of embedded audio vs. external audio has come up. In regards to file management, you have to think about the question before you start setting up your file structure and approach. If you're embedding the audio, then your file management will take place in Flash CS4, where you'll use the FLA file's document Library to create an organized collection of movie clip symbols, sound assets, and library folders. If you're using external audio, then you'll manage the collection of files using XML play lists, an organized folder structure in an application folder, and the new Project panel in Flash CS4.
Consider the organization approach best used if you're planning on embedding your audio. In this case, the Library panel and movie clip symbols become your environment for organization. Embedded audio has to be imported through the File menu in Flash CS4 (File > Import). As you import sounds, they appear in the Library panel. It's important to organize your Library using folders. Otherwise, the view can quickly become a long scrolling list of assets that is unwieldy and impractical to navigate. It is a best practice to use a pattern and choose naming conventions for your Library folders that repeats across projects. This makes it easier for other developers (and even yourself at a later date) to understand where assets are located and where to edit your project (see Figure 12).
In addition to creating an easy to navigate series of folders for the Library assets, you have to think about how you will handle the sounds next. If you use the Sound object in ActionScript, then you can create a single reusable component or script to handle the audio in the Library. If you attach the sounds to a timeline using the Property inspector, then you'll either be separating sounds in buttons symbols, separating sounds in movie clip symbols, or staggering sounds along a single movie clip's timeline.
In the first sample file, three audio files are attached to timelines as event sounds in three separate button symbols. The collection of sound buttons is stored in a "Sound Player" movie clip that encapsulates the controls into a single object on the main Timeline. This type of organization within the FLA file works well for portability and general ease of navigation.
Note: Even though a sound in the Library is not a symbol type, the same concepts apply for reuse. That means you can create as many instances of the sound as you like without adding more file size to the FLA file.
For dynamic applications using sound, the audio files will usually be located external to the SWF file. These types of applications require a different organizational approach. In general, the Flash file should read in a playlist to remain flexible and reusable. Now that you're dealing with separate external files, you should create an organized project folder containing all your assets separated in corresponding folders.
The Project panel is similar to the Dreamweaver Site window, in that it allows you view the file structure of the website while working on a document. Flash CS4 includes an update to the Project panel. You can add FLA files, ActionScript files, and folders, to a project. I use the Project panel as a shortcut to quickly open FLA files, ActionScript files, and text files directly from Flash CS4. I also find it useful to create a list of all the files and folders in my "project" folder for visual reference while thinking about or discussing the site (see Figure 13).
XML is a commonly used format for configuring playlists in SWF applications. For example, let's say that you build an MP3 jukebox application that loads a series of MP3 files and plays them in a simple interface. You probably wouldn't want to rebuild the application every time you wanted to switch the list of MP3 files. And what if you wanted an English version and a Spanish version?
Instead of embedding the audio files, you'll use external MP3 files and set up the jukebox application to read an external playlist. This external playlist is simply an XML formatted text file, which can be easily changed without changing the jukebox application. Furthermore, you could supply the path to the playlist file in the HTML parameters that embed the SWF file on the web page; this approach allows you to switch from English to Spanish file references dynamically or by using a separate HTML page for each language version.
The XML format is simple to learn in the context of Flash development. An XML document is made up of open and close tags (<myTag></myTag>) similar to HTML tags. The names of the tags, called XML nodes, are user-defined and are intended to describe the data in-between the open and close tags. The node names and the nesting of nodes create a hierarchy of information that can be consumed as a string in any programming language.
A simple playlist document contains references to the audio file paths as well as track names and any other related information, as shown in the XML example below:
<songs>
<song>
<name>Song 1</name>
<source>audio/song1.mp3</source>
</song>
<song>
<name>Song 2</name>
<source>audio/song2.mp3</source>
</song>
<song>
<name>Song 3</name>
<source>audio/song3.mp3</source>
</song>
</songs>
Tip: It's a best practice to include the folder path in the XML data in addition to the filename. That way you can easily change files without necessarily having to change the filenames; for example, you could change the folder name to route between multiple languages.
An XML file can also be used to store timing markers for synchronization of content while the audio is playing. One of the useful things about working with XML is that you can define its granularity as needed. For example, you could create a playlist that stores audio file paths and timing markers for each file—all in one playlist. Alternatively, you could create a simple playlist with file references and then create individual XML files containing timing markers per audio file. In this case you would load the synchronization details as needed. When deciding which route to take, the decision usually is based upon the amount of data being described and when the data is needed. If you can get everything in one file, then you can load that file as the application launches and instantly have access to all the information as the movie plays. If the amount of information is so great that you cannot load it all up front, then you can split the data into separate files and load each file as needed.
The general concept of timing markers (more commonly called cue points in ActionScript) is to provide time-based notifications to the SWF application's interface that something is happening. This strategy can be used for synchronizing animation, signaling to the interface that it's time to do something, or for displaying text captions that visually support the audio. Cue points generally have name and time properties associated with them. However, you can combine caption text or anything else that you need when creating the XML file. Remember, the goal is to create an external playlist that is easy to change so that you can update the movie without editing it. Identify all the information you need to update in your project and add the corresponding data in each XML node.
In the example shown below, the playlist has been updated to add cue points and captions to each audio file:
<songs>
<song>
<name>Song 1</name>
<source>audio/song1.mp3</source>
<cuePoints>
<cuePoint>
<name>Cue 1</name>
<time>2</time>
<caption>Caption 1...</caption>
<cuePoint>
<cuePoint>
<name>Cue 2</name>
<time>4</time>
<caption>Caption 2...</caption>
<cuePoint>
</cuePoints>
</song>
<song>
<name>Song 2</name>
<source>audio/song2.mp3</source>
<cuePoints>
<cuePoint>
<name>Cue 1</name>
<time>5</time>
<caption>Caption 1...</caption>
<cuePoint>
<cuePoint>
<name>Cue 2</name>
<time>10</time>
<caption>Caption 2...</caption>
<cuePoint>
</cuePoints>
</song>
</songs>
For more information while you're getting started with XML, read XML in the real world.
Now that you understand the supported audio formats in Flash Player, have prepared files in an audio editing application, and have organized your project to achieve best results, you're ready to jump in and start implementing the audio. There are a number of approaches you can take that range from easy to intermediate in difficultly. This section lists a range of options available to you and supplies examples for each approach.
The easiest strategy is also one of the oldest approaches for synchronizing sounds to graphic content in your FLA file. It involves attaching embedded audio to a keyframe on the Timeline using the Property inspector. The process is fairly easy and can be accomplished quickly without using code.
To embed an audio file and attach it to a button timeline, follow these steps:
If you are following along using the sample files provided on the first page of this article, open sample_1.fla and double-click the Button – One symbol in the Library to view its timeline (see Figure 14). You can experiment with this working example.
To embed an audio file and set it up for synchronized streaming with the timeline, follow these steps:
stop action at the end of the timeline to keep it from looping and repeating the sound.The introduction of ActionScript 1.0 in Flash 5 brought along with it the Sound object in ActionScript. The Sound object is an easy to use code feature that allows you to load a sound from the Library dynamically or load an MP3 file from an external location.
To load an embedded sound from the Library using a button symbol and ActionScript 3, follow these steps:
import flash.events.MouseEvent;
import flash.media.Sound;
function handleClick(event:MouseEvent):void
{
var snd:Sound = new sound1();
snd.play();
}
play_btn.addEventListener(MouseEvent.CLICK,handleClick);
To see a working example of this section, open up sample_2.fla file from the sample files folder. You can examine the project in detail by investigating the code and assets on each layer (see Figure 15).
To load an MP3 file from an external location, follow these steps:
import flash.events.MouseEvent;
import flash.media.*;
function handleClick(event:MouseEvent):void
{
var path:String = "audio/mp3/counting1a.mp3";
var stream = new URLRequest(path);
var snd:Sound = new Sound(stream);
}
play_btn.addEventListener(MouseEvent.CLICK,handleClick);
To see a working example of this section, open up sample_3.fla file from the sample files folder. Examine the project by reviewing the code as it is updated for three buttons and plays a different external sound as each button is clicked (see Figure 16).
There are two benefits to using FLV audio. First, FLV files can contain embedded cue point metadata that can easily be used for content synchronization without using XML data. Second, the FLV file can be loaded and manipulated using the FLVPlayback component with little to no coding.
Note: The FLV export feature in Soundbooth does not appear to embed cue points or totalTime metadata. You can use the cue point marker file saved from Soundbooth for synchronization as seen in the next example. For best results, encode your WAV source file to FLV using stand-alone Adobe Media Encoder utility.
To load an FLV audio file using the FLVPlayback component, follow these steps:
skin parameter to choose one of the default skins or set the skin parameter to none to either forego controls or clear the default skins so you can piece together your own controls using the FLVPlayback custom user interface components. For more information on skinning the FLVPlayback components, read Skinning the ActionScript 3 FLVPlayback component.true.To synchronize content to an FLV audio file using embedded cue point metadata, follow these steps:
import fl.video.*;
import flash.net.*;
import flash.events.Event;
// Respond to cuePoint events
function handleCuePoint(event:MetadataEvent):void
{
trace("name: = "+event.info.name+", time = "+event.info.time);
}
// Parse XML and add ActionScript cue
// points to the FLVPlayback instance.
function addMarkers( xmlData:XML ):void
{
// Convert the XML format to ActionScript
var len:Number = xmlData.CuePoint.length();
for(var i:Number=0; i < len; i++)
{
var curCueTime:Number = Number(xmlData.CuePoint[i].Time)/1000;
var curCueName:String = String(xmlData.CuePoint[i].Name);
// Add cue point
flvAudio.addASCuePoint(curCueTime,curCueName);
}
flvAudio.addEventListener(MetadataEvent.CUE_POINT,handleCuePoint);
}
// Load the XML marker list
function dataHandler(event:Event):void
{
addMarkers(new XML(loader.data));
}
var loader = new URLLoader();
loader.addEventListener(Event.COMPLETE, dataHandler);
loader.load(new URLRequest("settings/counting1_markers.xml"));
Tip: This code loads the markers file, converts the XML into ActionScript cue points, and listens to the cue point event from the video component to change the currently viewed frame and content.
stop();
import fl.video
import flash.net.*;
import flash.events.Event;
function handleCuePoint(event:MetadataEvent):void
{
gotoAndStop(event.info.name);
}
...
Tip: If you're using Navigation or Event cue points already embedded in the FLV file, simply skip the XML loading parts of the code sample and assign the handleCuePoint event handler to the flvAudio instance.
To see a working example of this section, open up sample_4.fla file from the sample files folder. Examine the project by reviewing the code as it is updated for three buttons and plays a different external sound as each button is clicked (see Figure 17).
While the use of FLV files and cue point metadata can be a powerful way to develop synchronized media, it may not always be an option if the audio files have to be usable outside of a SWF environment. In these cases, you can use MP3 files, XML playlists, and cue point lists to create synchronization between audio and other content.
To load an XML playlist file into ActionScript, follow these steps:
import flash.net.*;
import flash.events.Event;
var playlist:XML;
var loader:URLLoader = new URLLoader();
loader.load(new URLRequest("settings/audio_playlist.xml"));
function dataHandler(event:Event):void
{
playlist= new XML(loader.data);
trace(playlist);
}
loader.addEventListener(Event.COMPLETE, dataHandler);
<?xml version="1.0" encoding="UTF-8" standalone="no" ?>
<sounds>
<sound>
<label>Sound 1</label>
<data>audio/mp3/counting1.mp3</data>
</sound>
<sound>
<label>Sound 1</label>
<data>audio/mp3/counting2.mp3</data>
</sound>
<sound>
<label>Sound 1</label>
<data>audio/mp3/counting3.mp3</data>
</sound>
</sounds>
To implement an XML playlist in ActionScript 3, follow these steps:
import flash.net.*;
import flash.media.*;
import flash.events.Event;
var playlist:XML;
var loader = new URLLoader();
loader.load(new URLRequest("settings/audio_playlist.xml"));
// Load the XML play list
function dataHandler(event:Event):void
{
playlist = new XML(loader.data);
// Assign the play list to a combobox component
for(var n:String in playlist.sound)
{
var child = playlist.sound[n];
soundList.addItem({label:child.label,data:child.data});
}
}
loader.addEventListener(Event.COMPLETE, dataHandler);
// Load files from the play list when a combobox selection is made
var snd:Sound;
function changeHandler(event:Event):void
{
var path:String = event.target.selectedItem.data;
snd = new Sound(new URLRequest(path));
snd.play();
}
soundList.addEventListener(Event.CHANGE, changeHandler);
Tip: Working with events and event handler functions is the key to handling interactivity in ActionScript 3. Explore the Flash CS4 Help pages (F1) for more information on using audio events to work with metadata and error handling.
To get a working example of this section, open up sample_5.fla file from the sample files folder (see Figure 18).
ActionScript 3 introduced the SoundChannel object, which allows you to stop a sound and respond to its completion without closing the stream. This is a handy trick to know when you want to load a sound once and play it multiple times without streaming it down each time.
To play a sound and control it with a SoundChannel object, follow these steps:
import flash.media.Sound;
import flash.media.SoundChannel;
import flash.events.MouseEvent;
import flash.net.URLRequest;
var snd:Sound;
var sndChannel:SoundChannel;
var sndPlaying:Boolean = false;
function playHandler(event:MouseEvent):void
{
if( sndPlaying ){
stopSound();
}
sndPlaying = true;
snd = new Sound(new URLRequest("audio/mp3/counting1.mp3"));
sndChannel = snd.play();
sndChannel.addEventListener(Event.SOUND_COMPLETE, completeHandler);
}
function stopHandler(event:MouseEvent):void
{
stopSound();
}
function completeHandler(event:Event):void
{
stopSound();
}
function stopSound():void
{
sndPlaying = false;
sndChannel.stop();
sndChannel.removeEventListener(Event.SOUND_COMPLETE, completeHandler);
}
play_btn.addEventListener(MouseEvent.CLICK, playHandler);
stop_btn.addEventListener(MouseEvent.CLICK, stopHandler);
See sample_6.fla in the sample files for the working example of this section.
ActionScript 3 provides a number of options for sound control beyond the basics discussed here. If you want to take your research further, the next steps will be to look deeper into the SoundChannel object and examine how to control volume, panning, and sound transformations.
Take a look at the Flash CS4 Professional ActionScript 3 sample files and other media sample files to see more sound control examples.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License
| 04/23/2012 | Auto-Save and Auto-Recovery |
|---|---|
| 04/23/2012 | Open hyperlinks in new window/tab/pop-up ? |
| 04/21/2012 | PNG transparencies glitched |
| 04/01/2010 | Workaround for JSFL shape selection bug? |
| 02/13/2012 | Randomize an array |
|---|---|
| 02/11/2012 | How to create a Facebook fan page with Flash |
| 02/08/2012 | Digital Clock |
| 01/18/2012 | Recording webcam video & audio in a flv file on local drive |