Building a visual performance app with
Adobe AIR, Flex, and Flash

by Mike Creighton

By day, I'm a lead developer at Odopod, an interactive design studio in San Francisco. Using technologies such as Adobe Flash, HTML, CSS, and JavaScript, I work with a team of designers and developers to create engaging websites. By night, I consider myself more of a technological artist. I'm interested in dynamic art, generative systems, and, most of all, interactivity. And I have a strong desire to share this art with others in the form of live performance.

Plenty of software exists for performing digital artwork; most of it comes under the category of visual jockey (VJ) software. But most VJ software doesn't work for the dynamic art I'm interested in.

So after seeing Mike Chambers' presentation of Adobe® AIR™ a few months ago, I realized that my experience with web technologies now enabled me to make the application I wanted. In this article, I explain why I needed to create this application and how I brought it to fruition using Adobe AIR, Adobe Flex, and Adobe Flash software.

Flaws in available VJ software

The available VJ software didn't work for me for several reasons, including:

Too much UI management. This is the number-one reason I dislike most VJ software. Often, it sucks you into managing the UI during your performance rather than concentrating on performing. I want to be gestural with my performance. I want to be able to physically say, "Now I want the visuals to blow up," or "Let's start highlighting this sound." Instead, I had to use the keyboard and mouse to manage some UI elements, causing me to miss precious cues. I could use some sort of controller, such as MIDI, OSC, or something hacked together, but that leads me to my next grouse.

Too much focus on clip triggering. I have an interactive media background. I think in dynamics. I think in variables and rule sets. I don't think in terms of linear media. Almost all available VJ software focuses on layering video clips, which I don't like. I could trigger clips using some sort of gestural controller, adjusting layer opacities and filters using physical sliders, but then I would still be limited to the linear prerendered footage in my library, which doesn't suit my tastes. So why not use something programmatic, such as Processing, Max, MSP, Jitter, or Touch?

Maintaining visual continuity is difficult. In many programmatic software environments, you can build beautiful, dynamic, and interactive visuals. However, moving from one visual construct to another is no simple task. There's no meta application that lets you string these pieces together in a unified environment (unless you build it yourself). I started doing that for Processing applets, but two things stopped me: a bug in the latest build prevented me from creating a proof of concept, and I didn't have the time to learn Java.

The plan

Knowing the flaws of existing VJ applications, I knew the general feature-set and goals for my application — visp, which is short for visual performer. The general logic behind building an application like this — especially one so visually oriented — isn't so different from planning the logic for the types of websites I work on by day. The key hurdles were learning Flex software and the ActionScript 3.0 language, and familiarizing myself with the added features that Adobe AIR brings. Except for using ActionScript 2.0, I had no experience with any of these technologies.

But that was the big surprise. Learning those new technologies was hardly a hurdle at all. ActionScript 3.0 is a huge improvement over ActionScript 2.0 in terms of language standardization, ease of use, and clarity. The Flex framework is a great extension of ActionScript, bringing together already-familiar technologies such as XML and CSS. Flex and the Flex Builder plug-in for Eclipse greatly reduced the amount of time it took to get a fully functional interface onscreen for visp. The combination of these things and my web development background let me focus on the architecture and logic of my application, rather than learning foreign technologies.

I simply made a list of the features I needed for my gig, researched the Adobe AIR documentation for the new desktop-specific APIs I needed (the File System API and the Native Windowing API), and began building visp one feature at a time.

The application

Within six weeks (working just nights and weekends), I had a fully functioning VJ application that satisfied my performance goals (see Figure 1). I even had time to iterate through three versions of visp in that time period. Since Adobe AIR leverages Flash Player 9, I saw significant performance gains over anything I had built for Flash Player 8. An added benefit of building Adobe AIR applications is that they truly are cross-platform. Without modifying any code, I was able to jump between the Windows and Mac operating systems, both compiling the application and running it. This makes sharing the application with the VJ and digital artist community so much easier. And because Adobe has released the Flex SDK and ActionScript to the open-source community, I'm able to release visp as a truly open-source application that is compilable across platforms.

VJ application

Figure 1: VJ application

You can find the latest release version of visp at along with tutorials for creating your own custom content. For those who want to see the inner workings of the application, I have released the source code under the GNU General Public License (GPL) 2.0 on Google Code. Everything will be kept compatible with the latest version of the Flex 3 SDK through its final release.

The application has an API for creating performable content, custom filter overlays (post-render effects), and custom transitions to help performers make visually seamless transitions between their individual pieces. Also included is a stand-alone Java applet that acts as a MIDI proxy, enabling performers to control visp using a MIDI controller (see Figure 2). This too is open source. Lastly, thanks to Adobe AIR, visp can output the visuals to a separate window entirely — which is a must-have feature for any VJ — while displaying a full-resolution preview window within the interface.

MIDI proxy

Figure 2: MIDI proxy

The future

This initial release presents a base platform for digital artists and performers to use for their projects and gigs. And while I don't have any big plans for additional features beyond general customization and configuration options, I do see an opportunity within the application to leverage the Internet-capable nature of Adobe AIR to help support the community of visp users. I have hopes to implement features that enable users to:

I also hope that interest in the application will warrant bringing other input communication options to visp via the new Java to AIR bridge, called Artemis, or even via Processing. This would mean communicating with OSC devices, the Wiimote, the P5 Virtual Reality glove, and any of the readily available microcontroller kits with external sensors.

Making the leap

The ease of building visp on Adobe AIR using Flex and ActionScript 3.0 was completely unexpected, and has encouraged me to start thinking outside the browser for the first time. After this initial tryst with Adobe AIR, I already have a couple of applications under development: a personal productivity application to help keep records of communication, and a media-management tool for popular social networks.

Even if you don't specialize in Flash-based websites, the good news is that you can still use your Ajax and HTML development skills to create cross-platform, Internet-aware desktop applications. With Adobe AIR, it's no longer a matter of wondering if you can build it; it's simply a matter of deciding what to build.

Visit the Adobe AIR Developer Center

Mike Creighton a lead developer at Odopod, an interactive design studio in San Francisco.