back

Advanced Android development with Flex "Hero" and Flash Builder "Burrito"

by Brian Rinaldi

In previous issues of the Edge, I discussed getting started with Adobe AIR for Android using Adobe Flash Professional CS5 as well as creating Android apps with Adobe Flex “Hero” and Flash Builder "Burrito." While both these articles showed how easy developing native Android applications can be using these tools, they only skimmed the surface of what makes developing for mobile devices fun and exciting — things like taking photos and video with the camera, recording audio with the microphone, responding to gestures or accelerometer data, and storing local files and data.

In this article, I show you how to use some of the built-in APIs in ActionScript to add more advanced features to your AIR for Android applications. Specifically, I will walk you through a sample application that enables you to choose existing photos or take new photos and add them to an album. You can then add an audio note to each photo in the album. All of this is stored locally on the file system and in a simple SQLite database. While the actual sample application is built using Flex “Hero” and Flash Builder “Burrito,” the specific ActionScript APIs remain the same if you choose to develop using Flash Professional CS5.

Fig. 1 The view album screen of the finished application.

It’s important to note that when we talk about mobile devices, we aren’t just talking about phones. For example, this sample application was tested on both a Nexus One as well as a Samsung Galaxy Tab tablet. A variety of device form factors run Android, and more are in the pipeline. (Adobe evangelist Paul Trani has a great post on the topic of designing for various screen sizes.) The good news is, however, that the same application, without any code changes, runs perfectly on both the phone and the tablet in testing.

Local file system and data storage in SQLite

While it’s admittedly not nearly as much fun as taking photos or recording microphone data, local data and file storage is usually an integral part of any serious mobile application. It’s useful for things like storing user preferences and account information or even caching a local copy of data or images retrieved from a web service. The good news is that AIR for Android supports all the same file system and SQLite APIs that you may already be familiar with from developing for AIR on the desktop.

Creating a SQLite database

Typically, you’ll want to create the database and table structure as soon as the application is opened for the first time. For relatively simple database structures, you create the database file and execute some SQL statements to create the necessary tables, columns, and rows of default data. To do that, you use a method that is called when your application is initialized.

When you create a new Flex mobile project using Flash Builder, it generates the default MXML file as well as the initial view. (In this case, I’ve called my sample project AVAlbum, so the default MXML file is AVAlbum.mxml, and the initial view is AVAlbumHome.mxml.) Since your database creation script should run before you actually enter any view, use the initialize event on the MobileApplication tag within your default MXML file. In the following code, the initialize event calls a method named init():

<s:MobileApplication xmlns:fx="http://ns.adobe.com/mxml/2009"
	xmlns:s="library://ns.adobe.com/flex/spark"
	xmlns:mx="library://ns.adobe.com/flex/mx"
	firstView="views.AVAlbumHome"
 	sessionCachingEnabled="true"
	initialize="init()">

Now add the init() method to your code. In my example, this method calls a createDb method that creates the database file and table if they do not already exist. This is done by using the File API to access the database file and creating a connection to this database. Finally, a SQL statement is built as a string and then executed against this database connection.

protected function init():void
{
	createDb();
	navigator.firstViewData = model;
}

protected function createDb():void
{
	sqlConnection = new SQLConnection();
	sqlConnection.open(File.applicationStorageDirectory.resolvePath("AVAlbum.db"));
	var stmt:SQLStatement = new SQLStatement();
	stmt.sqlConnection = sqlConnection;
	stmt.text = 
		"CREATE TABLE IF NOT EXISTS albumItems (" +
		"id INTEGER PRIMARY KEY AUTOINCREMENT, " +
		"audioFile STRING, " + 
		"photoFile STRING, " +
		"dateAdded DATE)";
	stmt.execute();
	model.connection = sqlConnection;
}

Persisting data across views

You may notice that I use a variable called “model” here and scattered throughout my sample code. This variable is actually an instance of an ActionScript class I also call “model” that is designed to store data I need to persist across views on my mobile application. In this case, for instance, it allows me to reuse my database connection rather than re-create it on each view. In order to pass this structure down to my first view, I simply need to specify my model variable as the navigator.firstViewData in my init() method.

While using a model class isn’t an officially sanctioned best practice, I’ve found it helpful in my mobile development using Flex. You may recall from the prior tutorial that you can pass any arbitrary structure of data between views. However, I’ve found this makes it difficult to know what data any particular view is receiving, which can lead to difficulties when debugging issues.

Instead, in my application, every view always expects to receive the model class (typically within the viewActivate event handler). However, it is important to remember that you aren’t just passing it to the next view as in the firstViewData above or the pushView() method. You must make sure it is passed back when a view is popped: either when popView() or popToFirstView() is called or when the user hits the Back button on the device. To do this, you need to add the following method to your view components:

override public function createReturnObject():Object {
	return model;
}

Adding this method ensures that the model class (or any data you want to pass back) is returned to the prior or first view when it is passed. For more information, check out Passing data on pop methods in mobile Flex on Adobe Cookbooks.

Adding and updating records in SQLite

Now that you’ve established a connection to the database and seen how easy it is to execute SQL statements against this connection, the next step is to add or update records of data. You just need to change the SQL being executed. For instance, the SQL below adds a new user-selected photo in the sample application to the database:

var sqlStatement:SQLStatement = new SQLStatement();
sqlStatement.sqlConnection = model.connection;
sqlStatement.text =
	"INSERT INTO albumItems (photoFile, dateAdded) " +
	"VALUES (:photoFile, :dateAdded)";
sqlStatement.parameters[":photoFile"] = evt.data.file.url;
sqlStatement.parameters[":dateAdded"] = new Date();
sqlStatement.execute();

You’ll notice that I am using a parameterized query whereby the photoFile and dateAdded data are set via SQL parameters. An update query works much the same way. In the following example, I am updating the data record with a new audio description that was added to the photo:

var sqlStatement:SQLStatement = new SQLStatement();
sqlStatement.sqlConnection = model.connection;
sqlStatement.text =
	"UPDATE albumItems " +
	"SET audioFile = :audioFile " +
	"WHERE id = :id";
sqlStatement.parameters[":audioFile"] = model.albumItems[model.selectedItem].id + "note.wav";
sqlStatement.parameters[":id"] = model.albumItems[model.selectedItem].id;
sqlStatement.execute();

As you can see, using the local SQLite database storage is both powerful and easy to use.

Working with the local file system

If you’ve ever worked with the file system API in Adobe AIR for desktop development, the good news is that it doesn’t change when working on AIR for Android. It’s just as easy to access or create files and folders. For instance, in the following lines of code from the sample application, we get a reference to a folder called “audioFiles” on the device, check if it exists, and, if not, create it. Finally, we get a reference to the WAV file that we will play back.

var outputDir:File = File.documentsDirectory.resolvePath("audioFiles");
if (!outputDir.exists) outputDir.createDirectory();
var outputFile:File = outputDir.resolvePath(model.albumItems[model.selectedItem].id + "note.wav");

Obviously, this is only the beginning of what you can do with the file system API. For example, I have a blog post explaining how you can save a local copy of remote images for offline loading. But as you can see, working with the file system on mobile devices is just as easy in AIR for Android as it is in AIR for the desktop.

Adding photos with the CameraRoll

The CameraRoll is an API that enables users to open the image gallery on their device and choose a photo from their gallery. You’ll notice in the snippet below that before using the CameraRoll, we check that the device actually supports it. This will avoid runtime errors. If it is supported, we create a new instance of the CameraRoll and add an event listener to call a method (in this case onMediaSelect) when the photo is chosen:

if (CameraRoll.supportsBrowseForImage) {
	viewsAC.addItem({label:"Add Existing Photo", icon:fbIcon});	
	roll = new CameraRoll();
	roll.addEventListener(MediaEvent.SELECT, onMediaSelect);
}	

Once this is done, we can launch the CameraRoll with a single line of code that we call when the user chooses the menu item for adding an existing image:

if (viewsList.selectedItem.label=="Add Existing Photo")
	roll.browseForImage();

The onMediaSelect method is called when the selected photo gets an event whose data contains an instance of the MediaPromise class that exposes information about the chosen image. In the sample application, this is actually used to insert into the local database, but for the sake of brevity, the following method shows you how to set an Image component (with the ID “image”) to the chosen photo’s local URL, thereby displaying the image:

protected function onMediaSelect(event:MediaEvent):void
{	
	var mp:MediaPromise = event.data;				
	image.source = mp.file.url;
}

Taking photos with the camera

Taking photos with the device’s camera in your application is very similar to working with the CameraRoll. First, you check to ensure that the device actually supports it. Then create an instance of CameraUI. Finally, add an event listener to call a method (in this case onCameraComplete) when the photo is taken and confirmed:

if (CameraUI.isSupported) {
viewsAC.addItem({label:"Take Photo", icon:fbIcon});
	myCam = new CameraUI();
	myCam.addEventListener(MediaEvent.COMPLETE, onCameraComplete);
}

Launching the camera takes just a single line of code, which you call when the user clicks the Take Photo button:

if (viewsList.selectedItem.label=="Take Photo")
	myCam.launch(MediaType.IMAGE);

Finally, in the event listener method, you can use the received MediaEvent to get information about the image the user took. For example, the code below sets an image (also with the ID “image”) to display the taken photo by using the photo file’s local URL:

protected function onCameraComplete(evt:MediaEvent):void
{
	image.source = evt.data.file.url;
}

Recording video really isn’t much different than using the camera. In fact, Koen De Weggheleire has a good tutorial on how to handle video recording and playback on AIR for Android.

Recording audio with the microphone

Your application can enable users to record voice input via the device’s microphone. This is a bit more involved than the photo code above, but it isn’t terribly complex. To do so, you use the Microphone class. In the sample application, we allow the user to add a voice note to any photo in their album. The startRecording() method is called when the Record button is clicked. As you can see below, it opens a connection to the first microphone instance from the available list of microphones on the device. The method then adds an event listener that is called as audio data is received.

protected function startRecording():void
{
	isRecording = true;
	microphone = Microphone.getMicrophone(Microphone.names[0]);
	microphone.rate = 44;
	microphone.gain = 100;
	soundData = new ByteArray();
	microphone.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleDataReceived);
}

As the data is received, the application writes this data to the soundData variable, which is an instance of a ByteArray:

protected function onSampleDataReceived(event:SampleDataEvent):void
{
	while(event.data.bytesAvailable)
	{
	var sample:Number = event.data.readFloat();
		soundData.writeFloat(sample);
	}
}

Finally, when the user stops the recording by hitting the Record button again, the stopRecording method is called. It removes the event listener from the microphone and passes the soundData ByteArray to a method that writes this out to a WAV file. We could actually play the audio back without writing it out to a file, but for this application, we need to store the audio in some kind of file format.

protected function stopRecording():void
{
	isRecording = false;
	microphone.removeEventListener(SampleDataEvent.SAMPLE_DATA, onSampleDataReceived);
	writeWAV(soundData);
				
	saveRecording();
}

While a deep explanation of writing (and reading) WAV files is outside the scope of this article, it is relatively simple to write WAV files using the WAVWriter class. This class and the writeWav() method in the sample application came from Christophe Coenraets’ article and VoiceNotes sample application on his blog. However, I couldn’t get the WAV playback to work and would probably have pulled all my hair out if it weren’t for the sample code and WAVReader class provided by Doug Arthur. If you want to know more about how WAV files are saved and played, look at the code provided in the sample application.

AIR Launchpad

If you want to start a new AIR application, you cannot afford to ignore the AIR Launchpad application created by Holly Schinsky for Adobe and available on Adobe Labs. The application walks you through a simple wizard and generates an FXP file that can be imported as a new Flex project to Flash Builder.

Not only does Launchpad handle many of the basic settings for a new Flex mobile application, but it gives you the option to include samples for many of the items I discussed above, including taking or choosing photos, recording and playing back audio from the microphone, creating and storing data in a local database, and much more. In fact, Launchpad includes so many sample code options, I think Holly is secretly trying to put those of us who write tutorials out of business. I highly recommend you simply build Launchpad into your workflow for beginning any Flex-based AIR application — mobile or desktop.

Wrapping it all up

We covered a lot of ground in this article, and due to space constraints, I couldn’t explain every detail. However, you can download the project code and add it to Flash Builder “Burrito.” This application is still a work in progress. For instance, I would like to add more key features such as support for multiple albums, but I will give you access to all the code I didn’t have room to cover in this article.

AIR for Android gives you access to the exciting device features and APIs that make native mobile application development fun. By adding photos, audio, and local data and file storage, you can create a rich experience for your users on mobile devices.

‹ Back


Brian Rinaldi is as a Content and Community Manager for the Adobe Developer Center team, where he helps drive content strategy for HTML5 and JavaScript developer content. Brian blogs regularly at http://remotesynthesis.com and is a unreformed twitter addict @remotesynth.