by Joe Berkovitz
Allurent is a software company that offers next-generation e-commerce experiences. As VP of Engineering at Allurent, I am responsible for all technical aspects of product development.
When I joined the company, we were just a fledgling startup. One of our first challenges was to select the client technology platform on which we'd be building these e-commerce products. We selected Adobe Flex and Adobe Flash Player, and we've been happy with that decision. Not only has the Flex 2 platform blossomed, but Adobe is taking the great qualities of Flex and aiming them at the desktop application space with Adobe AIR. Naturally, this has prompted us to think about what Adobe AIR might do for shopping, both online and offline.
In this article, I describe some of our experiences developing the Allurent Desktop Connection, an AIR-based prototype that leverages both the Internet and the power of the local desktop to create a stellar shopping experience. I'll cover our experience with the platform and the tools and look at a few of the challenges we had to solve along the way.
We developed the Allurent Desktop Connection (ADC) in cooperation with Anthropologie, the women's apparel and home goods retailer. With ADC, we will deliver a premium desktop shopping experience to customers. (The ADC pilot launches in 2008; to see a canned demo, please visit allurent.com/adc.) Our goal for this early prototype was to vividly demonstrate — in working form — key features of the application:
In short, we aimed to present a vision of shopping that exploited the unique capabilities of Adobe AIR to our customers' maximum advantage.
We had about eight weeks to build the ADC prototype in time for its unveiling at Adobe's North American MAX keynote on October 1, 2007. The development team included one designer, three developers, and a QA tester. Fortunately, we began the project with a good idea of what we wanted to build. Of course, we were stocked with a whole lot of useful Flex libraries from Allurent's already-shipping product line. In line with our agile development approach, we decided to build the prototype in two four-week sprints.
One of the singular aspects of the ADC is its style of navigation. We describe this style as "cinematic": transitions don't take place suddenly, but unfold in a visual narrative that cues and prepares the user for what happens next, with a slight dash of suspense. For example, clicking a product image causes the display to smoothly pan and scale until the image occupies a rectangle on the screen with the correct size and position. At that point, additional information about the product fades in, along with controls to enter personal notes on the product or add it to the user's shopping cart.
Figure 1: Product details and notes
A scrolling control also scrolls the main display horizontally, with smoothing applied to its motion so that small jitters in mouse motion don't cause the display to suddenly jerk around (see Figure 1). We applied a blur filter whose effect increases with velocity, providing a realistic sense of motion blurring that softens the impact of the scrolling motion as its rate increases.
Given all these kinds of motion that can be in play simultaneously, and the fact that the user can rapidly change that motion by making additional gestures at any time, we decided to try a new "goal-based" animation approach that treats motion very differently from the built-in Effect and Tween classes in the Flex framework. In our approach, the application controller responds to user gestures by updating the view with a position and scale to which the application should adjust itself. In response, the view smoothly pans and zooms in or out to reach that goal — in an amount of time that feels realistic given the nature and extent of that motion. If the goal changes, the motion changes as well, but always smoothly. That's what it means to be cinematic: we want the user to feel like they're watching a movie with a perfectly paced continuous shot.
One of the most engaging features in the ADC is its fuzzy, color-matching search against the catalog. The shopper can click any point on a color wheel or any pixel of an image dragged into the application from the desktop or a browser (see Figure 2). Upon the user's doing so, the application displays a cluster of products with similar colors, grouping similar hues together in the display.
Figure 2: Fuzzy color-matching
Implementing this feature was a lot of fun, and brought together a number of key Adobe AIR and Flash features: high-performance local database queries, access to clipboard and drag-and-drop data, and bitmap manipulation.
Apparel manufacturers typically create tiny "color chip" images that are used to convey color choices on their website. Rather than attempt to analyze the color content of the full product shots, which would entail our knowing what part of the shot actually contains product, we take the average values of the pixels in the color chips, transform them into HSV (hue/saturation/value) triples and insert those triples into our product database, correlated with each available variant of each product. This happens up front, as part of populating the database.
These color triples are used at run time to filter query results based on a perceptual notion of "color distance," which — given two HSV color values — determines a number roughly indicating how similar two colors look to a human observer. (Trying to figure this out from regular RGB color values is surprisingly unhelpful! For example, dark colors look much more alike than light colors.)
The results are then sorted in a simple way to place similar hues near each other — for instance, in Figure 2, you can see some orange items placed together although the target color was a shade of red.
One of our biggest challenges was to create a local SQL database that contained complete product metadata for a cross-section of the catalog, to support the cool local search capabilities and to save local data such as a customer's personal notes attached to products. Earlier, when Adobe first introduced SQL support into Adobe AIR, we got quite excited and did some quick studies showing that performance was excellent. We had already encoded a sample product catalog into XML for use in our web-based product demos, and it was easy enough to write an import utility to bring that catalog into an AIR SQL database. The schema was a simple flat-file–like scheme: element names were tables and attribute names were columns.
One of our customers, Anthropologie, is using Allurent Checkout, Details, and Inline Cart products (see Figure 3), so we had ready access to the catalog data and media we'd need to populate the prototype. However, the catalog was only available by querying a copy of Anthropologie's e-commerce database schema, and some of the key product image media were available only online. One of our developers had to create an entire small suite of scripts and Adobe AIR mini-applications to download, massage, color-index, and clean this data — writing the result of the last step to an XML file in the format our import utility expected. Even though it all worked out in the end, like many data extraction projects, it had many unexpected pitfalls and surprises!
Figure 3: Search results
We believed from the start that — excepting the inevitable beta bugs and rough edges on new APIs — working with Adobe AIR would seem perfectly straightforward to anyone who understood how to work with Flex. Happily, we weren't wrong! Our main challenges were the ambitious goals that we set ourselves for the prototype feature set, not working with Adobe AIR itself. We're very enthused about the step forward that Adobe AIR represents for desktop application developers, and look forward to the live pilot of this project next year.
Joe Berkovitz is vice president of engineering at Allurent, Inc., a company offering a suite of Flex-based online shopping applications. He has spent 28 years in the software profession as an architect, interface designer, and engineer.