The goal of the project was to simulate the effects of water-color paint propagation. Traditionally, fluids and gasses are simulated on a grid of cells, and a fluid flow is computed between the cells. This approach can be slow and we were looking for a faster solution.

Our solution

In this project we developed a novel approach, in which the fluid location is defined by a set of overlapping semitransparent shapes of uniform color. Instead of computing the fluid flow between cells of a grid, our algorithm computes the movement of vertices that specify the edges of each shape.

Each shape has a certain lifespan and certain time, in which it is considered wet. The movement of shape vertices is controlled by several parameters: whether the vertex is over another shape that is still wet, where was the shape located withing the area of the initial deposition, and a few random parameters.

See the paper below for more details.


The initial prototype of the technique was developed by Daichi in Adobe's Deco scripting environment, with help of Radomir. Aravind and Steve helped to implement the algorithm in C++, improve it and then deply it i on an ipad application and inside Photoshop, respectively. The ipad application, Eazel, was shipped with Photoshop CS5.5.



Figure 1: A comparison of similar strokes made with real watercolor paint (top) versus our algorithm (bottom). Paper texture is added to our results for comparison purposes. The strokes are chosen to showcase a variety of characteristic watercolor behaviors, including edge darkening (A), non-uniform pigment density (B), granulation (C), re-wetting (D), back runs (E), color blending (F), feathering (G), and glazing (H). Strokes exemplifying particular effects have been labeled with the corresponding letter. While our algorithm does not make identical strokes, it exhibits the same range and depth of expressiveness as traditional watercolor.


Figure 2: A vector watercolor painting made by Daichi in our interactive iPad painting application, displaying complex texture and color blending. Insets zoom in to show stroke detail and resolution independence.


Figure 3: Example of a painting made by Daichi using our algorithm.