Requirements

Prerequisite knowledge

An intermediate to advanced understanding of ActionScript 3, object oriented programming, and the Away3D engine will help you make the most of this article.

Additional required other products

User level

Intermediate

Away3D is a powerful 3D engine that abstracts the powerful, yet low-level Stage3D API, making it easier to use. With just a few lines of code, you can create advanced scenes and let the engine take care of the rest. In this tutorial you will tour  the Away3D material system as you emulate a scene depicting our beautiful planet Earth, floating in deep space. You will start with simple geometries and materials, and gradually get more involved as you use the engine's powerful material system.

Setting up the scene

To set up the geometries of the scene, you use spheres to create the Earth and the Moon, group them appropriately so you can simulate axis and orbit rotations, and texture them with basic materials. Example 1 shows the basic setup.

Note: The full source code for this example and subsequent examples is available in the sample files for this article.

For each celestial body, you reproduce the same sequence of steps. You create a texture and a geometry, wrap them in a mesh object, add it to a container, and add the container to the scene.

Materials

This example embeds JPG images and uses the Away3D Cast utility class to create BitmapTexture objects. Note that these are not the regular Flash BitmapData objects, but special GPU image objects that Away3D will eventually upload to the GPU via the Stage3D API. The objects are then used within a TextureMaterial instance, which is the engine's main player in any material that makes use of textures for shading. This example barely makes use of this powerful material, but you will take much more advantage of it shortly.

var moonSurfaceTexture:BitmapTexture = Cast.bitmapTexture( MoonSurfaceDiffuse ); var moonSurfaceMaterial:TextureMaterial = new TextureMaterial( moonSurfaceTexture );

Geometries

The celestial bodies are represented by SphereGeometry objects. A geometry object is mainly a buffer holder, which keeps track of a potentially large number of triangles. These triangles can represent a plane, a sphere (as in this case), a torus, or even much more realistic and complex objects such as a detailed car or avatar shape. Away3D offers a basic set of primitive geometries, and this example uses spheres with a significant level of detail—that is, with a high polygon count or segmentation—to make sure the bodies are smoothly rounded.

var moonSurfaceGeometry:SphereGeometry = new SphereGeometry( 50, 100, 50 );

Meshes

Geometries combined with materials form mesh objects, which gather the necessary functionality for actual rendering. The geometry determines the object's shape, and the material determines the object's shader, or how it will be drawn to screen. Meshes also wrap other objects, such as transforms, which are another essential part of the 3D rendering pipeline, enabling you to position, scale, and rotate objects in the scene.

// Mesh. var moonSurfaceMesh:Mesh = new Mesh( moonSurfaceGeometry, moonSurfaceMaterial );

Containers

The ObjectContainer3D object also has transforms and may contain children. In this example, objects are put in containers to make it is easy to perform groupings and transformations on them. For example, by displacing the Moon's position within its container, you can simulate its orbital motion by simply rotating its parent container. After setting them up, you add the ObjectContainer3D objects to the scene:

// Container. _moon = new ObjectContainer3D(); _moon.rotationY = rand( 0, 360 ); _view.scene.addChild( _moon ); moonSurfaceMesh.x = 1000; _moon.addChild( moonSurfaceMesh );

Enriching the scene

The scene still looks pretty bare and there are several aspects you could focus on to make it look better. You could work on the object's materials, and you will a bit later. First, however, you'll add more elements to the scene to make it more interesting. You will add the Sun, some stars, and deep space graphics to the scene.

The code for this example has been compacted from the previous example. As you advance in this tutorial, the code is going to get more and more involved, so it makes sense to move previously studied elements in the source out of the current main focus. Example 2 has three new methods: one that creates the Sun, another that creates a star field, and a third that creates the deep space graphics using a SkyBox.

Sprite3D

The Sun and the stars use Sprite3D instead of Mesh. Sprite3D objects don't really have a geometry, but still own a material, a transform, and so on. Their geometry is actually a plane that always faces the camera. These are commonly known as billboards in the 3D world, and are ideal to represent potentially numerous far away objects or particles. These objects can be faked with an image attached to a plane, and that is exactly how they are used here. Sprite3D objects use significantly fewer triangles than Mesh objects and are hence computationally cheaper.

In the example code, the method blackToTransparent() is simply a utility method that changes the black background of the images used on the billboards to transparent. This is done by creating a new BitmapData object and copying anything on the image's red channel (anything that is not black) to the alpha channel. It could have used any of the other channels since the images are mostly black and white, but the red channel will do. Once you have the transparent images, setting the material's alphaBlending property to true will notify the rendering pipeline that this object's texture is to be alpha-blended with whatever objects are behind it:

var bitmapData:BitmapData = blackToTransparent( Cast.bitmapData( StarTexture ) ); var starMaterial:TextureMaterial = new TextureMaterial( new BitmapTexture( bitmapData ) ); starMaterial.alphaBlending = true;

The star field is layed out using a for loop that creates each billboard, generates random spherical coordinates, calculates their equivalent Cartesian coordinates, and assigns them to the position of each star. Spherical coordinates are used here because they tend to be easier for laying out elements in a spherical distribution.

SkyBox

A SkyBox is simply an infinitely large cube with images attached to its inner faces. A SkyBox is ideal for simulating environmental elements that are sufficiently far away to not require parallax or perspective effects. The elements are so far away that moving around won't actually make them look very different. For this, you embed six images, one for each face of the cube, and wrap them around a BitmapCubeTexture, which is another type of GPU texture for the Stage3D API. You then wrap this texture in turn in the SkyBox object itself. This is a simple, cheap resource that provides excellent visual contributions.

// Cube texture. var cubeTexture:BitmapCubeTexture = new BitmapCubeTexture( Cast.bitmapData( PosX ), Cast.bitmapData( NegX ), Cast.bitmapData( PosY ), Cast.bitmapData( NegY ), Cast.bitmapData( PosZ ), Cast.bitmapData( NegZ ) ); // Skybox geometry. var skyBox:SkyBox = new SkyBox( cubeTexture ); _view.scene.addChild( skyBox );

Adding lights

Now that you have added objects in the scene to make it a bit richer, it is time to make the celestial bodies look a little more realistic. A fundamental aspect of 3D graphics that you have not yet addressed is lighting. Light sources and light enabled materials shade objects depending on their configuration. This provides a powerful enhancement to the scene.

Not much has changed in this example. You just add a point light, adjust some settings on it and on the materials, and associate the lights with the material's lightPicker property. Away3D supports different types of lights, including PointLight, which represents a light source that emanates light in all directions from a point in space. In contrast, DirectionalLight emanates light from no particular point in a single direction. Each type of light produces a different effect.

This example uses a point light placed at the same position as the Sun sprite. The light is placed in an array and assigned it to the example's StaticLightPicker object. A light picker object is used to group and collect lights in a scene. Consecutively, this light picker is assigned to each of the materials that you want illuminated. In this case you apply it to texture materials, but you could also apply it to color materials and other types of materials. Under the hood, the result is that Away3D produces a phong shader model to illuminate the affected objects.

Phong shading

Shading via Stage3D uses a rather complicated shading language called AGAL, which looks almost like assembler and is difficult to code. You can use AGAL to produce shader programs, which can be uploaded to the GPU and–together with geometries, textures, transforms, and so on–define how things are drawn to screen. Fortunately, Away3D wraps all these complexities into an intuitive material system in which you can control shading models by simply manipulating parameters and properties.

This example uses the light source's ambient, diffuse, and specular values and the materials' gloss values. The ambient property of the light illuminates objects from no specific origin or direction, simulating a scene's random reflectance with no specifically directed shading. In this case, it is set to 1 so that the dark sides of the celestial bodies are not completely in darkness. The diffuse property affects the amount in which incident light on the geometry's normals illuminates it. Here it is set to 2 to exaggerate the difference between the lit areas and the ones in darkness. The specular property of the light affects how much the reflecting highlight (depending on the incident direction of the light), the surface normals, and the view angle light the surface. Finally, the gloss property of the material determines how spread out the specular highlight is. Try different values for each to get a better understanding of what each does. If you want to dwell deeper into the fantastic world of shading, see GLSL: An Introduction. It uses GLSL instead of AGAL, but the concepts are similar.

Materials, like lights, share some of the ambient, diffuse, and specular properties. This can be confusing because these values get combined in the shader so you can have, for instance, different ambient illumination on a set of objects affected by the same light source.

Additional maps

Lighting improved the look of the celestial bodies, but there is still more you can do. The TextureMaterial object accepts more than the one texture or map already used (see Figure 1). In fact, you will be applying three more maps on it now: a normal map (see Figure 2), a specular map (see Figure 3), and an ambient map (see Figure 4).

TextureMaterial can use these maps for more advanced illumination. The specular map acts as a mask for the specular highlight of the material. Wherever the map is white, the specular highlight is allowed to exist, wherever it is black, it is not. Values in between attenuate the highlight. The ambient map is used as a sort of diffuse map wherever no light reaches the geometry. In this case, it will paint the dark side of the Earth since it doesn't receive any lighting from the Sun. The normal map is a little more involved and produces the illusion of having a higher polygon count on the model using a technique known as normal mapping. A SphereGeometry object has normals at each vertex and the shader uses each of these normals for determining the diffuse and specular components of the globe's illumination. In fact, the shader interpolates these normals for every pixel or fragment within each triangle producing a very large number of normals, all pointing away from the center of the sphere. This map encodes how much these normals deviate from their original directions by representing the deviations in x, y, and z in the red, green, and blue color components. This gives a lot more illumination information to the globe when lit, while not actually adding any triangles whatsoever. Example 4 shows the result of using all these maps together. Notice how specular highlights occur only in the oceanic areas, how the dark side of the planet shows city night lights, and how normal mapping appears to shade the terrain, while not actually adding terrain elevations near the edges of the sphere.

As you can see in the source code, these rather advanced shading topics are easy to implement with Away3D. The most significant change in this example is the addition of the following lines, which simply assign the mentioned maps into the Earth TextureMaterial:

earthMaterial.normalMap = Cast.bitmapTexture( EarthSurfaceNormals ); earthMaterial.specularMap = Cast.bitmapTexture( EarthSurfaceSpecular ); earthMaterial.ambientTexture = Cast.bitmapTexture( EarthSurfaceNight );

Additionally, notice that the illumination values for each of the bodies have been tweaked independently to achieve the desired aspect of each one.

Fresnel specular method

The specular highlights on the celestial bodies still don't look very realistic. They do make things look more interesting, but they would be more appropriate for billiard balls than for celestial bodies. Planets just aren't that shiny. Also notice the undesired specular highlight trail off when looking at the bodies from behind.

This example changes the way the TextureMaterial deals with specular highlights. In fact, it uses a completely different specularMethod object Away3D materials allow you to composite a material using methods. For instance, you can change the ambient, diffuse, and specular methods of a TextureMaterial object. You can think of methods as being interchangeable chunks of shader code that the engine combines to produce a completed shader program. This gives you a wide range of shading possibilities that are easy to use.

To improve the highlights, you use FresnelSpecularMethod. This is an illumination technique that sounds complicated, but is actually very simple. The concept is as follows: If the viewer is looking down on the surface producing the specular highlights, they're weak. If instead the viewer is aligned with the surface and looking at it edge on, the highlights are strong. Imagine yourself looking at a pool of water on a sunny day. If you stand by the pool and look straight down, you will see little light reflected on the surface and you will be able to see the bottom of the pool. If instead you're in the pool and looking at the water with the Sun in the background, you will see little of the pool's bottom and see much more light directly reflected from the Sun on the surface of the water. Implementing this on the example produces subtle results, but eliminates the undesired effects discussed before, producing a more realistic simulation.

Once again, the implementation in code is very simple. You simply initialize FresnelSpecularMethod objects and assign them to the materials of the Earth and the Moon:

var earthFresnelSpecularMethod:FresnelSpecularMethod = new FresnelSpecularMethod( true ); earthFresnelSpecularMethod.fresnelPower = 1; earthFresnelSpecularMethod.normalReflectance = 0.1; earthFresnelSpecularMethod.shadingModel = SpecularShadingModel.PHONG; //... earthMaterial.specularMethod = earthFresnelSpecularMethod;

Adding clouds

To include Earth's rich atmosphere in the simulation, you add a slightly bigger transparent sphere to the Earth container with a texture of clouds. The implementation is basic, but impressively effective.

An alternative approach would be to merge the Earth surface and sky diffuse maps, or layer the materials in some way. This would definitely save triangles since using a new sphere adds many of them. In this scenario, you can afford the additional triangles. Plus, if you merged the maps together, the maps you added to the surface would affect the clouds, and this would definitely be unwanted. Instead, you give it its own geometry and material and completely ignore specular highlights on the clouds by setting its specular property to 0.

var bitmapData:BitmapData = blackToTransparent( Cast.bitmapData( EarthSkyDiffuse ) ); var earthCloudMaterial:TextureMaterial = new TextureMaterial( new BitmapTexture( bitmapData ) ); earthCloudMaterial.alphaBlending = true; earthCloudMaterial.lightPicker = _lightPicker; earthCloudMaterial.specular = 0;

Cool! This is starting to look more realistic.

Adding an atmosphere

To continue with the simulation of the atmosphere, you'll need to simulate a thin layer of gas around the Earth's surface. You can do this using a trick similar to the one you used for the clouds–that is, a slightly larger sphere, tweaked to generate the desired effect. In this case, you won't map a texture onto it. Since you just want a fading bluish color, you use a ColorMaterial object instead. You could make this material slightly transparent, but it would be on top of everything you've done so far, making everything bluish and obscuring much of the detail you've implemented. You just want this sphere to be visible on the perimeter of the globe, not covering it. To do this, you use another trick known as inverting the sphere.

Here is the line of code that inverts the sphere:

earthAtmosphere.scaleX = -1;

You could have inverted any of the two other axes with the same result: the sphere's normals point into its center instead of away from it. As a result the faces that would be normally pointing towards us (that is, on top of the Earth's surface) are clipped by back face culling, and the faces that would normally be clipped are visible. Hence you see only triangles that are on the opposite hemisphere of the globe and not occluded by the other spheres–exactly the effect you want.

The construction of the material for this geometry is straightforward. Note that you apply lighting to the sphere so you don't see the atmosphere on the dark face of the planet. Also note that you use BlendMode.ADD on it. This blend mode is ideal for this goal because it simply adds color to whatever is rendered behind the material but never subtracts it. It will allow lighter things behind it to pass, but will not occlude them whenever the material is darker.

var atmosphereMaterial:ColorMaterial = new ColorMaterial( 0x1671cc ); atmosphereMaterial.blendMode = BlendMode.ADD; atmosphereMaterial.lightPicker = _lightPicker; atmosphereMaterial.gloss = 5;

The produces roughly the desired effect, but there is still room for improvement. The results make the atmosphere a bit too rough on the edges.

A more advanced atmosphere

To make the atmosphere more realistic, you need to make it fade out radially from the center of the sphere. The real atmosphere's gases are denser nearer to the surface of the planet, and gradually become less dense as the distance from the surface increases. This is a fairly particular requirement, and to implement it, you need to go into a lower, more involved level of the Away3D material system. This will be more complex than the previous sections of this tutorial.

The main part of the code is the CompositeDiffuseMethod object, and its related modulation function. This object is simply a base diffuse method that exposes a bit of its shader code so that you can alter it. You alter the bit of the shader by defining and passing a method with the following signature:

function myModulatingMethod( vo:MethodVO, t:ShaderRegisterElement, regCache:ShaderRegisterCache ):String

Basically, you receive information on previous parts of the shader, and are expected to return the bit you are altering. The MethodVO argument gives information or properties of the shader program on which you are working (the current state of the material you are using), the ShaderRegisterElement argument represents the state of the shader as you receive it, and the ShaderRegisterCache argument is used when you want to use additional registers. This technique requires quite a bit of knowledge on writing shaders, specifically AGAL shaders. If you are not familiar with AGAL shaders, you may not completely understand this part of the tutorial until you've done some further reading on this topic. Still, read on; you'll grasp what is going on at a conceptual level.

Here is the code:

private function modulateDiffuseMethod( vo:MethodVO, t:ShaderRegisterElement, regCache:ShaderRegisterCache ):String { var viewDirFragmentReg:ShaderRegisterElement = _atmosphereDiffuseMethod.viewDirFragmentReg; var normalFragmentReg:ShaderRegisterElement = _atmosphereDiffuseMethod.normalFragmentReg; var temp:ShaderRegisterElement = regCache.getFreeFragmentSingleTemp(); regCache.addFragmentTempUsages( temp, 1 ); var code:String = "dp3 " + temp + ", " + viewDirFragmentReg + ".xyz, " + normalFragmentReg + ".xyz\n" + "mul " + temp + ", " + temp + ", " + temp + "\n" + "mul " + t + ".w, " + t + ".w, " + temp + "\n"; regCache.removeFragmentTempUsage( temp ); return code; }

First, you ask the register cache for a temporary register in which you store the projection of the current fragment to the viewer vector onto the normal vector of the fragment. This value is then squared. This projection will be larger as the view and normal vectors become aligned and smaller as they tend to be perpendicular to each other. If you think about the normals of the sphere, the projection will be larger near the center, and smaller near the edges of the sphere, to completely zero at its edges. You then multiply this value by the w component of the incoming t registry element, which contains diffuse lighting intensity depending on the position of the light and the surface normal, hence taking into account the previously calculated diffuse phong shading. By storing the result in the w component of the register, you are affecting the w component, which represents the level of transparency of the current color, hence making areas near the edge of the sphere more transparent–exactly what you wanted.

Post-processing

Of course, you could continue enhancing the example indefinitely. In this tutorial, however, there is just one more thing to do: enhance the graphics with a bit of post-processing. That is, you won't work on the scene itself anymore, but you will add a few rendering effects on top of the current result. Specifically, you will add a bloom effect as a GPU Filter3D and a lens flare effect via the regular Flash display API using Bitmap objects.

Bloom effect

Away3D offers an extensible post-processing kit via the filters package. These filters like BloomFilter3D, BlurFilter3D, DepthOfFieldFilter3D, etc can be considered to be shaders, but instead of acting on independent materials and regular 3D geometry, they work on the final output of the scene's rendering and draw the result on a quad that perfectly fits the screen. This lets you use the GPU's vast processing power for global scene rendering effects.

Basically, a bloom filter emulates how lenses experience film overexposure by very bright light sources. In this case, you use it when looking directly at the Sun. To set up filters in Away3D you simply initialize the objects and set them as the view's filters3d array. You can chain multiple filters in this way.

_bloomFilter = new BloomFilter3D( 2, 2, 0.5, 0, 4 ); _view.filters3d = [ _bloomFilter ];

Then, at runtime, you evaluate the strength (exposure parameter) of the filter by calculating how much the viewer is in front of the Sun. This is done in the updateBloom() method, mainly by projecting the camera position onto the X axis (along which the Sun lays). This value is used as well to slightly scale up the Sun Sprite3D.

Lens flare effect

The lens flare effect is a classical 2D, regular Flash display API effect. You simply load a few bitmaps, place them on top of the view and align them to produce the desired effect. The updateFlares() method evaluates the position and visibility of the flares. The most important part of this method is  view.project( vector:Vector3D ), which takes a scene position in 3D space and transforms it to 2D screen space. You evaluate the visibility of the flares by first determining if the Sun is in the screen and then by checking if the Sun is occluded by the Earth. Notice that you are not considering occlusion for the Moon, but you could. The code also embeds the miniclass FlareObject just to make the manipulation of the flares a bit easier.

Where to go from here

To create this planet Earth simulation, you started with a very basic scene in terms of geometry and shading and enhanced its appearance step by step. First, you used a SkyBox and billboards to enrich the scene, and proceeded to enhance the materials of the Earth, eventually getting into coding a bit of AGAL. With all this, you should now be a bit more familiar with the Away3D material system, which you can use to produce some stunning visual effects. Certainly, you could continue to develop this example, but you can also move on now that you have a better understanding of how to harness the power of the GPU to achieve creative effects with Away3D.

To explore the Away3D engine further, visit the growing Away3D tutorials wiki.

The Away3D forum is another great resource with an active community.

Also, be sure to check out the GitHub repository of Away3D examples.