Encoding live video to H.264/AVC with Flash Player 11

 

 


Requirements

Prerequisite knowledge

You should have a basic familiarity with ActionScript 3.

 

User level

Beginning

 

 

 

With the release of Flash Player 11, Adobe has introduced some exciting new features, including performance upgrades such as native 64-bit support, and asynchronous bitmap decoding. In addition, Flash Player can now encode live video streams to the H.264/AVC standard. This new feature allows developers to create real-time, high-quality, live video streaming applications for chat, conferencing, and live event broadcasting.

 

H264 VideoStreamSettings

 

The heart of Flash Player’s ability to encode video to H.264 lies within a new class called H264VideoStreamSettings. This new class is a subclass of VideoStreamSettings, and it’s what allows you to control the compression settings for video attached to a NetStream. The following code allows you to encode video attached to a NetStream to H.264 in Flash Player, instead of using the default Sorensen Spark codec:

import flash.media.H264VideoStreamSettings; var h264Settings:H264VideoStreamSettings = newH264VideoStreamSettings(); h264Settings.setProfileLevel( H264Profile.BASELINE, H264Level.LEVEL_3_1 )

 

This article demonstrates how to take advantage of Flash Player 11.0's new H.264 encoding capabilities by walking you through the development of a video encoding and streaming application. By leveraging the Flex 4.6 SDK, and Flash Player 11.0 or higher, you will build an application that does the following:

  • Captures live video from a webcam
  • Establishes a connection to Flash Media Server using NetConnection
  • Publishes video stream from application to Flash Media Server using NetStream
  • Displays outgoing video stream from camera (prior to being encoded) in a Video component within the application
  • Sends encoding parameters to Flash Player to encode the raw webcam video to H.264
  • Displays encoded video’s metadata
  • Streams live, encoded video from Flash Media Server to the application using another instance of  NetStream
  • Displays newly encoded, streamed live video in another Video component within the application

 

Getting started

 

To get the most out of this walkthrough, you will need the following:

Setting up the server

 

This walkthrough demonstrates taking a live video feed that has been encoded to H.264/AVC within Flash Player and sending it via RTMP to a Flash Media Server. The walkthrough below assumes you are using either Flash Media Server 4.5 or Adobe Media Server 5. If you do not have a media server setup online you can download a free copy of Flash Media Developer Server 4.5 here.

Beyond a basic install of either Flash Media Server 4.5, or Adobe Media Server 5, there isn’t anything more required to run the example.

If you’re new to Flash Media Server 4.5 or Adobe Media Server 5, and would like some guidance on how to get started with streaming media, please refer to this excellent series by Joseph LaBrecque and Tom Green – Beginning Flash Media Server 4.5.

 

Setting up the project in Flash Builder

 

The example application is a simple ActionScript 3.0 project that runs in the Flash Player, and utilizes features found in Flash Player versions 11 or later, specifically. Both start and completed versions of the application are provided (H264Encoder_START, and H264Encoder_COMPLETED).

  1. Import  the H264Encoder_START project into Flash Builder by choosing File -> Import Flash Builder Project.

    In order for this application to work correctly, Flash Builder needs to target Flash Player 11.0 or higher. This happens by default when using the Flex 4.6 SDK, but not when using earlier versions such as Flex 4.5. For instructions on how to set up your project with an SDK earlier than 4.6, please refer to this article.

  2. In Flash Builder with the H264Encoder project selected, choose Project -> Properties -> ActionScript Compiler.
  3. Verify that the compiler is targeting at least Flash Player 11.0. (Fig. 1.2) If it isn’t, select the “Use a specific version” radio button, and type “11.0.0″ for the value.

 

 

Creating the application

 

First, you’ll modify H264Encoder_START so that it can communicate with an attached webcam. In addition, you’ll add the code necessary for establishing a NetConnection to connect the application to the server, as well two NetStream instances; one responsible for getting the video from the application into Flash Media Server, and one for bringing it back from the server into the application.

 

Connecting a camera, establishing a NetConnection and NetStreams

 

  1. Directly under the opening class definition statement, but before the constructor method, create a private variable named “nc”, data-typed as NetConnection. Use code hinting to have Flash Builder generate the necessary import statement for you, or import flash.net.NetConnection manually. Your code should appear as follows:

 

package { import flash.net.NetConnection; public class H264Encoder extends Sprite { private var nc:NetConnection; public function H264Encoder() { } } }
  1. Create two private variables to represent each NetStream. Create one for the stream going from the application to the server (ns_out), and another for the stream coming back into the application from the server (ns_in). Be sure to import flash.net.NetStream.
package { import flash.net.NetConnection; import flash.net.NetStream; public class H264Encoder extends Sprite { private var nc:NetConnection; private var ns_out:NetStream; private var ns_in:NetStream; public function H264Encoder() { } } }
  1. Next, create a private variable named cam of type Camera, and set its value = Camera.getCamera(). The Camera class is a little different than other classes, in that you don’t call a constructor to instantiate an object of type Camera. Instead, you call the getCamera() method of the Camera class. This method will return an instance of a Camera object unless there isn’t a camera attached to the computer, or if there is, but the camera is in use by another application. Be sure to import flash.media.Camera.
  1. It is now time to add code that will allow the application to connect to the server using an instance of the NetConnection class. Within the provided initConnection() function on about line 44, create a new NetConnection by instantiating the nc:NetConnection variable, which you defined in step 1:
private var cam:Camera = Camera.getCamera();
private function initConnection():void { nc = new NetConnection(); }
  1. It’s always a good practice to verify that a NetConnection was successful before any further actions are taken. To do this, add an EventListener to listen for an event of type NetStatusEvent.NET_STATUS. You will create the onNetStatus() event handler in the next section. Be sure to import flash.events.NetStatusEvent:
import flash.events.NetStatusEvent; private function initConnection():void { nc = new NetConnection(); nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus); }
  1. Next, and still within the initConnection() function body, tell the NetConnection where to connect to by calling the connect() method of the NetConnection class. As an argument to this method, add the URL for the location of the “live” folder within the instance of Flash Media Server or Adobe Media Server you want to connect to. For example, to connect to an instance of Flash Media Server running locally on your machine, you would set the URL to: “rtmp://localhost/live”.
private function initConnection():void { nc = new NetConnection(); nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus); nc.connect("rtmp://YOUR_SERVER_URL/live"); }
  1. Finally, tell the NetConnection where the server should invoke callback methods by setting the value for the NetConnection’s client property to this. Callback methods are special handler functions invoked by the server when a client application establishes a NetConnection. Later on in this example you will work with the onMetaData() and onBWDone() callback methods. You will include these callback methods within the main application class, which is in fact the same object that will establish the NetConnection, and therefore the value of the NetConnection instance’s (nc) client property should be set to this. The initConnection() function should now appear as follows:
private function initConnection():void { nc = new NetConnection(); nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus); nc.connect("rtmp://YOUR_SERVER_URL/live"); nc.client = this; }

 

Verifying a successful NetConnection

 

As mentioned, it’s always a good practice to verify the success of a NetConnection attempt. To do this, locate the function named onNetStatus() on about line 70:

protected function onNetStatus(event:NetStatusEvent):void { }
  1. Within the onNetStatus() event body, create a trace statement that outputs the value of event.info.code to the console during debugging. The code property of the info object in the NetStatus event will contain String data that indicates the status of the attempted NetConnection, such as “NetGroup.Connect.Success”, or “NetGroup.Connect.Failed”. Tracing the value of this property allows you to check the status of the NetConnection easily by simply running the application in debug mode.
protected function onNetStatus(event:NetStatusEvent):void { trace(event.info.code); }
  1. This walkthrough example application will attempt to connect to the server and start playing/publishing video automatically when launched. To achieve this, call initConnection() from within the main class’ constructor method:
public function H264_Encoder() { initConnection(); }

The sample application contains two callback functions – onBWDone(), and onMetaData(). The onBWDone() callback checks for available bandwith, which can be useful in applications that need to dynamically switch video assets according to the bandwith that’s currently available. Although it’s necessary to include these functions in the client code (omitting them will generate a runtime error when the server tries to make the function call) it’s not necessary to actually do anything with them.

  1. This application isn’t concerned with monitoring bandwith, so onBWDone() can be left as an empty function.

The onMetaData() callback function is useful for accessing a video stream’s metadata, and the example application provides code within this callback to do just that. The onMetaData() callback returns an Array of generic objects that represents the video stream’s metadata. Later, you will use those objects that correspond to various metadata in order to display that information within the UI.

Next, you’ll add code that enables the application to read webcam data, encode that webcam data to H.264, and to then stream the encoded video to the server.

 

Understanding callback functions

 

The sample application contains two callback functions – onBWDone(), and onMetaData(). The onBWDone() callback checks for available bandwith, which can be useful in applications that need to dynamically switch video assets according to the bandwith that’s currently available. Although it’s necessary to include these functions in the client code (omitting them will generate a runtime error when the server tries to make the function call) it’s not necessary to actually do anything with them.

This application isn’t concerned with monitoring bandwith, so onBWDone() can be left as an empty function.The onMetaData() callback function is useful for accessing a video stream’s metadata, and the example application provides code within this callback to do just that. The onMetaData() callback returns an Array of generic objects that represents the video stream’s metadata. Later, you will use those objects that correspond to various metadata in order to display that information within the UI.

Next, you’ll add code that enables the application to read webcam data, encode that webcam data to H.264, and to then stream the encoded video to the server.

 

Setting Up H.264 encoding, and publishing to the NetStream

 

In this next section, you will attach your webcam to an instance of the Camera class. You will then encode the webcam input to H.264 using properties of the Camera class, and the H264VideoStreamSettings class. Certain encoding parameters can’t be set (yet, although support for this is hopefully coming soon) on H264VideoStreamSettings, so you’ll be setting those values on the Camera class.

Next, you will attach the encoded video to a live video stream, and stream it to the server’s live directory.

Finally, in order to utilize the provided code that will allow you to read the metadata of the newly encoded video stream, you will call the send() method of the NetStream class. As arguments to the send() method, you will include @setDataFrame, a special handler method within Flash Media Server, the provided onMetaData() callback method to listen for the metadata client-side, and finally, a local variable metaData, which will be used to represent the desired metadata items. First:

  1. Within the onNetStatus  handler body, and beneath the existing trace statement, create a conditional statement that checks the value of event.info.code and compares it to the String value “NetConnection.Connect.Success”. If event.info.code == “NetConnection.Connect.Success”, call three functions that you will create in the next section; one that publishes an outgoing video stream, one that displays the incoming video from the webcam, and one that displays the video stream being sent back to the application from the server. The completed onNetStatus() function should appear as follows:
protected function onNetStatus(event:NetStatusEvent):void { trace(event.info.code); if(event.info.code =="NetConnection.Connect.Success") { publishCamera(); displayPublishingVideo(); displayPlaybackVideo(); } }

At this point, you have included the code necessary to establish a NetConnection, and verify the success or failure of that connection with a trace statement. In addition, you’ve included calls to functions that will eventually handle the publishing and playback of the video from the webcam, as well as the video coming back from the server.

  1. Locate the function named publishCamera(), on around line 85. In the first line of publishCamera(), instantiate the ns_out NetStream object by calling its constructor. Pass the constructor the NetConnection instance nc:
protected function publishCamera():void { ns_out = new NetStream( nc ); }
  1. On the next line, attach the Camera instance cam to the outgoing NetStream by calling the attachCamera() method of the NetStream class. Pass this method the cam instance:
ns_out.attachCamera( cam );
  1. Create a new local variable named h264Settings, data typed as H264VideoStreamSettings and set its initial value equal to new H264VideoStreamSettings(). Be sure to import  flash.media.H264VideoStreamSettings:
import flash.media.H264VideoStreamSettings; var h264Settings:H264VideoStreamSettings = newH264VideoStreamSettings();
  1. Call the setProfileLevel() method of the H264VideoStreamSettings class on the h264Settings instance to encode the video using the BASELINE profile, and a level of 3.1. Be sure to import both the H264Level and  H264Profile classes:
import flash.media.H264Level; import flash.media.H264Profile; h264Settings.setProfileLevel( H264Profile.BASELINE, H264Level.LEVEL_3_1 )
  1. Use the setQuality() method of the Camera class instance to encode the video stream at 90000 bps (900Kbps), and with a quality setting of 90:
cam.setQuality( 90000, 90 );
cam.setMode( 320, 240, 30, true );
  1. Use the setMode() method of the Camera class instance to set the video’s width, height, and frames per second, and to determine if it should maintain its capture size when if camera has no default behavior for this parameter:
  1. Next, using the setKeyFrameInterval() method of the Camera class instance to set the video’s keyframe interval to 15 (two keyframes per second):
cam.setKeyFrameInterval( 15 );
  1. To set the outgoing video’s compression settings, assign the values of the h264Settings variable to the videoStreamSettings property of the outbound stream, ns_out
ns_out.videoStreamSettings = h264Settings;
  1. Call the publish() method of the NetStream class on the outgoing NetStream, and pass it parameters to provide a name for the stream (“mp4:webCam.f4v”), as well as a destination folder in the server (“live”):

Note: FLV streams don’t require a codec prefix, but F4V/MP4 files, MP3 files, and RAW files do. You can find more information about this in the documentation.

ns_out.publish( "mp4:webCam.f4v", "live" );

Create the objects that will hold the metadata values of the encoded video you will access at runtime. This isn’t a necessary step for encoding H.264 video with Flash Player, but it’s included nonetheless so that the completed application can utilize the metadata returned from the onMetaData() callback function that’s been provided.

  1. Create a new local variable named metaData, data typed as an Object, and set its initial value equal to new Object():
var metaData:Object = new Object();
  1. Create the following metaData objects and add them to the publishCamera() function:
metaData.codec = ns_out.videoStreamSettings.codec; metaData.profile = h264Settings.profile; metaData.level = h264Settings.level; metaData.fps = cam.fps; metaData.height = cam.height; metaData.width = cam.width; metaData.keyFrameInterval = cam.keyFrameInterval;

In order for the application to return the stream’s metadata, a special handler built into Flash Media Server named @setDataFrame needs to be called from the send() method of the NetStream. For more information about @setDataFrame and adding metadata to a live stream, please refer to the Adobe documentation.

  1. Call the send() method of the NetStream class on the ns_out object and pass it the name of the handler method “@setDataFrame”, and the callback method “onMetaData”, as well as the local variable metaData:
ns_out.send( "@setDataFrame", "onMetaData", metaData );
protected function publishCamera():void { ns_out = new NetStream( nc ); ns_out.attachCamera( cam ); var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings(); h264Settings.setProfileLevel( H264Profile.BASELINE, H264Level.LEVEL_3_1 ); cam.setQuality( 90000, 90 ); cam.setMode( 320, 240, 30, true ); cam.setKeyFrameInterval( 15 ); ns_out.videoStreamSettings = h264Settings; ns_out.publish( "mp4:webCam.f4v", "live" ); var metaData:Object = new Object(); metaData.codec = ns_out.videoStreamSettings.codec; metaData.profile = h264Settings.profile; metaData.level = h264Settings.level; metaData.fps = cam.fps; metaData.bandwith = cam.bandwidth; metaData.height = cam.height; metaData.width = cam.width; metaData.keyFrameInterval = cam.keyFrameInterval; ns_out.send( "@setDataFrame", "onMetaData", metaData); }

 

Displaying and encoding the video from the webcam, and displaying video streamed back from the server

 

The application needs to display both the raw, un-encoded incoming video from the webcam, as well as the inbound streaming video after it has been encoded to H.264 in  Flash Player, sent to the server, and then back to the application. In addition, the metadata that you defined in the previous section needs to be displayed in the UI to reveal the encoding settings defined in publishCamera().

In this next section, you will work with two functions that have been provided for you; displayPublishingVideo() (line 121), and displayPlaybackVideo() (line 128), to play the streams and display the metadata on screen.

  1. Create a new private instance variable named vid_out, and set its data type to Video. This new instance of the Video class will be used to playback the not-yet-encoded video coming in from the webcam. Be sure to import flash.media.Video:
import flash.media.Video; private var vid_out:Video;
  1. Next, within the initConnection() function, instantiate the vid_out variable by calling the constructor method of the Video class. Provide some layout information by assigning x a value of 300, and y a value of 10. Finally, add vid_out to the stage by passing it as an argument to addChild():
vid_out = new Video(); vid_out.x = 300; vid_out.y = 10; addChild( vid_out );
  1. To allow the vid_out component to display video coming from the webcam, call the attachCamera() method of the Video class, and pass that method the instance of the Camera class that represents the webcam:
vid_out.attachCamera( cam );

If you run the application at this point, provided you have a webcam attached to your computer, you should see the Flash Player dialog that asks permission to access your camera. Grant Flash Player permission, and you should now see a live video feed coming from your webcam.

 

Display the publishing video stream

 

Next, you’ll bring the video stream back from the server, and display it in another Video object.

  1. Create a new instance variable named vid_in and type it as Video.
private var vid_in:Video;
  1. Next, locate the function named displayPlaybackVideo() on about line 128. In the first line of the function body, instantiate ns_in, the NetStream variable you declared earlier, and set its initial value equal to new NetStream(nc) with the NetConnection nc passed as an argument.
protected function displayPlaybackVideo():void { ns_in = new NetStream( nc ); }
  1. Still within displayPlaybackVideo(), instead of calling the attachCamera() method, as you did for the previous NetStream, set the client property of the new NetStream to “this”.
ns_in.client = this;
  1. On the next line, call the play() method of the NetStream class, and pass it the String value for the name of the stream. This should be the name of the outgoing stream as well.
ns_in.play( "mp4:webCam.f4v" );
  1. Next, within the initConnection() function, instantiate the vid_in variable by calling its constructor. Set some sizing and layout properties for the new Video object so that it sits properly on the stage. Finally, add vid_in to the display list by passing it as an argument to addChild():
vid_in = new Video(); vid_in.x = vid_out.x + vid_out.width; vid_in.y = vid_out.y; addChild( vid_in );
  1. Then, back in displayPlaybackVideo() function, attach the incoming NetStream to the Video object so that it can playback the video stream.
vid_in.attachNetStream( ns_in );
  1. Save and run the application.

You should now see a dark rectangle appear that displays the video’s encoding settings, and two video streams, side-by-side. The video on the left is the raw video footage coming from the webcam, and the one on the right is the stream coming back from the server.

The application now automatically attaches a webcam, displays the webcam video, encodes that video to H.264, delivers the video to a server via RTMP, and then streams that video from the server to the application.

 

 

HTTP delivery considerations

 

Currently Flash Player can encode audio using either the Nellymoser or Speex codec. This allows you to stream H.264/AVC video with either Nellymoser or Speex encoded audio to your server and thus web and desktop application, as well as devices that can ingest RTMP streams or HTTP Dynamic Streaming (HDS) and HTTP Live Streaming (HLS)  content with the use of the Adobe Media Server Live packaging capabilities. To note, at this time Flash Player cannot encode audio to the AAC or MP3 standards, which are required if you want to stream audio to iOS devices using HTTP Live Streaming (HLS).

 

Where to go from here

 

Congratulations! You’ve just encoded a live video stream to the H.264 standard, all within the Flash Player. In the past this would’ve involved the use of one or more separate desktop applications to produce the content, as well as a client-side application to ingest the stream. The new capabilities within Flash Player give you some very interesting options for creating high-definition video chat/conferencing applications, and other user-generated, HD video streaming solutions, in a very streamlined fashion.

Thanks to the new H264VideoStreamSettings class, your applications can now utilize the more efficient and higher quality encoding standard of H.264/AVC, instead of the default Sorensen Spark codec.

  • For an in-depth look into working with various H.264 encoding parameters, such as level and profile, check out Encoding options for H.264 video, by Jan Ozer on Adobe Media Center Developer Center.
  • To learn more about the new features in Flash Player 11 and AIR 3, check out Thibault Imbert and Tom Nguyen’s presentation Changing the Game, from MAX 2011.

 

Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License + Adobe Commercial Rights

 

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License. Permissions beyond the scope of this license, pertaining to the examples of code included within this work are available at Adobe.


More Like This


Beginner's guide to streaming audio through Flash Media Server 3.5

Beginner's guide to dynamic streaming with Flash Media Server 3.5

Beginner's guide to using ActionScript 3.0 with Flash Media Server 3.5

Streaming AAC/MP3 files with Flash Media Server 3

Beginner's guide to installing Flash Media Server 3.5

Beginner's guide to streaming video with Flash Media Server 3.5

Protecting online video distribution with Adobe Flash media technology

Live dynamic streaming and DVR for non-developers

Setting a crossdomain.xml file for HTTP streaming

Varnish sample code for HDS and HLS failover