Encoding live video to H.264/AVC with Flash Player 11

Table of contents
- H264 VideoStreamSettings
- Getting started
- Setting up the server
- Setting up the project in Flash Builder
- Creating the application
- Verifying a successful NetConnection
- Understanding callback functions
- Setting Up H.264 encoding, and publishing to the NetStream
- Displaying and encoding the video from the webcam, and displaying video streamed back from the server
- Display the publishing video stream
- HTTP delivery considerations
- Where to go from here
Created
12 November 2012
Requirements
Prerequisite knowledge
You should have a basic familiarity with ActionScript 3.
User level
Beginning
With the release of Flash Player 11, Adobe has introduced some exciting new features, including performance upgrades such as native 64-bit support, and asynchronous bitmap decoding. In addition, Flash Player can now encode live video streams to the H.264/AVC standard. This new feature allows developers to create real-time, high-quality, live video streaming applications for chat, conferencing, and live event broadcasting.
The heart of Flash Player’s ability to encode video to H.264 lies within a new class called H264VideoStreamSettings
. This new class is a subclass of VideoStreamSettings
, and it’s what allows you to control the compression settings for video attached to a NetStream
. The following code allows you to encode video attached to a NetStream to H.264 in Flash Player, instead of using the default Sorensen Spark codec:
import flash.media.H264VideoStreamSettings;
var h264Settings:H264VideoStreamSettings = newH264VideoStreamSettings();
h264Settings.setProfileLevel( H264Profile.BASELINE, H264Level.LEVEL_3_1 )
This article demonstrates how to take advantage of Flash Player 11.0's new H.264 encoding capabilities by walking you through the development of a video encoding and streaming application. By leveraging the Flex 4.6 SDK, and Flash Player 11.0 or higher, you will build an application that does the following:
- Captures live video from a webcam
- Establishes a connection to Flash Media Server using
NetConnection
- Publishes video stream from application to Flash Media Server using
NetStream
- Displays outgoing video stream from camera (prior to being encoded) in a
Video
component within the application - Sends encoding parameters to Flash Player to encode the raw webcam video to H.264
- Displays encoded video’s metadata
- Streams live, encoded video from Flash Media Server to the application using another instance of
NetStream
- Displays newly encoded, streamed live video in another
Video
component within the application
To get the most out of this walkthrough, you will need the following:
- Flash Player 11.0 or higher download latest Flash Player here
- Flex 4.6 SDK or higher download latest Flex SDK here
- Flash Media Server 4.5, or Adobe Media Server 5
- A video camera attached to your computer
This walkthrough demonstrates taking a live video feed that has been encoded to H.264/AVC within Flash Player and sending it via RTMP to a Flash Media Server. The walkthrough below assumes you are using either Flash Media Server 4.5 or Adobe Media Server 5. If you do not have a media server setup online you can download a free copy of Flash Media Developer Server 4.5 here.
Beyond a basic install of either Flash Media Server 4.5, or Adobe Media Server 5, there isn’t anything more required to run the example.
If you’re new to Flash Media Server 4.5 or Adobe Media Server 5, and would like some guidance on how to get started with streaming media, please refer to this excellent series by Joseph LaBrecque and Tom Green – Beginning Flash Media Server 4.5.
The example application is a simple ActionScript 3.0 project that runs in the Flash Player, and utilizes features found in Flash Player versions 11 or later, specifically. Both start and completed versions of the application are provided (H264Encoder_START, and H264Encoder_COMPLETED).
- Import the H264Encoder_START project into Flash Builder by choosing File -> Import Flash Builder Project.
In order for this application to work correctly, Flash Builder needs to target Flash Player 11.0 or higher. This happens by default when using the Flex 4.6 SDK, but not when using earlier versions such as Flex 4.5. For instructions on how to set up your project with an SDK earlier than 4.6, please refer to this article.
- In Flash Builder with the H264Encoder project selected, choose Project -> Properties -> ActionScript Compiler.
- Verify that the compiler is targeting at least Flash Player 11.0. (Fig. 1.2) If it isn’t, select the “Use a specific version” radio button, and type “11.0.0″ for the value.
First, you’ll modify H264Encoder_START so that it can communicate with an attached webcam. In addition, you’ll add the code necessary for establishing a NetConnection
to connect the application to the server, as well two NetStream
instances; one responsible for getting the video from the application into Flash Media Server, and one for bringing it back from the server into the application.
Connecting a camera, establishing a NetConnection and NetStreams
- Directly under the opening class definition statement, but before the constructor method, create a private variable named “
nc
”, data-typed asNetConnection
. Use code hinting to have Flash Builder generate the necessary import statement for you, or importflash.net.NetConnection
manually. Your code should appear as follows:
package
{
import flash.net.NetConnection;
public class H264Encoder extends Sprite
{
private var nc:NetConnection;
public function H264Encoder()
{
}
}
}
- Create two private variables to represent each
NetStream
. Create one for the stream going from the application to the server (ns_out
), and another for the stream coming back into the application from the server (ns_in
). Be sure to importflash.net.NetStream
.
package
{
import flash.net.NetConnection;
import flash.net.NetStream;
public class H264Encoder extends Sprite
{
private var nc:NetConnection;
private var ns_out:NetStream;
private var ns_in:NetStream;
public function H264Encoder()
{
}
}
}
- Next, create a private variable named
cam
of typeCamera
, and set its value =Camera.getCamera()
. TheCamera
class is a little different than other classes, in that you don’t call a constructor to instantiate an object of typeCamera
. Instead, you call thegetCamera()
method of theCamera
class. This method will return an instance of aCamera
object unless there isn’t a camera attached to the computer, or if there is, but the camera is in use by another application. Be sure to importflash.media.Camera
.
- It is now time to add code that will allow the application to connect to the server using an instance of the
NetConnection
class. Within the providedinitConnection()
function on about line 44, create a newNetConnection
by instantiating thenc:NetConnection
variable, which you defined in step 1:
private var cam:Camera = Camera.getCamera();
private function initConnection():void
{
nc = new NetConnection();
}
- It’s always a good practice to verify that a
NetConnection
was successful before any further actions are taken. To do this, add anEventListener
to listen for an event of typeNetStatusEvent.NET_STATUS
. You will create theonNetStatus()
event handler in the next section. Be sure to importflash.events.NetStatusEvent
:
import flash.events.NetStatusEvent;
private function initConnection():void
{
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
}
- Next, and still within the
initConnection()
function body, tell theNetConnection
where to connect to by calling theconnect()
method of theNetConnection
class. As an argument to this method, add the URL for the location of the “live” folder within the instance of Flash Media Server or Adobe Media Server you want to connect to. For example, to connect to an instance of Flash Media Server running locally on your machine, you would set the URL to: “rtmp://localhost/live
”.
private function initConnection():void
{
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
nc.connect("rtmp://YOUR_SERVER_URL/live");
}
- Finally, tell the
NetConnection
where the server should invoke callback methods by setting the value for theNetConnection
’sclient
property tothis
. Callback methods are special handler functions invoked by the server when a client application establishes aNetConnection
. Later on in this example you will work with theonMetaData()
andonBWDone()
callback methods. You will include these callback methods within the main application class, which is in fact the same object that will establish theNetConnection
, and therefore the value of theNetConnection
instance’s (nc
) client property should be set tothis
. TheinitConnection()
function should now appear as follows:
private function initConnection():void
{
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
nc.connect("rtmp://YOUR_SERVER_URL/live");
nc.client = this;
}
As mentioned, it’s always a good practice to verify the success of a NetConnection
attempt. To do this, locate the function named onNetStatus()
on about line 70:
protected function onNetStatus(event:NetStatusEvent):void
{
}
- Within the
onNetStatus()
event body, create atrace
statement that outputs the value ofevent.info.code
to the console during debugging. The code property of theinfo
object in theNetStatus
event will containString
data that indicates the status of the attemptedNetConnection
, such as “NetGroup.Connect.Success
”, or “NetGroup.Connect.Failed
”. Tracing the value of this property allows you to check the status of theNetConnection
easily by simply running the application in debug mode.
protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code);
}
- This walkthrough example application will attempt to connect to the server and start playing/publishing video automatically when launched. To achieve this, call
initConnection()
from within the main class’ constructor method:
public function H264_Encoder()
{
initConnection();
}
The sample application contains two callback functions – onBWDone()
, and onMetaData()
. The onBWDone()
callback checks for available bandwith, which can be useful in applications that need to dynamically switch video assets according to the bandwith that’s currently available. Although it’s necessary to include these functions in the client code (omitting them will generate a runtime error when the server tries to make the function call) it’s not necessary to actually do anything with them.
- This application isn’t concerned with monitoring bandwith, so
onBWDone()
can be left as an empty function.
The onMetaData()
callback function is useful for accessing a video stream’s metadata, and the example application provides code within this callback to do just that. The onMetaData()
callback returns an Array
of generic objects that represents the video stream’s metadata. Later, you will use those objects that correspond to various metadata in order to display that information within the UI.
Next, you’ll add code that enables the application to read webcam data, encode that webcam data to H.264, and to then stream the encoded video to the server.
The sample application contains two callback functions – onBWDone(), and onMetaData(). The onBWDone() callback checks for available bandwith, which can be useful in applications that need to dynamically switch video assets according to the bandwith that’s currently available. Although it’s necessary to include these functions in the client code (omitting them will generate a runtime error when the server tries to make the function call) it’s not necessary to actually do anything with them.
This application isn’t concerned with monitoring bandwith, so onBWDone() can be left as an empty function.The onMetaData() callback function is useful for accessing a video stream’s metadata, and the example application provides code within this callback to do just that. The onMetaData() callback returns an Array of generic objects that represents the video stream’s metadata. Later, you will use those objects that correspond to various metadata in order to display that information within the UI.
Next, you’ll add code that enables the application to read webcam data, encode that webcam data to H.264, and to then stream the encoded video to the server.
In this next section, you will attach your webcam to an instance of the Camera
class. You will then encode the webcam input to H.264 using properties of the Camera
class, and the H264VideoStreamSettings
class. Certain encoding parameters can’t be set (yet, although support for this is hopefully coming soon) on H264VideoStreamSettings
, so you’ll be setting those values on the Camera
class.
Next, you will attach the encoded video to a live video stream, and stream it to the server’s live directory.
Finally, in order to utilize the provided code that will allow you to read the metadata of the newly encoded video stream, you will call the send()
method of the NetStream
class. As arguments to the send()
method, you will include @setDataFrame
, a special handler method within Flash Media Server, the provided onMetaData()
callback method to listen for the metadata client-side, and finally, a local variable metaData
, which will be used to represent the desired metadata items. First:
- Within the
onNetStatus
handler body, and beneath the existing trace statement, create a conditional statement that checks the value ofevent.info.code
and compares it to the String value “NetConnection.Connect.Success
”. Ifevent.info.code
== “NetConnection.Connect.Success
”, call three functions that you will create in the next section; one that publishes an outgoing video stream, one that displays the incoming video from the webcam, and one that displays the video stream being sent back to the application from the server. The completedonNetStatus()
function should appear as follows:
protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code);
if(event.info.code =="NetConnection.Connect.Success")
{
publishCamera();
displayPublishingVideo();
displayPlaybackVideo();
}
}
At this point, you have included the code necessary to establish a NetConnection
, and verify the success or failure of that connection with a trace
statement. In addition, you’ve included calls to functions that will eventually handle the publishing and playback of the video from the webcam, as well as the video coming back from the server.
- Locate the function named
publishCamera()
, on around line 85. In the first line ofpublishCamera()
, instantiate thens_out NetStream
object by calling its constructor. Pass the constructor theNetConnection
instancenc
:
protected function publishCamera():void
{
ns_out = new NetStream( nc );
}
- On the next line, attach the
Camera
instancecam
to the outgoingNetStream
by calling theattachCamera()
method of theNetStream
class. Pass this method thecam
instance:
ns_out.attachCamera( cam );
- Create a new local variable named
h264Settings
, data typed asH264VideoStreamSettings
and set its initial value equal to newH264VideoStreamSettings()
. Be sure to importflash.media.H264VideoStreamSettings
:
import flash.media.H264VideoStreamSettings;
var h264Settings:H264VideoStreamSettings = newH264VideoStreamSettings();
- Call the
setProfileLevel()
method of theH264VideoStreamSettings
class on theh264Settings
instance to encode the video using the BASELINE profile, and a level of 3.1. Be sure to import both theH264Level
andH264Profile
classes:
import flash.media.H264Level;
import flash.media.H264Profile;
h264Settings.setProfileLevel( H264Profile.BASELINE, H264Level.LEVEL_3_1 )
- Use the
setQuality()
method of theCamera
class instance to encode the video stream at 90000 bps (900Kbps), and with a quality setting of 90:
cam.setQuality( 90000, 90 );
cam.setMode( 320, 240, 30, true );
- Use the
setMode()
method of theCamera
class instance to set the video’s width, height, and frames per second, and to determine if it should maintain its capture size when if camera has no default behavior for this parameter:
- Next, using the
setKeyFrameInterval()
method of theCamera
class instance to set the video’s keyframe interval to 15 (two keyframes per second):
cam.setKeyFrameInterval( 15 );
- To set the outgoing video’s compression settings, assign the values of the
h264Settings
variable to thevideoStreamSettings
property of the outbound stream,ns_out
ns_out.videoStreamSettings = h264Settings;
- Call the
publish()
method of theNetStream
class on the outgoingNetStream
, and pass it parameters to provide a name for the stream (“mp4:webCam.f4v
”), as well as a destination folder in the server (“live
”):
Note: FLV streams don’t require a codec prefix, but F4V/MP4 files, MP3 files, and RAW files do. You can find more information about this in the documentation.
ns_out.publish( "mp4:webCam.f4v", "live" );
Create the objects that will hold the metadata values of the encoded video you will access at runtime. This isn’t a necessary step for encoding H.264 video with Flash Player, but it’s included nonetheless so that the completed application can utilize the metadata returned from the onMetaData()
callback function that’s been provided.
- Create a new local variable named
metaData
, data typed as anObject
, and set its initial value equal tonew Object()
:
var metaData:Object = new Object();
- Create the following
metaData
objects and add them to thepublishCamera()
function:
metaData.codec = ns_out.videoStreamSettings.codec;
metaData.profile = h264Settings.profile;
metaData.level = h264Settings.level;
metaData.fps = cam.fps;
metaData.height = cam.height;
metaData.width = cam.width;
metaData.keyFrameInterval = cam.keyFrameInterval;
In order for the application to return the stream’s metadata, a special handler built into Flash Media Server named @setDataFrame
needs to be called from the send()
method of the NetStream
. For more information about @setDataFrame
and adding metadata to a live stream, please refer to the Adobe documentation.
- Call the
send()
method of theNetStream
class on thens_out
object and pass it the name of the handler method “@setDataFrame
”, and the callback method “onMetaData
”, as well as the local variablemetaData
:
ns_out.send( "@setDataFrame", "onMetaData", metaData );
protected function publishCamera():void
{
ns_out = new NetStream( nc );
ns_out.attachCamera( cam );
var h264Settings:H264VideoStreamSettings = new
H264VideoStreamSettings();
h264Settings.setProfileLevel( H264Profile.BASELINE,
H264Level.LEVEL_3_1 );
cam.setQuality( 90000, 90 );
cam.setMode( 320, 240, 30, true );
cam.setKeyFrameInterval( 15 );
ns_out.videoStreamSettings = h264Settings;
ns_out.publish( "mp4:webCam.f4v", "live" );
var metaData:Object = new Object();
metaData.codec = ns_out.videoStreamSettings.codec;
metaData.profile = h264Settings.profile;
metaData.level = h264Settings.level;
metaData.fps = cam.fps;
metaData.bandwith = cam.bandwidth;
metaData.height = cam.height;
metaData.width = cam.width;
metaData.keyFrameInterval = cam.keyFrameInterval;
ns_out.send( "@setDataFrame", "onMetaData", metaData);
}
Displaying and encoding the video from the webcam, and displaying video streamed back from the server
The application needs to display both the raw, un-encoded incoming video from the webcam, as well as the inbound streaming video after it has been encoded to H.264 in Flash Player, sent to the server, and then back to the application. In addition, the metadata that you defined in the previous section needs to be displayed in the UI to reveal the encoding settings defined in publishCamera()
.
In this next section, you will work with two functions that have been provided for you; displayPublishingVideo()
(line 121), and displayPlaybackVideo()
(line 128), to play the streams and display the metadata on screen.
- Create a new private instance variable named
vid_out
, and set its data type toVideo
. This new instance of theVideo
class will be used to playback the not-yet-encoded video coming in from the webcam. Be sure to importflash.media
.Video
:
import flash.media.Video;
private var vid_out:Video;
- Next, within the
initConnection()
function, instantiate thevid_out
variable by calling the constructor method of theVideo
class. Provide some layout information by assigning x a value of 300, and y a value of 10. Finally, addvid_out
to the stage by passing it as an argument toaddChild()
:
vid_out = new Video();
vid_out.x = 300;
vid_out.y = 10;
addChild( vid_out );
- To allow the
vid_out
component to display video coming from the webcam, call theattachCamera()
method of theVideo
class, and pass that method the instance of theCamera
class that represents the webcam:
vid_out.attachCamera( cam );
If you run the application at this point, provided you have a webcam attached to your computer, you should see the Flash Player dialog that asks permission to access your camera. Grant Flash Player permission, and you should now see a live video feed coming from your webcam.
Next, you’ll bring the video stream back from the server, and display it in another Video
object.
- Create a new instance variable named
vid_in
and type it asVideo
.
private var vid_in:Video;
- Next, locate the function named
displayPlaybackVideo()
on about line 128. In the first line of the function body, instantiatens_in
, theNetStream
variable you declared earlier, and set its initial value equal to newNetStream(nc)
with theNetConnection nc
passed as an argument.
protected function displayPlaybackVideo():void
{
ns_in = new NetStream( nc );
}
- Still within
displayPlaybackVideo()
, instead of calling theattachCamera()
method, as you did for the previousNetStream
, set the client property of the newNetStream
to “this
”.
ns_in.client = this;
- On the next line, call the
play()
method of theNetStream
class, and pass it the String value for the name of the stream. This should be the name of the outgoing stream as well.
ns_in.play( "mp4:webCam.f4v" );
- Next, within the
initConnection()
function, instantiate thevid_in
variable by calling its constructor. Set some sizing and layout properties for the newVideo
object so that it sits properly on the stage. Finally, addvid_in
to the display list by passing it as an argument toaddChild()
:
vid_in = new Video();
vid_in.x = vid_out.x + vid_out.width;
vid_in.y = vid_out.y;
addChild( vid_in );
- Then, back in
displayPlaybackVideo()
function, attach the incomingNetStream
to theVideo
object so that it can playback the video stream.
vid_in.attachNetStream( ns_in );
- Save and run the application.
You should now see a dark rectangle appear that displays the video’s encoding settings, and two video streams, side-by-side. The video on the left is the raw video footage coming from the webcam, and the one on the right is the stream coming back from the server.
The application now automatically attaches a webcam, displays the webcam video, encodes that video to H.264, delivers the video to a server via RTMP, and then streams that video from the server to the application.
Currently Flash Player can encode audio using either the Nellymoser or Speex codec. This allows you to stream H.264/AVC video with either Nellymoser or Speex encoded audio to your server and thus web and desktop application, as well as devices that can ingest RTMP streams or HTTP Dynamic Streaming (HDS) and HTTP Live Streaming (HLS) content with the use of the Adobe Media Server Live packaging capabilities. To note, at this time Flash Player cannot encode audio to the AAC or MP3 standards, which are required if you want to stream audio to iOS devices using HTTP Live Streaming (HLS).
Congratulations! You’ve just encoded a live video stream to the H.264 standard, all within the Flash Player. In the past this would’ve involved the use of one or more separate desktop applications to produce the content, as well as a client-side application to ingest the stream. The new capabilities within Flash Player give you some very interesting options for creating high-definition video chat/conferencing applications, and other user-generated, HD video streaming solutions, in a very streamlined fashion.
Thanks to the new H264VideoStreamSettings class, your applications can now utilize the more efficient and higher quality encoding standard of H.264/AVC, instead of the default Sorensen Spark codec.
- For an in-depth look into working with various H.264 encoding parameters, such as level and profile, check out Encoding options for H.264 video, by Jan Ozer on Adobe Media Center Developer Center.
- To learn more about the new features in Flash Player 11 and AIR 3, check out Thibault Imbert and Tom Nguyen’s presentation Changing the Game, from MAX 2011.
Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License + Adobe Commercial Rights
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License. Permissions beyond the scope of this license, pertaining to the examples of code included within this work are available at Adobe.
More Like This
Beginner's guide to streaming audio through Flash Media Server 3.5
Beginner's guide to dynamic streaming with Flash Media Server 3.5
Beginner's guide to using ActionScript 3.0 with Flash Media Server 3.5
Streaming AAC/MP3 files with Flash Media Server 3
Beginner's guide to installing Flash Media Server 3.5
Beginner's guide to streaming video with Flash Media Server 3.5
Protecting online video distribution with Adobe Flash media technology
Live dynamic streaming and DVR for non-developers
Setting a crossdomain.xml file for HTTP streaming
Varnish sample code for HDS and HLS failover
Tutorials & Samples
Tutorials
Setting Virtual Private Cloud for Adobe Media Server
Introduction to Adobe Media Server on Amazon Web Services marketplace
Extending an existing red button workflow to mobile with Adobe Primetime
Samples
Best practices for real-time collaboration using Flash Media Server
Understanding live DVR – Part 2: Using DVRCast with Flash Media Live Encoder 3
Understanding live DVR – Part 1: Implementing a live DVR player
Flash Media Server Forum
07/16/2017 I think they stole my licence
07/04/2017 Adobe media server HLS straming will hangs up when run for few mins.
07/04/2017 Can I upgrade my AMS from previous version to latest use old License and SN?
07/04/2017 Can I upgrade my AMS from previous version to latest use old License and SN?