User level
This article outlines Adobe's recommendations for developing and enhancing the performance of dynamic streaming with Adobe Flash Media Server 3.5 and Adobe Flash Player 10 for streaming broadcast (live) media.
Publishing the ideal number of stream encodings
When you stream broadcast (live) video to end users, it is best to provide the appropriate stream for your customers. Use a very slow bit rate stream for your very low-end bandwidth (dial-up) users, and a fast bit rate stream for your high-end broadband customers.
This means maintaining a significant difference between successive encoding bit rates. Keeping too many bit rates close to one another could result in too many stream switches, even with smaller bandwidth fluctuations. Besides your (slight) overhead in switching, the viewer's experience with frequent quality fluctuations may not be pleasant. On the other hand, encoding too few streams with a huge gap in bit rates would not provide the best quality or the most optimal stream for a particular bandwidth environment.
Adobe recommends using the bit rates shown in Table 1 for live streaming. If the intended target users are towards the higher end of the bandwidth spectrum with at least a DSL connection, then the first couple of bit rates could be skipped. The frame rate for videos below a bit rate of 100 Kbps could be set to lower values such as 15 fps, but at bit rates higher than 300 Kbps, a frame rate of at least 25 fps and ideally 30 fps is recommended.
Table 1. Recommended bit rates for live streaming

* Note: Based on the IDC 2008 Consumer Panel Broadband Survey. Each figure represents the percentage of users who have the bandwidth to support the respective total bit rate in that category. For example, 25% of users have bandwidth of at least 1200 Kbps to support the D1 video type but don't have the higher bandwidth needed to support the next higher bit rate of 1800 Kbps.

Here are a few salient points about Table 1:
  • When there is a switch between two streams with different audio bit rates, there may be a slight audible "pop" sound. If you are concerned about this switching artifact, then keep the audio bit rates constant between all the streams to have a completely seamless audio switch.
  • For higher bit rates, the video encoding resolution should not be set higher than the source capture size, as that will not produce higher quality. Keeping the maximum encoding resolution limited to the maximum resolution supported by the camera or the capture interface gives the best quality at a particular bit rate.
  • The video encoding resolution should also not exceed the maximum video resolution on the client's end. If the client's video size is restricted to a particular size, encoding at a higher resolution will be a waste, and the bits may better be conserved for encoding quality rather than size.
  • Keep the encoding audio codec and sampling rates the same across all streams, preferably 44,100 Hz. Switching between streams with different audio sample rates or codecs may not provide a smooth transition and could result in an audible stutter.
  • For lower bit rates, keeping audio set to mono will allow bandwidth savings while keeping the quality the same as the next higher bit-rate streams, providing the smoothest audio transition during a switch with no perceivable difference in audio quality.
  • The audio bit rate values used in Table 1 are for the AAC audio codec supported by Flash Media Live Encoder 3.0. MP3 is only supported for a bit rate of 32 kbps and higher.
  • Unless you're streaming a talking-head video, keep the video frame rate the same at around 24 fps for all streams. For talking-head content, the lower bit-rate streams could be set to publish fewer frames per second, which will help to increase the video quality.
  • Live sporting events and news events involve high-motion video with frequent camera changes, zooms, and other motion effects that require a higher video bit rate. Audio is mostly speech or background noise and could be encoded at lower bit rates to give video more bits, and thus higher quality.
  • Events pertaining to music, concerts or on-demand broadcasts of recorded video may need higher audio bit rates at the cost of slightly lower video quality.
Obtaining client maximum bandwidth capacity
The ActionScript property NetStreamInfo.maxBytesPerSecond is most useful in determining the best bit rate to which the client could subscribe in order to get the best possible quality of service (QoS) without going into buffer underruns. Unlike video-on-demand cases, in live streaming the server keeps a continuous stream of data flowing as it gets it from the publisher. This stream may not utilize the maximum capacity of the client's bandwidth completely, especially if the client is subscribed to a lower bit rate; therefore, the maxBytesPerSecond property will not accurately reflect the maximum bandwidth that the client's connection may be able to support. In video-on-demand cases, the server bursts a few seconds of data together, allowing the maxBytesPerSecond property to reflect the client's maximum bandwidth potential accurately at a given point in time.
In live streaming, the server can also be set up to queue a few seconds of data by setting up a live queue, also known as "live aggregate messages," and then send it as an aggregate, which would allow the client to measure its true bandwidth capacity potential. Starting with FMS 3.5, Application.xml on the server provides the following new configurations:
  • Application/StreamManager/Live/Queue/MaxQueueDelay: The maximum time (in msec) that the live queue can delay transmiting messages
  • Application/StreamManager/Live/Queue/MaxQueueSize: The maximum size (in bytes) to which the live queue is allowed to grow before the messages it contains are transmitted
Here is the contents of Application.xml:
<Application> <StreamManager> <Live> <Queue enabled="true"> <!-- Specifies how often the server will flush the message queue by size --> <!-- (in bytes). Set value to 0 also disable queuing. Default is 4096. --> <MaxQueueSize>4096</MaxQueueSize> <!-- Specifies how often the server will flush the message queue by time --> <!-- (in milliseconds). The default value is set to 500 milliseconds. --> <MaxQueueDelay>500</MaxQueueDelay> </Queue> </Live> </StreamManager> </Application>
Set the MaxQueueDelay property to a minimum of 4000 milliseconds. Corresponding to that, you should set the MaxQueueSize property with the maximum size in bytes needed to queue up 4000 milliseconds of data. This would be the size of 4000 milliseconds of data of the stream with the highest bit rate that will be streamed by the publisher; for example, if the highest bit rate is going to be 2.4 Mbps, then the MaxQueueSize should be:
2.4 × 1024 × 1024 ÷ 8 bps × 4 sec. = 1,258,292 bytes
Flash Media Server will use the smaller of the value of MaxQueueDelay or MaxQueueSize if they conflict with each other. Hence, for the sample above, Application.xml should be set up as follows:
<MaxQueueSize>1258292</MaxQueueSize> <!-- Specifies how often the server will flush the message queue by time --> <!-- (in milliseconds). The default value is set to 500 milliseconds. --> <MaxQueueDelay>4000</MaxQueueDelay>
The live queue could also be set up programmatically through server-side ActionScript. The following new properties of the Stream object are available with FMS 3.5:
  • Stream.maxQueueDelay
  • Stream.maxQueueSize
  • Stream.publishQueryString
The first two match their counterparts in Application.xml, of course; the third, Stream.publishQueryString, specifies the query string appended to the stream path when the stream is published. If the stream is not being published, this will be an empty string.
A publisher can specify the live queue configuration parameters through the query string appended to the publish name, and the server-side ActionScript would be able to parse the string and set the parameters.
Using client-side ActionScript from an application built in Adobe Flash or Adobe AIR, you could configure the parameters as follows:
ns = new NetStream(nc); // set a queue delay of 4 seconds and max queue size of 2.4 Mbits ns.publish("foo?com.adobe.fms.maxQueueDelay=4000&com.adobe.fms.maxQueueSize=1258292");
Query strings are also supported with Flash Media Live Encoder 3.0 and later, and the same string as above could be used to publish from it and configure the live queue on the server.
The following server-side ActionScript would then parse the publish string and set the respective live queuing properties:
application.onPublish = function(clientObj, streamObj) { trace("queryString : " + streamObj.publishQueryString); // the helper function extracQueryStringArg() is defined below delay = extractQueryStringArg(streamObj.pubishQueryString, "com.adobe.fms.maxQueueDelay"); size = extractQueryStringArg(streamObj.publishQueryString, "com.adobe.fms.maxQueueSize"); trace("old maxQueueDelay : " + streamObj.maxQueueDelay); streamObj.maxQueueDelay = delay; trace("new maxQueueDelay : " + streamObj.maxQueueDelay); trace("old maxQueueSize : " + streamObj.maxQueueSize); streamObj.maxQueueSize = size; trace("new maxQueueSize : " + streamObj.maxQueueSize); } function extractQueryStringArg(queryString, arg) { var retVal = ""; temp = arg + "="; i = queryString.indexOf(temp); if (i != 0) { temp = "&" + arg + "="; i = queryString.indexOf(temp); } if (i != -1) { retVal = queryString.substr(i+temp.length); i = retVal.indexOf("&"); if (i != -1) { retVal = retVal.substr(0, i); } } return retVal; }
Considering that the delay is set to four seconds, the client's buffer should be set to a larger value than four seconds. As mentioned later in this document, a client buffer of 10 seconds would be ideal.
Setting the client-side buffer and keyframe interval
The FMS server identifies the right frame at which to switch, keeping the switch smooth without introducing a gap in the live stream. The amount of delay the server introduces is based on the size of the client's playback buffer. For the best user experience, the client-side buffer needs to be more than the keyframe interval of the live stream—ideally, two times the key frame interval, which gives the server enough buffer to find the right frame to switch.
A keyframe interval of five seconds is recommended. Making them too far apart would require a larger client-side buffer and increase the delay on the client side. Also, it takes longer to switch to a stream that has keyframes at longer intervals. Having them too close to each other, on the other hand, would increase bandwidth usage and hence provide lower quality for a fixed bit rate.
For these recommended keyframe intervals and for smoother video switching, a client-side buffer of at least 10 seconds would be most suitable.In case of the server-side publish, set the buffer length of the server-side streams so they are proportional with the keyframe interval. For example, if the streams are encoded with a five-second keyframe interval, set the buffer time to a minimum of 10 seconds with the setBufferTime method.
Also, for smoother video switching, keep the various stream encodings time-aligned and the keyframe interval values closer to each other and preferably the same. Flash Media Live Encoder 3.0 conforms to these requirements.
When not to initiate a stream switch
As a good practice, it is recommended that a new switch not be initiated in the following scenarios:
  • While the stream is buffering. This means following a Buffer.Empty event until the Buffer.Full event is received or until the first Buffer.Full event on a new stream.
  • While the server is processing a previous switch call. Wait until the client receives the NetStream.Play.Transition event for the prior switch before making another call.
  • Before the stream is published. This means waiting until a NetStream.Publish.Notify event is received by the client.
Monitoring other quality of service properties
Besides NetStreamInfo.maxBytesPerSecond, mentioned previously, another property to monitor while playing back a live stream is NetStreamInfo.droppedFrames, which provides the absolute number of video frames dropped due to limited CPU resources on the client's machine. If the bit rate or the resolution of the video is much higher than the client CPU can handle, the video will tend to stutter and drop video frames in order to keep the synch going with the audio or to keep up to its timeline. If the video is dropping a lot of frames, then that should trigger a switch to a lower bit-rate stream to provide a smoother playback experience. Besides lower bit rates, turning off smoothing on the Video object also helps reducing cpu usage and hence could be turned off before switching down to a lower bit rate.
NetStream.bufferLength is also a critical property to monitor to make switch-down decisions. Besides will also start dropping when the client's bandwidth drops and is not able to sustain the buffer length with the current bit rate of the video. If the buffer length drops below a certain threshold, it should trigger a switch down to a lower bit rate or one calculated by the NetStreamInfo.maxBytesPerSecond property.
Where to go from here
For more information, please refer to these other articles on dynamic streaming: