Beta Feature: VideoTexture and Stage3D

Authored by Jason Lee

Flash Player 15 Beta introduces the VideoTexture object, which allows hardware decoded video to be used in Stage3D content. With prior versions of Flash Player, the use of video in Stage3D required the use of the Video object (which is not hardware accelerated), copying of video frame to a BitmapData object, and loading of the data onto the GPU.

With the VideoTexture object introduced in Flash Player 15 Beta, the decode, conversion from YUV to RGB, and texture loading can be completely moved to the GPU.  The textures provided by the VideoTexture object can be used as rectangular, RGB, no mipmap textures in rendering of a Stage3D scene; and can be treated as ARGB texture by the shaders (i.e., the AGAL shaders do not need to provide YUV->RGB conversion).  Thus, standard shaders used with static images can also be used without providing a functionality for YUV-> RGB conversion.

Following are sample scenarios in which the use of the VideoTexture would be helpful:

  • Video transitions in game.  Videos are sometimes used to transition between story lines in a game. Without VideoTexture, the best way to do this is to use StageVideo.  However, if StageVideo is used for this purpose, the stage3D content needs to be removed so as to not obsure the StageVideo; and, thus, the Stage3D content cannot be used as an overlay.  VideoTexture would allow Stage3D content to be used as an overlay in such a scenario.
  • Using Remote/Local camera in games (e.g., chatting with game mates) or applying shader effects to a video camera stream.  VideoTexture would allow  embedding of camera streams within Stage3D content
  • Game video streams. Use of video streams inside the content (for example inside a monitor in the scene). Applying special effects to video (YouTube 3D vision, for example).

ActionScript example of use of VideoTexture

There are three steps to use the VideoTexture.

1. Check whether VideoTexture is supported by the Context3D class

Before using VideoTexture object, a check should be made to confirm that the Context3D class supports VideoTexture by checking the static property “supportsVideoTexture” of Context3D class.

if( stage.stage3Ds.length > 0 )
 var stage3D:Stage3D = stage.stage3Ds[0];
 stage3D.addEventListener( Event.CONTEXT3D_CREATE, myContext3DHandler ); 
 stage3D.requestContext3D( ); 
function myContext3DHandler ( event : Event ) : void 
 if (Context3D.supportsVideoTexture)
 // We can create VideoTexture Object with Context3D object.

2. Create a VideoTexture object and attach a NetStream/Camera object to the VideoTexture object.

// Connections
var nc:NetConnection;
var ns:NetStream;
nc = new NetConnection();
ns = new NetStream(nc);
var texture:VideoTexture;
texture = stage3D.context3D.createVideoTexture();
texture.addEventListener(Event.TEXTURE_READY, renderFrame);

3. Retrieve into the VideoTexture object the currently available video frame in the attached NetStream object

The current video frame can be retrieved as a texture from the VideoTexture object when the callback for the Event.TEXTURE_READY event is fired.

Event.TEXTURE_READY is fired whenever a video frame corresponding to current audio playback is available.  Video frames not in synch with the current audio playback may be dropped.

var context3D:Context3D;
function renderFrame(e:Event):void
 // Render on stage3D with VideoTexture Object.

Through the VideoTextureEvent object, the VideoTexture feature provides notification of the same events as that for the StageVideoEvent object.

texture.addEventListener(VideoTextureEvent.RENDER_STATE, onRenderState);
function onRenderState (event:VideoTextureEvent)
 if ( event.status == VideoStatus.SOFTWARE) {
 // Indicates software video decoding works.
 if ( event.status == VideoStatus.ACCELERATED ) {
 // Indicates hardware-accelerated (GPU) video decoding works.
 if ( event.status == VideoStatus.UNAVAILABLE ) {
 // Indicates Video decoder is not available.

Notes on the Feature

  • The current beta implementation is for Windows AIR only.  We will be adding support for Mac, iOS and Android AIR in an upcoming release.
  • VideoTexture is not supported when Context3D uses software rendering mode.  The method Context3D.CreateVideoTexture will throw an error, “Texture Creation Failed. Internal error.”, if the method is called when Context3D uses software rendering mode.
  • There are limitations to VideoTexture object on texture sampling. Below are available options for VideoTexture object:
    • Texture dimension. Available options: 2d
    • Mip mapping. Available options: nomip (or mipnone)
    • Texture repeat. Available options: clamp
  • A maximum of 4 VideoTexture objects are available per Context3D instance.
  • The VideoTexture object will not provide video frames from DRM encrypted videos.

API details:

Changes to the Context3D interface

Public properties:
 /* Indicates if Context3D supports video texture on current platform 
    and product. */ 
 static supportsVideoTexture:Boolean; [read-only]
Public methods:
 /* Creates a VideoTexture object.*/
 createVideoTexture( ):VideoTexture;

Video Texture class

class VideoTexture extends TextureBase
Public methods:
 /* attaches a NetStream video stream to the texture. */
 /* attaches a webcam to the texture */
Public properties:
 /* returns the current width of the video in pixels */
 videoWidth:int; [read-only]
 /* returns the current height of the video in pixels */
 videoHeight:int; [read-only]

Events generated by the Video Texture



class VideoTextureEvent extends Event
Public properties:
 /* The status of the VideoTexture object. */
 status:String; [read-only]
 /* The color space used by the video being displayed in the VideoTexture object. */ 
 colorSpace:String; [read-only]
 /* The VideoTexture.RENDER_STATE constant defines the value of the type property 
    of a renderState event object. */

5 Responses to Beta Feature: VideoTexture and Stage3D

  1. henke37 says:

    Still unclear on what the Event.TEXTURE_READY event is supposed to be used for.

  2. Anton Azarov says:

    This is very useful feature for us! Now we can use it in our touch screen applications. Also it can be very important to iOS because currently we doing frame rip using Native Extension. Would be great to have this inside AIR!


  3. PJShand says:

    context3D.createVideoTexture() returns null
    is this because this feature isn’t supported with my hardware?
    i’m publishing with air 15 for Win