Simple Stand Alone Video Player

Recently we were asked if we’d ever created an OSMF based desktop player that was capable of switching back and forth between windowed and full-screen mode. The answer was “no”, but re-using some of the sample code for our recent MAX presentation, we thought that building it shouldn’t be too much work – so we set out to create it as a sample.

The resulting very very simple ‘proof-of-concept’ AIR based OSMF Desktop Player is available for download here. The sources to the files are available right here.

Operating Instructions
(please note that the application is not a supported product!)

  • To load a video, double click the player’s chrome.
  • To switch between windowed mode and full-screen, press the ‘f’ key.
  • To quit the player, bring it into focus and press ALT-F4 on Windows, or CMD-Q on OS-X.
  • The player doesn’t scrub: the progress bar merely indicates the play-head’s position.

The remainder of this post goes over the sample’s code. We’ll touch on how to bind a UI to a media element using traits, and how to put a viewable media element on the stage by setting its gateway property to a RegionGateway instance.

Element Loading and Playback

The current iteration of the OSM framework does not come with any visual player controls. The idea is that (at least for now), developers provide their own buttons, sliders, and other interfaces. OSMF does make it very easy to control media though: through a class called MediaPlayer controlling a media element becomes as simple as setting some properties:

The code from the desktop player illustrates the use of MediaPlayer like so (from

mediaPlayer = new MediaPlayer();  // line 118
video = new VideoElement(...);    // line 131
mediaPlayer.element = video;      // line 142

The first line instantiates the MediaPlayer class. Next, a video element gets constructed, and last, mediaPlayer.element gets set to the constructed video. The media player object will automatically attempt to download the media element that it gets assigned. Once the element has finished loading, the media player will attempt to start its playback (this behavior can be switched off, by setting the player’s autoPlay property to false).

Any further operations on the media element can be carried out from the MediaPlayer instance. The sample illustrates this at its button click handlers (, line 37):

override protected function onPlayButtonClick(event:MouseEvent):void

override protected function onPauseButtonClick(event:MouseEvent):void

override protected function onSoundLessButtonClick(event:MouseEvent):void
    mediaPlayer.volume -= .2;

override protected function onSoundMoreButtonClick(event:MouseEvent):void
    mediaPlayer.volume += .2;

The code shows, that using MediaPlayer, there is no need to operate directly on a media element. In general, the methods on MediaPlayer are a bit easier to use, and additionally one can setup a MediaPlayer once, and change the actual media element that it targets without having to do any additional work.

Staging a MediaElement

MediaPlayer is great for controlling a media element, but it is a non-visual object; it doesn’t show a media element (it is a Controller, to speak in Model View Controller (MVC) terms).

Visualizing a media element can be done by asking the element for its viewable trait. One of the viewable trait’s properties, is called view, and is of type DisplayObject. Within your player, you can take this object, and put it on the stage using addChild. However, this is the complicated way of doing things: using the framework’s RegionGateway class instead is much easier, as this code (from shows:

canvas = new RegionGateway();        // line 105
addChildAt(canvas, 0);
video                                // line 131
    = new VideoElement
        ( new NetLoader()
        , new URLResource(new URL("file://"+file.nativePath))
video.gateway = canvas;

At the first line, a RegionGateway gets instantiated. Since a RegionGateway derives from Sprite, it is a DisplayObject that can be staged. This happens at the next code clause. Last, a media element gets constructed, and its gateway property gets set to canvas, which is the RegionGateway instance that just got created.

Instead of the last line (video.gateway = canvas;), the following code would have worked just as well:


However, the sample uses the gateway property on MediaElement to highlight the properties’ existence: every MediaElement can be sent off to somewhere, by setting this property. In this case the destination is a RegionGateway, which causes the element to end-up on the stage within the player. Another such gateway is HTMLGateway, that can be used to have an element be processed by the HTML document that is hosting the player.

Connecting the UI

Another thing to cover is how the UI controls from the sample know when to be enabled, and when not to be. The chrome holds a couple of controls, and each of them maps to a media element trait:

  • Play button maps to IPlayable,
  • Pause button maps to IPausibe,
  • Position bar maps to ITemporal,
  • Volume up / Volume down maps to IAudible.

Media element traits are dynamic objects, that can be added to, and removed from a media element during its lifetime. For example, before an image has been loaded, it does not have a viewable trait yet. Once it has completed loading successfully, the viewable trait gets added to the element dynamically. If a UI wants to react to media elements changing this way, it will have to monitor for traits to come and go.

The framework tries to make it easy to do this monitoring: again, MediaPlayer comes to the rescue be providing a MediaPlayerCapabilitiesChange event, that will fire whenever a trait on its currently assigned media element gets added or removed.

The desktop player sample takes a lower level approach though: it listens directly at the MediaElement level, where the TraitsChangeEvent gets fired whenever the media element gains or looses a trait. The following snippet (from – a class that all and inherit from) shows the listeners being added:

_mediaElement.addEventListener(TraitsChangeEvent.TRAIT_ADD, onTraitsChange);
_mediaElement.addEventListener(TraitsChangeEvent.TRAIT_REMOVE, onTraitsChange);

The handler method checks to see if the trait that is being added or removed, is of interest to the media element. At the updateEnablement method (, line 111), the type of the added/removed trait is checked against a list of required traits (each subclass or instance adds these required traits using the addRequiredTrait method). If the handler sees the trait is required, then the component is toggled enabled if the trait is present at that time.


As a result of theOSM framework not using any Flex, AIR, or Flash Authoringspecific libraries, it can be used in all sorts of ActionScript 3 applications. Be aware that on using the platformwith AIR (or from the the Flash stand-alone player, for that matter), one may run into local-to-network security sand-box issues. For example, dynamically loading OSMF plug-ins from a local application will fail. Since this sample works with local files only, it does not have any such issues.

More …

This concludes the walk-through of sample’s main topics. For more information on OSMF, this blog is a great resource. Another good source of information are our forums, that are frequently visited by engineers from the OSMF team.

Comments are closed.