Author Archive: Brian Riggs

OSMF “Neon” Sprint 3 Release Posted

We’re excited to announce that the first sprint release of OSMF 1.5 “Neon” is available!  We’ve been working closely with the Flash Media Server team to provide client-side support for some of the new FMS features.  Here are the highlights from this sprint release:

  • Multicast Support.  OSMF now supports Multicast, a new feature in FMS 4.0, enabling publishers to reach audiences within and beyond their network without a CDN.  The feature includes support for Native IP Multicast, Peer-to-Peer (P2P) Multicast, and Native IP Multicast with fallback to P2P Multicast (known as Fusion).
  • Stream Reconnect Support.  Stream Reconnect minimizes unnecessary playback interruptions by providing a grace period when the player loses its network connection.  The player will continue to play out the buffer while it attempts to re-establish a network connection.  Stream Reconnect is supported in FMS 3.5.3.

The release also includes a number of bug fixes.  We’ve also posted an updated version of Strobe Media Playback that uses the “Neon” Sprint 3 release. (Updated: The SMP ZIP includes OSMF 1.0.  We’ll be providing an updated version of SMP that uses the “Neon” drop in the near future.) Here are the relevant links:

OSMF Plug-in Developer’s Guide Posted

We’ve posted a draft of the plug-in developer’s guide (PDF) to the Developer Documentation section of the OSMF site.  Feedback is welcome.

Here’s an outline:

1. Overview
        Why Plug-ins?
        What is a Plug-in?

2. Using Plug-ins in a Media Player
        Loading a Plug-in
        Using a Plug-in
        Plug-in Collisions
                Plug-in Metadata
                Plug-in Resolution
        Plug-in Load Failures

3. Building a Plug-in
        Building a Basic Plug-in
        Implementing Static Plug-ins
        Implementing Dynamic Plug-ins
        Initializing Your Plug-in
        Types of Plug-ins
                Standard Plug-ins
                Proxy Plug-ins
                Reference Plug-ins
                Proxy vs. Reference Plug-ins
        Plug-in Versioning

4. Additional Resources

OSMF User Group Presentation Tomorrow

I’ll be presenting to the OSMF user group tomorrow (Wednesday, May 19th) at noon PST about the latest OSMF features, as well as what we’re considering for OSMF 1.5.  We’ll definitely be interested in feedback on the latter.  Hope you can make it!

http://groups.adobe.com/index.cfm?event=post.display&postid=19525

Brian Riggs – OSMF Team

OSMF FC1 released

The feature complete (FC) build of OSMF is now available on osmf.org.  With 1.0 right around the corner, our primary focus has been quality, with over 130 bugs fixed since the last release.  Here are the highlights from the current release:

  • Player Size Reductions – We’ve optimized OSMF for player size, with the minimum player coming in at 35KB.  Lots more details on our player size summary page.
  • Plugin Loading Changes – We’ve made some subtle but important changes to how dynamic plugins are loaded.  Plugin developers should take a close look at the details in the release notes to see if the changes will impact their plugins.
  • MediaPlayerSprite - We heard your feedback, and brought back a streamlined, improved version of this useful display class.  For those just getting started with OSMF, MediaPlayerSprite (as seen in our HelloWorld example) is the easiest-to-use introduction class.
  • Ad & Recommendation Examples – The ExamplePlayer sample app boasts two new examples, demonstrating how to implement a recommendations bumper and how to implement an overlay countdown timer with pre-, post-, or mid-roll ads.  There are now 50 examples in ExamplePlayer, demonstrating a wide variety of OSMF features.

And here are the links:

OSMF v0.93 available

We’re very excited to announce that the Sprint 10 drop, OSMF version 0.93, is now available. And we’re even more excited to announce that our APIs are locked down for 1.0 (more on that below).  If you’ve been waiting for the dust to settle before diving into OSMF, then you should take a look at the v0.93 release.

Although this release has primarily been about API lockdown and stability, there are a few new features worth calling out:

  • DVR Support.  OSMF now supports DVR via Flash Media Server’s server-side DVRCast application.  This feature allows clients to jump back in time while viewing a live stream.
  • RSS + MRSS.  Via a new AS3 library, OSMF can now parse all of the major feed formats, plus the media-centric RSS extensions.
  • Extended HTTP Streaming Support.  OSMF now supports subclips for HTTP-streamed content, and DRM for HTTP-streamed content.

On the API front, we’ve spent the past six weeks reviewing, debating, and refining the public APIs.  We applied a critical eye towards the APIs, renaming those that were not immediately clear, eliminating those for which not enough use cases existed.  Moving forward, we’re setting an extremely high bar for accepting additional API changes prior to 1.0.  See the API Lockdown Status page for details on this process, and on the API changes themselves.  (Note that this API lockdown applies to the framework-level APIs, not to the APIs for individual libraries or plugins, such as VAST or SMIL.)

And now, to the links:

OSMF v0.9 released

The ZIP file for the latest OSMF release is live!  Here’s a high-level summary of the latest changes:

  • OSMF Sample Player with Chrome.  We’ve heard your feedback about the need for a configurable, embeddable sample player.  The new OSMFPlayer supports all OSMF media types, and comes with a nice-looking, dynamic control bar. If you play a progressive video, you’ll have basic playback controls.  If you play a dynamic streaming video, you’ll have an additional set of controls (for monitoring or switching streams).  The sample player uses the new ChromeLibrary, a reference implementation for how to create UI controls with OSMF.  You can see the player in action here.
  • SMIL Support.  A new SMIL plugin allows you to create MediaElements out of SMIL documents, and play them back.  Supported SMIL features include dynamic streaming (via the <switch> tag), parallel and sequential media, and the core media types (video, audio, and image).
  • HTTP Streaming Support.  We’ve added support for HTTP streaming in the latest OSMF release.  If you’d like to test this new functionality, please sign for our pre-release at: https://www.adobe.com/cfusion/mmform/index.cfm?name=prerelease_interest.  (Note that the prerelease enrollment form at this link won’t be available until approximately 2/12.)
  • Enhanced Plugin Support.  We’ve added a new plugin type (CREATE_ON_LOAD) to allow a plugin’s MediaElement to be created as soon as the plugin is loaded.  This is useful for plugins that monitor the status of other media (e.g. for analytics & reporting).
  • API Lockdown Progress. We continue to make progress on our lockdown of APIs, and we expect the lockdown to be largely complete in the next release at the end of sprint 10. Details of API changes in sprint 9 are included in the release notes.  We have been aggressively working with an internal review board to review the public APIs, and have made sufficient progress now that we consider the API to be functionally complete.  The review process is likely to go on for at least another month, during which the primary focus will be on ensuring that terminology and conventions are consistent with other Flash Platform APIs. 

    Normal
    0

    false
    false
    false

    EN-US
    X-NONE
    X-NONE

      For information on overall status of the API lockdown effort, please see our API Lockdown Status wiki page.

Please note:  While we understand that using an API that isn’t fully frozen yet carries some risk of additional refactoring work later, we believe that most of that refactoring will amount to renaming of methods/classes/packages rather than fundamental changes to the workflow, and we encourage you to start developing real world applications based on this build. Your feedback in the next month will really help us ensure that we enable your unique use cases before we truly freeze the API for our final release.

And here are the links:

OSMF v0.8 now available

We’ve posted the ZIP file for the latest stable OSMF release.  New features include:

  • Live Stream Support.  Player developers can now specify the stream type (live vs. recorded vs. either) when connecting to FMS.
  • Subclip Support. This feature allows for playback of subclips, or contiguous sub-sections, of a streamed video, which lays the groundwork for presenting mid-roll ads.
  • Captioning Plugin.  We’ve implemented a new plugin that can handle captioning for OSMF-based players, using the W3C’s DFXP timed text format.
  • Flash Media Manifest Support.  Adobe is proposing a new XML format for representing a Flash Media asset (including MBR information) and its related metadata.  The latest drop supports this through the F4MLoader class.
  • Preassigned Durations.  You can now specify a default duration (via the defaultDuration) property on VideoElement and AudioElement, so that the media can reflect its duration before playback.
  • CompositeElement support for bytesLoaded and bytesTotal.  In a previous sprint,we added support for tracking download progress for a single VideoElement.  Now, you can track the download progress when your video (or other downloadable media) is part of a composition.
  • Improved Metadata Merging Support.  CompositeElements now reflect the metadata of their children.  The Metadata API exposes finer-grained control over how a child’s metadata will surface on its parent.

In addition, we’ve made great progress in refactoring and renaming our core APIs to address customer feedback, and to be more consistent with other APIs in the Flash Platform.  In the short term, this means that virtually every player and plugin built on top of previous versions of OSMF will need some code changes to integrate with the new APIs.  (The release notes contain a summary of these changes, as well as some tables with the old and new names, which should be helpful when updating your app.)  In the long term, this moves us one step closer to solidifying our APIs and our 1.0 release.

And now the links:

Cue Point Support in OSMF

v\:* {behavior:url(#default#VML);}
o\:* {behavior:url(#default#VML);}
w\:* {behavior:url(#default#VML);}
.shape {behavior:url(#default#VML);}

Normal
0
false

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Table Normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-qformat:yes;
mso-style-parent:””;
mso-padding-alt:0in 5.4pt 0in 5.4pt;
mso-para-margin:0in;
mso-para-margin-bottom:.0001pt;
mso-pagination:widow-orphan;
font-size:12.0pt;
font-family:”Cambria”,”serif”;
mso-ascii-font-family:Cambria;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Cambria;
mso-hansi-theme-font:minor-latin;}

Guest post from Charles Newman of Akamai, who designed and implemented OSMF’s new cue point feature.


OSMF
v0.7 includes new functionality allowing you to create, inspect, and react to
temporal metadata, either embedded in the media at encoding time or added to the
media element at run time.

 

Since
cue points are essentially temporal metadata, we decided to provide a generic
solution for temporal metadata, rather than limit ourselves (and you) to cue
points for video elements.  Therefore,
temporal metadata can be applied to any media element in OSMF; you are not
limited to cue points on video elements. 

 

Types of Cue Points

 

Cue
points come in 3 flavors:

·        
Event: 
Meant to trigger some specific action when the player reaches the cue
point, such as displaying a caption, controlling an animation, etc.

·        
Navigation: 
Allows seeking to a specific point in the media, such as a chapter or a
sequence. The encoding software creates a key frame at the position of the cue
point.

·        
ActionScript: 
External cue points created at run-time; requires code to watch for
these cue points.

 

Event
and Navigation cue points are added at encoding time.  ActionScript cue points are added at
run-time.

 

Easing Your Pain

 

This
new support for temporal metadata in OSMF solves a few pain points and enables
other features to be built on this core functionality, such as closed
captioning.  A couple of specific pain
points include:

 

1)
F4V encodes created with a CS4 or earlier product do not fire in-stream cue
point events, you need to extract the cue point information from your onMetaData()
handler and create a timer to watch for cue points, then dispatch a custom
event.

2)
In order for your player to react to ActionScript cue points, as mentioned in
1) above, you’ve got to write some code, which may not be trivial, depending on
whether you want to optimize the timer, support seeking, etc.

 

F4V
files are H.264 encodes with an FLV wrapper. To react to your embedded event
cue points, you need to read the array of cue points in your onMetaData() handler,
create a timer, write some code to watch the NetStream time and dispatch your
own event.

 

For
ActionScript cue points, you need to do the same thing but also make sure the
cue points are sorted by time in your internal collection.

 

The
new temporal metadata support in OSMF v0.7 handles all of this for you with a
new metadata facet class called TemporalFacet.

 

If
you are unfamiliar with metadata support in OSMF, here is a brief description (for
more info you can read the spec here
http://opensource.adobe.com/wiki/display/osmf/Metadata+Support):

 

·        
Metadata
can be added to a media element or a media resource.

·        
All
metadata is organized by namespaces, guaranteeing their uniqueness, allowing
several different types of metadata to be added to a media element or its
resource.

·        
In
addition to a namespace, a metadata instance has a facet type.

·        
The
facet type describes what the metadata holds, for example there is a
KeyValueFacet that represents a concrete class containing a collection of key
value pairs. This class allows you to easily add key/value pairs as metadata to
a media element or a media resource.

 

The New Classes

 

Here
are the new classes that implement the new temporal metadata support along with
a brief description:

 

org.osmf.metadata.TemporalIdentifier

 

This
is a base class for temporal metadata; it defines time and duration properties.
The new CuePoint class extends this class.

 

org.osmf.metadata.TemporalFacetEvent

 

The
TemporalFacet dispatches this event. There are specific events for “position
reached” and “duration reached”.

 

org.osmf.metadata.TemporalFacet

 

This
class is essentially the temporal metadata manager.  It manages temporal metadata of type
TemporalIdentifier associated with a media element and dispatches events of
type TemporalFacetEvent when the playhead position of the media element matches
any of the time values in its collection of TemporalIdentifer objects.  Basically, this is the code you would need to
write to handle F4V event cue points and ActionScript cue points in your
player.

 

The
TemporalFacet class has an optimized algorithm for adding and watching for the
time values in its internal collection of TemporalIdentifier objects.  Here are some of the ways the algorithm is
optimized:

-      
Uses
a binary search to insert items into its collection of TemporalIdentifier
objects (sorted by time), rather than calling a sort method on each insert.
Inserting items in any order is very fast.

-      
Stops
the timer when the user pauses play back and restarts the timer when the user
resumes play back.

-      
Optimizes
the timer interval by looking ahead to the next cue point (there is no reason
to keep checking every 100 milliseconds, for example, when the next cue point
is 15 seconds away).

-      
Keeps
track of the last cue point fired so it doesn’t need to look through its entire
collection of cue points.

-      
If
the user seeks, it reliably dispatches the correct TemporalFacetEvent.

 

 

org.osmf.video.CuePoint

 

This
class extends TemporalIdentifier to provide a more standard cue point model for
video cue points. This class contains the properties: name, type, and
parameters (the parameters property returns an array of key value pairs added
at encode time or run time).

 

The New Cue Point Sample Application

The
cue point sample app in the OSMF v0.7 release
demonstrates the following:

-       Loads
a video and populates a data grid (in the upper right of the sample app) with the
embedded Event and Navigation cue points found in the onMetaData() handler.  You can sort this grid by time.  The purpose of the grid is to allow you to
navigate using the Navigation cue points and to verify the TemporalFacet class
is working correctly by showing you the events you should be receiving.

-       Clicking
on a Navigation cue point in the grid takes you to that position (key frame) in
the video.  This represents what could be
chapters or sequences.

-       Shows
the ActionScript and Event cue points (in the lower left of the sample app) as received
by the player code (the events are dispatched by the TemporalFacet class at run
time).

-       Allows
you to add ActionScript cue points at run time (in the lower right of the
sample app) and see those events being fired. 
As you hit the “Add” button you will see the ActionScript cue point
added to the data grid (note you may need to click on the Time column to force
a sort).  If you enter a duplicate, only
the last one is retained.

How to Listen for Cue Points

 

The first step is to listen for metadata facets
being added to your media element:

 

videoElement = new VideoElement(new NetLoader(), new URLResource(new URL(MY_STREAM)));

 

videoElement.metadata.addEventListener(MetadataEvent.FACET_ADD,
onFacetAdd);

 

When the
TemporalFacet is added to your media element, you can start listening for the
TemporalFacetEvent.POSITION_REACHED event:

 

private function onFacetAdd(event:MetadataEvent):void

{

  
var
facet:TemporalFacet = event.facet as TemporalFacet;

  
if
(facet)

  
{

  
   facet.addEventListener(TemporalFacetEvent.POSITION_REACHED,

  
                          onCuePoint);

 

 

How to Add Cue Points at Run-time

 

Create a
new TemporalFacet with your own unique namespace, add the facet to the metadata
for the video element, and then add your cue point to the facet:

 

_temporalFacet = new TemporalFacet(new URL(CUSTOM_NAMESPACE), videoElement);

videoElement.metadata.addFacet(_temporalFacet);                  

                 

var cuePoint:CuePoint = new CuePoint(CuePointType.ACTIONSCRIPT,

                                    121, // time
in seconds

                                    ‘my test cue
point’,

null);

_temporalFacet.addValue(cuePoint);

 

When you
add the facet to the metadata for the media element, you will get the
MetadataEvent.FACET_ADD as shown in the example above this one. In that event
handler, you can create a unique listener for your namespace, or use one event
listener for cue points coming from all namespaces.

 

What’s Next

 

As I
mentioned earlier, this new functionality lays the groundwork for more useful
and exciting OSMF features and plugins. 
Next up, closed captioning.

OSMF v0.7 available

Another month, another OSMF release!  Here are the highlights from the latest drop:

  • Cue Point Support.  OSMF now has support for all types of cue points (event, navigation, and AS).  Cue point support is built on top of our new support for temporal metadata, which can serve as the basis for defining (and responding to) metadata along the media’s timeline.  (We’ll cover this in greater detail in a separate post.)
  • Tracking Download Progress. We’ve introduced a new trait for following the download progress (in terms of bytes) of media.  The relevant properties and events are exposed on the MediaPlayer class, and can be used to represent the portion of the media which is immediately seekable (e.g. the red seek track in YouTube’s player).
  • Package Renaming + Refactorings.  You may notice that the package for the OSMF classes is now “org.osmf”.  (Yes, we finally got around to fixing that!)  As part of this renaming, we took the opportunity to do some general cleanup of code and APIs.

In addition, this drop contains an initial implementation of some content protection features in support of Flash Access 2.0, Adobe’s Digital Rights Management solution for the Flash platform.  These features require a new version of the Flash Player, version 10.1, which is not yet publicly available.  For information about participating in the Flash Access private prerelease program, please contact Kelly Miller (kelmille@adobe.com).

And now to the links:

Building a HelloWorld app with OSMF

Want to see the simplest possible application you can build with OSMF?  Here is a “HelloWorld” application, which auto-plays a single video:

[SWF(width="640", height="352")]
public class HelloWorld extends Sprite { public function HelloWorld() { var sprite:MediaPlayerSprite = new MediaPlayerSprite(); addChild(sprite); sprite.element = new VideoElement ( new NetLoader , new URLResource(new URL(REMOTE_PROGRESSIVE)) ); } private static const REMOTE_PROGRESSIVE:String = "http://mediapm.edgesuite.net/strobe/content/test/AFaerysTale_sylviaApostol_640_500_short.flv"; }

Just three lines of code!  The first line creates an instance of the Sprite class which wraps up the MediaPlayer class.  The MediaPlayer class is the client class for playing back any type of media.  Once we’ve added the MediaPlayerSprite as a child of the root Sprite, we assign a VideoElement to the MediaPlayerSprite.element property.  Because MediaPlayer’s autoPlay property defaults to true, the video begins to play back immediately.

You can see the HelloWorld application in action here.  If you run it, you’ll notice that the video appears in the upper left corner of the browser window.  Suppose you wanted to center it, and have the SWF take up all available space?  Here’s a second iteration on HelloWorld (changes in bold):

public class HelloWorld2 extends Sprite
{
public function HelloWorld2()
{
stage.scaleMode = StageScaleMode.NO_SCALE;
stage.align = StageAlign.TOP_LEFT;

var sprite:MediaPlayerSprite = new MediaPlayerSprite();
addChild(sprite);

// Set the Sprite's size to match that of the stage, and
// prevent the content from being scaled.
sprite.width = stage.stageWidth;
sprite.height = stage.stageHeight;
sprite.scaleMode = ScaleMode.NONE;

sprite.element = new VideoElement
( new NetLoader
, new URLResource(new URL(REMOTE_PROGRESSIVE))
);
}

private static const REMOTE_PROGRESSIVE:String
= "http://mediapm.edgesuite.net/strobe/content/test/AFaerysTale_sylviaApostol_640_500_short.flv";
}

In the first two lines, we set the Stage’s scaleMode and alignment so that the SWF takes up all available space.  In the next set of new lines, we set the width and height of the MediaPlayerSprite to match that of the stage, and then tell the MediaPlayerSprite not to scale the content.  This has the effect of placing the video directly in the middle of the window.

You can see HelloWorld2 in action here.
 
And now, one last example.  Suppose you wanted to play a sequence of media rather than a single video.  Instead of creating a VideoElement, you could create a SerialElement whose children represent the sequence of media to play:

var serialElement:SerialElement = new SerialElement();

// First child is a progressive video.
serialElement.addChild
( new VideoElement
( new NetLoader
, new URLResource(new URL(REMOTE_PROGRESSIVE))
)
);

// Second child is a SWF that shows for three seconds.
serialElement.addChild
( new TemporalProxyElement
( 3
, new SWFElement
( new SWFLoader()
, new URLResource(new URL(REMOTE_SWF))
)
)
);

// Third child is a progressive video.
serialElement.addChild
( new VideoElement
( new NetLoader
, new URLResource(new URL(REMOTE_STREAM))
)
);


sprite.element = serialElement;

You can see the third iteration of HelloWorld here.  After the first (30 second) video completes, you’ll see an orange box appear for 3 seconds, followed by another 30 second video.

Hopefully these examples shed some light on how to get started with OSMF.  The complete source code and Flex Builder project files for all three variations are checked into the OSMF public repository.