Technical Intro to OSMF

(Cross-posted from the forums.)

There’s a lot about OSMF for newcomers to digest, so I thought I’d share the thinking behind the core design of the media framework.  Let’s start with a use case:  how would you build a video experience like that of mtv.com, Hulu, or some other premium content publisher?  The first thing to acknowledge is that consumer video sites like Hulu are not just about video. They provide multimedia experiences:  video is choreographed with images (companion ads), overlays (SWFs), and even the branding of the underlying HTML page. And the experience is seldom confined to visual media alone; typically, the playback of the media happens in conjunction with CDN requests, ad server interactions, and tracking & reporting calls.  There’s a lot happening under the hood.

So the core of a media framework needs to be a) supportive of any and all media types, and b) flexible in how it can integrate backend server calls that enable and complement the media experience.

With OSMF, we’re attempting to solve this problem by defining three key classes at the heart of the framework.  First, a MediaElement represents a unified media experience.  A MediaElement might represent something as simple as a single video, or it could represent the entire choreographed experience of a web video site.  But given the dynamic nature of a site like MTV’s, if a MediaElement is going to represent the media experience as a whole, it too needs to be very dynamic.  That’s where the second key class in the framework, IMediaTrait, comes in.  A media trait represents a fundamental capability of a piece of media, without making any assumptions about the type of the media or how the media achieves that capability.  Examples of traits include IPlayable, IViewable, IAudible, and ITemporal.  A MediaElement that represents an image (ImageElement) would only contain the IViewable trait, whereas a MediaElement for a video (VideoElement) would have that trait plus IPlayable, IAudible, and ITemporal.  An audio-specific MediaElement (AudioElement) would have IPlayable, IAudible, and ITemporal, but not IViewable (since it has no visual representation).  Traits can come and go dynamically over time, which is the key to representing a unified media experience as a MediaElement.  Taken as a whole, the complete set of traits makes up the vocabulary of the system.  If you can map the behavior of a media type into this vocabulary, then it can leverage all of the functionality of the framework.

But what about non-visual media, such as integration of CDNs or tracking servers?  Here’s the key:  the trait-based vocabulary of the system applies to both visual and non-visual media.  In other words, everything that a CDN or ad server needs to do can be expressed through one or more traits.  For example, one of the traits in the framework is ILoadable, which represents the process needed to transform an input (such as a URL) into ready-to-play media — i.e. the load process.  But if a CDN plugin needs to do authentication or other custom logic, all it needs to do is map that custom logic into the ILoadable API.  Under the hood, the load process (as represented by ILoadable) can work with NetConnections and NetStreams, or with Flash’s Loader class, or with the Sound/SoundChannel API, or with custom RTMP or HTTP requests and responses.  In a sense, OSMF is taking all of the idiosyncracies and incompatibilities of the different media-specific Flash APIs and abstracting them into a common API.

Hopefully by now it’s at least somewhat clear how to represent a single piece of media.  But for complex media experiences with many moving parts, we need the third key class in the framework.  A CompositeElement is a MediaElement that represents a composition of multiple MediaElements.  The two specific examples are SerialElement, which represents a set of MediaElements that play in sequence;  and ParallelElement, which represents a set of MediaElements that play simultaneously.  These two classes allow you to build complex media experiences with many different MediaElements.  I could go on about this, but it’s probably more instructive to post a code snippet:

// Create a root-level parallel element.
var parallel:ParallelElement = new ParallelElement();

// Add a sequence of videos to the root.
var videoSequence:SerialElement = new SerialElement();
videoSequence.addChild(new VideoElement(new VideoLoader(),new URLResource("http://www.example.com/video1.flv")));
videoSequence.addChild(new VideoElement(new VideoLoader(),new URLResource("http://www.example.com/ad.flv")));
videoSequence.addChild(new VideoElement(new VideoLoader(),new URLResource("http://www.example.com/video2.flv")));
parallel.addChild(videoSequence);

// Add a sequence of rotating banners in parallel:
// - The first banner doesn't appear until five seconds have passed.
// - Each banner shows for 20 seconds.
// - There is a 15 second delay before a subsequent image shows.
var imageSequence:SerialElement = new SerialElement();
imageSequence.addChild(new TemporalProxyElement(5));
imageSequence.addChild(new TemporalProxyElement(20, new ImageElement(new ImageLoader(),new URLResource("http://www.example.com/image1.jpg")));
imageSequence.addChild(new TemporalProxyElement(15));
imageSequence.addChild(new TemporalProxyElement(20, new ImageElement(new ImageLoader(),new URLResource("http://www.example.com/image2.jpg")));
imageSequence.addChild(new TemporalProxyElement(15));
imageSequence.addChild(new TemporalProxyElement(20, new ImageElement(new ImageLoader(),new URLResource("http://www.example.com/image3.jpg")));
parallel.addChild(imageSequence);

// Add the whole thing to the MediaPlayer.
player.media = parallel;

There, in about twenty lines of code, is your (basic) multimedia experience.  (Note that I haven’t covered everything that’s in the code in this post — checkout the developer documentation to learn about the rest.)  Yes, we’re still a far cry from reproducing Hulu or MTV.  But this is just a warmup.  Hope you’ll stick around to see where we’re going with this, and help us get there.

12 Responses to Technical Intro to OSMF

  1. mr_binitie says:

    Thanks a lot for this little blurb – it has definitely expanded my knowledge of the framework. My experiments with the framework seem to allude to the fact that the server needs to be configured in a certain way to stream content to OSMF media player. I’ve been working with the framework and Flash Media I Server and so far I’ve found that video can only be streamed via the RTMP(…) protocol if the application is configured in the manner of the VOD. I feel this is a flaw – the player should be able to stream videos from FMIS and its variants from the folder structure specified by the application developer and not by a client (OSMF) enforced setup.

  2. Brian Riggs says:

    You should be able to work with any FMS folder structure. The FMSURL class takes a “useInstance” parameter that dictates whether the instance name should be expected in the URI. Here is the ASDOC entry referring to this feature./*** Set the URL this class will work with.** @param url The URL this class will use to provide FMS-specific information such as app name and instance name.* @param useInstance If true, then the second part of the URL path is considered the instance name,* such as rtmp://host/app/foo/bar/stream. In this case the instance name would be ‘foo’ and the stream would* be ‘bar/stream’.* If false, then the second part of the URL path is considered to be the stream name,* such as rtmp://host/app/foo/bar/stream. In this case there is no instance name and the stream would* be ‘foo/bar/stream’.**/If you use this parameter, does that resolve your issue? Or am I misunderstanding the problem?

  3. Kalpesh says:

    Thanks a lot Brian. I have few queries about the serial element: Does it work with FMS only or you can pass any http progressive download URL as a child? Is there any sample/test page where I can see this in action?

  4. Brian Riggs says:

    The SerialElement can be used with any playable/temporal MediaElement, including a VideoElement that targets an HTTP progressive URL. The Composition API spec has a bunch of examples, see here for details..

  5. Kalpesh says:

    It seems there is no such method as player.media in the latest release of osmf

  6. Brian Riggs says:

    MediaPlayer’s “media” property has been renamed to “source”.

  7. Kalpesh says:

    Thanks Brian – I got it working here: http://mediamelon.com/demo/strobe/test.html. I see there is an pause of few seconds when the VideoElement changes. I was expecting that the transition from one child to another will be smoother.

  8. Brian Riggs says:

    By default, a SerialElement doesn’t load a child MediaElement until the playhead reaches that MediaElement. You can reduce this delay a bit if you preload all children up front, and immediately play/pause each one (so that the NetStream is initialized and ready to go). But that approach doesn’t always work in production scenarios (e.g. if you’re tracking player events, you don’t want false positives).Typically a transition from video to video will be from main content to midroll ad, so seamlessness isn’t that important. Do you have a specific use case in mind where you would want that transition to be smoother?

  9. Kalpesh says:

    Hi Brian – we are trying to implement http based dynamic bitrate switching in flash player. We create multiple segments of the video and play them one after another, selecting the appropriate bitrate. We already preload the next segment and the video transitions are very smooth, but the audio transition is not that smooth. You can check this link http://www.mediamelon.com/MediaMelon/PartnerVideo.do?mediaid=8055. Please let us know it you have any suggestions to improve this.

  10. Brian Riggs says:

    Can you raise this issue on the OSMF forums? I’ve only been peripherally involved in the dynamic streaming implementation, but the developers who worked on it do patrol the forums.

  11. Buck says:

    Thanks for this rundown, Brian. I was wondering if you had a suggestion for how to gain access to the current MediaElement that is being accessed in the SerialElement? My use case: I’m extending an AudioElement so that I can put properties on a subclass with metadata about the file being played. At the moment I can’t see any way to respond to a LoadableStateChangeEvent and get a reference back to my subclass. There is a reference to the URLResource, but it seems inappropriate to subclass that and put the properties there.

  12. Brian Riggs says:

    The omission of a current item getter on SerialElement is intentional, as we don’t want clients having to write custom code based on class type (if x is SerialElement do this, else do that). I’m fairly sure there’s a way to achieve what you’re trying to do without needing to know the current item (although I’d need more detail to propose a solution). Feel free to post some specifics (including code) here or on the OSMF forums. Based on what you’ve described, one possible solution would be to set the metadata on the URLResource (which implements IMediaResource, which has a getter for Metadata). That seems like the most appropriate place for metadata about the file being played (since it’s metadata about the resource), and it would be accessible within your AudioElement subclass.