Adobe Supports OpenStand

On March 9th, at the Open Future  reception at SXSW, Adobe announced support for the OpenStand  initiative. Our rationale for this was simple – OpenStand is good for the Web, good for users, and good for Adobe. It increases innovation, openness, and allows greater participation in evolving the Internet.

The Internet is built on standards. These standards come from all sorts of organizations – some formal and supported by governments, some less formal and created by industry associations, and some driven by users who believe in collective action. OpenStand takes a simple position on these organizations – if the organization is open, transparent, balanced, has due process in creation, and has broad consensus – then the organization and its specifications are legitimate.

The approach advocated by OpenStand seems to be intuitively obvious; good technical standards which are accepted and implemented by the industry should be judged not on their origin, but rather on their utility to the industry. A poor standard with a “proper background and backing” is still a poor standard.

The Internet is ubiquitous – from mobile phones to tablets to desktops – all form factors, all types of information, design, and literally “everything”.  It is a golden age for creative display and use of information – all driven by innovation, which then is “standardized” so users can access it and interoperate with complementary services.

Adobe has contributed significantly and will continue to contribute to efforts to document and test these innovative activities in conjunction with W3C with the Web Platform Docs (WPD) project and Test the Web Forward.   We have also (along with Microsoft and Google), supported the HTML5 editor as this standard moves to completion and we are also active in WebKit specs, as well as in formal metadata standards. All of these venues are important – and all build the ability of the market to innovate, grow, and change. And that’s what OpenStand is all about, which is why Adobe has chosen to support it.

Carl Cargill
Principal Scientist

Takeaways from the 14th Annual Privacy and Security Conference

I recently presented at the14th Annual Privacy and Security Conference held in Victoria, British Columbia.  There were several things which I took away from the meeting.

The first takeaway is that even though the three keynote speakers all looked at the issue of security and privacy from different perspectives – they all agreed that the level of interest in these two areas is growing as governments begin to recognize that the World Wide Web crosses borders with impunity.

The second takeaway was that standards creation is largely ignored. Mostly, the discussion was on standards and regulation implementation not on the act of creation. My presentation – “Whose Internet is it?” – focused on the groups that create basic Internet and telephony standards. The intent of the presentation was to convince people that they can (and should) get involved in creating the standards that drive the Web.

The final takeaway was that the distinction between standards and policy is becoming very blurred and the implications for national governments and for commercial providers are significant. On one hand, a nation has the right (and sometimes the duty), to protect itself and its citizens. For this, there exists the traditional standardization venues. On the other hand, there is the growing realization that these traditional bodies are ill equipped to deal with the increased pace of technology change that the Information and Communications Technology (ICT) world is experiencing. Throw into this mix open source and IPR, social change, and increasingly, mobile telecommunications, and you have a volatile mix.

This is an interesting, challenging, and confusing time for those involved. The collision of regulations, innovation, policy, technology, and a host of other factors, of necessity, make the issue complicated and complex – but very relevant to how the Web will evolve. Adobe will continue to follow this issue as it unfolds, and we welcome your perspectives and comments.

Carl Cargill
Principal Scientist

 

Canvas 2D and Real-Time Media Flow Protocol (RTMFP) Standards Contributions

In the last several weeks, Adobe has made two significant announcements about standards contributions. One announcement signaled the submission of a specification for Adobe’s Secure Real-Time Media Flow Protocol (RTMFP) to the Internet Engineering Task Force (IETF) and the other was an announcement by the W3C of stable and feature-complete HTML5 and Canvas 2D specifications to which Adobe contributed an endorsement (as well as providing Rik Cabanier as a co-editor of the Canvas 2D spec).

The two announcements are joined by a common thread: In both cases, Adobe felt that the market and our customers would benefit from the technology in the specifications. In the case of RTMFP, Adobe made a direct contribution of technology, which we believe has value for developers as the Internet continues to develop new applications and solutions. RTMFP may help solve some of the more vexing problems in real-time and peer-to-peer communications. It was submitted under a royalty-free grant – meaning that Adobe does not stand to profit from the contribution.

In the case of HTML5 and Canvas 2D, Adobe made a significant royalty-free grant of technology to the HTML5 specification as well as associated specifications that comprised “big HTML5” (which includes all the elements associated with HTML5 , from JavaScript to CSS). Along with that, Adobe (in conjunction with Microsoft and Google) is a major contributor to the W3C editor’s fund, which provides the means necessary to hire full time W3C editors for the HTML5.1 specification. We’re not sure how the next revision of HTML5 will shake out, but we’re reasonably certain that the careful and planned releases of stable and testable technology will help the market (including our customers) achieve fuller benefits from the World Wide Web.

In both cases, Adobe is betting on the future. The technologies being offered are either proven and existing technologies (Adobe uses RTMFP in Flash and other products), and Canvas 2D is increasingly being deployed and being embraced by the market. What is different is that businesses and developers now have an available and stable specification for implementation and planning. We don’t know where the market will go – but we do know that providing a firm foundation for continued expansion makes it much easier to build for the future.

We’re also willing to bet that the increased transparency offered by standards will help make the Internet and the Web more useful and increase the numbers of users and developers. And that they, in turn, will see more and more opportunities for further development and use. And this grows the market and increases the utility of the Web for everyone.

Carl Cargill
Principal Scientist

Testing: The Third Pillar of Standards

Recently, a series of “Test the Web Forward” events have been scheduled to promote getting the community involved in building test cases for important Web standards. A few months ago, I participated in “Test the Web Forward/Paris” in Paris.  The next “Test the Web Forward/Sydney” event is scheduled for February 8th and 9th in Sydney, Australia. These events, held in various cities around the world, are open to everyone who is passionate about Web standards, and bring together developers and standards experts.

Why is testing important? When we think about “standards,” we usually think about the two initial components: (1) specifications — written descriptions about how the standards work, and (2) implementations — software that implements the specification. A suite of test cases becomes an essential link between specifications and implementations.

When it comes to standards and standardization, what people care about is compatibility — the ability to use components from multiple sources with the expectation that those components will work together. This connection is there for all kinds of software standards, whether Application Program Interfaces (APIs), rules for communicating over the network (protocols), computer languages, or smaller component pieces (protocol elements) used by any of those.

On the Web, the APIs are frequently JavaScript, the protocol is often HTTP, and the languages include HTML, CSS, and JavaScript. URLs, host names and encoding labels and MIME types are protocol elements.

The “Create the Web” tour demonstrated the relationship between specification and implementation. “Test the Web Forward” brings in test cases to ensure that the promise of compatibility isn’t empty. Building the global information infrastructure requires a focus not only on new developments, but on compatibility, reliability, performance, and security. The challenge of testing is that the technology is complex, the specifications are new, and the testing needed is extensive.

I encourage everyone who is passionate about the Web and Web standards to attend the “Test the Web Forward” event in Sydney or other related events. Get involved and help make the Web a more interoperable place.

Larry Masinter
Principal Scientist

Adobe’s Secure Real-Time Media Flow Protocol

Today Michael Thornburgh, Sr. Computer Scientist at Adobe, submitted a specification for Adobe’s Secure Real-Time Media Flow Protocol (RTMFP) to the Internet Engineering Task Force (IETF) Internet-Drafts repository  along with a disclosure of and grant of essential intellectual property rights under royalty-free terms.

RTMFP is a low-level endpoint-to-endpoint data transport protocol designed for real-time and peer-to-peer (P2P) communication. It presents a unified and holistic solution to a number of Internet communication problems, including security, NAT traversal, congestion control and prioritization of parallel media and data flows, variable transmission reliability for messages, and IP address mobility.

RTMFP is the foundation protocol that we use to bring the P2P and Multicast capabilities to Adobe products such as Adobe Flash and Adobe Media Server, but we believe it has applicability beyond Flash that the Internet community can use in their own innovations for next-generation real-time and P2P applications. Adobe continues to develop technologies using RTMFP within Adobe Flash and Adobe Media Server including features like Multicast and Groups, being used today by our customers to deliver high quality video experiences across public and corporate networks.

We are excited to continue making contributions to standards organizations such as the IETF that further Internet technologies for developers and users. As a technology leader, Adobe collaborates with stakeholders from industry, academia and government to develop, drive and support standards in existing and emerging technologies, policy areas, and markets, in order to improve our customers’ experience.

We welcome comments and feedback to help us improve the quality, clarity, and accuracy of this specification and we are excited to see what the Internet community creates with it.

Kevin Towes
Sr. Product Manager

(reposted from: Kevin Towes on Online Video at Adobe blog)

Congressman Jack Brooks’ Legacy to the World of Standards

In a brief footnote to history, I note with sadness that Congressman Jack Brooks has died. He had a long and interesting career as a Congressman from Texas and in the process shaped the face of standards and standardization as we now know it. He is mourned by his colleagues and friends.

I never met Jack Brooks, but I owe him – and his ideas – a great deal and probably, my career and interest in standardization. Jack Brooks was the author of Public Law 89-306 (Brooks Act), dated October 30, 1965, H.R.4845, which established the rules and requirements for buying “…any equipment or interconnected system or subsystems of equipment that is used in the automatic acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of data or information…”

To make sure that what the government bought was “interconnectable” (since interoperable was still just a pipe dream), the Act further required that the National Bureau of Standards promulgate standards and guidelines “necessary to improve the efficiency of operation or security and privacy of Federal computer systems.”

This Act made standards and standardization a central feature of Federal procurement from 1965 on. Not only did prime contractors have to meet these standards, but all the participants down the supply chain did as well. This was significant, since the Federal government was the largest single purchaser of information technology in the world. From NASA to the Social Security Administration, from the Department of Education (another Brooks accomplishment) to the National Weather service, systems began to become “interconnectable.” Proprietary hardware (for that was the emphasis at the time) was slowly moved to interconnected and “plug compatible” systems. 

The development and creation of a whole theory of business rationale and strategic planning necessary for standards was a “green field” area and was rather challenging – and ultimately led to where we are today – with a vast and complex set of inter-relations between trade, business, politics, economics, jurisprudence, and social planning.

I doubt that Jack Brooks saw this far or anticipated the extent to which his Act would change the face of computing. He was a hard-headed realist who wanted to save the government money. But by driving standards into the procurement of Government systems, Jack Brooks changed the face of the IT industry by making technical standards necessary. And when that happened, all of the ancillary business activity, from legal basis to strategic implications to marketing to social use followed in its wake.  This is one of Jack Brooks’ unsung but tremendously powerful legacies.

Carl Cargill
Principal Scientist

 

Who’s Making the Rules About the Internet?

Governance – who makes the policies and rules (but NOT the technology) on how the Internet runs – is one of the “invisible infrastructure” activities that just happen and keeps the Internet from failing. With the explosive growth of the Web and its importance to the world economy (coupled with the fact that the Internet is global and doesn’t recognize national borders), the last decade has seen governments and policy makers start to look more closely at who actually makes the rules that run the Internet and wonder if perhaps there isn’t a better way. Things like taxation, censorship, privacy, intellectual property, libel, economic stability, and financial dealings are all aspects of Internet governance the world is coming to recognize. And governments are loathe to grant U.S. based Non-Governmental Organizations (NGOs) (such as ICANN and ISOC and IETF[1]) the right to make the fundamental rules that impact these sovereign rights.

Part of the reason for this is the importance of the Internet to the world’s economic well-being. The global (and globalized) economy depends in large part on the instantaneous communications afforded by the Internet, which are now reaching ever-broader audiences.

However, the major impact of the Internet is on governance of nations – and not just of the individuals of the Internet. The “Arab Spring” movement showed the power of the Internet to galvanize public reaction. This can be a worrisome thing to a government trying to maintain economic or political stability. Wikileaks also illustrated the power of the Internet to disseminate governmentally unfavorable information that impacted governmental foreign policy, and the use of malware (e.g. Stuxnet) has become a tool for both industrial and military espionage and sabotage.

But in the international geopolitical arena, governance has expanded to mean more: It also means that all nations have an equal say and stake in the creation of the rules and deployment of the Internet. Note here that the term is “nations” – not individuals – because how the Internet is governed can have a tremendous impact on a nation’s ability to pursue a national growth and governance strategy.

One of the countries most concerned about this area is China. It was noted that the non-regulated nature of the Internet, and what could be viewed as favoritism to the developed countries, poses a long term and large problem for developing countries. On September 18, 2012, the Chinese government hosted an “emerging nations Internet roundtable,” where issues of Internet governance, content management, and cyber security were discussed. The governments of Russia, Brazil, India, and South Africa all participated, and together looked at Internet governance so that the needs of developing nations are taken into consideration.

Following the meeting, a statement was released that said that participating governments would continue to meet on a regular basis and that consensus has been reached on the following four major topics:

1. Internet governance must become a governmental mandate, and the impact of social networks on society (good, bad, and otherwise) is of particular concern.

2. Content filtering needs increased regulation and legislation to both protect and promote the interests of developing and emerging nations.

3. The whole area of cyber-security needs increased transnational cooperation and coordination – but this must be balanced with the economic needs of emerging states.

4. Emerging and developing nations offer the greatest opportunity for Internet growth, and these nations must take responsibility for managing this growth.

This conference clearly delimits the debate between those who seek an unfettered (open and based in information technology) Internet and those who would like a more regulated (in the style of the regulated telecommunications industry) model.

The issue of who controls a tremendously powerful communications force is, of course, a matter of high interest to nearly everyone. But the essential issue is that this policy and governance issue is being fought in the standards arena – that is, who has the right and duty to make the rules about the standards that will drive the next generation of innovation in the massively connected world. Currently, the International Telecommunication Union (ITU)[2], is proposing to assume an increased role in making rules for the Internet, with support from many of the G-30 nations. And ISOC, the IETF, W3C, and the IEEE are responding with the Open Stand (http://open-stand.org) initiative. Europe is moving to recognize consortia specifications – and the national standards bodies (with implicit support of their governments) are trying to slow and limit this change. And we will see this same type of standards-based activity in policy decisions on privacy, security, accessibility, and others. As the world becomes more and more highly interconnected, the need for and control of who creates and who mandates standards – their creation, implementation, testing, and IPR status – will become major issues in national and international policy. And this is the lesson that is being learned from the internet governance discussions.

(1)Internet Corporation for Assigned Names and Numbers (ICANN); Internet Society (ISOC); Internet Engineering Task Force (IETF)

 (2) The ITU is a specialized U.N. Treaty Organization responsible for Information and Communications technology. It creates standards to regulate telecommunications.

Understanding the Updated EU Standardization Regulation

 

After many months of negotiations, the European Union (EU) in September 2012 published its new Standardisation Regulation, which aims to address some of the major systemic challenges in European Commission (EC) standardization.

By creating a mechanism whereby EU legislation can reference, for the first time, standards created in “informal” (i.e. non-commission sponsored) fora or consortia, the new regulation should address the issues with speed and flexibility which have become apparent, particularly for the always-evolving Information and Communications Technology (ICT) space. At the same time, ensuring funding for a broad range of organizations that might otherwise lack the means to participate in standardization processes – subject matter experts (SMEs), consumer groups and other civil society representatives – the regulation aims to ensure a balanced representation of interests; however, there are a number of other interesting features of the regulation which are worth examining.

The regulation allows public procurement processes in the EU to reference technical specifications developed by bodies other than the three formal European Standards Bodies (ESBs) [1], where no EU standard exists. Strict criteria in Annex II of the regulation define the type of body whose technical specifications are eligible for adoption in this way. The strong reference to FRAND (Fair, reasonable and non-discriminatory terms) is significant, given the many years of wrangling over the lacklustre support for that concept in the Commission’s own IT procurement processes. It is an important recognition by the Commission of the value of intellectual property rights (IPR) produced in Europe, and that inventors deserve a chance to monetize their creations.

That said, any technical specification produced by a “non-official standards organization” hoping to be referenced in an EU public procurement tender, will need to run the gauntlet of a new “Multi-Stakeholder Forum” (MSF) comprising 67 representatives from national governments, trade associations and assorted industry representatives. It’s still too early to tell how this consultative body will work in practice, but observers will be keen to see how the group reaches consensus as to whether a technical specification meets the criteria for adoption and should be endorsed by EC legislation. Adopting standards that are actually used in the real world, and where the completeness of the specification is evidenced by multiple independent implementations, is a valuable objective, and one which is already a part of many standard development organizations.

Thanks to another change, technical specifications for reference in public procurement can refer to the expected interoperability or environmental performance of a product or service. This is in line with planned changes to EU Public Procurement Directives, but any change to existing criteria will take time for industry to fully understand, particularly where subjective and politically-sensitive terms like “interoperability” are concerned.

Another less commented development is the more formal advisory role accorded to the EU’s scientific research bodies, to ensure that the standards developed by the ESBs take into account “economic competitiveness… and safety and security concerns.” In an ideal world, scientific advice is, of course, objective and neutral. When linked to standards used to determine market access, that objectivity is even more critical and, potentially, more elusive. Developments in the Cloud Computing space are likely to be an early test of the EU’s ability to adopt a truly global approach.

(1) There are three European Standards Bodies: CEN (European Committee for Standardisation), CENELEC (European Committee for Electrotechnical Standardization) and ETSI (European Telecommunications Standards Institute). These three bodies are Brussels-based organizations which the European Union has recognized as the creators of European standards. As intended, CEN was to be an ISO analogue, CENELEC an IEC analogue, and ETSI an ITU-T analogue.

 John Jolliffe
Senior Manager, European Government Affairs

W3C Web Platforms Docs: A Standards Perspective

Recently, Adobe, along with many others in the community, initiated a major community effort to build a common suite of developer-oriented documentation of the Open Web Platform as a community effort sponsored by the World Wide Web Consortium (W3C) .

One of the problems with standards is that generally, they are meant more for implementors and less for users of the standards (often by design). Those who actually write the standards and work on the committees that create them know that they are fragile interface descriptions – and this fragility is what requires the care in their crafting.

Standards specifications are necessarily quite detailed, in order to really promote interoperability and ensure things work the same. And this is where things get sticky. The implementations are based on the standard or the specification, and all standards (well, nearly all) are written in native language by people who are usually specialists in technology, as are many people who implement the developer-oriented documentation.

What’s exciting here is that the Web Platform Docs (WPD) effort is really targeted at the user community to help document the standards in a way that is useful to that community.

But a standard really only gains value when it is implemented and widely deployed. And this is why the WPD is so innovative. WPD is about use and deployment of the standard. It has tutorials on how to use a feature; it has examples of uses. This is the kind of material that the W3C working group does not have time to create, but values. It is what the vendors provide to get their implementations used.

The importance of the site, from a standards point of view, is that it helps build an informed user base. Not at all a simple task.

The Web is evolving – and in its evolution, it is forcing others to change as well. Ten years ago, this type of common activity, open to all (for both contributions and information) would have been if not unthinkable, at least foreign. With this announcement, the contributors and the W3C have (hopefully) begun to change the way standards are seen – to an easier and kinder environment.  And this is a good thing.

For an Adobe developer’s view, see: What’s one of the biggest things missing from the Web?

Leading the Web Forward: Adobe’s “Create the Web” Event and Open Standards

I recently attended Adobe’s “Create the Web” event in San Francisco on September 24, 2012. One of things that struck me was the role standards are playing in the tools and technologies announced at that event.  Adobe is increasingly delivering standards-based tools to simplify the creation of imaginative content for the Web as well as contributing technology to the continuing development of the Open Web Platform standards within the World Wide Web Consortium (W3C).
 
Adobe has for many years been one of the primary vendors of tools for creating visual content. Our customers look to us to help them create innovative and effective presentations and content: graphically, textually and interactively. Originally, these tools involved display vehicles created by Adobe, but increasingly, the tools Adobe is providing are moving to standards-based platforms such as the Open Web Platform. For example, the recently announced Edge Animate tool makes the creation of animations using HTML5, CSS and JavaScript much more natural; a user interacts with a graphical display of the objects being animated, and the tool helps the user write the “code” for inclusion of the objects on the user’s Web page.
 
As the Web platform standards have become available on mobile as well as desktop devices, creating presentations that scale across these devices has become more challenging. The Edge Reflow tool helps create presentations that shift the way the same content is displayed on devices of different sizes using a Cascading Style Sheet (CSS) feature called media queries. PhoneGap Build then allows an author to take a Web platform-based application and package it as a native application that can run on a number of mobile device operating systems.
 
But, the Web of today still lacks many of the features Adobe customers have grown to appreciate and use. For that reason, Adobe is very active in extending the Web standards to include those features. In the area of presentation layout, Adobe has submitted proposals to allow a presentation to be constructed from multiple flows of material and to have objects on the page exclude other objects or text to achieve layout effects commonly seen in magazines. In the area of graphics, Adobe is helping to standardize the technologies used to create filters that add pizazz to presentations and to allow various elements to be overlaid, transparently. These efforts are accompanied by open-source demonstration implementations that help vendors supporting the Open Web Platform understand the value of and possibilities around the features being contributed. Adobe is in active partnerships developing these features to lead the Web forward.
 
Adobe is making a strong statement in support of the Open Web Platform standards. We are developing tools that make it easier to produce content for the Open Web, and we are working to extend that standard to better meet the needs of Adobe customers. Thus, standards are significant in the ways Adobe helps creative professionals, publishers, developers and businesses create, publish, promote and monetize their content anywhere.