Testing: The Third Pillar of Standards

Recently, a series of “Test the Web Forward” events have been scheduled to promote getting the community involved in building test cases for important Web standards. A few months ago, I participated in “Test the Web Forward/Paris” in Paris.  The next “Test the Web Forward/Sydney” event is scheduled for February 8th and 9th in Sydney, Australia. These events, held in various cities around the world, are open to everyone who is passionate about Web standards, and bring together developers and standards experts.

Why is testing important? When we think about “standards,” we usually think about the two initial components: (1) specifications — written descriptions about how the standards work, and (2) implementations — software that implements the specification. A suite of test cases becomes an essential link between specifications and implementations.

When it comes to standards and standardization, what people care about is compatibility — the ability to use components from multiple sources with the expectation that those components will work together. This connection is there for all kinds of software standards, whether Application Program Interfaces (APIs), rules for communicating over the network (protocols), computer languages, or smaller component pieces (protocol elements) used by any of those.

On the Web, the APIs are frequently JavaScript, the protocol is often HTTP, and the languages include HTML, CSS, and JavaScript. URLs, host names and encoding labels and MIME types are protocol elements.

The “Create the Web” tour demonstrated the relationship between specification and implementation. “Test the Web Forward” brings in test cases to ensure that the promise of compatibility isn’t empty. Building the global information infrastructure requires a focus not only on new developments, but on compatibility, reliability, performance, and security. The challenge of testing is that the technology is complex, the specifications are new, and the testing needed is extensive.

I encourage everyone who is passionate about the Web and Web standards to attend the “Test the Web Forward” event in Sydney or other related events. Get involved and help make the Web a more interoperable place.

Larry Masinter
Principal Scientist

Adobe’s Secure Real-Time Media Flow Protocol

Today Michael Thornburgh, Sr. Computer Scientist at Adobe, submitted a specification for Adobe’s Secure Real-Time Media Flow Protocol (RTMFP) to the Internet Engineering Task Force (IETF) Internet-Drafts repository  along with a disclosure of and grant of essential intellectual property rights under royalty-free terms.

RTMFP is a low-level endpoint-to-endpoint data transport protocol designed for real-time and peer-to-peer (P2P) communication. It presents a unified and holistic solution to a number of Internet communication problems, including security, NAT traversal, congestion control and prioritization of parallel media and data flows, variable transmission reliability for messages, and IP address mobility.

RTMFP is the foundation protocol that we use to bring the P2P and Multicast capabilities to Adobe products such as Adobe Flash and Adobe Media Server, but we believe it has applicability beyond Flash that the Internet community can use in their own innovations for next-generation real-time and P2P applications. Adobe continues to develop technologies using RTMFP within Adobe Flash and Adobe Media Server including features like Multicast and Groups, being used today by our customers to deliver high quality video experiences across public and corporate networks.

We are excited to continue making contributions to standards organizations such as the IETF that further Internet technologies for developers and users. As a technology leader, Adobe collaborates with stakeholders from industry, academia and government to develop, drive and support standards in existing and emerging technologies, policy areas, and markets, in order to improve our customers’ experience.

We welcome comments and feedback to help us improve the quality, clarity, and accuracy of this specification and we are excited to see what the Internet community creates with it.

Kevin Towes
Sr. Product Manager

(reposted from: Kevin Towes on Online Video at Adobe blog)

Congressman Jack Brooks’ Legacy to the World of Standards

In a brief footnote to history, I note with sadness that Congressman Jack Brooks has died. He had a long and interesting career as a Congressman from Texas and in the process shaped the face of standards and standardization as we now know it. He is mourned by his colleagues and friends.

I never met Jack Brooks, but I owe him – and his ideas – a great deal and probably, my career and interest in standardization. Jack Brooks was the author of Public Law 89-306 (Brooks Act), dated October 30, 1965, H.R.4845, which established the rules and requirements for buying “…any equipment or interconnected system or subsystems of equipment that is used in the automatic acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of data or information…”

To make sure that what the government bought was “interconnectable” (since interoperable was still just a pipe dream), the Act further required that the National Bureau of Standards promulgate standards and guidelines “necessary to improve the efficiency of operation or security and privacy of Federal computer systems.”

This Act made standards and standardization a central feature of Federal procurement from 1965 on. Not only did prime contractors have to meet these standards, but all the participants down the supply chain did as well. This was significant, since the Federal government was the largest single purchaser of information technology in the world. From NASA to the Social Security Administration, from the Department of Education (another Brooks accomplishment) to the National Weather service, systems began to become “interconnectable.” Proprietary hardware (for that was the emphasis at the time) was slowly moved to interconnected and “plug compatible” systems. 

The development and creation of a whole theory of business rationale and strategic planning necessary for standards was a “green field” area and was rather challenging – and ultimately led to where we are today – with a vast and complex set of inter-relations between trade, business, politics, economics, jurisprudence, and social planning.

I doubt that Jack Brooks saw this far or anticipated the extent to which his Act would change the face of computing. He was a hard-headed realist who wanted to save the government money. But by driving standards into the procurement of Government systems, Jack Brooks changed the face of the IT industry by making technical standards necessary. And when that happened, all of the ancillary business activity, from legal basis to strategic implications to marketing to social use followed in its wake.  This is one of Jack Brooks’ unsung but tremendously powerful legacies.

Carl Cargill
Principal Scientist

 

Who’s Making the Rules About the Internet?

Governance – who makes the policies and rules (but NOT the technology) on how the Internet runs – is one of the “invisible infrastructure” activities that just happen and keeps the Internet from failing. With the explosive growth of the Web and its importance to the world economy (coupled with the fact that the Internet is global and doesn’t recognize national borders), the last decade has seen governments and policy makers start to look more closely at who actually makes the rules that run the Internet and wonder if perhaps there isn’t a better way. Things like taxation, censorship, privacy, intellectual property, libel, economic stability, and financial dealings are all aspects of Internet governance the world is coming to recognize. And governments are loathe to grant U.S. based Non-Governmental Organizations (NGOs) (such as ICANN and ISOC and IETF[1]) the right to make the fundamental rules that impact these sovereign rights.

Part of the reason for this is the importance of the Internet to the world’s economic well-being. The global (and globalized) economy depends in large part on the instantaneous communications afforded by the Internet, which are now reaching ever-broader audiences.

However, the major impact of the Internet is on governance of nations – and not just of the individuals of the Internet. The “Arab Spring” movement showed the power of the Internet to galvanize public reaction. This can be a worrisome thing to a government trying to maintain economic or political stability. Wikileaks also illustrated the power of the Internet to disseminate governmentally unfavorable information that impacted governmental foreign policy, and the use of malware (e.g. Stuxnet) has become a tool for both industrial and military espionage and sabotage.

But in the international geopolitical arena, governance has expanded to mean more: It also means that all nations have an equal say and stake in the creation of the rules and deployment of the Internet. Note here that the term is “nations” – not individuals – because how the Internet is governed can have a tremendous impact on a nation’s ability to pursue a national growth and governance strategy.

One of the countries most concerned about this area is China. It was noted that the non-regulated nature of the Internet, and what could be viewed as favoritism to the developed countries, poses a long term and large problem for developing countries. On September 18, 2012, the Chinese government hosted an “emerging nations Internet roundtable,” where issues of Internet governance, content management, and cyber security were discussed. The governments of Russia, Brazil, India, and South Africa all participated, and together looked at Internet governance so that the needs of developing nations are taken into consideration.

Following the meeting, a statement was released that said that participating governments would continue to meet on a regular basis and that consensus has been reached on the following four major topics:

1. Internet governance must become a governmental mandate, and the impact of social networks on society (good, bad, and otherwise) is of particular concern.

2. Content filtering needs increased regulation and legislation to both protect and promote the interests of developing and emerging nations.

3. The whole area of cyber-security needs increased transnational cooperation and coordination – but this must be balanced with the economic needs of emerging states.

4. Emerging and developing nations offer the greatest opportunity for Internet growth, and these nations must take responsibility for managing this growth.

This conference clearly delimits the debate between those who seek an unfettered (open and based in information technology) Internet and those who would like a more regulated (in the style of the regulated telecommunications industry) model.

The issue of who controls a tremendously powerful communications force is, of course, a matter of high interest to nearly everyone. But the essential issue is that this policy and governance issue is being fought in the standards arena – that is, who has the right and duty to make the rules about the standards that will drive the next generation of innovation in the massively connected world. Currently, the International Telecommunication Union (ITU)[2], is proposing to assume an increased role in making rules for the Internet, with support from many of the G-30 nations. And ISOC, the IETF, W3C, and the IEEE are responding with the Open Stand (http://open-stand.org) initiative. Europe is moving to recognize consortia specifications – and the national standards bodies (with implicit support of their governments) are trying to slow and limit this change. And we will see this same type of standards-based activity in policy decisions on privacy, security, accessibility, and others. As the world becomes more and more highly interconnected, the need for and control of who creates and who mandates standards – their creation, implementation, testing, and IPR status – will become major issues in national and international policy. And this is the lesson that is being learned from the internet governance discussions.

(1)Internet Corporation for Assigned Names and Numbers (ICANN); Internet Society (ISOC); Internet Engineering Task Force (IETF)

 (2) The ITU is a specialized U.N. Treaty Organization responsible for Information and Communications technology. It creates standards to regulate telecommunications.

Understanding the Updated EU Standardization Regulation

 

After many months of negotiations, the European Union (EU) in September 2012 published its new Standardisation Regulation, which aims to address some of the major systemic challenges in European Commission (EC) standardization.

By creating a mechanism whereby EU legislation can reference, for the first time, standards created in “informal” (i.e. non-commission sponsored) fora or consortia, the new regulation should address the issues with speed and flexibility which have become apparent, particularly for the always-evolving Information and Communications Technology (ICT) space. At the same time, ensuring funding for a broad range of organizations that might otherwise lack the means to participate in standardization processes – subject matter experts (SMEs), consumer groups and other civil society representatives – the regulation aims to ensure a balanced representation of interests; however, there are a number of other interesting features of the regulation which are worth examining.

The regulation allows public procurement processes in the EU to reference technical specifications developed by bodies other than the three formal European Standards Bodies (ESBs) [1], where no EU standard exists. Strict criteria in Annex II of the regulation define the type of body whose technical specifications are eligible for adoption in this way. The strong reference to FRAND (Fair, reasonable and non-discriminatory terms) is significant, given the many years of wrangling over the lacklustre support for that concept in the Commission’s own IT procurement processes. It is an important recognition by the Commission of the value of intellectual property rights (IPR) produced in Europe, and that inventors deserve a chance to monetize their creations.

That said, any technical specification produced by a “non-official standards organization” hoping to be referenced in an EU public procurement tender, will need to run the gauntlet of a new “Multi-Stakeholder Forum” (MSF) comprising 67 representatives from national governments, trade associations and assorted industry representatives. It’s still too early to tell how this consultative body will work in practice, but observers will be keen to see how the group reaches consensus as to whether a technical specification meets the criteria for adoption and should be endorsed by EC legislation. Adopting standards that are actually used in the real world, and where the completeness of the specification is evidenced by multiple independent implementations, is a valuable objective, and one which is already a part of many standard development organizations.

Thanks to another change, technical specifications for reference in public procurement can refer to the expected interoperability or environmental performance of a product or service. This is in line with planned changes to EU Public Procurement Directives, but any change to existing criteria will take time for industry to fully understand, particularly where subjective and politically-sensitive terms like “interoperability” are concerned.

Another less commented development is the more formal advisory role accorded to the EU’s scientific research bodies, to ensure that the standards developed by the ESBs take into account “economic competitiveness… and safety and security concerns.” In an ideal world, scientific advice is, of course, objective and neutral. When linked to standards used to determine market access, that objectivity is even more critical and, potentially, more elusive. Developments in the Cloud Computing space are likely to be an early test of the EU’s ability to adopt a truly global approach.

(1) There are three European Standards Bodies: CEN (European Committee for Standardisation), CENELEC (European Committee for Electrotechnical Standardization) and ETSI (European Telecommunications Standards Institute). These three bodies are Brussels-based organizations which the European Union has recognized as the creators of European standards. As intended, CEN was to be an ISO analogue, CENELEC an IEC analogue, and ETSI an ITU-T analogue.

 John Jolliffe
Senior Manager, European Government Affairs

W3C Web Platforms Docs: A Standards Perspective

Recently, Adobe, along with many others in the community, initiated a major community effort to build a common suite of developer-oriented documentation of the Open Web Platform as a community effort sponsored by the World Wide Web Consortium (W3C) .

One of the problems with standards is that generally, they are meant more for implementors and less for users of the standards (often by design). Those who actually write the standards and work on the committees that create them know that they are fragile interface descriptions – and this fragility is what requires the care in their crafting.

Standards specifications are necessarily quite detailed, in order to really promote interoperability and ensure things work the same. And this is where things get sticky. The implementations are based on the standard or the specification, and all standards (well, nearly all) are written in native language by people who are usually specialists in technology, as are many people who implement the developer-oriented documentation.

What’s exciting here is that the Web Platform Docs (WPD) effort is really targeted at the user community to help document the standards in a way that is useful to that community.

But a standard really only gains value when it is implemented and widely deployed. And this is why the WPD is so innovative. WPD is about use and deployment of the standard. It has tutorials on how to use a feature; it has examples of uses. This is the kind of material that the W3C working group does not have time to create, but values. It is what the vendors provide to get their implementations used.

The importance of the site, from a standards point of view, is that it helps build an informed user base. Not at all a simple task.

The Web is evolving – and in its evolution, it is forcing others to change as well. Ten years ago, this type of common activity, open to all (for both contributions and information) would have been if not unthinkable, at least foreign. With this announcement, the contributors and the W3C have (hopefully) begun to change the way standards are seen – to an easier and kinder environment.  And this is a good thing.

For an Adobe developer’s view, see: What’s one of the biggest things missing from the Web?

Leading the Web Forward: Adobe’s “Create the Web” Event and Open Standards

I recently attended Adobe’s “Create the Web” event in San Francisco on September 24, 2012. One of things that struck me was the role standards are playing in the tools and technologies announced at that event.  Adobe is increasingly delivering standards-based tools to simplify the creation of imaginative content for the Web as well as contributing technology to the continuing development of the Open Web Platform standards within the World Wide Web Consortium (W3C).
 
Adobe has for many years been one of the primary vendors of tools for creating visual content. Our customers look to us to help them create innovative and effective presentations and content: graphically, textually and interactively. Originally, these tools involved display vehicles created by Adobe, but increasingly, the tools Adobe is providing are moving to standards-based platforms such as the Open Web Platform. For example, the recently announced Edge Animate tool makes the creation of animations using HTML5, CSS and JavaScript much more natural; a user interacts with a graphical display of the objects being animated, and the tool helps the user write the “code” for inclusion of the objects on the user’s Web page.
 
As the Web platform standards have become available on mobile as well as desktop devices, creating presentations that scale across these devices has become more challenging. The Edge Reflow tool helps create presentations that shift the way the same content is displayed on devices of different sizes using a Cascading Style Sheet (CSS) feature called media queries. PhoneGap Build then allows an author to take a Web platform-based application and package it as a native application that can run on a number of mobile device operating systems.
 
But, the Web of today still lacks many of the features Adobe customers have grown to appreciate and use. For that reason, Adobe is very active in extending the Web standards to include those features. In the area of presentation layout, Adobe has submitted proposals to allow a presentation to be constructed from multiple flows of material and to have objects on the page exclude other objects or text to achieve layout effects commonly seen in magazines. In the area of graphics, Adobe is helping to standardize the technologies used to create filters that add pizazz to presentations and to allow various elements to be overlaid, transparently. These efforts are accompanied by open-source demonstration implementations that help vendors supporting the Open Web Platform understand the value of and possibilities around the features being contributed. Adobe is in active partnerships developing these features to lead the Web forward.
 
Adobe is making a strong statement in support of the Open Web Platform standards. We are developing tools that make it easier to produce content for the Open Web, and we are working to extend that standard to better meet the needs of Adobe customers. Thus, standards are significant in the ways Adobe helps creative professionals, publishers, developers and businesses create, publish, promote and monetize their content anywhere.

Governance and standards: Publishing and Linking on the Web

Governance is the process by which society defines expectations, grants power, or verifies performance, through laws, regulations, or other means. Societies govern communication, for example, to support copyright, privacy, or to help manage defamatory or illegal material. As the Internet becomes increasingly central to the way people communicate, it is also increasingly subject to governance.

Unfortunately, a number of problems commonly arise when dealing with governance of the Internet.

Regulations often don’t match the technology. Ordinarily, we use analogies to talk about technology; for example, we talk about “publishing a page”, but the actual process of putting up a web page is very different from physical publishing by making and distributing printed paper. So a rule “It’s okay to read this page, but you can’t make a copy of it” doesn’t acknowledge that, in order to read a page, the bits that make the page must necessarily be copied to the reader’s computer.

Different goals conflict. Law enforcement might require that a site owner keep records of everyone who posts information, in order to be able to track down those who post illegal or defamatory material, while, at the same time, privacy regulation might insist that the same site owner not keep records.

The internet is global, but governance is local. The jurisdiction of law, regulation and social values are geographically based, but the Internet has no simple boundaries. Yet values, regulation, laws from different jurisdictions are inconsistent, and often conflicting. Is it possible to conform to the norms of everyone from a single web site?

Technology standards can help reduce some of the difficulties by providing appropriate terminology, guidance and standards. For example, W3C standards for accessibility have helped reduce some of the unnecessary variability between accessibility guidelines in various countries. In another example, many countries have created regulations and laws that reference common standards for digital signatures of documents, which in turn helps extend the applications that can be supported by electronic communication.

Recently, as a member of the Technical Architecture Group of the World Wide Web Consortium, I’ve been helping produce a First Public Working Draft of a new document called Publishing and Linking on the Web.

This is the first step of getting community consensus on the document and any recommendations. Your thoughts are welcome! Please review the document, share it, discuss it, make comments. Only by discussion can we develop a a common understanding of the alignment of technology and values, and help standards groups, policy makers, and those building new Internet content and services.

 

 

 

Technical Standards during ‘China’s 12th Five Year Plan’

 

A little over two months ago, the Chinese government’s Ministry of Science & Technology released a document entitled “The Special Planning Document for Technical Standards during China’s 12th Five Year Plan.” For short, I’ll call this the “SPD.” This SPD is one of a series published in recent months by the Chinese government on standards and standardization.

A little background is necessary to understand why the Chinese government is emphasizing standards—especially technical standards. China first emerged as a powerful global economic force largely as a foundry nation. In other words, it produced manufactured goods based on others’ designs – the classic commodity provider. The keys to competitiveness in this area include price, scale and flexibility of manufacturing, and the availability of the right workforce.

Perhaps realizing that China could not become a major player in the international information and communications technology (ICT) arena solely as a foundry nation, Chinese government policy now appears aimed at moving the country up the value chain to become a designer and creator of its own ICT products. It has also come to realize that standards, in the era of the Internet and widely and massively connected systems, are essential to market success. The Chinese government examined the German use of standards after World War II to promote that country’s industrial export policy; at the use of standards by the EU to foster a single European market; and at the use of standards by United States companies to drive the direction and focus of the information and communications technology sector. In short, they have realized that standardization is an important element of national industrial policy. The investment in standardization education, the increased participation by Chinese companies in a wide spectrum of SDOs, and the creation of Chinese consortia all point to an understanding of the value of standards to set direction in an industry.

The SPD opens with a bold assertion – “Technical standards are the technical basis of social and economic activities, a strategic resource for national development and a core element for international competitiveness.” This statement is the key to understanding the remainder of the document – which sets forth how technical standards are to be used and considered in planning, R&D, Advanced Development, Testing and Certification, Intellectual Property, and a host of other areas that are typical of life cycle planning in technology companies. More importantly, the Chinese government sees standards as encouraging innovation by limiting duplication, encouraging sharing, and making innovative ideas and products more available to other developers. Standards are also seen as coordinative activities, allowing disparate groups to develop solutions – but with the added caveat that these solutions must/may be applied to social, legal, and economic issues more easily. In total, the document presents an ideal and optimistic vision of standards as a strategic planning mechanism that can be used to spur their economy and various industries, both new and established.

However, it must also be remembered that this is a formal governmental planning document. Planning documents – especially long-term documents – can change, not because the original plan was faulty or incomplete, but because the market conditions upon which the plan was predicated have changed. For the Chinese strategy to succeed, the government will have to be flexible and adaptive in implementing the strategy, as there is no longer a static landscape in the world of ICT standardization.

The Information and Communications Technology sector standards environment in the United States and Europe is highly dynamic – as illustrated by activities like the creation of WebKit and open source; the appearance of W3C community groups; the increase in different Intellectual Property Rights rules within Standards Setting Organizations; the constant creation of new consortia; and the appearance of ad hoc standardization (as shown by social media). Over the last five years, ICT standardization has changed dramatically. Because formal standards organizations (ISO, IEC, and the ITU-based) move slowly, the ICT sector often relies on consortia and ad-hoc groups (WebKit, WHAT WG, and the like) for innovation and leadership in standards development. Additionally, planning in a period of significant change is challenging – and planning for standards is a second derivative of ICT planning. It both leads and is led by technology and technological planning – but with a healthy dose of marketing and economic and social strategy thrown in.

With that said, the SPD is a fascinating statement of Chinese governmental policy. One only wishes that there were corresponding standards strategies from other countries, recognizing the criticality of standards for national competitiveness, with which to compare with that of the Chinese government. There is an old saying in the standards world: “If you don’t play, you can’t complain.” If this document serves the purpose of causing others to re-examine their approaches and seek consensus, then it has served not only China, but also the world.

Internationalized Resource Identifiers: Standards Progress

The idea of a Uniform Resource Locator (URL) is a key Web innovation: the “hyper” of hypertext. URLs function as a combined locator (how to find it) and identifier (how to name it) for reference to other Internet resources within documents (using hypertext, such as the HyperText Markup Language [HTML]), email, and a variety of other Internet protocols (e.g., the HyperText Transfer Protocol [HTTP]).
 
URLs were designed to be portable and easily transcribed at a time when most computers had very limited support for character sets. As a result, the allowed characters for a URL is limited to a subset of safe characters that are always available, much like identifiers in most programming languages: the ASCII letters, digits, and a few punctuation characters.  However, unlike programming languages, URLs are frequently made visible to users. Web users see and type URLs, and it is common for people to use URLs in advertising, written communication, and spoken announcements.
 
Since most of the world uses languages which are written with characters not allowed in URLs, there has been considerable interest in development of a kind of URL which allows the use of other (“non-ASCII”) characters drawn from Unicode — the standard for representing characters for the world’s languages. This new identifier is called an Internationalized Resource Identifier (IRI); it overlaps the existing URL syntax, based on the idea that some systems might still be URL-only while others might allow IRIs.
 
This was pretty good in theory, but in practice there have been a number of problems: For example, having multiple ways of writing the same identifier can cause security and reliability problems if implementations aren’t uniform. The standard, rather than converging, has undergone some pressure and divergence because of the wide variety of implementations.
 
Work continues to try to bring the concerned implementors together to work out the details and ensure that there is a single standard for IRIs in browsers, email, HTML, plain text, and other contexts. Specifications are developed in the World Wide Web consortium (W3C) and the Internet Engineering Task force. Adobe’s Larry Masinter and Roy Fielding continue to work on the related standards as editors, specification authors and reviewers.

As with most standards, the overall concept is simple; it’s the details that are difficult given that any changes to the core addressing standards for the Web have significant implications for security, reliability, and compatibility with existing deployed systems.