Author Archive: cargill

The Business of Standards, Part Two: The Catalyst for Change

The proliferation of Standards Setting Organizations (SSOs) began in the mid-1980s as a response to a perceived threat by the Japanese Fifth Generation Computer Systems project (FGCS) to the U.S. semiconductor industry. Several major U.S. chip and computer makers decided that a joint research initiative would help them meet the threat to U.S. chip making dominance. Unfortunately, U.S. law considered joint activities of this type to be in violation of anti-trust and anti-competitive laws. The remedy to this issue was to pass enabling legislation (National Cooperative Research Act of 1984, Pub L. No. 98-462) (NCRA) which allowed the creation of consortia for joint research and development.  Soon afterward, the Microelectronics and Computer Consortium (MCC) was created to engage in joint research in multiple areas of computer and chip design.

It should be noted that – at this time – the formal standards organizations in the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) had significant standardization activities occurring – especially in the area of the Open Systems Interconnect (OSI) arena.  Recognizing that the environment was becoming more important to business, the leaders of the ISO and IEC IT standards bodies proposed a merger of the ISO and IEC committees – creating the first (and so far, only) Joint Technical Committee  (ISO/IEC  JTC1) to make standardization easier.

However, the formalists failed to provide adequate testing and verification of the very complex Open Systems Interconnect standards, and this need was quickly met by the Corporation for Open Systems (COS), (1986), a consortium of suppliers and users to ensure that OSI implementations were interoperable.  Although short lived (in standards years), COS showed the way for the use of the NCRA to be used “differently.” It showed that a private organization (a consortium) could accomplish quickly what the formal standards organizations couldn’t – and do it with a highly focused approach that didn’t need all the “international” approvals and compromise.

The late 1980s and early 1990s saw an explosion of similar organizations – all of which were created by companies to “expedite” time to market (as well as the creation of the market, it was hoped). The most successful of these was the Object Management Group (OMG) founded in 1989 to create a heterogeneous distributed object standard. The Manufacturing Automation Protocol/Technical Office Protocol (MAP/TOP), championed by GM and Boeing respectively, came to life during this time, as did the User Alliance for Open Systems. There were also consortia created to push a particular provider’s technology (88open Consortium and SPARC International come to mind).

Of course, these groups began to strain the limits of the 1984 cooperative R&D legislation, so Congress modified the law in 1993 and passed the National Cooperative Production Amendments of 1993, Pub. L. No. 103-42, which amended the National Cooperative Research Act of 1984, Pub L. No. 98-462 and renamed it the National Cooperative Research and Production Act of 1993. (NCRPA)

And it is this Act that most consortia use to legitimize their existence. It provides limited immunity from anti-trust, provides some cover for anti-competitive behavior, and provides a basis for an organizational framework upon which to build your own consortium. However, this is not the end of the quest for “the nice thing about standards is that you have so many to choose from” syndrome. While the tools and mechanism for creating a consortium were now in place, the actual creation takes a little more effort.

The next post will look at how the “business of standards” has grown in the 20 years since the NCRPA was passed – and how consortium have changed standardization in the Information and Communications Technology (ICT) world.

Carl Cargill
Principal Scientist

The Business of Standards, Part 1

Andrew S. Tenenbaum, professor of computer science at Vrije University in Amsterdam once said:  “The nice thing about standards is that you have so many to choose from.”

I like this quote because it, like so many trite sayings, covers a rather more complex issue that most in the Information and Communications Technology  (ICT) arena prefer to ignore. The issue is, simply, why are there so many standards?  Beyond this, where do these standards come from and who pays for them?

The answers are simple – standards don’t appear magically, and are often created by the very industries that criticize their proliferation.  Industry members invest a lot of time, resources, and energy in creating standards. Case in point, Andy Updegrove  – a lawyer who helps create consortia – lists 887 ICT consortia in his tracking guide.  All of these consortia are funded by companies and individuals who are busily engaged in writing standards.

So why do these companies support such a vast standards industry? Because the act of standardization, if properly managed, can confer competitive advantage.  Basic to this idea is that a standard is a change agent – its only function is to change the market in some way or another.

Most often, standards are described as being used to “level the playing field.” This is true only in a commodity arena, such as standard wheat or standard cotton. Nearly everything in the ICT industry that is “standardized” has associated differentiators (from performance to speed to cost) that are vital for market share retention and growth.

However, occasionally, a company or other entity may find creating a differentiator to the current standard difficult due to extenuating business reasons, such as IPR (intellectual property rights) payments, lack of technical expertise, or even possibly owning significant competing technology. In this case, the organization can try to create a competing product that incorporates the (newer/better/more open/other) technology. All the organization needs are enough allies and/or market share to either support and embrace this competing offering.   If it wants to do this more openly, it can create an organization to help.

This scenario has been played out at least 887 times. Every time it is repeated, at least one new Standards Setting Organization (SSO) is created, which in turn sets about creating standards.

Companies find it to their benefit to claim that their product conforms to a standard – it reassures buyers, builds confidence, and allows markets to be opened. However, this also creates a morass of conflicting standards and standards organizations, thereby limiting the value of all standards – both the good and the bad.

One question is what is the legal basis of  this proliferation of standards setting organizations (SSOs)? Well, it turns out that the doctrine of “unanticipated consequences” is to blame.

The next post will examine the roots for this proliferation and how the business of standards started.

Carl Cargill
Principal Scientist

The Internet, Standards, and Intellectual Property

The Internet Society recently issued a paper on “Intellectual Property on the Internet“,written by Konstantinos Komaitis, a policy advisor at the Internet Society. As the title of the paper indicates, the paper focuses on only one policy issue – the need to reshape the role and position of intellectual property. The central thesis of the paper is that “industry-based initiatives focusing on the enforcement of intellectual property rights should be subjected to periodic independent reviews as related to their efficiency and adherence to due process and the rule of law.”

The author cites the August 2012 announcement of “The Modern Paradigm for Standards Development” which recognizes that the economics of global markets, fueled by technological advancements, drive global deployment of standards regardless of their formal status. In this paradigm, standards support interoperability, foster global competition, are developed through an open participatory process, and are voluntarily adopted globally.” These “OpenStand” principles were posited by the Institute of Electrical and Electronics Engineers (IEEE), the Internet Engineering Task Force (IETF), the Internet Architecture Board (IAB), the Internet Society, and the World Wide Web Consortium (W3C).

Komaitis conveniently overlooks the nearly 700 other organizations (formal and otherwise) that develop standards. And that nearly all industries depend upon standards. And that governments are aware of the power of standards to create economic policy and drive and sustain economic growth. Instead, the author focuses on one small aspect of standards – intellectual property.

Another issue conveniently overlooked is how to fund standards development. Komaitis asserts that “…industry-based initiatives ….should be subjected to periodic independent reviews… ” He misses the fact that industry funds nearly all of the standards organizations in existence. Absent industry funding for participants, charging for standards, and acceptance of standards in product creation, would cause the entire standardization arena to become extinct.

The author seems to be arguing for a revision of intellectual property rights (IPR) rules in standardization – when, in fact, there is no real demand from the industry as a whole. Komaitis is really asking for an “intellectual property rights carve out” for standards related to the Internet. Looking at the big picture, the plea that it is necessary to rejigger world-wide IPR rules to prevent putting the State or courts “in the awkward position of having to prioritize intellectual property rights over the Internet’s technical operation…” seems trite and self-serving.

There is a claim that “the Internet Society will continue to advocate for open, multi-participatory and transparent discussions and will be working with all stakeholders in advancing these minimum standards in all intellectual property fora.” Perhaps the Internet Society could look at what already exists in the International Organization for Standardization (ISO) or the World Trade Organization (WTO) or perhaps even the International Telecommunications Union (ITU) to see how a majority of the “stakeholders” worldwide already deal with these issues – and then maybe get back to actually solving the technical issues at which the IETF excels.

Carl Cargill
Principal Scientist

 

Adobe Supports OpenStand

On March 9th, at the Open Future  reception at SXSW, Adobe announced support for the OpenStand  initiative. Our rationale for this was simple – OpenStand is good for the Web, good for users, and good for Adobe. It increases innovation, openness, and allows greater participation in evolving the Internet.

The Internet is built on standards. These standards come from all sorts of organizations – some formal and supported by governments, some less formal and created by industry associations, and some driven by users who believe in collective action. OpenStand takes a simple position on these organizations – if the organization is open, transparent, balanced, has due process in creation, and has broad consensus – then the organization and its specifications are legitimate.

The approach advocated by OpenStand seems to be intuitively obvious; good technical standards which are accepted and implemented by the industry should be judged not on their origin, but rather on their utility to the industry. A poor standard with a “proper background and backing” is still a poor standard.

The Internet is ubiquitous – from mobile phones to tablets to desktops – all form factors, all types of information, design, and literally “everything”.  It is a golden age for creative display and use of information – all driven by innovation, which then is “standardized” so users can access it and interoperate with complementary services.

Adobe has contributed significantly and will continue to contribute to efforts to document and test these innovative activities in conjunction with W3C with the Web Platform Docs (WPD) project and Test the Web Forward.   We have also (along with Microsoft and Google), supported the HTML5 editor as this standard moves to completion and we are also active in WebKit specs, as well as in formal metadata standards. All of these venues are important – and all build the ability of the market to innovate, grow, and change. And that’s what OpenStand is all about, which is why Adobe has chosen to support it.

Carl Cargill
Principal Scientist

Takeaways from the 14th Annual Privacy and Security Conference

I recently presented at the14th Annual Privacy and Security Conference held in Victoria, British Columbia.  There were several things which I took away from the meeting.

The first takeaway is that even though the three keynote speakers all looked at the issue of security and privacy from different perspectives – they all agreed that the level of interest in these two areas is growing as governments begin to recognize that the World Wide Web crosses borders with impunity.

The second takeaway was that standards creation is largely ignored. Mostly, the discussion was on standards and regulation implementation not on the act of creation. My presentation – “Whose Internet is it?” – focused on the groups that create basic Internet and telephony standards. The intent of the presentation was to convince people that they can (and should) get involved in creating the standards that drive the Web.

The final takeaway was that the distinction between standards and policy is becoming very blurred and the implications for national governments and for commercial providers are significant. On one hand, a nation has the right (and sometimes the duty), to protect itself and its citizens. For this, there exists the traditional standardization venues. On the other hand, there is the growing realization that these traditional bodies are ill equipped to deal with the increased pace of technology change that the Information and Communications Technology (ICT) world is experiencing. Throw into this mix open source and IPR, social change, and increasingly, mobile telecommunications, and you have a volatile mix.

This is an interesting, challenging, and confusing time for those involved. The collision of regulations, innovation, policy, technology, and a host of other factors, of necessity, make the issue complicated and complex – but very relevant to how the Web will evolve. Adobe will continue to follow this issue as it unfolds, and we welcome your perspectives and comments.

Carl Cargill
Principal Scientist

 

Canvas 2D and Real-Time Media Flow Protocol (RTMFP) Standards Contributions

In the last several weeks, Adobe has made two significant announcements about standards contributions. One announcement signaled the submission of a specification for Adobe’s Secure Real-Time Media Flow Protocol (RTMFP) to the Internet Engineering Task Force (IETF) and the other was an announcement by the W3C of stable and feature-complete HTML5 and Canvas 2D specifications to which Adobe contributed an endorsement (as well as providing Rik Cabanier as a co-editor of the Canvas 2D spec).

The two announcements are joined by a common thread: In both cases, Adobe felt that the market and our customers would benefit from the technology in the specifications. In the case of RTMFP, Adobe made a direct contribution of technology, which we believe has value for developers as the Internet continues to develop new applications and solutions. RTMFP may help solve some of the more vexing problems in real-time and peer-to-peer communications. It was submitted under a royalty-free grant – meaning that Adobe does not stand to profit from the contribution.

In the case of HTML5 and Canvas 2D, Adobe made a significant royalty-free grant of technology to the HTML5 specification as well as associated specifications that comprised “big HTML5” (which includes all the elements associated with HTML5 , from JavaScript to CSS). Along with that, Adobe (in conjunction with Microsoft and Google) is a major contributor to the W3C editor’s fund, which provides the means necessary to hire full time W3C editors for the HTML5.1 specification. We’re not sure how the next revision of HTML5 will shake out, but we’re reasonably certain that the careful and planned releases of stable and testable technology will help the market (including our customers) achieve fuller benefits from the World Wide Web.

In both cases, Adobe is betting on the future. The technologies being offered are either proven and existing technologies (Adobe uses RTMFP in Flash and other products), and Canvas 2D is increasingly being deployed and being embraced by the market. What is different is that businesses and developers now have an available and stable specification for implementation and planning. We don’t know where the market will go – but we do know that providing a firm foundation for continued expansion makes it much easier to build for the future.

We’re also willing to bet that the increased transparency offered by standards will help make the Internet and the Web more useful and increase the numbers of users and developers. And that they, in turn, will see more and more opportunities for further development and use. And this grows the market and increases the utility of the Web for everyone.

Carl Cargill
Principal Scientist

Congressman Jack Brooks’ Legacy to the World of Standards

In a brief footnote to history, I note with sadness that Congressman Jack Brooks has died. He had a long and interesting career as a Congressman from Texas and in the process shaped the face of standards and standardization as we now know it. He is mourned by his colleagues and friends.

I never met Jack Brooks, but I owe him – and his ideas – a great deal and probably, my career and interest in standardization. Jack Brooks was the author of Public Law 89-306 (Brooks Act), dated October 30, 1965, H.R.4845, which established the rules and requirements for buying “…any equipment or interconnected system or subsystems of equipment that is used in the automatic acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of data or information…”

To make sure that what the government bought was “interconnectable” (since interoperable was still just a pipe dream), the Act further required that the National Bureau of Standards promulgate standards and guidelines “necessary to improve the efficiency of operation or security and privacy of Federal computer systems.”

This Act made standards and standardization a central feature of Federal procurement from 1965 on. Not only did prime contractors have to meet these standards, but all the participants down the supply chain did as well. This was significant, since the Federal government was the largest single purchaser of information technology in the world. From NASA to the Social Security Administration, from the Department of Education (another Brooks accomplishment) to the National Weather service, systems began to become “interconnectable.” Proprietary hardware (for that was the emphasis at the time) was slowly moved to interconnected and “plug compatible” systems. 

The development and creation of a whole theory of business rationale and strategic planning necessary for standards was a “green field” area and was rather challenging – and ultimately led to where we are today – with a vast and complex set of inter-relations between trade, business, politics, economics, jurisprudence, and social planning.

I doubt that Jack Brooks saw this far or anticipated the extent to which his Act would change the face of computing. He was a hard-headed realist who wanted to save the government money. But by driving standards into the procurement of Government systems, Jack Brooks changed the face of the IT industry by making technical standards necessary. And when that happened, all of the ancillary business activity, from legal basis to strategic implications to marketing to social use followed in its wake.  This is one of Jack Brooks’ unsung but tremendously powerful legacies.

Carl Cargill
Principal Scientist

 

Who’s Making the Rules About the Internet?

Governance – who makes the policies and rules (but NOT the technology) on how the Internet runs – is one of the “invisible infrastructure” activities that just happen and keeps the Internet from failing. With the explosive growth of the Web and its importance to the world economy (coupled with the fact that the Internet is global and doesn’t recognize national borders), the last decade has seen governments and policy makers start to look more closely at who actually makes the rules that run the Internet and wonder if perhaps there isn’t a better way. Things like taxation, censorship, privacy, intellectual property, libel, economic stability, and financial dealings are all aspects of Internet governance the world is coming to recognize. And governments are loathe to grant U.S. based Non-Governmental Organizations (NGOs) (such as ICANN and ISOC and IETF[1]) the right to make the fundamental rules that impact these sovereign rights.

Part of the reason for this is the importance of the Internet to the world’s economic well-being. The global (and globalized) economy depends in large part on the instantaneous communications afforded by the Internet, which are now reaching ever-broader audiences.

However, the major impact of the Internet is on governance of nations – and not just of the individuals of the Internet. The “Arab Spring” movement showed the power of the Internet to galvanize public reaction. This can be a worrisome thing to a government trying to maintain economic or political stability. Wikileaks also illustrated the power of the Internet to disseminate governmentally unfavorable information that impacted governmental foreign policy, and the use of malware (e.g. Stuxnet) has become a tool for both industrial and military espionage and sabotage.

But in the international geopolitical arena, governance has expanded to mean more: It also means that all nations have an equal say and stake in the creation of the rules and deployment of the Internet. Note here that the term is “nations” – not individuals – because how the Internet is governed can have a tremendous impact on a nation’s ability to pursue a national growth and governance strategy.

One of the countries most concerned about this area is China. It was noted that the non-regulated nature of the Internet, and what could be viewed as favoritism to the developed countries, poses a long term and large problem for developing countries. On September 18, 2012, the Chinese government hosted an “emerging nations Internet roundtable,” where issues of Internet governance, content management, and cyber security were discussed. The governments of Russia, Brazil, India, and South Africa all participated, and together looked at Internet governance so that the needs of developing nations are taken into consideration.

Following the meeting, a statement was released that said that participating governments would continue to meet on a regular basis and that consensus has been reached on the following four major topics:

1. Internet governance must become a governmental mandate, and the impact of social networks on society (good, bad, and otherwise) is of particular concern.

2. Content filtering needs increased regulation and legislation to both protect and promote the interests of developing and emerging nations.

3. The whole area of cyber-security needs increased transnational cooperation and coordination – but this must be balanced with the economic needs of emerging states.

4. Emerging and developing nations offer the greatest opportunity for Internet growth, and these nations must take responsibility for managing this growth.

This conference clearly delimits the debate between those who seek an unfettered (open and based in information technology) Internet and those who would like a more regulated (in the style of the regulated telecommunications industry) model.

The issue of who controls a tremendously powerful communications force is, of course, a matter of high interest to nearly everyone. But the essential issue is that this policy and governance issue is being fought in the standards arena – that is, who has the right and duty to make the rules about the standards that will drive the next generation of innovation in the massively connected world. Currently, the International Telecommunication Union (ITU)[2], is proposing to assume an increased role in making rules for the Internet, with support from many of the G-30 nations. And ISOC, the IETF, W3C, and the IEEE are responding with the Open Stand (http://open-stand.org) initiative. Europe is moving to recognize consortia specifications – and the national standards bodies (with implicit support of their governments) are trying to slow and limit this change. And we will see this same type of standards-based activity in policy decisions on privacy, security, accessibility, and others. As the world becomes more and more highly interconnected, the need for and control of who creates and who mandates standards – their creation, implementation, testing, and IPR status – will become major issues in national and international policy. And this is the lesson that is being learned from the internet governance discussions.

(1)Internet Corporation for Assigned Names and Numbers (ICANN); Internet Society (ISOC); Internet Engineering Task Force (IETF)

 (2) The ITU is a specialized U.N. Treaty Organization responsible for Information and Communications technology. It creates standards to regulate telecommunications.

W3C Web Platforms Docs: A Standards Perspective

Recently, Adobe, along with many others in the community, initiated a major community effort to build a common suite of developer-oriented documentation of the Open Web Platform as a community effort sponsored by the World Wide Web Consortium (W3C) .

One of the problems with standards is that generally, they are meant more for implementors and less for users of the standards (often by design). Those who actually write the standards and work on the committees that create them know that they are fragile interface descriptions – and this fragility is what requires the care in their crafting.

Standards specifications are necessarily quite detailed, in order to really promote interoperability and ensure things work the same. And this is where things get sticky. The implementations are based on the standard or the specification, and all standards (well, nearly all) are written in native language by people who are usually specialists in technology, as are many people who implement the developer-oriented documentation.

What’s exciting here is that the Web Platform Docs (WPD) effort is really targeted at the user community to help document the standards in a way that is useful to that community.

But a standard really only gains value when it is implemented and widely deployed. And this is why the WPD is so innovative. WPD is about use and deployment of the standard. It has tutorials on how to use a feature; it has examples of uses. This is the kind of material that the W3C working group does not have time to create, but values. It is what the vendors provide to get their implementations used.

The importance of the site, from a standards point of view, is that it helps build an informed user base. Not at all a simple task.

The Web is evolving – and in its evolution, it is forcing others to change as well. Ten years ago, this type of common activity, open to all (for both contributions and information) would have been if not unthinkable, at least foreign. With this announcement, the contributors and the W3C have (hopefully) begun to change the way standards are seen – to an easier and kinder environment.  And this is a good thing.

For an Adobe developer’s view, see: What’s one of the biggest things missing from the Web?

Technical Standards during ‘China’s 12th Five Year Plan’

 

A little over two months ago, the Chinese government’s Ministry of Science & Technology released a document entitled “The Special Planning Document for Technical Standards during China’s 12th Five Year Plan.” For short, I’ll call this the “SPD.” This SPD is one of a series published in recent months by the Chinese government on standards and standardization.

A little background is necessary to understand why the Chinese government is emphasizing standards—especially technical standards. China first emerged as a powerful global economic force largely as a foundry nation. In other words, it produced manufactured goods based on others’ designs – the classic commodity provider. The keys to competitiveness in this area include price, scale and flexibility of manufacturing, and the availability of the right workforce.

Perhaps realizing that China could not become a major player in the international information and communications technology (ICT) arena solely as a foundry nation, Chinese government policy now appears aimed at moving the country up the value chain to become a designer and creator of its own ICT products. It has also come to realize that standards, in the era of the Internet and widely and massively connected systems, are essential to market success. The Chinese government examined the German use of standards after World War II to promote that country’s industrial export policy; at the use of standards by the EU to foster a single European market; and at the use of standards by United States companies to drive the direction and focus of the information and communications technology sector. In short, they have realized that standardization is an important element of national industrial policy. The investment in standardization education, the increased participation by Chinese companies in a wide spectrum of SDOs, and the creation of Chinese consortia all point to an understanding of the value of standards to set direction in an industry.

The SPD opens with a bold assertion – “Technical standards are the technical basis of social and economic activities, a strategic resource for national development and a core element for international competitiveness.” This statement is the key to understanding the remainder of the document – which sets forth how technical standards are to be used and considered in planning, R&D, Advanced Development, Testing and Certification, Intellectual Property, and a host of other areas that are typical of life cycle planning in technology companies. More importantly, the Chinese government sees standards as encouraging innovation by limiting duplication, encouraging sharing, and making innovative ideas and products more available to other developers. Standards are also seen as coordinative activities, allowing disparate groups to develop solutions – but with the added caveat that these solutions must/may be applied to social, legal, and economic issues more easily. In total, the document presents an ideal and optimistic vision of standards as a strategic planning mechanism that can be used to spur their economy and various industries, both new and established.

However, it must also be remembered that this is a formal governmental planning document. Planning documents – especially long-term documents – can change, not because the original plan was faulty or incomplete, but because the market conditions upon which the plan was predicated have changed. For the Chinese strategy to succeed, the government will have to be flexible and adaptive in implementing the strategy, as there is no longer a static landscape in the world of ICT standardization.

The Information and Communications Technology sector standards environment in the United States and Europe is highly dynamic – as illustrated by activities like the creation of WebKit and open source; the appearance of W3C community groups; the increase in different Intellectual Property Rights rules within Standards Setting Organizations; the constant creation of new consortia; and the appearance of ad hoc standardization (as shown by social media). Over the last five years, ICT standardization has changed dramatically. Because formal standards organizations (ISO, IEC, and the ITU-based) move slowly, the ICT sector often relies on consortia and ad-hoc groups (WebKit, WHAT WG, and the like) for innovation and leadership in standards development. Additionally, planning in a period of significant change is challenging – and planning for standards is a second derivative of ICT planning. It both leads and is led by technology and technological planning – but with a healthy dose of marketing and economic and social strategy thrown in.

With that said, the SPD is a fascinating statement of Chinese governmental policy. One only wishes that there were corresponding standards strategies from other countries, recognizing the criticality of standards for national competitiveness, with which to compare with that of the Chinese government. There is an old saying in the standards world: “If you don’t play, you can’t complain.” If this document serves the purpose of causing others to re-examine their approaches and seek consensus, then it has served not only China, but also the world.