Please consider a donation to the Higher Intellect project. See https://preterhuman.net/donate.php or the Donate to Higher Intellect page for more info.

What do Internet standards bring to the table -- Promise or more confusion

From Higher Intellect Vintage Wiki
Jump to navigation Jump to search

We take you through the standards process and map out the main standards categories -- By Robert E. Lee

Abstract - In this second in a series of articles on the emerging Internet model, standards processes and competitive roles are examined. We present a simplified standards model and consider some strong market contenders.


The first article in this Internet series reflected on the physical structure and experiences with the Internet. As you might recall, there are an enormous number of challenges in bringing this technology into the realm of responsive and effective use for the masses. Now we'll take a look at the standards and technologies that are appearing in response to the various issues.

XDSL, ISDN, ATM, HTML, XML, IPX, IP, IPv6, SNMP, RMON, SMTP, POP, SET, CDF, OPS, PPTP, L2TP, LDAP, NDS, JNDI, DMI, ICMPv6, DNS, WINS, NFS, OSI, PPP, PICS, MPPP, SLIP...Are you beginning to get the idea that there are a lot of standards out there?

It helps to understand that there are standards covering content, content delivery, network transport, and systems operation. These standards are driven less from the standards bodies seeing a future requirement and defining the tool than from vendors acting in their own self interest to create a marketing edge and subsequently dominate the market with their proprietary protocol turned standard. Mind you, there can be nothing inherently wrong with proprietary technologies becoming open standards when that is the way the technology emerges from the developing company -- after all, good technologies have historically been best developed in the private sector and then put into the public domain.


The standards model

For the sake of simplicity, it is useful to view standards in the broad categories shown in Figure 1. Each layer of this over-simplified model is actually composed of a number of technologies and standards that define the performance and functionality within the layer and in interfacing with the layers around it. What complicates this model is the fact that each layer resides in a quasi-autonomous state. For example the IEEE (Institute for Electrical and Electronic Engineers) and the ITU (International Telecommunication Union) are the primary bodies for standards at the physical layer of this model. At that level, electrical signals and basic transport technologies, like ATM, Frame Relay, ISDN, xDSL and others emerge to drive the data across the Internet.

Standards.gif

Transport standards are sitting under the purview of the IETF (Internet Engineering Task Force) and the IAB (Internet Architecture Board). These standards involve everything from protocols to route packets through the network to those that support name resolution, electronic mail, and other basic structural components of the data moving across the Internet. It is here that you find IPv6, the next generation Internet Protocol addressing standard coming to life, as well as the core transport protocols that go beyond standard HTTP to provide private networks across the public domain.

Server, security, and presentation standards are the primary domain of the World Wide Web Consortium (W3C), with involvement from CommerceNet and other groups in areas like the security and protocols required for electronic commerce. It is here that the many competing features being developed by Microsoft and Netscape are shaped into industry-level standards that allow everyone to provide software for the market. The W3C is attempting to stay ahead of the industry leaders in defining the future structure of the Internet presentation features through efforts like Extensible Markup Language (XML), which bridges the gap between the Standard Generalized Markup Language (SGML) and HTML, bringing both to the Internet in a simpler form.


Standards process

Many of the existing standards at the core of the Internet were developed through the standards bodies taking member recommendations and producing what is known as a Request for Comment (RFC). During this phase of standards creation, the public and members of the standards body are queried for comments about the proposed standard. Once the RFC phase is completed, the standard then enters a draft phase where additional details are worked out, and the standard undergoes trials to test its validity. After testing and additional public comment, the standard is then voted upon, where it will either die or be approved.

Because the market is moving so quickly, many standards are implemented prior to approval or are derived from market initiatives. As an example of this, consider that while HTML 3.0 was being developed, Microsoft and Netscape so outpaced the W3C group, the HTML 3.0 standard was left to expire as an unapproved standard, and HTML 3.2 was born from the W3C's effort to embrace the industry de facto standards that had emerged as a result of Microsoft and Netscape distributing their browser technologies with so many new proprietary HTML tags.

Another example is what Kodak is doing to images on the Internet with its FlashPix technology. Here is a new technology innovation that can provide dynamic image resolution delivery over the Internet tailored to the bandwidth available to the end user and the need of the application. If a user connects to a FlashPix server with a 14.4-Kbps connection, a high resolution image, perhaps 18 MB in size, is dynamically scaled to the appropriate screen resolution and transmitted in the much smaller format, perhaps as small as 10 KB. Meanwhile, another user connected through a T-1 circuit with 1.544 Mbps would receive a much higher resolution image. The obvious push from the developing partners, Kodak, Hewlett-Packard, Microsoft, and Live Picture is to move this technology into a standard for the Internet.


Contention in the standards

Everyone in the industry is hot over the idea of creating dynamic independent development of content and implementation of technologies. Whenever a standard precedes a need in the market, products are then brought to market that allow these objectives to be met. But the pace of the Internet -- reducing product development cycles down from years to months and cutting content projects down to days -- has increased the pressure to choose a technology prior to it being standards based. Increasingly divergent approaches, especially on the client side of the equation, increases the complexity of the solutions being delivered over the Internet.

The net effect of all of this is to increase the cost of implementations, increase the time to create the solutions, and in many cases, decrease the success of the solution serving all versions of the browsers and users on the Internet today.

Does that leave everything in a negative state? No, just an irritating one. The promise and reality of standards-based solutions still prevail, as the projects based on confirmed standards today interoperate and perform as expected. Betting the farm one direction or another is only an issue with the bleeding-edge adopters -- in these times defined as those that adopt in the first weeks or months of an announcement of the technology. By the time the solution begins to show up in larger segments of the market, adoption becomes easier since market momentum will carry the major players' initiatives through.

The online resources listed below provide numerous resources on the hundreds of key and emerging standards for the Internet today. Use these to find detailed technical information on each standard of interest to you. Barring that, develop your Web applications to the widely deployed standards and technologies unless you have the need, resources, and patience to take the lead in areas still under development.