Skip to main content Accessibility help
×
Hostname: page-component-586b7cd67f-t8hqh Total loading time: 0 Render date: 2024-11-30T21:17:34.558Z Has data issue: false hasContentIssue false

Prelude

Published online by Cambridge University Press:  24 August 2020

Kevin Werbach
Affiliation:
University of Pennsylvania Wharton School of Business

Summary

Type
Chapter
Information
After the Digital Tornado
Networks, Algorithms, Humanity
, pp. 11 - 12
Publisher: Cambridge University Press
Print publication year: 2020
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

Digital Tornado: The Internet and Telecommunications Policy (1997) Selected Excerpts

Kevin Werbach
Federal Communications Commission Office of Plans and Policy Working Paper #29 (March 1997), at www.fcc.gov/Bureaus/OPP/working_papers/oppwp29.pdf
.
Background

The Internet, from its roots a quarter-century ago as a military and academic research tool, has become a global resource for millions of people. As it continues to grow, the Internet will generate tremendous benefits for the economy and society. At the same time, the Internet poses significant and difficult questions for policy makers. This working paper examines some of these emerging issues at the intersection of technology, law, economics, and public policy.

The United States federal government has long been involved in the development of the Internet. Through research grants, and by virtue of its status as the largest institutional user of computer services in the country, the federal government played a central role in bringing what we now call the Internet into being. Just as important, the federal government has consistently acted to keep the Internet free of unnecessary regulation and government influence. As the Internet has matured and has grown to support a wide variety of commercial activity, the federal government has transitioned important technical and management functions to the private sector. In the area of telecommunications policy, the Federal Communications Commission (FCC) has explicitly refused to regulate most online information services under the rules that apply to telephone companies.

Limited government intervention is a major reason why the Internet has grown so rapidly in the United States. The federal government’s efforts to avoid burdening the Internet with regulation should be looked upon as a major success, and should be continued. The Telecommunications Act of 1996 (1996 Act) adopts such a position. The 1996 Act states that it is the policy of the United States “to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation,”Footnote 1 and the FCC has a responsibility to implement that statute. The draft “Framework for Global Electronic Commerce” developed by the White House with the involvement of more than a dozen federal agencies, similarly emphasizes the need to avoid unnecessary government interference with the Internet.Footnote 2

This working paper addresses three overlapping telecommunications policy areas that relate to the Internet: law, economics, and public policy. Legal questions arise from the difficulty in applying existing regulatory classifications to Internet-based services. Economic questions arise from the effects of Internet usage on the telecommunications infrastructure, and the effects of the telecommunications infrastructure on the Internet. Public policy questions arise from the need to maximize the public benefits that the Internet brings to society.

The Internet is a fluid, complex entity. It was designed to route around obstacles, such as failures at central points of the network, and it may respond in unexpected ways to pressures placed on it. It has developed largely without any central plan, especially in the past several years as the US government has reduced its management role. It overcomes any boundaries that can be drawn, whether rooted in size, geography, or law. Because the Internet represents an ever-growing interconnected network, no one entity can control or speak for the entire system. The technology of the Internet allows new types of services to be layered on top of existing protocols, often without the involvement or even the knowledge of network providers that transmit those services. Numerous users can share physical facilities, and the mix of traffic through any point changes constantly through the actions of a distributed network of thousands of routers.

The chaotic nature of the Internet may be troubling for governments, which tend to value stability and certainty. However, the uncertainty of the Internet is a strength, not a weakness. With decentralization comes flexibility, and with flexibility comes dynamism. Order may emerge from the complex interactions of many uncoordinated entities, without the need for cumbersome and rigid centralized hierarchies. Because it is not tied to traditional models or regulatory environments, the Internet holds the potential to dramatically change the communications landscape. The Internet creates new forms of competition, valuable services for end users, and benefits to the economy. Government policy approaches toward the Internet should therefore start from two basic principles: avoid unnecessary regulation, and question the applicability of traditional rules.

Beyond these overarching themes, some more specific policy goals can be identified. For the FCC in particular, these include the following.

Promote competition in voice, video, and interactive services.

In passing the 1996 Act, Congress expressed its intent to implement a “pro-competitive deregulatory national communications policy.” The Internet provides both a space for innovative new services, as well as potential competition for existing communications technologies. The FCC’s role will be to ensure that the playing field is level, and that efficiency and market forces drive competition.

Facilitate network investment and technological innovation.

The Internet encourages the deployment of new technologies that will benefit consumers and produce jobs. The Commission should not attempt to pick winners, but should allow the marketplace to decide whether specific technologies become successful. By eliminating regulatory roadblocks and other disincentives to investment, the FCC should encourage both incumbents and new entrants to develop innovative solutions that transcend the capabilities of the existing network.

Allow all citizens to benefit from advanced technologies.

The communications revolution should benefit all Americans. In an age of new and exciting forms of interactive communications, the FCC should ensure that entities such as schools and libraries are not left behind. However, the mechanisms used to achieve this goal should be consistent with the FCC’s broader policies of competition and deregulation.

This working paper is intended to explore issues and to facilitate discussion, not to propose specific government actions. Many proponents of the Internet’s development are wary of any government actions directed toward the Internet. Government, however, has been intimately involved with the Internet since the network’s beginnings. Government decisions – such as the FCC’s directive that Internet service providers not be subject to interstate access charges, and the widespread requirement by state regulators that local calls be available at flat monthly rates – continue to shape Internet development. Moreover, policy decisions are best made with knowledge and comprehension of their potential implications.

The goal of this paper, therefore, is to promote greater understanding, on the part of both government and the private sector, of the unique policy issues the Internet raises for the FCC and similar agencies. The discussion of a topic is not a suggestion that government regulation in that area is necessary or desirable. On the contrary, a fundamental position of this paper is that government should work to avoid unnecessary interference with the Internet’s development.

Government may influence the evolution of the Internet in many ways, including directly regulating, participating in technical standards development, providing funding, restricting anti-competitive behavior by dominant firms, facilitating industry cooperation otherwise prohibited by antitrust laws, promoting new technologies, encouraging cooperation between private parties, representing the United States in international intergovernmental bodies, and large-scale purchasing of services. The FCC and other government entities may also play a useful role simply by raising the profile of issues and stimulating debate. A better understanding of the relationship between the Internet and telecommunications policy will facilitate intelligent decision-making about when and to what extent any of these government actions are appropriate.

The Endless Spiral of Connectivity

Government officials, pundits, and market researchers often compare the Internet to established communications technologies such as telephony and broadcasting. These efforts are understandable. “Traditional” technologies have well-defined usage characteristics, growth patterns, and market behavior. Moreover, the Internet physically “piggybacks” on other networks, in particular the wireline telephone infrastructure.

Drawing analogies between the Internet and traditional media makes it easier to decide whether existing bodies of law or regulation apply to new Internet-based services. Thus, for example, the debate over the constitutionality of the Communications Decency Act (CDA), which seeks to restrict the transmission of indecent material over the Internet, has often boiled down to a conflict of analogies. Opponents of the CDA have compared the Internet to a telephone network, while supporters often describe the Internet as similar to broadcasting. Because telephone carriers are generally not legally responsible for the content routed over their networks, but broadcasters may be subject to fines for transmitting inappropriate material, the choice of analogy can predetermine the legal outcome.

Although such analogies are appealing, most break down upon closer analysis of the unique characteristics of the Internet. The Internet is substitutable for all existing media. In other words, the Internet potentially poses a competitive threat for every provider of telephony, broadcasting, and data communications services. At the same time, Internet-related businesses are substantial customers of existing telephony, broadcasting, and data companies. The Internet creates alternate distribution channels for pre-existing content, but more importantly, it permits delivery of new and hybrid forms of content. The Internet is one of many applications that utilize the existing telephone network. However, from another perspective, the telephone, broadcasting, and cable networks are simply nodes of the larger network that is the Internet.

Thus, the Internet is fundamentally different from other communications technologies. In most cases, simply mapping the rules that apply to other services onto the Internet will produce outcomes that are confusing, perverse, or worse. Any attempt to understand the relationship between the Internet and telecommunications policy must therefore begin with the distinguishing aspects of the Internet.

How the Internet Is Unique

The distinctiveness of the Internet derives in large part from its technical architecture, which is described in greater detail in Section II. The Internet functions as a series of layers, as increasingly complex and specific components are superimposed on but independent from other components.Footnote 3 The technical protocols that form the foundation of the Internet are open and flexible, so that virtually any form of network can connect to and share data with other networks through the Internet. As a result, the services provided through the Internet (such as the World Wide Web) are decoupled from the underlying infrastructure to a much greater extent than with other media. Moreover, new services (such as Internet telephony) can be introduced without necessitating changes in transmission protocols, or in the thousands of routers spread throughout the network.

The architecture of the Internet also breaks down traditional geographic notions, such as the discrete locations of senders and receivers. The Internet uses a connectionless, “adaptive” routing system, which means that a dedicated end-to-end channel need not be established for each communication. Instead, traffic is split into “packets” that are routed dynamically between multiple points based on the most efficient route at any given moment. Many different communications can share the same physical facilities simultaneously. In addition, any “host” computer connected directly to the Internet can communicate with any other host.

A further distinguishing characteristic of the Internet is its fractal nature. Fractals are derived from the branch of mathematics known as chaos or complexity theory. Fractals exhibit “self-similarity”; in other words, a roughly similar pattern emerges at any chosen level of detail. Internet traffic patterns most clearly demonstrate the Internet’s fractal tendencies. For traditional communications networks (including the telephone network), engineers have over many years developed sophisticated statistical models to predict aggregate usage patterns. Researchers have shown that usage of the Internet follows not the traditional “poisson” pattern, but rather a fractal distribution.Footnote 4 In other words, the frequency of Internet connections, the distribution between short and long calls, and the pattern of data transmitted through a point in the network tend to look similarly chaotic regardless of the time scale.

The fractal nature of the Internet confounds regulatory and economic models established for other technologies. However, as chaos theorists have shown, fractals have valuable attributes. In a fractal entity, order emerges from below rather than being dictated from above. The fact that the Internet does not have an easily-identifiable hierarchy or any clear organizational structure does not mean that all behavior is random. Many small, uncoordinated interactions may produce an aggregate whole that is remarkably persistent and adaptable.

Finally, the Internet has thus far not been regulated to the same extent as other media. The Communications Act of 1934 (Communications Act), which created the Federal Communications Commission to oversee telephony and radio broadcasting, is more than sixty years old. By contrast, Internet service providers, and other companies in the Internet industry, have never been required to gain regulatory approval for their actions.

The Feedback Loop

If the Internet is not like any other established communications technology, what then is it? On one level, the Internet is whatever anyone wants it to be. It is plastic, decentralized, and constantly evolving network. Any simple concept to describe the Internet will necessarily be incomplete and misleading.Footnote 5 Such templates are useful, however, to promote greater understanding of aspects of the Internet that may not otherwise be obvious.

For purposes of this paper, I believe it is valuable to understand the Internet as a feedback loop. A feedback loop occurs when the output of a system is directed back into the system as an input. Because the system constantly produces fuel for its own further expansion, a feedback loop can generate explosive growth.Footnote 6 As the system expands, it produces more of the conditions that allow it to expand further. All networks are feedback loops, because they increase in value as more people are connected.Footnote 7 The Internet, however, is driven by a particularly powerful set of self-reinforcing conditions.

Figure 1 describes some of the interrelated factors that build upon each other to foster the growth of the Internet. Some “supply” factors (such as the availability of higher-capacity networks) permit an expansion of demand (for example, by allowing bandwidth-intensive services such as high-resolution video transmission). Like a digital tornado, the vortex continues, as the new level of demand creates the need for additional capacity, and so forth.Footnote 8 The Internet feedback loop is a fundamentally positive force, because it means that more and more services will be available at lower and lower prices. So long as effective self-correcting mechanisms exist, the Internet will overcome obstacles to its future growth.

Figure 1: The Internet Spiral

Understanding the underpinnings of the Internet feedback loop is necessary to craft policies that facilitate, and do not hinder, its continuation. There are four primary factors that support the growth of the Internet:

Digitalization and “Deep Convergence”

As described above, the Internet exhibits characteristics of several media that had previously been distinct. Networks carry three types of information – voice, video, and data – and those categories are further subdivided into areas such as pre-recorded vs. live or real-time presentation, and still vs. moving images. Historically, these different forms of information have used different delivery vehicles. The telephone network delivered voice, private corporate networks delivered data, and broadcast networks delivered video. Each service was tightly coupled to a specific form of infrastructure – the telephone network used copper wires to reach subscribers, broadcast television used the airwaves, cable television used coaxial cable, and so forth.

“Convergence” means that those lines are blurring. However, convergence is often understood in a shallow manner, as simply the opportunity for owners of one type of delivery system to compete with another type of delivery system, or as the opportunity for content owners to deliver their content using different technologies. In reality, convergence is something far more fundamental. “Deep convergence” is driven by a powerful technological trend – digitalization. Digitalization means that all of the formerly distinct content types are reduced to a stream of binary ones and zeroes, which can be carried by any delivery platform.Footnote 9 In practical terms, this means not only that specific boundaries – between a telephone network and a cable system, for example – are blurred, but also that the very exercise of drawing any such boundaries must be fundamentally reconsidered or abandoned.

Digitalization has been occurring for decades. The long-distance telephone network in the United States is now almost entirely comprised of digital switches and fiber optic transmission links. These digital facilities, however, have been optimized to transport a single service – voice. The Internet, by contrast, can transmit any form of data. Internet protocols are sufficiently flexible to overcome the boundaries between voice and other services. Innovators can develop new services and immediately load them onto the existing Internet infrastructure. Convergence creates new markets, and new efficiencies, because particular services are no longer locked into specific forms of infrastructure.

Moore’s Law and Metcalfe’s Law

As George Gilder has most clearly articulated, the two technological “laws” that most impact the growth of the Internet are Moore’s Law and Metcalfe’s Law.Footnote 10 Moore’s Law holds that the maximum processing power of a microchip, at a given price, doubles roughly every eighteen months. In other words, computers become faster at an explosive rate, or conversely, the price of a given level of computing power decreases at that same dramatic rate. Metcalfe’s Law says that the value of a network is equivalent to the square of the number of nodes. In other words, as networks grow, the utility of being connected to the network not only grows, but does so exponentially.

Moore’s Law and Metcalfe’s Law intersect on the Internet. Both the computers through which users access the Internet, and the routers that transmit data within the Internet, are subject to the price/performance curve described by Moore’s Law. At the same time, advances in data transmission technology have expanded the capacity of the Internet’s backbone networks. As the bandwidth available through the network continues to grow, Moore’s Law states that the price of obtaining a given level of bandwidth continues to drop, while Metcalfe’s Law dictates that the value of a connection increases exponentially. The ratio of the cost of Internet access to the value it provides plummets over time. And as it plummets, connectivity and higher-bandwidth connections become that much more important, generating more usage and more capital to upgrade the network.

The Magnetism of Money and Minds

Moore’s Law and Metcalfe’s Law describe the technological forces that push the growth of the Internet, but there are also business forces that exert a powerful influence. In a capitalist economy, the “invisible hand” of the market dynamically redirects capital where it is most highly valued, without any direct outside intervention. Companies that demonstrate superior potential for generating future revenues more easily attract investment, and for public companies, see their stock prices rise. Other companies in the same industry sector often see increases in their stock prices as well, as investors seek to repeat the pattern of the first company and to capitalize on economic trends.

As money flows into a “hot” sector, so do talented people seeking to obtain some of that money by founding or working at a company in that sector. The presence of so many top minds further attracts capital, reflecting a synergistic process I call “the magnetism of money and minds.” This trend promotes the availability of financing to spur the future growth of the Internet.

Competition

Competition enables both the dynamic allocation of capital and talent, as well as the constant innovation in technology that leads to deep convergence and falling prices. In a competitive market, companies must constantly invest and innovate, or risk losing out to competitors. Intel CEO Andy Grove has observed that in the computer industry there are only two kinds of companies: the quick and the dead. Even those companies with strong positions must always look over their shoulder, because customer loyalty vanishes in the face of superior alternatives.

The benefits of competition are evident in the computer industry, where companies must constantly improve their products to remain successful. Competition in the Internet context means that many different providers of hardware, software, and services vie for customers. In a competitive market, providers that can offer superior service or prices are more likely to succeed. Technological innovations that lower costs or allow new service options will be valuable to providers and consumers alike.

Threats to the Continued Spiral

If the Internet truly operates like a feedback loop, why is government intervention necessary?

There are many ways the Internet spiral could be derailed. Any of the underlying drivers of Internet growth could be undermined. Moving toward proprietary standards or closed networks would reduce the degree to which new services could leverage the existing infrastructure. The absence of competition in the Internet service provider market, or the telecommunications infrastructure market, could reduce incentives for innovation. Excessive or misguided government intervention could distort the operation of the marketplace, and lead companies to expend valuable resources manipulating the regulatory process.

Insufficient government involvement may also, however, have negative consequences. Some issues may require a degree of central coordination, even if only to establish the initial terms of a distributed, locally-controlled system. A “tragedy of the commons” situation may arise when all players find it in their own self-interest to consume limited common resources. The end result, in the absence of collective action, may be an outcome that no one favors. In addition, the failure of the federal government to identify Internet-related areas that should not be subject to regulation leaves open opportunities for state, local, or international bodies to regulate excessively and/or inconsistently.

How Government Should Act

The novel aspects of the Internet require government policies that are sensitive to both the challenges and the opportunities of cyberspace. Three principles should guide such government decision-making.

Scalability, Not Just Stability

Rather than seeking to restrain the growth of the Internet, government should encourage it. As long as the underpinnings of the network support further expansion, and self-correcting mechanisms can operate freely, the Internet should be able to overcome obstacles to further development. Additional capital and innovation will be drawn to any challenge due to the prospect of high returns. In addition, a focus on scalability directs the attention of policy makers to the future of the network, rather than its current configuration. Given the rapid rate at which the Internet is changing, such a forward-looking perspective is essential. The “growth” of the Internet means more than an increase in the number of users. It also means that the network will evolve and change, becoming an ever more ubiquitous part of society.

Nevertheless, stability remains important. The Internet must achieve a sufficient level of reliability to gain the trust of consumers and businesses. However, even such stability requires an architecture that is built to scale upward. Otherwise, periods of calm will inevitably be followed by crashes as the Internet continues to grow.

Swim with the Current

The economic and technological pressures that drive the growth of the Internet should not be obstacles for government. Rather, government should identify ways to use those pressures to support the goals that government hopes to achieve. In telecommunications, this means using the pricing signals of the market to create incentives for efficiency. In a competitive market, prices are based on costs, and the firm that can provide a service for the lowest cost is likely to succeed. Such competitive pressures operate far more effectively, with lower administrative costs, than direct government mandates.

Similarly, government should look for mechanisms that use the Internet itself to rectify problems and create opportunities for future growth. For example, new access technologies may reduce network congestion, as long as companies have proper incentives to deploy those technologies. Filtering systems may address concerns about inappropriate content. Competition from Internet services may pressure monopolies or outdated regulatory structures. Government agencies should also use the Internet themselves to receive and disseminate information to the public.

The Network, not Networks

The Internet is a network, but so are AT&T, TCI, and NBC. The FCC’s goal should not be to foster the development of any one of those networks individually, but to maximize the public benefits that flow from the Network that encompasses all of those networks and many more. With the growth of competition and the elimination of traditional regulatory, technological, and economic boundaries, networks are more likely than ever to be interdependent, and a policy that benefits one network may have a detrimental effect on others. For example, a mandate that Internet service providers be entitled to connect to the telephone network for free might stimulate Internet use, but telephone companies might be forced to increase their rates or offer lower quality service to recover the increased cost of supporting such connections.

Although government should support the growth of the Internet, this support need not involve explicit subsidies that are not independently justified as a matter of public policy and economics. Instead, government should create a truly level playing field, where competition is maximized and regulation minimized.

How the Internet Works
Basic Characteristics

Just as hundreds of millions of people who make telephone calls every day have little conception of how their voice travels almost instantaneously to a distant location, most Internet users have only a vague understanding of how the Internet operates. The fundamental operational characteristics of the Internet are that it is a distributed, interoperable, packet-switched network.

A distributed network has no one central repository of information or control, but is comprised of an interconnected web of “host” computers, each of which can be accessed from virtually any point on the network. Thus, an Internet user can obtain information from a host computer in another state or another country just as easily as obtaining information from across the street, and there is hierarchy through which the information must flow or be monitored. Instead, routers throughout the network regulate the flow of data at each connection point. By contrast, in a centralized network, all users connect to single location.Footnote 11 The distributed nature of the Internet gives it robust survivability characteristics, because there is no one point of failure for the network, but it makes measurement and governance difficult.

An interoperable network uses open protocols so that many different types of networks and facilities can be transparently linked together, and allows multiple services to be provided to different users over the same network. The Internet can run over virtually any type of facility that can transmit data, including copper and fiber optic circuits of telephone companies, coaxial cable of cable companies, and various types of wireless connections. The Internet also interconnects users of thousands of different local and regional networks, using many different types of computers. The interoperability of the Internet is made possible by the TCP/IP protocol, which defines a common structure for Internet data and for the routing of that data through the network.

A packet-switched network means that data transmitted over the network is split up into small chunks, or “packets.” Unlike “circuit-switched” networks such as the public switched telephone network (PSTN), a packet-switched network is “connectionless.”Footnote 12 In other words, a dedicated end-to-end transmission path does (or circuit) not need to be opened for each transmission.Footnote 13 Rather, each router calculates the best routing for a packet at a particular moment in time, given current traffic patterns, and sends the packet to the next router. Thus, even two packets from the same message may not travel the same physical path through the network. This mechanism is referred to as “dynamic routing.” When packets arrive at the destination point, they must be reassembled, and packets that do not arrive for whatever reason must generally be re-sent. This system allows network resources to be used more efficiently, as many different communications can be routed simultaneously over the same transmission facilities. On the other hand, the inability of the sending computer under such a “best effort” routing systemFootnote 14 to ensure that sufficient bandwidth will be available between the two points creates difficulties for services that require constant transmission rates, such as streaming video and voice applications.Footnote 15

The Internet Today

As of January 1997 there were over sixteen million host computers on the Internet, more than ten times the number of hosts in January 1992.Footnote 16 Several studies have produced different estimates of the number of people with Internet access, but the numbers are clearly substantial and growing. A recent Intelliquest study pegged the number of subscribers in the United States at 47 million,Footnote 17 and Nielsen Media Research concluded that 50.6 million adults in the United States and Canada accessed the Internet at least once during December 1996 – compared to 18.7 million in spring 1996.Footnote 18 Although the United States is still home to the largest proportion of Internet users and traffic, more than 175 countries are now connected to the Internet.Footnote 19

According to a study by Hambrecht & Quist, the Internet market exceeded one billion dollars in 1995, and is expected to grow to some 23 billion dollars in the year 2000. This market is comprised of several segments, including network services (such as ISPs); hardware (such as routers, modems, and computers); software (such as server software and other applications); enabling services (such as directory and tracking services); expertise (such as system integrators and business consultants); and content providers (including online entertainment, information, and shopping). The Internet access or “network services” portion of the market is of particular interest to the FCC, because it is this aspect of the Internet that impacts most directly on telecommunications facilities regulated by the Commission. There are now some 3,000 Internet access providers in the United States,Footnote 20 ranging from small start-ups to established players such as Netcom and AT&T to consumer online services such as America Online.

Internet Trends

Perhaps the most confident prediction that can be made about the Internet is that it will continue to grow. The Internet roughly doubled in users during 1995, and this trend appears to be continuing.Footnote 21 Figure 4 shows one projection of the growth in residential and business users over the remainder of the decade. Estimates suggest as many as half a billion people will use the Internet by the year 2000.Footnote 22

Figure 2: Internet Growth Projections

As the Internet grows, methods of accessing the Internet will also expand and fuel further growth. Today, most users access the Internet through either universities, corporate sites, dedicated ISPs, or consumer online services. Telephone companies, whose financial resources and network facilities dwarf those of most existing ISPs, have only just begun to provide Internet access to businesses and residential customers. Cable companies are also testing Internet access services over their coaxial cable networks, and satellite providers have begun to roll out Internet access services. Several different forms of wireless Internet access are also being deployed.

At the same time as these new access technologies are being developed, new Internet clients are also entering the marketplace. Low-cost Internet devices such as WebTV and its competitors allow users to access Internet services through an ordinary television for a unit cost of approximately $300, far less than most personal computers. Various other devices, including “network computers” (NCs) for business users, and Internet-capable video game stations, promise to reduce the up-front costs of Internet access far below what it is now. These clients promise to expand greatly the range of potential Internet users. Moreover, as Internet connectivity becomes embedded into ordinary devices (much as computer chips now form the brains of everything from automobiles to microwave ovens), the Internet “market” will expand even more.

Bandwidth will continue to increase to meet this new demand, both within the Internet backbones and out to individual users. There is a tremendous level of pent-up demand for bandwidth in the user community today. Most users today are limited to the maximum speed of analog phone lines, which appears to be close to the 28.8 or 33.6 kbps supported by current analog modems, but new technologies promise tremendous gains in the bandwidth available to the home.Footnote 23 In addition, the backbone circuits of the Internet are now being upgraded to OC-12 (622 Mbps) speeds, with far greater speeds on the horizon.Footnote 24 With more bandwidth will come more services, such as full-motion video applications. Virtually every one of the challenges identified in this paper will become more acute as bandwidth and usage increase, and as the current limitations of the Internet are overcome. Thus, even though some of the questions that the Internet poses are of limited practical significance today, policy makers should not wait to consider the implications of the Internet.Footnote 25

Throughout the history of the Internet, seemingly insurmountable obstacles have been overcome. Few people would have expected a network designed for several dozen educational and research institutions to scale to a commercial, educational, and entertainment conduit for tens of millions of users, especially with no means of central coordination and administration. Governments should recognize that the Internet is different from traditional media such as telephony and broadcasting, although lessons can be learned from experience in dealing with those technologies. At the same time, the Internet has always been, and will continue to be influenced by the decisions of large institutions and governments. The challenge will be to ensure that those decisions reinforce the traditional strengths of the Internet, and tap into the Internet’s own capability for reinvention and problem-solving.

Category Difficulties

The FCC has never directly exercised regulatory jurisdiction over Internet-based services. However, the rapid development of the Internet raises the question of whether the language of the Communications Act of 1934 (as amended by the Telecommunications Act of 1996), or existing FCC regulations, cover particular services offered over the Internet.

Governments act by drawing lines, such as the jurisdictional lines that identify which governmental entity has authority over some activity, or the service classifications that differentiate which body of law should be applied in a particular case. Governments traditionally determine the treatment of new services by drawing analogies to existing services. For example, the FCC regulates long-distance telephony, but does not regulate dial-up remote access to corporate data networks. ISPs almost exclusively receive calls from their subscribers, but so do retailers taking catalog orders or radio stations holding call-in promotions. Dial-up access to the Internet resembles, but differs from, other types of connections.

There are reasons to believe that a simple process of drawing analogies to familiar services will not be appropriate for the Internet. The Internet is simultaneously local, national, and global, and is almost infinitely plastic in terms of the services it can support. As a result, it confounds any attempt at classification. Failure to consider such category difficulties is, however, itself a form of line drawing. As long as some communications services are subject to regulatory constraints, legal boundaries will be necessary. New approaches may therefore be necessary to avoid inefficient or burdensome results from existing legal and regulatory categories.

Toward a Rational Approach

The primary goal of this paper is to identify issues, not to offer specific policy recommendations. It is important to remember that, despite the tremendous attention given to the Internet in the past few years, it remains orders of magnitude smaller in terms of usage and revenues than the voice telephone network in the United States. Many of the questions raised here will answer themselves as service providers fine-tune their business models and as the communications industry evolves. Once competition is sufficiently well-developed, regulation may become largely unnecessary. At some point, companies will be disciplined more strongly by market forces than by the dictates of regulators. Nonetheless, some thoughts about how to address the categorization challenges raised in this section are appropriate.

So long as some services are regulated, a line-drawing process must take place. When Internet services are involved, this line drawing will be inherently messy and imprecise. However, even the premise that Internet services should not be regulated requires a precise assessment of what constitutes an “Internet” service. With the increasing prevalence of hybrid services, joint ventures, and alternative technologies, such distinctions will always be difficult. No matter how sophisticated the regulator, companies in the marketplace will devise clever means of avoiding regulatory restrictions. No matter how well-intentioned the regulator, government intervention in the private sector can have unexpected and unfortunate consequences.

Thus, government should apply blunt instruments that achieve underlying goals, rather than struggling for an elegant or precise solution that will cover every case. Wherever possible, market forces should be harnessed to take the place of direct regulatory intervention. Although new services like Internet telephony and streaming video may create legal headaches, these developments are positive ones that government should encourage. Such new technologies are valuable both because of the new options they represent for consumers, but also because of the potential competitive pressure they may exert on incumbent providers.

The first task of government policy towards these new Internet-based services should therefore be to identify those areas where regulation is clearly not appropriate. By distinguishing these “easy cases,” government can provide greater certainty to the private sector that regulation will not be extended to the theoretical boundaries of statutory authority. For example, when a company such as Vocaltec sells retail software that allows end users to make voice phone calls through the Internet, and nothing more, it makes little sense to classify that company as a telecommunications carrier subject to federal and state regulation. Such software providers merely enable end users to utilize a functionality through the network, much like companies that sell fax machines. They do not themselves transport telecommunications traffic. Similarly, an ISP should not be classified as a telecommunications carrier simply because some of its users choose to use Internet telephony software to engage in voice calls. By stating that such companies are not subject to the Communications Act, the FCC could eliminate fear and uncertainty, while still leaving room to analyze the harder questions.

The next step should be to identify relatively simple and flexible structures that achieve underlying policy goals. The initial assumption ought to be that new Internet-based services should not be subject to the regulatory constraints of traditional services. Government policy should be sensitive to the fact that technology is changing rapidly, and that the Internet landscape a few years in the future may look very different than it does today. Market forces may lead to the creation of differentiated classes of service, with users paying higher rates for higher quality, thus de facto distinguishing between different types of service offerings, without any intervention by the government.

The analytical process must work in both directions. Government should think not only about the regulatory treatment of new services, but about the implications of those new services for the regulatory treatment of existing services. If a competitive imbalance exists because a new technology is not subject to the same regulatory constraints as a competing older technology, the answer should be reduced regulation of the older technology. Of course, such deregulation should be dependent on the existence of sufficient competition to police the actions of incumbents. The ultimate objective, however, should be less regulation for all, rather than more regulation for some.

Conclusion

This working paper has reviewed many difficult and complex issues that have arisen as the Internet has grown to prominence. I have attempted to identify government policy approaches that would have a positive influence on the development of the Internet. This final section seeks to place the challenges described throughout this paper into a broader context.

The Internet and Competition in Telecommunications

The movement toward deregulation and local competition in telecommunications in the United States may be the single most significant development for the future of the Internet. The decisions that the FCC, state regulators, and companies make about how to create a competitive marketplace will determine the landscape in which the Internet evolves. The shape of local competition will influence what types of companies are able to provide Internet access to what categories of users, under what conditions, and at what price. The removal of barriers between different industries – such as the prohibition on BOCs offering in-region long-distance service – will accelerate the convergence that is already occurring as a result of digitalization and other technological trends.

Internet providers are potentially both substantial customers of circuit-switched voice carriers, and competitors to them. It is ultimately in the interests of both ISPs (who depend on the PSTN to reach their customers) and LECs (who derive significant revenue from ISPs) to have pricing systems that promote efficient network development and utilization. If the costs of Internet access through incumbent LEC networks increase substantially, users will have even stronger incentives to switch to alternatives such as competitive local exchange carriers, cable modems, and wireless access.

Dial-up Internet access today tends to be priced on a flat-rated basis, for both the PSTN portion of the connection and the transmission of packets through Internet backbones. By contrast, interexchange telephone service tends to be charged on a per-minute basis.Footnote 26 However, both networks run largely over the same physical facilities. There is some evidence that Internet and long-distance pricing are beginning to move towards each other.Footnote 27 This paper has discussed some of the arguments about usage pricing for Internet connections through the PSTN; similar debates are occurring among Internet backbone providers in response to congestion within the Internet. With the development of differentiated quality of service mechanisms on Internet backbones, usage pricing seems likely to become more prevalent on the Internet, although usage in this context may be measured by metrics other than minutes.

In the telephone world, flat-rated pricing appears to be gaining ground. The FCC established the subscriber line charge (SLC), because the fixed costs it represented were more efficiently recovered on a flat-rated basis. The Access Reform proceeding raises questions about whether other usage-sensitive charges (such as the Transport Interconnection Charge and the Carrier Common Line Charge) should be replaced with flat-rated charges, and there was substantial debate in the Interconnection proceeding about whether LEC switching capacity should be sold on a flat-rated basis in the form of a “switch platform.” Pressure toward flat-rated pricing is also arising for business reasons – for example, Southwestern Bell has reportedly considered offering a flat-rated regional long-distance plan when it receives interLATA authorization. Customers in the US seem to prefer the certainty of flat-rated pricing even where it winds up costing more for their particular level of usage.

There are, of course, important differences in the architectures of the Internet and the public switched telephone network. However, both of these architectures are evolving. There will not be one universal pricing structure for the Internet or the telephone network, for the simple reason that there will not be one homogenous network or one homogenous company running that network. Technology and business models should drive pricing, rather than the reverse.

Today, the vast majority of Internet users and ISPs must depend on incumbent LECs for their connections to the Internet. These incumbent LECs have huge investments in their existing circuit-switched networks, and thus may be reluctant, absent competitive pressure, to explore alternative technologies that involve migrating traffic off those networks. The economics of the Internet are uncertain, since the market is growing and changing so rapidly. Competition will enable companies to explore the true economics and efficiencies of different technologies. The unbundling mandated by the 1996 Act will allow companies to leverage the existing network to provide new high-bandwidth data services.

Competition can lead to instability or confusion, especially during periods of transition. Monopolies provide certainty of returns that, by definition, cannot be achieved in a competitive market. With many potential players, forecasting the future of the industry can be difficult. Companies must choose between different technologies and business models, and those companies that do not choose wisely will see the impact on their bottom lines.

Yet, as the Internet demonstrates, uncertainty can be a virtue. The Internet is dynamic precisely because it is not dominated by monopolies or governments. Competition in the Internet industry, and the computer industry that feeds it, has led to the rapid expansion of the Internet beyond anything that could have been foreseen. Competition in the communications industry will facilitate a similarly dynamic rate of growth and innovation.

The Right Side of History

The legal, economic, and technical underpinnings of the telecommunications infrastructure in the United States have developed over the course of a century, while the Internet as a service for consumers and private businesses is less than a decade old, and the national framework for competition in local telephone telecommunications markets was adopted scarcely more than a year ago. Challenges that seem insurmountable today may simply disappear as the industry and technology evolve.

As significant as the Internet has become, it is still near the beginning of an immense growth curve. America Online, the largest ISP, has grown from under a million subscribers to eight million in roughly four years. But those eight million subscribers represent only a fraction of the eighty million households served by AT&T. The revenues generated by the Internet industry, although growing rapidly, pale in comparison to those generated by traditional telephony. Only about 15 percent of the people in the United States use the Internet today, and less than 40 percent of households even have personal computers. A decade from now, today’s Internet may seem like a tiny niche service. Moreover, as Internet connectivity is built into cellular phones, television sets, and other household items, the potential number of Internet hosts will mushroom beyond comprehension. Computers are now embedded in everything from automobiles to cameras to microwave ovens, and all of these devices may conceivably be networked together. The Internet may exert the greatest influence on society once it becomes mundane and invisible.

The growth potential of the Internet lends itself to both pessimistic and optimistic expectations. The pessimist, having struggled through descriptions of legal uncertainties, competitive concerns, and bandwidth bottlenecks, will be convinced that all these problems can only become worse as the Internet grows. The optimist, on the other hand, recognizes that technology and markets have proven their ability to solve problems even faster than they create them.

The global economy increasingly depends on networked communications, and communications industries are increasingly shifting to digital technologies. Bandwidth is expanding, but so is demand for bandwidth. None of these trends shows signs of diminishing. As long as there is a market for high-speed connections to the Internet, companies will struggle to make those high-speed connections available in an affordable and reliable manner. Once a sufficiently affordable and reliable network is built, new services will emerge to take advantage of it, much as the World Wide Web could take off once the Internet had reached a certain level of development.

Difficulties and confusion may arise along the way, but improvements in communications technology will continue to provide myriad benefits for individuals, businesses, and society. In the long run, the endless spiral of connectivity is more powerful than any government edict.

Footnotes

1 Telecommunications Act of 1996, Pub. L. No. 104–104, 110 Stat. 56, to be codified at 47 U.S.C. §§ 151 et. seq (1996 Act), at § 230(b)(2). Hereinafter, all citations to the 1996 Act will be to the 1996 Act as codified in the United States Code.

2 A Framework for Global Electronic Commerce, available on the World Wide Web, at www.whitehouse.gov.

3 Tony Rutkowski, former Executive Director of the Internet Society, has written a more detailed discussion of the implications of Internet architecture for the development of the network. See Anthony M. Rutkowski, “Internet as Fractal: Technology, Architecture, and Evolution,” in The Internet as Paradigm (Aspen Institute 1997).

4 See Amir Atai and James Gordon, Impacts of Internet Traffic on LEC Networks and Switching Systems (Bellcore 1996); Vadim Antonov, ATM: Another Technological Mirage, available on the World Wide Web, at www.pluris.com/ip_vs_atm/.

5 For a thorough explication of various metaphors for the Internet, including the now well-worn notion of the “Information Superhighway” coined by Vice President Albert Gore, see Mark Stefik, Internet Dreams: Archetypes, Myths, and Metaphors (1996).

6 For an extended discussion of the significance for feedback loops and control mechanisms as they relate to new technologies, see Kevin Kelly, Out of Control: The New Biology of Machines, Social Systems, and the Economic World (1994).

7 See infra section (IV)(B).

8 The tornado metaphor has been used by Paul Saffo, Eric Schmidt, and others to describe the Internet.

9 See Digitization and Competition (Computer Systems Policy Project 1996).

10 See, e.g., George Gilder, “The Bandwidth Tidal Wave,” Forbes ASAP, December 5, 1994.

11 In some cases, centralized networks use regional servers to “cache” frequently accessed data, or otherwise involve some degree of distributed operation.

12 Some newer technologies, such as asynchronous transfer mode (ATM) switching, allow for the creation of “virtual circuits” through the Internet, which allow traffic to follow a defined route through the network. However, information is still transmitted in the form of packets.

13 In actuality, much of the PSTN, especially for long-distance traffic, uses digital multiplexing to increase transmission capacity. Thus, beyond the truly dedicated connection along the subscriber loop to the local switch, the “circuit” tied up for a voice call is a set of time slices or frequency assignments in multiplexing systems that send multiple calls over the same wires and fiber optic circuits.

14 In a “best effort” delivery system, routers are designed to “drop” packets when traffic reaches a certain level. These dropped packets must be resent, which to the end user is manifested in the form of delay in receiving the transmission.

15 “Streaming” voice and video applications are those in which the data available to the receiving user is updated as data packets are received, rather than waiting until an entire image or sound file is downloaded to the recipient’s computer.

16 Network Wizards Internet Domain Survey, January 1997.

17 See “US on-line population reaches 47 million – Intelliquest survey results,” Internet IT Informer, February 19, 1997, available on the World Wide Web, at www.mmp.co.uk/mmp/informer/netnews/HTM/219n1e.htm.

18 See Julia Angwin, “Internet Usage Doubles in a Year,” San Francisco Chronicle, March 13, 1997, at B1.

19 Network Wizards Internet Domain Survey, January 1997, available on the World Wide Web, at www.nw.com/zone/WWW/top.html.

20 Boardwatch Directory of Internet Service Providers (Fall 1996).

21 See “Market Size,” CyberAtlas, available on the World Wide Web, at www.cyberatlas.com/market.html.

22 Paul Taylor, “Internet Users ‘Likely to Reach 500 m by 2000,’“ Financial Times, May 13, 1996, at 4.

23 Several manufacturers are beginning to deploy 56kbps modems. See “U.S. Robotics Launches the New Battle – 56kbps Modems,” Boardwatch, January 1997. This technology provides higher downstream transmission rates, but requires ISPs to have digital connections to the local exchange network. The throughput of these modems under real-world conditions will depend on the nature of each user’s connection, and will usually be lower than 56 kbps. In addition, current FCC technical rules governing line power may limit the maximum connection speed of these modems to 53kbps.

24 MCI, for example, currently plans to upgrade its backbone to OC-48 speed (2.5 Gbps) by 1998.

25 Of course, widespread penetration of new, higher-bandwidth services may take far longer than some breathless commentators predict today. See Jonathan Weber, “Internet Video: Idea Whose Time Will Come … Slowly,” Los Angeles Times, May 13, 1996, at B8. Although policy makers and regulators should be aware of the possibilities that the Internet created, concrete actions should not be taken based on mere speculation about the potential impact of a service.

26 Outside the United States, local telephone service is usually charged on a per-minute basis as well.

27 See generally “Too Cheap to Meter?,” The Economist, October 19, 1996, at 23.

Figure 0

Figure 1: The Internet Spiral

Figure 1

Figure 2: Internet Growth Projections

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • Prelude
  • Edited by Kevin Werbach
  • Book: After the Digital Tornado
  • Online publication: 24 August 2020
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • Prelude
  • Edited by Kevin Werbach
  • Book: After the Digital Tornado
  • Online publication: 24 August 2020
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • Prelude
  • Edited by Kevin Werbach
  • Book: After the Digital Tornado
  • Online publication: 24 August 2020
Available formats
×