No one owns it. And no one in particular actually runs it. Yet more than half a billion people rely on it as they do a light switch.
The Internet is a network whose many incarnations -- as obscure academic playpen, information superhighway, vast marketplace, sci-fi-inspired matrix -- have seen it through more than three decades of ceaseless evolution.
PHOTO: NY TIMES
In the mid-1990s, a handful of doomsayers predicted that the Internet would melt down under the strain of increased volume. They proved to be false prophets, yet now, as it enters its 33rd year, the Net faces other challenges.
The demands and dangers -- sudden, news-driven traffic, security holes, and a clamor for high-speed access to homes -- are concerns that bear no resemblance to those that preoccupied the Internet's creators. For all their genius, they failed to see what the Net would become once it left the confines of the university and entered the free market.
Those perils are inextricably linked to what experts consider the Internet's biggest promise: evolving into an information utility as ubiquitous and accessible as electricity. That, too, was not foreseen by most of the engineers and computer scientists who built the Net in the 1960s and '70s.
Ten years ago, at the end of 1991, the same year that the World Wide Web was put in place but a good two or three years before the term Web browser became part of everyday speech, the Net was home to some 727,000 hosts, or computers with unique Internet Protocol, or IP, addresses. By the end of 2001, that number had soared to 175 million, according to estimates by Matrix Net Systems, a network measurement business in Austin, Texas.
For all that growth, the Net operates with surprisingly few hiccups, 24 hours a day -- and with few visible signs of who is responsible for keeping it that way. There are no vans with Internet Inc logos at the roadside, no workers in Cyberspace hard hats hovering over manholes.
Such is yet another of the Internet's glorious mysteries. No one really owns the Net, which, as most people know by now, is actually a sprawling collection of networks owned by various telecommunications carriers. The largest, known as backbone providers, include WorldCom, Verizon, Sprint and Cable & Wireless USA.
What, then, is the future of this vital public utility? Who determines it? And who is charged with carrying it out?
For the Internet's first 25 years, the US government ran parts of it, financed network research and in some cases paid companies to build custom equipment to run the network. But in the mid-1990s the Net became a commercial enterprise, and its operation was transferred to private carriers. In the process, most of the government's control evaporated.
Now the network depends on the cooperation and mutual interests of the telecommunications companies. Those so-called backbone providers adhere to what are known as peering arrangements, which are essentially agreements to exchange traffic at no charge.
"Peering fits right in with the overly loose way the Internet is provided," said Scott Bradner, a senior technical consultant at Harvard University, "which is unrelated commercial interests doing their own thing."
Bradner, co-director of the Internet Engineering Task Force, an international self-organized group of network designers, operators and researchers who have set technical standards for the Internet since the late 1980s, said that peering remains a remarkably robust mechanism.
And for now, capacity is not a particularly pressing problem because the backbone providers have been laying high-speed lines at prodigious rates over the last few years.
"We've got a lot of long-distance fiber in the ground, a lot of which isn't being used, but it's available," said Craig Partridge, a chief scientist at BBN Technologies, an engineering company that oversaw the building of the first network switches in the late 1960s and is now owned by Verizon.
Still, the fear that the Net is not up to its unforeseen role still gnaws at prognosticators. Consider the gigalapse prediction.
In December 1995, Robert Metcalfe, who invented the office network technology known as Ethernet, wrote in his column in the industry weekly Infoworld that the Internet was in danger of a vast meltdown.
More specifically, Metcalfe predicted what he called a gigalapse, or 1 billion lost user hours resulting from a severed link -- for instance, a ruptured connection between a service provider and the rest of the Internet, a backhoe's cutting a cable by mistake or the failure of a router.
The disaster would come by the end of 1996, he said, or he would eat his words.
The gigalapse did not occur, and while delivering the keynote address at an industry conference in 1997, Metcalfe literally ate his column. "I reached under the podium and pulled out a blender, poured a glass of water, and blended it with the column, poured it into a bowl and ate it with a spoon," he recalled recently.
Distributed network
The failure of Metcalfe's prediction apparently stemmed from the success of the Net's basic architecture. It was designed as a distributed network rather than a centralized one, with data taking any number of different paths to its destination. If one link fails, the data finds another one.
That deceptively simple principle has, time and again, saved the network from failure. When a communications line important to the network's operation goes down, as one did last summer when a freight-train fire in Baltimore damaged a fiber-optic loop, data works its way around the trouble.
It took a far greater crisis to make the Internet's vulnerabilities clearer.
On Sept. 11, within minutes of the terrorist attacks on the World Trade Center, the question was not whether the Internet could handle the sudden wave of traffic, but whether the servers -- the computers that deliver content to anyone who requests it by clicking on a Web link -- were up to the task.
Executives at CNN.com were among the first to notice the Internet's true Achilles' heel: the communications link to individual sites that become deluged with traffic. CNN.com fixed the problem within a few hours by adding server capacity and moving some of its content to servers operated by Akamai, a company providing distributed network service.
Bradner said that most large companies have active mirror sites to allow quick downloading of the information on their servers. And as with so many things about the Net, responsibility lies with the service provider.
"Whether it's CNN.com ornytimes.com or anyone offering services, they have to design their service to be reliable," he said. "This can never be centralized."
Guidelines can help. Bradner belongs to a Federal Communications Commission advisory group called the Network Reliability and Operability Council, which just published a set of recommended practices for service providers, including advice on redundant servers, backup generators and reliable power. "Still, there are no requirements," Bradner said.
If the government is not running things, exactly, at least it is taking a close look.
Partridge of BBN Technologies recently served on a National Research Council committee that published a report on the Internet. One of the committee's biggest concerns was bringing broadband, or high-speed, Internet service to households.
Some 10.7 million of the nation's households now have broadband access, or about 16 percent of all households online, according to the Yankee Group, a research firm.
Only when full high-speed access is established nationwide, Partridge and others say, will the Internet and its multimedia component, the Web, enter the next phase of their evolution.
"We need to make it a normal thing that everyone has high-speed bandwidth," said Brian Carpenter, an engineer at IBM and chairman of the Internet Society, a nonprofit group that coordinates Internet-related projects around the world.
Yet there is no central coordination of broadband deployment. Where, when and how much access is available is up to the individual provider -- typically, the phone or cable company. As a result, the availability of broadband service varies widely.
Control falls to the marketplace. And in light of recent bankruptcies and mergers among providers, like Excite(AT)Home's failure and AT&T Broadband's sale to Comcast late last year, universal broadband deployment may be moving further into the future.
The one prominent element of centralized management in Internet operations -- the assignment of addresses and top domain names, like .com or .edu -- reflects the tricky politics of what is essentially a libertarian arena. That is the task of the Internet Corporation for Assigned Names and Numbers, or Icann, which operates under the auspices of the Commerce Department. Its efforts to establish an open decision-making process became mired in disputes over who the Internet's stakeholders actually were.
And even as Icann and its authorized registrars take over administration of the Internet's naming system, a different problem nags at computer scientists: the finite number of underlying IP addresses.
In the current version of Internet Protocol, the software for the routers that direct Internet traffic, there is a theoretical limit of 4 billion addresses. Some 25 percent are already spoken for.
The solution, Carpenter said, is bigger addresses. "This means rolling out a whole new version of IP," he said.
Although the assignment of IP addresses falls to Icann, inventing a new protocol is essentially a research problem that falls to the Internet Engineering Task Force.
As the Internet continues to grow and sprawl, security is also a nagging concern. The Internet was not built to be secure in the first place: its openness is its core strength and its most conspicuous weakness.
"Security is hard -- not only for the designers, to make sure a system is secure, but for users, because it gets in the way of making things easy," Bradner said.
There is no centralized or even far-flung security management for the Internet. The Computer Emergency Response Team at Carnegie Mellon University is mainly a voluntary clearinghouse for information about security problems in Internet software.
The lack of a central security mechanism "is a mixed bag," Bradner said. A centralized system that could authenticate the origin of all traffic would be useful in tracing the source of an attack, he said.
Striking a balance
That is where a delicate balance must be struck: between the ability to trace traffic and the desire to protect an individual's privacy or a corporation's data. "It's not at all clear that there's a centralizable role, or that there's a role government could play without posing a severe threat to individuals," Bradner said.
Past plans for identity verification have failed because of the complexity of making them work on a global scale, he said.
Such are the challenges that face the Internet as it continues its march.
"The really interesting question to ask is whether we can build a next generation of applications," Carpenter said. "Can we move from what we have now, which is an information source, to a network that's really an information utility, used for entertainment, education and commercial activities? There's tremendous potential here, but we've got a lot of work to do."
As that work progresses, another question centers on what role the government should play. Many carriers who bear the cost of expanding the infrastructure favor federal incentives for carriers to invest in new broadband technology. The Federal Communications Commission is also mulling policy changes, soliciting suggestions from the communications industry for making broadband access more widely available.
A subsidiary of a Hong Kong-based company that has lost control of two critical ports on the Panama Canal said it is seeking US$2 billion of compensation in damages from Panama over its “illegal” takeover of the ports. Panama Ports Co, a unit of Hong Kong’s CK Hutchison Holdings (長江和記實業), on Friday said in a statement that it is demanding the sum under international arbitration proceedings that it had already started. The Panamanian government last week seized control of the Balboa and Cristobal ports on each end of the Panama Canal, after the country’s Supreme Court declared earlier that a concession allowing
DETERRENCE: With 1,000 indigenous Hsiung Feng II and III missiles and 400 Harpoon missiles, the nation would boast the highest anti-ship missile density in the world With Taiwan wrapping up mass production of Hsiung Feng II and III missiles by December and an influx of Harpoon missiles from the US, Taiwan would have the highest density of anti-ship missiles in the world, a source said yesterday. Taiwan is to wrap up mass production of the indigenous anti-ship missiles by the end of year, as the Chungshan Institute of Science and Technology has been meeting production targets ahead of schedule, a defense official with knowledge of the matter said. Combined with the 400 Harpoon anti-ship missiles Taiwan expects to receive from the US by 2028, the nation would have
POSSIBILITIES EMERGE: With Taiwan’s victory and Japan’s narrow win over Australia, Taiwan now have a chance to advance if South Korea also beat the Aussies Taiwan has high hopes that the national baseball team would advance to the World Baseball Classic (WBC) quarter-finals after clinching a crucial 5-4 victory over South Korea in a nail-biting extra-inning game at the Tokyo Dome yesterday. Boosted by three home runs — two solo shots by Yu Chang (張育成) and Cheng Tsung-che (鄭宗哲) and a two-run homer by Stuart Fairchild — the triumph gave Taiwan a much-needed second victory in the five-team Pool C, where only the top two finishers would advance to the knockout stage in Miami, Florida. Entering extra innings with the game tied at four apiece, Taiwan scored
MISSION OF PEACE: The foreign minister urged Beijing to respect Taiwan’s existence as an independent nation, and work together to ensure peace and stability in the region Minister of Foreign Affairs Lin Chia-lung (林佳龍) yesterday rejected Chinese Minister of Foreign Affairs Wang Yi’s (王毅) comments about Taiwan, criticizing China as a “troublemaker” in the international community and a disruptor of cross-strait peace. Speaking at a news conference on the sidelines of the Chinese National People’s Congress, Wang said that Taiwan has always been a territory of China and that it would be impossible for it to become its own country. The “return” of Taiwan to China was the natural outcome of the Chinese people’s resistance against Japan in World War II, and that any pursuit of independence was “doomed