From www.dailygrail.comResearchers say the time has come to rethink the Internet's underlying architecture are exploring tearing the Internet apart and rebuilding it to better address security and mobility
THE BEGINNING OF END
After four decades of building the Internet, some university researchers with the U.S. government's blessing want to shutdown all and start over.
The idea is surely unthinkable, but many believe a "clean slate" approach is the only way to truly address security, mobility and other challenges that have cropped up since Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.
The Internet works well in many situations but it was designed for completely different assumptions, a Rutgers University professor overseeing three clean-slate projects. "It's sort of a miracle that it continues to work well today."
No longer constrained by slow connections and computer processors and high costs for storage, researchers say the time has come to rethink the Internet's underlying architecture, a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes.
Even Vinton Cerf, one of the Internet's founding fathers as co-developer of the key communications techniques, said the exercise was "generally healthy" because the current technology "does not satisfy all needs."
The National Science Foundation wants to build an experimental research network known as the Global Environment for Network Innovations, or GENI, and is funding several projects at universities and elsewhere through Future Internet Network Design, or FIND.Rutgers, Stanford, Princeton, Carnegie Mellon and the Massachusetts Institute of Technology are among the universities pursuing individual projects. Other government agencies, including the Defense Department, have also been exploring the concept.The European Union has also backed research on such initiatives, through a program known as Future Internet Research and Experimentation, or FIRE. Government officials and researchers met last month in Zurich to discuss early findings and goals.
A new network could run parallel with the current Internet and eventually replace it, or perhaps aspects of the research could go into a major overhaul of the existing architecture.These clean-slate efforts are still in their early stages, though, and aren't expected to bear fruit for another 10 or 15 years — assuming Congress comes through with funding.
And it could take billions of dollars to replace all the software and hardware deep in the legacy systems.Clean-slate advocates say the cozy world of researchers in the 1970s and 1980s doesn't necessarily mesh with the realities and needs of the commercial Internet.
THE NEED FOR NEW BEGINNING
The Internet's early architects built the system on the principle of trust. Researchers largely knew one another, so they kept the shared network open and flexible — qualities that proved key to its rapid growth.But spammers and hackers arrived as the network expanded and could roam freely because the Internet doesn't have built-in mechanisms for knowing with certainty who sent what.The network's designers also assumed that computers are in fixed locations and always connected. That's no longer the case with the proliferation of laptops, personal digital assistants and other mobile devices, all hopping from one wireless access point to another, losing their signals here and there.
Engineers tacked on improvements to support mobility and improved security, but researchers say all that adds complexity, reduces performance and, in the case of security, amounts at most to bandages in a high-stakes game of cat and mouse.The Internet will continue to face new challenges as applications require guaranteed transmissions — not the "best effort" approach that works better for e-mail and other tasks with less time sensitivity.
MISSION CRITICAL WORKS
Think of a doctor using teleconferencing to perform a surgery remotely, or a customer of an Internet-based phone service needing to make an emergency call. In such cases, even small delays in relaying data can be deadly.Even if the original designers had the benefit of hindsight, they might not have been able to incorporate these features from the get-go. Computers, for instance, were much slower then, possibly too weak for the computations needed for robust authentication.
Of course, a key question is how to make any transition — and researchers are largely punting for now.
"Let's try to define where we think we should end up, what we think the Internet should look like in 15 years' time, and only then would we decide the path," McKeown said. "We acknowledge it's going to be really hard but I think it will be a mistake to be deterred by that."
Kleinrock, the Internet pioneer at UCLA, questioned the need for a transition at all, but said such efforts are useful for their out-of-the-box thinking.
"A thing called GENI will almost surely not become the Internet, but pieces of it might fold into the Internet as it advances," he said.
Think evolution, not revolution.Princeton already runs a smaller experimental network called PlanetLab, while Carnegie Mellon has a clean-slate project called 100 x 100.Construction on GENI could start by 2010 and take about five years to complete. Once operational, it should have a decade-long lifespan.FIND, meanwhile, funded about two dozen projects last year and is evaluating a second round of grants for research that could ultimately be tested on GENI.These go beyond projects like Internet2 and National LambdaRail, both of which focus on next-generation needs for speed.
Any redesign may incorporate mechanisms, known as virtualization, for multiple networks to operate over the same pipes, making further transitions much easier. Also possible are new structures for data packets and a replacement of Cerf's TCP/IP communications protocols.
———Associated Press Business Writer Aoife White in Brussels, Belgium, contributed to this report.———