Fixing the Internet
By Wendy Rickard
For those in the business of selling Internet services and connectivity, we are now at a point that conjures up the old Chinese curse: May you live in interesting times. In this brief moment, the customer truly is at a disadvantage and most users are reluctant to admit their cluelessness. It's not unlike reading a difficult text. No matter how many times you read the same paragraph, you still don't get the message. When finally you realize that there's nothing wrong with your reading comprehension, that the paragraph is simply badly written, it is a moment of revelation-and power.
How long before the estimated 57 million Internet users have the same revelation?
The recent shutdown of America Online did more than point out that a really good marketing plan combined with an attractive pricing structure still needs to be backed by reliable service. It shattered the illusion of the Internet as a spaceless nirvana. The nuts and bolts of the infrastructure unraveled the mystery of the technology. The so-called ease of point and click turned into endless hours of frustration just trying to stay connected. Worse, customers believing that a flat $19.95 per month meant unlimited access, found out it really meant unlimited hours of teeth gnashing.
It is now accepted as fact that a healthy global economy depends on top-notch technology. As fantastic as the Internet is now, it's far from being a useful driver for a successful global economy. In true back-to-the-future fashion, hope resides within the academic and research communities. The development of Internet 2, backed by serious dollars-compliments of the United States government-is the first indication of the realization that research needs to continue. There are those who question the wisdom of this approach, believing that a government-funded program to advance anything is a model that no longer works.
In the short run, Internet 2 is not aimed at making it possible for end users to download blockbuster movies to their desktops. University researchers, one of the first groups to become dependent on the Internet for peer review, collaboration, and information exchange, have suffered from having to compete with other traffic on the Internet. With the development of Internet 2, academics and researchers will test and develop new Internet technology that will eventually enable everyone to use high-end applications. In a sense, Internet 2 will enable the hype to catch up to the reality.
As Janet Perry points out in her article about Internet 2, "If the history of universities and computing technology is considered, one thing becomes abundantly clear: this community is the bellwether for acceptance of technology in the commercial marketplace." The development of standards such as TCP/IP and Ethernet along with the first Web browsers (Mosaic) and e-mail packages (Eudora) all have roots in the university. More important perhaps is that the higher education community itself has come to a crossroads given increasing demands for services and rising tuitions. Investments in computing and Internet technology are considered by many policy makers and educators to be the only way higher education can survive.
In this issue, Cisco Systems' Paul Ferguson introduces another important element of next-generation Internet technology, one that impacts heavily on putting service back into the expression Internet service providers: quality of service. As Ferguson points out, "If there is never congestion, there basically is no issue; everyone's traffic gets delivered in a timely manner, and everyone is happy." But congestion, as anyone knows who has tried to download e-mail at certain times of the day, is a problem-one that's going to seriously affect companies that rely on Internet technology to conduct business. From the economic standpoint of the ISPs, QoS offers a boon.
If it can be implemented effectively and we can account and bill for it, we can effectively change the nature of pricing and service. Now you get lousy service no matter how much you pay. QoS promises to offer a new paradigm for service and systems.
As a user, I often wish I could wake up 20 years from now, when all of the network, hardware, and software kinks have been worked out. Now, even that time frame seems optimistic. One of the reasons I approve of the university/government model for development is that I understand the business model that says, get the goods out as soon as possible and get the customer to upgrade it even sooner. I get excited when I read in my local paper that soon I'll be able to pay my utility bill online, but my enthusiasm dims when I get hung up just trying to send a simple e-mail message. Right now, it's just easier to send a check in the mail.
Bill Graves, at the University of North Carolina at Chapel Hill
Institute for Academic Technology, writes about keeping alive
the "precompetitive model" of the Internet. The Internet, he points
out, emerged as a collaborative effort among competing parties.
The IETF works that way: individuals from competing organizations
and companies come together to develop standards that can then
be adopted, customized, and brought to market. The internetting
of the world is far from complete, and we should continue to look
carefully at the roots from which the Internet grew. It's a model
that would be a shame to abandon.