Last update at : Fri May 5 12:34:23 1995

The World Wide Web and Its Implications in a Democratic Society

The World Wide Web and Its Implications in a Democratic Society

This document was last updated on 28 April 1995.

Pattie Doyle

Rita Edwards

Angela Ross


In today's competitive marketplace, increased emphasis has been placed on providing information on new media. This includes supplying the means to access data in an accurate and precise manner. The government's ability to collect vast amounts of information far exceeds its capability to distribute it. The World Wide Web (WWW) is becoming an indispensable commodity, for distributing information. This paper examines one government agency's use of the WWW technology in successfully meeting customer requirements.

Table of Contents

1.0 Introduction

1.1 Historical Background

1.2 Corporate Information Center (CIC)

2.0 WWW Applications

2.1 SOLDIERS Magazine

2.2 Perdiem Application

2.3 On-Line Training

3.0 System Support

3.1 Compiler Environment

3.2 WinWatch

3.3 Windows to UNIX to Fax

4.0 The Future of Applications

5.0 Limitations

6.0 Federal Government Applications

7.0 Conclusion


Author Information

1.0 Introduction.

The advances provided by the World Wide Web (WWW) allow the average person to play a more direct role in a government established for the people. Too often in today's working bureaucracy, mistakes are made due to the inability to access and use pertinent information. Further, because of the inability to share results and "lessons learned" the same process of trial and error is repeated by different organizations. These inadequacies mean one thing "wasted money". The federal government will strengthen the drive towards developing the National Information Infrastructure, by accessing the power of the WWW and its browsers. Thus, the federal government strives toward achieving the goal established by the National Performance Review: "to create a government that works better and costs less [6]." Achieving this goal may only be accomplished by components building upon each other, much like a child's game of building blocks. Although the power of the Internet is intense, it is vital to understand that it is still growing. The overall effect of the Internet on organizations, both public and the private, is yet to be realized [1, p 4]. This paper examines both current uses of the Internet and the WWW in a government organization.

1.1 Historical Background.

The buzz words reorganization, re-engineering, and reinvention have a multitude of implications and interpretations. Regardless of the perspective, the end result of acting on these words is the same: do more, with less resources. The real problem appears when dedication and hard work are no longer enough. Technology must compensate for human limitations in two key areas. First, technology must preserve the results of events for historical purposes. Second, technology must provide the medium to access those results. A one-to-one relationship exists between the size of an organization and the importance of the quality of its data management technology. The larger an organization, the greater the amount of information, and the greater the need for technological tools. Government employees typically hoard information in large, gray, pre-World War II filing cabinets. These cabinets cause moving crews to cringe as workers move from office to office. If the cabinets are lost, the data is lost. When employees retire, the data is also "retired," usually in the nearest garbage can, or willed to a co-worker, continuing the life cycle. Although the data is available, its dissemination is difficult. Data is beneficial, but only if someone knows it exists and where to find it. Even if stored electronically, data is useless if the media is inaccessible. Yes, the information age is upon us. The questions are: What good is it? Where is the technology to communicate the information? Information leads to knowledge. Knowledge is power, but only if it is used correctly.

Government has a responsibility to provide this information to its customers, its people, in the most convenient method at its disposal [8, p. 98]. In addition, a strict requirement is that the chosen methodology be usable by the majority of the people. The WWW and its associated technologies provide the answer. Granted, the Internet has been around in some form since 1969 [14, p. 82], but early, complicated interfaces prevented most people from accessing the information [4, p. 56]. In addition, not many information providers knew how to set up the resources on the Internet. Technology aimed at a small group of people yields minimal results. The Internet had to broaden in scope and become approachable by a greater number of users. The WWW and its browsers have pulled the Internet out of its infant stage limited to education and research use, to its current pre-teen stage [12, p. 895]. It is too soon to predict when the Internet's evolution will be complete [4, p. 60].

1.2 Corporate Information Center (CIC) .

The Corporate Information Center (CIC) is a government agency which sees the Internet and the WWW as a medium for providing services to its customers. Designated as a reinvention lab, CIC has looked to new technology for possible solutions to old problems [6, pp. 1,65]. One such problem was to find ways of furnishing adequate support to customers at remote sites. Issues surrounding this support included the following:

  • how to handle time zone differences

  • how to ensure consistent answers to questions

  • how to provide fast response to a problem

  • how to contact developers who are out of the office.

    Any possible answer had to both maintain the CIC requirements to operate in an open systems environment and to support multiple users. An open systems approach focuses on hardware and software applications and their interactions. Flexibility of applications and hardware is the only way of guaranteeing that multiple users can access the data in a consistent fashion.

    One solution was found in the use of the Internet. However, two stumbling blocks were confronted. First, the Internet was not viewed as a tool. Many in decision-making positions held the opinion that, beyond Email, the Internet could not serve any productive purpose. In the same category as computer games, it was a toy not a tool. The WWW only made the Internet a more enticing plaything. The proliferation of WWW tools throughout organizations is still often met with resistance by those who do not understand their potential and implications. Many of these people refuse to see beyond the bottom line "Who is going to pay for this? [18. p. 84]" (Users of the Internet only pay for access to the network, not for the number of bytes sent [15, p. 881]). The financial issue revolves around the development and maintenance of the data. Unfortunately, at this point in the game, it is difficult to show a one-to-one correspondence between effort and returns on investment. Current returns are more intangible than seeing a set profit margin. For example, how does one weigh the time savings gathered from retrieving a software patch from the Internet versus calling a hot-line and waiting on hold for two hours, only to then wait for a diskette to be "FedEx'd"? Second, the Internet's interfaces were too complex for users with little to no experience. Although a gopher server gave access to needed documentation, the customer had to know how to maneuver through the menu system and was forced to linearly search for desired information [1, p. 4]. If the user knew how to access Veronica, that tool could index the gopher menus [11, p. 50]. Regardless, faster response time was available by thumbing through a printed manual. The option of moving to another interface required rewriting applications and did not offer better functionality [9, p. 93]. This was not acceptable. Despite these problems, the Internet could be a valuable tool beyond its mere electronic mail aspect. What was needed was an interface that would take advantage of the tools [2, p. 29].

    The WWW was the catalyst that propelled the CIC into the world of cyberspace. The Hyper-Text Transfer Protocol Daemon (HTTPD) is the key to the WWW's flexibility, simplicity, and success. The WWW serves as the conductor for an information symposium composed of various tools including File Transfer Protocol (FTP), gopher, and Wide Area Information Server (WAIS) [12, p. 897]. Graphical browsers, such as the National Center for Supercomputing Applications' (NCSA) Mosaic and the Netscape Communications Corporation's product Netscape, add the pizzazz necessary for applications to be aesthetically pleasing and easy to use [13, p. 26]. However, these browsers are not complete. They continue to evolve as do the needs of Internet users. A major advantage of the WWW is its ability to integrate these new technical advances into the current suite of tools.

    The use of the WWW falls into two distinct categories, application and system support. Application support includes software designed specifically for use by WWW "surfers". Applications in this category include information pages, information searchers, training courses, and surveys. System support includes using the WWW and its vast resources to support the software development life cycle. Software tools that fall into this category include compilers, editors, and libraries.

    Return to the Table of Contents

    2.0 WWW Applications.

    Applications on the WWW range in complexity from those that simply display text files to interactive programs that store information in databases. Combining hypermedia technology with the power of a relational database system removes the need for expensive graphical user interface (GUI) packages with immense data retrieval protocols. The HyperText Markup Language (HTML) forms provided by the WWW and the relational database support a built in data protocol. The user remains unaware of the complex data manipulation going on behind the scenes. This premise upholds the CIC's commitment to open systems.

    The CIC has chosen to use Oracle's relational database system to store its WWW application data. This product requires that special environment variables be set before trying to access its tables. For example, the TWO_TASK environment variable defines which database instance to employ. Prior to Version 1.3 of NCSA's HTTPD, the TWO_TASK was defined in the crontab resource file; whenever the daemon was restarted, the environment was automatically set. However, Version 1.3 of the daemon does not recognize variables defined outside its executable. The CIC developers found that their scripts would no longer work. After much deliberation, two possible solutions were suggested. First, the daemon code could be rewritten to set this variable. Second, the individual local routines could set the variable during initialization. The first proposal was unacceptable, because this would require updating the daemon code every time a new version was released. Thus, the second proposal was adopted. The TWO_TASK variable is set in each executable by reading the database instance name from a file at run time. Any file input/output time delays are insignificant and this method is the most flexible option. The variable is changed by simply editing the file. Neither the HTTPD code nor the local routines need to be recompiled.

    The following are three examples of applications placed on the WWW by the CIC. The examples demonstrate an evolution from text file manipulation to database interaction.

    2.1 SOLDIERS Magazine.

    All branches of the United States Armed Forces publish an information magazine which highlight its special people and events. SOLDIERS, the Army's publication, was the first military magazine on the WWW. The CIC has been placing the magazine on-line since July 1994. Figure 2.1-1, Front Cover, shows the March 1995 SOLDIERS edition currently on the WWW.


    Figure 2.1-1 Front Cover

    The electronic version of the magazine has increased the scope of its audience and has become a powerful recruiting mechanism for the Army. Further, the magazine offers insights into the military way of life. This provides an opportunity to foster public support for defense programs by detailing the diversified missions of the US Army and how they benefit communities at home and abroad. By providing this type of information about the United States Armed Forces to the general public, the average citizen will be empowered to make informed decisions.

    Requirements: The magazine's article's arrived on Macintosh formatted diskettes. In order to translate the articles into HTML format, two steps had to be taken. First, the articles had to be transferred from the Macintosh to a SCO UNIX system. Second, the HTML format tags and in-line images had to be added.

    Lessons Learned: FTP provided the medium to transfer the files to the SCO UNIX machine. After conversion, each file had to be renamed and special characters removed. Unfortunately, the special characters were not recognized by the UNIX editor VI. Thus, a global search and replace command was not an option. Two AWK programs were written to automate the conversion process. One program handled the global replacement and the second program performed removal of the special characters. Later, the HTML Assistant editor was downloaded to make the file conversion process easier. The HTML Assistant may be downloaded from the FTP site:

    2.2 Perdiem Application.

    The Perdiem application was designed to provide an on-line source for obtaining government perdiem rates for geographical locations. The application provides both the continental United States (CONUS) and outside continental United States (OCONUS) travel rates. This example represents the movement from simple file manipulation to the interactive processing available with hypermedia. Figure 2.2-1, Perdiem Applications, shows the first page of the perdiem application.

                        Figure 2.2-1 Perdiem Applications

    Requirements: The program's source was written in C++, using Open Software Foundation Motif tools for X Window applications. To make the application interactive, it accesses an Oracle database. Further, maintenance of the data is easier since it is stored in the database.

    Lessons Learned: The Perdiem application was developed in a C++ environment. This environment lends both disadvantages and advantages. One disadvantage is the extra overhead that must be maintained to make it function properly. An advantage to using an object-oriented design is that lower-level routines need only be written once. Subsequent applications, like the Perdiem application, simply inherit objects, adding only what is unique for them.

    The perdiem rate data was originally captured through screen dumps of an existing mainframe application. This made maintenance difficult. It was later determined that the data could be pulled directly from the Perdiem Committee in Alexandria, VA via modem. Although the ASCII file must be converted to database format, the data is guaranteed to be correct and up to date.

    2.3 On-Line Training.

    Budgetary constraints mandate a precise balance between well-informed employees and the cost of their training. The abundance of new software and hardware packages has made training a necessity. However, many organizations are faced with the dilemma "Do we pay our employees or do we train them?" The solution is to make training more affordable through automation [17, p. 74]. The WWW provides this medium. The Aircraft Visual Recognition Training Manual is a prototype for an on-line training program that was created to show the functionality of the integrated technologies. Figure 2.3-1, Aircraft Recognition Training, shows the training course developed for the WWW.

                   Figure 2.3-1 Aircraft Recognition Training

    Requirements: The on-line training project had to be interactive. It had to incorporate audio and video clips as well as other hypermedia techniques. The trainee registers when first activating the course. Personal information is stored in the Oracle database and retrieved when a return visit is made. The trainee may also take an exam and receive a grade upon completing the course.

    Lessons Learned: As the CIC's WWW applications became more complex, it became clear that structured techniques had to be incorporated into the development process. For example, the associated directory structure needed to follow a logical format. Each application needs a minimum of three subdirectories. These include the following:

  • db - contains source code for accessing the database

  • images - contains hypermedia files

  • html - contains the HTML formatted documents

    Return to the Table of Contents

    3.0 System Support.

    The theory behind government reinvention is to move support services into a more entrepreneurial atmosphere. Ingenuity and fresh insights into problems are needed to achieve legitimate solutions with minimal costs. Having the right tools to do the job is one major step in support of this effort. The old adage of "It's good enough for government work" must be replaced with "Of course it's good, it's government work." In the era of budget cuts, buying off-the-shelf products, which may or may not do the job, is no longer a feasible answer. Employees need a medium to post questions, track possible answers, download software, and research issues. In other words, employees need access to the Internet. The WWW and its browsers are the lifeboats in an endless sea of newsgroups, hot-lines, executables, shareware, and general information. All of these elements play a role in an application's development environment. Below are examples detailing the effects these elements have in such an atmosphere. Figure 3.0-1, FTP Sites, shows some of the anonymous FTP sites that developers visited to obtain needed software.


         Figure 3.0-1 FTP Sites

    3.1 Compiler Environment.

    The working environment can make or break the timeline for a project. If a programmer is forced to deal with antiquated tools, the software development process becomes all the more painful and slow. This problem is compounded when working across heterogeneous networks. Source code may react unpredictably on different platforms. Developers need assurance that their applications will still perform with consistent results, regardless of the platform. Predictability is the cornerstone for open systems. The definition of predictability, as used here is a set of standards and conventions that support the design and implementation of applications for multiple platforms. An organization need not invest large amounts of capital to acquire tools to solve this problem. The freeware compiler GNU, and its associated utilities, provides an environment that supports an open systems approach to software development. Two key advantages result from the use of a single compiler design. First, the source code may be freely ported and recompiled with little to no effort. Second, the constraints normally associated with proprietary software and hardware are greatly curtailed. The GNU utilities are found on many FTP sites. One valid location to retrieve a copy of the compiler environment is:

    3.2 WinWatch.

    Developers agree that one problem they all face is how to optimize runtime speed. As an application grows in functionality and size, its start-up time becomes a critical issue. If an application takes two to three minutes to load, the user will become frustrated. A program's associated resources are typically assumed to be the guilty party. A tool to aid the programmer in finding areas within the application where improvements can be made, would be helpful. The WWW provides the medium where such tools can be found. By searching through the site for the Washington University FTP Archive(URL:, one will find the software package WinWatch. This shareware program, available for a 30-day evaluation period, monitors Windows resources by detailing such runtime elements as free fixed memory and system memory. This identifies potential memory leaks. Further, the programmer is able to stress a system by defining the maximum sizes of memory modules. In addition, a snapshot of the application and its resources can be captured at runtime, yielding dynamic results. Figure 3.2-1, Main WinWatch Screen, is an example of output from the WinWatch application.

         Figure 3.2-1 Main WinWatch Screen

    By applying these results, the application may be fine tuned, ensuring optimal performance.

    3.3 Windows to UNIX to Fax.

    A third example of the influence of the WWW in the system support arena was a requirement to embed data communications within an existing application. The data had to flow from a DOS/Windows PC, to a SCO UNIX system, to its final destination, a fax machine. To further complicate matters, the developers had only one week to complete the task. Six software packages were discovered on the WWW to assist in this data transfer. These packages include the following:

  • Netdial: Makes the connection between DOS/Windows and UNIX

  • Trumpet Winsock: Provides the standard interface for the Transfer Control Protocol/Internet Protocol (TCP/IP) stack, including Point to Point Protocol (PPP)

  • Windows RCP: Provides remote copy protocol for DOS/Windows to UNIX

  • groff: Converts file to postscript format

  • ghostscript: Converts postscript file to fax formatted file

  • mgetty+sendfax: Sends to fax machine

    Figure 3.3-1, Fax Data Flow Chart, shows the methodology used to manage the data flow.

       Figure 3.3-1 Fax Data Flow Chart

    Return to the Table of Contents

    4.0 The Future of Applications.

    The typical user accessing the Internet is still a "techie". As other professional groups realize the relatively inexpensive information and services available via the WWW, the type of user will grow in variety. Soon, the technology envelope may finally be pushed to its limit. The more varied the users' needs, the more versatile the applications will become. As the WWW and its associated applications expand in functionality, they will be integrated into the lifestyles of people worldwide. In fact, accessing the WWW will be as natural a process as turning on the television to watch CNN's Headline News [16, pp.118-119]. Users will have a global warehouse of information at their finger tips.

    Return to the Table of Contents

    5.0 Limitations.

    Discussions of the power and potential of an entity such as the WWW are certainly exciting, but the current limitations impede its explosion into a true global vehicle for information exchange. Two illustrations of these limitations are the lack of a stable infrastructure and the unavailability of technology to those who are neither computer professionals nor computer literate.

    A healthy infrastructure is made up of at least three key elements. First, browsers must be able to rely on stable sites that do not disappear without notice. Second, data communication lines must be able to support the increasing traffic load ensuring an accident free virtual trip to the library or to the mall. Third, sites should employ a combination of both graphical and non-graphical WWW browsers to allow everyone access to available data. Currently, most sites have one or two of these elements. A strong infrastructure needs to be built on a foundation composed of all three elements.

    The availability of on-line information is of no use to those individuals with neither the knowledge nor the hardware to retrieve the data. In a recent survey of white collar workers, sixty percent did not know what the Internet was. Ten percent did not know how to access the Internet [10, p. 22]. In an effort to address this issue, the current presidential administration has strongly promoted the continued education and development of the Internet [7, p. 3]. Education is only half the battle. While graphical browsers make it easy for non-technical people to use and to understand the Internet, many people will still be excluded from the technology for financial reasons. Public transportation on the Information Superhighway must be guaranteed. The government must take great pains to ensure the "have-nots" are not excluded from the opportunities the WWW has to offer [3, p. 25].

    Return to the Table of Contents

    6.0 Federal Government Applications.

    As the Federal Government emerges from the dark age of pen and paper, the realization of the importance to disseminate information to the masses has erupted into multiple Internet sites. These sites serve as a reservoir of data ranging from consumer reports to the debates on Capitol Hill. The red tape associated with acquiring vital information has been virtual eliminated by going on-line. The electronic medium is not only fast, it is accurate. The educated voter only exists if there are avenues to the data needed to form opinions. These opinions lead to informed actions at the polls. The new generation of applications is focusing on providing services. The basic premise is to go beyond merely sorting and indexing the information. These services are the concrete for the foundation to make the government a more efficient and economical entity. The Federal Government has many sites available on the WWW. Figure 6.0-1, Federal Sites, lists a small subset of such sites.

              Figure 6.0-1 Federal Sites

    Return to the Table of Contents

    7.0 Conclusion.

    This discussion has examined one government organization's use of the WWW technology to serve its customers. Further, it has delved into the broader implications of the WWW's use across a wide spectrum of federal agencies. In order for true democracies to remain free, the people must have unimpeded admission into the information warehouses. As Internet technology improves and becomes more affordable, more people will benefit from the information infrastructure. The primary benefit is its potential to enable its users to remain competitive in a highly contested marketplace. Those governments which fail to embrace this technology are condemning its people to be left behind. Democratic governments around the world are realizing they must quickly improve operating procedures in order to succeed and to serve their people. A secure information infrastructure and usable tools will propel the nations forward into a global marketplace.

    Return to the Table of Contents


    [1] December, John and Neil Randall. The World Wide Web Unleashed. Indianapolis, Indiana: SAMS Publishing, 1994.

    [2] Baker, Steven. "Digging Around with Gopher." Unix Review. (July 1994), pp 23 - 29.

    [3] Bjerklie, David and Patrick Cole. "A New Divide Between Haves and Have-Nots?" Time. (Spring, 1995), pp 25 - 26.

    [4] Dickman, Steven. "Catching Customers on the Web." Inc. Technology. ( Summer, 1995), pp. 56-60.

    [5] Gore, Al. "From Red Tape to Results: Creating a Government that Works Better and Costs Less." Report of the National Performance Review. (September 7, 1993).

    [6] Minahan, Tim. "Army, White House seek agency data for the World Wide Web Listing." Government Computer News. (November 7, 1994), pp 1, 65.

    [7] Olsen, Florence, Susan Menke, William Jackson, and Tim Minaham. "Administration Teaches Surfing." Government Computer News. (March 20, 1995), pp 3.

    [8] Power, Kevin. "Gill Opens Internet Doors, then Departs." Government Computer News. (March 20, 1995), pp 98.

    [9] Raucci, Richard. "Hypermedia Internet, PC-Style." Open Computing. (June 1994), pp 93 - 94.

    [10] Romenesko, Jim. "Internet Insider." Online Access. (April 1995), pp 22.

    [11] Savetz, Kevin. "Veronica in Gopher Space." Online Access. (April 1995), pp 50 - 53.

    [12] Schatz, Bruce and Joseph Hardin. "NCSA Mosaic and the World Wide Web: Global Hypermedia Protocols for the Internet." Science, Vol 265. (August 12, 1994), pp 895 - 901.

    [13] Vaughan-Nichols, Steven. "The Web Means Business." Byte. (November, 1994), pp 26 - 27.

    [14] Verity, John and Robert Hof. "The Internet: How it will change the way you do business." Business Week. (November 14, 1994), pp 80 - 88.

    [15] Waldrop, M. Mitchell. "Culture Shock on the Networks." Science, Vol 265. (August 12, 1994), pp 879 - 881.

    [16] Wildstrom, Stephen. "Planet Internet: How the center of the computing universe has shifted." Business Week. (April 3, 1995), pp 118 - 124.

    [17] Williamson, Mickey. "High Tech Training." Byte. (December, 1994), pp 74 - 88.

    [18] Zuckerman, Mortimer. "Now, A Word From Cyberspace." U.S. News & World Report. (April 10, 1995), pp 84.

    Author Information.

    Pattie Doyle obtained a BA in Communications from James Madison University and a BS in Computer Information Systems from Athens State College. She currently holds a Programmer/Analyst position with COLSA Corporation working on a government contract with the Technology Development Lab (TDL) in Huntsville, Alabama. Her primary responsibilities include software development in an object-oriented environment in addition to her role as TDL Webmaster.

    Rita Edwards obtained both a BA in Communications and a BS in Computer Science from the University of Alabama in Huntsville. She currently holds a Senior Programmer position with SESI Corporation working on a government contract with the Technology Development Lab (TDL) in Huntsville, Alabama. Her primary responsibilities include software development in an object-oriented environment in addition to her role as TDL Web Security Manager.

    Angela Ross obtained a BS in Management Information Systems from Jacksonville State University, Jacksonville, Alabama. She currently holds a Computer Specialist position with the United States Army Missile Command's Corporate Information Center (CIC). She is the Project Chief of the Technology Development Lab. Her primary responsibilities involve the project management of approximately ten multi-million dollar software applications. The TDL mailing address is Technology Development Lab, 4946 Research Drive, Huntsville, AL 35806

    Visit the CIC and the TDL at

    Return to the Table of Contents