Mark León <email@example.com>
NASA Ames Research Center
University of North Dakota
Distance learning is not new. Since radio first embellished our culture, distance learning has taken may forms. With the onset of television, videotape, and satellite linkups the world of multimedia has become a presence in our remote learning environment. Now, in the "information age," new models for bringing the best education to people throughout the world are in their early stages. Recent technological developments have led to key advances in distance learning through the greater bandwidths available over the Internet and a broader communications infrastructure that extends to classrooms throughout the country and the world. Further, new software compression technology allows audio and video to be communicated over the Internet much more efficiently. Larger amounts of data can be transferred to remote sites at less cost.
The purpose of this paper is to demonstrate the use of state-of-the-art commercial technology in the educational community. The focus is on virtual conferences, virtual instruction, and remote education. The techniques herein have been developed by NASA and the University of North Dakota (UND) through the use of existing software and hardware purchased in the United States. NASA has awarded UND a grant for continued research in this area on the basis of its pioneering effort to date.
NASA has been conducting "virtual conferences" from the Ames Research Center in order to make unique educational opportunities available to participants across the country and internationally. Through the use of this technical approach, hundreds of teachers have been able to participate in events where physical or financial barriers traditionally prevented their attendance. This technique is currently being adopted by industry because of its scaleable merit.
The Department of Space Studies at the University of North Dakota has been conducting classes over the Internet internationally since January 1996. The success of the program is clear. This model has afforded UND the flexibility to place instructors in remote locations without altering class schedules. These classes have used basic Web browser tools with WebChat sessions to ensure interactivity with students. Final exams have been conducted over the Web, and security measures have been taken to yield a degree of authentication for the students.
The newest development of this program is the hybrid of combining virtual conferences with virtual instruction to produce a more complete educational experience. The utilization of RealAudio, CU-SeeMe, UNIX multicasting, WebChat sessions, e-mail, Web slide presentations, QuickTime video clips, and remote control of laboratory equipment has given the most complete educational experience ever offered over the Internet. To date no other attempt at remote education has provided this degree of learning over a 28.8-Kbps (kilobyte per second) Internet connection. This project has stretched the limits of technology and has documented its approach. The results from this experimental class will be available over the Web.
To demonstrate the truly significant importance of this program, NASA has selected one of its premier telerobotics groups to provide the lectures for a one-unit class. The NASA Ames Research Center in Mountain View, California, is developing planetary and space-based robotic systems for use in space, on lunar and planetary surfaces, and in hazardous environments on Earth. Remotely operated robots have been successfully tested in volcanoes, in simulated Martian environments, and under the ice in Antarctica. NASA Ames has entered into a joint project with the University of North Dakota to present a course based on these technologies entitled "Telerobotics: Live from NASA Ames" that will allow students to use the Internet to receive lectures, communicate questions and comments online, view video clips, play audiotapes, and manipulate a robot remotely.
In summer 1996, a virtual online conference using Internet technologies allowed participants around the world to "attend" a teacher training conference held in Washington, D.C. The technologies used during the conference were designed both to serve the lowest common denominator and to push the envelope in educational technology by meeting the needs of low-end and high-end Internet users. Regardless of their mode of connection to the Internet, users were given the opportunity to watch, listen, and read about the event occurring in Washington. As participants from across the globe had questions, they were invited to send them to the NASA auditorium using CU-SeeMe, WebChat (Web Broadcasting System, Menlo Park, California) and e-mail. The online event proved to be a tremendous success involving hundreds of participants in the United States and around the world with more than 45 sites in 13 countries.
The conference was co-funded by Passport to Knowledge and NASA. The event was planned and coordinated from NASA Ames Research Center, and NASA headquarters in Washington, D.C. Project management and technical support were provided by NASA's K-12 Internet Initiative and the NASA Internet at Ames. Technical support at NASA headquarters was provided primarily by the Advanced Internet Technology Group, and the technical architecture was configured to take advantage of the higher bandwidths available at the Ames location.
The purposes of the physical workshop were:
All participants, remote and local, were given an opportunity to ask questions and have them answered at the conference.
The following matrix provides an overview of the technologies used and the functionality they delivered:
The proceedings in the NASA auditorium were made available via several broadcast technologies. First, the event was recorded and broadcast live by a NASA TV crew via normal television channels. The TV signal was then made available on the MBone and through CU-SeeMe. In addition, the audio portion of the TV signal was broadcast via a live RealAudio signal. Transcription, live video capture, and e-mail updates were also sent from the auditorium.
Using an SGI Indigo server located at Ames Research Center, the live NASA TV signal was captured from a satellite band and rebroadcast.
The NASA TV signal was also broadcast over CU-SeeMe via two primary reflector sites. From the NASA auditorium, audio and video from the NASA TV A/V switch were captured by a Macintosh Quadra 8500 and sent to an HQ reflector running on an SGI Indigo. A second reflector site was located at Ames Research Center, which took its feed from a reflector site at Lewis Research Center in Cleveland, Ohio.
Because of bandwidth restrictions at the various sites, users with slower connections were encouraged to point their CU-SeeMe client to the NASA headquarters reflector. Users capable of receiving a higher bandwidth feed were encouraged to point to the Ames reflector. The Lewis site and other NASA TV reflector sites were used as backups.
For remote participants unable to view and hear the NASA TV signal, several technologies were in place to help make the experience live. Many users who were not able to view the video signal were able to use RealAudio client software to hear the live audio signal from the auditorium. A Web site was available to provide graphics from the auditorium.
The NASA TV audio signal was encoded in real time and broadcast via RealAudio (Progressive Networks, Seattle, Washington). There were two RealAudio servers configured for the event.
From the NASA auditorium, audio from the NASA TV A/V switch was fed to a pair of Macintosh machines running RealAudio's Live Encoder. A Macintosh 8100 was used to digitize the audio for the 28.8-Kbps feed, and a Macintosh 8500 was required for the 14.4-Kbps feed. A preconference test demonstrated that the only machine available that could digitize a 14.4-Kbps feed in real time was the 8500; the 8100 dropped out every five seconds or so. Compressing data to such a small sample was clearly beyond the CPU power of the 8100, whereas the 8500 encoding resulted in no reported problems.
Both RealAudio signals were then sent via an Ethernet network to an SGI Indigo machine running the RealAudio Live Transport Agent and Server. Through a modification of the configuration listed in the RealAudio documentation, the NASA headquarters support team was able to serve both 28.8-Kbps and 14.4-Kbps modem clients with automatic negotiation of the best feed.
At Ames, a Quadra 8500 was used to run the RealAudio Live Encoder software. The audio signal was taken directly from a NASA TV feed and plugged into the microphone jack of the Macintosh. At this site, only the 28.8 compression option was used and the signal was fed to a SunSparc IPX server.
To provide visuals for users relying on the audio signal, a Web site was established with presentation graphics and periodic video images captured from the live TV signal. The Web site server also hosted a WebChat that provided a live transcription of the event. And to provide users a personal glimpse of the activities in the auditorium, a photo-journal page was established.
Presentation Graphics: For each presentation there was a dedicated Web page containing the graphics used by that particular presenter. Remote participants could follow along visually as the presenter spoke.
Live Video Capture: A periodic capture of the NASA TV signal from the MBone machine was also made available via the Web site. The images were captured by invoking a cron job that instructed a shell script to capture the MBone image once per minute. The Ames Web server called the capture video image stored by a Web server located on the MBone machine. A user could then, as often as once a minute, refresh the page and see a new image.
WebChat Transcription: The proceedings were transcribed live from the auditorium via a moderated WebChat session. Moderation was used so that the flow of text would not be interrupted by remote participants trying to post there. The transcription was a paraphrase of the events on the auditorium stage and was done by a fast typist. Attempts were made to provide a sense of what information was being delivered and describe any activities on stage. The WebChat transcription was accessed from the Web site located at Ames.
Photo-Journal: In order to complement the images sent from the NASA TV signal, attendees in the auditorium took QuickTake images and wrote short captions that further described the activities being enjoyed by onsite participants.
For users not able to listen to the audio signal or access the Web site, periodic e-mail updates were sent from the auditorium. Every 15 to 20 minutes an update was sent to remote participants by an attendee in the NASA auditorium. This update mail list was managed by a majordomo server at Ames and was accessed via a Eudora interface configured to point to a POP server also running at Ames. The content of the e-mail ranged from a general synopsis of onstage activities to very rich descriptions of the information being presented.
The workshop schedule allowed for periodic questions from the audience. During these periods equal time was given to the onsite participants and the remote participants. Questions were taken from the WebChat, e-mail list, and CU-SeeMe chat windows and verbally presented to the speakers on stage.
Teachers in attendance at the conference were given an opportunity to select the technology they were most comfortable with or most interested in learning more about. They were then assigned to a specific presentation or a question and answer session. When appropriate, they took comments and questions from the remote participants.
A WebChat session was conducted for remote participants that provided both technical assistance and a forum for comments and questions about workshop content. During the question and answer periods, every attempt was made to answer all questions from the remote participants. Technical questions were answered by technical support staff located on- and off-site.
An e-mail account was also established that allowed remote participants to send their questions to the auditorium. An onsite participant was given the responsibility to take the questions from the account and verbally present them. The e-mail account was established at an Ames POP server; a Macintosh located in the NASA auditorium pointed to the POP server.
Remote participants watching via CU-SeeMe were encouraged to send a video signal back to the NASA headquarters reflector. This allowed onsite participants the ability to view the activities of the remote sites. Occasionally onsite participants would be involved in text-based chat sessions with the CU-SeeMe participants when a question was presented for a response from the auditorium stage.
For remote users familiar with multiple-user-devices, object-oriented (MOO) technology, the Diversity University (Houston, Texas) MOO site was supported by a teacher in the auditorium. Remote attendees on the MOO were also able to have their questions asked in the auditorium.
A great deal of effort went into the task of configuring the NASA auditorium to allow the onsite participants to support the remote participants. Several Macintoshes and personal computers were located throughout the auditorium (Figure 1). Teachers physically attending the workshop took turns providing e-mail, WebChat, MOO, and CU-SeeMe support.
In addition, to give the audience in the auditorium a sense of the experiences of the remote participants, backdrop panels were used. The backdrop of the NASA auditorium has three distinct floor-to-ceiling panels. Using video projection screens located behind the stage, different components of the virtual conference were portrayed. On the panel directly behind the presenter on the left-hand side of the stage was the WebChat discussion room, on the center panel were the presenter's graphics, and on the right panel was CU-SeeMe.
Figure 1: Auditorium layout
Planning for the 20 July event officially began on 27 May. A brief timeline is provided in Table 2:
Early efforts were focused on capturing the interest of remote participants and defining the necessary technology. Two types of remote participants were expected, individuals and host sites. Mail lists were established for each group and instruction and information were provided.
Very quickly, efforts shifted to defining the technical architecture and requirements for the event. A great deal of work was required to gather the necessary technology and ensure that all elements were in place and working together.
A Web site was also established to provide background information and technical assistance to remote participants as they made plans for the event. All mail-list activity was archived on the Web site to ensure that participants that joined later in the process had access to critical information. The Web site also introduced the sponsors of the workshop and supplied an agenda.
It was critical that the remote participants had enough information and opportunities to test the configuration they had planned for the event. A critical part of the planning process was the testing of the technologies. An internal test was conducted July 12 and two external tests were conducted July 15 and 17.
Live from Mars is a Passport to Knowledge project conducted in partnership with the NASA K-12 Internet Initiative's "Sharing NASA With The Classroom" project. These projects allow teachers and students to communicate with and learn about the NASA scientists and engineers that make exciting missions a reality. The lessons learned during the virtual conference will be used to promote and support other "Sharing NASA" projects. The technologies and techniques discussed in this article will be used to extend the level and type of interaction that can be experienced by the online education community.
The Department of Space Studies at the University of North Dakota, which combines space science, technology, medicine, commerce, law, and policy, is the only such interdisciplinary graduate program in the United States. Although astronauts, rocket engineers, and scientists will continue to provide the technical expertise to accomplish space goals, they will not decide what those goals should be and how to implement them in NASA, the military, and commercial segments of the U.S. and international space programs. The philosophy of the Space Studies Department is that the next generation of managers who will oversee future space achievements must have a broad overview of all space activities. Space Studies graduates are well prepared to participate in and help guide the developing transition from the exploration of space to its routine use.
Founded in 1987 by David Webb, a member of the 1985-86 Presidential Commission on Space, Space Studies has grown to become one of the largest graduate programs at the University of North Dakota. More than 200 students have been awarded M.S. degrees. Most students have come to North Dakota from other states, or they have been officers stationed at Air Force bases within the state. The faculty of the Department realized, however, that there were many people who could benefit from a Space Studies degree, but could not come to North Dakota. Thus, the idea was born to take the program to where the students live and work.
In designing SPACE.EDU, the distance education version of the Space Studies program, we recognized that our potential students would be professionals already working in space activities in industry, the military, or the government. They would already have received B.S. or M.S. degrees and would be working as scientists, engineers, technicians, and managers. And they would not all live in the United States. In order to reach such busy and widely dispersed students, SPACE.EDU had to deliver the content in a manner that would be convenient and flexible. Live video broadcast via satellites would not work because each student would be required to have access to a ground receiving station and to watch transmissions at a given time. A correspondence-based approach, whether using normal mail or electronic mail, was inappropriate for graduate-level programs where discussion of concepts and issues was required.
The first thing students need is information about their courses and degree requirements. To provide as much information as possible, a World Wide Web home page was constructed to establish a virtual campus. The URL address for that campus is http://www.space.edu. The home page provides information that normally is found in college catalogs, information leaflets, official forms, announcements tacked to bulletin boards, and class handouts. Students can review the schedule of classes for the next two years, register for classes, download journal articles from the library, buy a college sweatshirt, look up the credentials of a faculty member, or find the e-mail address of a classmate. Additionally, there is a substantial study guide for each course and, for one course, an entire online textbook with many colored images and links to relevant sites.
To provide distant students an experience paralleling that of campus students, we devised methods of interaction that maximized the efficiency of learning. We realized that courses have two components that could be offered to the students separately and using different technologies. First, the raw transmission of information was accomplished by videotaping live classes in a state-of-the-art studio. For example, the course "Global Change," taught to Space Studies campus students in the January to May 1996 term, was videotaped. Because of the need to make a legible and professional presentation, traditional hand-drawn vugraphs were replaced by PowerPoint presentation slides. At the end of the semester the videotapes were professionally duplicated and mailed as complete packages to the 24 students who signed up for the SPACE.EDU version of the course. From September to December 1996, the distant students watched the videos and read the assigned textbooks according to a published schedule. Thus, delivery of this segment of the course was via videos and the U.S. post office.
The second component of a course is interaction between student and instructor and between students themselves. For SPACE.EDU classes we use World Chat, an Internet Relay Chat (IRC) program that facilitates live e-mail exchanges between groups of people. During the fall term, one-hour chat sessions were conducted in the evenings to minimize interference with student jobs. During the chats the instructor and students discussed the topic of the week's tapes. It was assumed that students understood the technical issues so that more time could be devoted to policy aspects and recent news about the topic. For example, in considering ozone depletion, the Global Change class discussion focused on ways to get new technology for non-CFC-using refrigerators to China and India. Also discussed was that week's announcement that ozone levels were already recovering because of the Montreal Protocol limiting CFC production. If a student misses a Chat session and cannot participate in one on another night, he or she can access a transcription of the missed session on the Web page for the class.
To encourage further discussion, students were required to post news notes about global change activities to the Web home page for the course. Other news items and class announcements were distributed directly to students using e-mail list software. Thus, each week during the semester the student should view a videotape, read the text, engage in a seminar chat, and receive and post additional information about global change.
Two other aspects of SPACE.EDU classes are handled in innovative ways. Traditionally, campus students write a term paper that discusses some aspect of a subject in great detail. For the Global Change class the students were instead required to construct a home page that presented information about the resources and environmental pressures of some country other than the U.S. Additionally, each student's home page was assessed by all the other students, thus increasing everyone's awareness of global issues. At the end of the semester some students reported that learning to make a home page was a practical skill that they would use in their jobs.
The second innovative aspect of SPACE.EDU is examinations. Although we experimented with having each student find a proctor for a traditional closed-book exam, which we faxed to the proctor, this procedure was time-consuming and logistically awkward. Thus, we developed online exams. These are necessarily open-book exams, and therefore must be substantially different from typical classroom exams that test remembered content. A good open-book exam requires the student to apply learned information and concepts to a new scenario or to discuss how it can accommodate new data. SPACE.EDU programmers have designed a password-protected virtual exam room that is actually an HTML form. Students write answers to questions in scrolling fields. Instructors can rapidly read and grade the exams because each is typed rather than being scraggilly handwritten. And students can use normal word processing tools such as cut and paste to revise and edit their answers. The instructor can check a restricted Web page that lists the start and stop time of every student, and by clicking the student's name, go directly to the answers. It is a very convenient and easy-to-use examination system.
In its first year of operation SPACE.EDU registered 135 students from 24 states and 5 countries. Although some students had various computer glitches, the majority have been pleased enough with their educational experiences to enroll in subsequent classes. Faculty have learned that there is substantially more student involvement in chat sessions than is typical in classroom discussions. Also, the discussions are commonly at a higher level than in classrooms because the distant students are already professionally involved in space-related activities and have considerable personal experience to relate.
SPACE.EDU is the first step in a truly distributed education system: students are widely distributed in the U.S. and abroad. The second step occurs in the spring of 1997 when the University of North Dakota will offer academic credit for a course taught to distant students by faculty not in North Dakota. In conjunction with NASA Ames Research Center scientists and engineers, SPACE.EDU is offering a special course: SpSt 570: Tele-Robotics: Live from NASA Ames. The course will build on the capabilities evolved for the original SPACE.EDU courses, but explore new methods of presentation. The entire course will be taught live over the Internet using CU-SeeMe and RealAudio software. CU-SeeMe and its commercial version by White Pine Software transform video and sound into packets that are carried over the Internet in near real time. RealAudio similarly packetizes audio data for Internet transmission. We will use both applications for optimum sound and video quality.
Supporting the course will be a Web site that provides a syllabus, information on each speaker, an abstract of each presentation, and still images and videos. As the guest instructor lectures, CU-SeeMe will show some images to establish the setting and what the instructor looks like, but most of the graphics will come from the Web site images and video. Questions from students will be come via the IRC software product WebChat. An assistant will pass selected questions on to the instructor who will respond using CU-SeeMe and RealAudio.
The highlight of the course will occur during the last session when students will drive a robotic all-terrain vehicle similar to ones designed to explore Mars. Control of the vehicle will be through Web-based software. At the end of the course students will take an online examination to review the major points of the lesson.
Following the successful demonstration of telerobotics, SPACE.EDU plans each semester to offer a series of short courses from various NASA centers. Live courses are being considered about Mars, Landsat 7, spaceports, the launch industry, and the launch of a space shuttle.
SPACE.EDU will not be a static program. As Internet capabilities grow, new technologies will be pioneered. In 1997, chat sessions will be partially replaced by RealAudio or similar voice-based discussions. CU-SeeMe or other video and sound software will be routinely incorporated into Chat sessions afterwards. Within the next two years, technology is expected to improve sufficiently to allow on-demand Internet transmission of classroom videos, replacing the costly, time-consuming and logistically complex mailing of videotapes. Although SPACE.EDU could immediately demonstrate such higher level capabilities, few of our students could participate in such a program. Our goal is to apply commercial software to the daily task of teaching graduate-level courses. The technology must serve education.
The purpose of the NASA Ames/University of North Dakota (UND) project is to develop and document technologies that will support the continued development of distance learning as well as advancement of new technologies to that end. NASA is working with UND because it has an advanced distance learning program based on the Internet. By leveraging off of this platform, NASA is able to make certain technological improvements to existing techniques to bring the fullest value possible to the distant student. One example is the telerobotics course that will be taught over the Internet from January through March 1997. Another example is the use of Internet software to bring a human factors aviation safety course taught by UND over the network to Ames and Memphis city schools. The purpose of this second project is to replace a costly satellite link with an affordable Internet connection. Our objective is to prototype remote transmission and provision of instructional material and to test the efficiency and intelligibility with which it is received. This project will be attempted next semester.
Two elements are critical for the students to take the course. They must at least have a dedicated 28.8-Kbps link, and they must meet the minimum system requirements below so that their computer can run all required applications in a timely fashion.
Students who are limited to a 28.8-Kbps link have to use SLIP/PPP (Serial Line Internet Protocol and Point-to-Point Protocol) connections and follow the software optimization guides to insure best performance. Several Internet service providers (ISPs) and national information providers were evaluated. It has been repeatedly demonstrated that running a 28.8-Kbps connection over a regional ISP provided adequate performance to take the course intelligibly. When national ISPs were used, numerous problems were encountered. Even after technical problems were eliminated it was found that running PPP over these services was slow at best. We advise users to use local ISPs.
Low-bandwidth users (28.8-56 Kbps) typically use CU-SeeMe initially to get a sense of what the professor looks like and then rely on RealAudio for the majority of the instructor's verbal informational content. As the students are led through the Web slides, they will enter chat windows as instructed and play QuickTime movies as requested. It is recommended that students download QuickTime movies prior to the lecture. In addition to this, students are asked to pre-cache their Web browsers so that they will have no delays during the class.
The medium-bandwidth users (128 Kbps and above) have the luxury of being able to run CU-SeeMe video and RealAudio throughout the entire lecture. At the end of the lecture, noncritical videos are occasionally played. This group will benefit in being able to view these over CU-SeeMe.
Students with access to fractional T1 and full T1 lines (1.54 Mbps) are able to run all applications during the whole class session. Some students have access to local area network (LAN) speeds ranging from 10 to 100 Mbps. Note that although the course originates on a 100-Mbps FDDI LAN at NASA Ames, its entry point to the Internet is truncated by a T3 (45 Mbps). Subsequent connections will likely have lower speed connections as the signal cascades to the users.
A number of software applications are required to interact successfully on the Web. In order for students to have a platform that will support them, we recommend that they have a multimedia PC or Mac with a minimum of 8- to 16-Mb of RAM and multitasking to run more than one application at once. System speeds of 66 MHz or better showed the best performance.
This course requires the participants to run at least Netscape Navigator 3.0 or Microsoft Internet Explorer (MSIE) 3.0, RealAudio 2.0, and CU-SeeMe V.83b. Higher versions of these software applications will ensure superior performance. Through Netscape, students have access to e-mail, WebChats, and remote login for controlling a remotely operated vehicle.
Enhanced CU-SeeMe, V.2.1: Significant improvements in video performance levels can be achieved by ensuring that the codec settings are set up correctly within the video preferences. Most of the options shown below are within the Configure=8A button on the Preferences screen. There is some element of experimentation that may produce further improvements, but the settings shown below should be optimal for most purposes.
|28.8 kb/s modem connection||ISDN (128 kb/s) connection||LAN connection|
|Point-to-point||Working reflector||Point-to-point||Working reflector||Point-to-point||Working reflector|
|Smeared I frame rate||0||255||255||128||128||128|
|Gamma correction factor||33||33||33||33||33||33|
|Apply noise reduction filter||ON||ON||OFF||OFF||OFF||OFF|
|ME search radius||0||0||0||0||0||0|
Adjusting the settings above can have a dramatic effect on performance. If you wish to experiment, here is some guidance:
During the first 10 minutes instructors will give a brief history of their background in the field of telerobotics. This will give some students who have 28.8-Kbps links a chance to view the instructor using CU-SeeMe audio and video. Because the quality of the audio is not very good at this speed, a lot of information is lost. In fact on some links, users will turn off the CU-SeeMe audio so that they can get a sharper image. This phase is important as it creates a visual relationship between the students and the professor. There is also a picture of the instructor on the Web page if the link is not going well. Following the introduction, the professor will begin leading the class through the slides while discussing telerobotics. At this point most students using 28.8 Kbps have cut over to RealAudio for clear reception of the verbal information. The instructor uses the audio-based learning environment to enhance his slides and leads the students through various Web charts with images and diagrams related to telerobotics. Some examples are:
As the instructor leads the students through his slides, he takes them into WebChats and points to QuickTime videos. Students will be able to submit questions through their chat window. A teaching assistant (TA) or instructor will log questions and sort them according to their commonality and pertinence to class material. The TA will hand the printed questions to the teacher who then answers them. This phase continues until the end of the two-hour class. Students are required to participate in class chat lab sessions once a week, and to submit questions to the instructor through e-mail. At the end of the class a final examination will be given over the Web just as specified in UND's "Final Examination" section. One very exciting component of this course is the lab section where the students are each required to log into a remote computer and drive a vehicle using CU-SeeMe and a Telnet session.
Both CU-SeeMe video and Multicast video are available over the Internet. The White Pines Enhanced Color V. 2.1 is used for CU-SeeMe. CU-SeeMe will be in gray scale so that people using either the White Pines software or encoded Cornell University software will be able to decode the data. The Multicast video encoding uses Mrouted 3.8 and will support some earlier versions of NV software.
CU-SeeMe audio, RealAudio, and Multicast audio are supported. The RealAudio version 3.0 will be our primary source. Although CU-SeeMe audio will be distributed, it has been our experience that even over high speed connections intelligibility is not great. Multicast audio broadcasting using Mrouted 3.8 will provide audio for most VAT versions.
Both Netscape and MSIE support chat windows used by students to submit questions to the instructor in real time. These are reserved for students enrolled in the class. There may be an additional chat window available for nonregistered students in the event that there is time to answer additional questions. Although unenrolled students probably won't have their questions answered during the class, if the questions are not too voluminous they will all be answered through electronic mail. There will be two Chat windows running, one for the class and one for technical problems.
In addition to Chat, students will use the Web to download three types of multimedia: slides, video, and audio. Each instructor will have about 20 slides loaded onto the Internet. Students are responsible for downloading QuickTime video clips in advance so they can run them easily while the teacher talks them through the class work. They may also be asked to download audio clips.
E-mail is the last component of the Internet tool. Students need e-mail, either through Netscape or some other account. Any questions that are not answered during class will be answered via e-mail.
In order to produce the virtual classroom for the telerobotics project, an infrastructure must be developed that can provide acceptable data delivery and be able to transmit intelligible information to students in a real-time fashion. The objective of this distance learning project is to support 150 enrolled students and several hundred other viewers who have limited access. The software used will be based on delivering a course over the Internet using TCP/IP as the transport protocol. The data delivery system will originate from Ames Research Center, which has a T3 hub (45 Mbps) to the Internet. This will enable the large amount of data coming out of Ames to flow freely into the network. Fractional T1 is optimal for reception; however, the design plan is specifically geared toward students with a 28.8-Kbps connection.
To implement the full services, we will use a combination of video, audio, World Wide Web, and Internet tools. There are a number of products on the market that if utilized would have yielded better intelligibility; unfortunately, free client software was not available for the students. This system had to be as cheap as possible for the user. This communication design is composed of six distinct systems: Audio/Video Base Band, RealAudio, CU-SeeMe, Multicasting, WWW, and ROV. The following diagrams outline these systems. See http://zeus.arc.nasa.gov/getstarted.html.
This system is the eyes and ears of the whole design. Using poor equipment at this stage will only make the rest of the system less accurate. The performance of the video codec is affected by the performance of the camera. Cameras that generate "noise" in the signal make it difficult to generate high performance, as it is not possible to compress a noisy signal as much as a clean one. The two main factors that contribute to the noise level generated by a camera are the optics and fluctuations due to any "auto white balance" feature. Cameras with high-quality optics (such as most camcorders) and cameras with manual override of the auto white balance will give better performance. Unfortunately we did not have the budget to purchase top-of-the-line cameras. The video component is taken through a high-end desktop camera.
The output of the camera is fed into a video distribution amplifier. The device used was a DA-60. It had one input and four outputs. A manual gain control made it possible to provide a 1-volt peak-to-peak signal to the video test monitor and encoders. From there, video-capturing boards were used for both CU-SeeMe and NV multicasting. We recommend 30 frames per second capture boards with a minimum resolution of 320 x 240 with "true color" or "millions of color" image depth.
The sound is captured by a SM-18 Shure microphone connected through a standard three-wire XLR cable. This baseband signal is fed into a M-67 Shure mixer. This mixer provides multiple 0-db signals into the encoders. This is sufficient gain to drive the RealAudio, CU-SeeMe audio, and the test speaker. It is important to use a good-quality microphone that generates good audio levels. Even inexpensive microphones often have built-in amplification that gives good results.
A specific audio tool, RealAudio V.3.0, will be utilized. It is separate from the CU-SeeMe audio and the NV multicasting audio, because the audio on the previously mentioned tools is not always completely intelligible, especially when the user's network is encountering saturation problems. The system is designed to provide RealAudio at 14.4 or 28.8 Kbps. Student accounts are automatically set up for 14.4 Kbps. We will have two licenses running at 150 users each for a total of 300 RealAudio channels available. All channels distribute audio streams at 14.4 Kbps. It must be emphasized that registered students will have access to one CU-SeeMe channel and one RealAudio channel; access for all unregistered viewers will be limited to a first-come, first-served basis.
A Macintosh 8500/150 is used as the audio encoder. RealAudio V.3.0 allows for dual streaming outputs to multiple reflectors. We used a SunSparc Ultra Server 170 to distribute 150 RealAudio channels and a SunSparc 20 to distribute the other 150 channels. Users are allowed to log in using pnm://quest.arc.nasa.gov/live/live.ra.
When building the system we began with a 50-user license. We upgraded the license in order to accommodate 150 student accounts. Next we activated a second machine capable of supporting 150 users. Unfortunately, Real Audio Version 2.0 did not support multiple reflectors from a single encoder. Version 3.0 allows us to stream two encoded streams of the same rate (14.4 or 28.8 Kbps) to multiple machines. Our requirement was to provide RealAudio for a total of 300 users, so we needed three machines to accomplish this. The encoder was located several miles from the two separately located reflectors.
The software used to deliver the classes will be White Pines V.2.1.0 Enhanced Color CU-SeeMe. However, the system is designed to support Cornell University B&W CU-SeeMe software V.0.82b through V.0.85b1, which can be acquired free over the Internet (see http://zeus.arc.nasa.gov/getstarted.html). These versions of CU-SeeMe support both PCs and Macs.
There are a number of proprietary desktop software applications available. Some of these are substantially better but require a much larger investment by the user. The key here is that the software be free to the students. The video component of this instructional system is a very minor point, with the exception of the lab. Each student will be required to drive an ROV using the Internet. CU-SeeMe is used to transmit the video component to authenticate to the student that the vehicle has moved. If CU-SeeMe breaks down, a Web application will be available to provide feedback to the ROV driver.
A video and audio card, together with the CU-SeeMe software, acts as the encoder. The Pentium-based PC "Alpha Frog" then distributes the encoded stream to a SunSparc 20 called "Zeus." Zeus has a 50-user license and reflects for 50 users. This provides for 47 users and three streams to feed three more computers: "Quest," "Top Web," and "Explorer." Quest and Explorer each have a 100-user license which, along with Top Web's 50-user license, provides a total of 250 reflected channels. This system can distribute to 297 users.
Packet video will continue to play a big part in the future of our remote instruction. Initially we had intended to transmit in both color and gray scale at the expense of using an additional machine and by splitting up our licenses. After testing White Pines Enhanced CU-SeeMe in color, it was decided that it was not worth trying to transmit this in color and gray scale. The primary reason for transmitting in gray scale was to support the many students using Cornell University CU-SeeMe. When Cornell CU-SeeMe black and white users attempted to receive CU-SeeMe in color, they could not receive any video. We decided that we had a better chance of transmitting some usable information using gray scale, which used less of the data pipe. We also found that adjusting the software settings on White Pines Enhanced CU-SeeMe allowed for best performance. These settings are listed in this paper.
Multicasting over MBone is in the public domain and can be acquired free (http://zeus.arc.nasa.gov/mbone.html). We will use NV and VAT tools in the multicast domain (Mrouted V.3.8 multicast software) in order to transmit the video and audio signals originating from the Ames class over the Internet.
Multicasting goes out to most of the major universities across the country. It relies on the router protocol called OSPF (Open Shortest Path First). On top of that, specific tunneling must be implemented in order to ensure that the university can receive the multicast transmissions. We used a Sun IPX to encode the audio and video. The existing audio/video card was made by Parallax. The purpose of this system was to create a wider distribution of the course. No students were required to use it.
We require students to have access to Netscape 3.0 or Microsoft Internet Explorer. Incorporated in these tools are WebChat and e-mail. The Web page designed to lead the class allows a student to go into a specific chart on a given presentation or a chat window for that class, or send e-mail to the professor giving the class. In addition, there are buttons to launch CU-SeeMe and RealAudio applications.
The WWW system has been developed to survive if one of the two servers fails. The primary system at UND supports the student accounts. In the event that this system fails, students will be able to log into the public site mirrored at Ames. The chat window at the Ames site is intended for use only in the event of a UND failure.
A telephone link is streamed to a transcription service that provides streaming text into a chat window. This stenographer service is provided by Cheetah Corp. to support two functions. It provides live text for the hearing impaired and acts as a backup link in the event of a Real Audio failure.
The system is designed for students to use CU-SeeMe and WWW to drive a remotely operated vehicle. Part of the benefit of this distance learning lab is to give students the experience of controlling a vehicle thousands of kilometers away. Two resources have been designed to produce this result. In one case, students will use CU-SeeMe and Telnet to remotely control a vehicle or mechanical arm. The system is designed to allow a Telnet session to a computer at either the University of California at Berkeley or Ames Research Center. A second system loads scanned TV onto a Web page, which students can refresh to see how they have moved their vehicle.
As the course begins, the teacher will be seated before a camera and microphone in the lab at Ames. Conventional baseband audio and video recording devices will be used to record the full two hours of their narration, comments and facial expressions. The recording location from which the baseband audio and video signal originates is a ground station located in building N240 at NASA ARC in Moffett Field. This tape will be replayed over the network once a week for those who missed the class.
The completion of course is slated for 26 March 1997. We will have an evaluation by students on five characteristics: software usage, intelligibility of information, value, virtual experience, and results of the final exam.
The University of North Dakota is modifying its current evaluation system to include the following elements:
An in-depth survey prior to the beginning of the course indicated that no other live course has been executed with all of the components utilized in the NASA/UND telerobotics course. Furthermore, since UND's commencement of its distance learning program a number of foreign nationals have benefited from the curriculum including students from England, Spain, Australia, Malaysia, Brazil, Japan, Canada, and New Zealand. Included in the NASA/UND project are Brazil, England, and Puerto Rico. Russia is informally participating.
The first demonstration of this course was successfully launched on 22 January 1997 with more than 80 users. It was seen as one of the most innovative uses of low bandwidth distance learning approaches to date. The live video isn't very useful to the 28.8 Kbps users because the quality is poor; however, the audio is superlative. Clearly new ground has been broken here. Network speeds will continue to increase in urban areas, making this type of distance learning less interesting. Nevertheless, rural areas will continue to require solutions such as those outlined in this paper for some time to come.
This project has been designed as part of a three-year partnership. Next semester UND will begin remote instruction over MBone to Memphis city schools through the University of Tennessee. UND will provide vocational instruction on aeronautics to students in their first and second year of junior college. This project will leverage off of a "one touch" system, which allows a student logged onto a special system to initiate a question by entering a code into a touchtone pad connected to a computer. The instructor then allows that student to articulate his or her question over the network directly to the classroom.
The most important lesson learned here is that traditional barriers for remote locations are quickly falling to new technological developments. These applications should be implemented as learning tools into our educational communities to afford the maximum possible growth and development.
Mark León is Program Manager of the Learning Technologies Project (formerly known as the Information Infrastructure Technology and Applications Project [IITA]) at Ames Research Center.
Andrea McCurdy is a Sterling Software contractor with NASA's Information Infrastructure Technology and Applications (IITA) K-12 Internet Initiative.
Charles Wood is with the Dept. of Space Studies, University of North Dakota, Grand Forks, ND. Phone 701-777-3167, Fax 701-777-3711.
Sterling Software, NASA Ames Research Center: Allen Ross, Richard Andrews, Michael Defrenza, Alan Federman.
I-NET, NASA Ames Research Center: Damien Canerot, Allen Johnson, Gloria Houde, Patricia Kaspar.
University of North Dakota: Henry W. Borysewicz, Joanne Gabrynowicz, Thomas Eggebraaten, Jamie Dronen, George Seielsted.