On the basestation, an NCSA HTTP server runs and provides a graphical user interface (GUI) to the mobile robots.
In order for the users to interact effectively with the mobile robots in the lab, it is necessary that the GUI must support the following capabilities: (i) submission of basic commands, e.g., Forward, RaiseGripperLift, (ii) viewing of actuator and sensor states, and (iii) submission of program files. In addition, the following are highly desirable: (iv) video of the lab overview, and (v) real-time animation of sensory readings.
We implement our GUI based on Web facilities (i.e. HTTP, HTTP server, Common Gateway Interface, HTML, and Web browser). Web facilities are chosen as the implementation language since (i) Web facilities are widely available and therefore maximize the accessibility of our system, and (ii) Web facilities are rapidly evolving so that in the near future we will be able to implement our entire GUI based only on Web facilities. Figure 2 shows part of the current Web page.
Since the current Web facilities cannot support all the features needed for our interface, we use a hybrid approach for now, and plan to replace non-Web-based parts as Web facilities improve.
The GUI can be divided into 3 parts: (i) registration/login, (ii) programming, and (iii) display.
The registration/login part is performed in a manner similar to the Bradford telescope project. We are adding new capabilities for users to schedule experiment time slots with our lab.
Programming is supported at different levels:
The display part of the GUI transmits video and sensory data in semi real-time. In the current setup, we send video stream via a video conferencing tool called nv (NetVideo) . This requires that the client has nv capability, and is therefore restrictive. Sensory data are plotted and displayed in primitive animation using Netscape's server push method. Finally, the user can request that the complete log of sensory readings be sent to him/her via e-mail.