Control Station Projects
The SSL has worked on several projects to facilitate robust control:
A graphical simulation was developed to allow an operator to visualize the telerobot and worksite in a three-dimensional environment. Several windows are provided to allow simultaneous viewing of multiple views. One stereo and three monoscopic views from over 50 predefined virtual cameras may be selected; views attached to each arm, to the vehicle body, and the worksite improves situational awareness by providing several frames of references which the world may be observed. Every view is completely reconfigurable allowing the operator to freely move about the virtual environment and adjust the field of view. Telemetry data, either from a training simulation or from sensors on the vehicle, is used to update the simulation moving the robotic system and highlighting changed states. Using a keyboard and mouse, the operator uses the windows on the perimeter to change control parameters and monitor the vehicle's state. The operator can use a variety of input devices to directly interact with the graphical simulation and the vehicle moving each of the manipulators to perform the specified maintenance task. The graphical simulation not only can be displayed from a standard computer monitor, but stereo is provided using a head mounted display or LCD shutter glasses.
Although the graphical simulation replaces actual video during computer simulations, the graphical simulation can be used to augment live video during robotic operations. The graphical simulation is updated by telemetry data; this data can either come from a simulation or from sensors on the vehicle. During vehicle operations the graphical simulation can provide additional virtual views of the robot. Operators can move to an infinite number of locations using the virtual environment and use that vantage point to assist them with their task. The virtual environment can also be used to perform simple functions when actual video is not provided. For example, during one test, the live video satellite feed was lost. However the data feedback was still streaming, allowing the operators to continue performing basic manipulator checkouts without video.
The capability to display the actual position of the robot within the graphical simulation has proven helpful in many circumstances. The ability to augment, and even replace live video, has improved operator's situational awareness. However, a graphical simulation has the advantage of displaying things that could never be observed from live video images. The graphical simulation can easily highlight items to grab the operator's attention to important information. Telemetry data can warn of impending problems, which can be displayed within the virtual environment. An error condition can cause a manipulator to flash until the operator addresses the condition. Virtual graphics filters can be used to observe power consumption, temperature values, and global status for each manipulator joint by providing a gradient of colors to indicate various levels.
Lead Engineer: Corde Lane
Graphical User Interface Control System
To keep up with the developmental changes of the Ranger telerobot. A flexible control interface was designed. Over 50 different software processes can work together to effectively control the robot's many subsystems. The operator can use one of many types of input devices (hand controllers, forceballs, master arms, and 6DOF positioners) to directly control Ranger's manipulators. The inputs of these devices are feed to an interface program, shown below, which sends commands to the robot.
The above screenshot shows several independent windows surrounding a graphical simulation display. These windows display the current state of the vehicle. The operator can used this graphical interface to change control modes, update control parameters, run command scripts, or monitor the vehicle's status. Each window focuses on one aspect of the robot. The operator can open, close, and move each window to create control station layout for a specific task. The operator may have one layout when controlling the manipulator, then switch to another layout to analyze a system malfunction. The control station is distributed linking different modules together allowing several operators, using different computers over the internet, to work together and effectively control Ranger. If a operator station crashes, the control system can instantly be reconfigured, allowing any other operator to take over were the previous left off.
The modularity of the system facilitates the modification or addition of new systems on Ranger. If a new control method is developed for the robot, a new control station module with its different windows can easily be dropped into the collective. In fact the control of an entirely different robotic vehicle, SCAMP, was integrated within this system. The operator can control both vehicles simultaneously. Auxiliary programs are able to be absorbed into this system. The addition of new input devices, programs which monitor the communication between the control station and the robot, and graphical simulations are also linked to this system. Therefore the control system is not only used during robotic operations, it is used to test new system designs, analyze system anomalies, and train operators. The control system becomes more of a design tool in Ranger's continual development.
Robotic Kinematic and Dynamic Simulations
Neutral buoyancy provides a high fidelity simulation for developing and training. However, there are times when one of the underwater vehicles is either not functioning or not available. The use of a computer simulation fills the void allowing for some level of training and development without the actual vehicle. Even with a relatively low-fidelity computer simulation, much can be accomplished with the system.
The same input devices, control station, and graphical simulation are used during the computer simulations as TSX and NBV robotic operations. The figure below shows how each of the software processes are linked together. The Arm Control Simulation is a functional equivalent of the control software used on the actual robot to command its manipulators. The Arm Interaction Simulation is a mathematical process that runs on a computer to mimic the behavior of the actual robot. This includes addition of manipulator dynamics, collision with the worksite, and manipulating maintenance task elements. Data from the arm simulations are streamed to update the status on the control station panels. The graphical simulation is used in place of live video coming back from the robot.
The training simulations have been used to assist new operators to learn the fundamentals of controlling the robot. Lessons are learned on how to properly use the different input devices, how each of the control station functions are utilized, and what the procedure steps are for performing specific tasks. Using these simulations, novices with no prior experience controlling robots, including young children, have learned enough to pilot neutral buoyancy vehicles within a few minutes.Another project, SCAMP, which is designed to fly within the tank and take video. Every year an open house event allows the general public to control a telerobot. People from all ages and backgrounds learn how to use the ground control station to adequately fly the training simulation. After only a few minutes of experimentation with the simulation, they more confidently take controls of the actual underwater vehicle.
Although the training simulations have proved successful in quickly reducing the learning times of operators, the greater advantage these simulations have provided is the capability to develop the robotic system. Over five hundred hours of human factors testing has occurred using the simulations to determine the best strategies for controlling Ranger remotely under different amounts of time delay. The arm control software has been tested extensively using many hours of computer simulations, debugging code, adjusting control parameters, and developing unique control methods before testing with hardware. And much analysis has used the graphical simulations to test ideas out before considering major design steps. Before changing the size and length of manipulators, the computer simulations were updated and used to determine if such a design change was required.
Lead Engineer: Craig Carignan
Symbol Table Communications Index
The symbol table is a list of messages that are passed between software processes. Most commonly these messages are between a control station and a robotic vehicle. However, symbol table communication is also used between utility processes, simulations, other control stations, and monitoring programs. Software scripts are used to generate the required C code to allow these modules to communicate using shared memory or NDDS (Network Data Delivery Service by Real Time Inovations).
This document was to assist a programmer in interfacing with an exsisting control station project using the symbol table communictions code. The symbol table database holds all the information that the communication C code is derived. This database can be searched and review. Information is also given on how this information is used to write the communications C code. Finally, an explaination of how to use the C code to integrate modules is presented.