Hardware and System Software
Our on-board computer uses a Commell LS-371 3.5in embedded motherboard with an Intel Core Duo 2.0 GHz processor and 2GB of RAM. We use a Patriot Memory Warp II 32 GB solid state drive (SSD) for on-board storage. The SSD provides improved I/O speeds and increased storage over the CF cards used in the past, in a vibration resistant package. The computer runs a minimalist installation of Debian GNU/Linux.
The shared memory system is a custom library based upon the POSIX shared memory standard, providing thread/process-safe variable updating and notification. It is responsible for storing the vehicle's state in a litany of type-aware variables. These shared variables can be accessed by all of the components of the software system, which allows for simple communication among the various daemons. For example, the serial drivers interact with electronic peripherals on the vehicle, such as the sensors and thrusters, and data from these peripherals is written to shared memory. This data can be read by the controller or mission software, which can in turn send new commands to the serial drivers through shared memory. To keep these variables in sync across multiple computers, Statecast, a custom TCP/IP based protocol, is used. Statecast allows for dockside users to make changes to shared variables on their computer and have these changes be reflected quickly and reliably on all computers on the network. Statecast has allowed the development of a graphical dockside user interface while eliminating the dependence on Pyro (PYthon Remote Objects).
A Kalman Filter is used to fuse sensor data in real time on the vehicle. This filter fuses the faster rate sensor observations of acceleration and angular rate with the vehicle model-based prediction and then with the slower direct observations of velocity and orientation.
The output of the Kalman Filter is then fed into the vehicle's controller, which was designed using root-locus and Bode techniques. The six thrusters of the vehicle allow us to control five degrees of freedom: surge, sway, heave, pitch, and yaw. Velocity data from the DVL gives us the option of running either open or closed loop velocity control in the surge and sway directions.
Our vision system is written in C++ and utilizes the open source OpenCV and libdc1394 libraries. The vision system allows the mission to enable and disable the various different vision algorithms whenever visual data is required for a specific mission element.
All of the machine vision algorithms work in a similar manner. The input image is converted from RGB space into HSV space and split into its three component channels: hue, saturation, and value. Each of these channels is segmented through predetermined thresholds, and the three segmented channels are recombined to form a binary image. Contours are detected in the binary image, and these contours are then run through a set of probabilistic filters and moment analyses to determine the location, orientation, and probability for a specific mission element. For reference, output from the shape detection algorithm is shown below (input image, binary segmented image, and detected contours):
In the past, each of our vision algorithms was implemented in its own daemon. We now use an integrated vision daemon which combines all the separate processing daemons currently used (such as the pipe daemon, the buoy daemon, etc) and the camera capture daemon into one streamlined, multithreaded daemon. Each processing daemon now becomes a ‘module’ that fits into a framework provided by the camera daemon. These modules are dynamically loaded and unloaded, depending on shared variables. Separate threads are used for each module and each capture source. Capture sources can be physical cameras, directories of images, or video files. Images captured by the daemon are no longer saved to disk and passed through paths stored in shared variables. Now, images are passed in memory utilizing the module framework, resulting in significant performance improvements.
The new simulator utilizes the PyGame and PyOpenGL libraries to simulate a full 3D world in which the vehicle and all of the mission elements are placed. The simulator simulates all sensor values and vision data in Shared Memory, which allows missions to be tested off-line with no modification.
Vehicle Abstraction Layer (VAL)
The vehicle abstraction layer (VAL) builds a Python language wrapper around all of the shared variables, creating an abstract Vehicle object in Python. This Vehicle object not only has access to physical sensors which write data to shared memory, but can also create virtual hybrid sensors that combine data from multiple real or simulated sources. This feature allows us to create a new virtual sensor such as ''water depth'' by combining data from the real depth and altitude sensors. It also allows mission code to be tested using data from a simulator which writes data to virtual sensors.
The mission planner is divided into a planner system, which organizes the task tree, and a set of tasks which are invoked by the user and each other, and which the planner builds at runtime into a coherent mission. Given the dynamic nature of the mission, it is highly unlikely that a mission tree will be the same across two different runs; this is since even mission ‘primitives,’ such as ‘GoToHeading,’ or ‘GoToDepth’ are inserted into the task tree at runtime, and across multiple runs, the vehicle will usually take a different path.
The mission planner is a tree-walking, multi-threaded program, written in Python due to its simplicity, that instantiates each element of the user-given task list, allows the tasks to add sub-tasks, and executes these subtasks when their turn is up. The planner is always running in the background, ready to cull completed tasks and notify tasks further down the line that it’s their turn to run. The planner also makes sure ‘exclusive’ tasks, such as movement primitives, only run one-at-a-time, and that each task is run at a regular interval, so that tasks can use time-based accounting if desired.
Dashboard is an application for reviewing log files and viewing live data from the vehicle. Dashboard was born of a need to have an ergonomic, visual representation of the data coming from the vehicle. Dashboard is still under development, but it is a useful tool in its current state, and should soon replace much of the command-line interface we use for controlling the submarine.