This disclosure relates to optical navigation and, more particularly, to optical navigation for underwater vehicles.
Underwater navigation presents challenges for vehicles. Underwater navigation is not feasible for a typical global positioning system (GPS) as these systems cannot operate underwater. The radio frequency signals that are typically necessary for GPS are attenuated by water. Therefore, the location of an underwater vehicle may not be known until the vehicle resurfaces for GPS navigation or visual confirmation. Accordingly, a means to track location between known points is required for location accuracy. Given the current availability of navigation tools for underwater use, the cost has been prohibitive for many uses. When an underwater vehicle submerges, location metrics such as from GPS and other communication methods are lost. At this point, the underwater vehicle must rely on onboard sensors to maintain location accuracy.
Prior art methods for underwater navigation include using an Inertial Measurement Unit (IMU), Doppler Velocity Log (DVL), or acoustic communication with surface floats or subsea clumps. The cost of these sensors can be on the order of at least tens of thousands of dollars. In addition, these sensors are delicate and subject to damage, and may require active logistics support to accomplish the task via surface or underwater reference locators. Typical additional costs when acquiring and adapting the above-mentioned devices include customizing proprietary programming, non-recurring engineering cost associated with feature implementation, and support hardware.
In addition, an IMU is very sensitive to shock and may not be reliable. A DVL works through acoustic means and may be sensitive to fouling as its sensors are exposed to seawater. IMUs and DVLs also don't report position, so their solution needs to be integrated with respect to time, so even the highest end sensor will experience navigation “drift”. Other acoustic means using known reference sources are limited by range, are noisy (not covert) and require a lot of energy.
Computer mouse technology is well proven and accurate for local telemetry and is achieved for a very low cost. Therefore, it should be considered for underwater telemetry. It is very robust with high reliability, and can be made easily programmable through commonly available means. It works by performing image processing algorithms to determine the offset of features between multiple images taken with the mouse's optical sensor. It typically uses a standard LED or laser in the red-to-infrared spectrum to illuminate a scene. The return images are retrieved through a set focal length lens. When a surface is within close proximity (approximately 0-6 inches), LED is sufficient to illuminate the surface and the sensor can achieve high accuracy tracking.
Though the sensor is capable of taking measurements with ambient light, it can be shown that the accuracy diminishes with lower light conditions. By using a laser or other light source, the measurement field can be illuminated such that the sensor can more easily detect differences in the images and track movement. Because a laser can focus on a given point on the measured surface (hereafter called “ground”), given the proper lens geometry, the sensor can track telemetry in a similar manner to its more conventional desktop use.
The typical mouse sensor has a near focus, narrow field of view lens that is physically very close to the light source and the ground. This geometry is preserved in its application because the sensor and light source are always at a constant distance from the ground (i.e. the mouse is physically on the ground). This, however, is impractical for underwater navigation as the ground is very seldom flat.
There is a need for incorporation of a low-cost mouse sensor into a system for low-cost optical navigation for underwater vehicles. This new system should address the aforementioned shortcomings of using a mouse sensor system that was designed for a computer.
The present disclosure addresses the needs noted above by providing an underwater vehicle and method for underwater navigation. In accordance with one embodiment of the present disclosure, the underwater vehicle is capable of operating within close proximity to an underwater ground. The underwater vehicle includes an optical navigation system. The optical navigation system comprises a pressure housing that includes, disposed within the pressure housing: a sensor capable of taking images; a light source configured to produce a light beam that is offset from the sensor lens. The light source is further configured to reflect light directly into the field of view of the sensor. The navigation system also includes a processor, operably coupled to the sensor. The processor is configured to execute instructions. A memory, operably coupled to the processor and sensor, stores processor-executable instructions and images taken with the sensor. When executed, the instructions cause the processor to determine the offset of features between at least two images taken with the optical sensor. The instructions cause the processor to determine a distance traveled based on the offset between the at least two images.
These, as well as other objects, features and benefits will now become clear from a review of the following detailed description, the illustrative embodiments, and the accompanying drawings.
The accompanying drawings, which are incorporated in and form a part of the specification, illustrate example embodiments and, together with the description, serve to explain the principles of the invention. In the drawings:
The optical navigation system and method disclosed herein achieve two-dimensional (2D) navigation telemetry for underwater vehicles by leveraging open source programming and low cost commercial off-the-shelf (COTS) technology.
Disclosed herein is an underwater vehicle with an optical navigation system that is disposed within a pressure housing. Also disclosed herein is a method for optical navigation for underwater vehicles. The optical navigation system and method include a sensor that takes images of an ocean floor or other underwater ground, through a sensor lens. A light source produces a light beam that is offset from the sensor lens. The light source reflects light directly into the field-of-view of the sensor. The field of view may feature the ocean floor. The sensor takes multiple images which are received by software that is stored in memory that resides within the housing. The software, which may be feature detection software, is executable by a processor. When executed, the software causes the processor to determine the offset of features between at least two images taken with the sensor. In this manner, navigation information may be derived. This navigation information may include a vehicle's two-dimensional position, especially when a compass is used for a fixed reference. In addition, for underwater vehicles, the information could include surge (front-back motion) and sway (side-to-side motion) which may occur as a result of wave motion. The optical navigation system disclosed herein could be adapted for use with land vehicles.
Referring now to
The optical navigation system 110 may take images of the ocean floor. Based on those images, the system 110 can determine the two-dimensional position of underwater vehicle 120. The optical navigation system 110 can also determine surge motion has occurred based on how far front and/or back at least one of the images is from at least one other image. The optical navigation system 110 can determine how much sway motion has occurred based on how far sideways at least one of the images is from at least one other image.
As shown in
Referring now to
Disposed within the pressure body 210 are an optical sensor 230 and a sensor lens 240. The optical sensor 230 is capable of taking images through sensor lens 240, and thus the line of sight of optical sensor 230 should be directed through sensor lens 240. Optical sensor 230 may be a complementary metal-oxide-semiconductor (CMOS) sensor, an N-type metal-oxide-semiconductor (NMOS), a semiconductor charge coupled device (CCD) sensor or other sensor capable of taking digital images or capable of converting reflecting light back to a digital signal.
Still referring to
Still referring to
Still referring to
Still referring to
The ocean floor and other underwater ground areas are very seldom flat. Therefore, it may be desirable for the light source 250 and the optical sensor 240 to be on the same optical path. Ideally, when using a laser, the line of sight of the optical sensor 230 should be on the same axis as the beam path of the laser to eliminate any errors due to parallax. Parallax is a displacement or difference in the apparent position of an object when the object is viewed along two different lines of sight. Parallax may be measured by the angle or semi-angle of inclination between those two lines.
Light source 250 may be made to travel directly through the sensor lens 240 (bore sighting), or it may be mounted at a minimum slight offset, so that it can reflect light directly in the field of view of the optical sensor 230. If the light is made to travel directly through the sensor lens 240, this has the advantage of zero parallax so that distance is not an issue for alignment, only illuminance.
The sensor lens 240 may have a wider field of view or a larger depth of field to maintain low sensitivity to varying height. Two-dimensional (2D) telemetry is taken with the optical sensor 230 and calibrated through compass readings. A compass (not shown in
Circuit board 260 includes a processor 245 that is operably coupled to the optical sensor 230. Processor 245 may be a digital signal processor. A power source 247, e.g., a battery, may provide power to the optical sensor 30, processor 245, light source 250 and other components needing power. Circuit board 260 also includes a memory 235 that stores processor-executable instructions as well as images taken with the optical sensor 230. Processor 245 should be of sufficient speed to process images and instructions for the optical navigation system 110 at the rate needed in order to determine image offsets at the rate necessary to accomplish 2-D navigation. Images of underwater ground 115 may be captured in continuous succession and compared with each other in order to determine how far the underwater vehicle 120 has moved. Memory 235 or other data storage medium should be of sufficient size to store multiple images over at least the course of a trip for the underwater vehicle. Memory 235 is operably coupled to processor 245. When executed, the instructions in memory 235 cause the processor 245 to determine the offset of features between at least two images taken with the sensor 230. Features may include any identifiable characteristic in the image, including any change in pixel. The features may include rocks, aquatic plants, changes in elevation, and any other feature that can translate to an identifiable pixel. Features can even be naked to the human eye, such as a multiple lighter colored pieces of sand next to multiple slightly darker colored pieces of sand. The features may also include different textures on the underwater ground 115 or sea floor.
A window 280 is disposed within the watertight pressure housing. Window 280 is configured to receive light emitted from the light source to the underwater ground 115. The window 280 is further configured to receive light reflected back from the underwater ground 115 to a field of view of the optical sensor 230. Bolts 290 or other securing means may secure the pressure lid 220 to the pressure body 210.
Optical sensor 230 may be chosen, at least in part, based on its frame rate. The frame rate needed for optical sensor 230 may depend on the speed of the vehicle or other body on which the optical sensor 230 is mounted.
The frame rate needed for the optical sensor 230 may be determined according to the following equation:
The return images may be received via sensor lens 240, which may have a set focal length.
Digital image correlation and tracking and/or image processing algorithms may be used to determine the offset of features between multiple images taken with the optical sensor 230. Digital image correlation and tracking is an optical method that uses tracking and image registration techniques for accurate two-dimensional and three-dimensional measurements of changes in images. An example of a digital image correlation technique is cross-correlation to measure shifts in data sets. Another example of a digital image correlation technique is deformation mapping, wherein an image is deformed to match a previous image.
Feature detection algorithms are an example of the type of image processing algorithm that may be used. Feature detection algorithms are known in the art. Examples of feature detection algorithms can be found in the following publication: Jianbo Shi and C. Tomasi, “Good features to track,” Computer Vision and Pattern Recognition, 1994. Proceedings CVPR '94., 1994 IEEE Computer Society Conference on, Seattle, Wash., 1994, pp. 593-600.
Some feature detection algorithms receive an image, divide it into segments and look for features, texture and surfaces as markers. For example, if a camera zooms in to a small square, e.g., a sandy bottom, pixels will show distinctions between portions of the sandy bottom. Markers such as these may be compared in subsequent images to see how far a vehicle has traveled. Memory 235 may also be operably coupled to a compass (not shown in
Still referring to
Also by way of example, if we know what the distance traveled would be if we were six inches (6″) from underwater ground 115 and eight inches (8″) from underwater ground 115, we may be able to interpolate that data to reach a conclusion as to distance traveled if we were seven inches (7″) from underwater ground 115. Generally, the closer to the water's floor, the less the vehicle has traveled. Feature detection algorithms, which may be obtained as COTS items, take information such as this into account.
Referring now to
Circuit board 260 and light source 50 may be mounted onto the interior of pressure body 210, or otherwise disposed within pressure body 210, using a number of means known in the art, including hard mounting, brackets, and foam. Mounted on circuit board 260 may be sensor 230, memory 235, sensor lens 240, processor 245 and power source 247 (e.g., a battery).
When used underwater, it is the intention of this system to work where measurement can be taken close to the ground. Because of optical challenges with visibility and backscatter due to turbidity, distances of less than a meter from ground are expected for subsea use. However, this technology could be adapted as an alternative navigation source to any vehicle traveling over ground where the distance is known such as land vehicles.
Additionally it can be used where ambient light can be utilized for image processing, and the distance can be taken as optical infinity, such as day use for aerial vehicles, or where ground lights can be used as the tracking points during night flight.
The invention can take on alternate embodiments. In this invention's first embodiment, ground refers to the sea floor, however it is not limited to this. Ship hull inspection, pipeline inspection, etc. could also apply. Also, for vehicles that require an operational depth that is not near ground, the user could modify their vehicle's mission to submerge near the seafloor, navigate a 2D position, then float up to its desired working depth.
Another embodiment could be for land survey or mapping utilizing the high accuracy of this system.
Another embodiment could be as a cheap alternative for land or air speed utilizing the low cost of this system to eliminate the lens of the laser, the sensor or both. Autofocus could be implemented to account for varying measurement distance. Multiple systems could be used in tandem to reduce error for turbid conditions. Different colored lasers or alternative light sources could be used based on mission conditions for better performance or covert operations.
The present system incorporates proven, reliable components such as circuit boards, sensors and lasers have proven to be very high. The system may be provided using COTS, easy to use items. The present system eliminates the requirement for acoustic measurements. Therefore, operation can be made active while still maintaining a covert signature to listening devices. Because it does not use acoustic devices, the system has a comparatively lower energy cost.
The foregoing description of various preferred embodiments have been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The example embodiments, as described above, were chosen and described in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims appended hereto.
The United States Government has ownership rights in this invention. Licensing inquiries may be directed to Office of Research and Technical Applications, Space and Naval Warfare Systems Center, Pacific, Code 72120, San Diego, Calif., 92152; telephone (619)553-5118; email: ssc.pac.12@navy.mil. Reference Navy Case No. 103,105.