The present subject matter generally relates to a lidar system for underwater vehicles and more specifically to a 360 degree lidar system for underwater vehicles.
Underwater environments can present scenarios where the acquisition of image data through the field of view is limited, which can complicate underwater surveillance when using presently-available imaging systems. For example, the presence of a large number of suspended particles in the field of view may create significant backscattering of light and/or contribute to transmission loss of imaging data and acquisition of that data. Additionally, data acquisition from a direction and dimensional perspective is limited. The underwater environment presents challenges for traditional imaging application since objects of interest can be located anywhere within the water column. However, state-of-the-art camera based, Laser based, or caustically based imaging systems are typically configured to only capture the ocean floor and allow the acquisition of data from one direction rather than from anywhere within the water column as is typical for.
Recent advances in both acoustics and optical imaging technology, coupled with advances in signal and image processing, have enabled oceanographers to acquire and process images and videos that were unthinkable in years past. Acoustic imaging technologies have been used to gather oceanic images of objects in turbid oceanic environments that are challenging for optical systems. While acoustical systems have a longer imaging range as compared to optical system, their resolution is significantly lower than that of optical systems. Optical systems are useful in environments that are less turbid since they can have superior resolution and offer contrast based on the optical reflectivity of objects.
The shortcomings of both systems for acquiring and processing high-quality images in underwater environments creates the need for the development of technologies that will enable image acquisition and processing that are low power, reasonable in cost, and unobtrusive.
Advanced imaging technologies vastly improve and broaden the utility of unmanned systems. In particular, the application of advancements in underwater laser line scan (LLS) serial imaging sensors and Light Detection and ranging (LiDAR) have enabled optical imaging in turbid waters where imaging was not previously feasible. Additionally, these platforms have been employed for imaging applications for autonomous underwater vehicles (AUVs) and remotely operated underwater vehicles (ROVs) to enable characterization of underwater structures as well as the seafloor for military and civil applications.
LLS systems have many benefits, including high resolution in some environments where other scanning methods are not feasible, high capture speed, and surface topography scanning, but there are some disadvantages to using these systems for certain applications particularly under poor water visibility. Pulsed laser-based imaging systems show promise for imaging at extended range due to their ability to reject volume scattering compared to camera based systems. However, none of these systems offers the promise of omnidirectional imaging.
Aspects and advantages of the invention will be set forth in part in the following description, or may be obvious from the description, or may be learned through practice of the invention.
According to some aspects of the present disclosure, an omnidirectional imaging system includes a housing have at least one transparent portion, a transmitter configured to direct emitted light outward of the housing through the at least one transparent portion at a 360 degree scanning angle measured about an axis, and a receiver configured to receive the emitted light. The receiver is configured to receive the emitted light and generate a cylindrical 3D point cloud centered along a path of the housing as the housing moves in a first direction. The at least one transparent portion may be two transparent portions and/or the housing may be a cylindrical housing.
According to some aspects of the present disclosure, an omnidirectional imaging system includes a housing having at least one transparent portion, a light source, and at least one monogon laser scanner configured to rotate about an axis. The at least one monogon laser scanner is configured to redirect emitted light from the light source outward through the at least one transparent portion to form a 360 degree scanning angle measured about the axis of motion of the system. A plurality of detectors are positioned in a circular arrangement about the axis. The plurality of detectors are configured to receive the emitted light from the light source as the housing is moved in a first direction to generate a cylindrical 3D point cloud centered along the axis of the system as the system moves in the first direction.
According to some aspects of the present disclosure, an omnidirectional imaging system includes a housing having at least one transparent portion, a light source, and a first monogon laser scanner configured to rotate about an axis motion of the system. The first monogon laser scanner is configured to redirect emitted light from the light source outward through the at least one transparent portion to form a 360 degree scanning angle measured about the first axis. A second synchronized monogon configured to direct the emitted light to a detector as the housing is moved in a first direction. The detector is configured to generate a cylindrical 3D point cloud centered along the axis of motion of the system as the system moves in the first direction.
These and other features, aspects, and advantages of the present disclosure will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
A full and enabling disclosure of the present disclosure, including the best mode thereof, directed to one of ordinary skill in the art, is set forth in the specification, which makes reference to the appended figures.
Repeat use of reference characters in the present specification and drawings is intended to represent the same or analogous features or elements of the present disclosure.
Reference will now be made in detail to present embodiments of the invention, one or more examples of which are illustrated in the accompanying drawings. The detailed description uses numerical and letter designations to refer to features in the drawings. Like or similar designations in the drawings and description have been used to refer to like or similar parts of the invention.
As used herein, the terms “first,” “second,” and “third” may be used interchangeably to distinguish one component from another and are not intended to signify location or importance of the individual components. The terms “coupled,” “fixed,” “attached to,” and the like refer to both direct coupling, fixing, or attaching, as well as indirect coupling, fixing, or attaching through one or more intermediate components or features, unless otherwise specified herein.
The singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
Approximating language, as used herein throughout the specification and claims, is applied to modify any quantitative representation that could permissibly vary without resulting in a change in the basic function to which it is related. Accordingly, a value modified by a term or terms, such as “about,” “approximately,” “generally,” and “substantially,” is not to be limited to the precise value specified. In at least some instances, the approximating language may correspond to the precision of an instrument for measuring the value, or the precision of the methods or apparatus for constructing or manufacturing the components and/or systems. For example, the approximating language may refer to being within a ten percent margin.
Moreover, the technology of the present application will be described with relation to exemplary embodiments. The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments. Additionally, unless specifically identified otherwise, all embodiments described herein should be considered exemplary.
Here and throughout the specification and claims, range limitations are combined and interchanged, such ranges are identified and include all the sub-ranges contained therein unless context or language indicates otherwise. For example, all ranges disclosed herein are inclusive of the endpoints, and the endpoints are independently combinable with each other.
As used herein, the term “and/or,” when used in a list of two or more items, means that any one of the listed items can be employed by itself, or any combination of two or more of the listed items can be employed. For example, if a composition or assembly is described as containing components A, B, and/or C, the composition or assembly can contain A alone; B alone; C alone; A and B in combination; A and C in combination; B and C in combination; or A, B, and C in combination.
The present disclosure is generally directed to an omnidirectional imaging system 10 configured as a 360 degree pulsed laser line scan imager)(PLLS−360° for underwater and/or surf zone bathymetry and other marine science applications. The omnidirectional imaging system 10 may include a housing 12 including at least one transparent portion 14, 16, a light source 18 configured to produce emitted light 20, a transmitter 22 configured to direct the emitted light 20 outward of the housing 12 and through the at least one transparent portion 14, 16 at a 360 degree scanning angle α measured about an axis of motion x of the system, and a receiver 24 configured to receive the emitted light 20. The receiver 24 is configured to receive the emitted light 20 and to generate a cylindrical 3D point cloud 26 centered along a path of the housing 12 (see arrow P of
Referring now to
Referring now to
The housing 12 may include an outer wall 56 defining at least a first cylindrical portion 50 defining a first cavity 52. The outer wall 56 may have a first transparent portion 14. The first transparent portion 14 is configured to extend about the entire circumference of the first cylindrical portion 50 to provide a full 360 degree view from the first cavity 52. The first transparent portion 14 may extend the full length of the first cylindrical portion 50 or the first transparent portion 14 may extend a portion of the length of the first cylindrical portion 50.
The housing 12 may further include a second cylindrical portion 60 defining a second cavity 62. The second cavity 62 may be at least partially formed by the outer wall 56 of the housing 12. As shown in
The outer wall 56 may have a second transparent portion 16 positioned proximate the second cavity 62. The second transparent portion 16 is configured to extend about the entire circumference of the second cylindrical portion 60 to provide a full 360 degree view from the second cavity 62. The second transparent portion 16 may extend the full length of the second cylindrical portion 60 or the second transparent portion 16 may extend a portion of the length of the second cylindrical portion 60. It is contemplated that the entirety of the outer wall 56 may be transparent. It is further contemplated that the outer wall 56 may have any number of transparent portions without departing from the scope of the present disclosure.
As previously introduced, the imaging system 10 includes a transmitter 22 and a receiver 24. The transmitter 22 and receiver 24 are selected such that the imaging system 10 is capable of detecting reflections from hard surfaces within the 360 degree field of view as well as from the volume scattering throughout the water column. As illustrated herein, the receiver 24 may be positioned within the first cavity 52 and the transmitter 22 may be positioned within the second cavity 62 for a bistatic configuration (i.e., the receiver 24 and the transmitter 22 are in different housings or different hull sections). This spatially separates the receiver 24 and the transmitter 22 and may reduce backscatter of the emitted light 20 from the common water volume formed between the transmitter 22 and the receiver 24 by the propagation of the emitted light 20 and the field of view of the receiver 24. Here common water volume refers to the volume of water that is shared between the emitted light 20 from the transmitter 22 and the field of view of the receiver 24. In addition, configurations with a multitude or plurality of both transmitters 22 and receivers 24 can be realized to reduce shadowing effects in a multistate configuration. However, it is contemplated that the receiver 24 and the transmitter 22 may be positioned within a single cavity in a monostatic configuration without departing from the scope of the present disclosure. In other words, in various examples, the transmitter 22 and the receiver 24 may be housed in the same portion of the housing 12 (or the same hull section) (e.g., within the first cavity 52) for a monostatic configuration.
As shown in
In various examples, the at least one circular arrangement 70 of individual detectors 72 may include one or more rows of detectors 72 arranged about a cylindrical support 76. The support 76 may be one of a plurality of supports 76 centered about the axis of motion x of the system 10. For example, each row of detectors 72 may be positioned on a separate support 76. Alternatively, each row of detectors 72 may be positioned on a single support 76. Where more than one support 76 is used, it is contemplated that the supports 76 may be operated independently or concurrently without departing from the scope of the present disclosure.
Alternatively, as shown in
With continued reference to
As shown in
As shown in
In some configurations, where a need exists for electronic components on both sides of the transmitter, wireless transmission of power and data can be utilized (e.g., via RF or optical link(s), for example). These configurations eliminate the otherwise needed wires, which may shield portions of the 360 degree field of view.
Referring now to
In application, the imaging system 10 will acquire data in all directions and will work by generating a cylindrical 3D point cloud 26 along the path of the housing 12 (see
In various examples, as shown in
Distance or depth information is then determined via the time of flight of individual optical pulses of the emitted light 20 from the light source 18. The information is used to generate the cylindrical 3D point cloud along the path of movement of the housing 12. The movement may be caused by the forward movement of the carrying assembly 40 (e.g., the AUV or similar underwater platform or towfish). It is understood that any other method commonly used in LiDAR system for determining the propagation distance may also be utilized.
Exemplary images of a scanned environment using the imaging system 10 are shown in
Referring now to
Depending on the separation between the transmitter 22 and the receiver 24, and the distance to the target (e.g., the target object 42, 48, the seafloor 44, etc.), the direction of the outgoing emitted light 20 and the field of the receiver 24 may be adjusted to maximize the photon collection efficiency. To overlap the field of view with the position at which the emitted light 20 intersects the target, the angle of the monogons 80, 84 can be adjusted by installing monogon mirrors with face angles other than 90° in the transmitter 22 as well as in the receiver 24 if the receiver 24 is a synchronized monogon 80.
The imaging system 10 adds to the capability of a LiDAR system for underwater vehicles by allowing a 360 degree illumination and field of view. This is of particular interest in the three-dimensional underwater environment where objects of interest can be found in any direction. The imaging system 10 is well suited to collect bathymetric data in shallow water where pressure changes due to wave action can easily skew the depth sensor of an AUV. As previously discussed with reference to
Furthermore, the imaging system 10 may be used to navigate in and inspect harbor environments where target objects 44, 48 can be at the bottom, on the surface, or suspended in mid-water. Similarly, the imaging system 10 can be used to inspect oil rigs, which again require an omnidirectional imaging system in order to capture the entire structure. Operating in deep water, the imaging system 10 can investigate thin layers of suspended particles that often form in the ocean, whether these layers are above or below the housing 12. Other examples of applications for the imaging system 10 include the mapping of shallow water regions with or without ice cover. In these cases, the imaging system 10 can measure the wave high or the ice/water interface with respect to the seafloor using the same transmitter 22 and receiver 24.
In other words, the imaging system 10 provides a host platform with quantitative awareness of its surrounding environment by measurement time-of-flight (TOF) of reflections from hard surfaces within its 360° field of view, as well as measuring the time-resolved backscattering throughout the water column. The imaging system 10 has several unique applications for which traditional LiDAR sensors with a finite field of view are not well suited.
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they include structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
This application claims priority to U.S. Application No. 63/172,481 to Gero Nootz et al. filed on Apr. 8, 2021, the contents of which are incorporated herein by reference in their entirety.
This invention was made with government support under the U.S. Office of Naval Research (ONR) Grant/Contract No. N00014-18-9-001. The government has certain rights in the invention.
Number | Date | Country | |
---|---|---|---|
63172481 | Apr 2021 | US |