Embodiments of the present invention relate generally to a sonar transducer system configured to provide live sonar images with an expanded coverage volume.
Sonar transducer assemblies have been provided that provide sonar coverage for volumes that are underneath a watercraft. However, to date, the angle of sonar coverage has been limited for these sonar transducer assemblies. Sonar transducer assemblies often provide only a limited sonar coverage angle of 135 degrees or less. Where the sonar coverage angle is 135 degrees, volumes are left on both sides of the sonar coverage that are not covered, with these uncovered volumes totaling 45 degrees or more. For example, where the facing direction of a sonar transducer assembly is directed straight down, and uncovered volumes may be provided on both sides having angles of 22.5 degrees each. Alternatively, the facing direction may be altered so that the edge of the sonar coverage generally aligns with the water surface, but this would still result in an angle of 45 degrees of uncovered volume at the rear of the coverage.
In various example embodiments provided herein, sonar systems are provided having expanded coverage volumes. Multiple sonar transducer assemblies may be used in conjunction with each other, and the sonar systems may be configured to control the facing directions of the sonar transducer assemblies so that an expanded coverage volume may be obtained. While a single sonar transducer assembly is often limited to coverage angles of 135 degrees or less, the use of multiple sonar transducer assemblies may permit expanded coverage angles of 140 degrees or more, 150 degrees or more, or even 180 degrees or more. Sonar transducer assemblies may be oriented so that they possess facing directions that are at least partially in the downward direction. In some embodiments, the sonar transducer assemblies may provide continuous side-to-side downward coverage or continuous front-to-back downward coverage relative to a watercraft. The sonar transducer assemblies may be frequency steered in some cases. Sonar return data from the sonar transducer assemblies may be utilized to form live sonar images, and the live sonar images may be two-dimensional sonar images.
In an example embodiment, a sonar system for generating live sonar images having an expanded coverage angle is provided. The sonar system comprises two or more sonar transducer assemblies including a first sonar transducer assembly that has a first plurality of sonar transducer elements. The first sonar transducer assembly defines a first facing direction. The first plurality of sonar transducer elements are configured to transmit one or more first sonar beams into an underwater environment to form a first coverage volume within the underwater environment. A second sonar transducer assembly has a second plurality of sonar transducer elements. The second sonar transducer assembly defines a second facing direction. The second plurality of sonar transducer elements are configured to transmit one or more second sonar beams into the underwater environment to form a second coverage volume within the underwater environment. The sonar system further includes one or more alignment features, wherein the one or more alignment features are configured to position the first sonar transducer assembly and the second sonar transducer assembly so that the first facing direction and the second facing direction are different and relative to each other so as to create continuous coverage of the underwater environment. The continuous coverage has an overall coverage volume that is greater than either of the first coverage volume or the second coverage volume individually. The sonar system further includes at least one processor and a memory including computer program code configured to, when executed, cause the at least one processor to receive first sonar return data from the first plurality of sonar transducer elements; receive second sonar return data from the second plurality of sonar transducer elements; receive first facing direction data regarding the first facing direction of the first sonar transducer assembly; receive second facing direction data regarding the second facing direction of the second sonar transducer assembly; position the first sonar return data based on the first facing direction data to form positioned first sonar return data; position the second sonar return data based on the second facing direction data to form positioned second sonar return data; and generate a live sonar image of the underwater environment using the positioned first sonar return data and the positioned second sonar return data. The first facing direction and the second facing direction are generally outward and downward of the watercraft such that the live sonar image is representative of the underwater environment underneath the watercraft.
In some embodiments, an overlap volume is defined as a volume where the first coverage volume and the second coverage volume overlap. In some embodiments, sonar return data from only one of the first sonar transducer assembly or the second sonar transducer assembly is used in the overlap volume. In some embodiments, sonar return data from both the first sonar transducer assembly and the second sonar transducer assembly is used in the overlap volume.
In some embodiments, the first sonar transducer assembly includes a first sensor configured to obtain the first facing direction data regarding the first facing direction of the first sonar transducer assembly relative to the watercraft. The second sonar transducer assembly includes a second sensor configured to obtain the second facing direction data regarding the second facing direction of the second sonar transducer assembly relative to the watercraft. The computer program code is configured to, when executed, cause the at least one processor to: receive the first facing direction data from the first sensor; and receive the second facing direction data from the second sensor.
In some embodiments, the sonar system further comprises at least one bracket, wherein the at least one bracket is configured to position the first sonar transducer assembly and the second sonar transducer assembly, and wherein the one or more alignment features are provided on the at least one bracket. In some embodiments, the at least one bracket includes a first bracket and a second bracket, wherein the first bracket is configured to position the first sonar transducer assembly, and wherein the second bracket is configured to position the second sonar transducer assembly. In some embodiments, the sonar system further comprises a first clamp, wherein the at least one bracket is configured to be attached to the first clamp, and wherein the at least one bracket is configured to be rotated relative to the first clamp when attached to the first clamp. In some embodiments, the sonar system further comprises a second clamp, wherein the at least one bracket includes a first bracket and a second bracket, wherein the first bracket is configured to be attached to the first clamp, and wherein the second bracket is configured to be attached to the second clamp.
In some embodiments, the overall coverage volume includes a downward coverage angle, wherein the downward coverage angle defines an angle that is at least 140 degrees.
In some embodiments, the overall coverage volume includes a downward coverage angle, wherein the downward coverage angle defines an angle that is at least 150 degrees.
In some embodiments, the overall coverage volume includes a downward coverage angle, wherein the downward coverage angle defines an angle that is at least 180 degrees. In some embodiments, the downward coverage angle provides at least 180 degrees of continuous side-to-side downward coverage. In some embodiments, the downward coverage angle provides at least 180 degrees of continuous front-to-back downward coverage.
In some embodiments, the two or more sonar transducer assemblies are rotatably mounted with respect to the watercraft such that the overall coverage angle rotates with respect to the watercraft.
In some embodiments, the sonar system further comprises a display, wherein the memory including computer program code is configured to, when executed, cause the at least one processor to: cause the first plurality of sonar transducer elements and the second plurality of sonar transducer elements to transmit the one or more sonar beams into the underwater environment; and cause, on the display, presentation of the live sonar image. The live sonar image is a two-dimensional live sonar image that is formed of the first sonar return data and second sonar return data. The first sonar return data and the second sonar return data used to form the live sonar image were received at substantially a same time by the first plurality of sonar transducer elements and the second plurality of sonar transducer elements.
In some embodiments, the first facing direction is at least partially in a forward direction relative to the watercraft, wherein the second facing direction is at least partially in a backward direction relative to the watercraft.
In another example embodiment, a sonar system for generating live sonar images having an expanded coverage angle is provided. The sonar system comprises a first sonar transducer assembly having at least one array of a first plurality of sonar transducer elements, wherein the first sonar transducer assembly defines a first facing direction. The sonar system further includes one or more alignment features, wherein the one or more alignment features are configured to position the first sonar transducer assembly to obtain the first facing direction. The sonar system further includes at least one processor and a memory including computer program code configured to, when executed, cause the at least one processor to: receive first sonar return data from the first plurality of sonar transducer elements; receive first facing direction data regarding the first facing direction of the first sonar transducer assembly; and generate a live sonar image of the underwater environment based on the first sonar return data and the first facing direction data. The first plurality of sonar transducer elements are associated with a watercraft on a body of water, wherein the first facing direction is generally outward of the watercraft and is at least partially in a downward direction relative to the watercraft. The first plurality of sonar transducer elements are configured to transmit one or more sonar beams into an underwater environment. The live sonar image provides a continuous front-to-back downward image.
In some embodiments, the continuous front-to-back downward image covers a downward coverage angle that is at least 180 degrees.
In yet another example embodiment, a method of generating a live sonar image of an underwater environment having an expanded coverage volume is provided. The method includes receiving first sonar return data from the first plurality of sonar transducer elements; receiving second sonar return data from the second plurality of sonar transducer elements; receiving first facing direction data regarding the first facing direction of the first sonar transducer assembly; receiving second facing direction data regarding the second facing direction of the second sonar transducer assembly; positioning the first sonar return data based on the first facing direction data to form positioned first sonar return data; positioning the second sonar return data based on the second facing direction data to form positioned second sonar return data; and generating a live sonar image of the underwater environment using the positioned first sonar return data and the positioned second sonar return data.
In some embodiments, the method further comprises causing, on the display, presentation of the live sonar image, wherein the live sonar image provides a continuous front-to-back downward image.
Additional example embodiments of the present invention include methods, systems, apparatuses, and computer program products associated with various embodiments described herein.
Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Example embodiments of the present invention now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the example embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Additionally, any connections or attachments may be direct or indirect connections or attachments unless specifically noted otherwise.
In this regard, the sonar transducer may be formed of one or more active elements (e.g., piezoelectric crystals). Wires are soldered to coatings on the active element and can be attached to a cable which transfers the electrical energy from a transmitter to the active element. As an example, when the frequency of the electrical signal is the same as the mechanical resonant frequency of the active element, the active element moves, creating sound waves at that frequency. The shape of the active element determines both its resonant frequency and shape of the sonar beam. Further, padding can be used to prevent sonar emissions from certain faces of the active element (e.g., the top and sides) leaving exposed only the emitting faces for which the sonar beam is desired. Frequencies used by sonar devices vary, and some sonar transducers may produce sonar beams at multiple different frequencies. Some example sonar transducers utilize a frequency range from 50 KHz to over 900 KHz depending on application. Some sonar systems vary the frequency within each sonar pulse using “chirp” technology.
Depending on the configuration, the watercraft 100 may include a primary motor 105, which may be a main propulsion motor such as an outboard or inboard motor. Additionally, the watercraft 100 may include a trolling motor 108 configured to propel the watercraft 100 or maintain a position. The one or more sonar transducer assemblies (e.g., 102a, 102b, and/or 102c) may be mounted in various positions and to various portions of the watercraft 100 and/or equipment associated with the watercraft 100. For example, the sonar transducer assembly may be mounted to the transom 106 of the watercraft 100, such as depicted by sonar transducer assembly 102a. The sonar transducer assembly may be mounted to the bottom or side of the hull 104 of the watercraft 100, such as depicted by sonar transducer assembly 102b. The sonar transducer assembly may be mounted to the trolling motor 108, such as depicted by sonar transducer assembly 102c. Other mounting configurations are contemplated also, such as may enable rotation of the sonar transducer assembly (e.g., mechanical and/or manual rotation, such as on a rod or other mounting connection). In this regard, in some embodiments, the sonar transducer assembly 102c may rotate with the trolling motor shaft. In some embodiments, the sonar transducer assemblies 102a, 102b, 102c may be mounted to one or more steering systems to steer (e.g., rotate) the sonar transducer assemblies relative to the watercraft 100.
The watercraft 100 may also include one or more marine electronic devices 160, such as may be utilized by a user to interact with, view, or otherwise control various functionality regarding the watercraft, including, for example, nautical charts and various sonar systems described herein. In the illustrated embodiment, the marine electronic device 160 is positioned proximate the helm (e.g., steering wheel) of the watercraft 100—although other places on the watercraft 100 are contemplated. Likewise, additionally or alternatively, a remote device (such as a user's mobile device) may include functionality of a marine electronic device.
The watercraft 100 may also comprise other components within the one or more marine electronic devices 160 or at the helm. In
Some example embodiments of the present invention utilize sonar transducer assemblies that provide for generating near real-time (e.g., “live”) sonar imagery. In this regard, in some embodiments, the entire sonar image is continuously updated all at once (e.g., as opposed to building up historical slices of sonar data as is typical of conventional downscan or sidescan sonar images). The example sonar transducer assembly described with respect to
In the illustrated embodiment shown in
In some embodiments, the array 220 of transducer elements 208 is configured to operate to transmit one or more sonar beams into the underwater environment. Depending on the configuration and desired operation, different transmission types of sonar beams can occur. For example, in some embodiments, the array 220 may transmit sonar beams according to a frequency sweep (e.g., chirp sonar) so as to provide sonar beams into the underwater environment. In some embodiments, the array 220 may be operated to frequency steer transmitted sonar beams into various volumes of the underwater environment. In some embodiments, the array 220 may be operated to cause a broadband transmit sonar beam to be sent into the underwater environment. Depending on the frequency used and phase shift applied between transducer elements, different volumes of the underwater environment may be targeted.
In some embodiments, the array 220 may be configured to receive sonar return signals. The way the sonar return signals are received and/or processed may vary depending on the desired sonar system configuration.
With further reference to
Without being bound by theory, a perhaps simplified explanation of this can be based on considering a single beam shape that is formed by a receipt event of the array. The beam shape is formed of a rather wide main beam lobe, along with at least one relatively small defined side lobe (e.g., the beam 280) that extends outwardly therefrom. By operating at a fixed phase shift and ignoring the main beam lobe, the sonar return signals received within the side lobe MAY be determined. Further, changing the frequency causes a shifting of the direction of the side lobe among the range of angles (281 or 282). Since the side lobe is symmetrical about the main lobe, there are two ranges of angles that are symmetrical about the facing direction DFD of the emitting face 221 of the array 220.
Further information regarding beamforming, including frequency steered beamforming, can be found, for example, in the following: U.S. Pat. No. RE45,379, entitled “Frequency Division Beamforming for Sonar Arrays”; U.S. Pat. No. 10,114,119, entitled “Sonar Systems using Interferometry and/or Beamforming for 3D Imaging”; U.S. Pat. No. 9,739,884, entitled “Systems and Associated Methods for Producing a 3D Sonar Image”; and U.S. patent application Ser. No. 16/382,639, published as U.S. Publication No. 2019/0265354, and entitled “Sonar Transducer Having Geometric Elements”; the contents of each hereby being incorporated by reference in their entireties.
Depending on various factors, different beam shapes can be achieved and different ranges of angles can be achieved. The following describes some example factors that can be varied to effect the beam shapes and different ranges of angles: the number of transducer elements, the size/shape of the transducer elements, the size/shape of the array, the fixed phase shift, the frequency range, among other things. An example embodiment produces a first range of angles spanning ˜22.5 degrees and a second range of angles spanning ˜22.5 degrees with a gap of range of angles of ˜45 degrees therebetween. Additionally, sonar return beams of ˜0.25 degrees to 2 degree are formed. Further, with reference to
In some embodiments, the system may be configured to utilize more than one array, where the arrays are oriented relative to each other to increase coverage volume of the underwater environment. For example, in some embodiments, a second (or more) array(s) can be added and tilted relative to the first array such that the gap within the first array is “covered” by one or more of the range of angles of sonar return beams from such array(s).
Though shown mounted in
Looking first at
In the illustrated embodiment, the first sonar transducer assembly 902A provided at position 102B′ possesses a first facing direction 775A, and this first facing direction 775A is in a downward direction and towards the back of the watercraft 100. The continuous sonar coverage 705A generated by the first sonar transducer assembly 902A may generate a coverage angle (A1). This coverage angle (A1) may possess a variety of values. For example, the coverage angle (A1) may be at least 90 degrees, at least 120 degrees, or at least 135 degrees. In the illustrated embodiment, the coverage angle (A1) is 135 degrees.
Looking now at
In the illustrated embodiment of
Looking now at
In some embodiments, the overall coverage angle (AT) of the first sonar transducer assembly and the second sonar transducer assembly may be at least 140 degrees. Additionally, in some embodiments, the overall coverage angle (AT) of the first sonar transducer assembly and the second sonar transducer assembly may be at least 150 degrees. Furthermore, the overall coverage angle (AT) of the first sonar transducer assembly and the second sonar transducer assembly may be at least 180 degrees in some embodiments. In the illustrated embodiment in
As illustrated in
Looking now at
Turning now to
Looking now at
In some embodiments, the overall coverage angle (AT′) of the first sonar transducer assembly 902A and the second sonar transducer assembly 902B may be at least 140 degrees. Additionally, in some embodiments, the overall coverage angle (AT′) of the first sonar transducer assembly 902A and the second sonar transducer assembly 902B may be at least 150 degrees. Furthermore, the overall coverage angle (AT′) of the first sonar transducer assembly 902A and the second sonar transducer assembly 902B may be at least 180 degrees in some embodiments. In the illustrated embodiment in
As illustrated in
Sonar transducer assemblies may be mounted in a variety of ways.
Multiple sonar transducer assemblies may be used in some embodiments to obtain the desired increased sonar coverage volume.
As illustrated in
Various alignment features may be utilized to assist in positioning the sonar transducer assemblies to adjust the facing direction of the sonar transducer assemblies.
Looking at
Further alignment features are illustrated in
Looking first at
Turning now to
First sonar return data and first facing direction data may be obtained for a first sonar transducer assembly (e.g., via a database, user input, one or more sensors, etc.), and second sonar return data and second facing direction data may be obtained for a second sonar transducer assembly (e.g., via a database, user input, one or more sensors, etc.). The first sonar return data may be positioned based on the first facing direction data to form positioned first sonar return data, and the second sonar return data may be positioned based on the second facing direction data to form positioned second sonar return data. By forming positioned first sonar return data and positioned second sonar return data, the first sonar return data and second sonar return data may be positioned appropriately relative to each other and/or the watercraft. One or more sonar images may then be generated of the underwater environment using the positioned first sonar return data and the positioned second sonar return data. The sonar images may be live sonar images, and, in some embodiments, the images may be two-dimensional or three-dimensional live sonar images. Where live sonar images are being created, sonar return data being used to form the live sonar images may be received at substantially the same time by sonar transducer elements in the first sonar transducer assembly and the second sonar transducer assembly.
With reference to
Sonar images may be presented in different ways.
The marine electronic device 1260 may include at least one processor 1268, a memory 1272, a communication interface 1282, a user interface 1276, a display 1274, autopilot 1270, and one or more sensors (e.g. position sensor 1278, direction sensor 1220, other sensors 1280). One or more of the components of the marine electronic device 1260 may be located within a housing or could be separated into multiple different housings (e.g., be remotely located).
The processor(s) 1268 may be any means configured to execute various programmed operations or instructions stored in a memory device (e.g., memory 1272) such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g. a processor operating under software control or the processor embodied as an application specific integrated circuit (ASIC) or field programmable gate array (FPGA) specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the processor(s) 1268 as described herein. For example, the processor(s) 1268 may be configured to analyze sonar return data for various features/functions described herein (e.g., generate a sonar image, determine an object and/or object position, etc.).
In some embodiments, the processor(s) 1268 may be further configured to implement signal processing. In some embodiments, the processor(s) 1268 may be configured to perform enhancement features to improve the display characteristics of data or images, collect or process additional data, such as time, temperature, GPS information, waypoint designations, or others, or may filter extraneous data to better analyze the collected data. The processor(s) 1268 may further implement notices and alarms, such as those determined or adjusted by a user, to reflect proximity of other objects (e.g., represented in sonar data), to reflect proximity of other vehicles (e.g. watercraft), approaching storms, etc.
In an example embodiment, the memory 1272 may include one or more non-transitory storage or memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or removable. The memory 1272 may be configured to store instructions, computer program code, sonar data, and additional data such as radar data, chart data, location/position data in a non-transitory computer readable medium for use, such as by the processor(s) 1268 for enabling the marine electronic device 1260 to carry out various functions in accordance with example embodiments of the present invention. For example, the memory 1272 could be configured to buffer input data for processing by the processor(s) 1268. Additionally or alternatively, the memory 1272 could be configured to store instructions for execution by the processor(s) 1268.
The communication interface 1282 may be configured to enable communication to external systems (e.g. an external network 1284). In this manner, the marine electronic device 1260 may retrieve stored data from a remote device 1286 via the external network 1284 in addition to or as an alternative to the onboard memory 1272. Additionally or alternatively, the marine electronic device 1260 may transmit or receive data, such as sonar signal data, sonar return data, sonar image data, or the like to or from a sonar transducer assembly 1202A. In some embodiments, the marine electronic device 1260 may also be configured to communicate with other devices or systems (such as through the external network 1284 or through other communication networks, such as described herein). For example, the marine electronic device 1260 may communicate with a propulsion system of the watercraft 100 (e.g., for autopilot control); a remote device (e.g., a user's mobile device, a handheld remote, etc.); or another system. Using the external network 1284, the marine electronic device may communicate with and send and receive data with external sources such as a cloud, server, etc. The marine electronic device may send and receive various types of data. For example, the system may receive weather data, data from other fish locator applications, alert data, among others. However, this data is not required to be communicated using external network 1284, and the data may instead be communicated using other approaches, such as through a physical or wireless connection via the communications interface 1282.
The communications interface 1282 of the marine electronic device 1260 may also include one or more communications modules configured to communicate with one another in any of a number of different manners including, for example, via a network. In this regard, the communications interface 1282 may include any of a number of different communication backbones or frameworks including, for example, Ethernet, the NMEA 2000 framework, GPS, cellular, Wi-Fi, or other suitable networks. The network may also support other data sources, including GPS, autopilot, engine data, compass, radar, etc. In this regard, numerous other peripheral devices (including other marine electronic devices or sonar transducer assemblies) may be included in the system 1264.
The position sensor 1278 may be configured to determine the current position and/or location of the marine electronic device 1260 (and/or the watercraft 100). For example, the position sensor 1278 may comprise a GPS, bottom contour, inertial navigation system, such as machined electromagnetic sensor (MEMS), a ring laser gyroscope, or other location detection system. Alternatively or in addition to determining the location of the marine electronic device 1260 or the watercraft 100, the position sensor 1278 may also be configured to determine the position and/or orientation of an object outside of the watercraft 100.
The display 1274 (e.g. one or more screens) may be configured to present images and may include or otherwise be in communication with a user interface 1276 configured to receive input from a user. The display 1274 may be, for example, a conventional LCD (liquid crystal display), a touch screen display, mobile device, or any other suitable display known in the art upon which images may be displayed.
In some embodiments, the display 1274 may present one or more sets of data (or images generated from the one or more sets of data). Such data includes chart data, radar data, sonar data, weather data, location data, position data, orientation data, sonar data, or any other type of information relevant to the watercraft. Sonar data may be received from one or more sonar transducer assemblies 1202 or from sonar devices positioned at other locations, such as remote from the watercraft. Additional data may be received from marine devices such as a radar 1216, a primary motor 1205 or an associated sensor, a trolling motor 1208 or an associated sensor, an autopilot 1270, a rudder 1210 or an associated sensor, a position sensor 1278, a direction sensor 1220, other sensors 1280, a remote device 1286, onboard memory 1272 (e.g., stored chart data, historical data, etc.), or other devices.
In some further embodiments, various sets of data, referred to above, may be superimposed or overlaid onto one another. For example, a route may be applied to (or overlaid onto) a chart (e.g. a map or navigational chart). Additionally or alternatively, depth information, weather information, radar information, sonar information, or any other navigation system inputs may be applied to one another.
The user interface 1276 may include, for example, a keyboard, keypad, function keys, mouse, scrolling device, input/output ports, touch screen, or any other mechanism by which a user may interface with the system.
Although the display 1274 of
The marine electronic device 1260 may include one or more other sensors/devices 1280, such as configured to measure or sense various other conditions. The other sensors/devices 1280 may include, for example, an air temperature sensor, a water temperature sensor, a current sensor, a light sensor, a wind sensor, a speed sensor, or the like.
The sonar transducer assemblies 1202 illustrated in
The sonar transducer assemblies 1202 may also include one or more other systems, such as various sensor(s) 1266. For example, the sonar transducer assembly 1202A may include an orientation sensor, such as gyroscope or other orientation sensor (e.g., accelerometer, MEMS, direction, etc.) that can be configured to determine the relative orientation and/or direction of the sonar transducer assembly 1202A and/or the one or more sonar transducer array(s) and/or element(s) 1267 — such as with respect to the watercraft. In some embodiments, additionally or alternatively, other types of sensor(s) are contemplated, such as, for example, a water temperature sensor, a current sensor, a light sensor, a wind sensor, a speed sensor, or the like.
The components presented in
Methods for generating a live sonar image of an underwater environment having an expanded coverage volume are also contemplated.
First sonar return data may be received at operation 1304, and second sonar return data may be received at operation 1306. First sonar return data may be received from a first sonar transducer assembly, and this first sonar transducer assembly may include one or more arrays with one or more sonar transducer elements in each of the arrays. Second sonar return data may be received from a second sonar transducer assembly, and this second sonar transducer assembly may include one or more arrays with one or more sonar transducer elements in each of the arrays.
At operation 1308, first facing direction data may be received, and second facing direction data may be received at operation 1310. First facing direction data and second facing direction data may be received in a variety of ways. For example, this facing direction data may be received from sensors that are provided in a first sonar transducer assembly and a second sonar transducer assembly, the facing direction data may be manually input be an installer, or default facing direction data may automatically apply with it being assumed that the installer will provide the sonar transducer assemblies in the appropriate orientation. Where sensors are utilized, the sensors may be configured to detect the orientation of the relevant sonar transducer assembly. The facing direction data may also be provided in other ways.
At operation 1312, the first sonar return data may be positioned based on the first facing direction data, and the second sonar return data may be positioned based on the second facing direction data at operation 1314. Positioning of the first sonar return data may form positioned first sonar return data, and positioning of the second sonar return data may form positioned second sonar return data.
At operation 1316, at least one live sonar image is generated using the positioned first sonar return data and the positioned second sonar return data. In some embodiments, two different live sonar images are created, with a first live sonar image being formed using the positioned first sonar return data and with the second live sonar image being formed using the positioned second sonar return data. However, in other embodiments, a single live sonar image is created based on the positioned first sonar return data and the positioned second sonar return data.
At operation 1318, presentation of the live sonar image(s) may be caused on a display. Where multiple live sonar images have been created, the live sonar images may be presented in a split-screen view with the live sonar images positioned adjacent to each other so that the represented sonar coverage in the images extends continuously between the images.
The method 1300 is only one example method for forming a live sonar image, and the method 1300 may be modified in other embodiments. Certain operations may be omitted from the method 1300. For example, operations 1302 and/or 1318 may be omitted in some embodiments. Furthermore, certain operations may be added to method 1300. For example, additional sonar return data may be received where three or more sonar transducer assemblies are utilized. Certain operations of the method may be performed simultaneously (e.g. operations 1306 and 1308 may occur simultaneously), and the order of operations may be changed (e.g. operations 1308 and 1310 may occur before operations 1304 and 1306).
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the embodiments of the invention are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the invention. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the invention. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated within the scope of the invention. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
Number | Name | Date | Kind |
---|---|---|---|
5311095 | Smith et al. | May 1994 | A |
5329496 | Smith | Jul 1994 | A |
5548564 | Smith | Aug 1996 | A |
5744898 | Smith et al. | Apr 1998 | A |
5923617 | Thompson et al. | Jul 1999 | A |
6678210 | Rowe | Jan 2004 | B2 |
7035166 | Zimmerman et al. | Apr 2006 | B2 |
7106656 | Lerro et al. | Sep 2006 | B2 |
7123546 | Zimmerman et al. | Oct 2006 | B2 |
7173879 | Zimmerman et al. | Feb 2007 | B2 |
7330399 | Lerro et al. | Feb 2008 | B2 |
7355924 | Zimmerman et al. | Apr 2008 | B2 |
7453769 | Kirschner et al. | Nov 2008 | B2 |
7542376 | Thompson et al. | Jun 2009 | B1 |
7542377 | Kirschner et al. | Jun 2009 | B2 |
7606114 | Bachelor et al. | Oct 2009 | B2 |
7847925 | Vogt | Dec 2010 | B2 |
7852709 | Lerro et al. | Dec 2010 | B1 |
7889600 | Thompson et al. | Feb 2011 | B2 |
7957609 | Lu et al. | Jun 2011 | B2 |
8254208 | Vogt | Aug 2012 | B2 |
8300499 | Coleman et al. | Oct 2012 | B2 |
8345511 | Rikoski | Jan 2013 | B1 |
8514659 | Vogt | Aug 2013 | B2 |
8638362 | Thompson et al. | Jan 2014 | B1 |
8717847 | Blake | May 2014 | B2 |
8811120 | Bachelor et al. | Aug 2014 | B2 |
8814795 | Derode et al. | Aug 2014 | B2 |
RE45379 | Rowe | Feb 2015 | E |
8964507 | Bachelor et al. | Feb 2015 | B2 |
9182486 | Brown et al. | Nov 2015 | B2 |
RE45823 | Vogt | Dec 2015 | E |
9218799 | Stytsenko et al. | Dec 2015 | B2 |
9322915 | Betts et al. | Apr 2016 | B2 |
9386964 | Bagge | Jul 2016 | B2 |
9664783 | Brown et al. | May 2017 | B2 |
9739884 | Proctor et al. | Aug 2017 | B2 |
9766328 | Black et al. | Sep 2017 | B2 |
9784825 | Brown et al. | Oct 2017 | B2 |
9784826 | Matson et al. | Oct 2017 | B2 |
9812118 | Matson et al. | Nov 2017 | B2 |
9846232 | Thompson et al. | Dec 2017 | B1 |
9947309 | Stokes et al. | Apr 2018 | B2 |
10019002 | Harnett et al. | Jul 2018 | B2 |
10067228 | Steenstrup et al. | Sep 2018 | B1 |
10107908 | Betts et al. | Oct 2018 | B2 |
10114119 | Horner et al. | Oct 2018 | B2 |
10197674 | Thompson et al. | Feb 2019 | B2 |
10215849 | Kozuki | Feb 2019 | B2 |
10241200 | Sayer et al. | Mar 2019 | B2 |
10310062 | Coleman et al. | Jun 2019 | B2 |
10365356 | Stokes et al. | Jul 2019 | B2 |
10408933 | DeHart et al. | Sep 2019 | B1 |
10502820 | Zimmerman et al. | Dec 2019 | B2 |
10514451 | Brown et al. | Dec 2019 | B2 |
10545226 | Wigh et al. | Jan 2020 | B2 |
10605913 | Coleman et al. | Mar 2020 | B2 |
10852429 | Gatland | Dec 2020 | B2 |
10890660 | Wigh et al. | Jan 2021 | B2 |
10914810 | Laster et al. | Feb 2021 | B2 |
11059556 | Ahlgren | Jul 2021 | B2 |
11125866 | Sumi et al. | Sep 2021 | B2 |
11639996 | Clark | May 2023 | B2 |
20030235112 | Zimmerman et al. | Dec 2003 | A1 |
20050007882 | Bachelor et al. | Jan 2005 | A1 |
20070159922 | Zimmerman et al. | Jul 2007 | A1 |
20100067330 | Collier et al. | Mar 2010 | A1 |
20100074057 | Bachelor et al. | Mar 2010 | A1 |
20100284248 | Wang et al. | Nov 2010 | A1 |
20110013485 | Maguire | Jan 2011 | A1 |
20140050051 | Vogt | Feb 2014 | A1 |
20140092709 | Miller et al. | Apr 2014 | A1 |
20170031022 | Ivanov | Feb 2017 | A1 |
20170031023 | Ivanov | Feb 2017 | A1 |
20170038344 | Capus et al. | Feb 2017 | A1 |
20170212230 | Wigh et al. | Jul 2017 | A1 |
20170371039 | Clark et al. | Dec 2017 | A1 |
20180100922 | Wigh et al. | Apr 2018 | A1 |
20180224544 | Ivanov | Aug 2018 | A1 |
20180259339 | Johnson | Sep 2018 | A1 |
20180275649 | Harnett et al. | Sep 2018 | A1 |
20190079185 | Steenstrup et al. | Mar 2019 | A1 |
20190113619 | Laster | Apr 2019 | A1 |
20190235075 | Thompson et al. | Aug 2019 | A1 |
20190242994 | Wanis et al. | Aug 2019 | A1 |
20190251356 | Rivers | Aug 2019 | A1 |
20190265354 | Antao et al. | Aug 2019 | A1 |
20200011965 | Stokes et al. | Jan 2020 | A1 |
20200011981 | Stokes et al. | Jan 2020 | A1 |
20200057488 | Johnson | Feb 2020 | A1 |
20200072953 | Wigh et al. | Mar 2020 | A1 |
20200088840 | Stokes et al. | Mar 2020 | A1 |
20200103512 | Brown et al. | Apr 2020 | A1 |
20200116843 | Zimmerman et al. | Apr 2020 | A1 |
20200158842 | Wigh et al. | May 2020 | A1 |
20200256967 | Wigh et al. | Aug 2020 | A1 |
20200300994 | Matson et al. | Sep 2020 | A1 |
20210096244 | Wigh et al. | Apr 2021 | A1 |
20210132204 | Caspall | May 2021 | A1 |
20210141048 | Laster et al. | May 2021 | A1 |
20210141087 | Cunningham | May 2021 | A1 |
20210165068 | Clark | Jun 2021 | A1 |
20210173061 | Fyler et al. | Jun 2021 | A1 |
20210263150 | Stokes | Aug 2021 | A1 |
20210364636 | Simonton | Nov 2021 | A1 |
20210389439 | Sumi et al. | Dec 2021 | A1 |
20220026570 | Cunningham | Jan 2022 | A1 |
20220035026 | Proctor | Feb 2022 | A1 |
20220035027 | Proctor | Feb 2022 | A1 |
20220113393 | Nishimori et al. | Apr 2022 | A1 |
20220120882 | Coleman et al. | Apr 2022 | A1 |
20220153262 | Gallo | May 2022 | A1 |
20220171056 | Cunningham | Jun 2022 | A1 |
20220172464 | Ross | Jun 2022 | A1 |
20220373678 | Combs | Nov 2022 | A1 |
20220381906 | Combs | Dec 2022 | A1 |
20230125477 | Gurumurthy | Apr 2023 | A1 |
20230233940 | van Welzen | Jul 2023 | A1 |
Number | Date | Country |
---|---|---|
2004258175 | Sep 2009 | AU |
2009283312 | Jun 2015 | AU |
2019213353 | Aug 2019 | AU |
2019203322 | Dec 2020 | AU |
2530290 | Nov 2015 | CA |
2899119 | Jan 2017 | CA |
2928461 | Jan 2017 | CA |
2993361 | Feb 2017 | CA |
3032163 | Aug 2019 | CA |
3042656 | Nov 2019 | CA |
105759257 | Jul 2016 | CN |
110493698 | Nov 2019 | CN |
1925949 | May 2008 | EP |
2294452 | Dec 2011 | EP |
2612165 | Jul 2013 | EP |
3084467 | Oct 2016 | EP |
3144700 | Mar 2017 | EP |
1656568 | Dec 2017 | EP |
3325997 | May 2018 | EP |
3479138 | May 2019 | EP |
3572837 | Nov 2019 | EP |
2326970 | Oct 2020 | EP |
2956796 | Apr 2022 | EP |
4009069 | Jun 2022 | EP |
2004-080577 | Mar 2004 | JP |
2007-535195 | Nov 2007 | JP |
2008-508539 | Mar 2008 | JP |
2010-261883 | Nov 2010 | JP |
5600678 | Oct 2014 | JP |
5688197 | Mar 2015 | JP |
2016-510106 | Apr 2016 | JP |
2017-227489 | Dec 2017 | JP |
6444319 | Dec 2018 | JP |
2019-030623 | Feb 2019 | JP |
2020-039841 | Mar 2020 | JP |
6732274 | Jul 2020 | JP |
6737464 | Aug 2020 | JP |
2020-141250 | Sep 2020 | JP |
2020-155900 | Sep 2020 | JP |
200184719 | Jun 2000 | KR |
9409605 | Apr 1994 | WO |
2005008272 | Jan 2005 | WO |
2006017511 | Feb 2006 | WO |
2012028896 | Mar 2012 | WO |
2013126761 | Aug 2013 | WO |
2014126847 | Aug 2014 | WO |
2017015741 | Feb 2017 | WO |
2018222566 | Dec 2018 | WO |
2019050552 | Mar 2019 | WO |
2020174640 | Sep 2020 | WO |
2021019858 | Feb 2021 | WO |
2021127592 | Jun 2021 | WO |
2021176726 | Sep 2021 | WO |
2021220377 | Nov 2021 | WO |
Entry |
---|
Thompson et al; “Two Dimensional and Three Dimensional Imaging Results Using Blazed Arrays;” MTS/IEEE Oceans 2001. An Ocean Odyssey. Conference Proceedings (IEEE Cat. No.01CH37295); Nov. 5-8, 2001; pp. 985-988. |
SeaBotix—Underwater Remotely Operated Vehicles (ROVs); ADS, Inc. YouTube. 2014 Video (mentioning SmartFlight): retreived Jul. 29, 2020 from https://www.youtube.com/watch?v=hkqJh5j6eQA. |
SmartFlight 2.0 video; retrieved Jul. 29, 2020 from: http://www.teledynemarine.com/smartflight2-0?ProductLineID=112. |
“Garmin Marine Webinars: Panoptix LiveScope Installation and Setup;” YouTube; Apr. 6, 2020; retreived Jan. 12, 2021 from https://www.youtube.com/watch?v=Z2AiSOmX5PA. |
Nov. 26, 2021 Extended European Search Report issued in European Patent Application No. 21177698.4; 8 pp. |
RyTek Marine (Apr. 6, 2022). Seeing double isn't always a bad thing . . . ; retreived Sep. 30, 2022 from https://www.facebook.com/RyTekMarine. |
“Open Access Review: a Review of Acoustic Impedance Matching Techniques for Piezoelectric Sensors and Transducers;” Sensors; vol. 20; No. 14; Jul. 21, 2020; DOI: https://doi.org/10.3390/s20144051. |
“Active Target ‘Scout Only’ Transducer Mount Combo;” RyTek Marine; retreived Aug. 10, 2022 from https://rytekmarine.com/collections/lowrance-activetarget/products/active-target-scout-only-transducer-mount-combo. |