Embodiments of the present invention relate generally to systems with sonar transducer assemblies, to the presentation of marine data, and to providing improved live sonar imagery.
Sonar (SOund Navigation And Ranging) has long been used to detect waterborne or underwater objects. For example, sonar devices may be used to determine depth and bottom topography, detect fish, locate wreckage, etc. In this regard, due to the extreme limits to visibility underwater, sonar is typically the most accurate way to locate objects underwater. Sonar transducer elements, or simply transducers, may convert electrical energy into sound or vibrations at a particular frequency. A sonar sound beam is transmitted into and through the water and is reflected from objects it encounters (e.g., fish, structure, bottom surface of the water, etc.). The transducer may receive the reflected sound (the “sonar returns”) and convert the sound energy into electrical energy. Based on the known speed of sound, it is possible to determine the distance to and/or location of the waterborne or underwater objects.
The sonar return signals can also be processed to be presented on a display, giving the user a “picture” or image of the underwater environment.
A display can be used to present marine information (such as sonar images or nautical charts) to a user. Live sonar imagery provides a two-dimensional sonar image that continuously updates all at the same time to provide a “live” sonar image of the underwater environment. Current systems provide limited coverage volumes however. Embodiments of the present invention provide improved sonar imagery that includes increased coverage volume of the underwater environment, particularly for such live sonar imagery. In some embodiments, the live sonar imagery may be provided as an overlay on a nautical chart, such as to give real-world context to the live sonar imagery.
Example embodiments of the present invention provide various sonar systems for imaging an underwater environment. Some example sonar systems provide for generating a live sonar image that represents an image of a volume of the underwater environment that is updating in real-time. In this regard, in some embodiments, the sonar system may include one or more arrays of sonar transducer elements that operate to beamform multiple sonar return beams. The multiple sonar return beams can be filtered, such as based on frequency, to receive sonar returns in sonar beam slices (e.g., around 0.25 degrees-2 degree beam angle). The sonar beam slices build-up to form the live sonar image extending across an overall sonar beam angle (e.g., multiple adjacent slices may form an overall coverage angle, such as ˜135 degrees). Since the sonar beam slices update continually, the resulting sonar image updates. Accordingly, the system may be configured to generate a corresponding two-dimensional (2D) near-real time (or “live”) sonar image.
Other example embodiments provide systems having sonar transducer assemblies and brackets having alignment features. The alignment features may be used to control the facing direction of sonar transducer assemblies. In some embodiments, a single bracket may be provided that is configured to position two or more sonar transducer assemblies. However, in other embodiments, there may be multiple brackets where each bracket may be configured to position one or more sonar transducer assemblies. In some systems, three sonar transducer assemblies, four sonar transducer assemblies or a greater number of sonar transducer assemblies may be provided. By using multiple sonar transducer assemblies, the overall coverage volume for sonar may be increased. The overall coverage volume may be defined by a coverage angle in one dimension (e.g., horizontal) that may be increased to 140 degrees, 150 degrees, 180 degrees, 270 degrees, or even 360 degrees.
In some example embodiments, the array(s) may be oriented such that the facing direction corresponding to the sonar image is generally outward of the watercraft. In such an example, the sonar image may extend in a horizontal plane, such as may correspond with the horizontal plane of a nautical chart. Accordingly, in some embodiments, the system may be configured to cause presentation of the live sonar image in the facing direction on the chart and relative to a representation of the watercraft so as to provide live sonar imagery on the chart to visually provide a relationship between objects within the live sonar imagery and a real-world position of the objects.
In some embodiments, the effective distance of the sonar coverage for the live sonar image may be accounted for during presentation on the chart. In this regard, the size of the sonar image on the chart may dimensionally correspond to the size of the sonar beam coverage within the underwater environment. In such examples, a user can more accurately understand where an object presented in the sonar image is in the real world. This may be useful for casting a fishing line or setting an anchor, among other things.
In some embodiments, the array may be rotatable with respect to the watercraft. Accordingly, the orientation of the sonar image of the chart with respect to the watercraft may be adjusted based on the current facing direction of the sonar system.
In an example embodiment, a system for generating live sonar images is provided. The system includes a first sonar transducer assembly having a first plurality of sonar transducer elements. The first sonar transducer assembly defines a first facing direction, and the first plurality of sonar transducer elements are configured to transmit one or more first sonar beams into an underwater environment to form a first coverage volume within the underwater environment. The system also includes a second sonar transducer assembly having a second plurality of sonar transducer elements. The second sonar transducer assembly defines a second facing direction, and the second plurality of sonar transducer elements are configured to transmit one or more second sonar beams into the underwater environment to form a second coverage volume within the underwater environment. The system also includes at least one bracket having one or more alignment features. The bracket(s) are configured to mount the first sonar transducer assembly and the second transducer assembly to a watercraft, and the alignment feature(s) are configured to position the first sonar transducer assembly and the second sonar transducer assembly so that the first facing direction and the second facing direction are different and relative to each other so as to create continuous coverage of the underwater environment. The continuous coverage has an overall coverage volume that is greater than either of the first coverage volume or the second coverage volume individually. The first facing direction and the second facing direction are generally outward of the watercraft. Further, the sonar return data from the first plurality of sonar transducer elements and the second plurality of sonar transducer elements is used to form a live sonar image representative of sonar returns received from the overall coverage volume.
In some embodiments, the system may also include at least one processor configured to receive first sonar return data from the first plurality of sonar transducer elements and second sonar return data from the second plurality of sonar transducer elements. Further, the processor(s) may be configured to generate the live sonar image based on the first sonar return data and the second sonar return data.
In some embodiments, the bracket(s) may include a first bracket and a second bracket. The first bracket may be configured to mount the first sonar transducer assembly, and the second bracket may be configured to mount the second sonar transducer assembly.
In some embodiments, the bracket(s) may include a first bracket, and the first bracket may be configured to mount both the first sonar transducer assembly and the second sonar transducer assembly. Further, in some embodiments, the first bracket may include a first arm, a second arm, and a connecting arm. The connecting arm may connect the first arm and the second arm, the first arm may be configured to mount the first sonar transducer assembly, and the second arm may be configured to mount the second sonar transducer assembly. In some embodiments, the first arm may be connected to the connecting arm at a first end of the connecting arm, and the second arm may be connected to the connecting arm at a second end of the connecting arm. The connecting arm may extend in a lengthwise direction from the first end to the second end. Further, the first arm may possess a slope that is angularly offset from the lengthwise direction. Additionally, in some embodiments, the first bracket may be oriented such that the lengthwise direction is a vertical direction, and the slope may cause the first sonar transducer assembly to be rotated at a downward angle relative to a horizontal direction when mounted on the first arm. In some embodiments, the connecting arm may define a first surface, the first arm may extend at a first angle that is angularly offset from the first surface, and the second arm may extend at a second angle that is angularly offset from the first surface. Furthermore, in some embodiments, the first angle may be angularly offset from the first surface in a first direction, and the second angle may be angularly offset from the first surface in a second direction that is the opposite of the first direction. In some embodiments, extension of the first arm at the first angle and extension of the second arm at the second angle may reduce an overall footprint of the first sonar transducer assembly and the second sonar transducer assembly when the first sonar transducer assembly and the second sonar transducer assembly are mounted on the first bracket.
In some embodiments, the system may also include a clamp. The clamp may define an internal volume, and the clamp may be configured to be attached to an object by receiving the object in the internal volume. The clamp may be configured to be attached to the bracket(s) to assist in attaching the bracket(s) to the object, and a connecting arm of the bracket(s) may be offset by some distance from a center point of the internal volume when the bracket(s) is attached to the clamp. In some embodiments, the bracket(s) may be attachable to at least one of a pole, a trolling motor, a primary motor, or a hull of the watercraft.
In another example embodiment, a system for generating live sonar images is provided. The system includes at least one sonar transducer assembly having a plurality of arrays of a plurality of sonar transducer elements associated with a watercraft on a body of water and oriented with an emitting face in a facing direction. The facing direction is generally outward of the watercraft, and the plurality of sonar transducer elements are configured to transmit one or more sonar beams into an underwater environment to form an overall coverage volume with a horizontal coverage angle. The horizontal coverage angle defines an angle that is greater than 140 degrees. Further, sonar return data from the plurality of sonar transducer elements is used to form a live sonar image representative of sonar returns received from the overall coverage volume.
In some embodiments, the system may also include at least one processor that is configured to receive the sonar return data from the plurality of sonar transducer elements. The processor(s) may also be configured to generate the live sonar image of the underwater environment based on the sonar return data.
In some embodiments, the horizontal coverage angle may define an angle that is at least 180 degrees. Furthermore, in some embodiments, the horizontal coverage angle may define an angle that is at least 270 degrees. Additionally, in some embodiments, the horizontal coverage angle may define an angle that is at least 360 degrees. In some embodiments, sonar transducer assembl(ies) may include a first sonar transducer assembly having a first facing direction, a second sonar transducer assembly having a second facing direction, and a third sonar transducer assembly having a third facing direction. The first facing direction, the second facing direction, and the third facing direction may be different, and the sonar return data may be used to generate a 360 degree live sonar image. Furthermore, in some embodiments, the system may also include a first bracket and a second bracket. The first bracket may be configured to mount the first sonar transducer assembly and the second sonar transducer assembly, and the second bracket may be configured to mount the third sonar transducer assembly. In addition, in some embodiments, the sonar transducer assembl(ies) may also include a fourth sonar transducer assembly, and the second bracket may be configured to mount the third sonar transducer assembly and the fourth sonar transducer assembly.
In some embodiments, the system may also include a display, at least one processor, and a memory including computer program code. The computer program code may be configured to, when executed, cause the processor(s) to cause the plurality of sonar transducer elements to transmit the sonar beam(s) into the underwater environment, receive the sonar return data, and generate the live sonar image of the underwater environment based on the sonar return data. The live sonar image may be a two-dimensional live sonar image that is formed of the sonar return data, and the sonar return data used to form the live sonar image may be received at substantially a same time by the plurality of sonar transducer elements. The computer program code may also be configured to, when executed, cause the processor(s) to cause, on the display, presentation of the live sonar image.
In another example embodiment, a bracket for positioning sonar transducer assemblies is provided. The bracket includes a first arm having a first alignment feature, a second arm having a second alignment feature, and a connecting arm. The connecting arm connects the first arm and the second arm, and the connecting arm extends in a lengthwise direction between a first end and a second end. The first arm is connected to the connecting arm at the first end, and the second arm is connected to the connecting arm at the second end. The first alignment feature is configured to receive a first sonar transducer assembly and aim the first sonar transducer assembly in a first facing direction, and the second alignment feature is configured to receive a second sonar transducer assembly and aim the second sonar transducer assembly in a second facing direction. The bracket is configured to position the first sonar transducer assembly and the second sonar transducer assembly such that the first sonar transducer assembly and the second sonar transducer assembly together provide an overall coverage volume with a horizontal coverage angle of greater than 140 degrees.
Additional example embodiments of the present invention include methods, systems, apparatuses, and computer program products associated with various embodiments described herein.
Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Exemplary embodiments of the present invention now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout.
In this regard, the sonar transducer may be formed of one or more active elements (e.g., piezoelectric crystals). Wires are soldered to coatings on the active element and can be attached to a cable which transfers the electrical energy from a transmitter to the active element. The shape of the active element determines both its resonant frequency and shape of the sonar beam. Further, padding can be used to prevent sonar emissions from certain faces of the active element (e.g., the top and sides) leaving exposed only the emitting faces for which the sonar beam is desired. Frequencies used by sonar devices vary, and some sonar transducers may produce sonar beams at multiple different frequencies. Some example sonar transducers utilize a frequency range from 50 KHz to over 900 KHz depending on application. Some sonar systems vary the frequency within each sonar pulse using “chirp” technology.
Depending on the configuration, the watercraft 100 may include a primary motor 105, which may be a main propulsion motor such as an outboard or inboard motor. Additionally, the watercraft 100 may include a trolling motor 108 configured to propel the watercraft 100 or maintain a position. The one or more transducer assemblies (e.g., 102a, 102b, and/or 102c) may be mounted in various positions and to various portions of the watercraft 100 and/or equipment associated with the watercraft 100. For example, the transducer assembly may be mounted to the hull (e.g., transom 106) of the watercraft 100, such as depicted by transducer assembly 102a. The transducer assembly may be mounted to the bottom or side of the hull 104 of the watercraft 100, such as depicted by transducer assembly 102b. The transducer assembly may be mounted to the trolling motor 108, such as depicted by transducer assembly 102c. Other mounting configurations are contemplated also, such as may enable rotation of the transducer assembly (e.g., mechanical and/or manual rotation, such as on a rod or other mounting connection).
The watercraft 100 may also include one or more marine electronic devices 160, such as may be utilized by a user to interact with, view, or otherwise control various functionality regarding the watercraft, including, for example, nautical charts and various sonar systems described herein. In the illustrated embodiment, the marine electronic device 160 is positioned proximate the helm (e.g., steering wheel) of the watercraft 100—although other places on the watercraft 100 are contemplated. Likewise, additionally or alternatively, a remote device (such as a user's mobile device) may include functionality of a marine electronic device.
The watercraft 100 may also comprise other components within the one or more marine electronic devices 160 or at the helm. In
Some example embodiments of the present invention utilize sonar transducer assemblies that provide for generating near real-time (e.g., “live”) sonar imagery. In this regard, in some embodiments, the entire sonar image is continuously updated all at once (e.g., as opposed to building up historical slices of sonar data as is typical of conventional downscan or sidescan sonar images). The example transducer assembly described with respect to
In the illustrated embodiment shown in
In some embodiments, the array 220 of transducer elements 208 is configured to operate to transmit one or more sonar beams into the underwater environment. Depending on the configuration and desired operation, different transmission types of sonar beams can occur. For example, in some embodiments, the array 220 may transmit sonar beams according to a frequency sweep (e.g., chirp sonar) so as to provide sonar beams into the underwater environment. In some embodiments, the array 220 may be operated to frequency steer transmitted sonar beams into various volumes of the underwater environment. In some embodiments, the array 220 may be operated to cause a broadband transmit sonar beam to be sent into the underwater environment. Depending on the frequency used and phase shift applied between transducer elements, different volumes of the underwater environment may be targeted.
In some embodiments, the array 220 may be configured to receive sonar return signals. The way the sonar return signals are received and/or processed may vary depending on the desired sonar system configuration.
With further reference to
Without being bound by theory, a perhaps simplified explanation of this can be based on considering a single beam shape that is formed by a receipt event of the array. The beam shape is formed of a rather wide main beam lobe, along with at least one relatively small defined side lobe (e.g., the beam 280) that extends outwardly therefrom. By operating at a fixed phase shift and ignoring the main beam lobe, the sonar return signals received within the side lobe can be determined. Further, changing the frequency causes a shifting of the direction of the side lobe among the range of angles (281 or 282). Since the side lobe is symmetrical about the main lobe, there are two ranges of angles that are symmetrical about the facing direction DFD of the emitting face 221 of the array 220.
Further information regarding beamforming, including frequency steered beamforming, can be found, for example, in the following: U.S. Pat. No. RE45,379, entitled “Frequency Division Beamforming for Sonar Arrays”; U.S. Pat. No. 10,114,119, entitled “Sonar Systems using Interferometry and/or Beamforming for 3D Imaging”; U.S. Pat. No. 9,739,884, entitled “Systems and Associated Methods for Producing a 3D Sonar Image”; and U.S. patent application Ser. No. 16/382,639, published as U.S. Publication No. 2019/0265354, and entitled “Sonar Transducer Having Geometric Elements”; the contents of each hereby being incorporated by reference in their entireties.
Depending on various factors, different beam shapes can be achieved and different ranges of angles can be achieved. The following describes some example factors that can be varied to effect the beam shapes and different ranges of angles: the number of transducer elements, the size/shape of the transducer elements, the size/shape of the array, the fixed phase shift, the frequency range, among other things. An example embodiment produces a first range of angles spanning ˜22.5 degrees and a second range of angles spanning ˜22.5 degrees with a gap of range of angles of ˜45 degrees therebetween. Additionally, sonar return beams of ˜0.5 degrees to 1 degrees are formed. Further, with reference to
In some embodiments, the system may be configured to utilize more than one array, where the arrays are oriented relative to each other to increase coverage volume of the underwater environment. For example, in some embodiments, a second (or more) array(s) can be added and tilted relative to the first array such that the gap within the first array is “covered” by one or more of the range of angles of sonar return beams from such array(s).
Though shown mounted in
In some embodiments, the transducer assembly can be used to form a live (or substantially real-time) two-dimensional (2D) sonar image (e.g., time/distance from the transducer assembly and angle) with a horizontal view. For example,
Due to the overall coverage angle being ˜135 degrees, there are blank spaces in each corner 723a, 723b (as the display is shaped as a rectangle). Notably, the shape of the sonar image may be different depending on the effective coverage provided by the sonar transducer assembly. In this regard, in some embodiments, the live sonar image is shaped to provide imagery of the sonar return data all at once, and that sonar return data is continuously updated such that the imagery is continuously updated.
Whether a novice or an expert, it would be beneficial to be able to quickly and easily visually appreciate the real-world sonar coverage of a sonar image, such as a live sonar image. Indeed, even for experts, it can be difficult (or mentally consuming) to determine the real-world sonar coverage of a sonar transducer of a watercraft, such as figuring out where objects in the sonar imagery are actually in the real-world. Some embodiments of the present invention aim to provide useful information that will aid the user in determining and understanding the sonar coverage of the underwater environment, such as by providing live sonar imagery on a chart in the proper location, orientation, and/or dimensional spacing.
Referencing
The processor 410 may also be configured to receive sonar return data in response to the one or more sonar signals being transmitted into the body of water 101. As discussed above, the processor 410 may be configured to generate one or more sonar images based on the one or more sonar returns. In some embodiments, sonar return data from multiple of the sonar transducer assemblies, such as described herein, may be combined or otherwise integrated to form a sonar image of the overall coverage volume. The processor 410 may determine corresponding facing directions for each sonar transducer assembly and cause relative presentation of the sonar imagery based thereon to form the sonar image of the overall coverage volume. In some embodiments, the sonar transducer assemblies may include one or more sensors that may enable the processor to correlate received sonar return data with a facing direction for use in forming the sonar imagery. In some embodiments, certain sonar transducer assemblies may be assigned or predetermined for their relative orientations. In some embodiments, the resultant sonar image may be a composite of multiple sonar images. In some embodiments, the entire sonar image may be generated together.
The processor 410 may determine a location associated with the sonar return data based on location data received by the position sensor 445 at the time in which the sonar returns were received by the one or more transducer assemblies 102a, 102b, 102c (e.g., one or more of sonar transducer assemblies 462, 462′, 462″ in
In some embodiments, the system may be configured to cause presentation of chart (e.g., nautical chart) on a display, along with a representation of the watercraft at a current location within the chart. The chart may be stored in memory and/or gathered via an external or internal network. The position and/or orientation of the watercraft may be determined via position/orientation data, such as from a global positioning system (GPS) and/or other source(s). Returning to
In some embodiments, the system may be configured to operate one or more sonar transducer assemblies associated with the watercraft. For example, the system may be configured to operate one or more arrays of a plurality of sonar transducer elements, such as from the sonar transducer assembly 602 shown in
In some embodiments, the system may be configured to determine the facing direction of the sonar transducer assembly. In some embodiments, direction data (e.g., orientation data, compass data, etc.) may be determined regarding at least one of the watercraft or the sonar transducer assembly. For example, the relative facing direction of the sonar transducer assembly with respect to the watercraft may be known and fixed (e.g., forward, rearward, 10 degrees port of forward, etc.). In that case, the facing direction may be determined by determining the direction the watercraft is facing and then extracting out the facing direction of the sonar transducer assembly. In some cases, however, the sonar transducer assembly may have its own sensor for determining the facing direction (e.g., a direction sensor, GPS, orientation sensor, etc.) and the facing direction may be determined based on that data. Alternatively, the facing direction may be determined in other ways, such as being inputted by a user.
In some embodiments, the system is configured to cause, on the display, presentation of the sonar image in the facing direction on the chart and relative to the representation of the watercraft. In this regard, the sonar image is presented in the facing direction on the chart so as to provide live sonar imagery on the chart to visually provide a relationship between objects within the live sonar imagery and a real-world position of the objects. Referring to
In some embodiments, the radial distance of the sonar image (e.g., radial distance DRSI in
In some embodiments, the sonar image may be generated and/or presented to remove any unnecessary or unused space (e.g., so as to not detract from the view of the chart itself). For example, the blank corners 723a, 723b of the sonar image 720 of
As noted above, in some embodiments, the sonar image may be a live sonar image. In this regard, in some such embodiments, the sonar image 793 may be updated in real-time while being presented on the chart.
In addition to making it easier to determine real-world positions of objects within the sonar image, utilizing this feature enables a reduction of the number of images that are displayed (e.g., a normal split-screen chart and sonar view may be replaced with a single larger chart view with the sonar image presented thereon). In some embodiments, various navigation and other chart features may be presented along with the sonar image on the chart.
In some embodiments, the relative position of the sonar transducer on the watercraft may be accounted for when forming and/or presenting the sonar image. In this regard, a sonar image from a sonar transducer assembly positioned near the front of the watercraft (e.g., mounted to the front of the watercraft, mounted to a trolling motor positioned on the front of the watercraft, etc.) may extend from a point on the representation of the watercraft near the front. Likewise, a sonar image from a sonar transducer assembly positioned near the rear of the watercraft (e.g., mounted to the rear of the watercraft, mounted to a trolling motor positioned on the rear of the watercraft, etc.) may extend from a point on the representation of the watercraft near the rear. Other relative positions are also contemplated. In some embodiments, position data associated with the sonar transducer assembly may be utilized directly from the sonar transducer assembly (e.g., as opposed to from the watercraft) to determine where to position the sonar image on the chart. In some embodiments, the relative position of the sonar transducer on the watercraft may be known (or inputted), which can be used to position the sonar image on the chart relative to the representation of the watercraft.
In some embodiments, the sonar transducer assembly may be rotatable with respect to the watercraft. For example, the sonar transducer assembly may be mounted to a trolling motor that is rotatable with respect to the watercraft. As another example, the sonar transducer assembly may be mounted to a rod or directly mounted to the watercraft in a manner that enables rotation (e.g., manually and/or mechanically). In some such embodiments, it may be desirable to provide a direction sensor (e.g., direction sensor, orientation sensor, etc.) with the sonar transducer assembly to enable detection of the facing direction of the sonar transducer assembly. Accordingly, in some embodiments, the system may be configured to re-orient the sonar image (such as with respect to the watercraft) based on the current facing direction.
In some embodiments, the system may include one or more additional sonar transducer assemblies or arrays. Such additional sonar transducer assemblies or arrays may be formed of any configuration of sonar transducer elements. For example, the watercraft may include other types of sonar transducer assemblies, such as downscan transducer elements (traditional and/or linear), sidescan transducer elements, or other arrays of transducer elements. In some embodiments, the system may be configured to generate and present corresponding sonar images on the chart, such as in the proper orientation and at the proper location. In some embodiments, multiple sonar images may be presented on the chart simultaneously.
In some embodiments, the additional sonar transducer assemblies or arrays may be aimed in a different facing direction than the first sonar transducer assembly. In some such embodiments, the multiple sonar images may be presented on the chart at the same time, thereby providing a composite sonar image that covers a large section of the chart (and the underwater environment). For example, a first sonar image may define a first coverage area (with a first overall coverage angle) in a horizontal plane extending outwardly from the watercraft and a second sonar image may define a second coverage area (with a second overall coverage angle) in the horizontal plane extending outwardly from the watercraft, where the first coverage area is different than the second coverage area. In some embodiments, the coverage areas may be configured so as to not overlap.
In some embodiments, the multiple sonar transducer assemblies or arrays may be positioned (e.g., and mounted) and aimed to coordinate together to form a desirable coverage volume. For example, two sonar transducer assemblies with a similar configuration may be aimed in different facing directions, but the sonar transducer assemblies may be compliments to each other such that the two sonar coverage volume are positioned to form a continuous composite sonar coverage volume. Such a continuous composite sonar coverage volume may, for example, cover an angle range extending from the watercraft (e.g., 240 degrees, 360 degrees, or some other degree range). For example,
In some embodiments, the sonar system may be designed to provide 360 degrees of coverage around the watercraft. For example, multiple arrays and/or multiple sonar transducer assemblies may be arranged in appropriate facing directions and have appropriate sonar coverage to enable the full 360 degree view. In some examples, the resulting sonar image may include live sonar imagery over the entire coverage area. Additionally or alternatively, in some embodiments, the 360 degree sonar image may be built up as the one or more sonar transducer assemblies or arrays rotate about the watercraft.
Multiple sonar transducer assemblies may be used to increase the sonar coverage volume, and various brackets are contemplated herein that may be used to mount and position one or more sonar transducer assemblies.
As illustrated in
The first sonar transducer assembly 1202A and the second sonar transducer assembly 1202B may each be similar to the sonar transducer assembly 602 of
Looking now at
Brackets utilized to receive and position sonar transducer assemblies may be attached to a pole or some other object using a clamp, and the engagement of a bracket and clamp are illustrated in
As illustrated in
Brackets are contemplated herein having the ability to receive and position two different sonar transducer assemblies, and
Looking first at
Each of the first arm 1506A and the second arm 1506B may include various alignment features. For example, the first arm 1506A includes a first bar 1517A, a first patterned surface 1519A, protrusions 1525A, and/or first knob 1527A. Additionally, the second arm 1506B may include alignment features in the form of second bar 1517B, a second patterned surface 1519B, protrusions 1525B, and/or second knob 1527B. The first patterned surface 1519A and the second patterned surface 1519B may include a plurality of elevated regions and depressed regions, and the patterned surfaces may be configured to assist with positioning of sonar transducer assemblies, to assist in retaining the sonar transducer assembly in position, and/or to assist in preventing inadvertent rotation or movement of the sonar transducer assemblies. The first bar 1517A and the second bar 1517B may be received in a sonar transducer assembly to restrain the lateral movement of the respective sonar transducer assemblies. The first bar 1517A may be attached to the first knob 1527A, and rotation of the first knob 1527A may result in rotation of the first bar 1517A and any sonar transducer assembly attached to the first bar 1517A. Similarly, the second bar 1517B may be attached to the second knob 1527B, and rotation of the second knob 1527B may result in rotation of the second bar 1517B and any sonar transducer assembly attached to the second bar 1517B. Furthermore, the protrusions 1525A and protrusions 1525B may be configured to assist a user in positioning a sonar transducer assembly at a desired angular orientation on the bracket 1506. The first arm 1506A possesses a first inclined surface 1533A, and some of the alignment features may be provided on the first inclined surface 1533A. The second arm 1506B possesses a second inclined surface 1533B, and some of the alignment features may be provided on the second inclined surface 1533B.
Using the first arm 1506A, the second arm 1506B, and the alignment features on the arms, sonar transducer assemblies may be positioned as desired. The sonar transducer assemblies may be positioned so that the first facing direction of the first sonar transducer assembly 1202A (see
The connecting arm 1506C defines a front surface 1537. The connecting arm 1506C possesses one or more holes 1539. These holes 1539 may be provided at the front surface 1537 and may extend through the connecting arm 1506C. These holes 1539 may be threaded holes in some embodiments. The holes 1539 may be used to assist in attaching the bracket 1506 to a clamp 1408 (see
Looking now at
The first arm 1506A defines a surface 1533A, and the first arm 1506A possesses a slope so that a normal to the surface 1533A is angularly offset from the lengthwise direction of the connecting arm 1506C.
Looking now at
While brackets are described having the ability to receive and position multiple sonar transducer assemblies, other brackets may be provided having only the ability to receive and position a single sonar transducer assembly.
In some embodiments, a bracket may be provided with the ability to receive and position four different sonar transducer assemblies.
In various embodiments, sonar transducer assemblies may be arranged in a manner that is configured to permit the generation of live sonar images having 360 degrees of coverage.
Looking first at
Turning now to
With the embodiment of
Looking now at
Though various illustrated embodiments show various brackets attached to a shaft of a trolling motor, other mountings are contemplated, such as on a pole separate from a trolling motor shaft (although the pole may be attached to a trolling motor shaft).
The marine electronic device 405 may include at least one processor 410, a memory 420, a communication interface 430, a user interface 435, a display 440, autopilot 450, and one or more sensors (e.g. position sensor 445, direction sensor 448, other sensors 452). One or more of the components of the marine electronic device 405 may be located within a housing or could be separated into multiple different housings (e.g., be remotely located).
The processor(s) 410 may be any means configured to execute various programmed operations or instructions stored in a memory device (e.g., memory 420) such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g. a processor operating under software control or the processor embodied as an application specific integrated circuit (ASIC) or field programmable gate array (FPGA) specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the at least one processor 410 as described herein. For example, the at least one processor 410 may be configured to analyze sonar return data for various features/functions described herein (e.g., generate a sonar image, determine an object and/or object position, etc.).
In some embodiments, the at least one processor 410 may be further configured to implement signal processing. In some embodiments, the at least one processor 410 may be configured to perform enhancement features to improve the display characteristics of data or images, collect or process additional data, such as time, temperature, GPS information, waypoint designations, or others, or may filter extraneous data to better analyze the collected data. The at least one processor 410 may further implement notices and alarms, such as those determined or adjusted by a user, to reflect proximity of other objects (e.g., represented in sonar data), to reflect proximity of other vehicles (e.g. watercraft), approaching storms, etc.
In an example embodiment, the memory 420 may include one or more non-transitory storage or memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or removable. The memory 420 may be configured to store instructions, computer program code, sonar data, and additional data such as radar data, chart data, location/position data in a non-transitory computer readable medium for use, such as by the at least one processor 410 for enabling the marine electronic device 405 to carry out various functions in accordance with example embodiments of the present invention. For example, the memory 420 could be configured to buffer input data for processing by the at least one processor 410. Additionally or alternatively, the memory 420 could be configured to store instructions for execution by the at least one processor 410.
The communication interface 430 may be configured to enable communication to external systems (e.g. an external network 402). In this manner, the marine electronic device 405 may retrieve stored data from a remote device 454 via the external network 402 in addition to or as an alternative to the onboard memory 420. Additionally or alternatively, the marine electronic device 405 may transmit or receive data, such as sonar signal data, sonar return data, sonar image data, or the like to or from a sonar transducer assembly 462. In some embodiments, the marine electronic device 405 may also be configured to communicate with other devices or systems (such as through the external network 402 or through other communication networks, such as described herein). For example, the marine electronic device 405 may communicate with a propulsion system of the watercraft 100 (e.g., for autopilot control); a remote device (e.g., a user's mobile device, a handheld remote, etc.); or another system. Using the external network 402, the marine electronic device may communicate with and send and receive data with external sources such as a cloud, server, etc. The marine electronic device may send and receive various types of data. For example, the system may receive weather data, data from other fish locator applications, alert data, among others. However, this data is not required to be communicated using external network 402, and the data may instead be communicated using other approaches, such as through a physical or wireless connection via the communications interface 430.
The communications interface 430 of the marine electronic device 405 may also include one or more communications modules configured to communicate with one another in any of a number of different manners including, for example, via a network. In this regard, the communications interface 430 may include any of a number of different communication backbones or frameworks including, for example, Ethernet, the NMEA 2000 framework, GPS, cellular, Wi-Fi, or other suitable networks. The network may also support other data sources, including GPS, autopilot, engine data, compass, radar, etc. In this regard, numerous other peripheral devices (including other marine electronic devices or sonar transducer assemblies) may be included in the system 400.
The position sensor 445 may be configured to determine the current position and/or location of the marine electronic device 405 (and/or the watercraft 100). For example, the position sensor 445 may comprise a GPS, bottom contour, inertial navigation system, such as machined electromagnetic sensor (MEMS), a ring laser gyroscope, or other location detection system. Alternatively or in addition to determining the location of the marine electronic device 405 or the watercraft 100, the position sensor 445 may also be configured to determine the position and/or orientation of an object outside of the watercraft 100.
The display 440 (e.g. one or more screens) may be configured to present images and may include or otherwise be in communication with a user interface 435 configured to receive input from a user. The display 440 may be, for example, a conventional LCD (liquid crystal display), a touch screen display, mobile device, or any other suitable display known in the art upon which images may be displayed.
In some embodiments, the display 440 may present one or more sets of data (or images generated from the one or more sets of data). Such data includes chart data, radar data, sonar data, weather data, location data, position data, orientation data, sonar data, or any other type of information relevant to the watercraft. Sonar data may be received from one or more sonar transducer assemblies 462 or from sonar devices positioned at other locations, such as remote from the watercraft. Additional data may be received from marine devices such as a radar 456, a primary motor 458 or an associated sensor, a trolling motor 459 or an associated sensor, an autopilot, a rudder 457 or an associated sensor, a position sensor 445, a direction sensor 448, other sensors 452, a remote device 454, onboard memory 420 (e.g., stored chart data, historical data, etc.), or other devices.
In some further embodiments, various sets of data, referred to above, may be superimposed or overlaid onto one another. For example, a route may be applied to (or overlaid onto) a chart (e.g. a map or navigational chart). Additionally or alternatively, depth information, weather information, radar information, sonar information, or any other navigation system inputs may be applied to one another.
The user interface 435 may include, for example, a keyboard, keypad, function keys, mouse, scrolling device, input/output ports, touch screen, or any other mechanism by which a user may interface with the system.
Although the display 440 of
The marine electronic device 405 may include one or more other sensors/devices 452, such as configured to measure or sense various other conditions. The other sensors/devices 452 may include, for example, an air temperature sensor, a water temperature sensor, a current sensor, a light sensor, a wind sensor, a speed sensor, or the like.
The sonar transducer assemblies 462 illustrated in
The sonar transducer assemblies 462 may also include one or more other systems, such as various sensor(s) 466. For example, the sonar transducer assembly 462 may include an orientation sensor, such as gyroscope or other orientation sensor (e.g., accelerometer, MEMS, direction, etc.) that can be configured to determine the relative orientation and/or direction of the sonar transducer assembly 462 and/or the one or more sonar transducer array(s) and/or element(s) 467—such as with respect to the watercraft. In some embodiments, additionally or alternatively, other types of sensor(s) are contemplated, such as, for example, a water temperature sensor, a current sensor, a light sensor, a wind sensor, a speed sensor, or the like.
The components presented in
Some embodiments of the present invention provide methods, apparatus, and computer program products related to the presentation of information according to various embodiments described herein. Various examples of the operations performed in accordance with embodiments of the present invention will now be provided with reference to
At operation 802, the method comprises causing presentation of a chart, including a representation of the watercraft at a current location within the chart. At operation 804, the method comprises operating an array of a plurality of elements of one or more transducer assemblies. At operation 806, the method comprises receiving sonar return data from the one or more transducer assemblies. At operation 808, the method comprises generating a sonar image, such as a live sonar image. Then, at operation 810, the method comprises determining a facing direction corresponding to the one or more transducer assemblies and/or the sonar image. At operation 812, the method comprises causing presentation of the sonar image on the chart in a facing direction at the current location for the watercraft. In some embodiments, the method may include, at operation 814, updating the sonar image. In some embodiments, at operation 816, the method may include causing presentation of an object indicator within the sonar image (which may include determining the object and/or tracking the object).
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the embodiments of the invention are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the invention. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the invention. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated within the scope of the invention. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
This application claims priority to and is a continuation-in-part of U.S. Non-Provisional application Ser. No. 17/174,415, filed Feb. 12, 2021, entitled “Marine Chart and Sonar Image Presentation Systems and Methods”, which claims priority to and is a continuation-in-part of U.S. Non-Provisional application Ser. No. 17/123,189, filed Dec. 16, 2020, entitled “Marine Electronic Device For Presentment of Nautical Charts and Sonar Images”, which claims priority to and is a continuation of U.S. Non-Provisional application Ser. No. 15/982,362, filed May 17, 2018, entitled “Marine Electronic Device For Presentment of Nautical Charts and Sonar Images”, which issued as U.S. Pat. No. 10,914,810, the contents of each being hereby incorporated by reference in its entirety.
| Number | Name | Date | Kind |
|---|---|---|---|
| 3572837 | Lackey | Mar 1971 | A |
| 4425635 | Yamamoto et al. | Jan 1984 | A |
| 4970700 | Gilmour | Nov 1990 | A |
| 5228008 | Burhanpurkar | Jul 1993 | A |
| 5311095 | Smith et al. | May 1994 | A |
| 5329496 | Smith | Jul 1994 | A |
| 5548564 | Smith | Aug 1996 | A |
| 5744898 | Smith et al. | Apr 1998 | A |
| 5923617 | Thompson et al. | Jul 1999 | A |
| 6520105 | Koda et al. | Feb 2003 | B2 |
| 6678210 | Rowe | Jan 2004 | B2 |
| 7035166 | Zimmerman et al. | Apr 2006 | B2 |
| 7106656 | Lerro et al. | Sep 2006 | B2 |
| 7123546 | Zimmerman et al. | Oct 2006 | B2 |
| 7173879 | Zimmerman et al. | Feb 2007 | B2 |
| 7215598 | Guthmann | May 2007 | B2 |
| 7330399 | Lerro et al. | Feb 2008 | B2 |
| 7355924 | Zimmerman et al. | Apr 2008 | B2 |
| 7453769 | Kirschner et al. | Nov 2008 | B2 |
| 7542376 | Thompson et al. | Jun 2009 | B1 |
| 7542377 | Kirschner et al. | Jun 2009 | B2 |
| 7606114 | Bachelor et al. | Oct 2009 | B2 |
| 7847925 | Vogt | Dec 2010 | B2 |
| 7852709 | Lerro et al. | Dec 2010 | B1 |
| 7889600 | Thompson et al. | Feb 2011 | B2 |
| 7957609 | Lu et al. | Jun 2011 | B2 |
| 8254208 | Vogt | Aug 2012 | B2 |
| 8300499 | Coleman et al. | Oct 2012 | B2 |
| 8345511 | Rikoski | Jan 2013 | B1 |
| 8514659 | Vogt | Aug 2013 | B2 |
| 8605550 | Maguire | Dec 2013 | B2 |
| 8638362 | Thompson et al. | Jan 2014 | B1 |
| 8645012 | Salmon et al. | Feb 2014 | B2 |
| 8717847 | Blake | May 2014 | B2 |
| 8761976 | Salmon et al. | Jun 2014 | B2 |
| 8811120 | Bachelor et al. | Aug 2014 | B2 |
| 8814795 | Derode et al. | Aug 2014 | B2 |
| 8940312 | Hayashi et al. | Jan 2015 | B2 |
| RE45379 | Rowe | Feb 2015 | E |
| 8964507 | Bachelor et al. | Feb 2015 | B2 |
| 9132900 | Salmon et al. | Sep 2015 | B2 |
| 9135731 | Lauenstein et al. | Sep 2015 | B2 |
| 9182486 | Brown et al. | Nov 2015 | B2 |
| RE45823 | Vogt | Dec 2015 | E |
| 9218799 | Stytsenko et al. | Dec 2015 | B2 |
| 9322915 | Betts et al. | Apr 2016 | B2 |
| 9386964 | Bagge | Jul 2016 | B2 |
| 9664783 | Brown et al. | May 2017 | B2 |
| 9739884 | Proctor et al. | Aug 2017 | B2 |
| 9766328 | Black et al. | Sep 2017 | B2 |
| 9784825 | Brown et al. | Oct 2017 | B2 |
| 9784826 | Matson et al. | Oct 2017 | B2 |
| 9784832 | Proctor et al. | Oct 2017 | B2 |
| 9812118 | Matson et al. | Nov 2017 | B2 |
| 9840312 | Clark | Dec 2017 | B1 |
| 9846232 | Thompson et al. | Dec 2017 | B1 |
| 9947309 | Stokes et al. | Apr 2018 | B2 |
| 10012731 | Pelin et al. | Jul 2018 | B2 |
| 10019002 | Harnett et al. | Jul 2018 | B2 |
| 10067228 | Steenstrup et al. | Sep 2018 | B1 |
| 10107908 | Betts et al. | Oct 2018 | B2 |
| 10114119 | Horner et al. | Oct 2018 | B2 |
| 10197674 | Thompson et al. | Feb 2019 | B2 |
| 10215849 | Kozuki | Feb 2019 | B2 |
| 10241200 | Sayer et al. | Mar 2019 | B2 |
| 10247832 | Serafino et al. | Apr 2019 | B2 |
| 10310062 | Coleman et al. | Jun 2019 | B2 |
| 10311715 | Jopling | Jun 2019 | B2 |
| 10365356 | Stokes et al. | Jul 2019 | B2 |
| 10365366 | Lauenstein | Jul 2019 | B2 |
| 10408933 | DeHart et al. | Sep 2019 | B1 |
| 10502820 | Zimmerman et al. | Dec 2019 | B2 |
| 10513322 | Clark | Dec 2019 | B2 |
| 10514451 | Brown et al. | Dec 2019 | B2 |
| 10545226 | Wigh et al. | Jan 2020 | B2 |
| 10545235 | Pelin et al. | Jan 2020 | B2 |
| 10605913 | Coleman et al. | Mar 2020 | B2 |
| 10843781 | Clark | Nov 2020 | B2 |
| 10852429 | Gatland | Dec 2020 | B2 |
| 10890660 | Wigh | Jan 2021 | B2 |
| 10914810 | Laster et al. | Feb 2021 | B2 |
| 11059556 | Ahlgren | Jul 2021 | B2 |
| 11125866 | Sumi et al. | Sep 2021 | B2 |
| 11220317 | Clark | Jan 2022 | B2 |
| 11249176 | Hooper | Feb 2022 | B2 |
| 11367425 | Antao | Jun 2022 | B2 |
| 11435427 | Laster et al. | Sep 2022 | B2 |
| 11500054 | Clark | Nov 2022 | B2 |
| 11585921 | Proctor | Feb 2023 | B2 |
| 11796661 | Caspall | Oct 2023 | B2 |
| 11921200 | Clark | Mar 2024 | B1 |
| 11971478 | Combs | Apr 2024 | B2 |
| 20030235112 | Zimmerman et al. | Dec 2003 | A1 |
| 20050007882 | Bachelor et al. | Jan 2005 | A1 |
| 20070159922 | Zimmerman et al. | Jul 2007 | A1 |
| 20080130413 | Bachelor | Jun 2008 | A1 |
| 20090037040 | Salmon et al. | Feb 2009 | A1 |
| 20100067330 | Collier et al. | Mar 2010 | A1 |
| 20100074057 | Bachelor et al. | Mar 2010 | A1 |
| 20100284248 | Wang et al. | Nov 2010 | A1 |
| 20110013485 | Maguire | Jan 2011 | A1 |
| 20130021876 | Maguire | Jan 2013 | A1 |
| 20140013270 | Thomas et al. | Jan 2014 | A1 |
| 20140013276 | Butterworth | Jan 2014 | A1 |
| 20140050051 | Vogt | Feb 2014 | A1 |
| 20140071059 | Girault | Mar 2014 | A1 |
| 20140092709 | Miller et al. | Apr 2014 | A1 |
| 20140096060 | Thomas et al. | Apr 2014 | A1 |
| 20140258935 | Nishida et al. | Sep 2014 | A1 |
| 20140336854 | Salmon et al. | Nov 2014 | A1 |
| 20150142211 | Shehata et al. | May 2015 | A1 |
| 20160054733 | Hollida et al. | Feb 2016 | A1 |
| 20160061951 | Brown et al. | Mar 2016 | A1 |
| 20160214715 | Meffert | Jul 2016 | A1 |
| 20160259049 | Proctor et al. | Sep 2016 | A1 |
| 20160259050 | Proctor et al. | Sep 2016 | A1 |
| 20160259051 | Proctor et al. | Sep 2016 | A1 |
| 20160259052 | Kirmani | Sep 2016 | A1 |
| 20160306040 | Hunt et al. | Oct 2016 | A1 |
| 20160341827 | Horner et al. | Nov 2016 | A1 |
| 20170031022 | Ivanov | Feb 2017 | A1 |
| 20170031023 | Ivanov | Feb 2017 | A1 |
| 20170038344 | Capus et al. | Feb 2017 | A1 |
| 20170212230 | Wigh | Jul 2017 | A1 |
| 20170235308 | Gordon et al. | Aug 2017 | A1 |
| 20170242113 | Lauenstein | Aug 2017 | A1 |
| 20170363739 | Lauenstein | Dec 2017 | A1 |
| 20170371039 | Clark et al. | Dec 2017 | A1 |
| 20180100922 | Wigh | Apr 2018 | A1 |
| 20180107210 | Harnett et al. | Apr 2018 | A1 |
| 20180224544 | Ivanov | Aug 2018 | A1 |
| 20180275649 | Harnett et al. | Sep 2018 | A1 |
| 20180288990 | Laster et al. | Oct 2018 | A1 |
| 20190079185 | Steenstrup et al. | Mar 2019 | A1 |
| 20190088239 | Antao | Mar 2019 | A1 |
| 20190113619 | Laster | Apr 2019 | A1 |
| 20190176952 | Clark | Jun 2019 | A1 |
| 20190176953 | Clark | Jun 2019 | A1 |
| 20190235075 | Thompson et al. | Aug 2019 | A1 |
| 20190242994 | Wanis et al. | Aug 2019 | A1 |
| 20190265354 | Antao et al. | Aug 2019 | A1 |
| 20190353744 | Laster et al. | Nov 2019 | A1 |
| 20200011965 | Stokes et al. | Jan 2020 | A1 |
| 20200011981 | Stokes et al. | Jan 2020 | A1 |
| 20200070943 | Clark | Mar 2020 | A1 |
| 20200072953 | Wigh | Mar 2020 | A1 |
| 20200088840 | Stokes et al. | Mar 2020 | A1 |
| 20200103512 | Brown et al. | Apr 2020 | A1 |
| 20200116843 | Zimmerman et al. | Apr 2020 | A1 |
| 20200158842 | Wigh et al. | May 2020 | A1 |
| 20200241133 | Laster | Jul 2020 | A1 |
| 20200256967 | Wigh | Aug 2020 | A1 |
| 20200300994 | Matson et al. | Sep 2020 | A1 |
| 20210096244 | Wigh | Apr 2021 | A1 |
| 20210141048 | Laster et al. | May 2021 | A1 |
| 20210165068 | Clark | Jun 2021 | A1 |
| 20210173061 | Fyler et al. | Jun 2021 | A1 |
| 20210263150 | Stokes | Aug 2021 | A1 |
| 20210364636 | Simonton | Nov 2021 | A1 |
| 20210389439 | Sumi et al. | Dec 2021 | A1 |
| 20220035026 | Proctor | Feb 2022 | A1 |
| 20220035027 | Proctor | Feb 2022 | A1 |
| 20220089267 | Clark | Mar 2022 | A1 |
| 20220113393 | Nishimori et al. | Apr 2022 | A1 |
| 20220120882 | Coleman et al. | Apr 2022 | A1 |
| 20220373663 | Caspall | Nov 2022 | A1 |
| 20220373678 | Combs | Nov 2022 | A1 |
| 20220381906 | Combs | Dec 2022 | A1 |
| 20220390542 | Clark | Dec 2022 | A1 |
| 20220404491 | Caspall et al. | Dec 2022 | A1 |
| 20230111196 | Proctor | Apr 2023 | A1 |
| 20230143089 | Pendergraft | May 2023 | A1 |
| 20240061105 | Clark | Feb 2024 | A1 |
| Number | Date | Country |
|---|---|---|
| 2004258175 | Sep 2009 | AU |
| 2010273842 | Feb 2012 | AU |
| 2009283312 | Jun 2015 | AU |
| 2015201220 | Feb 2017 | AU |
| 2019213353 | Aug 2019 | AU |
| 2019203322 | Dec 2019 | AU |
| 2019203322 | Dec 2020 | AU |
| 2019283811 | Sep 2021 | AU |
| 2021229158 | Sep 2021 | AU |
| 2022202065 | Dec 2022 | AU |
| 2022203017 | Dec 2022 | AU |
| 2021229158 | Jun 2023 | AU |
| 2022203017 | Aug 2023 | AU |
| 2023270184 | Dec 2023 | AU |
| 2023263507 | Jul 2024 | AU |
| 2530290 | Nov 2015 | CA |
| 2899119 | Jan 2017 | CA |
| 2928461 | Jan 2017 | CA |
| 2993361 | Feb 2017 | CA |
| 3032163 | Aug 2019 | CA |
| 3042656 | Nov 2019 | CA |
| 3121772 | Sep 2023 | CA |
| 3221275 | Jun 2024 | CA |
| 3223790 | Jun 2024 | CA |
| 105759257 | Jul 2016 | CN |
| 110493698 | Nov 2019 | CN |
| 202010018577 | Nov 2017 | DE |
| 202010018565 | Dec 2017 | DE |
| 1925949 | May 2008 | EP |
| 2294452 | Dec 2011 | EP |
| 2612165 | Jul 2013 | EP |
| 3084467 | Oct 2016 | EP |
| 3144700 | Mar 2017 | EP |
| 1656568 | Dec 2017 | EP |
| 3325997 | May 2018 | EP |
| 3479138 | May 2019 | EP |
| 3572837 | Nov 2019 | EP |
| 2326970 | Oct 2020 | EP |
| 2956796 | Apr 2022 | EP |
| 4009069 | Jun 2022 | EP |
| 4043915 | Aug 2022 | EP |
| 4092444 | Nov 2022 | EP |
| 4386440 | Jun 2024 | EP |
| 4393811 | Jul 2024 | EP |
| 2004-080577 | Mar 2004 | JP |
| 2007-535195 | Nov 2007 | JP |
| 2008-508539 | Mar 2008 | JP |
| 2010-261883 | Nov 2010 | JP |
| 5600678 | Oct 2014 | JP |
| 5688197 | Mar 2015 | JP |
| 2016-510106 | Apr 2016 | JP |
| 2017-227489 | Dec 2017 | JP |
| 6444319 | Dec 2018 | JP |
| 2019-030623 | Feb 2019 | JP |
| 2020-039841 | Mar 2020 | JP |
| 6732274 | Jul 2020 | JP |
| 6737464 | Aug 2020 | JP |
| 2020-141250 | Sep 2020 | JP |
| 2020-155900 | Sep 2020 | JP |
| 200184719 | Jun 2000 | KR |
| 20160121915 | Oct 2016 | KR |
| 133285 | Oct 2013 | RU |
| 9409605 | Apr 1994 | WO |
| 1997004334 | Feb 1997 | WO |
| 2005008272 | Jan 2005 | WO |
| 2006017511 | Feb 2006 | WO |
| WO-2011008430 | Jan 2011 | WO |
| 2012028896 | Mar 2012 | WO |
| 2013126761 | Aug 2013 | WO |
| 2014126847 | Aug 2014 | WO |
| 2014144471 | Sep 2014 | WO |
| 2016205938 | Dec 2016 | WO |
| 2017015741 | Feb 2017 | WO |
| 2018201097 | Nov 2018 | WO |
| 2018222556 | Dec 2018 | WO |
| 2019050552 | Mar 2019 | WO |
| 2020174640 | Sep 2020 | WO |
| 2021019858 | Feb 2021 | WO |
| 2021127592 | Jun 2021 | WO |
| 2021176726 | Sep 2021 | WO |
| 2021220377 | Nov 2021 | WO |
| Entry |
|---|
| European Examination Report issued in Application No. 19174327.7 dated Mar. 7, 2023. |
| Thompson et al; “Two Dimensional and Three Dimensional Imaging Results Using Blazed Arrays;” MTS/IEEE Oceans 2001. An Ocean Odyssey. Conference Proceedings (IEEE Cat. No. 01CH37295); Nov. 5-8, 2001; pp. 985-988. |
| SeaBotix—Underwater Remotely Operated Vehicles (ROVs); ADS, Inc. YouTube. 2014 Video (mentioning SmartFlight): retreived Jul. 29, 2020 from https://www.youtube.com/watch?v=hkqJh5j6eQA. |
| SmartFlight 2.0 video; retrieved Jul. 29, 2020 from: http://www.teledynemarine.com/smartflight2-0?ProductLineID=112. |
| “Garmin Marine Webinars: Panoptix LiveScope Installation and Setup;” YouTube; Apr. 6, 2020; retreived Jan. 12, 2021 from https://www.youtube.com/watch?v=Z2AiSOmX5PA. |
| Nov. 26, 2021 Extended European Search Report issued in European Patent Application No. 21177698.4; 8 pp. |
| RyTek Marine (Apr. 6, 2022). Seeing double isn't always a bad thing . . . ; retreived Sep. 30, 2022 from https://www.facebook.com/RyTekMarine. |
| “Open Access Review: a Review of Acoustic Impedance Matching Techniques for Piezoelectric Sensors and Transducers;” Sensors; vol. 20; No. 14; Jul. 21, 2020; DOI: https://doi.org/10.3390/s20144051. |
| “Active Target ‘Scout Only’ Transducer Mount Combo;” RyTek Marine; retreived Aug. 10, 2022 from https://rytekmarine.com/collections/lowrance-activetarget/products/active-target-scout-only-transducer-mount-combo. |
| WASSP Multibeam; retrieved from <https://wassp.com/video/26/WASSP-S3-Demo-WEB.mp4> May 17, 2018. |
| Ellison, Ben; Panbo; The Marine Electronics Hub; “Garmin BlueChart g2 & g2 Vision 2010, lots new?” Mar. 16, 2010; retrieved from <https://www.panbo.com/garmin-bluechart-g2-g2-vision-2010-lots-new>. |
| Ellison, Ben; Panbo; The Marine Electronics Hub; “Maptech i3 fishfinder road trip” Jun. 15, 2005; retrieved from <https://www.panbo.com/maptech-i3-fishfinder-road-trip>. |
| “Teledyne SeaBotix—SmartFlight 2.0”; YouTube; Apr. 13, 2018; retrieved from https://www.youtube.com/watch?v=xFJ2OCKIXwc. |
| “SAMM”; Oceanic Imaging Consultants; retrieved Feb. 12, 2021 from https://www.geomatrix.co.uk/software/oceanographic-and-hydrographic/samm/. |
| ADS, Inc.; “SeaBotix—Underwater Remotely Operated Vehicles (ROVs)”; YouTube, Jul. 16, 2014; retrieved from https://www.youtube.com/watch?v=hkqJh5j6eQA. |
| Teledyne Marine; “SmartFlight 2.0 powered by Greensea”; retrieved Jun. 19, 2019 from http://www.teledynemarine.com/smartflight2-0?ProductLineID=112. |
| “LED Programmable Message Pocket Fan & Rave Toy”; retrieved Jan. 31, 2019 from https://www.amazon.com/LED-Programmable-Message-Pocket-Rave/dp/B002FWOYG2. |
| AguaDrone—The World's First Drone with a Fish Finder! website visited Oct. 25, 2016 (10 pgs.) https://www.aguadrone.com/. |
| AeroKontiki—Introducing the world's first autopilot fishing drone kontiki website visited Oct. 25, 2016 (4 pgs.) http://www.aerokontiki.com/. |
| DIY Drones—The Leading Community for Personal UAVs—Home website visited Oct. 25, 2016 (9 pgs.) www.diydrones.com. |
| DIY Drones—The Leading Community for Personal UAVs—My Blog Automated Precision Landing on a (stationary) Boat website visited Oct. 25, 2016 (6 pgs.) www.diydrones.com/profiles/blogs/automated-precision-landing-on-a-stationary-boat. |
| Number | Date | Country | |
|---|---|---|---|
| 20220390542 A1 | Dec 2022 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | 15982362 | May 2018 | US |
| Child | 17123189 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | 17174415 | Feb 2021 | US |
| Child | 17891354 | US | |
| Parent | 17123189 | Dec 2020 | US |
| Child | 17174415 | US |