Sonar mapping system

Information

  • Patent Grant
  • 10545235
  • Patent Number
    10,545,235
  • Date Filed
    Tuesday, November 1, 2016
    8 years ago
  • Date Issued
    Tuesday, January 28, 2020
    4 years ago
Abstract
A sonar mapping system that includes a sonar transducer assembly configured for mounting on a watercraft, and a display configured to show a topographical chart of a body of water. The sonar mapping system further includes a processor coupled to the sonar transducer assembly and display. The processor is configured to create the topographical chart in real time, and to update the topographical chart in real time, based on sonar data provided by the sonar transducer assembly. The processor is configured to render the created or updated topographical chart on the display. The sonar mapping system has memory accessible by the processor and configured to store the topographical chart rendered by the processor, and to store the sonar data provided by the sonar transducer assembly. The sonar data includes information indicative of vegetation present on a lakebed, seabed, or riverbed surface, the information being displayed on the topographical chart.
Description
FIELD OF THE INVENTION

This invention generally relates to a sonar mapping system.


BACKGROUND OF THE INVENTION

Sonar transducer assemblies are sometimes mounted on the hulls of watercrafts for various purposes, fish finding for example. U.S. Patent Publication No. 2013/0215719, published on Aug. 22, 2013, discloses a system including a sonar transducer assembly, deployed below the bottom of a boat hull, which provides 360-degree sonar imaging, the entire teachings and disclosure of which is incorporated herein by reference thereto. U.S. Patent Publication No. 2014/0269164, published Sep. 18, 2014, discloses a system including a sonar transducer assembly, which provides sonar imaging for a predetermined sector, the entire teachings and disclosure of which is incorporated herein by reference thereto. Various embodiments of a system for sonar imaging is disclosed in the following patents: U.S. Pat. No. 7,652,952 issued on Jan. 26, 2010 to Betts et al.; U.S. Pat. No. 7,710,825 issued on May 4, 2010 to Betts et al.; U.S. Pat. No. 7,729,203 issued on Jun. 1, 2010 to Betts et al.; and U.S. Pat. No. 7,755,974 issued on Jul. 13, 2010 to Betts et al., the entire teachings and disclosures of which are incorporated herein by reference thereto.


It is often advantageous for anglers to have detailed maps or charts of the lakes, rivers, or other bodies of water in which they fish. Charts showing the topography of the lake bed, river bed, or sea bed may inform the angler as to the best location for catching a particular type of fish. Embodiments of the present invention advance the state of the art with respect to the use of sonar transducers on watercrafts in a way that addresses some of the aforementioned needs of anglers.


These and other advantages of the invention, as well as additional inventive features, will be apparent from the description of the invention provided herein.


BRIEF SUMMARY OF THE INVENTION

In one aspect, embodiments of the invention provide a sonar mapping system that includes a sonar transducer assembly configured for mounting on a watercraft, and a display configured to show a topographical chart of a body of water. The sonar mapping system further includes a processor coupled to the sonar transducer assembly and display. The processor is configured to create the topographical chart in real time, and to update the topographical chart in real time, based on sonar data provided by the sonar transducer assembly. The processor is also configured to render the created or updated topographical chart on the display. The sonar mapping system has memory accessible by the processor and configured to store the topographical chart rendered by the processor, and to store the sonar data provided by the sonar transducer assembly. The sonar data includes information indicative of vegetation present on a lakebed, seabed, or riverbed surface, the information being displayed on the topographical chart.


In a particular embodiment of the invention, the topographical chart includes one or more bathymetric tints indicative of the vegetation. In a more particular embodiment, colors of the one or more bathymetric tints are selectable by a user. In certain embodiments, the processor is integrated into the sonar transducer assembly.


In a particular embodiment, the processor is configured to convert the sonar data in real time into topographical data rendered on the display for one of a lakebed, riverbed, and seabed. The processor may be configured to estimate topographical data to fill in missing portions of topographical data adjacent the topographical data gathered via the sonar transducer assembly. In certain embodiments, the topographical data includes one or more contour lines indicative of a water depth. In alternate embodiments, the topographical data includes bathymetric tints indicative of a water depth. The colors of the bathymetric tints may be selectable by a user.


The topographical data may include bathymetric tints indicative of a hardness of the lakebed, riverbed, or seabed surface. The colors of the hardness-indicating bathymetric tints may be selectable by a user. Similarly, the colors of any topographical chart generated by the processor may be selectable by a user.


In some embodiments, a chart for a body of water is stored in the memory, and the processor updates topographical or bathymetric data for the chart based on the sonar data provided by the sonar transducer assembly. In particular embodiments, the processor is configured to update the topographical chart in real time by overwriting stored topographical data with new topographical data acquired and converted form sonar data in real time.


In a particular embodiment, the processor is configured to generate a 3-D rendering, including the vegetation, based on sonar data collected by the sonar transducer assembly, and wherein the 3-D rendering is shown on the display. In some embodiments, a user can save the 3-D rendering in the memory. Different features of the 3-D rendering may be shown in different colors. The colors of the 3-D rendering may be selectable by a user of the sonar mapping system.


In a particular embodiment, the processor is configured to convert the sonar data in real time into topographical data for one of a lakebed, riverbed, and seabed. The topographical data may include one or more contour lines indicative of a water depth, or, alternatively, may include bathymetric tints indicative of a water depth. The sonar mapping system may include a connection for a portable memory device, wherein the processor is configured to access portable memory device, the portable memory device including at least one of a USB drive, and SD card, optical storage media, and magnetic storage media.


In another aspect, embodiments of the invention provide a sonar mapping system that includes a sonar transducer assembly configured for mounting on a watercraft, and configured to provide sonar data for a 360-degree area surrounding the watercraft, or for a portion of a 360-degree area using a sector-scanning device, and a display configured to show underwater images based on data from the sonar transducer assembly. The sonar mapping system also includes a processor coupled to the sonar transducer assembly and to the display. The processor is configured to convert sonar data from the sonar transducer assembly into the underwater images rendered on the display. The underwater images including images of vegetation present on a lakebed, seabed, or riverbed surface. The processor is also configured to overlay the underwater images, in real time, onto a previously-stored chart for a body of water, or to create a new chart, in real time, that includes the underwater images. The sonar mapping system also includes memory accessible by the processor. The processor is configured to store, in the memory, the new chart with underwater images or the previously-stored chart with overlaid underwater images.


In a particular embodiment, the underwater images are shown on the display as bathymetric tints in which different features of the underwater images are represented by a plurality of colors. In more particular embodiments, the vegetation is represented by at least one of the plurality of colors, and at least one color of the plurality of colors is selectable by a user of the sonar mapping system.


Other aspects, objectives and advantages of the invention will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings incorporated in and forming a part of the specification illustrate several aspects of the present invention and, together with the description, serve to explain the principles of the invention. In the drawings:



FIG. 1 is a plan view of a boat with a transom-mounted sonar transducer assembly, according to an embodiment of the invention;



FIG. 2 is a pictorial illustration the sonar transducer assembly attached to a trolling motor, according to an embodiment of the invention;



FIG. 3 is a pictorial illustration of the mounting and deployment of the sonar transducer assembly on a trolling motor, according to an embodiment of the invention;



FIG. 4 is an exemplary screenshot of the display for the sonar mapping system, in accordance with an embodiment of the invention;



FIG. 5 is an exemplary screenshot of the display for the sonar mapping system, in accordance with an embodiment of the invention;



FIG. 6 is an exemplary 3-D rendering of an underwater topographical chart, according to an embodiment of the invention; and



FIG. 7 is an exemplary screenshot of the display for the sonar mapping system as it would appear with a 360-degree sonar imaging system, in accordance with an embodiment of the invention.





The accompanying drawings include a number of exemplary charts as they would be displayed on the display of an embodiment of the sonar mapping system.


While the invention will be described in connection with certain preferred embodiments, there is no intent to limit it to those embodiments. On the contrary, the intent is to cover all alternatives, modifications and equivalents as included within the spirit and scope of the invention as defined by the appended claims.


DETAILED DESCRIPTION OF THE INVENTION


FIGS. 1-3 show exemplary embodiment of a sonar transducer system according to embodiments of the invention. FIG. 2 shows a sonar mapping system 200 constructed in accordance with an embodiment of the invention. The sonar mapping system 200 includes a sonar imaging system 100 configured for mounting to various types of watercraft.


The sonar mapping system 200 includes a sonar transducer assembly 106, a control processor 110 and a sonar display. In the embodiment of FIG. 2, the sonar mapping system 200 is installed on a watercraft 104, illustrated as a motorized fishing boat 104. An optional second display 111 may be positioned at one of several different locations on the boat 104. FIG. 2 shows the second display 111 positioned towards the bow of the boat 104. In the embodiment of FIG. 2, the sonar imaging system 100, which includes the sonar transducer assembly 106, is deployed from the rear of the boat 104.


The control processor 110 is coupled to the sonar imaging system 100 and receives sonar data from the sonar transducer assembly 106. The processor is also coupled to a display. In certain embodiments such as illustrated in FIGS. 1-3 and described below, the sonar imaging system 100 can interface with a single control processor 110, or network with multiple control processors 110. The one or more control processors 110 may be integrated into the sonar transducer assembly 106, or may be installed in a control head, or command console, such as shown on the boat 104 of FIG. 2. When integrated in the sonar transducer assembly 106, the modular transducer assembly 106 and control processor 110 may be readily installed in a number of different types of watercraft 104. The sonar imaging system 100 may connect, via wired or wireless connection, to the processor 110 and display unit, although in other embodiments, this communication may take place using other wireless technologies, including, but not limited to Wi-Fi, Bluetooth, for example.


In certain embodiments, the sonar imaging system 100 includes a sonar transducer assembly 106 and one of several possible deployment mechanisms. When the sonar imaging system 100 is connected to the control processor 110, a variety of menus and views can be added to the existing user interface and rendered on the display. While the following will describe various embodiments of such a user interface, the examples are to demonstrate functionality.



FIG. 1 illustrates a sonar imaging system 100 deployed from the transom 113 of fishing boat 104, in accordance with an embodiment of the invention. In FIG. 1, the sonar imaging system 100 is shown in its retracted state in which the sonar transducer assembly 106 is close to the water line. However, phantom lines are used to show the sonar imaging system 100 in its deployed state, in which the sonar transducer assembly 106 is below the keel 108 of the boat 104. In some embodiments, the depth at which the sonar transducer assembly 106 is deployed is adjustable and set by the user. In the embodiment of FIG. 1, the sonar transducer assembly 106 is attached at the end of a shaft 105 that extends from, and retracts into, a housing 107. The interior of shaft 105 provides a path for cables from the sonar transducer assembly 106 to a control processor 110 (shown in FIG. 2).


The sonar transducer assembly 106 can be deployed in any number of ways, including but not limited to, automatically based on speed, or and locally via user controls on the transducer deployment system. In some embodiments, when the sonar imaging system 100 is in the process of deploying, a message will be displayed stating, for example, “Deploying transducer.” When the sonar transducer assembly 106 reaches the set depth or the current limit, the deploying message will clear.



FIG. 3 shows an isolated view of the sonar imaging system 100 attached to the trolling motor 112. The sonar transducer assembly 106 is attached to the end of a shaft 114. In certain embodiments, the shaft 114 for the sonar transducer assembly 106 is coupled to a non-rotating portion of shaft 116 for the trolling motor 112 by a quick connecting clamp 115. In particular embodiments, the position of the sonar transducer assembly 106 is fixed with respect to the trolling motor 112. That is, the sonar transducer assembly 106 does not rotate with the trolling motor shaft 116, instead remaining stationary with respect to the boat. The sonar transducer assembly 106 may be deployed, at the bow of the boat 104, from the transom, or through the hull of the boat 104.


In one embodiment of the invention, the sonar imaging system 100 is a sweeping, or scanning, sonar system, also referred to as a 360-degree sonar imaging system. The sweeping/scanning sonar system may be configured to continually rotate the sonar transducer assembly 106 to provide a sonar image that includes a full 360-degree underwater view. Such 360-degree sonar imaging systems may be used to provide a picture-like image of the lake bed, river bed, or sea bed below and around the boat 104. The automatic charting function allows the user to create or update the image for a partial or entire body of water, and to store that image in memory for later recall. In other embodiments, the sonar imaging system 100 uses a sector-scanning device to image a predetermined portion of a 360-degree area.


In alternate embodiments, the sonar imaging system 100 has a stationary transducer arranged to provide 2-D sonar imaging. Though not shown explicitly in the drawings, one of ordinary skill in the art will recognize that the sonar imaging system 100 may be deployed through the hull of the watercraft 104 such that the sonar imaging system 100 extends below the keel of the boat 104 during operation. In some embodiments, this sonar imaging system 100 is designed to extend down from the hull during operation and to retract up against, or into, the hull when not being used.


For example, in particular embodiments, the sonar mapping system 200 may be configured such that the display with show retract and deploy messages. In certain embodiments, all retract and deploy messages are broadcast to any of the one or more control processor 110 that has the sonar transducer assembly 106 selected as one of its sonar sources.


Referring again to FIG. 2, the aforementioned control processor 110 for the sonar mapping system 200 is coupled to the sonar transducer assembly 106, and to the display in console 109 and to any additional displays, such as display 111. This coupling could be either wired or wireless depending on the sonar mapping system configuration. The control processor 110 is also coupled to, and able to access, electronic memory (not shown), which is configured to store charts or maps, along with sonar data from the sonar transducer assembly 106. The memory may be located proximate the control processor 110 or the display, in the console 109 for example, or may be located remotely from the both the display and the control processor 110. It is envisioned that this memory, accessible to the control processor, includes both fixed and portable forms of memory. Such portable memory includes, but is not limited to, flash memory, solid-state memory, including, but not limited to, thumb drives, SD cards, and may also include optical storage media, etc.



FIGS. 4-5 show exemplary screen shots of the display showing a chart 300 with topographical information provided by the sonar transducer assembly 106 (shown in FIGS. 1-3), in accordance with an embodiment of the invention. These figures also show elements of an exemplary graphical user interface 302 for the sonar mapping system 200 (shown in FIG. 2).



FIG. 4 shows an exemplary illustration of the chart 300 showing the boat 104 location after the automatic mapping function has been initiated. In a particular embodiment, the sonar mapping system 200 is configured to access the chart 300 in memory (not shown). Thus, the user may access, in memory, a desired chart for a body of water 301, for example the body of water on which the user is navigating. When the automatic charting function is operating, the control processor 110 (shown in FIG. 2) is configured to update the chart 300 with topographical data 304 in real time based on sonar data provided by the sonar transducer assembly 106. The control processor 110 is also configured to render the updated chart 300 on the display. The embodiments of FIGS. 4 and 5 show topographical data 304 as it might appear if provided by the sonar imaging system 100 in FIG. 2, for example.


Additionally, the control processor 110 is configured to store these newly created or updated charts 300 in memory for later recall by the user. During each successive use of this chart 300, additional topographical data 304, for instance from an area of the body of water 301 not previously charted, can be added. Furthermore, the topographical data 304 gathered during previous charting sessions can be updated to reflect any changes in the topography of the lake bed, river bed, or sea bed, as the case may be.


In certain embodiments, the chart 300 may include topographical data 304 of the lakebed, seabed, or riverbed of the body of water 301 being navigated. In such a case, the automatic charting feature of the sonar mapping system 200 is configured to update the topographical data 304 in real time. However, it is envisioned that the automatic charting feature would be able to create from scratch a topographical map in real time for the floor of the body of water 301, for example using GPS coordinates, even when there is no available topographical data 304 in memory, or even if there is no chart 300 for the body of water 301 in memory before the automatic charting feature is engaged. Topographical data 304 may be displayed simultaneously, for example overlaid, with sonar imaging data.


The topographical data 304 may be in the form of a bathymetric chart with contour lines 306, as shown in FIGS. 4 and 5, where each contour line 306 indicates the location of a particular water depth for the body of water 301. Alternatively, the topographical data 304 may be in the form of a bathymetric tints or shading to indicate various depths in the body of water 301, where the color of the tints change as the underwater topography progresses from shallow to deep. The display may be configured to show the bathymetric chart with tints and/or contour lines 306 in various colors which are selectable by the user on the graphical user interface 302. Similarly, it is envisioned that the contour lines 306 may be customized via the graphical user interface 302 such that the contour lines 306 shown on the chart 300 indicate those depths selected by the user.


In particular embodiments, the control processor 110 (shown in FIG. 2) is also configured to use GPS data to show, on the display, the position of the watercraft 104, on the chart 300 for the body of water 301 being navigated, in relation to established landmarks or in relation to the boundaries of the body of water 301.


As stated above, if the chart 300 for the body of water 301 being navigated does not include topographical data 304, the sonar mapping system 200 can immediately create a topographical chart of the lakebed, riverbed, or seabed being navigated. With the automatic charting feature engaged, the sonar data for a portion of the lakebed, riverbed, or seabed is converted into topographical data by the processor. With a sufficient number of passes on the body of water 301, the entire floor of the body of water 301 can be charted. With each pass, the control processor 110 (shown in FIG. 2) performs a real-time update of the chart for the body of water 301 by adding contour lines 306, numerical displays, or bathymetric tints/shading to show water depths on the chart being displayed.



FIG. 5 shows topographical data 304 in two different spots along the path of travel for the boat 104. This may happen when the automatic charting function is paused for one reason or another. However, if the space between the charted areas is not too great, the control processor 110 may perform an interpolation function to estimate the missing topography between the two charted areas. In this manner, the topographical data 304 of FIG. 5 may be rendered as a closed approximation of the actual topographical data 304 as shown in FIG. 4 for example.


This same method may be employed to map the hardness, rather than the topography of the lakebed, riverbed, or seabed. Based on the strength of the sonar signal received by the sonar transducer assembly 106, the control processor 110 can create a chart, a color-coded chart for example, where the colors represent a spectrum of hardness for the lakebed, riverbed, or seabed surface. It is envisioned that, in certain embodiments, the graphical user interface 302 will allow the user to select the colors for this function.


In many cases, the lakebed, riverbed, or seabed surface is covered by varying degrees of vegetation 316 (shown in FIG. 6). In particular embodiments, the control processor 110 is configured to process the signals from the sonar transducer assembly 106 such that vegetation 316 on the lakebed, riverbed, or seabed surface can be distinguished from the lakebed, riverbed, or seabed surface itself, and distinguished from fish or other animals in the water. The control processor 110 is further configured to display the vegetation 316 on the chart 300. Furthermore, in particular embodiments, the control processor 110 is configured to show relative densities of vegetation 316 on the lakebed, riverbed, or seabed surface. The vegetation 316 may be shown in bathymetric tints (in one or more colors) so the color of the vegetation 316, as displayed for the user, is distinct from the color of the bottom surface for the body of water. The display may be a 2-D display or a 3-D rendering shown on the display screen of the user's electronic device. In certain embodiments, the user is able to choose the one or more colors of the vegetation 316 as shown on the user's display.


It is envisioned that the control processor 110 can be configured to show both the vegetation 316 on the lakebed, riverbed, or seabed surface while also indicating the hardness of that surface using various means, bathymetric tints being only one such mean. Color intensity or special symbols may also be used to distinguish between degrees of surface hardness. Similar means may also be used to distinguish between different types of vegetation.


Further, embodiments of the invention are able to generate and display a 3-D topographical map 308 of a body of water in real time based on the sonar data collected by the sonar transducer. FIG. 6 shows an exemplary rendering of the 3-D topographical map 308 along with the position of the boat. With the appropriate sonar transducer assembly 106, particular embodiments of the invention provide the user with the ability to create or update the 3-D topographical map 308 on the display and to save the created or updated 3-D topographical map 308 to memory. As in the previously-discussed embodiments, the graphical user interface 302 may be configured to allow the user to select the colors for the 3-D topographical map 308.


The control processor 110 (shown in FIG. 2) and the graphical user interface 302 are configured such that the user of the sonar mapping system 200 (shown in FIG. 2) can adjust the display in a variety of ways including, but not limited to transparency level, color, sensitivity, contrast, etc. In a particular embodiment of the invention, the user can select from one of several drawing modes where the sonar data from the sonar transducer assembly 106 overwrites the original chart data, overwrites previously-acquired sonar data, or is blended with original or previously-acquired sonar data. The control processor 110 may also include a feature in which sonar data with greater intensity replaces sonar data with lower intensity.



FIG. 7 is an exemplary screen shot illustrating the look of the display when the automatic charting feature is used with the aforementioned 360-degree sonar imaging system. A chart 310 is created or updated to include 360-degree sonar imaging data 314, which shows underwater images for a 360-degree area surrounding the boat 104 for the body of water 301. In the embodiment of FIG. 7, the 360-degree sonar imaging data 314, which resembles a series on non-concentric circles, may be in the form of a bathymetric tints or shading to help indicate various underwater features or objects imaged from boat 104 in the body of water 301. In some embodiments, the colors of the bathymetric tints are selectable by the user of the sonar mapping system 200 (shown in FIG. 2).


With the automatic charting feature engaged, the sonar data is gathered for some portion of the lakebed, riverbed, or seabed and converted into imaging data by the control processor 110. With a sufficient number of passes on the body of water 301, the entire floor of the body of water 301 can be imaged. With each pass, the control processor 110 (shown in FIG. 2) performs a real-time update of the chart 310. The 360-degree sonar imaging data 314 can be stored in memory for later recall by the user, and can also be displayed simultaneously, for example overlaid, with charts or topographical data previously stored in memory. A similar process could be employed to store sonar imaging data captured using a sector scanning device to capture a portion of a 360-degree area surrounding the boat 104.


All references, including publications, patent applications, and patents cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.


The use of the terms “a” and “an” and “the” and similar referents in the context of describing the invention (especially in the context of the following claims) is to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including,” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.


Preferred embodiments of this invention are described herein, including the best mode known to the inventors for carrying out the invention. Variations of those preferred embodiments may become apparent to those of ordinary skill in the art upon reading the foregoing description. The inventors expect skilled artisans to employ such variations as appropriate, and the inventors intend for the invention to be practiced otherwise than as specifically described herein. Accordingly, this invention includes all modifications and equivalents of the subject matter recited in the claims appended hereto as permitted by applicable law. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the invention unless otherwise indicated herein or otherwise clearly contradicted by context.

Claims
  • 1. A sonar mapping system comprising: a sonar transducer assembly configured for mounting on a watercraft;a display configured to show a topographical chart of a body of water;a processor coupled to the sonar transducer assembly and display, and configured to create the topographical chart in real time, or to update the topographical chart in real time, based on sonar data provided by the sonar transducer assembly, the processor being configured to render the created or updated topographical chart on the display; andmemory accessible by the processor and configured to store the topographical chart rendered by the processor, and to store the sonar data provided by the sonar transducer assembly;wherein the sonar data includes information indicative of vegetation present on a lakebed, seabed, or riverbed surface, the information being displayed on the topographical chart.
  • 2. The sonar mapping system of claim 1, wherein the topographical chart includes one or more bathymetric tints indicative of the vegetation.
  • 3. The sonar mapping system of claim 2, wherein colors of the one or more bathymetric tints are selectable by a user.
  • 4. The sonar mapping system of claim 1, wherein the processor is configured to estimate topographical data to fill in missing portions of topographical data adjacent the topographical data gathered via the sonar transducer assembly.
  • 5. The sonar mapping system of claim 1, wherein the topographical chart includes one or more contour lines indicative of a water depth.
  • 6. The sonar mapping system of claim 1, wherein the topographical chart includes bathymetric tints indicative of a water depth.
  • 7. The sonar mapping system of claim 6, wherein colors of the bathymetric tints are selectable by a user.
  • 8. The sonar mapping system of claim 1, wherein the topographical chart includes bathymetric tints indicative of a hardness of the lakebed, riverbed, or seabed surface.
  • 9. The sonar mapping system of claim 8, wherein colors of the bathymetric tints are selectable by a user.
  • 10. The sonar mapping system of claim 1, wherein a chart for a body of water is stored in the memory; and wherein the processor updates topographical or bathymetric data for the chart based on the sonar data provided by the sonar transducer assembly.
  • 11. The sonar mapping system of claim 1, wherein the processor is configured to generate a 3-D rendering, including the vegetation, based on sonar data collected by the sonar transducer assembly, and wherein the 3-D rendering is shown on the display.
  • 12. The sonar mapping system of claim 11, wherein a user can save the 3-D rendering in the memory.
  • 13. The sonar mapping system of claim 11, wherein different features of the 3-D rendering are shown in different colors.
  • 14. The sonar mapping system of claim 13, wherein colors for the 3-D rendering are selectable by a user of the sonar mapping system.
  • 15. The sonar mapping system of claim 1, further comprising a connection for a portable memory device, wherein the processor is configured to access portable memory device.
  • 16. The sonar mapping system of claim 1, wherein the microprocessor is configured to update the topographical chart in real time by overwriting stored topographical data with new topographical data acquired and converted from sonar data in real time.
  • 17. The sonar mapping system of claim 1, wherein the processor is integrated into the sonar transducer assembly.
  • 18. A sonar mapping system comprising: a sonar transducer assembly configured for mounting on a watercraft, and configured to provide sonar data for a 360-degree area surrounding the watercraft;a display configured to show underwater images based on data from the sonar transducer assembly;a processor coupled to the sonar transducer assembly and to the display, the processor configured to convert sonar data from the sonar transducer assembly into the underwater images rendered on the display, the underwater images including images of vegetation present on a lakebed, seabed, or riverbed surface, wherein the processor is also configured to overlay the underwater images, in real time, onto a previously-stored chart for a body of water, or to create a new chart, in real time, that includes the underwater images; andmemory accessible by the processor, wherein the processor is configured to store, in the memory, the new chart with underwater images or the previously-stored chart with overlaid underwater images.
  • 19. The sonar mapping system of claim 18, wherein the underwater images are shown on the display as bathymetric tints in which different features of the underwater images are represented by a plurality of colors.
  • 20. The sonar mapping system of claim 19, wherein the vegetation is represented by at least one of the plurality of colors, and wherein at least one of the plurality of colors is selectable by a user of the sonar mapping system.
US Referenced Citations (165)
Number Name Date Kind
3548370 Hoxsie Dec 1970 A
3683324 Hoxsie Aug 1972 A
3721124 Franks Mar 1973 A
3740705 Lowrance Jun 1973 A
3747053 Austin Jul 1973 A
3747413 Barrett et al. Jul 1973 A
3752431 McBride Aug 1973 A
3781777 Lowrance Dec 1973 A
3782170 Cramer Jan 1974 A
3797448 Cramer Mar 1974 A
3808731 Lowrance May 1974 A
3835447 Lowrance Sep 1974 A
3845928 Barrett et al. Nov 1974 A
3946295 Moore Mar 1976 A
D243589 Moore Mar 1977 S
D244434 Moore May 1977 S
4110727 Kriege Aug 1978 A
4186372 Maloy Jan 1980 A
4189702 Maloy Feb 1980 A
4322827 Weber Mar 1982 A
4369508 Weber Jan 1983 A
4420824 Weber Dec 1983 A
4456210 McBride Jun 1984 A
4480809 Healey Nov 1984 A
D278690 Steensland et al. May 1985 S
4590679 Livings et al. May 1986 A
4612633 Weber Sep 1986 A
4809242 Oka Feb 1989 A
4862819 Fawcett Sep 1989 A
4879697 Lowrance et al. Nov 1989 A
4894922 Lovelock Jan 1990 A
4907208 Lowrance et al. Mar 1990 A
4938165 Williams et al. Jul 1990 A
5109364 Stiner Apr 1992 A
D329615 Stiner Sep 1992 S
D329616 Stiner Sep 1992 S
5235927 Singh et al. Aug 1993 A
5313397 Singh et al. May 1994 A
5327398 Wansley et al. Jul 1994 A
5491636 Robertson et al. Feb 1996 A
5537380 Sprankle, Jr. et al. Jul 1996 A
D373568 Bloom et al. Sep 1996 S
5574700 Chapman Nov 1996 A
D380664 Currier et al. Jul 1997 S
5697319 Steensland et al. Dec 1997 A
D390092 Currier et al. Feb 1998 S
5771205 Currier et al. Jun 1998 A
5805528 Hamada Sep 1998 A
5852589 Wilson et al. Dec 1998 A
5865403 Covell Feb 1999 A
5887376 Currier et al. Mar 1999 A
5930200 Kabel Jul 1999 A
6108269 Kabel Aug 2000 A
6335905 Kabel Jan 2002 B1
6345179 Wiegers et al. Feb 2002 B1
6377516 Whiteside et al. Apr 2002 B1
6466514 Kabel Oct 2002 B1
6493894 Whiteside et al. Dec 2002 B1
6529381 Schoenfish Mar 2003 B1
6650884 Wiegers et al. Nov 2003 B1
6678589 Robertson et al. Jan 2004 B2
6687138 Poindexter Feb 2004 B1
6687615 Krull et al. Feb 2004 B1
6693586 Walters et al. Feb 2004 B1
6700787 Beseth et al. Mar 2004 B1
6703998 Kabel et al. Mar 2004 B1
6708112 Beesley et al. Mar 2004 B1
6711478 Hilb Mar 2004 B2
6721651 Minelli Apr 2004 B1
6735542 Burgett et al. May 2004 B1
6745115 Chen et al. Jun 2004 B1
6751552 Minelli Jun 2004 B1
6768450 Walters et al. Jul 2004 B1
6778388 Minelli Aug 2004 B1
6789012 Childs et al. Sep 2004 B1
6795770 Hanshew et al. Sep 2004 B1
6798378 Walters Sep 2004 B1
6798673 Poindexter Sep 2004 B1
6801854 Pemble et al. Oct 2004 B1
6801855 Walters et al. Oct 2004 B1
6809657 Parker et al. Oct 2004 B1
6809940 Poindexter Oct 2004 B1
6810322 Lai Oct 2004 B2
6816782 Walters et al. Nov 2004 B1
6819549 Lammers-Meis et al. Nov 2004 B1
6822402 Poindexter Nov 2004 B1
6833851 Brunk Dec 2004 B1
6839624 Beesley et al. Jan 2005 B1
6839631 Pemble et al. Jan 2005 B1
6844845 Whiteside et al. Jan 2005 B1
6845318 Moore et al. Jan 2005 B1
6845320 Tompkins et al. Jan 2005 B2
6845323 Beason et al. Jan 2005 B1
6847890 Childs et al. Jan 2005 B1
6850188 Lee et al. Feb 2005 B1
6850844 Walters et al. Feb 2005 B1
6853955 Burrell et al. Feb 2005 B1
6856274 Johnson Feb 2005 B1
6856893 Beesley et al. Feb 2005 B2
6856898 Tompkins et al. Feb 2005 B1
6856899 Krull et al. Feb 2005 B2
6856900 Childs et al. Feb 2005 B1
6871138 Minelli Mar 2005 B1
6871144 Lee Mar 2005 B1
6879114 Jales et al. Apr 2005 B2
6882932 Tompkins et al. Apr 2005 B2
6892135 Krull et al. May 2005 B1
6898525 Minelli May 2005 B1
6899562 Ruff et al. May 2005 B1
6909946 Kabel et al. Jun 2005 B1
6927983 Beseth et al. Aug 2005 B1
6934657 Carlson et al. Aug 2005 B1
6943771 Kabel et al. Sep 2005 B2
6950372 Sogaard Sep 2005 B2
D518396 Jopling Apr 2006 S
7062374 Walters et al. Jun 2006 B1
7063297 Jopling Jun 2006 B2
7106657 Sogaard Sep 2006 B2
7230882 Swisher Jun 2007 B2
7236426 Turner et al. Jun 2007 B2
7268703 Kabel et al. Sep 2007 B1
7298320 Whiteside et al. Nov 2007 B1
D565077 Sakamaki et al. Mar 2008 S
D565977 Ross et al. Apr 2008 S
7386374 Orf et al. Jun 2008 B1
7430461 Michaels Sep 2008 B1
7441189 Michaels Oct 2008 B2
7520481 Jopling Apr 2009 B2
7543241 Brunk Jun 2009 B1
7602302 Hokuf et al. Oct 2009 B2
7610148 Walters et al. Oct 2009 B1
7646329 Britton et al. Jan 2010 B2
7652952 Betts et al. Jan 2010 B2
7710825 Betts May 2010 B2
7729203 Betts et al. Jun 2010 B2
7729684 Straub Jun 2010 B1
7755974 Betts et al. Jul 2010 B2
7787857 Peterman Aug 2010 B2
7825858 Blessing et al. Nov 2010 B2
7889085 Downey et al. Feb 2011 B2
7973705 Cunning et al. Jul 2011 B2
8224562 Katzer Jul 2012 B2
8291757 Johnson et al. Oct 2012 B2
8300499 Coleman et al. Oct 2012 B2
8301714 Johnson et al. Oct 2012 B2
8305840 Maguire Nov 2012 B2
8514658 Maguire Aug 2013 B2
8624776 Jales et al. Jan 2014 B2
20060278789 Jopling Dec 2006 A1
20080191935 Tidwell Aug 2008 A1
20080192575 Coleman Aug 2008 A1
20080195242 Tidwell Aug 2008 A1
20080195313 Coleman Aug 2008 A1
20090122647 Betts May 2009 A1
20100117923 Storz May 2010 A1
20100199225 Coleman et al. Aug 2010 A1
20110013484 Coleman et al. Jan 2011 A1
20110013485 Maguire Jan 2011 A1
20130215719 Betts et al. Aug 2013 A1
20140032479 Lauenstein et al. Jan 2014 A1
20140269164 Betts et al. Sep 2014 A1
20150285909 Pelin Oct 2015 A1
20170071017 Klemans Mar 2017 A1
20170123037 Kim May 2017 A1
20180120431 Pelin May 2018 A1
Foreign Referenced Citations (2)
Number Date Country
3315993 May 2018 EP
20170052377 May 2017 KR
Non-Patent Literature Citations (12)
Entry
Wassp Multibeam Navigator, Jan. 2015, pp. 1-38. (Year: 2015).
MaxSea; MaxSea Marine Navigation Software; internet publication printed Feb. 27, 2015; 3 pages; www.maxsea.com/products/modules/pbg.
MaxSea; Getting Started with MaxSea 2D/3D/PBG; installation instructions; 3 pages; known as of filed of application (Apr. 4, 2015).
Furuno; Furuno DFF1-UHD TruEcho CHIRP Fish Finder; internet brochure http://www.furunousa.com/ProductDocuments/DFF1-UHD_Brochure.pdf; known at least as of the filing date of the application (Apr. 2, 2015); 2 pages.
DrDepth—MVP; DrDepth Sea bottom mapping software; pages printed from the internet; 2017; 4 pages; http://mob.drdepth2.se/mvphelp.php.
DrDepth—Nomad; DrDepth Sea bottom mapping software; pages printed from the internet; 2017; 4 pages; http://mob.drdepth2.se/nomadhelp.php.
Download DrDepth Nomad APK 1.9.9—Bypass Region-Lock; DrDepth Nomad; pages printed from the internet; 2017; 6 pages http://m.downloadatoz.com/drdepth-nonnad/com.drdepth.drdepthnomad/.
DrDepth Nomad—Android Informer. DrDepth Nomad is an Extended Version of DrDepth . . . ; DrDepth Nomad; pages printed from the internet; 2017; 2 pages; http://drdepth-nomad.android.informer.com/.
Press release—“WASSP Product Announcement”, WASSP Ltd., Auckland, new Zealand, Jan. 14, 2014, available at http://wassp.com/news-events/wassp-goes-wireless-with-new-remote-mapping-system/.
Video; “Wassp Wireless” uploaded Dec. 22, 2013: https://www.youtube.com/watch?v=J6CTyIHzuFE.
Video; “Humminbird 360 Imaging” uploaded Feb. 23, 2012: https://www.youtube.com/watch?v=bsOGUx7O3nk.
Sarah Mielke, “Aquatic Vegetation Density Mapping—BioBase 2015 Report”, internet article URL: https://www.plslwd.org/wp-content/uploads/2016/07/The-Biobase-Report.pdf, retrieved on Mar. 13, 2018; 39 pages.
Related Publications (1)
Number Date Country
20180120431 A1 May 2018 US