The present invention relates to a system and a method for an augmented reality boating and fishing application, and more particularly to a system and a method for creating the illusion of seeing through water to discern the landscape or topobathy of a lake or ocean to guide anglers and boaters.
Typical marine electronic equipment provide GPS 2D-maps, sonar detectors, weather applications, environmental parameter sensors, position and orientation sensors, and navigation applications, among others. Boater may also use their mobile phones to access GPS 2D-map data, fish-finding applications, weather applications, environmental parameters, and position orientation data, among others. However, in spite of the availability of these expensive marine electronic equipment, and the existing fishing applications and data, it takes a lot of expertise and effort to combine the existing 2D-maps with the fishing applications and to predict where is a good place to fish. Even after undertaking such a combination of 2D-mapping data with fishing application, it is still difficult to visualize in 3D where exactly the fish is within the water. Accordingly, there is a need for a fishing application that displays a 3D topographical map receding downwards into a body of water together with the types of fish that can be fished in the specific body of water.
The present invention provides a system and a method for an augmented reality boating and fishing application, and more particularly to a system and a method for creating the illusion of seeing through water to discern the landscape or topobathy of a lake or ocean to guide anglers and boaters.
In general, in one aspect the invention provides a system for an augmented reality boating and fishing application including a client device comprising a client application, a computing system comprising a server-based application and a database datastore comprising topobathy data that are presented as data elevation model (DEM) data of a water body. The client application accesses the server-based application and the database datastore via a network connection. The server-based application includes an augmented reality (AR) engine, a computing algorithm, and a rendering engine. The AR engine receives the DEM data of the water body and environmental factor inputs and uses the computing algorithm to calculate fish probability distributions of various types of fish within the water body. The rendering engine fuses the calculated fish probability distributions and DEM data and generates an AR composite image that is viewed via the client device.
Implementations of this aspect of the invention include the following. The AR composite image is superimposed onto a user's field of view and displayed via a user interface of the client application. The client device includes a camera and the composite image is superimposed onto a user's field of view, as viewed via the camera. The client device may be a tablet, a mobile phone or smart glasses. The environmental factors may be at least one of terrain gradients, water visibility, water temperature, tide, wind, current, barometric pressure, light intensity, time of day, date, seasonal variations, local noise, and local traffic. The computing algorithm calculates the fish probability distributions of various types of fish within the water body using the environmental factors and set rules and machine-learned rules based on historical data about which species of fish prefer which combinations of the environmental factors. The rendering engine receives external data including instantaneous location GPS data, 5G inputs, orientation compass data and gyroscope data. The AR composite image includes the topobathy and bathymetry mapping data, the calculated fish probability distributions, fish location markers, water temperature data, suggested cast depth and suggested fishing equipment and techniques, animated flora and fauna simulated under the water surface in 3D, waypoint and navigation paths between waypoints to optimize fish yield, visualization of boating hazards and navigation dangers. The client application includes a user interface that provides options to drop markers for fishing suggestions, for boating hazards and custom markers within the displayed AR composite image. The client application includes a user interface that provides options to capture digital images, video clips and audio clips of the AR composite image, fish, hazards and objects in the water and upload these digital images, video clips and audio clip to an online website. The client application includes a user interface that provides options to project markers above the surface of the water body within the displayed AR composite image.
In general, in another aspect the invention provides a computer-implemented method for an augmented reality boating and fishing application including the following. Providing a client device comprising a client application. Providing a computing system comprising a server-based application. Providing a database datastore comprising topobathy data that are presented as data elevation model (DEM) data of a water body. The server-based application comprises and augmented reality (AR) engine, a computing algorithm, and a rendering engine. Next, receiving the DEM data of the water body and environmental factor inputs by the AR engine and using the computing algorithm to calculate fish probability distributions of various types of fish within the water body. Next, fusing the calculated fish probability distributions and DEM data by the rendering engine and generating an AR composite image that is viewed via the client device. The client application accesses the server-based application and the database datastore via a network connection.
Some embodiments of the present invention are illustrated as an example and are not limited by the figures of the accompanying drawings, in which like references may indicate similar elements and in which:
The present invention provides a system and a method for an augmented reality boating and fishing application, and more particularly to a system and a method for creating a composite image that provides an illusion of seeing through water to discern the landscape or topobathy of a lake or ocean to guide anglers and boaters.
In one embodiment, the invention provides a system and a method that takes Lidar generated depth-map data, runs several image processing tools including proprietary algorithms to determine the probability distribution for finding fish of various types, then displays the results scaled and positioned over the surface of the water. The resulting effect is a color-coded topo-map receding downwards into a body of water, with meta-data that is relevant to boaters and anglers superimposed in space.
Referring to
Database 120 receives topobathy data 110 that are collected via airborne light detection and ranging (LiDAR) systems operating in a blue-green wavelength (532 μm) laser which penetrates through the water column. The same systems map terrestrial landscapes using a higher frequency to penetrate the foliage canopy of near-infrared wavelength (1000 μm-1500 μm). The resulting data are available from the National Oceanic and Atmospheric Administration (NOAA) and other private company sources in the form of a data elevation model (DEM) 110. These detailed depth contours 110 provide the size, shape, and distribution of underwater features including bottom sediment types for performing scientific, engineering, marine, geophysical, and environmental studies.
The computing system 180 includes an augmented reality (AR) engine 150, a ClearWater algorithm 140 and a rendering engine 165. The AR engine 150 receives the above mentioned terrain mapping data 110 and environmental factor inputs 130 and uses the ClearWater Algorithm 140 to calculate probability distributions of various types of fish 155. The environmental factor inputs 130 include terrain gradients, water visibility and temperature, tide, wind, current, and barometric pressure factors, light and time of day and seasonal variations, local factors like noise or traffic, among others. The ClearWater Algorithm 140 includes set rules and machine-learned rules based on historical data about which species of fish prefer which combinations of the above mentioned environmental factors. The calculated fish probability distributions 155 are entered into the rendering engine 160 together with external data 135 including instantaneous location GPS data, 5G inputs, and orientation compass and gyroscope data. The rendering engine 160 is capable of computing at least six degrees of freedom display including the above mentioned instantaneous location GPS data, 5G inputs, and orientation compass and gyroscope data. The rendering engine 160 generates an AR composite image that fuses the terrain mapping data 110 and the calculated fish probability distributions 155 and superimposes the composite image directly onto a user's field of view, as viewed via the camera of the tablet 172, or the camera of the mobile phone 174 or via the smart glasses 176. The user holds the camera of the tablet 172 or the mobile phone 174 in front of their eyes and views the generated AR composite image that includes the current field of view of the camera and the superimposed computer generated layers of the terrain mapping data 110 and the calculated fish probability distributions 155, as shown in
Referring to
In the embodiment of
Referring to
Referring to
Other embodiments of the present invention include one or more of the following. A compass map on a gimbal is used for orientation over the water. A reticle is used to reveal the depth using a ray-cast in the center of the user's field of view. A sky dashboard is used to show where the points of interest are at a distance. The distance between the position of the user and the markers is indicated. After catching a fish, a 3D virtual model of the fish is generated and is added to swim in the imaged water as a “Ghost fish” 190, as shown in
Referring to
Computer system 400 may further include one or more memories, such as first memory 430 and second memory 440. First memory 430, second memory 440, or a combination thereof function as a computer usable storage medium to store and/or access computer code. The first memory 430 and second memory 440 may be random access memory (RAM), read-only memory (ROM), a mass storage device, or any combination thereof. As shown in
The computer system 400 may further include other means for computer code to be loaded into or removed from the computer system 400, such as the input/output (“I/O”) interface 450 and/or communications interface 460. Both the I/O interface 450 and the communications interface 460 allow computer code to be transferred between the computer system 400 and external devices or webservers including other computer systems. This transfer may be bi-directional or omni-direction to or from the computer system 400. Computer code transferred by the I/O interface 450 and the communications interface 460 are typically in the form of signals, which may be electronic, electromagnetic, optical, or other signals capable of being sent and/or received by the interfaces. These signals may be transmitted via a variety of modes including wire or cable, fiber optics, a phone line, a cellular phone link, infrared (“IR”), and radio frequency (“RF”) link, among others.
The I/O interface 450 may be any connection, wired or wireless, that allows the transfer of computer code. In one example, I/O interface 450 includes an analog or digital audio connection, digital video interface (“DVI”), video graphics adapter (“VGA”), musical instrument digital interface (“MIDI”), parallel connection, PS/2 connection, serial connection, universal serial bus connection (“USB”), IEEE1394 connection, PCMCIA slot and card, among others. In certain embodiments the I/O interface connects to an I/O unit 455 such as a user interface, monitor, speaker, printer, touch screen display, among others. Communications interface 460 may also be used to transfer computer code to computer system 400. Communication interfaces include a modem, network interface (such as an Ethernet card), wired or wireless systems (such as Wi-Fi, Bluetooth, and IR), local area networks, wide area networks, and intranets, among others.
The invention is also directed to computer products, otherwise referred to as computer program products, to provide software that includes computer code to the computer system 400. Processor 420 executes the computer code in order to implement the methods of the present invention. In one example, the methods according to the present invention may be implemented using software that includes the computer code that is loaded into the computer system 400 using a memory 430, 440 such as the mass storage drive 443, or through an I/O interface 450, communications interface 460, or any other interface with the computer system 400. The computer code in conjunction with the computer system 400 may perform any one of, or any combination of, the steps of any of the methods presented herein. The methods according to the present invention may be also performed automatically, or may be invoked by some form of manual intervention.
The computer system 400, or network architecture, of
Several embodiments of the present invention have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the invention. Accordingly, other embodiments are within the scope of the following claims.
This application claims the benefit of U.S. provisional application Ser. No. 63/102,840 filed on Jul. 7, 2020 and entitled “CleAR Water: an augmented reality boating and fishing application”, which is commonly assigned and the contents of which are expressly incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63102840 | Jul 2020 | US |