This section is intended to provide information to facilitate an understanding of various technologies described herein. As the section's title implies, this is a discussion of related art. That such art is related in no way implies that it is prior art. The related art may or may not be prior art. It should therefore be understood that the statements in this section are to be read in this light, and not as admissions of prior art.
When trolling, sonar may be used to locate fish. Sometimes, an angler's vessel may be equipped with a sonar device to provide an underwater view. Sonar images may be displayed on a marine display.
Described herein are implementations of technologies for a device configured to use motion sensing for controlling a display. The device may include a motion sensing module configured to track orientation of a user on a watercraft and determine a direction of a cast executed by the user based on the tracked orientation of the user. The device may include a display module configured to display one or more sonar images based on the determined direction of the cast executed by the user.
Described herein are also implementations of technologies for an apparatus configured to use motion sensing for controlling a display. The apparatus may include a motion sensor configured to track movement of an angler on a watercraft and determine a direction of a casting gesture performed by the angler based on the tracked movement of the angler. The apparatus may include a display component configured to display one or more sonar images based on the determined direction of the casting gesture performed by the angler.
Described herein are implementations of various technologies for a computer-readable medium having stored thereon a plurality of computer-executable instructions which, when executed by a computer, cause the computer to use motion sensing for controlling a display. The instructions may cause the computer to receive motion capture data from one or more motion sensors configured to track movement of an angler on a watercraft. The instructions may cause the computer to determine a direction or intensity of a cast by the angler based on the motion capture data. The instructions may cause the computer to identify a target area of the cast based on the determined direction of the cast by the angler. The instructions may cause the computer to automatically display one or more sonar images on a display component in relation to the identified target area of the cast by the angler.
Described herein are other implementations of technologies for a computer-readable medium having stored thereon a plurality of computer-executable instructions which, when executed by a computer, cause the computer to use motion sensing for controlling a display. The instructions may cause the computer to receive first data from a first motion sensor configured to determine a direction of a cast by an angler. The instructions may cause the computer to receive second data from a second motion sensor configured to determine an intensity of the cast by the angler. The instructions may cause the computer to identify a target area of the cast based on the determined direction of the cast and the determined intensity of the cast. The instructions may cause the computer to automatically display one or more sonar images on a display component in relation to the identified target area of the cast.
The above referenced summary section is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description section. The summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Moreover, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Implementations of various techniques are described herein with reference to the accompanying drawings. It should be understood, however, that the accompanying drawings illustrate only the various implementations described herein and are not meant to limit the scope of various techniques described herein.
Various implementations described herein are directed to using motion sensing for controlling a display, including display of sonar images. In one implementation, various techniques described herein refer to using a motion sensing module to track orientation of an angler on a watercraft and to determine a direction of a cast executed by the angler based on the tracked orientation of the angler. In some implementations, the motion sensing module may track movements of the angler on the watercraft and detect intensity of the cast executed by the angler based on the tracked movements of the angler. Further, based on the cast by the angler, a marine display may be configured to automatically display one or more sonar images associated with a targeted area that the angler is presently casting toward. This technique may provide the angler an automatic view of the fish in the water column in relation to the target area of the cast.
Various implementations of using motion sensing for controlling a display will now be described in reference to
The sonar transducer 110 may be configured to provide various angular ranges of view in various directions, such as, e.g., approximately a 90° vertical view along with approximately a 15° to 30° horizontal view. The various angular ranges of view may include or at least be extended to include angular ranges of vertical views from/between 0° to more than 90° along with angular ranges of horizontal views from/between 0° to 180°, or in some cases, 360° view. The sonar transducer 110 may be configured to manually or automatically rotate (or pivot or directionally adjust) vertically and/or horizontally so as to rotate the view (i.e., field of view).
During operation, the sonar transducer 120 may be configured to use sonar for imaging various environmental features (e.g., fish, plants, rocks, lures, bait, etc.) in the body of water 102. This imaging may include mapping an underwater environment below the surface 104 of the body of water 102 between the surface 104 and a bottom or floor 106 of the body of water 102. For instance, this imaging may include various images of fish or schools of fish 150 captured beneath the watercraft 140 by the sonar transducer 120 directed in any direction, such as the forward direction 144 with the sonar beam 110, as shown in reference to
In reference to
During trolling, the computing device 122 may be used to display sonar images in relation to a direction of a cast executed/performed by a user or angler 130 (e.g., boat pilot, fisherman, etc.). In one implementation, the computing device 122 may include one or more motion sensing modules 124 configured to track orientation of the angler 130 on the watercraft 140 and determine a direction of a cast executed by the angler 130 based on the tracked orientation of the angler 130. Further, the computing device 122 may be configured to display the one or more sonar images based on the determined direction of the cast executed by the angler 130. In another implementation, the motion sensing module 124 may be configured to track movements of the angler 130 on the watercraft 140 and detect intensity of the cast executed by the angler 130 based on the tracked movements of the angler 130. Further, the computing device 122 may be configured to display one or more sonar images based on the detected intensity of the cast. Further, in some other instances, the computing device 122 may be configured to display one or more sonar images based on the detected intensity of the cast along with the determined direction of the cast.
The motion sensing modules 124 may be referred to as motion sensors, components, or devices. The motion sensing modules 124 may include one or more image capture components including at least one of a video camera, such as, e.g., a 2D and/or a 3D video camera, and a depth sensing camera, such as e.g., an infrared (IR) and/or radio frequency (RF) depth sensing camera.
Further, as shown in
In some implementations, the computing device 122 may be used to determine a depth of the fish 150 in the body of water 102 near the watercraft 140. Once the depth 108 of the fish 150 is determined, then the angler 130 may cast a lure or bait 136 in the body of water 102 at the determined depth 108. For instance, during trolling, the lure or bait 136 may be coupled to a casting device, such as a rod 132 (e.g., fishing rod or pole), via a line 134 (e.g., fishing line). The rod 132 may be configured for casting the lure or bait 136 by the angler 130. As shown in
In some implementations, the sonar transducer 120 may be electrically coupled to the computing device 122 via one or more electrical wires or cables (not shown) passing through the watercraft 140. The computing device 122 may be configured to record sonar data received from the sonar transducer 120 via the electrical cables. Further, the computing device 122 may be configured to control operation of the watercraft 140. In some instances, operation of the watercraft 140 may be controlled by the computing device 122 including user interaction with the computing device 122. In some other instances, operation of the watercraft 140 may be controlled via user interaction with a foot-pedal (not shown) positioned on the watercraft 140.
In various implementations, the sonar transducer 120 may be referred to as a forward scanning sonar transducer having a forward spotlight scan transducer. In some other implementations, the sonar transducer 120 may include an array of multiple sonar transducers having, e.g., one or more of a right forward scanning element, a left forward scanning element, a conical sonar element, and a bar downscan sonar element. In this instance, the multiple sonar scanning elements are each capable of generating a separate sonar beam, wherein each sonar beam may include one or more of a conical beam projection and a linear beam projection. Further, each of the sonar beams may include a conical downscan beam projection having a coverage area of a beam produced by a circular downscan transducer. Still further, in some instances, each of the sonar beams may include a linear downscan beam projection having a coverage area of a beam produced by a linear downscan transducer.
In various implementations, each sonar transducer element may be configured to use sonar technology to evaluate attributes of various target objects by interpreting echoes from sound waves. Further, each sonar transducer element may be configured to actively generate low and/or high frequency sound waves and evaluate echoes received back to thereby measure time intervals between sending signals and receiving corresponding echoes to determine distance to target objects. Each sonar transducer element may be configured to convert energy into sound waves using piezoelectric transducers or capacitive transducers that are configured to convert electrical energy into sound. Each sonar transducer element may be configured to use piezoelectric crystals that include a property of changing size when voltage is applied, whereby applying an alternating current (AC) across the piezoelectric crystals may cause oscillations at high frequencies, to thereby generate high frequency sound waves. In some instances, focusing sound waves generated by each sonar transducer element may be determined by an area and shape of each sonar transducer element, a sound wave frequency of each sonar transducer element, and a sound velocity of the propagation medium, such as a body of water. In some instances, each sonar transducer element may use piezoelectric crystals configured as transceivers to transmit and detect sound waves in one or more elements, such as propagating sound waves and receiving echoing sound waves.
Generally, the term sonar (i.e., SOund Navigation And Ranging) may refer to various techniques for propagating sound underwater to detect objects on or under a surface of a body of water, such as fish, lures, plants, rocks, sea floor, etc. One type of sonar technology refers to active sonar that is configured to emit pulses of sound waves while receiving echoes, which refers to pinging. Sonar may be used to determine acoustic locations and/or measurements of echo characteristics for targets and objects in a body of water. Further, acoustic frequencies used in sonar based devices may vary from low frequency (i.e., infrasonic) to high frequency (i.e., ultrasonic).
In some implementations, the sonar transducer 120 may include use of one or more sensors (not shown). For instance, the one or more sensors may include a dedicated sensor (e.g., water sensor) configured for sensing deployment and/or removal of the sonar transducer 120 in and/or from the body of water 102. In this instance, the dedicated sensor may include electrode terminals (not shown) configured to activate (e.g., power-up) the sonar transducer 120 when the watercraft 140 is deployed in water. In some instances, the electrode terminals may be configured to deactivate (e.g., power-down) the sonar transducer 120 when the watercraft 140 is removed or withdrawn from water. Further, the one or more sensors may include one or more environmental sensors, such as a temperature sensor and depth finding sensor.
In some implementations, the motion sensing modules 124 of
In some implementations, the motion sensing modules 124 of
In some implementations, the motion sensing modules 124 of
In various implementations, the motion sensing module 124 may be configured to determine a range of the cast (or casting gesture) based on the detected intensity of the cast (or casting gesture). A range as used herein may refer to how far or what distance the fishing lure or bait travels after casting. The motion sensing module 124 may be configured to identify a target area of the cast (or casting gesture) based on the determined range of the cast (or casting gesture) along with the determined direction of the cast (or casting gesture). The target area of the cast may be an approximate location or area of where the fishing lure or bait may hit the water. The motion sensing module 124 may be configured to automatically display the one or more sonar images that may correspond to the identified target area of the cast (or casting gesture). Further, the intensity of the cast (or casting gesture) may refer to a speed of a casting body part of the angler 130 during the cast (or casting gesture). The speed of the cast (or casting gesture) may be determined based on a rate of change of a transitional movement of the casting body part of the angler 130 over a distance from a first position to a second position during a time period of the cast (or casting gesture), wherein the second position is different than the first position.
In some implementations, the angler 130 may wear a motion sensing device 240 on a movable body part, such as a wrist, hand, etc., as shown in
In various implementations, the processor 242 may include software configured to detect one or more user commands for operating a marine electronics device, such as, e.g., marine display, MFD, etc. The user commands may be detected using one or more buttons 248, using the motion sensor 246, or both. For instance, if a user presses a button 248 on the wearable device 240, the wearable device 240 may transmit a user command (or data related to the user command) to a marine electronics device. Further, in some instances, the wearable device 240 may include a display 250. The display 250 may include various types of light emitting diodes (LEDs), and as such, the display 250 may include a liquid crystal display (LCD).
In some situations, the processor 242 may include software configured to detect (or determine) an intensity of a cast by the angler 130. For instance, based on the intensity of the cast, a type of lure and a type of line/reel used, the processor 242 may be able to calculate an approximate distance of the cast. This information may be transmitted to a computing device (e.g., a marine display, MFD, etc.), and this information may be used to determine a range output for the display. For instance, the range may be from 20 ft. for light cast to 100 ft. for a hard cast.
In various implementations, the wearable device 240 may be configured to use wireless technology, such as, e.g., Bluetooth, Wi-Fi, cellular technology (such as GSM or CDMA), satellite communication, and/or any other type of wireless technology. In some instances, the wearable device 240 may be wirelessly connected to a marine electronics device via a network interface 252. In other instances, the wearable device 240 may be wirelessly connected to any computer system via the network interface 252, including a portable computer system, a smart phone device, a remote computer, a remote server, a cloud server, and the like. Further, the wearable device 240 may be connected to any computing device with a wired or wireless connection via the network interface 252.
In various implementations, the transducer 310 may be configured to rotate and/or pivot to provide multiple fields of horizontal views, such as 360° views along the horizontal (e.g., x-axis). These multiple fields of horizontal views may include forward (fore) facing views (e.g., facing toward a bow of a watercraft), rear (aft) facing views (e.g., facing toward a stern of a watercraft), starboard (right) facing views (e.g., facing toward a starboard side of a watercraft), and port (left) facing views (e.g., facing toward a port side of a watercraft). Further, the transducer 310 may be configured to rotate and/or pivot to provide multiple fields of vertical views at various depths, such as angular views from 0° to 90° along the vertical (e.g., y-axis).
In some implementations, the transducer 310 may implement use of a spotlight transducer array having multiple scanning transducers. In some other implementations, the transducer 310 may include multiple transducer elements having one or more of a right scanning transducer, a left scanning transducer, a down scanning transducer, and a conical down beam transducer.
As shown in reference to
The motion capture data may include full-body 3D motion capture data in relation to movement of the angler on the watercraft. The motion capture data may also include facial recognition data of the angler to determine a forward facing orientation of the angler. Further, tracking movement of the angler on the watercraft may include tracking one or more gestures of the user on the watercraft including casting gestures, such as, e.g., overhead casting gestures and side casting gestures.
In some implementations, the computing device 340 may be configured to determine (or detect) an intensity of the cast by the angler based on the motion capture data. The computing device 340 may be configured to identify a target area of the cast based on the determined (or detected) intensity of the cast by the angler. The computing device 340 may be configured to automatically display (or focus the display of) one or more sonar images on the display 370 in relation to the identified target area of the cast by the angler. In this instance, determining intensity of the cast watercraft may include determining a range of the cast based on the determined (or detected) intensity of the cast. In some situations, based on the intensity of the cast, a type of lure and a type of line/reel used, the computing device 340 may be able to calculate an approximate distance of the cast. This information may be processed by to the computing device 140, and the computing device 140 may use this information to determine a range output for the display. For instance, the range may be from 20 ft. for light cast to 100 ft. for a hard cast.
In some implementations, the computing device 340 may be configured to receive first data from a first motion sensor (e.g., the motion sensor 362) that may be configured to determine (or detect) a direction of a cast by an angler. The computing device 340 may be configured to receive second data from a second motion sensor (e.g., the motion sensor 362 or the intensity sensor 364 configured to determine (or detect) an intensity of the cast by the angler. Further, the computing device 340 may be configured to process the first data and/or the second data to identify a target area of the cast based on the determined (or detected) direction of the cast and/or the determined (or detected) intensity of the cast. The computing device 340 may be configured to automatically display (or focus the display of) one or more sonar images on the display 370 in relation to the identified target area of the cast. Accordingly, the computing device 340 may be configured to use motion sensor data as a filtering mechanism to display sonar images in relation to a side of the watercraft, such as, e.g., front (fore), back (stern or aft), right (starboard side), left (port side), or some combination thereof.
The first data and/or second data may include full-body 3D motion capture data in relation to orientation of the angler. The first data and/or second data may also include facial recognition data associated with a forward facing position of the angler. The direction of the cast may be based on the orientation of the angler or the forward facing position of the angler during the cast, or both. Further, determining intensity of the cast by the angler may include determining a range of the cast based on the detected intensity of the cast by the angler. Optionally, receiving second data from the second motion sensor may include receiving acceleration data from a wearable wrist sensor or device 366 worn by the angler during the cast. As described in reference to
In various implementations, the motion sensor 362 may be configured to track multiple movements (and/or orientations and/or casting intensities) of multiple anglers on a watercraft, and the computing device 340 may be configured to process the data and information associated with each of the multiple anglers. Further, the intensity sensor 364 may be configured to receive data and information from multiple anglers, with each angler wearing a wearable device 366, and track an intensity of each cast performed or executed by each angler. As such, motion sensing data from each angler may be used to determine (or detect) multiple target areas for each angler based on the determined (or detected) direction and/or determined (or detected) intensity of each angler's cast. Further, the computing device 340 may be configured to display multiple sonar images on the display 370 for each angler, e.g., in a split-screen mode of operation. Still further, the display 370 may comprise multiple displays positioned throughout the watercraft, and the computing device 340 may be configured to display particular sonar images for particular anglers depending on each angler's tracked position on the watercraft.
The computing device 340 may include the processor 342 and memory 344 having instructions that cause the processor 342 to display images associated with the sonar data on a display component or device 370. Further, the instructions may cause the processor 342 to simultaneously display images associated with current and previous sonar data 312 on the display device 370 in a split screen mode of operation, such as, e.g., left-side screen display of various images associated with current sonar data and right-side screen display of various images associated with previous sonar data. Further, the computing device 340 may be configured to create/generate sonar logs associated with the sonar data 312, including previously recorded sonar data.
In some implementations, the computing device 340 may be configured to store/record sonar data 312 and/or sonar logs in one or more databases (e.g., database 380). The computing device 340 may be configured to upload the sonar data 312 and/or sonar logs to the network server 390, such as, e.g., cloud server or other network server, via network interface 360. The computing device 340 may be configured to store/record multiple sonar logs and create/generate a map therefrom. The computing device 340 and/or the network server 390 may be configured to create/generate one or more maps by stitching/combining/joining multiple sonar logs together. The computing device 340 may be configured to receive geo-coordinate data, such as global positioning system data (i.e., GPS data), via a GPS transceiver 350 and associate the received GPS data to the sonar data 312, sonar logs, and/or maps at anytime, including prior to upload. The wired or wireless network may include any type of wired or wireless communication network and/or cloud based network.
In various implementations, the computing device 340 may be configured as a special purpose machine for interfacing with one or more transducers 310. The computing device 340 may include standard elements and/or components, including the processor 342, the memory 344 (e.g., non-transitory computer-readable storage medium), at least one database 380, power, peripherals, and various other computing elements and/or components that may not be specifically shown in
In some implementations, method 400 may be performed or executed by various types of computing devices, such as, e.g., the computing device 122 of
At block 410, method 300 may track movement of an angler on a watercraft. The movement may detect the angler transitioning from a first position on the watercraft to a second position on the watercraft. The second position may be different than the first position. The movement may also detect various gestures or body movements of the angler while on the watercraft. The movement may also detect turning of the angler's head in a forward facing direction of the angler's face.
At block 420, method 300 may detect an orientation of the angler based on the tracked movement of the angler. The orientation of the angler may refer to a position of the angler in relation to the bow (fore side), the stern (aft side), the port (left side), and/or the starboard (right side) of the watercraft. The orientation of the angler may also refer to a forward facing direction of the angler of the watercraft.
At block 430, method 300 may determine a direction of a cast executed by the angler based on the tracked movement and/or the detected orientation of the angler. The tracked movement may refer to various gestures or body movements of the angler while on the watercraft, including casting gestures performed or executed by the angler. For instance, the casting gestures may include overhead casting gestures and/or side casting gestures performed or executed by the angler. The cast executed by the angler may refer to a casting gesture performed by the angler.
At block 440, method 300 may detect an intensity of the cast executed by the angler based on the tracked movement of the angler. The intensity of the cast may refer to a speed or acceleration of a casting body part of the angler during the cast. The speed may be determined based on a rate of change of a transitional movement of the casting body part of the angler over a distance from a first position to a second position during a time period of the cast. The second position may be different than the first position.
At block 450, method 300 may determine a range of the cast based on the tracked movement of the angler and/or the detected intensity of the cast. This may include identifying a target area of the cast based on the determined range of the cast. The range of the cast may refer to a distance of the cast as calculated based on the intensity of the cast. The intensity may refer to a power, force, or strength of the cast.
At block 460, method 300 may display one or more sonar images on a display component based on the determined direction of the cast and/or the determined range of the cast. Further, this may include automatically focusing the display of the one or more sonar images to the identified target area of the cast. The one or more sonar images may include one or more 2D and/or 3D based sonar images of an underwater environment in a column of water where the watercraft is deployed. In various implementations, the display component may be configured to display the one or more sonar images based solely on the determined direction of the cast, based solely on the detected intensity of the cast, or based on the determined direction of the cast along with the detected intensity of the cast. Accordingly, in various implementations, the one or more sonar images may be displayed based solely on the direction of the cast, based solely on the range of the cast, or based on both the direction and range of the cast.
The marine electronics device 500 may be operational with numerous general purpose or special purpose computing system environments and/or configurations. The marine electronics device 500 may include any type of electrical and/or electronics device capable of processing data and information via a computing system. The marine electronics device 500 may include various marine instruments, such that the marine electronics device 500 may use the computing system to display and/or process the one or more types of marine electronics data. The device 500 may display marine electronic data 515, such as, e.g., sonar data and images associated with sonar data. The marine electronic data types 515 may include chart data, radar data, sonar data, steering data, dashboard data, navigation data, fishing data, engine data, and the like. The marine electronics device 500 may include a plurality of buttons 520, which may be include physical buttons or virtual buttons, or a combination thereof. The marine electronics device 500 may receive input through a screen 505 sensitive to touch or buttons 520.
The marine electronics device 500 may be configured as a computing system having a central processing unit (CPU), a system memory, a graphics processing unit (GPU), and a system bus that couples various system components including the system memory to the CPU. In various implementations, the computing system may include one or more CPUs, which may include a microprocessor, a microcontroller, a processor, a programmable integrated circuit, or a combination thereof. The CPU may include an off-the-shelf processor such as a Reduced Instruction Set Computer (RISC), or a Microprocessor without Interlocked Pipeline Stages (MIPS) processor, or a combination thereof. The CPU may also include a proprietary processor.
The GPU may be a microprocessor specifically designed to manipulate and implement computer graphics. The CPU may offload work to the GPU. The GPU may have its own graphics memory, and/or may have access to a portion of the system memory. As with the CPU, the GPU may include one or more processing units, and each processing unit may include one or more cores.
The CPU may provide output data to a GPU. The GPU may generate graphical user interfaces that present the output data. The GPU may also provide objects, such as menus, in the graphical user interface. A user may provide inputs by interacting with the objects. The GPU may receive the inputs from interaction with the objects and provide the inputs to the CPU. A video adapter may be provided to convert graphical data into signals for a monitor (MFD 500). The monitor (MFD 500) includes a screen 505. In various implementations, the screen 505 may be sensitive to touching by a human finger, and/or the screen 505 may be sensitive to the body heat from a human finger, a stylus, and/or responsive to a mouse.
The system bus may be any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of instance, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus. The system memory may include a read only memory (ROM) and a random access memory (RAM). A basic input/output system (BIOS), containing the basic routines that help transfer information between elements within the computing system, such as during start-up, may be stored in the ROM.
The computing system may further include a hard disk drive interface for reading from and writing to a hard disk, a memory card reader for reading from and writing to a removable memory card, and an optical disk drive for reading from and writing to a removable optical disk, such as a CD ROM or other optical media. The hard disk, the memory card reader, and the optical disk drive may be connected to the system bus by a hard disk drive interface, a memory card reader interface, and an optical drive interface, respectively. The drives and their associated computer-readable media may provide nonvolatile storage of computer-readable instructions, data structures, program modules and other data for the computing system.
Although the computing system is described herein as having a hard disk, a removable memory card and a removable optical disk, it should be appreciated by those skilled in the art that the computing system may also include other types of computer-readable media that may be accessed by a computer. For instance, such computer-readable media may include computer storage media and communication media. Computer storage media may include volatile and non-volatile, and removable and non-removable media implemented in any method or technology for storage of information, such as computer-readable instructions, data structures, program modules, software modules, or other data. Computer-readable storage media may include non-transitory computer-readable storage media. Computer storage media may further include RAM, ROM, erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other solid state memory technology, CD-ROM, digital versatile disks (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computing system. Communication media may embody computer readable instructions, data structures, program modules or other data in a modulated data signal, such as a carrier wave or other transport mechanism and may include any information delivery media. The term “modulated data signal” may mean a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of instance, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared (IR), and other wireless media. The computing system may include a host adapter that connects to a storage device via a small computer system interface (SCSI) bus, Fiber Channel bus, eSATA bus, or using any other applicable computer bus interface.
The computing system can also be connected to a router to establish a wide area network (WAN) with one or more remote computers. The router may be connected to the system bus via a network interface. The remote computers can also include hard disks that store application programs. In another implementation, the computing system may also connect to the remote computers via local area network (LAN) or the WAN. When using a LAN networking environment, the computing system may be connected to the LAN through the network interface or adapter. The LAN may be implemented via a wired connection or a wireless connection. The LAN may be implemented using Wi-Fi™ technology, cellular technology, Bluetooth™ technology, satellite technology, or any other implementation known to those skilled in the art. The network interface may also utilize remote access technologies (e.g., Remote Access Service (RAS), Virtual Private Networking (VPN), Secure Socket Layer (SSL), Layer 2 Tunneling (L2T), or any other suitable protocol). In some instances, these remote access technologies may be implemented in connection with the remote computers. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computer systems may be used.
A number of program modules may be stored on the hard disk, memory card, optical disk, ROM or RAM, including an operating system, one or more application programs, and program data. In certain implementations, the hard disk may store a database system. The database system could include, for instance, recorded points. The application programs may include various mobile applications (“apps”) and other applications configured to perform various methods and techniques described herein. The operating system may be any suitable operating system that may control the operation of a networked personal or server computer.
A user may enter commands and information into the computing system through input devices such as buttons, which may be physical buttons, virtual buttons, or combinations thereof. Other input devices may include a microphone, a mouse, or the like (not shown). These and other input devices may be connected to the CPU through a serial port interface coupled to system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB).
Certain implementations may be configured to be connected to a global positioning system (GPS) receiver system and/or a marine electronics system. The GPS system and/or marine electronics system may be connected via the network interface. The GPS receiver system may be used to determine position data for the vessel on which the marine electronics device 500 is disposed. The GPS receiver system may then transmit the position data to the marine electronics device 500. In other instances, any positioning system known to those skilled in the art may be used to determine and/or provide the position data for the marine electronics device 500.
The marine electronics system may include one or more components disposed at various locations on the vessel. Such components may include one or more data modules, sensors, instrumentation, and/or any other devices known to those skilled in the art that may transmit various types of data to the marine electronics device 500 for processing and/or display. The various types of data transmitted to the marine electronics device 500 from the marine electronics system may include marine electronics data and/or other data types known to those skilled in the art. The marine electronics data received from the marine electronics system may include chart data, sonar data, structure data, radar data, navigation data, position data, heading data, automatic identification system (AIS) data, Doppler data, speed data, course data, or any other type of data.
The marine electronics device 500 may receive external data via a network, such as a LAN or WAN. In various implementations, external data may relate to data and information not available from the marine electronics system. The external data may be retrieved from the Internet or any other source. The external data may include various environmental data, such as, e.g., atmospheric temperature, tidal data, weather, moon phase, sunrise, sunset, water levels, historic fishing data, and other fishing data.
In one implementation, the marine electronics device 500 may be a multi-function display (MFD) unit, such that the marine electronics device 500 may be capable of displaying and/or processing multiple types of marine electronics data.
The discussion of the present disclosure is directed to certain specific implementations. It should be understood that the discussion of the present disclosure is provided for the purpose of enabling a person with ordinary skill in the art to make and use any subject matter defined herein by the subject matter of the claims.
It should be intended that the subject matter of the claims not be limited to the implementations and illustrations provided herein, but include modified forms of those implementations including portions of the implementations and combinations of elements of different implementations within the scope of the claims. It should be appreciated that in the development of any such implementation, as in any engineering or design project, numerous implementation-specific decisions should be made to achieve a developers' specific goals, such as compliance with system-related and business related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort maybe complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having benefit of this disclosure. Nothing in this application should be considered critical or essential to the claimed subject matter unless explicitly indicated as being “critical” or “essential.”
Reference has been made in detail to various implementations, instances of which are illustrated in the accompanying drawings and figures. In the following detailed description, numerous specific details are set forth to provide a thorough understanding of the present disclosure. However, the present disclosure may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It should also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For instance, a first object or step could be termed a second object or step, and, similarly, a second object or step could be termed a first object or step, without departing from the scope of the invention. The first object or step, and the second object or step, are both objects or steps, respectively, but they are not to be considered the same object or step.
The terminology used in the description of the present disclosure herein is for the purpose of describing particular implementations and is not intended to limit the present disclosure. As used in the description of the present disclosure and appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. The term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. The terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify a presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
As used herein, the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context. As used herein, the terms “up” and “down”; “upper” and “lower”; “upwardly” and “downwardly”; “below” and “above”; and other similar terms indicating relative positions above or below a given point or element may be used in connection with some implementations of various technologies described herein.
While the foregoing is directed to implementations of various techniques described herein, other and further implementations may be devised without departing from the basic scope thereof, which may be determined by the claims that follow.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as instance forms of implementing the claims.
Number | Name | Date | Kind |
---|---|---|---|
4829493 | Bailey | May 1989 | A |
4879697 | Lowrance et al. | Nov 1989 | A |
5025423 | Earp | Jun 1991 | A |
5191341 | Gouard et al. | Mar 1993 | A |
5321391 | Fox | Jun 1994 | A |
5446775 | Wright et al. | Aug 1995 | A |
5537380 | Sprankle, Jr. et al. | Jul 1996 | A |
5546695 | Langer | Aug 1996 | A |
6222449 | Twining | Apr 2001 | B1 |
6225984 | Crawford | May 2001 | B1 |
6252544 | Hoffberg | Jun 2001 | B1 |
6263147 | Tognazzini | Jul 2001 | B1 |
6321158 | DeLorme et al. | Nov 2001 | B1 |
6411283 | Murphy | Jun 2002 | B1 |
6418080 | Inouchi | Jul 2002 | B2 |
6421299 | Betts et al. | Jul 2002 | B1 |
6459372 | Branham et al. | Oct 2002 | B1 |
6567792 | Arnold | May 2003 | B1 |
6584722 | Walls et al. | Jul 2003 | B1 |
6587740 | Byrne et al. | Jul 2003 | B2 |
6751626 | Brown et al. | Jun 2004 | B2 |
6761692 | Angelsen et al. | Jul 2004 | B2 |
6798378 | Walters | Sep 2004 | B1 |
6816782 | Walters et al. | Nov 2004 | B1 |
7002579 | Olson | Feb 2006 | B2 |
7236426 | Turner et al. | Jun 2007 | B2 |
7243457 | Smith et al. | Jul 2007 | B1 |
7319992 | Gaos | Jan 2008 | B2 |
7321824 | Nesbitt | Jan 2008 | B1 |
7430461 | Michaels | Sep 2008 | B1 |
7652952 | Betts et al. | Jan 2010 | B2 |
7710825 | Betts et al. | May 2010 | B2 |
7722218 | Leung | May 2010 | B2 |
7729203 | Betts et al. | Jun 2010 | B2 |
7755974 | Betts et al. | Jul 2010 | B2 |
7812667 | Fagg | Oct 2010 | B2 |
7870496 | Sherwani | Jan 2011 | B1 |
7890867 | Margulis | Feb 2011 | B1 |
8019532 | Sheha et al. | Sep 2011 | B2 |
8040758 | Dickinson | Oct 2011 | B1 |
8063540 | Angelsen et al. | Nov 2011 | B2 |
8452797 | Paleja et al. | May 2013 | B1 |
8468164 | Paleja et al. | Jun 2013 | B1 |
9596839 | Bailey | Mar 2017 | B2 |
20010054961 | Twining | Dec 2001 | A1 |
20020035574 | Dumas | Mar 2002 | A1 |
20020093541 | Schileru-Key | Jul 2002 | A1 |
20020099457 | Fredlund et al. | Jul 2002 | A1 |
20020116421 | Fox et al. | Aug 2002 | A1 |
20030046689 | Gaos | Mar 2003 | A1 |
20030056419 | Squires et al. | Mar 2003 | A1 |
20030089020 | Dirito | May 2003 | A1 |
20040124297 | Steer | Jul 2004 | A1 |
20040162830 | Shirwadkar et al. | Aug 2004 | A1 |
20040193364 | Chojnacki | Sep 2004 | A1 |
20040249860 | Stechschulte et al. | Dec 2004 | A1 |
20050037872 | Fredlund et al. | Feb 2005 | A1 |
20050102101 | Beesley et al. | May 2005 | A1 |
20060013066 | Nishimori et al. | Jan 2006 | A1 |
20060048434 | Congel | Mar 2006 | A1 |
20060119585 | Skinner | Jun 2006 | A1 |
20060224940 | Lee | Oct 2006 | A1 |
20060265931 | McFadden et al. | Nov 2006 | A1 |
20070011334 | Higgins et al. | Jan 2007 | A1 |
20070045010 | Kasperek | Mar 2007 | A1 |
20070058489 | Bratcher | Mar 2007 | A1 |
20070220798 | Davidson | Sep 2007 | A1 |
20080126935 | Blomgren | May 2008 | A1 |
20080165022 | Herz et al. | Jul 2008 | A1 |
20080204424 | Jin et al. | Aug 2008 | A1 |
20080246627 | Guazzelli | Oct 2008 | A1 |
20090064055 | Chaudhri et al. | Mar 2009 | A1 |
20090099871 | Gadodia | Apr 2009 | A1 |
20090105952 | Grace et al. | Apr 2009 | A1 |
20090179789 | Haughay, Jr. et al. | Jul 2009 | A1 |
20090240354 | Davidson | Sep 2009 | A1 |
20090241636 | Obori | Oct 2009 | A1 |
20090249247 | Tseng et al. | Oct 2009 | A1 |
20090258710 | Quatrochi | Oct 2009 | A1 |
20090271054 | Dokken | Oct 2009 | A1 |
20090287409 | Summers | Nov 2009 | A1 |
20090295626 | Su | Dec 2009 | A1 |
20100049468 | Papadourakis | Feb 2010 | A1 |
20100080082 | Betts et al. | Apr 2010 | A1 |
20100145601 | Kurtti et al. | Jun 2010 | A1 |
20100199225 | Coleman et al. | Aug 2010 | A1 |
20100226203 | Buttle et al. | Sep 2010 | A1 |
20100250122 | Kubota et al. | Sep 2010 | A1 |
20100319235 | Panaro | Dec 2010 | A1 |
20110007035 | Shai | Jan 2011 | A1 |
20110013484 | Coleman et al. | Jan 2011 | A1 |
20110013485 | Maguire | Jan 2011 | A1 |
20110019887 | Roehrig et al. | Jan 2011 | A1 |
20110025720 | Jo et al. | Feb 2011 | A1 |
20110067290 | Miskatovic | Mar 2011 | A1 |
20110082644 | Imasaka et al. | Apr 2011 | A1 |
20110154183 | Burns et al. | Jun 2011 | A1 |
20110208479 | Chaves | Aug 2011 | A1 |
20110213515 | Haymart et al. | Sep 2011 | A1 |
20110214500 | Cabrera et al. | Sep 2011 | A1 |
20110257819 | Chen et al. | Oct 2011 | A1 |
20120001773 | Lyons et al. | Jan 2012 | A1 |
20120011437 | James et al. | Jan 2012 | A1 |
20120014220 | DePasqua | Jan 2012 | A1 |
20120047790 | Hess et al. | Mar 2012 | A1 |
20120069712 | Potanin et al. | Mar 2012 | A1 |
20120106300 | Maguire | May 2012 | A1 |
20120144384 | Baek | Jun 2012 | A1 |
20120144723 | Davidson | Jun 2012 | A1 |
20120185801 | Madonna et al. | Jul 2012 | A1 |
20120316456 | Rahman et al. | Dec 2012 | A1 |
20120316458 | Rahman et al. | Dec 2012 | A1 |
20120317167 | Rahman et al. | Dec 2012 | A1 |
20130007665 | Chaudhri et al. | Jan 2013 | A1 |
20130040714 | Rosing | Feb 2013 | A1 |
20130074051 | Freeman | Mar 2013 | A1 |
20130096575 | Olson | Apr 2013 | A1 |
20130107031 | Atkinson | May 2013 | A1 |
20130281087 | Ruhanen et al. | Oct 2013 | A1 |
20130307720 | Lilburn | Nov 2013 | A1 |
20130343151 | Shiraki et al. | Dec 2013 | A1 |
20140012587 | Park | Jan 2014 | A1 |
20140032468 | Anandaraj | Jan 2014 | A1 |
20140071059 | Girault | Mar 2014 | A1 |
20140111368 | Lee et al. | Apr 2014 | A1 |
20140180566 | Malhotra | Jun 2014 | A1 |
20140195297 | Abuelsaad et al. | Jul 2014 | A1 |
20140358483 | da Rosa | Dec 2014 | A1 |
20150019135 | Kacyvenski | Jan 2015 | A1 |
20150051786 | Wang | Feb 2015 | A1 |
20150054655 | Bailey | Feb 2015 | A1 |
20150054732 | Bailey | Feb 2015 | A1 |
20150054828 | Bailey | Feb 2015 | A1 |
20150054829 | Bailey | Feb 2015 | A1 |
20150055827 | Bailey | Feb 2015 | A1 |
20150055930 | Bailey | Feb 2015 | A1 |
20150057929 | Bailey | Feb 2015 | A1 |
20150057965 | Gaynor | Feb 2015 | A1 |
20150057968 | Bailey | Feb 2015 | A1 |
20150058020 | Bailey | Feb 2015 | A1 |
20150058237 | Bailey | Feb 2015 | A1 |
20150058323 | Bailey | Feb 2015 | A1 |
20150310524 | Gospodarek et al. | Oct 2015 | A1 |
20160125348 | Dyer et al. | May 2016 | A1 |
Number | Date | Country |
---|---|---|
102004059619 | Jun 2006 | DE |
1 561 377 | Aug 2005 | EP |
2 613 223 | Jul 2013 | EP |
2004 207812 | Jul 2004 | JP |
2006-158239 | Jun 2006 | JP |
2010 193284 | Sep 2010 | JP |
2011 139647 | Jul 2011 | JP |
WO 9802037 | Jan 1998 | WO |
WO 2004088572 | Oct 2004 | WO |
WO 2010056392 | May 2010 | WO |
WO 2012170163 | Dec 2012 | WO |
Entry |
---|
McElderry; At-Sea Observing Using Video-Based Electronic Monitoring; Prepared for: Electronic Monitoring Workshop Jul. 29-30, 2008; Archipelago Marine Research Ltd. |
Allen, et al.; Upper Extremity Kinematic Trends of Fly-Casting; Establishing the Effects of Line Length; Sports Biomechanics; vol. 7, No. 1; Jan. 1, 2008; pp. 38-53. |
First look at new Mio Link ANT +/Bluetooth Smart optical heart rate wrist band; http://www.dcrainmaker.com/2014/01/mio-link-first-look.html; Jan. 6, 2014 (accessed Mar. 3, 2014). |
SAS, “SAS BI Dashboard 4.31 User's Guide”, Second Edition, by SAS Electronic book, Aug. 1, 2012, downloaded at http://support.sas.com/documentation/cdl/en/bidbrdug/65580/PDF/default/bidrdrug.pdf. |
PCT International Search Report and Written Opinion; PCT/IB2013/060285, dated Feb. 18, 2014. |
PCT International Search Report and Written Opinion; PCT/IB2014/063974, dated Dec. 2, 2014. |
PCT International Search Report and Written Opinion; PCT/IB2014/063975, dated Dec. 3, 2014. |
PCT International Search Report and Written Opinion; PCT/IB2014/063976, dated Dec. 12, 2014. |
PCT International Search Report and Written Opinion; PCT/IB2014/063979, dated Jan. 7, 2015. |
PCT International Search Report and Written Opinion; PCT/IB2014/063980, dated Jan. 5, 2015. |
PCT International Search Report and Written Opinion; PCT/IB2014/063982, dated Dec. 22, 2014. |
PCT International Search Report and Written Opinion; PCT/IB2014/063983, dated Mar. 5, 2015. |
PCT International Search Report and Written Opinion; PCT/US2013/047645, dated Sep. 27, 2013. |
PCT International Search Report and Written Opinion; PCT/US2013/047869, dated Oct. 21, 2013. |
PCT International Search Report and Written Opinion; PCT/US2013/047926, dated Oct. 11, 2013. |
PCT International Search Report and Written Opinion; PCT/US2013/048129, dated Oct. 17, 2013. |
PCT International Search Report and Written Opinion; PCT/US2013/048177, dated Oct. 21, 2013. |
PCT International Search Report and Written Opinion; PCT/IB2014/063973, dated Nov. 28, 2014. |
PCT International Search Report and Written Opinion; PCT/IB2014/063981, dated Feb. 10, 2015. |
PCT International Search Report and Written Opinion; PCT/IB2014/063978, dated Dec. 19, 2014. |
PCT International Search Report and Written Opinion; PCT/IB2014/063977, dated Nov. 28, 2014. |
Number | Date | Country | |
---|---|---|---|
20170038461 A1 | Feb 2017 | US |