Embodiments of the present invention relate generally to presentation of marine data and interaction with marine electronic device(s), and more particularly to, providing for an improved user interface and experience for presenting display features on a marine electronic device.
Marine electronic devices are used to present marine data, and other types of data to a user while using a marine watercraft. With technological improvements, more types of data are available to be collected and the amount of data collected has increased. Thus, to present all available data, the display abilities and functionality of the marine electronic devices has similarly increased. In order to present the data, menus, and other command centers are added onto the display area, effectively causing the functional visual area of the display to decrease in size.
As a consequence of the increased data collection and presentation, the marine electronic devices require a greater amount of energy. In this regard, to increase the functional life of the marine electronic device (e.g., battery life), the marine electronic devices may employ power saving features while on the watercraft. This, of course, should be balanced with desirable access for a user.
Thus, there exists a need to allow effective presentation of data (e.g., in charts, and/or images), while maintaining functionality of utilizing menus and optionality for switching between multiple display pages to see/use the different options. Further, with the increased functionality, methods to reserve the battery life of the marine electronic device while maintaining the functionality of the device is desired.
As noted above, it can be difficult to present all of the marine data gathered during the course of a trip on the display of a marine electronic device. Even with the improvements in the sizes of the display, there is a desire to see the current image, chart, or screen across the whole display. However, the added functionality including menus, toggle screens, data display and other options may take up space on the display, thereby preventing the user from seeing the entire image on the display. Notably, marine electronic devices are often used as aids by presenting the marine data while a user engages in other marine activities (e.g., fishing, driving, etc.) and, thus, maximizing the functional visual display space is important (such as making the sonar image as large as possible to enable the user to see the fish).
Various embodiments of the present invention use sensors to determine when a user wishes to “interact” with the display. In some such embodiments, an interactive menu (which is used to access the various vast functionality of the marine electronic device) can be hidden until the sensor(s) detect a user (at which time the interactive menu can be presented on the display for use thereof). This allows the functional visual display space to be used for presenting the desirable marine data instead, and the interactive menu is only presented when needed. Accordingly, various embodiments of the present invention utilize sensor(s) to detect the presence of a user, and in response to detecting the presence of the user, adjust the presentation of the display, such as by adding one or more interactive menus. In some embodiments, a proximity sensor may be used to determine when a user is close to a sensing zone of the display. In some embodiments, based on the user's position within the sensing zone, different menus or options may be presented on the display for the user to select. Further, the sensor may be configured to engage or disengage a power save mode based on the position of the user within the sensing zone. In some embodiments, a camera or other sensor type device may be used.
In an example embodiment a system for presenting marine data is provided. The system comprises a sensor, a display, at least one processor and a memory including a computer program code. The computer program code is configured to, when executed, cause the at least one processor to cause, on the display, presentation of at least one image. The at least one image covers a first area of the display which extends at least partially across the display. The computer program code is further configured to cause the processor to detect, with the sensor, a presence of a user within a sensing zone, and to cause, on the display, in response to detecting the presence of the user within the sensing zone, an interactive menu to be presented within at least a portion of the first area of the display, wherein the interactive menu was not previously presented within the portion of the display.
In some embodiments, the sensor may be a proximity sensor. In some embodiments, the interactive menu may be presented over the at least one image. In some embodiments, the computer program code may be further configured to, when executed, cause the at least one processor to cause, on the display, presentation of the at least one image to be reduced to a second area, the second area being smaller than the first area. The second area and the at least a portion of the first area of the display presenting the interactive menu may not overlap.
In some embodiments, the at least a portion of the first area may be an edge of the display. In some embodiments, the at least a portion of the first area may be a top edge of the display. In some embodiments, the at least a portion of the first area may be a first side edge and a second side edge of the display.
In some embodiments, the at least one image may be a first image and a second image. The first image may cover a third area of the display, and the second image may cover a fourth area of the display. The interactive menu may be presented within either a first portion of the third area of the display, a second portion of the fourth area of the display or both the first portion of the third area and the second portion of the fourth area of the display.
In some embodiments, the interactive menu may be a first interactive menu and a second interactive menu. The computer program code may be configured to, when executed cause the at least one processor to cause, on the display, in response to detecting the presence of the use in a first sensing region, presentation of the first interactive menu, to be presented within a first portion of the first area. The computer program code may be further configured to cause, on the display, in response to detecting the presence of the user in a second sensing region, presentation of the second interactive menu, to be presented within a second portion of the first area. The first sensing region, may be different than the second sensing region.
In some embodiments, the sensing zone may be a first sensing zone and a second sensing zone wherein the first sensing zone is a first distance away from the display, and the second sensing zone is a second distance away from the display. The first distance and the second distance being different. The computer program code may be further configured to, when executed, cause the processor to cause, on the display, in response to detecting the presence of the user within the first sensing zone, presentation of an interactive menu indication. The computer program code may be further configured to, when executed, cause the processor to cause, on the display, in response to detecting the presence of the user within the second sensing zone, presentation of the interactive menu.
In some embodiments, the presence of the user may comprise a user gesture. In some embodiments, the computer program code may be further configured to, when executed, cause the at least one processor to cause, on the display, in response to detecting non-presence of the user, presentation of the interactive menu to cease of the portion of the first area of the display.
In another example embodiment a method of presenting marine data is provided. The method comprises causing, on a display, presentation of at least one image which covers a first area of the display. The first area of the display extends at least partially across the display. The method continues by detecting, with a sensor, presence of a user within a sensing zone. The method continues by causing, on the display, in response to detecting the presence of the user within the first sensing zone an interactive menu to be presented within the at least a portion of the first area of the display, wherein the interactive menu war not previously presented within the portion of the first area of the display.
In some embodiments, the sensor may be a proximity sensor. In some embodiments, the interactive menu may be partially transparent. In some embodiments, the presentation of the at least one image may be reduced to a second area upon presentation of the interactive menu on the display in the portion of the first area. The second area and the portion of the first area may be distinct. Ise the at least a portion of the first area may be an edge of the display.
In some embodiments, the at least one image may be a first image and a second image. The first area may be a third area and a fourth area. The first image may be presented within the third area and the second area may be presented within the fourth area. The interactive menu may be presented in either a first portion of the third area, a second portion of the fourth area or both the first portion of the third area and the second portion of the fourth area.
In some embodiments, the sensing zone may be a first sensing region and a second sensing region. The method may further comprise causing, on the display, in response to detecting the presence of the user in the first sensing region, presentation of a first interactive menu within a first portion of the first area of the display. The method may further comprise causing, on the display, in response to detecting the presence of the user in the second sensing region, presentation of a second interactive menu within a second portion of the first area of the display. The first portion and the second portion may be distinct.
In yet another example embodiment a system for presenting marine data is provided. The system comprises a user input device, a display, at least one processor, and a memory including a computer program code. The computer program code is configured to, when executed, cause the at least one processor to cause, on the display, presentation of at least one image, which covers a first area of the display, which extends at least partially across the display. The computer program code is further configured to, when executed, cause the at least one processor to receive, from the user input device, indication of a user engaging the user input device, and cause, on the display, in response to engaging the user input device, an interactive menu to be presented within at least a portion of the first area of the display, wherein the interactive menu was not previously presented within the portion of the first area of the display.
In some embodiments, the user input device may be a button. In some embodiments, the user input device may be a remote. In some embodiments, the interactive menu ay be presented as an overlay of the at least one image.
In yet another example embodiment a system for presenting marine data is provided. The system comprises a display, a camera, at least one processor, and a memory including a computer program code. The camera is user-facing such that a lens of the camera is oriented outwardly from the display towards a direction to capture a user attempting to interact with the display. The computer program code is configured to, when executed, cause the at least one processor to cause, on the display, presentation of at least one image, which covers a first area of the display, which extends at least partially across the display. The computer program code is further configured to, when executed, cause the at least one processor to detect, with the camera, a presence of the user within a sensing zone, and cause, on the display, in response to detecting the presence of the user within the sensing zone, an interactive menu to be presented within at least a portion of the first area of the display, wherein the interactive menu was not previously presented within the portion of the first area of the display.
In some embodiments, the computer program code may be further configured to, when executed, cause the at least one processor to, detect, with the camera, a distance of the user from the display, and cause, on the display, in response to a detected distance of the user, a display action. The display action may be one of: turning off the display, waking up the display, presenting the interactive menu on the display, or locking the display. In some embodiments, the computer program code may be further configured to, when execute, cause the at least one processor to, detect, with the camera, a gesture of the user, and cause, on the display, in response to detecting the gesture of the user, an action to be taken on the display. The action may be one of: selecting the interactive menu, selecting an option from the interactive menu, removing the interactive menu, or locking the display. In some embodiments, the camera may be remote from the display.
In yet another embodiment a method for presenting marine data is provided. The method comprises causing, on a display, presentation of at least one image. The at least one image covering a first area of the display, which extends at least partially across the display. the method further comprises detecting, with the camera, a presence of the user within the sensing zone. The camera is user-facing such that a lens of the camera is oriented outwardly from the display towards a direction to capture a user attempting to interact with the display. The method further comprises causing, on the display, in response to detecting the presence of the user within the sensing zone, an interactive menu to be presented within at least a portion of the first area of the display, wherein the interactive menu was not previously presented within the portion of the first area of the display.
In some embodiments, the method may further comprise detecting, with the camera, a distance of the user from the display, and causing, on the display, in response to a detected distance of the user, a display action. The display action may be one of turning off the display, waking up the display, presenting the interactive menu on the display or locking the display.
Reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Example embodiments of the present invention now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the example embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout.
Depending on the configuration, the watercraft 100 may include a primary motor 106, which may be a main propulsion motor such as an outboard or inboard motor. Additionally, the watercraft 100 may include a trolling motor 108 configured to, for example, propel the watercraft 100 and/or maintain a position.
The watercraft 100 may also include one or more marine electronic devices 160, such as may be utilized by a user to interact with, view, or otherwise control various functionality regarding the watercraft, including, for example, nautical charts and various sonar systems. In the illustrated embodiment, the marine electronic device 160 is positioned proximate the helm (e.g., steering wheel) of the watercraft 100—although other places on the watercraft 100 are contemplated. Likewise, additionally or alternatively, a user input device 165 (such as a remote, or a user's mobile device) may include functionality of a marine electronic device. In some embodiments, a camera 180 may be a remote device, while in other embodiments the camera 180 may be integral to the marine electronic device 160.
The watercraft 100 may also comprise other components, such as within the one or more marine electronic devices 160 or at the helm. In
In some embodiments, the marine electronic device 260 may include a user input device 206. The user input device 206 may be configured to select options from menus, toggle between screens, or similar. In some embodiments, the user input device 206 may be a button integral to the display 261. In other embodiments, the user input device 206 may be a remote device, for example, a cellular device (e.g., 165
In some embodiments, the marine electronic device 260 may include a camera 280 associated with (e.g., near, pointed toward, and/or embedded within) the display 261. In some embodiments, the camera 280 may be remote to the marine electronic device 260 while maintaining data communication between the camera 280 and the marine electronic device 260. The camera 280 may be a user-facing camera. In this regard, the camera 280 may face outward from the display 261 in a direction of the user of the marine electronic device 260.
In some embodiments, the display 261 may have a display area 220 including the total useable space (e.g., functional visual display space) for presenting information on the display 261. In the illustrated embodiment, an image 215, marine data 270 (e.g., marine electronic device data, heading and location data, time data, wireless internet status, etc.), an interactive menu indication 241 (grid view, list view, compass, etc.), and an interactive menu 240 (e.g., zoom level) are all illustrated within the display area 220. Each of these options takes up a portion of what would be useable display area 220 to present information that may be desirable for being seen by the user at all times.
Thus, all of the “extras” (e.g., interactive menus 240, interactive menu indication 241 and marine data 270) take up space on the display 261 which otherwise could be used to show more of the image 215 presented on the display 261. To explain, rather than the at least one image 215 covering the entire display area 220, the image 215 is only viewable in a reduced area 222. Thus, although there is more useable space within the display 261, the display area 220 is not fully utilized by the image 215, but also includes dead (or already covered) space due to the marine data 270, the interactive menu indication 241 and the interactive menu 240. Here, the reduced area 222 is about 20% less than the display area 220, and, therefore, the image 215 only includes about 80% of the relevant image.
In addition to reducing the amount of space about the display area 220, the interactive menu indications 241, the interactive menus 240 and the marine data 270 may cover portions of the image 215 which may be useful to the user. For example, when the image 215 is a sonar image, any of the interactive menus 240, interactive menu indications 241, or the marine data 270 may be presented within the display area 220 and cover a shadow, or an indication of a fish, hazard, or point of interest within the marine environment.
Thus, to increase the amount of space available on the display, the marine electronic device may comprise a sensor to determine the presence of a user and present one or more interactive menus (which may include interactive menu indications or other marine data) on the display.
In some embodiments, at least one image 315 may be presented on the display 361. The at least one image 315 may cover a first area 324 of the display 361, which may encompass the entire display area 320. Further, the first area 324 may cover the entire sensing area 330.
In some embodiments, as illustrated in
In some embodiments, as illustrated in
In some embodiments, the interactive menu may be tied to the sensing zone where the user is detected. As illustrated in
Thus, as illustrated in
Similarly, with reference to
In some embodiments, each of the first region 430a, second region 430b, third region 430c and the fourth region 430d may overlap, while in other embodiments each of the regions may be distinct. Therefore, in embodiments wherein the user 410 is positioned within more than one region, each of the corresponding interactive menus may appear. For example, if the user 410 is positioned over the first region 430a and the third region 430c, the first interactive menu 440a and the third interactive menu 440c may appear. Additionally, if the user 410 is positioned within the sensing zone 430 but not in a specific region, each of the interactive menus 440a, 440b, 440c, 440d may appear on the display.
In some embodiments, the interactive menus 440a, 440b, 440c, 440d may appear and disappear based on the location of the user 410. To explain, if the user 410 is positioned over the first region 430 and moves away from the first region 430a, the first interactive menu 440a may appear, while the user 410 is within the first region 430a, and disappear after a time period when the user 410 is no longer in the first region 430a. In some embodiments, the time period may be at least 5 seconds, at least 10 second, or at least 15 seconds. In some embodiments, the time period may be adjusted based on user preference.
Similarly, when the user 410 moves from one region to another region, the corresponding interactive menus may appear and disappear. For example, the user 410 may move from the first region 430a to the second region 430b, thus, the first interactive menu 440a may appear while the user 410 is within the first region 430a and may remain showing while the user 410 moves to the second region 430b. Once the user 410 is positioned in the second region 430b the second interactive menu 440b may appear. In some embodiments, the first interactive menu 440a may remain presented on the display 461 for a time period before disappearing. Thus, when the user 410 moves between two regions, each respective interactive menu may be presented simultaneously.
In some embodiments, the marine electronic device 460 may use the camera 480 to detect the user 410 and/or a gesture of the user 410. In some embodiments, the gesture may be assigned to an interactive menu 440a, 440b, 440c, 440d. For example, pointing to the left may cause presentation of the first interactive menu 440a, while pointing to the right may cause presentation of the second interactive menu 440b. In other embodiments the gesture of the user 410 may correspond to an action to be taken on the display. The action may be, for example, selecting the interactive menu 440, selecting an option from the interactive menu, removing the interactive menu, or locking the display 461.
In some embodiments, each of the interactive menus 440a, 440b, 440c, 440d may be presented as an overlay on the at least one image 415. In other embodiments, upon presentation of each of the interactive menus 440a, 440b, 440c, 440d may be presented adjacent to the at least one image 415. In this regard the presentation of the at least one image 415 may shrink such that the interactive menus 440a, 440b, 440c, and 440d are not overlaid on the at least one image 415 but are each adjacent to the at least one image 415.
Marine electronic devices may be configured to present more than one image on the display at a time. In this regard, the display may utilize a split screen configuration. In some embodiments, the split screen configuration may include two images, while in other embodiments the split screen configuration may include three or more images. In this regard, interactive menus corresponding to each of the images, may have features that are specific to that image, while other interactive menus may be generic for all or multiples of the images. For example, a sonar image may include menu options to increase the frequency, change the contrast, flip left/right, change the view, change the clarity, etc., while a navigational chart may include menu options to add a way point, start or end a route, track a route, etc.
In some embodiments, such as illustrated in
In some embodiments, the first interactive menu 540a may be presented in a first portion 547 of the third area 527, the second interactive menu 540b may be presented in a second portion 548 of the fourth area 528, and the third interactive menu 540c may be presented along a top portion 549 of the display 561. In some embodiments, the top portion 549 of the display 561 may be within each of the third area 527 and the fourth area 528. In some embodiments, each of the first interactive menu 540a, the second interactive menu 540b, and the third interactive menu 540c may be overlaid on the first image 516 and/or the second image 517.
As the user 510 moves within the first sensing zone 538 illustrated in
Similarly, as the user 510 moves to the second sensing zone 528 illustrated in
Additionally in some embodiments, the display 561 may highlight the portion of the screen where the sensor is active. To explain, the display may highlight around the area of the display 561, for example, as illustrated in
In some embodiments, the highlight may be a color border about the sensing zone. In other embodiments, other methods of highlighting may be used, such as, for example, a patterned border, a text indication, an image indication, etc., about or within the sensing zone.
In some embodiments, a distance component may be utilized regarding determining when and/or what to present.
As the user 610 moves closer to the display 661, the display actions and/or the sensitivity of the display 661 (e.g., a touch screen) may change. As illustrated in
In some embodiments, such as illustrated in
After selection from the interactive menu 640 or upon retreating from the display 661 to a fourth distance D4, the interactive menu 640 may retreat such that the at least one image 615 is presented across the entire display area as illustrated in
In some embodiments, in addition to or in the alternative to presenting the interactive menu 640 on the display 661, the relative distance between the user 610 and the display 661 may result in relative sensitivity of the display 661. To explain, when on a watercraft, the marine electronic device 660 may be exposed to the elements, unknowing passengers, marine life, and even pets. In order to prevent accidental interaction (e.g., with a dog's tail) the marine electronic device 660 may have variable sensitivity. In this regard, when the user 610 is at the first distance D1, the display 661 may be the least sensitive, or may be locked, such that anything that contacts the display 661 may not cause an action to be taken. As the user 610 approaches the marine electronic device 660, the sensitivity may increase. Thus, when the user 610 is closer to the display 661 it may be easier to select and interact with the marine electronic device.
In some embodiments, the marine electronic device 660 may utilize the camera to determine if the user 610 is an authorized user. In this regard, children, or other people who are not versed in using the marine electronic device 660, may not be able to access and/or control the marine electronic device 660. To explain, the marine electronic device 660 may be configured to use the camera 680 to determine if the user 610 is authorized. If the user 610 is authorized, then the marine electronic device 660 may unlock, and allow the user 610 to interact with the marine electronic device 660. However, if the user 610 is not authorized, the marine electronic device 660 may present the at least one image 615, but not allow selection of any of the interactive menus 640.
In some circumstances, the user may wish to select the options from the interactive menus remotely. Accordingly, in some embodiments, the user may have access to a separate user input device (e.g., 165
The marine electronic device may require an energy supply for power. In order to preserve the energy supply, the marine electronic device may include power saving features for when not in use.
Although in power save mode, the marine electronic device may be “ON”, and may be sensing the location and/or presence of the user 710 with a sensor. In some embodiments, the sensor may be a proximity sensor, while in other embodiments, the sensor may be a camera 780 built into the display 761 of the marine electronic device 760. In this regard, the camera 780 may be a user-facing camera, such that the lens is facing out the front of the display 761.
In some embodiments, the sensor may be on in the background, determining the position of the user 710. At a second distance DB, as the user 710 approaches the marine electronic device 760, as illustrated in
Similarly in some embodiments, the power save mode setting may be chosen by the user. In some embodiments, the power save mode may be entered and exited with a remote device for example, a remote, a phone, or similar. In some embodiments, the power save features may be turned on or off depending on user preference. For example, the power save features may be turned off when the user 710 is fishing and is not directly looking at the display 761. In this regard, the user 710 may be able to glance back at the display 761 and see the display without having to move closer to the display 761. The power save feature may be turned on when the display is located away from the view of the user 710 (e.g., in an engine room). Similarly, the power save feature may be turned on when the marine electronics device 760 is used overnight. In some embodiments, the marine electronics device 760 may switch the power save feature on/off at a certain time. For example, the marine electronics device 760 may turn the power save feature on one hour after sunset, two hours after sunset, or a predetermined time after sunset. Similarly, the marine electronics device 760 may turn the power save feature off one hour before sunrise, two hours before sunrise, or a predetermined time before sunrise.
In some embodiments, the user may begin interacting with the display remotely, such as through a remote device (e.g., a phone, a remote control, a foot pedal, etc.). In such situations, one or more of the above functionality related to sensing may occur instead to a button press on the remote device. For example, a button press on a remote device may cause one or more interactive menus (or interactive menu indicators) to be presented on the display, such as within the first area 324 of the display 360 (consider
The marine electronic device 860 may include at least one processor 810, a memory 820, a communication interface 830, a user interface 835, a display 840, autopilot 850, and one or more sensors (e.g., position sensor 845, direction sensor 848, proximity sensor 885, other sensors 852). One or more of the components of the marine electronic device 860 may be located within a housing or could be separated into multiple different housings (e.g., be remotely located).
The processor(s) 810 may be any means configured to execute various programmed operations or instructions stored in a memory device (e.g., memory 820) such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g. a processor operating under software control or the processor embodied as an application specific integrated circuit (ASIC) or field programmable gate array (FPGA) specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the at least one processor 810 as described herein. For example, the at least one processor 810 may be configured to analyze sonar data, radar data, and/or chart data. As another example, the processor 810 may be configured to analyze sensor data to determine if a user is detected, which may include determining a relative position of the user, such as with respect to distance-wise and/or lateral-wise regarding the display 861.
In some embodiments, the at least one processor 810 may be further configured to implement signal processing. In some embodiments, the at least one processor 810 may be configured to perform enhancement features to improve the display characteristics of data or images, collect or process additional data, such as time, temperature, GPS information, waypoint designations, current, environmental conditions (e.g., wind speed, wind direction) or others, or may filter extraneous data to better analyze the collected data.
In an example embodiment, the memory 820 may include one or more non-transitory storage or memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or removable. The memory 820 may be configured to store instructions, computer program code, sonar data, radar data, chart data, and additional data such as, bathymetric data, location/position data in a non-transitory computer readable medium for use, such as by the at least one processor 810 for enabling the marine electronic device 860 to carry out various functions in accordance with example embodiments of the present invention. For example, the memory 820 could be configured to buffer input data for processing by the at least one processor 810. Additionally or alternatively, the memory 820 could be configured to store instructions for execution by the at least one processor 810.
The communication interface 830 may be configured to enable communication to external systems (e.g., an external network 890). In this manner, the marine electronic device 860 may retrieve stored data from a remote device 854 via the external network 890 in addition to or as an alternative to the onboard memory 820. Additionally or alternately, the marine electronic device 860 may store marine data locally, for example within the memory 820. Additionally or alternatively, the marine electronic device 860 may transmit or receive data, such as environmental conditions. In some embodiments, the marine electronic device 860 may also be configured to communicate with other devices or systems (such as through the external network 890 or through other communication networks, such as described herein). For example, the marine electronic device 860 may communicate with a propulsion system of the watercraft 100 (e.g., for autopilot control); a remote device (e.g., a user's mobile device, a handheld remote, etc.); or another system. Using the external network 890, the marine electronic device 860 may communicate with and send and receive data with external sources such as a cloud, server, etc. The marine electronic device 860 may send and receive various types of data. For example, the system may receive weather data, tidal data, alert data, current data, among others. However, this data is not required to be communicated using external network 890, and the data may instead be communicated using other approaches, such as through a physical or wireless connection via the communication interface 830.
The communication interface 830 of the marine electronic device 860 may also include one or more communications modules configured to communicate with one another in any of a number of different manners including, for example, via a network. In this regard, the communication interface 830 may include any of a number of different communication backbones or frameworks including, for example, Ethernet, the NMEA 2000 framework, GPS, cellular, Wi-Fi, or other suitable networks. The network may also support other data sources, including GPS, autopilot, engine data, compass, radar, etc. In this regard, numerous other peripheral devices (including other marine electronic devices or sonar transducer assemblies) may be included in the system 800.
The position sensor 845 may be configured to determine the current position and/or location associated with travel of the marine electronic device 860 (and/or the watercraft 100). For example, the position sensor 845 may comprise a GPS, bottom contour, inertial navigation system, such as machined electromagnetic sensor (MEMS), a ring laser gyroscope, or other location detection system. Additionally or alternately, the position sensor 845 may be configured to determine the orientation of the watercraft 100. Alternatively or in addition to determining the location of the marine electronic device 860 or the watercraft 100, the position sensor 845 may also be configured to determine the position and/or orientation of an object outside of the watercraft 100. In some embodiments, the position sensor 845 may be configured to determine a location associated with travel of the watercraft. For example, the position sensor 845 may utilize other sensors 852 (e.g., speed sensor, and/or direction sensor 848) to determine a future position of the watercraft 100 and/or a waypoint along the route of travel.
The proximity sensor 885 may be configured to determine the current position and/or location of a user of the watercraft in relation to the marine electronic device 860, such as described herein. In some embodiments, the proximity sensor 885 may be in data communication with the camera 880 to determine the location of the user in the watercraft. For example, the processor 810 may receive sensor data from, for example, the proximity sensor 885 and/or camera 880 and determine if a user is detected. Along these lines, in some embodiments, the processor 810 may be configured to correlate the data from the proximity sensor 885 and/or the camera 880 and determine a corresponding action. For example, the processor 810 may receive a first position from the proximity sensor 885 and determine the user is a distance from the display 861. The processor 810, may determine an action (e.g., present an indication of an interactive menu, present an interactive menu, etc.) and cause the action to present on the display 861.
The display 861 (e.g., one or more screens) may be configured to present images and may include or otherwise be in communication with a user interface 835 configured to receive input from a user. The display 861 may be, for example, a conventional LCD (liquid crystal display), a touch screen display, mobile device, or any other suitable display known in the art upon which images may be displayed.
In some embodiments, the display 861 may present one or more sets of data (or images generated from the one or more sets of data). Such data includes chart data, radar data, sonar data, weather data, location data, position data, orientation data, environmental data, sonar data, or any other type of information relevant to the watercraft. Environmental data may be received from the external network 890, retrieved from the other sensors 852, and/or obtained from sensors positioned at other locations, such as remote from the watercraft. Additional data may be received from marine devices such as a radar, a primary motor 805 or an associated sensor, a trolling motor 808 or an associated sensor, an autopilot 850, a rudder 857 or an associated sensor, a position sensor 845, a direction sensor 848, additional sensors 819, a remote device 854, onboard memory 820 (e.g., stored chart data, historical data, stored sonar data, etc.), or other devices.
The user interface 835 may include, for example, a keyboard, keypad, function keys, mouse, scrolling device, input/output ports, touch screen, or any other mechanism by which a user may interface with the system.
Although the display 861 of
The marine electronic device 860 may include one or more other sensors/devices 852, such as configured to measure or sense various other conditions. The other sensors/devices 852 may include, for example, an air temperature sensor, a water temperature sensor, a current sensor, a light sensor, a wind sensor, a speed sensor, tide sensor, or the like.
The components presented in
Embodiments of the present invention provide methods, apparatus and computer program products for operating according to various embodiments described herein. Various examples of the operations performed in accordance with embodiments of the present invention will not be provided with reference to
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the embodiments of the invention are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the invention. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the invention. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated within the scope of the invention. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.