Embodiments relate generally to systems, devices, and methods that are used to automatically identify offset of installed devices on a watercraft, such as for improved functionality of such devices relative to the watercraft.
When manually installing devices on watercraft, users often face issues with being confident in how to mount and align devices with respect to the watercraft. The devices may be installed improperly, or they may be installed properly, but the user may fail to determine a proper offset between the device direction and the watercraft direction to apply in memory for proper device functioning with respect to the watercraft. Users frequently attempt to install the devices without the aid of any other tools, leading to inaccuracies in installation. In some cases, equipment such as trolling motors may be misaligned by forty degrees or more. Users are often required to make repeated attempts to properly calibrate devices through trial and error until devices are correctly installed. Repeated attempts at calibration can be time consuming for users and may cause a significant amount of frustration for users. When attempting to use improperly calibrated devices such as a trolling motor or another motor, users may repeatedly miss their intended target and may instead circle around the target, leading to significant frustration for the user and wasting of time for the user.
Systems, devices, and methods described in various embodiments herein result in increased accuracy in the calibration of devices on a watercraft. Various embodiments herein accurately and automatically identify an offset so that this offset may be utilized to determine relative position and/or orientation. Once offset has been identified, the identified amount of offset may be saved and used by other systems. Additionally, or alternatively, once the offset has been identified, one or more indicators of the offset may be provided on an image in devices or in the form of text notifications. This may prevent the user being required to make repeated attempts to install/calibrate devices on a watercraft.
A user may use a camera to view an image of a device after initial installation of the device on a watercraft (e.g., in a live camera view or via a taken image). Image processing techniques may be utilized to identify edges of the device and the watercraft that are represented in the image. These image processing techniques may be formed and/or optimized using machine learning or artificial intelligence. Once edges of the device and the watercraft are identified, relevant lines or directions for the device and the watercraft may be identified, allowing the offset for the device from the appropriate position to be properly determined.
In an example embodiment, a system is provided for determining an angular offset for a device attached to a watercraft. The system comprises an electronic device including a camera. The system also comprises one or more processors and a memory including computer program code configured to, when executed, cause the one or more processors to perform various tasks. These tasks include determining, based on an image via the camera, a first direction associated with the watercraft. The image includes at least a portion of the watercraft and at least a portion of the device mounted to the watercraft. The tasks also include determining, based on the image via the camera, a second direction associated with the device, determining an angular offset between the first direction and the second direction, and storing an indication of the angular offset in the memory for use with one or more functions associated with the device.
In some embodiments, the computer program code may be configured to, when executed, cause the one or more processors to determine the first direction associated with the watercraft and determine the second direction associated with the device based on a still image taken by the camera.
In some embodiments, determining the first direction associated with the watercraft based on the image may be accomplished using image processing. Additionally, in some embodiments, determining the first direction associated with the watercraft may be performed using a Hough transform. Furthermore, in some embodiments, the Hough transform may use points on the watercraft to determine the first direction associated with the watercraft, and the points on the watercraft may be positioned on a bow of the watercraft. In some embodiments, the one or more processors may be configured to utilize a model when using the image processing to determine the first direction associated with the watercraft based on the image. The model may be formed based on historical comparisons of with historical shape data for a watercraft or a device and historical additional data, and the model may be developed through machine learning utilizing artificial intelligence.
In some embodiments, the device may be a component of a trolling motor assembly, and the second direction may be a forward direction of the component of the trolling motor assembly extending outwardly from the watercraft.
In some embodiments, the electronic device may comprise a display, and the computer program code may be configured to, when executed, cause the one or more processors to present a representation of the angular offset on the display. Additionally, in some embodiments, presenting the representation of the angular offset on the display may include presenting text to the user indicating the amount of angular offset. Furthermore, in some embodiments, presenting the representation of the angular offset on the display may include presenting one or more indicators on a live image showing the device, and the indicators may indicate a magnitude of the angular offset.
In some embodiments, the electronic device may be at least one of a cell phone, a tablet, a laptop, a smart watch, or smart glasses. In some embodiments, the image may be taken from a position above the device. In some embodiments, the device may be a sonar transducer, the first direction associated with the watercraft may be a rearward direction of the watercraft, and the second direction may be a pointing direction of the sonar transducer extending outwardly from the watercraft.
In another example embodiment, an electronic device for determining an angular offset for a device attached to a watercraft is provided. The electronic device comprises a camera, one or more processors, and a memory including computer program code configured to, when executed, cause the one or more processors to perform various tasks. The tasks include determining, based on an image via the camera, a first direction associated with the watercraft. The image includes at least a portion of the watercraft and at least a portion of the device mounted to the watercraft. The tasks also include determining, based on the image via the camera, a second direction associated with the device, determining an angular offset between the first direction and the second direction, and storing an indication of the angular offset in the memory for use with one or more functions associated with the device.
In some embodiments, the electronic device may be at least one of a cell phone, a smart phone, a tablet, a laptop, a smart watch, or smart glasses.
In some embodiments, the computer program code may be configured to, when executed, cause the one or more processors to determine the first direction associated with the watercraft and determine the second direction associated with the watercraft based on a still image taken from the camera. In some embodiments, determining the first direction associated with the watercraft based on the image may be accomplished using image processing. Furthermore, in some embodiments, determining the first direction associated with the watercraft may be performed using a Hough transform.
In another example embodiment, a method for determining an angular offset for a device attached to a watercraft is provided. The method comprises determining, based on an image via the camera, a first direction associated with the watercraft. The image includes at least a portion of the watercraft and at least a portion of the device mounted to the watercraft. The method also includes determining, based on the image via the camera, a second direction associated with the device. The method also includes determining an angular offset between the first direction and the second direction and storing an indication of the angular offset in the memory for use with one or more functions associated with the device. In some embodiments, the method may also include determining the first direction associated with the watercraft and determining the second direction associated with the watercraft based on a live image from the camera.
Reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Example embodiments now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments are shown. For
Depending on the configuration, the watercraft 100 may include a primary motor 105, which may be a main propulsion motor such as an outboard or inboard motor. Additionally, the watercraft 100 may include a trolling motor 108 configured to propel the watercraft 100 or maintain a position. The one or more sonar transducer assemblies (e.g., 102A, 102B, and/or 102C) may be mounted in various positions and to various portions of the watercraft 100 and/or equipment associated with the watercraft 100. For example, the transducer assembly may be mounted proximate to the transom 106 of the watercraft 100, such as depicted by sonar transducer assembly 102A. The transducer assembly may be mounted to the bottom or side of the hull 104 of the watercraft 100, such as depicted by sonar transducer assembly 102B. The transducer assembly may also be mounted to the trolling motor 108, such as depicted by sonar transducer assembly 102C.
The watercraft 100 may also include one or more marine electronic devices 160, such as may be utilized by a user to interact with, view, or otherwise control various aspects of the various sonar systems described herein. In the illustrated embodiment, the marine electronic device 160 is positioned proximate the helm (e.g., steering wheel) of the watercraft 100—although other locations on the watercraft 100 are contemplated. Likewise, additionally or alternatively, a remote device (such as a user's mobile device) may include functionality of a marine electronic device.
The watercraft 100 may also comprise other components within the one or more marine electronic devices 160 or at the helm. In
One or more sonar transducer assemblies may be attached at different locations on a watercraft. One example location where a sonar transducer assembly may be attached is at the transom of a watercraft, and
The orientation of sonar transducer assemblies may be adjusted so that the sonar transducer assemblies maintain optimal performance.
The sonar transducer assemblies also should be oriented in an appropriate manner relative to other axes. For example, the sonar transducer assembly 175D is rotated about the Z-axis, with the sonar transducer assembly 175D defining an offset angle OA4 relative to the surface 139. This offset angle OA4 may be about 15 degrees. The sonar transducer assembly 175E has minimal rotation about the Z-axis, with the sonar transducer assembly 175E defining an offset angle OA5 relative to the surface 139. This offset angle OA5 may be about zero, and sonar transducer assembly 175E may serve as an example of an ideal orientation for sonar transducer assemblies, with the sonar transducer assembly 175E generally extending parallel to the surface 139. The sonar transducer assembly 175F is rotated about the Z-axis, with the sonar transducer assembly 175F defining an offset angle OA6 relative to the surface 139. This offset angle OA6 may be about 15 degrees.
In some embodiments, the offset angles may be optimized for when the watercraft is moving. For example, when the watercraft is moving at trolling speed, the watercraft may be oriented differently than when the watercraft is moving at a maximum speed or when the watercraft is not moving at all. Where the sonar transducer assemblies are oriented improperly, bubbles may be formed beneath the sonar transducer assemblies as the watercraft moves through the water, and this may cause a degradation in the sonar data obtained from the sonar transducer assemblies. For example, the sonar data may not be accurate all the way to the bottom of the body of water due to interference from these bubbles.
As noted previously, equipment may be installed with an offset, and users may not be able to accurately calibrate the equipment based on this offset. These calibration issues may cause the device to work sub-optimally.
Electronic devices may be provided that are configured to calibrate the trolling motor housing 224 or other devices based on the amount of offset (e.g., angular offset as shown, but other types of offset are contemplated, such as linear offset).
An electronic device 332 is illustrated, and the electronic device 332 may include a camera, one or more processors, and memory including computer program code. The electronic device 332 may also include a screen thereon that is configured to present an image in a top pane 333A. The computer program code may be configured to cause the processor(s) to receive images from the camera (e.g., live or taken images) and to determine a first direction A1 associated with the watercraft as indicated by the representation of the first direction A1′ in
The determination of the first direction A1 and/or the second direction B1 associated with the watercraft may be performed using one or more images taken at the camera. These images may be taken (e.g., captured) images or live images. The determination may be accomplished through image processing techniques such as a Hough transform. This determination of these directions A1, B1 may be performed using data regarding points on the watercraft that are positioned on a bow of the watercraft. However, data used for Hough transforms may be different in other embodiments where devices are being positioned at different locations on a watercraft. For example, where a sonar transducer assembly is being mounted at a transom of a watercraft, on a motor of the watercraft, or at another location on the watercraft, the Hough transform may utilize data regarding points positioned at the transom of the watercraft or positioned at other points on the watercraft. Points at edges of features may be the focus of image processing techniques, but other points may be considered as well in some embodiments.
The first direction A1 and/or the second direction B1 may be determined by analyzing data for points at edges of features in the images. For example, as illustrated in
Based on the determined first direction A1 and the determined second direction B1, the computer program code may be configured to cause the processor(s) to determine an angular offset θ1 between the first direction A1 and the second direction B1. A representation of this angular offset θ1′ is represented in
The screen of the electronic device 332 includes a top pane 333A, a bottom pane 334A, and a selection button 336A. In
The electronic device 332 may be positioned proximate to the trolling motor housing 224 of
The electronic device may include a camera in some embodiments. For example,
In some embodiments, a user may use tape or another similar material to assist with image processing. For example, in
The approaches described herein may also be used to assist in positioning other devices and to assist in positioning devices at other locations on a watercraft.
The watercraft 500 may define a first direction A2, and this first direction A2 is a rearward direction of the watercraft 500 in
Electronic devices of various embodiments described herein may be configured to determine the offset for the sonar transducer assembly or other devices.
An electronic device 632 is illustrated, and the electronic device 632 may include a camera, one or more processors, and memory including computer program code. The electronic device 632 may also include a screen thereon that is configured to present an image in a top pane 633A. The computer program code may be configured to cause the processor(s) to determine a first direction A2 associated with the watercraft as indicated by the representation of the first direction A2′ in
The screen of the electronic device 632 includes a bottom pane 634A and a selection button 636A. In
While the electronic device 332 of
While
In some embodiments, the electronic device 632 may be positioned relative to the sonar transducer assembly to identify an offset in the position or orientation of the sonar transducer assembly, and the orientation or position of sonar transducer assemblies may be adjusted so that the sonar transducer assemblies maintain optimal performance. Looking at the sonar transducer assemblies 175A-175C in
Turning now to the sonar transducer assemblies 175D-175F in
Height mounted at the rear of the boat and angle up/down. The transducer must be in the water, but not too much. And pointed in the right direction for the correct angle of view. See diagram below.
The watercraft may have systems thereon including various electrical components, and
The marine electronic device 760 may include at least one processor 710, a memory 720, a communications interface 778, a user interface 735, a display 740, autopilot 750, and one or more sensors (e.g. position sensor 745, direction sensor 748, other sensors/devices 752). One or more of the components of the marine electronic device 760 may be located within a housing or could be separated into multiple different housings (e.g., be remotely located).
The processor(s) 710 may be any means configured to execute various programmed operations or instructions stored in a memory device (e.g., memory 720) such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g. a processor operating under software control or the processor embodied as an application specific integrated circuit (ASIC) or field programmable gate array (FPGA) specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the processor(s) 710 as described herein.
In an example embodiment, the memory 720 may include one or more non-transitory storage or memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or removable. The memory 720 may be configured to store instructions, computer program code, radar data, and additional data such as sonar data, chart data, location/position data in a non-transitory computer readable medium for use, such as by the processor(s) 710 for enabling the marine electronic device 760 to carry out various functions in accordance with example embodiments of the present invention. For example, the memory 720 could be configured to buffer input data for processing by the processor(s) 710. Additionally or alternatively, the memory 720 could be configured to store instructions for execution by the processor(s) 710. The memory 720 may include computer program code that is configured to, when executed, cause the processor(s) 710 to perform various methods described herein. The memory 720 may serve as a non-transitory computer readable medium having stored thereon software instructions that, when executed by a processor, cause methods described herein to be performed.
The communications interface 778 may be configured to enable communication to external systems (e.g. an external network 702). In this manner, the marine electronic device 760 may retrieve stored data from a remote device 754 via the external network 702 in addition to or as an alternative to the onboard memory 720. Additionally or alternatively, the marine electronic device 760 may transmit or receive data, such as radar signal data, radar return data, radar image data, path data or the like to or from a sonar transducer assembly 762. In some embodiments, the marine electronic device 760 may also be configured to communicate with other devices or systems (such as through the external network 702 or through other communication networks, such as described herein). For example, the marine electronic device 760 may communicate with a propulsion system of the watercraft 100 (e.g., for autopilot control); a remote device (e.g., a user's mobile device, a handheld remote, etc.); or another system.
The communications interface 778 of the marine electronic device 760 may also include one or more communications modules configured to communicate with one another in any of a number of different manners including, for example, via a network. In this regard, the communications interface 778 may include any of a number of different communication backbones or frameworks including, for example, Ethernet, the NMEA 2000 framework, GPS, cellular, Wi-Fi, or other suitable networks. The network may also support other data sources, including GPS, autopilot, engine data, compass, radar, etc. In this regard, numerous other peripheral devices (including other marine electronic devices or transducer assemblies) may be included in the system 700A.
The position sensor 745 may be configured to determine the current position and/or location of the marine electronic device 760 (and/or the watercraft 100). For example, the position sensor 745 may comprise a GPS, bottom contour, inertial navigation system, such as machined electromagnetic sensor (MEMS), a ring laser gyroscope, or other location detection system. Alternatively or in addition to determining the location of the marine electronic device 760 or the watercraft 100, the position sensor 745 may also be configured to determine the position and/or orientation of an object outside of the watercraft 100.
The display 740 (e.g. one or more screens) may be configured to present images and may include or otherwise be in communication with a user interface 735 configured to receive input from a user. The display 740 may be, for example, a conventional LCD (liquid crystal display), a touch screen display, mobile device, or any other suitable display known in the art upon which images may be displayed.
In some embodiments, the display 740 may present one or more sets of data (or images generated from the one or more sets of data). Such data includes chart data, radar data, sonar data, weather data, location data, position data, orientation data, sonar data, or any other type of information relevant to the watercraft. Radar data may be received from radar 756A located outside of a marine electronic device 760, radar 756B located in a marine electronic device 760, or from radar devices positioned at other locations, such as remote from the watercraft. Additional data may be received from marine devices such as a sonar transducer assembly 762, a primary motor 705 or an associated sensor, a trolling motor 708 or an associated sensor, a kicker motor 742 or an associated sensor, an autopilot 750, a rudder 757 or an associated sensor, a position sensor 745, a direction sensor 748, other sensors/devices 752, a remote device 754, onboard memory 720 (e.g., stored chart data, historical data, etc.), or other devices.
The user interface 735 may include, for example, a keyboard, keypad, function keys, buttons, a mouse, a scrolling device, input/output ports, a touch screen, or any other mechanism by which a user may interface with the system.
Although the display 740 of
The marine electronic device 760 may include one or more other sensors/devices 752, such as configured to measure or sense various other conditions. The other sensors/devices 752 may include, for example, an air temperature sensor, a water temperature sensor, a current sensor, a light sensor, a wind sensor, a speed sensor, or the like.
A sonar transducer assembly 762 is also provided in the system 700A. The sonar transducer assembly 762 illustrated in
The sonar transducer assembly 762 may also include one or more other systems, such as various sensor(s) 766. For example, the sonar transducer assembly 762 may include an orientation sensor, such as gyroscope or other orientation sensor (e.g., accelerometer, MEMS, etc.) that may be configured to determine the relative orientation of the sonar transducer assembly 762 and/or the one or more sonar transducer element(s) 767—such as with respect to a keel direction of the watercraft. In some embodiments, additionally or alternatively, other types of sensor(s) are contemplated, such as, for example, a water temperature sensor, a current sensor, a light sensor, a wind sensor, a speed sensor, or the like. While only one sonar transducer assembly 762 is illustrated in
An electronic device 768 is also included. The electronic device 768 may be a phone such as a smart phone, a cell phone, smart glasses, a tablet, a computer, a headset, or another electronic device. The electronic device 768 comprises a display 770, with the display 770 having a screen. In some embodiments, the display 770 may be a touch display that is configured to receive input from a user by detecting the user touching the display 770 with a finger. A user interface 772 is also provided in the electronic device 768, and the user interface 772 may include one or more input buttons, a speaker, a microphone, a keypad, and other mechanisms to enable the user to input commands. The electronic device 768 may also comprise a camera 774B to obtain one or more images, which may be live images. The electronic device 768 may also comprise an orientation sensor 776B. The orientation sensor 776B may be configured to determine the orientation at the camera 774B. Alternatively, a camera 774A and an associated orientation sensor 776A may be positioned at another location on the watercraft, with the orientation sensor 776A being configured to determine the orientation at the camera 774A.
The components presented in
At operation 804, a first direction associated with a watercraft is determined. This first direction may be dependent on the type of device being installed and the location where the device is being installed. For example, where a device being installed is a trolling motor having a trolling motor housing that is being installed at a bow of a watercraft, the first direction may be associated with the direction that the keel of the watercraft extends in.
At operation 806, a second direction associated with a device is determined. The device may be a sonar transducer assembly, a trolling motor housing, a kicker motor, a primary motor, a radar device, etc. However, several other devices may be utilized in method 800 to accomplish effective alignment during installation.
In some embodiments, the image received at operation 802 may be used to assist in determining the first direction and the second direction at operations 804 and 806. In some embodiments, the image may be the only material used to determine the second direction, but other material may be utilized to assist in the determination in other embodiments. Image processing techniques such as a Hough transform may be used to assist in determining the first direction and/or the second direction, and these image processing techniques may rely on data obtained for various points on the watercraft (e.g., at the bow of the watercraft, at the transom of the watercraft, etc.). Points at edges of features may be the focus of image processing techniques, but other points may be considered as well in some embodiments. However, in some embodiments, operation 802 may be omitted and the first direction and the second direction may be determined in other ways (e.g., based on inputs received from users on a display) without use of image processing techniques. At operation 808, an angular offset between the first direction and the second direction is determined.
At operation 810, an indication of the angular offset may be stored in memory. This indication may be a numerical value in some embodiments, but the indication may be provided in other forms (e.g., an image, a qualitative classification for the angular offset as high, medium, or low, etc.). As detailed herein, such angular offset stored in the memory may be used by various systems, such as the marine electronic device 760, trolling motor 708, or other devices with various functionality, such as for orientation sensing, navigation, among other functionality.
At operation 812, a representation of the angular offset is presented on a display. The representation of the angular offset may be presented on the display by presenting text to the user indicating the amount of angular offset. However, the representation of the angular offset may be presented on the display by presenting one or more indicators on a live image showing the device, with the indicator(s) indicating the amount of angular offset.
The system may also obtain and utilize data regarding for the type of watercraft, demographic information about one or more users, the environment around the watercraft, various user actions on a display, a user interface, or some other action, etc. The developed model may assign different weights to different types of data that are provided.
In some systems, even after the model is deployed, the systems may beneficially improve the developed model by analyzing further data points. By utilizing artificial intelligence, effective image processing techniques may be implemented. Additionally, utilizing artificial intelligence, the image processing techniques may be performed with greater accuracy as the accuracy of object recognition and edge recognition may be greatly improved.
By receiving several different types of data, the example method 900 may be performed to generate complex models. The example method 900 may find relationships between different types of data that may not have been anticipated. The method 900 may identify relationships that are very difficult or impossible for a human to identify on their own. By detecting relationships between different types of data, the method 900 may generate accurate models even where a limited amount of data is available.
In some embodiments, the model may be continuously improved even after the model has been deployed. Thus, the model may be continuously refined based on changes in the systems or in the environment over time, which provides a benefit as compared with other models that stay the same after being deployed. The example method 900 may also refine the deployed model to fine-tune weights that are provided to various types of data based on subtle changes in the watercraft, devices on the watercraft, the environment, etc. In embodiments where an initial model is formed using data from multiple users and where the model is subsequently refined after being deployed based on data from a particular user, the data obtained after being deployed may be weighted more strongly than other data obtained before the model is deployed.
The method 900 may continuously refine a deployed model to quickly account for the changes and provide a revised model that is accurate. This may be particularly beneficial where certain parts of the watercraft are replaced, modified, or damaged or where there are swift changes in the environment. By contrast, where a model is not continuously refined, changes to the watercraft or the surrounding environment may make the model inaccurate until a new model may be developed and implemented, and implementation of a new model may be very costly, time-consuming, and less accurate than a continuously refined model. Continuous refinement may also be beneficial for novice users who may otherwise be unaware of changes that are occurring.
At operation 902, one or more data points are received. These data points may or may not be the initial data points being received. These data points preferably comprise known data on edge locations on objects within images, or some other characteristic that the model may be used to predict. The data points provided at operation 902 will preferably be historical data points with verified values to ensure that the model generated will be accurate. The data points may take the form of discrete data points. However, where the data points are not known at a high confidence level, a calculated data value may be provided, and, in some cases, a standard deviation or uncertainty value may also be provided to assist in determining the weight to be provided to the data value in generating a model. In this regard, the model predicted user preferences may be formed based on historical data.
For example, the model may be formed based on historical data regarding edge locations on objects within images and additional data regarding the watercraft, devices on the watercraft, the environment, etc. Additional data may be provided from a variety of sources, and additional data may, for example, be historical data from existing images where the actual edge locations for objects represented in the images are known. Historical data may also be provided by experts. However, historical data may be obtained in other ways as well. This model may be formed to predict the edge locations for objects represented in images, and this may be beneficial to allow the shape, size, and/or the orientation of the objects to be identified. A processor may be configured to utilize the developed model to perform image processing. This model may be developed through machine learning utilizing artificial intelligence. Alternatively, a model may be developed through artificial intelligence. A processor may be configured to use the model and input images.
At operation 904, a model is improved by minimizing error between data regarding predicted data regarding edge locations and actual data regarding edge locations. In some embodiments, an initial model may be provided or selected by a user. The user may provide a hypothesis for an initial model, and the method 900 may improve the initial model. However, in other embodiments, the user may not provide an initial model, and the method 900 may develop the initial model at operation 904, such as during the first iteration of the method 900. The process of minimizing error may be similar to a linear regression analysis on a larger scale where three or more different variables are being analyzed, and various weights may be provided for the variables to develop a model with the highest accuracy possible. Where a certain variable has a high correlation with the accuracy of data, that variable may be given increased weight in the model. In refining the model, the component performing the method 900 may perform a very large number of complex computations. Sufficient refinement results in an accurate model.
In some embodiments, the accuracy of the model may be checked. For example, at operation 906, the accuracy of the model is determined. This may be done by calculating the error between the model predicted outputs generated by the model and the actual outputs. In some embodiments, error may also be calculated before operation 904. By calculating the accuracy or the error, the method 900 may determine if the model needs to be refined further or if the model is ready to be deployed. Where the model predicted output is a qualitative value or a categorical value, the accuracy may be assessed based on the number of times the predicted value was correct. Where the model predicted output is a quantitative value, the accuracy may be assessed based on the difference between the actual value and the predicted value.
At operation 908, a determination is made as to whether the calculated error is sufficiently low. A specific threshold value may be provided in some embodiments. For example, where the model is used to predict the edge locations for a watercraft or a device on the watercraft, the error may be evaluated by determining the difference between the actual edge locations and the predicted edge locations. Alternatively, where the model is used to predict the edge locations for a watercraft or a device on the watercraft, the error may be evaluated by determining the percentage of times (e.g., 25%, 50%, 75%, 90%, 95%, etc.) where the model successfully predicts all edge locations of an object within an image within a specified error limit (e.g., 1 millimeter, 5 millimeters, 10 millimeters, etc.). However, other threshold values may be used, and the threshold value may be altered by the user in some embodiments. If the error rate is not sufficiently low, then the method 900 may proceed back to operation 902 so that one or more additional data points may be received. If the error rate is sufficiently low, then the method 900 proceeds to operation 910. Once the error rate is sufficiently low, the training phase for developing the model may be completed, and the implementation phase may begin where the model may be used to make predictions.
By completing operations 902, 904, 906, and 908, a model may be refined through machine learning utilizing artificial intelligence based on the historical comparisons. Notably, example model generation and/or refinement may be accomplished even if the order of these operations is changed, if some operations are removed, or if other operations are added.
After the model has been successfully trained, the model may be implemented as illustrated from operations 910-912. In some embodiments, the model may be modified (e.g., further refined) based on the received data points, such as at operation 914.
At operation 910, further data points are received. For these further data points, the data points provide actual data regarding edge locations for representations of objects within an image. At operation 912, the model may be used to provide a predicted output data value for the further data points. Thus, the model may be utilized to determine the edge locations for representations of objects within an image.
At operation 914, the model may be modified based on supplementary data points, such as those received during operation 910 and/or other data points. The system may account for data regarding edge locations for representation of objects within an image. By providing supplementary data points, the model may continuously be improved even after the model has been deployed. The supplementary data points may be the further data points received at operation 910, or the supplementary data points may be provided to the processor(s) from some other source. In some embodiments, the processor(s) or the other component performing the method 900 may receive additional data and verify the further data points received at operation 910 using this additional data. By doing this, the method 900 may prevent errors in the further data points from negatively impacting the accuracy of the model.
In some embodiments, supplementary data points may be provided to the processor from some other source and are utilized to improve the model. For example, supplementary data points may be saved to a memory 720 (see
As indicated above, in some embodiments, operation 914 is not performed and the method proceeds from operation 912 back to operation 910. In other embodiments, operation 914 occurs before operation 912 or simultaneously with operation 912. Upon completion, the method 900 may return to operation 910 and proceed on to the subsequent operations. Supplementary data points may be the further data points received at operation 910 or some other data points.
The methods 800, 900 illustrated in
Many modifications and other embodiments set forth herein will come to mind to one skilled in the art to which these embodiments pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the embodiments are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the invention. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the invention. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated within the scope of the invention. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.