The presently disclosed subject matter relates to the field of marine environment.
In the marine environment, a marine vessel travels on a route on which it can encounter various situations. Some of these situations can include dangers, e.g. an obstacle to be avoided, zones with dangerous weather, etc.
It is now necessary to provide new methods and systems in order to improve safety and reliability of marine vessels navigation, improve understanding of the marine environment for marine vessels and improve control of marine vessels. More generally, it is necessary to develop innovative methods in the marine domain, and in particular, in the field of autonomous ships.
In accordance with certain aspects of the presently disclosed subject matter, there is provided a method comprising, by a processor and memory circuitry (PMC), estimating data informative of at least one of a height or an orientation of at least one imaging device of a marine vessel (that it to say data informative of a height and/or an orientation of the at least one imaging device of the marine vessel), wherein at least one of the height or the orientation (that it to say the height and/or the orientation) of the at least one imaging device is variable over time, the estimating comprising obtaining first position data informative of a position of first marine objects, wherein the first position data is obtained based on images acquired by the at least one imaging device of the marine vessel, obtaining second position data informative of a position of second marine objects, wherein the second position data is obtained based on data acquired by at least one sensor of the marine vessel, wherein the at least one sensor is different from the imaging device, wherein at least some of the first marine objects are the same as at least some of the second marine objects, and using the first position data and the second position data to estimate data informative of at least one of a height or an orientation of the at least one imaging device of the marine vessel.
In addition to the above features, the method according to this aspect of the presently disclosed subject matter can optionally comprise one or more of features (i) to (xxviii) below, in any technically possible combination or permutation:
In accordance with certain aspects of the presently disclosed subject matter, there is provided a non-transitory storage device readable by a machine, tangibly embodying a program of instructions executable by the machine to perform operations as described with reference to the method above.
In accordance with certain aspects of the presently disclosed subject matter, there is provided a system comprising a processor and memory circuitry (PMC) configured to estimate data informative of at least one of a height or an orientation of at least one imaging device of a marine vessel, wherein at least one of the height and the orientation of the at least one imaging device is variable over time, the estimating comprising obtain first position data informative of a position of first marine objects, wherein the first position data is obtained based on images acquired by at least one imaging device of a marine vessel, obtain second position data informative of a position of second marine objects, wherein the second position data is obtained based on data acquired by at least one sensor of the marine vessel, wherein the at least one sensor is different from the imaging device, wherein at least some of the first marine objects are the same as at least some of the second marine objects, and use the first position data and the second position data to estimate data informative of at least one of a height or an orientation of the imaging device of the marine vessel.
In addition to the above features, the system according to this aspect of the presently disclosed subject matter can optionally comprise (or be configured to implement) one or more of features (i) to (xxviii) as described above.
In accordance with certain aspects of the presently disclosed subject matter, there is provided a marine vessel comprising at least one imaging device, at least one sensor different from the image device, a processor and memory circuitry (PMC) configured to estimate data informative of at least one of a height or an orientation of the at least one imaging device, wherein at least one of the height and the orientation of the at least one imaging device is variable over time, the estimating comprising: obtain first position data informative of a position of first marine objects, wherein the first position data is obtained based on images acquired by at least one imaging device of a marine vessel, obtain second position data informative of a position of second marine objects, wherein the second position data is obtained based on data acquired by at least one sensor of the marine vessel, wherein the at least one sensor is different from the imaging device, wherein at least some of the first marine objects are the same as at least some of the second marine objects, and use the first position data and the second position data to estimate data informative of at least one of a height or an orientation of the imaging device of the marine vessel.
In addition to the above features, the marine vessel according to this aspect of the presently disclosed subject matter can optionally comprise (or be configured to implement) one or more of features (i) to (xxxviii) as described above.
In accordance with certain aspects of the presently disclosed subject matter, there is provided a system comprising a processor and memory circuitry (PMC) configured to obtain first position data informative of a position of first marine objects, wherein the first position data is obtained based on images acquired by at least one imaging device of a marine vessel, obtain second position data informative of a position of second marine objects, wherein the second position data is obtained based on data acquired by at least one sensor of the marine vessel, wherein the at least one sensor is different from the imaging device, wherein at least some of the first marine objects are the same as at least some of the second marine objects, determine a match between at least some of the first marine objects and at least some of the second marine objects using the first position data and the second position data, and generate a database informative of at least some of the first marine objects and the second marine objects using said match and data informative of the first marine objects and the second marine objects determined based on data provided by at least one of the imaging device and the at least one sensor.
According to some embodiments, the proposed solution proposes an innovative approach in which an imaging device acquires data informative of the environment of a marine vessel, in order to improve knowledge of the environment of the marine vessel and/or knowledge of the position of the marine vessel.
According to some embodiments, the proposed solution relies on a multi-sensor approach, in which a smart combination of information provided by the various sensors enables to determine data informative of marine targets in a more complete and accurate way, and in real time.
According to some embodiments, the proposed solution enables automatically determining height and/or orientation of an imaging device of a marine vessel.
According to some embodiments, the proposed solution enables to be independent from manual procedures.
According to some embodiments, although height and/or orientation of a marine vessel can vary drastically during the voyage of the marine vessel (e.g., due to changes in the weight of the freight, variations in the weather conditions, maneuvers of the marine vessel, or other factors), the proposed solution enables to repetitively (and automatically) update an estimation of a height and/or orientation of an imaging device of a marine vessel.
According to some embodiments, the proposed solution takes advantage of the fact that some sensors better perform than other in some situations, whereas in other situations this can be the opposite. In light of the foregoing, the proposed solution proposes to perform a smart aggregation between the data provided by the various sensors depending on the situation, thereby providing an efficient solution. The proposed solution therefore compensates drawbacks of each sensor.
According to some embodiments, the proposed solution solves technical challenges which are specific to imaging devices mounted on a marine vessel.
According to some embodiments, the proposed solution proposes to use an imaging device in addition to non-imaging sensors already present in the marine vessel, to provide a robust and complete solution for assisting marine navigation and control. As a consequence, it can be implemented in existing marine vessels in a flexible manner.
According to some embodiments, the proposed solution enables a marine vessel to determine its own parameters and/or parameters of the targets surrounding the marine vessel.
According to some embodiments, the proposed solution enables to determine marine vessel position without requiring localization systems such as GPS which are vulnerable to spoofing.
According to some embodiments, the proposed solution enables mapping the marine objects surrounding a marine vessel.
According to some embodiments, the proposed solution enables determining position of marine objects which do not have a localization system.
According to some embodiments, the proposed solution enables converting various data of marine objects from a relative referential of an image to an absolute referential.
According to some embodiments, the proposed solution enables calibrating/recalibrating one or more sensors present on a marine vessel.
According to some embodiments, the proposed solution improves control of trajectory of marine vessels.
According to some embodiments, the proposed solution reduces the risk of collision of a marine vessel with other marine objects. Safety and reliability of marine vessels are thus improved.
According to some embodiments, the proposed solution enables generating automatically a dataset of labelled images of marine objects, usable for training a neural network.
According to some embodiments, the proposed solution enables generating a comprehensive database informative of marine objects.
In order to understand the invention and to see how it can be carried out in practice, embodiments will be described, by way of non-limiting examples, with reference to the accompanying drawings, in which:
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the presently disclosed subject matter may be practiced without these specific details. In other instances, well-known methods have not been described in detail so as not to obscure the presently disclosed subject matter.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as “obtaining”, “using”, “solving”, “determining”, “estimating”, “tracking”, “merging” or the like, refer to the action(s) and/or process(es) of a processor and memory circuitry (PMC) that manipulate and/or transform data into other data, said data represented as physical, such as electronic, quantities and/or said data representing the physical objects.
The term “processor and memory circuitry” (PMC) as disclosed herein should be broadly construed to include any kind of electronic device with data processing circuitry, which includes for example a computer processing device operatively connected to a computer memory (e.g. digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), and an application specific integrated circuit (ASIC), etc.) capable of executing various data processing operations.
It can encompass a single processor or multiple processors, which may be located in the same geographical zone, or may, at least partially, be located in different zones and may be able to communicate together.
System 100 can be embedded on a marine platform. In particular, the marine platform can be a moving marine platform. The moving marine platform can be e.g. a marine vessel 125. Marine vessels include e.g. ships, boats, hovercraft, etc.
In some embodiments, system 100 can be embedded on a marine platform which can be stationary, or at least temporarily stationary.
Although embodiments will be described with reference to marine vessels 125, it is to be understood that these embodiments apply similarly to a stationary marine platform.
As shown in
Sensors 130 collect data during the voyage of the marine vessel 125. The voyage includes portions of the voyage in which the marine vessel is in motion but can also include portions of the voyage in which the marine vessel is substantially static (e.g., when the marine vessel 125 is moored or docked such as at a harbor).
Sensors 130 can include an imaging device 120 (e.g. a camera).
In some embodiments, the camera includes an infrared camera, a night camera, a day camera, etc.
In some embodiments, sensors 130 include a plurality of imaging devices 120 (which can be distinct).
In addition, sensors 130 include one or more additional sensors 115 (which are not necessarily imaging devices) such as (this list is not limitative) a radar (any type of radar), a LIDAR, automatic identification system (AIS), a transponder communicating with GPS located on other marine objects, a system which includes a laser located on the marine vessel 125 and an optical reflector located on another marine object to be located by the marine vessel 125 (reflection of the laser by the reflector enables localization of the other marine object), etc.
In particular, sensors 115 provide information usable to localize marine objects surrounding the marine vessel 125.
The marine vessel 125 itself can include other sensors, such a geo-localization system (e.g. GPS), IMU, velocity and acceleration sensors, gyro compass, etc.
As explained hereinafter in the specification, system 100 can process data collected by sensors 130.
In some embodiments, data output by system 100 can be transmitted through a remote communication network 140 towards e.g., a central station 150, which can include at least one processor and memory circuitry (PMC).
In some embodiments, the central station 150 can perform at least some of the tasks of PMC 110 located on the marine vessel 125.
The remote communication link can correspond e.g. to a broadband cellular network (e.g. 4G, 5G, LTE, etc.), a satellite communication network, Radio communication network (such as Radio VHF—very high frequency), etc.
Data can be transmitted using a communication system located on the marine vessel 125 which is suitable to transmit data via the remote communication network. The communication system can include e.g., an antenna, an emitter, a transponder, etc.
Attention is now drawn to
During the voyage of the marine vessel 125, data is acquired by one or more of the sensors 130 of the marine vessel 125.
Data are representative of one or more situations encountered by the marine vessel during its voyage.
In particular, during the voyage of the marine vessel 125, various objects (hereinafter marine objects—e.g. icebergs, buoys, other marine vessels, etc.) can be encountered by the marine vessel 125, and data informative of these marine objects can be acquired. The marine objects generally include at least a part which is located above sea level.
As shown in
The first data is obtained based on images acquired by the imaging device 120 of the marine vessel 125. In some embodiments, the first data is obtained based on images acquired by a plurality of imaging devices 120 of the marine vessel 125. The first data includes first position data informative of a position of first marine objects.
Generally, the first position data is expressed in the referential of the imaging device 120. A PMC is configured to detect, in a given image acquired by the imaging device 120, marine objects present in the image. This detection can rely e.g. on an image processing algorithm. In some embodiments, a machine learning module (which implements e.g. a deep neural network) is trained to detect marine objects present in images acquired by an imaging device of a marine vessel. This training can include supervised learning in which a plurality of annotated images comprising marine objects are fed to the machine learning module. This is not limitative, and the training can also include automatic training and/or non-supervised learning.
In some embodiments, the machine learning module can provide information on the type of the object (e.g. marine vessel, type of marine vessel, iceberg, etc.). This can be obtained by performing supervised learning of the machine learning module, in which labelled images comprising marine objects (together with their type, which corresponds to the label) are fed to the machine learning module for its training.
Once a given marine object is detected in an image acquired by the imaging device 120, its position (e.g. pixel position comprising a position along the X axis of the image and a position along the Y axis of the image) in the image can be obtained. Therefore, for each marine object acquired by the imaging device 120, a position in the image can be obtained. The first position data can include, in some embodiments, the position of each given object of the first marine objects in the image in which the given object has been detected.
According to some embodiments, the imaging device 120 acquires a plurality of images at a plurality of periods of time. As a consequence, it is possible to obtain position over time of the first marine objects.
In particular, the method can include obtaining a set of first position data (see operation 2001 in
Indeed, since the marine vessel 125 moves over time and/or at least some of the first marine objects move over time, the position of the first marine objects in the images acquired by the imaging device 120 can change over time.
According to some embodiments, in order to generate the set of first position data, it is possible to track the first marine objects in the plurality of images acquired by the imaging device 120.
A non-limitative example is provided with reference to
Assume that at time t1, an image is acquired by the imaging device 120. In this image, three marine objects are detected. A first marine object is located at position 230, a second marine object is located at position 231, and a third marine object is located at position 232.
At time t2 (different from t1), another image is acquired by the imaging device 120. In this image, three marine objects are detected. A first marine object is located at position 233, a second marine object is located at position 234, and a third marine object is located at position 235.
At time t3 (different from t2), another image is acquired by the imaging device 120. In this image, three marine objects are detected. A first marine object is located at position 236, a second marine object is located at position 237, and a third marine object is located at position 238.
A tracking method can be used to track the various marine objects over the different images. The tracking method can be implemented by a PMC. The tracking method can implement e.g., a Kalman filter, or other adapted tracking methods.
In some embodiments, it can appear that a marine object is present in some of the images and disappear in subsequent images. This can be due to the relative motion between the marine object and the marine vessel 125.
In the example of
Similarly, the tracking method reveals that the marine object located at position 231 at time t1, the marine object located at position 234 at time t2 and the marine object located at position 237 at time t3 correspond to the same marine object at different periods of time. Therefore, the same tracking ID (in this example “(2)”) can be assigned.
Similarly, the tracking method reveals that the marine object located at position 232 at time t1, the marine object located at position 235 at time t2 and the marine object located at position 238 at time t3 correspond to the same marine object. Therefore, the same tracking ID (in this example “(3)”) can be assigned.
The method includes obtaining (210) second data informative of second marine objects.
Although
The second data is obtained based on data acquired by at least one sensor 115 of the marine vessel 125. Sensor 115 is different from imaging device 120. In some embodiments, sensor 115 is not an imaging device (e.g., not a camera). Various examples have been provided above for sensor 115 (e.g., a radar, AIS, etc.).
The second data includes second position data informative of a position of second marine objects encountered by the marine vessel 125 during its voyage. As explained hereinafter, according to some embodiments, the method projects position of the targets detected by the various sensors (which can be expressed in different referentials) within a common referential.
In some embodiments, a plurality of sensors 115 is available, which includes sensors 115 of different types (e.g., a first sensor is a radar, a second sensor is AIS, a third sensor is GPS, etc.). In this case, for each sensor, position data of marine objects detected by this sensor is obtained. As explained hereinafter, each sensor can detect different marine objects, but at least a plurality of the marine objects surrounding the marine vessel 125 is detected by different sensors 115.
At least some of the first marine objects and the second marine objects correspond to the same physical marine objects. For example, the first marine objects include a first given marine vessel, a second given marine vessel, and a buoy. The second marine objects include the first given marine vessel, the second given marine vessel, and an iceberg.
This difference between the first marine objects and the second marine objects can be due to the fact that the imaging device 120 and the other sensors 115 have a different field of view, and/or a different line of sight, and/or different capabilities of detecting objects (e.g., depending on the type of the sensor, its capability to detect objects can depend e.g. on weather conditions, size of the object, type of the object, etc.). In addition, the imaging device 120 and the other sensor(s) 115 can present other differences (for example, the imaging device 120 can be used to classify marine objects, which is not possible for all sensors 115, such as radar).
Distance between the marine vessel 125 and the marine objects can also impact the detection of the marine objects by the sensors of the marine vessel 125. For example, a radar is operative to detect marine objects at medium-long range, but has a blind zone at short range, whereas the imaging device 120 better performs at short range than at long range for detecting marine objects. Therefore, not all marine objects are detected by all sensors of the marine vessel 125.
In some embodiments, the first marine objects and the second marine objects are the same. This means that all sensors 130 have been able to detect the same marine objects.
The second position data is expressed in a referential which can depend on the sensor 115.
For example, if sensor 115 is an AIS, absolute position (latitude, longitude in world coordinates) of the marine objects is obtained.
If sensor 115 is a radar, position of the marine object relative to the marine vessel 125 is obtained (expressed e.g., as a range and an angular position relative to the radar and/or marine vessel 125).
According to some embodiments, the second data can include additional data (in addition to the second position data of the second marine objects).
According to some embodiments, the second data includes identification data of the second marine objects. For example, the AIS provides identification data specific to each object, which enables its identification.
According to some embodiments, the second data includes data informative of the type (e.g. type of marine vessel, etc.) of the marine object (which can be provided by sensor(s) 115 and/or which can be derived from data provided by sensor(s) 115).
For example, the AIS can provide type of the marine object.
According to some embodiments, sensor 115 acquires data at a plurality of periods of time (e.g., while the marine vessel 125 is in motion). As a consequence, it is possible to obtain position over time of the second marine objects. As explained hereinafter, tracking data of the marine objects can be used to improve matching/association between the first and second marine objects. This is however not mandatory.
In particular, the method can include obtaining a set of second position data (operation 2101 in
In order to track the second marine objects over the different acquisitions, several methods can be used.
In some embodiments, if sensor 115 provides identification data specific to each object, it is possible to track the object over the plurality of periods of time, thereby enabling generating the set of second position data. For example, if sensor 115 is an AIS, it is possible to track the position of the objects over time since each object is associated with specific identification data provided by the AIS.
If sensor 115 is a radar, the objects can be tracked over the various radar acquisitions (that is to say, at the plurality of periods of time), using regular radar tracking.
According to some embodiments, the first position data informative of a position of first marine objects (respectively, the set of first position data) corresponds to a position at a first period of time (respectively at a plurality of first periods of time), and the second data informative of a position of second marine objects (respectively the set of first position data) corresponds to a position at a second period of time (respectively at a plurality of second periods of time).
The first period of time (respectively first periods of time) and the second period of time (respectively second periods of time) meet a synchronization criterion. The synchronization criterion ensures that the time difference between the respective first period(s) of time and the respective second period(s) of time is below a threshold. For example, the synchronization criterion can ensure that a time difference between the respective first period(s) of time and the respective second period(s) of time is below 1 sec. This value is however not limitative. In particular, if the respective first period(s) of time and the respective second period(s) of time do not meet the synchronization criterion, it possible to perform up-sampling of the data provided by the sensors (using e.g. a Kalman filter—this is not limitative). Similarly, if necessary, down-sampling can be performed.
As a consequence, the first period(s) of time and the second period(s) of time are substantially identical.
The method further includes (operation 220) using the first position data and the second position data to estimate data informative of at least one of a height or an orientation of the imaging device 120. In some embodiments, both data informative of a height and an orientation of the imaging device is estimated, or only part of this data (e.g. because at least some of this data is already known, using e.g. other sensors and/or external input). Generally, at least one of the height and the orientation of the at least one imaging device is variable overtime, since orientation and/or position of the marine vessel evolves over time.
Data informative of an orientation of the imaging device 120 includes at least one of a roll of the imaging device 120, a pitch of the imaging device 120, a yaw of the imaging device 120, etc. This orientation can be expressed similarly as roll/pitch/yaw of a ship (for example, the roll axis is an imaginary line running horizontally through the length of the ship, through its centre of mass, and parallel to the waterline, the pitch axis is an imaginary line running horizontally across the ship and through its centre of mass, and the yaw axis is an imaginary line running vertically through the ship and through its centre of mass).
Data informative of a height (also called altitude or elevation) of the imaging device 120 can be also estimated. Height of the imaging device 120 can be expressed for example relative to sea level (also called mean sea level—MSL, or relative to still-water level—SWL).
As explained hereinafter, once one or more parameters of the imaging device 120 have been determined, additional position data (e.g. absolute position of the imaging device) can be determined.
As mentioned above, in some embodiments, position data of the marine objects at a plurality of periods of time is obtained.
According to some embodiments, the method can include using (operation 2201 in
As explained hereinafter, estimating data informative of height and/or orientation of the imaging device 120 can include attempting to match position of the first marine objects and position of the second marine objects (in order to reflect the fact that they correspond to the same marine objects acquired by different sensors), by modifying value of the height and/or orientation of the imaging device 120 (which is to be estimated).
According to some embodiments, a filter (e.g. probabilistic filter) can be used which predicts the expected variations in orientation and/or height of the imaging device 120 (depending e.g. on the weather conditions). This is useful to filter out estimation of the height and/or orientation of the imaging device 120 which is not realistic and corresponds to noise.
The method includes obtaining (operation 2001) first position data informative of a position of first marine objects FIRSTMOBJ1,i to FIRSTMOBJN,i at a first period of time T1,i, wherein the first position data is obtained based on images acquired by the imaging device 120 of the marine vessel 125. Operation 200 is similar to operation 200.
The method includes obtaining (operation 210i) second position data informative of a position of second marine objects SNDMOBJ1,i to SNDMOBJM,i at a second period of time T2,i, wherein the first period of time T1,i and the second period of time T2,i meet a synchronization criterion (see above a possible definition of this criterion). The second position data is obtained based on data acquired by the at least one sensor 115 of the marine vessel 125. Operation 210i is similar to operation 210.
At least some of the first marine objects FIRSTMOBJ1,i to FIRSTMOBJN,i are the same as at least some of the second marine objects SNDMOBJ1,i to SNDMOBJM,i.
The method includes (operation 220i) using the first position data and the second position data to estimate data informative of at least one of a height or an orientation of the imaging device 120 of the marine vessel 125. Operation 220i is similar to operation 220. As a consequence, data informative of at least one of a height or an orientation of the imaging device 120 is estimated at a given period of time T′i, which substantially coincides with the first period of time T1,i and the second period of time T2,i (as mentioned the first period of time T1,i and the second period of time substantially T2,i coincide since they meet a synchronization criterion). In other words, T′i≈T1,i≈T2,i.
As shown in
Operation 210i is repeated at a different second period of time T2,i+1 (which occurs after T2,i). Therefore, at time T2,i+1, second position data informative of a position of second marine objects SNDMOBJ1,i+1 to SNDMOBJM,i+1 is obtained. It has to be noted that the second marine objects SNDMOBJ1,i+1 to SNDMOBJM,i+1 of time Ti+1 can differ from the second marine objects SNDMOBJ1,i to SNDMOBJM,i of time Ti. This is however not mandatory, and depends on the scenario (in some cases, there is a partial overlap).
T1,i+1 and T1,i+1 meet a synchronization criterion.
At least some of the first marine objects FIRSTMOBJ1,i+1 to FIRSTMOBJN,i+1 are the same as at least some of the second marine objects SNDMOBJ1,i+1 to SNDMOBJM,i+1.
Operation 220i is repeated in order to estimate data informative of at least one of a height or an orientation of the at least one imaging device of the marine vessel. As a consequence, data informative of at least one of a height or an orientation of the imaging device 120 is estimated at a given period of time T′i+1, which substantially coincides with the first period of time T1,i+1 and the second period of time T2,i+1 (as mentioned the first period of time T1,i+1 and the second period of time substantially T2,i+1 coincide since they meet a synchronization criterion). In other words, T′i+1≈T1,i+1≈T2,i+1.
The method therefore enables to estimate at least one of a height or an orientation of the imaging device 120 over time.
According to some embodiments, height and/or orientation of the imaging device 120 is estimated in real time or quasi real time (a small delay can be present due to the time for the sensors of the marine vessel to acquire the data and the time for processing this data).
Assume that data informative of at least one of a height or an orientation of the imaging device 120 is estimated at a given period of time (corresponding to a given iteration i of the method of
In some embodiments, and as explained hereinafter, estimation of the height and/or orientation of the imaging device 120 includes determining an association or match between the first marine objects and the second marine objects at a given iteration. The association determined at a given iteration “i” of the method can be reused as an input of the method at a subsequent iteration “i+1” (or more generally at an iteration “j”, with j>i), to improve determination of the association at the subsequent iteration. For example, as explained hereinafter, if two given marine objects have been identified as matching at a previous iteration of the method of
As shown in
Although
The common referential can correspond e.g. to a global/absolute referential such as world coordinates (latitude, longitude). This is not limitative and other referentials can be used. For example, a predefined set of coordinates which share the same plane can be used (for example the set of coordinates is expressed relative to the marine vessel's position, which is selected as the origin of the set of coordinates).
As explained above, the first position data is generally expressed in the referential of the image (referential of the imaging device 120).
In order to convert the first position data into the common referential, an assumption on the data informative of the height and/or orientation of the imaging device 120 can be made.
Based on this assumption, and the known position of the first marine objects in the referential of the image(s) acquired by the imaging device 120, it is possible to project the first position data from the referential of the image into the common referential.
A method of projecting the first position data into the common referential is described hereinafter with reference to
In some embodiments, at least some of the position data is already expressed in the common referential.
For example, an AIS may provide position data in world coordinates.
If position data is provided by a radar, it is possible to convert the position data in world coordinates by using the position of the marine vessel 125. Indeed, since the radar provides relative position (range/bearing), and the position of the marine vessel 125 is known (using e.g. a localization system such as GPS/AIS of the marine vessel 125), it is possible to project the position data into world coordinates (or into another common referential).
Projection of the first position data into the common referential 252 depends inter alia on the height and orientation of the imaging device 120.
At this stage, since height and/or orientation of the imaging device 120 is unknown (or known with an error), the first position data is projected randomly, and therefore position of the first marine objects (depicted as triangles) does not match position of the second marine objects (depicted as circles).
It has to be noted that this method is not limitative and is provided as an example only.
Assume that a given marine object 299 (depicted in
For example, a bounding box (see
Parameters (i) to (vii) are the input of the method, and parameter (viii) is an output of the method.
The method includes converting (operation 290) the coordinates of the two extremities of the bounding box into a single point, with coordinates (targetx, targety).
The method includes converting (291) the coordinates (targetx, targety) of the given marine object into a bearing (noted global_bearing) of the given marine object expressed in an absolute referential (e.g. Earth referential), as detailed hereinafter.
The method further includes (292) determining an artificial horizon line equation in the image (the artificial horizon line corresponds to a reference for which the imaging device 120 has zero roll and zero pitch).
A normalization function old_to_new_value(oldvalue, oldrange, newrange) is defined, in which oldvalue is the value that needs to be normalized, oldrange corresponds to the current range (oldmin, oldmax), newrange corresponds to the expected value range (newmin, newmax) and newvalue corresponds to the output of the function. The function old_to_new_value can be defined as follows:
Operation 292 can include setting campitch as the output of old_to_new_value(campitch, (−90,90), (0,180)).
Operation 292 can further include calculating the artificial horizon line location in the image in pixels.
This can include determining pixelpitch as follows:
This can further include updating pixelpitch as follows:
This can further include defining (x1, y1) as follows:
(x1, y1) corresponds to a first point of the artificial horizon line (see
This can further include defining (x2, y2) as follows:
(x2, y2) corresponds to a second point of the artificial horizon line (see
The equation of the artificial horizon line can be calculated using (x1, y1) and (x2, y2).
The method further includes (293) determining an angle (noted angle_to_artificial_horizon) of the given marine object with respect to the artificial horizon line (see
The method further includes (294) determining a distance (Euclidean distance, noted euclideandist) between the marine vessel 125 and the given marine object 299, as detailed hereinafter. Operation 294 can include a preliminary step of determining an ortho distance (see
In some embodiments, orthodist can be corrected to take into account curvature of the Earth (see e.g. https://earthcurvature.com).
Operation 294 can then include performing the computation:
The method further includes (295) determining absolute coordinates (latitude, longitude) of the given marine object 299. Operation 295 can include the following computation:
In this equation, d is equal to euclideandist, R is the Earth's radius and b is equal to globalbearing.
The method of
This matching can be according to a criterion (as explained hereinafter, the criterion can define e.g. a number of iterations of the method and/or a minimal value for a loss function).
As mentioned above, modification of an estimation of the height and/or orientation of the imaging device 120 (“Dcamera”) induces that a projection of the position of the first marine objects from the referential of the image (first position data) into the common referential 252 is modified. Modification of Dcamera does not affect the projection of the position of the second marine objects acquired by other sensors into the common referential 252.
Solving the optimization problem can include optimizing the estimation Dcamera of the height and/or orientation of the imaging device 120 to optimize a matching between the position of the first marine objects (as recalculated using Dcamera and the first position data) and the position of the second marine objects in the common referential. In particular, it is attempted to match between a position of a first marine object and a position of a second marine object which correspond to the same marine object.
In some embodiments, it is possible to estimate additional parameters of the imaging device 120, such as the field of view (data Dfield_of_view, such as camvert
In this case, solving the optimization problem can also include optimizing the values of the field of view to optimize a matching between the position of the first marine objects (as recalculated using Dcamera, Dfield_of_view and the first position data) and the position of the second marine objects in the common referential.
As visible in
In some embodiments, the convergence criterion depends on the number (e.g., absolute number or ratio) of associations/matching that have been performed between the first marine objects and the second marine objects. Indeed, the higher the number associations/matching between the first marine objects and the second marine objects, the higher the prospects that an optimal solution to the optimization problem has been found (and the better the estimation of the orientation and/or height of the imaging device).
At each iteration, it is attempted to improve the estimation of the height and/or orientation of the imaging device 120, such that the matching between the position of the first marine objects and the position of the second marine objects in the common referential is improved.
Iteration of the method can include repeating operations 221 and 222.
As shown, the first marine object which had an initial position 250 at the first iteration of the method, has an optimized position 255 (after N iterations of the method) which matches the position 251 of a second marine object.
Similarly, a plurality of respective first marine objects has an optimized position which now matches the position of a plurality of the respective second marine objects.
However, there can be one or more first marine objects which do not match any of the second marine objects. In
Attention is drawn to
As explained above (see
Assume that first position data and second position are obtained for time ti.
The method of
The method further includes solving (operation 225) an optimization problem, in which data Dcamera informative of height and/or orientation of the imaging device 120 (and/or data Dfield_of_view) is estimated to enable matching of position of the first marine objects at time ti (as recalculated using Dcamera and the first position data) and the second position data of the second marine objects at time ti.
Operation 225 is similar to operation 222.
For position data of time ti, the method of
The method can be performed again (see reference 227) at time ti+1 (different from time ti). At time ti+1, position of the first and/or second marine objects may evolve in the common referential.
Estimation of the height and/or orientation of the imaging device 120 of time ti is not necessarily valid for time ti+1, because height and/or orientation of the imaging device 120 can change during the voyage of the marine vessel 125 (due to various factors mentioned above).
Therefore, the method can include performing operations 224 and operations 225 (these operations can be performed iteratively as depicted in reference 226), in order to estimate height and/or orientation of the imaging device 120 at time ti+1.
In some embodiments, it is possible to use tracking of the marine objects over time to improve matching between the first marine objects and the second marine objects. In particular, if it has been determined that there is a matching between two given objects (a given object of the first marine objects and a given object of the second marine objects) at different periods of time, there is a high likelihood that the two given objects correspond to the same marine object. Therefore, at subsequent periods of time during which it is attempted to match position of the first marine objects and position of the second marine objects (operation 225), matching of the two given objects should be assigned with a high weight in the optimization problem. This can be performed by introducing a term (reward) in the loss function which takes into account this information.
A non-limitative example is provided with reference to
Assume that at time t1 (after e.g. a plurality of iterations of the method—as depicted in reference 226), position 270 of a first marine object with tracking ID (1,1) matches a position 271 of a second marine object with tracking ID (2,1), and position 272 of a first marine object with tracking ID (1,2) matches a position 273 of a second marine object with tracking ID (2,2) in a common referential 252.
Assume that at time t2 (after e.g. a plurality of iterations of the method), position 274 of a first marine object with tracking ID (1,1) matches a position 271 of a second marine object with tracking ID (2,1), and position 276 of a first marine object with tracking ID (1,2) matches a position 277 of a second marine object with tracking ID (2,2) in the common referential 252.
At time t3, at the first iteration of the method, a first marine object has position 278 and tracking ID (1,1), another first marine object has position 280 and tracking ID (1,2), a second marine object has position 279 and tracking ID (2,1) and another second marine object has position 281 and tracking ID (2,2).
When trying to match the positions of the first marine objects and the second marine objects (operation 225), it is possible to take into account the tracking IDs (as illustrated in operations 285 and 286 of
Therefore, at time t3, data informative of the height and/or orientation of the imaging device 120 can be estimated to attempt to match position of the first marine object with tracking ID (1,1) with position of the second marine object with tracking ID (2,1), and to attempt to match position of the first marine object with tracking ID (1,2) with position of the second marine object with tracking ID (2,2) (since there is a high likelihood that these respective positions correspond to the same respective marine object).
Matching of the first and second marine objects relies therefore not only on position, but can rely also on tracking data of the first and second marine objects over time (and/or other parameters as described hereinafter).
Attention is drawn to
As mentioned above, in some embodiments, the first data informative of the first marine objects and the second data informative of the second marine objects include data informative of a type of the marine objects.
This can be used to improve matching between the first marine objects and the second marine objects.
Indeed, if it is known that two marine objects correspond to the same type of objects, a higher weight should be assigned in the optimization problem to match these two marine objects (at operations 222 or 225). This can be performed by introducing a term (reward) in the loss function which takes into account this information.
To the contrary, if two marine objects correspond to different types of objects, a low weight should be assigned in the optimization problem to match these two marine objects (at operations 222 or 225). This can be performed by introducing a term (penalty) in the loss function which takes into account this information.
A non-limitative example is shown in
Assume that a first marine object has a position 305 in the common referential 320. The first data includes type of objects and indicates that the first marine object is a marine vessel.
Assume that a second marine object has a position 315 in the common referential 320. The second data indicates that this second marine object is a marine vessel.
Assume that another second marine object has a position 310 in the common referential 320. The second data indicates that this second marine object is a buoy.
Although position 305 of the first marine object is closer to position 310 of a second marine object than to position 315 of another second marine object, the method should estimate data informative of height and/or orientation of the imaging device 120 to improve matching between position 305 of the first marine object and position 315 of a second marine object, since these two marine objects both correspond to a marine vessel (whereas the second marine object with position 310 corresponds to a buoy, which is a different marine object).
This is shown in operations 350 and 360 of
More generally, the method can use various parameters or additional input which can assist in improving the matching (or association) between the first marine objects and the second marine objects.
Attention is now drawn to
The method includes obtaining (operation 400) first data including first position data informative of a position of first marine objects derived from images acquired by the imaging device 120. Operation 400 is similar to operation 200.
The method includes obtaining (operation 410) second data including second position data informative of a position of second marine objects provided by another sensor 115. Operation 410 is similar to operation 210. This other sensor is different from the imaging device 120, and is generally a sensor which is not a camera.
As mentioned above, at least some of the first marine objects and the second marine objects correspond to the same marine objects.
The method further includes determining (operation 420) a current state for data informative of height and/or orientation of the imaging device 120.
At the first iteration of the method, the exact height and/or orientation of the imaging device 120 is unknown. Therefore, operation 420 can include generating a random value for the height and/or orientation of the imaging device 120.
In some embodiments, a first estimation of the height and/or orientation of the imaging device 120 can be available. This first estimation can be provided e.g. by an operator and/or a manufacturer who can have first knowledge on the height and/or orientation of the imaging device 120 (e.g. due to the fact that they installed the imaging device 120 on the marine vessel 125). However, due to various factors mentioned above, this first estimation is no longer exact during voyage of the marine vessel 125, and therefore parameters of the imaging device 120 need to be estimated.
In some embodiments, a first estimation of the height and/or orientation of the imaging device 120 can be provided by an operator located on the marine vessel 125, who measures a first value of the height and/or orientation of the imaging device 120.
Once a current state is available for the height and/or orientation of the imaging device 120, first position data of the first marine objects can be projected (operation 430) into a common referential (e.g. Earth referential—this is however not limitative). An example of this projection is provided in
Similarly, the second position data can be projected into the common referential, as already explained above.
The method further includes (operation 440) determining data informative of at least one of a height or an orientation of the imaging device to optimize a matching between position of at least some of the first marine objects and position of at least some of the second marine objects.
Examples of an optimization algorithm that can be used include e.g. MSE (Mean Square error), gradient descent, MAE (mean average error), min L2 distance (Euclidean distance), etc. These examples are not limitative.
As mentioned above, operation 440 can include using various additional data to improve matching between the first marine objects and the second marine objects, such as type of marine object, tracking data of marine objects, etc.
A loss function can be calculated to reflect the optimization problem. If the loss function does not meet a convergence criterion (e.g. because its value is above a threshold), the method can be repeated, by repeating operation 440, in which it is attempted to improve estimation of the height and/or orientation of the imaging device 120, in order to improve a match between the loss function and the convergence criterion.
When the loss function of the optimization algorithm meets a convergence criterion (e.g. its value is below a threshold and/or a sufficient number of iterations has been performed), the current state (current estimation) of the height and/or orientation of the imaging device 120 can be output (operation 450). Similarly, the matching between the first marine objects and the second marine objects can be also output, for further usage, as explained hereinafter (for example, two marine objects acquired by different sensors can be considered as matching when their position in the common referential, as determined using the estimated height/orientation of the imaging device, is substantially similar, or their distance is below a threshold).
According to some embodiments, the method of
It can happen that for a given marine object, it is detected only e.g. by the imaging device and the first sensor, and for another marine object, it is detected only by the first and the second sensors, or only by the imaging device and the second sensor.
As explained hereinafter, once the height and/or orientation of the imaging device 120 is estimated, this data can be used for different marine applications.
According to some embodiments, and as shown in
In other words, this indicates that this given marine object has been identified as the same marine object acquired both by the imaging device 120 and another sensor 115. Therefore, if the other sensor 115 provides second position data informative of the given marine object (e.g. expressed in a global/absolute referential, such as a world referential—for example, sensor 115 is an AIS), it is possible to determine (operation 460) position of the given marine object acquired by the imaging device 120 using the second position data. Attention is now drawn to
The method includes obtaining (operation 400) first data including first position data informative of a position of first marine objects derived from images acquired by an imaging device 120. Operation 400 is similar to operation 200.
The method includes obtaining (operation 410) second data including second position data informative of a position of second marine objects provided by a first sensor (see reference 115). Operation 410 is similar to operation 210. This first sensor is different from the imaging device 120 and is generally a sensor which is not a camera.
The method includes obtaining (operation 415) third data including third position data informative of a position of third marine objects provided by a second sensor (see reference 115).
The second sensor is different from the first sensor and from the imaging device 120. According to some embodiments, the second sensor is not a camera.
According to some embodiments, the second sensor is of a different type than the first sensor (e.g. the first sensor is an AIS and the second sensor is a radar or a LIDAR—this is not limitative).
The method can include an intermediate operation 416, in which at least some of the second marine objects and at least some of the third marine objects are merged, to obtain an aggregated (unified) set of marine objects. This operation is however not limitative.
Each marine object of the aggregated set of marine objects is assigned with position data, which can correspond e.g. to the second position data and/or to the third position data.
Operation 416 can be performed by merging marine objects for which a distance between their positions (in a common referential) is below a threshold, and/or is minimal.
Operation 416 can include solving an optimization problem, in which it is attempted to find pairs of marine objects (each pair including a marine object of the second marine objects and a marine object of the third marine objects), such that the distance between marine objects of each pair is minimized. Optimization algorithms mentioned above can be used.
For example, assume that the first sensor is an AIS and the second sensor is a radar. An AIS provides latitude/longitude of the second marine objects, and it is possible to use the relative range/bearing measurements of the radar and position of the marine vessel 125 to determine latitude/longitude of the third marine objects. Therefore, it is possible to merge the second marine objects and the third marine objects into an aggregated set of marine objects.
Although the method of
The method further includes, similarly to
With the current state for data informative of height and/or orientation of the imaging device, the method includes projecting (operation 430) first position data into a common referential (e.g. an absolute referential, such as Earth referential).
Regarding the position data of the aggregated set of marine objects, in some embodiments, this position data is already expressed in the common referential. Indeed, if at least one given sensor (among the first sensor and the second sensor) provides position data in the common referential, then after merging of the second and third marine objects into an aggregated set of marine objects (see operation 416), it is possible to assign, to each object of the aggregated set of marine objects, position data in the common referential, as provided by the given sensor.
The method further includes determining (operation 439) data Dcamera informative of a height and/or an orientation of the imaging device 120 to optimize a matching between position of at least some of the first marine objects determined using Dcamera and the first position data and position of at least some of the marine objects of the aggregated set of marine objects. Operation 439 is similar to operation 440 but differs in that operation 439 includes matching between the first marine objects and the aggregated set of marine objects (obtained using at least two sensors). In some embodiments, operation 439 can include determining Dfield_of_view.
The method can be iterated (e.g., operation 439 can be repeated to fine tune the estimation of Dcamera) until a convergence criterion is met and there is a match between the loss function and the convergence criterion.
Once the convergence criterion is met, an estimation of the height and/or orientation of the imaging device 120 can be output (see operation 450).
As explained hereinafter, once the height and/or orientation of the imaging device 120 is estimated, this data can be used for different marine applications.
Attention is now drawn to
As explained in the various embodiments above, a matching can be performed between position of the first marine objects (acquired by the imaging device 120) and position of the second (or even third, or more) marine objects (acquired by other sensors 115).
Once this matching has been performed, it is known that a marine object of the first marine objects and a marine object of the second marine objects correspond to the same given marine object, acquired both by the imaging device 120 and the sensor(s) 115 (since their positions are substantially identical, or with a difference below a threshold). In other words, an association (operation 500) between the first marine objects and the second marine objects has been performed.
In order to augment knowledge of the given marine object, it is possible to use this association and the various parameters provided by the different sensors.
In particular, it is possible to determine (operation 501) a parameter of the given marine object, which cannot be determined using (only) the imaging device 120 but can be determined using another sensor 115 which has detected this given marine object (since it is known that the respective acquisitions by the imaging device and the at least one sensor 115 correspond to the same physical object).
Conversely, it is possible to determine a parameter of the given marine object, which cannot be determined using (only) a sensor 115 but can be determined using the imaging device 120 which has detected this given marine object (since it is known that the respective acquisitions by the imaging device and the at least one sensor 115 correspond to the same physical object).
For example, the AIS provides a type (e.g. “cargo”) of a given marine object. However, this information can be corrupted, since it is provided by the marine object itself. The imaging device can be used to determine the true type of the given marine object (using e.g., a deep neural network which detects the type of the object based on the image).
A set of augmented data/parameters can be determined for each marine object.
As a consequence, a database of marine objects (with corresponding augmented data/parameters) can be created (see operation 502) and queried, for various purposes, as explained hereinafter.
Assume that it desired to monitor performance of detection of the imaging device (or of another sensor). For each marine object, it is known whether it was detected or not, and by which sensor (since for each marine object, after the association process described in the various embodiments above, it is known which data has been acquired by each of the sensors). Performance for each sensor and for each marine object can be stored in the database. It is therefore possible to query (operation 503) the database to output performance of the imaging device (or another sensor) for specific configurations (e.g.: what is the performance of the imaging device at a distance more than 4 miles? What is the performance of the radar for cargo in rough sea, etc.). Based on this query, it is possible to attempt to improve performance of sensor(s) in specific configurations in which they underperformed. Similarly, it is possible to query the database to obtain images of marine objects with desired parameters (e.g. specific size, specific type, etc.).
Another usage of this method is depicted in
Assume that a sensor (e.g. AIS) of the marine vessel provides information on the type of the marine objects (e.g. “cargo”, “fishing vessel”, etc.). Therefore, for each of a plurality of marine objects detected by the imaging device, it is possible to automatically determine the type of the marine objects, as provided by the AIS (since it is known that the respective acquisitions by the imaging device and the AIS correspond to the same physical object).
As a consequence, it is possible to generate automatically a set of labelled images, which comprise the image of a marine object and a label indicative of the type of the marine object (or other/additional parameters, such as state of the sea, distance of the marine object, etc.). An automatic labelling (sensor labelling) of marine objects is therefore achieved. The labelled images can be used e.g. for supervised training of a deep neural network configured to detect marine objects in images. A training with a higher granularity is therefore achieved.
Attention is now drawn to
According to some embodiments, at least some of the marine objects encountered by the marine vessel 125 during its voyage do not have a localization system which provides their position (such as AIS, GPS, etc.). For example, an iceberg does not have a localization system.
Assume that a given marine object does not have a localization system. The given marine object may be a new object (which was not used in the method of determining height and/or orientation of the imaging device 120).
Once height and/or orientation of the imaging device 120 has been determined (operation 510—using e.g., the various methods described above), it is possible to use (operation 520) an image of the given marine object acquired by the imaging device 120, and the estimated height and/or orientation of the imaging device 120 to determine absolute position (e.g. in world coordinates) of the given marine object.
A method of converting the position of the given marine object from the referential of the image to a global/absolute position (independent from the referential of the image) has been provided e.g., with reference to
Similarly, it is possible to determine a distance between the marine object and the marine vessel (using e.g. the method described with reference to
This is highly beneficial, since some marine objects are detected only by the imaging device 120 (for example, a fishing vessel which does not embed an AIS, and which is close to the marine vessel 125, is detected only by the imaging device 120, and not by other sensors of the marine vessel such as AIS and radar), and this method enables to determine their global/absolute position in a desired referential (or other parameters), although the parameters (Dcamera) of the imaging device 120 evolve over time, as already explained.
It is therefore possible to determine position of a marine object (and/or distance between the marine vessel and the marine object) using only an image of the marine object acquired by the imaging device (position is determined in a referential independent of the image).
Attention is now drawn to
When a marine object is detected over a plurality of images acquired by the imaging device 120, it is possible to track the marine objects in the plurality of images, as explained above. This is shown in
At time t2, marine vessel 600 is detected at position Xt2, Yt2 in the referential of the image.
The velocity of the marine vessel 600 relative to the marine vessel 125 can be calculated in the referential of the image (e.g. in terms of pixels per unit of time). In the example of
Since data informative of the height and/or orientation of the imaging device 120 has been estimated (operation 620—using the various methods described above), it is possible to convert (operation 630) the velocity of the marine vessel 600 in the referential of the image into a velocity of the object relative to the marine vessel (e.g. expressed in distance per unit of time). This can use a method similar to
In particular, it is possible to use only the imaging device 120 to determine a velocity of marine objects relative to the marine vessel 125 (expressed in distance per time relative to the marine vessel 125).
Since position of the marine object can be expressed in a global/absolute referential (e.g. Earth referential), it is possible to determine also direction of motion of the marine object over time in an absolute referential. In other words, a velocity vector can be determined in the absolute referential, using (only) data acquired by the imaging device 120.
Attention is now drawn to
In order to control the trajectory of the marine vessel 125, it is beneficial to determine position of various marine objects surrounding the marine vessel 125, which can be detected by various sensors 130 of the marine vessel 125. In particular, position of the various marine objects should be expressed in a common referential (e.g. an absolute referential such as an Earth referential).
For example, it is beneficial for an operator of the marine vessel to obtain a map enabling a visualization of the position of the different marine objects surrounding the marine vessel 125. This can be helpful to facilitate control of the marine vessel during its voyage.
Similarly, the position of various marine objects surrounding the marine vessel 125 in a common referential can be used by an auto-pilot system of the marine vessel, which controls trajectory of the marine vessel overtime.
Once height and/or orientation of the imaging device 120 has been estimated (operation 700—using e.g., the methods of
Regarding other sensors 115 of the marine vessel 125, some of them provide an absolute position (such as AIS, GPS) of the marine objects.
Some of sensors 115 (which are not imaging devices) provide a relative position (such as a radar) which can be converted into an absolute position, as explained above.
Therefore, position of the marine objects acquired by the various sensors 130 of the marine vessel 125 can be expressed (operation 710) in a common referential (e.g., absolute referential).
A map of the position of the marine objects, expressed in a common referential (absolute referential), can be generated. In some embodiments, the map can include a graphical map in which position of each marine object can be depicted.
In some embodiments, type of the marine objects can be determined (e.g. using the images of the imaging device 120) or obtained (e.g. from an AIS).
The graphical map can therefore depict, for each marine object, a type of the marine object. Navigation is therefore facilitated.
Since a given marine object may be acquired by a plurality of different sensors of the marine vessel 125, in some embodiments it is possible to merge (e.g. in the map) marine objects acquired by different sensors and which have similar position (and, in some embodiments, similar type) into a single marine object with a single position. Two marine objects can be considered to share a similar position e.g. when a distance between the two marine objects is below a threshold.
Attention is now drawn to
The marine vessel 125 can include one or more sensors (different from the imaging device 120), which need to be calibrated. For example, a sensor such as IMU (Inertial Measurement Unit) has drift over time, and therefore needs to be recalibrated over time.
The method of
The method further includes using Dcamera to calibrate (operation 810) another sensor of the marine vessel 125, which is different from the imaging device 120. In some embodiments, the other sensor is not an imaging device, but this is not limitative. Calibration of the other sensor can include providing the “true” orientation (extracted from Dcamera) to the other sensor to calibrate it.
In some embodiments, a plurality of other sensors can be recalibrated. In some embodiments, the calibration can use Dfield_of_view.
In some embodiments, since Dcamera is determined over a plurality of periods of time, it is possible to (re)calibrate the other sensor over time.
For example, IMU of the marine vessel 125 can be calibrated. IMU generally measures orientation (roll, pitch, yaw) and altitude of the marine vessel 125. Orientation and/or height of the imaging device 120 as estimated using the various methods described above can be used to calibrate orientation and/or height measured by IMU of the marine vessel 125, at a given period of time or at a plurality of periods of time.
For example, assume that the IMU and the imaging device 120 are physically aligned (e.g. they have the same orientation). During voyage of the marine vessel 125, orientation and/or height provided by IMU includes a drift, which needs to be corrected. Since orientation and/or height of the imaging device 120 can be determined (as explained in the various methods above), this can be used to recalibrate the IMU and cancel the error caused by the drift.
Although an example referring to recalibration of IMU has been provided, this applies similarly to other sensors (e.g. an experimental radar, etc.) present on the marine vessel 125, and which need to be calibrated/recalibrated. The orientation and/or height of the imaging device 120 can be used to perform the calibration/recalibration of the other sensor.
Attention is now drawn to
Assume that orientation and/or height of the imaging device 120 has been estimated as explained in the various embodiments above. In some embodiments, the field of view of the imaging device has also been estimated.
The method of
This can include acquiring an image of a first marine object and determining (operation 900) distance to the first marine object (this can be determined using the equations provided with reference to
This can include acquiring an image of a second marine object (different from the first marine object) and determining (operation 910) distance to the second marine object (this can be determined using the equations provided with reference to
Since position of the first and second marine objects is known (e.g., because they embed a localization system such as AIS), and the distance of the marine vessel 125 to each of these marine objects has been determined, the absolute position (latitude/longitude) of the marine vessel 125 can be determined (operation 920—as an intersection of two circles). This is useful since this enables to be independent from localization systems such as GPS. This can be used also to up-sample localization data provided by the localization systems such as GPS.
Embodiments of the presently disclosed subject matter are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the presently disclosed subject matter as described herein.
The invention contemplates a computer program being readable by a computer for executing one or more methods of the invention. The invention further contemplates a machine-readable memory tangibly embodying a program of instructions executable by the machine for executing one or more methods of the invention.
It is to be noted that the various features described in the various embodiments may be combined according to all possible technical combinations.
It is to be understood that the invention is not limited in its application to the details set forth in the description contained herein or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carried out in various ways. Hence, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting. As such, those skilled in the art will appreciate that the conception upon which this disclosure is based may readily be utilized as a basis for designing other structures, methods, and systems for carrying out the several purposes of the presently disclosed subject matter.
Those skilled in the art will readily appreciate that various modifications and changes can be applied to the embodiments of the invention as hereinbefore described without departing from its scope, defined in and by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
284251 | Jun 2021 | IL | national |
This application is a national phase filing under 35 C.F.R. § 371 of and claims priority to PCT Patent Application No. PCT/IL2022/050665, filed on Jun. 21, 2022, which claims the priority benefit under 35 U.S.C. § 119 of Israeli Patent Application No. 284251, filed on Jun. 21, 2021, the contents of each of which are hereby incorporated in their entireties by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IL2022/050665 | 6/21/2022 | WO |