The disclosure relates to an information processing method, a radar apparatus, and a recording medium.
Navigation radar is an important apparatus that detects objects around one's own ship
and provides information for safe and optimal avoidance actions. Patent Document 1 (Japanese Patent No. 3794641) discloses a technology in which an AIS apparatus of a ship equipped with an ARPA radar apparatus is connected to the ARPA radar apparatus via an interface, and which compares the received dynamic information of nearby ships and the radar symbol information acquired from the ARPA radar apparatus every moment and alerts the operator by using radar symbol information having no dynamic information recognized with a particular match as transmission stop warning information and using dynamic information having no radar symbol information recognized with a particular match as no situation report warning information.
A radar emits radio waves around the own ship and observes the reflected waves to display information on the orientation and distance of an object as a radar echo. However, on the other hand, the radio waves emitted to an object may be reflected by another object, generating a false image that does not actually exist.
At least one embodiment of the disclosure relates to an information processing method, a radar apparatus, and a recording medium that can recognize a false image on a radar echo.
An information processing method according to an embodiment of the disclosure includes: acquiring image data that includes time-series radar images; inputting acquired one radar image to a first training model that outputs position data of a first false image candidate in response to one radar image being input, to acquire the position data of the first false image candidate;
inputting acquired time-series radar images to a second training model that outputs position data of a second false image candidate in response to time-series radar images being input, to acquire the position data of the second false image candidate; and detecting a false image of a radar echo based on the acquired position data of the first false image candidate and the second false image candidate.
The information processing method according to an embodiment of the disclosure further includes: acquiring a detection result at a sensor unit that includes at least one of an AIS and an image sensor; and detecting the false image of the radar echo based on the acquired detection result, a probability regarding the position data of the first false image candidate output by the first training model, and a probability regarding the position data of the second false image candidate output by the second training model.
In the information processing method according to an embodiment of the disclosure, the first false image candidate includes a false image caused by at least one of a side lobe, a backwash, and a cloud.
In the information processing method according to an embodiment of the disclosure, the second false image candidate includes a false image caused by at least one of multiple reflections of a detection wave between an own ship and another ship, reflection between another ship and another ship, and reflection between another ship and land.
The information processing method according to an embodiment of the disclosure further includes differentiating a display mode according to whether the radar echo is a false image or not.
The information processing method according to an embodiment of the disclosure further includes differentiating a display mode of the false image based on a false image probability in response to the radar echo being a false image.
A radar apparatus according to an embodiment of the disclosure includes: processing circuitry configured to: acquire image data that includes time-series radar images; input acquired one radar image to a first training model that outputs position data of a first false image candidate in response to one radar image being input, to acquire the position data of the first false image candidate; input acquired time-series radar images to a second training model that outputs position data of a second false image candidate in response to time-series radar images being input, to acquire the position data of the second false image candidate; and detect a false image of a radar echo based on the acquired position data of the first false image candidate and the second false image candidate.
A non-transient computer-readable recording medium according to an embodiment of the disclosure records a computer program configured to enable a computer to perform processes of: acquiring image data that includes time-series radar images; inputting acquired one radar image to a first training model that outputs position data of a first false image candidate in response to one radar image being input, to acquire the position data of the first false image candidate; inputting acquired time-series radar images to a second training model that outputs position data of a second false image candidate in response to time-series radar images being input, to acquire the position data of the second false image candidate; and detecting a false image of a radar echo based on the acquired position data of the first false image candidate and the second false image candidate.
According to the disclosure, it is possible to recognize a false image on a radar echo.
The illustrated embodiments of the subject matter will be best understood by reference to the drawings, wherein like parts are designated by like numerals throughout. The following description is intended only by way of example, and simply illustrates certain selected embodiments of devices, systems, and processes that are consistent with the subject matter as claimed herein.
Embodiments of the disclosure will be described hereinafter.
The antenna 11 is a radar antenna that is capable of transmitting highly directional pulsed
radio waves (transmission waves). The antenna 11 is configured to receive reflected waves from an object such as a target object. The radar apparatus 100 measures the time from when pulsed radio waves are transmitted until when reflected waves are received. Thereby, the radar apparatus 100 can detect the distance to the target object. The antenna 11 is configured to be rotatable 360° on a horizontal plane. The antenna 11 is configured to repeatedly transmit and receive radio waves at particular intervals while changing the transmission direction of the pulsed radio waves (while changing the antenna angle). Thereby, the radar apparatus 100 can detect a target object on a plane around the own ship over 360°.
The reception unit 12 detects and amplifies the echo signal obtained from the reflected waves received by the antenna 11, and outputs the amplified echo signal to the A/D conversion unit 13. The A/D conversion unit 13 samples the echo signal in analog form and converts the echo signal into digital data (echo data) composed of multiple bits. Here, the echo data includes data specifying the strength (signal level) of the echo signal obtained from the reflected waves received by the antenna 11. The A/D conversion unit 13 outputs the echo data to the radar image generation unit 20.
The radar image generation unit 20 generates a radar image based on the echo data output from the A/D conversion unit 13.
Next, the mechanism by which a false image occurs will be explained.
B of
C of
False images are also caused by a side lobe, a backwash, a cloud, etc. The side lobe is
a portion where the radio wave strength is weaker than the main lobe where the strength of the radio waves emitted from the antenna is maximum, and is a leakage radio wave that is emitted in a direction different from the direction of the main lobe by, for example, 30 to 45 degrees. The backwash is a wave that occurs behind a ship when sailing. Generally, radar echoes of a backwash and a cloud are not false images, but here they are treated as a type of false image as radar echoes that are not subject to avoidance.
Next, a method for detecting a false image will be explained.
The first training model 31 can be generated, for example, as follows. That is, in the case where one radar image displayed in R-θ is input to the first training model 31, the first training model 31 may be generated so as to output the position data of a first false image candidate. Specifically, for the radar image displayed in R-θ, an image in which a false image that is definitely confirmed to be a false image is annotated, as shown in
The second training model 32 can be generated, for example, as follows. That is, in the case where time-series radar images displayed in R-θ are input to the second training model 32, the second training model 32 may be generated so as to output the position data of a second false image candidate. Specifically, for the time-series radar images displayed in R-θ, an image in which a false image that is definitely confirmed to be a false image is annotated, as shown in
As described above, the radar apparatus 100 acquires image data including time-series radar images; inputs the acquired one radar image to the first training model 31 that outputs the position data of the first false image candidate in the case where one radar image is input, to acquire the position data of the first false image candidate; inputs the acquired time-series radar images to the second training model 32 that outputs the position data of the second false image candidate in the case where the time-series radar images are input, to acquire the position data of the second false image candidate; and detects a false image of a radar echo based on the acquired position data of the first false image candidate and the second false image candidate, thereby making it possible to recognize a false image on a radar echo.
The AIS 41 is a communication device that can receive AIS data from another ship that exists around the own ship. The AIS data includes data such as identification code, ship name, position, course, ship speed, and destination. The AIS 41 outputs the AIS data to the target object identification unit 40. It should be noted that the disclosure is not limited to AIS, and may use VDES (VHF Data Exchange System).
The image sensor 42 can be configured with a visible light camera or an infrared camera, for example. The image sensor 42 photographs the outside of the ship and generates an image. The image sensor 42 may be installed, for example, on the bridge of the own ship to face toward the bow direction. The image sensor 42 may be configured to pan (swing in the horizontal direction), tilt (swing in the vertical direction), and zoom in response to an operation of the user. The image sensor 42 outputs image data to the target object identification unit 40.
The target object identification unit 40 uses at least one of the AIS data output by the AIS 41 and the image data output by the image sensor 42 to identify a target object which is an object that actually exists (another ship, etc.). The position of the target object can be identified, for example, by the relative position in terms of distance and orientation from the own ship, or the absolute position in terms of latitude and longitude. In the case of identifying a target object based on image data, the target object identification unit 40 may use a training model generated by machine learning. In this case, the training model can use an object detection model such as YOLO (You Only Look Once) or SSD (Single Shot Multibox Detector).
The synthesis processing unit 50 makes a final determination as to whether the false image detected by the false image detection unit 30 is a false image or a real image (target object) rather than a false image, based on the detection result detected by the false image detection unit 30 and the identification result identified by the target object identification unit 40. The determination method performed by the synthesis processing unit 50 will be described below.
As described above, the radar apparatus 100 may acquire the detection result (identification result) at the sensor unit including at least one of the AIS 41 and the image sensor 42, and detect a false image on a radar echo based on the acquired detection result, the probability (accuracy) regarding the position data of the first false image candidate output by the first training model 31, and the probability regarding the position data of the second false image candidate output by the second training model, thereby making it possible to further improve the accuracy of detecting a false image.
The display image generation unit 80 generates a display image to be displayed on the display device 90 based on the radar image generated by the radar image generation unit 20 and the final determination result of the synthesis processing unit 50. The display image includes, for example, an electronic nautical chart.
The radar apparatus 100 inputs the time-series radar images to the second training model 32 to acquire the position data of the second false image candidate (S14). The second false image candidate includes a false image caused by at least one of multiple reflections of the detection wave between the own ship and another ship, reflection between another ship and another ship, and reflection between another ship and land. The radar apparatus 100 identifies the target object based on the detection result of the sensor unit (S15).
The radar apparatus 100 detects a false image of a radar echo based on the position data of the first false image candidate, the position data of the second false image candidate, and the identification result of the target object (S16). In step S16, the determination method illustrated in
All or some of the radar image generation unit 20, the false image detection unit 30, the target object identification unit 40, the synthesis processing unit 50, and the display image generation unit 80 of the radar apparatus 100 can also be implemented using a computer equipped with a CPU (processor), a RAM, etc. Each of the above units of the radar apparatus 100 can be implemented on a computer by loading a computer program (recordable on a recording medium) that defines the processing procedure shown in
The information processing method of this embodiment includes: acquiring image data that includes time-series radar images; inputting acquired one radar image to a first training model that outputs position data of a first false image candidate in response to one radar image being input, to acquire the position data of the first false image candidate; inputting acquired time-series radar images to a second training model that outputs position data of a second false image candidate in response to time-series radar images being input, to acquire the position data of the second false image candidate; and detecting a false image of a radar echo based on the acquired position data of the first false image candidate and the second false image candidate.
The information processing method of this embodiment further includes: acquiring a detection result at a sensor unit that includes at least one of an AIS and an image sensor; and detecting the false image of the radar echo based on the acquired detection result, a probability regarding the position data of the first false image candidate output by the first training model, and a probability regarding the position data of the second false image candidate output by the second training model.
In the information processing method of this embodiment, the first false image candidate includes a false image caused by at least one of a side lobe, a backwash, and a cloud.
In the information processing method of this embodiment, the second false image candidate includes a false image caused by at least one of multiple reflections of a detection wave between an own ship and another ship, reflection between another ship and another ship, and reflection between another ship and land.
The information processing method of this embodiment further includes differentiating a display mode according to whether the radar echo is a false image or not.
The information processing method of this embodiment further includes differentiating a display mode of the false image based on a false image probability in response to the radar echo being a false image.
The radar apparatus of this embodiment includes: a data acquisition unit which acquires image data that includes time-series radar images; a first acquisition unit which inputs acquired one radar image to a first training model that outputs position data of a first false image candidate in response to one radar image being input, to acquire the position data of the first false image candidate; a second acquisition unit which inputs acquired time-series radar images to a second training model that outputs position data of a second false image candidate in response to time-series radar images being input, to acquire the position data of the second false image candidate; and a detection unit which detects a false image of a radar echo based on the acquired position data of the first false image candidate and the second false image candidate.
The computer program of this embodiment is configured to enable a computer to perform processes of: acquiring image data that includes time-series radar images; inputting acquired one radar image to a first training model that outputs position data of a first false image candidate in response to one radar image being input, to acquire the position data of the first false image candidate; inputting acquired time-series radar images to a second training model that outputs position data of a second false image candidate in response to time-series radar images being input, to acquire the position data of the second false image candidate; and detecting a false image of a radar echo based on the acquired position data of the first false image candidate and the second false image candidate.
It is to be understood that not necessarily all objects or advantages may be achieved in accordance with any particular embodiment described herein. Thus, for example, those skilled in the art will recognize that certain embodiments may be configured to operate in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other objects or advantages as may be taught or suggested herein.
All of the processes described herein may be embodied in, and fully automated via, software code modules executed by a computing system that includes one or more computers or processors. The code modules may be stored in any type of non-transitory computer-readable medium or other computer storage device. Some or all the methods may be embodied in specialized computer hardware.
Many other variations than those described herein will be apparent from this disclosure. For example, depending on the embodiment, certain acts, events, or functions of any of the algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (e.g., not all described acts or events are necessary for the practice of the algorithms). Moreover, in certain embodiments, acts or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially. In addition, different tasks or processes can be performed by different machines and/or computing systems that can function together.
The various illustrative logical blocks and modules described in connection with the
embodiments disclosed herein can be implemented or performed by a machine, such as a processor. A processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor includes an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable device that performs logic operations without processing computer-executable instructions. A processor can also be implemented as a combination of computing devices, e.g., a combination of a digital signal processor (DSP) and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Although described herein primarily with respect to digital technology, a processor may also include primarily analog components. For example, some or all of the signal processing algorithms described herein may be implemented in analog circuitry or mixed analog and digital circuitry. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
Conditional language such as, among others, “can,” “could,” “might” or “may,” unless specifically stated otherwise, are otherwise understood within the context as used in general to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically
stated otherwise, is otherwise understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present. Any process descriptions, elements or blocks in the flow diagrams described herein
and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or elements in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown, or discussed, including substantially concurrently or in reverse order, depending on the functionality involved as would be understood by those skilled in the art.
Unless otherwise explicitly stated, articles such as “a” or “an” should generally be interpreted to include one or more described items. Accordingly, phrases such as “a device configured to” are intended to include one or more recited devices. Such one or more recited devices can also be collectively configured to carry out the stated recitations. For example, “a processor configured to carry out recitations A, B and C” can include a first processor configured to carry out recitation A working in conjunction with a second processor configured to carry out recitations B and C. The same holds true for the use of definite articles used to introduce embodiment recitations. In addition, even if a specific number of an introduced embodiment recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations).
It will be understood by those within the art that, in general, terms used herein, are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.).
For expository purposes, the term “horizontal” as used herein is defined as a plane parallel to the plane or surface of the floor of the area in which the system being described is used or the method being described is performed, regardless of its orientation. The term “floor” can be interchanged with the term “ground” or “water surface”. The term “vertical” refers to a direction perpendicular to the horizontal as just defined. Terms such as “above,” “below,” “bottom,” “top,” “side,” “higher,” “lower,” “upper,” “over,” and “under,” are defined with respect to the horizontal plane.
As used herein, the terms “attached,” “connected,” “mated,” and other such relational terms should be construed, unless otherwise noted, to include removable, moveable, fixed, adjustable, and/or releasable connections or attachments. The connections/attachments can include direct connections and/or connections having intermediate structure between the two components discussed.
Numbers preceded by a term such as “approximately”, “about”, and “substantially” as used herein include the recited numbers, and also represent an amount close to the stated amount that still performs a desired function or achieves a desired result. For example, the terms “approximately”, “about”, and “substantially” may refer to an amount that is within less than 10% of the stated amount. Features of embodiments disclosed herein preceded by a term such as “approximately”, “about”, and “substantially” as used herein represent the feature with some variability that still performs a desired function or achieves a desired result for that feature.
It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
2021-196344 | Dec 2021 | JP | national |
The present application is a continuation of PCT/JP2022/015561, filed on Mar. 29, 2022, and is related to and claims priority from Japanese patent application no. 2021-196344, filed on Dec. 2, 2021. The entire contents of the aforementioned application are hereby incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/015561 | Mar 2022 | WO |
Child | 18674902 | US |