The present disclosure relates to a collision avoidance assistance system for ships to assist collision avoidance particularly using data sensed by sensors (that is referred to as “sensor data” hereinafter).
Radar devices are known as a system for collision avoidance assistance for ships. The radar devices assist safe ship navigation by displaying a bird's eye view in which land, a building, another ship, or the like is viewed from high above to supplement visual information.
However, obstacles such as seaweed, driftwood, and floating objects cannot be detected, identified and evaluated using only information obtained by the radar devices detecting the obstacles, and thus the information obtained by the radar devices is insufficient for safe ship navigation.
Patent Literature 1 discloses a system for assisting safe ship navigation by utilizing various sensors other than the radar devices.
Patent Literature 1 discloses a system to: generate a scenery image viewed from a predetermined position in a predetermined direction based on (i) information outputted by a global positioning system (GPS), a gyroscope device, an attitude sensor device, a radar device, a television (TV) camera device, a night vision device, an echo sounding device, a sonar, a device for logs, and a steering device and (ii) a database regarding nautical chart information, and (iii) a database regarding information on ports and harbors; and display the scenery image on an image displaying means, thereby enabling a generation of and a display of a scenery image viewed from any position in any direction, such as a scenery image viewed from a cockpit of a ship in the direction of forward movement of the ship or a scenery image viewed from a rear portion of a deck of the ship in the backward direction.
Additionally, Patent Literature 1 discloses that the system (i) makes a collision avoidance calculation for predicting a collision risk, a stranding avoidance calculation for predicting a risk that a ship is stranded, a dangerous sea area calculation for calculating a dangerous sea area, and a recommended course calculation for calculating a recommended ship course, and (ii) displays, on the image displaying means, a recommended ship course calculated in the system, thereby securing the safety of ship route even if information obtained by visual observation by a ship pilot is insufficient.
However, since the system of Patent Literature 1 does not detect, identify or evaluate an obstacle, the ship pilot itself need to identify the obstacle on the basis of a scenery image displayed in image information.
Patent Literature 2 discloses a ship motion control system including sensors for collecting sea condition data. In the system of Patent Literature 2, analyzer software receives the sea condition data and predicts occurrence of a sea condition event. Calculator software calculates an operation command in preparation for the occurrence of the sea condition event. Interface software transmits the operation command to the ship control system.
However, the sea condition collected by the system of Patent Literature 2 is mainly conditions of waves of sea, and the system of Patent Literature 2 does not detect or identify obstacles such as seaweed.
Patent Literature 1: Unexamined Japanese Patent Application Kokai Publication No. H06-301897
An objective of the present disclosure is to provide a system for assisting avoidance of collision between a ship and an obstacle by collecting various types of sensor data in a ship travel route using sensors, leaning from the collected sensor data, and detecting, identifying and evaluating an obstacle area as a result of the learning.
In a first aspect of the present disclosure, a collision avoidance assistance system for assisting avoidance of collision between a ship and an obstacle using sensors selected from among a video camera, a color meter, a spectrometer, an infrared sensor, a temperature sensor, a radar, a device for LIDAR, a sonar, an air-speed meter, a ship-speed meter, and a global navigation satellite system (GNSS) receiver is provided, and the system is characterized by including (a) three dimensional viewing field acquisition means for acquiring a three dimensional viewing field in a ship travel route by integrating sensor data obtained via sensors selected from among the video camera, the infrared sensor, and the device for LIDAR, and (b) learning means for (i) collecting the sensor data obtained via the sensors in the three dimensional viewing field in which the obstacle is located, and (ii) outputting an evaluation of the obstacle based on the collected sensor data by using a deep learning method in which a pair of the collected sensor data and the evaluation of the obstacle is used as learning data, wherein the evaluation of the obstacle includes identification information about a floating object on the sea, and the system provides, as information for assisting avoidance of collision with the obstacle, an evaluation of a current obstacle that is outputted by the learning means based on the sensor data on a current three dimensional viewing field obtained via the sensors.
The term, “LIDAR”, means “Light Detection and Ranging”, and the device for LIDAR is a sensor for detecting light and measuring a distance.
The sensors may be fixed to poles included in the ship.
Alternatively, the sensors may be attached to a drone for obtaining an observation altitude and an observation distance, and the drones may be connected to the ship with a wired or wireless connection.
The evaluation of the obstacle may further include identification information about an object on or in the sea, and the object on or in the sea is another ship different from the ship, a buoy, or a hidden rock.
The evaluation may be displayed on at least one selected from among (i) a screen installed on the ship in order to warn a ship pilot, (ii) a head-up display installed on the ship, (iii) a terminal device for the ship pilot, (iv) a head mounted display for the ship pilot, and (v) a visor for the ship pilot.
The evaluation may be transmitted to an autopilot system installed in the ship and used for avoiding potential collisions.
The evaluation may be transmitted by a device for long distance communication and may be shared with a database on a network or a system of another ship sailing near the ship.
The learning data may include (i) a pair of: sensor data collected in another collision avoidance assistance system different from the present system; and an evaluation of the obstacle or (ii) a pair of: sensor data generated as test data; and an evaluation of the obstacle.
The learning means may collect sensor data in a plurality of collision avoidance assistance systems connected to one another via communication and may use, as learning data, a pair of the collected sensor data and an evaluation of the obstacle.
A concrete example of the present disclosure is described below in an embodiment with reference to the drawings.
Operations of functional components of the embodiment are achieved by (i) executing a pre-installed control program such as firmware by a processor of a system circuit and (ii) making the processor cooperate with various devices that are components of the system. Also, the program is stored in a computer-readable recording medium, is read from the recording medium by the processor, and is executed by an operation of a user such as a ship pilot or by receiving a signal from a device included in the system.
Overall View Including a Network
A video camera, a color meter, a spectrometer, and an infrared sensor are mounted on upper portions of the right pole 2002 and the left pole 2003. A radar, a device for LIDAR, an air-speed meter, a thermometer, a ship-speed meter, and a GNSS receiver are mounted on the middle pole 2004.
In the present embodiment, the thermometer is an infrared thermometer enabling measurement of a temperature of an area of an obstacle without contact.
The collision avoidance assistance system 4000 is connected to a sensor group 4001 installed in the ship and a ship handling system 4002 of the ship. Additionally, the collision avoidance assistance system is connected, via a communication network 4003, to (i) a database 4004 about information on ship handling and (ii) a system 4005 of another ship.
Functional components of the collision avoidance assistance system 4000 are a three dimensional viewing field acquirer 4006, a collector 4007, a learning unit 4008, an assisting unit 4009, and a communicator 4010.
The sensor group 4001 expresses all of sensors attached to the ship 1001 and the drone 1008, and a data set obtained by the sensor group is transmitted to the collision avoidance assistance system.
The three dimensional viewing field generator 4006 forms data on a three dimensional region in the direction of the navigation route of the ship from (i) image data obtained by the video camera and the infrared sensor disposed on the right and left poles and included in the sensor group 4001 and (ii) data on an object in the sea obtained by the sonar, and generates a three dimensional viewing field including an area of the waterway and an area of the obstacle other than the area of the waterway.
In this case, the three dimensional region is a region including two straight lines 1007 and extending with depth from four points 5002, 5003, 5004, and 5005 on the XY-plane in the Z-direction denoting the direction in which the ship sails.
The collector 4007 collects a pair of (i) a data set regarding the three dimensional viewing field in which an obstacle such as a drift wood, the seaweed, the other ship, the buoy, the hidden rock, or a shoal is located and (ii) a user's operation of the ship handling system 4002 in response to the three dimensional viewing field or an inputted user's evaluation of the three dimensional viewing field.
In addition to data detected by the sensor group such as a shape, a color, a temperature, and speed, a data set of the present disclosure may include (i) data on a type of and a position of a structural object located on the sea as indicated by nautical chart data and/or (ii) data on a type of and a position of the floating object or the like indicated on a sea warning information.
During assistance of collision avoidance, the data set regarding a current three dimensional viewing field is inputted into the learning unit 4008 for the purpose of generating assistance information, and the learning unit 4008 outputs an evaluation of an obstacle located in the current three dimensional viewing field. The evaluation is outputted to the assisting unit 4009 and is transmitted, as assistance information displayed on the screen disposed in the ship, to the ship handling system. Although the current evaluation outputted by the learning unit is transmitted as assistance information to be displayed in the present embodiment, assistance information of the present disclosure is not limited to the current evaluation outputted by the learning unit, and the evaluation outputted by the learning unit may be transmitted to an autopilot system incorporated into the ship handling system with the evaluation unchanged.
In the present disclosure, a method in which a three dimensional viewing field is generated on the head-up display disposed on a windshield of the cockpit of the ship and the assistance information is displayed on the head-up display may be used as a manner of displaying the assistance information on a screen.
The system of the present embodiment also learns from (i) a pair of the data set collected during the operation for the above-described assistance and the output on which the assistance information is based, and (ii) the ship handling operation actually selected by the ship handling system. However, in order to efficiently improve the functions of the learning unit 4008, the communicator 4010 may receive a pair of (i) a data set collected by the system 4005 of the other ship and (ii) an output produced in the system 4005, and the system may use, as learning data for the learning unit 4008, the received pair of the data set and the output from the other ship. Additionally, the system may use, as learning data, (i) a data set of test data formed by simulating the three dimensional viewing field in which an obstacle is located and (ii) an output expressing an evaluation of a situation indicated by that data set.
By using the above-described system, a system for assisting collision avoidance in a situation in which an obstacle is located on a ship travel route can be provided.
This application claims the benefit of Japanese Patent Application No. 2017-155291, the entire disclosure of which is incorporated by reference herein.
The present disclosure is applicable to a ship industry, a ship-handling industry, and an industry such as an agency for providing ships with information.
Number | Date | Country | Kind |
---|---|---|---|
2017-155291 | Aug 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/025288 | 7/4/2018 | WO | 00 |