The present invention relates to an information processing apparatus, a control method of an information processing apparatus, a storage medium, and the like.
In recent years, in line with technological innovations such as autonomous driving mobility and spatial awareness systems around the world, the competition to develop a holistic architecture that connects data and systems among different organizations and members of society (hereinafter, “digital architecture”) is intensifying. In Japan as well, the development of digital architecture has become an urgent task.
By utilizing digital architecture, an autonomous driving mobility and spatial awareness system is able to acquire more information, and address greater challenges in conjunction with external devices and systems other than itself. In order to accomplish this, there is a need for a technology for linking real-world spaces and digital information.
As a technology that connects real-world spaces with digital information, there is a conventional temporal-spatial data management system. In Japanese Patent Application Laid-Open Publication No. 2014-2519, a system is disclosed in which a single processor divides a temporal-spatial area in time and space according to temporal-spatial management data provided by a user to generate a plurality of temporal-spatial divided areas. In addition, Japanese Patent Application Laid-Open Publication No. 2014-2519 discloses that the system assigns an identifier expressed as a one-dimensional integer value, for uniquely identifying each of the plurality of temporal-spatial divided areas, taking into account the temporal and spatial proximity of the temporal-spatial divided areas. Furthermore, Japanese Patent Application Laid-Open Publication No. 2014-2519 discloses that the system determines the arrangement of temporal data so that data of temporal-spatial divided areas with similar identifiers is stored in close proximity to each other on a storage apparatus.
However, in Japanese Patent Application Laid-Open Publication No. 2014-2519, there is no reference to a rule of generation of a temporal-spatial divided area, and the data related to the generated area can only be known by an identifier within the processor that generated the temporal-spatial divided area.
In addition, in Japanese Patent Application Laid-Open Publication No. 2014-2519, no reference is made with respect to a specific usage method for different users to use information of a temporal-spatial divided area.
Thus, in order to use such data by sharing among members of different organizations or society (hereinafter, “users”), in addition to understanding the structure of the data in advance, there is a need for each user to reconstruct the existing system to be able to handle the data structure, which may generate a large amount of work.
The present invention provides an information processing apparatus that uses a temporal-spatial format that can share position information and spatial information that is compatible with various devices.
An information processing device according to one aspect of the present invention include a determination unit that determines that a moving body has approached a collision hazard area, which is an area in which there is a possibility of collision with a collision factor, a list acquisition unit that acquires a unique identifier list that is stored by associating information of the collision factor with a unique identifier linked to a space in which the collision factor exists, in a case in which it has been determined that the moving body has approached the collision hazard area, and a collision avoidance control unit that performs control to the moving body so as to avoid a collision with the collision factor, according to the level of collision risk, based on the information of the collision factor that has been stored in the unique identifier list.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinafter, embodiments of the present invention will be described with reference to the drawings. Note the present invention is not limited by the following embodiments. In each diagram, the identical reference signs are applied to the identical members or elements, and a duplicate description will be omitted or simplified. It should be noted that although an example in which control of an autonomous moving body is applied in the embodiments will be explained, the moving body may be at least one operation unit operable by a user with respect to the movement of the moving body. That is, for example, it may be a configuration in which various displays and the like with respect to the traveling route and the like are provided to the user, and the user performs part of the driving operation of the moving body with reference to such displays.
In the present embodiment, each of the apparatuses shown in
The system control apparatus 10, the user interface 11, the autonomous moving body 12, the route determination apparatus 13, the conversion information holding apparatus 14, and the sensor node 15 each include an information processing apparatus comprising a CPU as a computer and a ROM, a RAM, a HDD, and the like as storage media. The functions and internal configuration of each apparatus will be explained in detail below.
Next, a service application software provided by the autonomous moving body control system will be explained. First, to provide an explanation, a screen image displayed on a user interface 11 when a user enters position information will be explained by using
It should be noted that in the present explanation, for convenience, a map display is explained in a two-dimensional plane. However, in the present embodiment, the user can specify a three-dimensional position, including “height,” and can also input “height” information. That is, according to the present embodiment, a three-dimensional map can be generated.
First displayed on the web page is an input screen 40 of a departure location, a transit location, and an arrival location for setting a departure location, a transit location, and an arrival location when moving the autonomous moving body 12. The input screen 40 has a list display button 48 for displaying a list of autonomous moving bodies (mobilities) to be used, and when the user presses the list display button 48, a list display screen 47 of mobilities is displayed, as shown in
When the user selects any mobility of M1 to M3 by a click operation or the like, the user is automatically returned to the input screen 40 of
Each time the add transit location button 44 is pressed, an input field 46 is additionally displayed, such as “Transit location 3” or “Transit location 4”, and it is possible to input a plurality of points of additional transit locations. In addition, the user inputs a location set as the arrival location in the input field 43 of “Arrival location”. Although not shown in the figures, when input fields 41 to 43, 46, or the like are clicked, a keyboard or the like for entering characters is temporarily displayed, thereby enabling the desired characters to be input.
Then, it is possible for the user to set the movement route of the autonomous moving body 12 by pressing a decision button 45. In the example of
Furthermore, it is possible for the user to display the latest status by updating the screen display information by pressing a refresh button 57. In addition, it is possible for the user to change the departure location, the transit location, and the arrival location by pressing a transit/arrival location change button 54. That is, it is possible to change a location by input of a desired location to be reset in the input field 51 of the “departure location”, a desired location to be reset in an input field 52 of the “transit location 1”, or a desired location to be reset in an input field 53 of the “arrival location”.
In this manner, the user can easily set a movement route of the autonomous moving body 12 from a predetermined location to a predetermined location by the operation of the user interface 11. It should be noted that such a route setting application can also be applied, for example, to a cab dispatch service or a drone delivery service and the like.
Next, configuration examples and function examples of each of the apparatuses 10 to 15 will be explained in detail by using
The display screen of the user interface 11 shown in
The control unit 11-2 incorporates a CPU as a computer, manages various applications in the user interface 11, performs mode management such as information input and information confirmation, and controls communication processing. Furthermore, the control unit 11-2 controls processing in each unit within the system control apparatus.
The information storage unit (memory/HD) 11-4 is a database that holds necessary information, such as a computer program for execution by a CPU and the like. The network connection unit 11-5 controls communication performed via the Internet, a LAN, a wireless LAN, and the like. It should be noted that the user interface 11 may be, for example, a device such as a smartphone, or it may be in a form such as a tablet terminal.
Thus, the user interface 11 of the present embodiment displays an input screen 40 showing the above-described departure location, the transit location, and the arrival location on the browser screen of the system control apparatus 10, and it is possible for the user to input location information such as a departure point, a transit point, and an arrival point. In addition, by displaying the confirmation screen 50 and the map display screen 60 on the browser screen, it is possible to display the current position of the autonomous moving body 12.
In
The map information is three-dimensional map information that includes information such as topography and latitude/longitude/altitude, and also includes regulatory information relating to road traffic laws such as roadways, sidewalks, direction of travel, traffic regulations, and the like. Furthermore, for example, the map information includes time-dependent regulatory information, such as in a case in which one-way traffic is determined by a time period or when a certain road becomes pedestrian-only during specific hours, and incorporates time information thereof. The control unit 13-2 incorporates a CPU as a computer, and controls processing in each unit of the route determination apparatus 13.
In addition to managing the position information of the autonomous moving body acquired via the network connection unit 13-5, the position/route information management unit 13-3 transmits the position information to the map information management unit 13-1, and manages the route information acquired from the map information management unit 13-1 as the search result. The control unit 13-2 converts the route information managed by the position/route information management unit 13-3 into a predetermined data format and transmits the converted route information to an external system in accordance with a request of the external system.
In this manner, in present Embodiment, the route determination apparatus 13 is configured so as to search for a route in accordance with road traffic laws and the like based on the specified position information, and to be able to output route information in a predetermined data format.
In
The control unit 14-3 converts the position information into a unique identifier that has been specified by the format, based on the position information acquired from the position/route information management unit 14-1 and the information of the format that is managed in the format database 14-4. Then, the control unit 14-3 sends the converted position information to the unique identifier management unit 14-2. The format will be explained in detail below. However, the format assigns an identifier (hereinafter, a “unique identifier”) to a space based on a predetermined position, with the starting point as the origin, and manages the space by the unique identifier. In the present embodiment, it is possible to acquire a corresponding unique identifier and information in the space based on the predetermined position information.
The unique identifier management unit 14-2 manages the unique identifier converted by the control unit 14-3 and transmits the converted unique identifier through the network connection unit 14-6. The format database 14-4 manages the information of the format, and transmits the information of the format to the control unit 14-3 according to a request of the control unit 14-3.
In addition, the information in the space acquired via the network connection 14-6 is managed by using the format. The conversion information holding apparatus 14 manages information related to the space acquired by an external device, apparatus, or network, in association with a unique identifier. In addition, the unique identifier and information related to the space associated therewith are provided to an external device, an apparatus, or a network.
In this manner, the conversion information holding apparatus 14 acquires the unique identifier and information in the space based on the predetermined position information, and manages and provides the information in a state in which the acquired information can be shared by the external device, apparatus, or network connected thereto. In addition, the conversion information holding apparatus 14 converts the position information that has been specified in the system control apparatus 10 into a unique identifier, and provides the converted information to the system control apparatus 10.
In
In addition, the position/route information management unit 10-3 is also capable of dividing the route information at a predetermined interval, and of generating position information such as the latitude/longitude of the divided locations. The unique identifier management unit 10-1 manages information converted from the position information and the route information to the unique identifier. The control unit 10-2 incorporates a CPU as a computer, controls the position, the route information, and a communication function of the unique identifier of the system control apparatus 10, and controls processing in each unit of the system control apparatus 10.
In addition, the control unit 10-2 provides a web page to the user interface 11, and transmits predetermined position information acquired from the web page to the route determination apparatus 13. Furthermore, the control unit 10-2 acquires predetermined route information from the route determination apparatus 13 and transmits each of the position information of the route information to the conversion information holding apparatus 14. Then, the control unit 10-2 transmits the route information that has been converted to the unique identifier acquired from the conversion information holding apparatus 14 to the autonomous moving body 12.
In this manner, the system control apparatus 10 is configured so as to be able to perform acquisition of predetermined position information specified by the user, transmission and reception of position and route information, generation of position information, and transmission and reception of route information that uses a unique identifier.
In addition, the system control apparatus 10 collects the route information necessary to perform autonomous movement of the autonomous moving body 12 based on the position information that has been input to the user interface 11, and provides route information to the autonomous moving body 12 by using a unique identifier. It should be noted that in the present embodiment, the system control apparatus 10, the route determination apparatus 13, and the conversion information holding apparatus 14 each function, for example, as a server.
In
In addition, the detection unit 12-1 includes a self-position detection function such as Global Positioning System (GPS), and, for example, a direction detection function such as a geomagnetic sensor. Furthermore, it is possible for the control unit 12-2 to generate a three-dimensional map of the cyberspace based on the acquired detection information, self-position estimation information, and direction detection information.
Here, a three-dimensional map of a cyberspace is a map in which spatial information equivalent to real-world feature position information can be expressed as digital data. Within this three-dimensional map of the cyberspace, the autonomous moving body 12 that exists in the real world and feature information on the surrounding thereof are held as spatially equivalent information as digital data. Accordingly, by using the digital data, efficient movement is possible.
Using
In
In addition, the position of a column 99 is specified as the position of a vertex 99-1 based on position information that has been measured in advance. Furthermore, the distance from α0 of the autonomous moving body 12 to the vertex 99-1 may be acquired by the distance measurement function of the autonomous moving body 12. In
In a three-dimensional map of the cyberspace, information acquired in this manner is managed as digital data, and can be reconfigured a spatial information such as in
Specifically, the position P1 of α0 in this space can be calculated from the latitude and longitude of α0 and the latitude and longitude of P0. Similarly, the column 99 can be calculated as P2. In this example, the two entities of the autonomous moving body 12 and the column 99 are represented in a three-dimensional map of the cyberspace. However, it is of course possible to handle many more entities in a similar manner. In this manner, a three-dimensional map is the self-position and objects of the real world mapped onto a three-dimensional space.
Returning to
A direction control unit 12-3 performs a change in the movement direction of the autonomous moving body 12 by changing the driving direction of the moving body by a driving unit 12-6. A driving unit 12-6 consists of a driving apparatus, such as a motor, that generates a propulsive force of the autonomous moving body 12. It is possible for the autonomous moving body 12 to reflect the self-position information, detection information, and object detection information in the three-dimensional map, to generate a route that maintains a certain distance from surrounding terrain, buildings, obstacles, and objects, and perform autonomous driving.
It should be noted that the route determination apparatus 13 performs route generation mainly in consideration of regulatory information related to road traffic laws. In contrast, the autonomous moving body 12 more accurately detects the position of a surrounding obstacle in the route by the route determination apparatus 13 and performs route generation so as to move without coming into contact with the obstacle based on the size of the autonomous moving body 12. In addition, it is also possible to store the mobility format of the autonomous moving body itself in the information storage unit (memory/HD) 12-4 of the autonomous moving body 12. This mobility format is, for example, a legally recognized type of moving body, and refers to, for example, the type of car, bicycle, drone, and the like. Based on this mobility format, it is possible to perform the generation of the format route information described below.
Here, the main body configuration of the autonomous moving body 12 in the present embodiment will be explained by using
In
A direction control unit 12-3 changes the direction of a driving unit 12-6 by rotational driving of a shaft, and a driving unit 12-6 performs the forward or backward movement of the autonomous moving body 12 by rotation of the shaft. It should be noted that the configuration explained by using
It should be noted that the autonomous moving body 12 is, for example, a moving body that uses Simultaneous Localization and Mapping (SLAM) technology. Furthermore, the autonomous moving body 12 is configured such that it can autonomously move along a specified predetermined route based on the detection information detected by the detection unit 12-1, or by detection information of an external system acquired via the Internet 16.
The autonomous moving body 12 is capable of trace movement, such as tracing precisely specified points, while simultaneously generating route information locally and moving within the space between the roughly set points.
In addition, as described above, the autonomous moving body 12 transmits information related to the movement of an own vehicle, such as direction and moving speed of the own vehicle, position information, and the like, to the system control apparatus 10 through the network connection 12-5. Furthermore, the system control apparatus 10 transmits information related to the operation of the autonomous moving body 12 received from the autonomous moving body 12 to the conversion information holding apparatus 14 through the network connection unit 10-5. The conversion information holding apparatus 14 stores in the format database 14-4 information related to the operation of the autonomous moving body 12, such as the direction, moving speed, and position information of the autonomous moving body 12 received from the system control apparatus 10. In the present embodiment, with respect to any other moving body other than the autonomous moving body 12, similar to the autonomous moving body 12, information related to the operation of the moving body itself, such as its direction, movement speed, position information, and the like is transmitted to the conversion information holding apparatus 14. Therefore, the format database 14-4 becomes a state in which the direction, the movement speed, and the position information of a moving body that exists in a space managed by a unique identifier are stored. How this information is stored will be described below with reference to
In this manner, it is possible for the autonomous moving body 12 of the present embodiment to perform autonomous movement based on the route information using the unique identifier that has been provided by the system control apparatus 10.
Returning to
The control unit 15-2 incorporates a CPU as a computer and controls the detection, data storage, and data transmission functions of the sensor node 15, and controls processing in each control unit in the sensor node 15. In addition, the detection information acquired by the detection unit 15-1 is stored in the information storage unit (memory/HD) 15-3 and transmitted to the conversion information holding apparatus 14 through the network connection unit 15-4.
In this manner, the sensor node 15 is configured so as to be able to store and communicate detection information, such as image information detected by the detection unit 15-1, feature point information of a detected object, position information, and the like, in the information storage unit 15-3. In addition, the sensor node 15 provides the detection information of an area detectable by the sensor node 15 to the conversion information holding apparatus 14.
Next, the specific hardware configuration of each control unit in
In
The ROM 23 is provided with a program ROM in which base software (OS), which is a system program that performs system control of the information processing apparatus, is recorded, and a data ROM in which information and the like required for operating the system are recorded. Instead of the ROM 23, it is also possible to use a below-described HDD 29. A reference numeral 24 designates a network interface (NETIF), which performs control for data transfer between information processing apparatuses via the Internet 16 and diagnoses a connection status. A reference numeral 25 designates a video RAM (VRAM), and develops an image to be displayed on the screen of an LCD 26 and performs control of the display thereof. A reference numeral 26 designates a display apparatus such as a display (hereinafter referred to as an “LCD”).
A reference numeral 27 designates a controller (hereinafter referred to as a “KBC”) for controlling an input signal from an external input apparatus 28. A reference numeral 28 designates an external input device (hereinafter referred to as a “KB”) for receiving an operation performed by a user, and for example, a pointing device such as a keyboard or a mouse is used. A reference numeral 29 designates a hard disk drive (hereinafter referred to as an “HDD”), which is used to store application programs and various data. An application program in the present embodiment is a software program or the like that executes various processing functions in the present embodiment.
A reference numeral 30 designates a CDD as a removable data storage medium for input/output of data to/from a removable medium 31, such as a CD-ROM drive, a DVD drive, a Blu-Ray (registered trademark) disc drive, and the like. The CDD 30 is an example of an external input/output apparatus. The CDD 30 is used in a case in which the application program described above is read from a removable medium or the like. A reference numeral 31 designates a removable medium, such as a CD-ROM disc, a DVD, a Blu-Ray disc, or the like, that is read by the CDD 30.
It should be noted that a removable medium may be a magneto-optical recording medium (for example, an MO), a semiconductor recording medium (for example, a memory card), or the like. Application programs and data stored in the HDD 29 can also be utilized after being stored in the removable media 31. A reference numeral 20 designates a transmission bus (address bus, data bus, input/output bus, and control bus) for connecting each of the above-described units with each other.
Next, details of the control operation in the autonomous moving body control system for realizing a route-setting application and the like as was explained in
First, in step S201, the user accesses a web page provided by the system control apparatus 10 by the user interface 11. In step S202, the system control apparatus 10 displays a position input screen as explained in
The position information may be a word that specifies a specific location (hereinafter, “position word”) that is, for example, a building name, a station name, an address, and the like, and a specific position on the map that has been displayed on the web page can be specified as a point (hereinafter referred to as “point”).
In step S204, the system control apparatus 10 stores the type information of the selected autonomous moving body 12 and the position information that has been input. At this time, in a case in which the position information is a position word, the position word is saved, and in a case in which the position information is the point, the latitude/longitude corresponding to the point is searched based on the simple map information stored in the position/route information management unit 10-3, and the latitude/longitude is stored.
Next, in step S205, the system control apparatus 10 specifies the type of route of the moving body that can be moved (hereinafter, the “route type”) from the mobility format (type) of the autonomous moving body 12 that has been specified by the user. Then, in step S206, the route type is transmitted to the route determination apparatus 13 together with the position information.
The mobility type is a legally recognized type of moving body, and means, for example, a type such as car, bicycle, drone, and the like. In addition, the type of route is, for example, a public road, a highway, a freeway, or the like for an automobile, or a designated sidewalk, a roadside strip on a public road, a dedicated bicycle lane, or the like for a bicycle.
In step S207, the route determination apparatus 13 inputs the received position information as a departure/transit/arrival point in its owned map information. In a case in which the position information is the position word, map information is searched by position word, and the corresponding latitude/longitude information is used. In a case in which the position information is latitude/longitude information, the position information is used by the direct input thereof into the map information.
Subsequently, in step S208, the route determination apparatus 13 searches for a route from the departure point to the arrival point via a transit point. At this time, the searched route is searched in compliance with the route type. Then, in step S209, as a result of the search, the route determination apparatus 13 outputs the route from the departure point to the arrival point via the transit point (hereinafter, “route information”) in a GPX format (GPS eXchange Format), and transmits the route information to the system control apparatus 10.
A GPX format file is configured mainly by the three types of information of a waypoint (point information having no order relationship), a route (point information having an order relationship with time information added), and a track (an aggregate of a plurality of point information: a trajectory).
As attribute values for each of point information, latitude/longitude, and as sub-elements, elevation, geoid height, GPS reception status/accuracy, or the like are described. The minimum element required in a GPX file is the latitude/longitude information of a single point, and the description of any other information is optional. The route is output as the route information, which is an aggregate of point information consisting of latitude/longitude having an order relationship. It should be noted that the route information may be in any other format if the above is satisfied.
Here, an example of a configuration of a format managed by the format database 14-4 of the conversion information holding apparatus 14 is explained in detail with reference to
In
In
It should be noted that in
In
That is, the conversion information holding apparatus 14 formats spatial information related to the type of object that exists or can enter the three-dimensional space defined by latitude/longitude/height in association with a unique identifier and stores the formatted spatial information in the format database 14-4. The spatial information is updated based on information that has been input by an external system (for example, the sensor node 15) and the like that is communicatively connected to the conversion information holding apparatus 14, and the information is shared with other external systems that have been communicatively connected to the conversion information holding apparatus 14.
With respect to the autonomous moving body 12, as described above, the autonomous moving body 12 transmits information related to movement of the own vehicle, such as a direction, a movement speed, and position information of the own vehicle, and how this information is stored will be explained below.
The sensor node 15 is disposed so as to be able to capture an image of the space to which a unique identifier 001 (hereinafter also referred to as “ID001”), a unique identifier 002 (hereinafter also referred to as “ID002”), and a unique identifier 003 (hereinafter also referred to as “ID003”) are assigned.
The sensor node 15 recognizes a bicycle 1202 as a bicycle. The sensor node 15 recognizes, by the distance measurement function, the extent to which the bicycle 1202 exists at a position separated from the sensor node 15, that is, the distance from the sensor node 15 to the bicycle 1202. Furthermore, because the sensor node 15 also includes self-position information and image capture direction information, it is possible to determine that the bicycle 1202 exists in the space to which ID002 and ID003 are assigned by performing a calculation in accordance with the distance.
Because the sensor node 15 performs object recognition processing for each captured frame, it is possible to calculate the direction and movement speed of a moving body in the captured image from the difference in position from the previous frame. A table 1203 is a table summarizing the information that was recognized by the sensor node 15. In the present control system, as shown in the table 1203, the items of a car, a motorcycle, a bicycle, and a person are prepared for each unique identifier as recognition objects. In a case in which an object does not exist in the space to which the unique identifier is assigned, “−1” is input. In a case in which an object exists in the space to which the unique identifier is assigned, the direction in which it is facing and its velocity are input. In the table 1203, a direction is expressed as an angle, with north being 0 degrees and one counterclockwise rotation being 360 degrees. In the table 1203, velocity is expressed in meters per second [m/s]. The sensor node 15 performs object recognition in each captured frame, compiles data as shown in the table 1203, and transmits the data to the conversion information holding apparatus 14 through the network connection 15-4.
It should be noted that the sensor node 15 may only perform object recognition to determine what object is at which position, and transmit the object recognition result to the conversion information holding apparatus 14 through the network connection 15-4. In this case, the conversion information holding apparatus 14 may store the received object recognition result in the unique identifier management unit 14-2 in the form of the table 1203 in accordance with the format database 14-4.
In this manner, in the present embodiment, information related to the type and time of an object that exists in or can enter a three-dimensional space defined by latitude/longitude/height (hereinafter, “spatial information”) is formatted in association with a unique identifier and stored in a database. Then, management of time-space is made possible by the formatted spatial information.
Returning to
At this time, the system control apparatus 10 makes the result obtained by thinning out the position information within the route information according to the interval between the starting point positions of the divided spaces as position point group data in a case in which an interval of the position information is smaller than the interval between the starting point positions of the divided spaces. In addition, the position information is made position point group data by the system control apparatus 10 interpolating the position information within a range that does not deviate from the route information in a case in which the interval of the position information is greater than the interval between the starting points of the divided spaces.
Next, as shown in step S211 of
In step S214, the system control apparatus 10 arranges the received unique identifiers in the same order as the original position point group data, and stores the received unique identifiers as route information (hereinafter, “format route information”) by using the unique identifiers. Thus, in step S214, the system control apparatus 10 acquires spatial information from the database of the conversion information holding apparatus 14, and generates route information related to the movement route of the moving body based on the acquired spatial information and the type information of the moving body.
Here, a process of generating the position point group data from the route information, and converting the position point group data into route information by using a unique identifier will be explained in detail with reference to
In
In
In
Returning to
Then, in step S216, the system control apparatus 10 converts the spatial information into a format that can be reflected in a three-dimensional map of the cyberspace of the autonomous moving body 12, and creates information that indicates the position of a plurality of objects (obstacles) in the predetermined space (hereinafter, a “cost map”). The cost map may be created initially with respect to the space of all routes of the format route information, or it may be created by a method in which it is created in a form that is divided by fixed areas and updated sequentially.
Next, in step S217, the system control apparatus 10 stores the format route information and the cost map by associating the format route information and the cost map with a unique identification number assigned to the autonomous moving body 12. At a predetermined time interval, the autonomous moving body 12 locally monitors (hereinafter, “polls”) its own unique identification number via a network, and in step S218, downloads the associated cost map. In step S219, the autonomous moving body 12 reflects latitude/longitude information of each unique identifier of the format route information as route information on a locally-created three-dimensional map of cyberspace.
Next, in step S220, the autonomous moving body 12 reflects the cost map in a three-dimensional map of the cyberspace as obstacle information on the route. In a case in which the cost map is created in a form in which it is divided at fixed intervals, after moving through the area for which the cost map has been created, the cost map for the next area is downloaded, and the cost map is updated.
In step S221, the autonomous moving body 12 moves in accordance with the route information while avoiding objects (obstacles) that have been input in the cost map. That is, the autonomous moving body 12 performs movement control based on the cost map. At this time, in step S222, the autonomous moving body 12 moves while performing object detection, and if there is a difference with the cost map, moves while updating the cost map by using object detection information.
Furthermore, in step S223, the autonomous moving body 12 transmits the difference information with the cost map to the system control apparatus 10, together with the corresponding unique identifier. In step S224 of
In step S226, the autonomous moving body 12, which is moving based on the format route information, transmits to the system control apparatus 10 the unique identifier associated with the space currently being traversed at each passage through a divided space linked to each unique identifier. Alternatively, the autonomous moving body 12 may be associated with its own unique identification number at the time of polling. The system control apparatus 10 determines the current position of a autonomous moving body 12 on the format route information, based on the unique identifier information of the space that the system control apparatus 10 receives from the autonomous moving body 12.
By repeating step S226, the system control apparatus 10 can determine the current location of the autonomous moving body 12 within the format route information. It should be noted that, with respect to a unique identifier of the space through which the autonomous moving body 12 has passed, the system control apparatus 10 may cease to hold the unique identifier, thereby reducing the held data capacity of the formatted route.
In step S227, the system control apparatus 10 creates the confirmation screen 50 and the map display screen 60 that were explained in
In contrast, in step S228, the sensor node 15 stores the detection information of a detection range and in step S229 abstracts the detection information, and in step S230, the sensor node 15 transmits the abstracted detection information as spatial information to the conversion information holding apparatus 14. The abstraction is information such as whether or not an object exists, or whether or not there has been a change in an existence state of an object, for example, and is not detailed information related to the object. Detailed information related to an object is stored in a memory in the sensor node 15.
Then, in step S231, the conversion information holding apparatus 14 stores the spatial information, which is abstracted detection information, in association with a unique identifier of a position corresponding to the spatial information. Thus, the spatial information is stored in one unique identifier in the format database.
Furthermore, in a case in which an external system different from the sensor node 15 utilizes the spatial information, the external system utilizes the spatial information by acquiring the detection information in the sensor node 15 via the conversion information storage apparatus 14, based on the spatial information in the conversion information storage apparatus 14. At this time, the conversion information holding apparatus 14 also includes a function to connect the communication standard of the external system and the sensor node 15.
By performing the above-described storage of spatial information not only in sensor node 15, but between a plurality of devices, the conversion information holding apparatus 14 has a function of connecting data from a plurality of devices with a relatively light data volume. It should be noted that in a case in which the system control apparatus 10 requires detailed object information when creating the cost map in steps S215 and S216, the detailed information can be used by downloading the detailed information from an external system storing the detailed detection information of spatial information.
Here, it is assumed that the sensor node 15 updates the spatial information on the route of the format route information of the autonomous moving body 12. At this time, the sensor node 15 acquires the detection information, generates the abstracted spatial information at step S233, and transmits the abstracted spatial information to the conversion information holding apparatus 14 at step S234. The conversion information holding apparatus 14 stores the spatial information in the format database 14-4 at step S235.
The system control apparatus 10 confirms a change in the spatial information in the managed format route information at a predetermined time interval, and if there is a change, downloads the spatial information at step S236. Then, at step S237, the system control apparatus 10 updates the cost map that has been associated with the unique identification number assigned to the autonomous moving body 12. In step S238, the autonomous moving body 12 recognizes the update of the cost map by polling, and reflects the update in the locally-created three-dimensional map of the cyberspace.
In this manner, by utilizing spatial information that has been shared by a plurality of devices, it is possible for the autonomous moving body 12 to recognize in advance a change on a route that is not recognizable locally, and to respond to that change. In a case in which the above-described series of systems are executed, and the autonomous moving body 12 arrives at the arrival point in step S239, the unique identifier is transmitted at step S240.
Thus, the system control apparatus 10, having recognized the unique identifier, at step S241 displays an arrival display on the user interface 11, and terminates the application. According to the present embodiment, in this manner, it is possible to provide a format of a digital architecture, and an autonomous moving body control system that uses the format.
As explained in
As one of these spatial information, there is type information of an object in the space. Here, the type information of an object in a space is information that can be acquired from map information, such as a roadway, a sidewalk, a bicycle path, and the like in a road. In addition, information such as the direction of travel of the mobility and traffic regulations and the like in a roadway can be similarly defined as type information. Furthermore, it is possible to define type information in the space itself.
In the present embodiment, a control system related to blind spot accident reduction using the autonomous moving body control system of the First Embodiment will be explained. Control of this system is realized by the system control apparatus 10 shown in
A front vehicle-mounted camera is mounted on the own vehicle 1301, and the field of view thereof is represented by a field of view 1304. At the time of
To reduce the number of accidents like that of
In the detection of a collision factor, if all of the information of around the own vehicle is acquired from the conversion information holding apparatus 14, the information volume becomes so broad and voluminous that time is required to search for the collision factor. Therefore, a range of information that is acquired in the blind spot accident reduction system of the present embodiment is set.
In the state shown in
First, the concept of a setting range of the area 1405 will be explained. The area 1405 is an area primarily for performing the detection of a motorcycle. A motorcycle that may collide with the own vehicle 1401 in the future is currently present further to the left side of the intersection in the opposite lane. Accordingly, the right end of the area 1405 is an intersection. In addition, because motorcycles travel on a road, the vertical width of the area 1405 in
Furthermore, it is possible to set the left end of the area 1405 as below. Because the own vehicle 1401 is still before the intersection (to the right side in
The range from the position of the intersection to the position(Tm[second]×maximum speed of the traveling road[m/sec]×2)[m]on the side before the intersection(left side in FIG. 16). (Formula1)
Furthermore, in order to detect a motorcycle that may collide with the own vehicle 1401 in the future from the data of the format database 14-4, it is necessary to take into account the update delay of this format database 14-4. An update delay of this format database 14-4 will be explained below.
In order to input information into this format database 14-4, processing is required whereby the sensor node 15 captures an image of the motorcycle, recognizes the motorcycle by object recognition, generates data, and stores the data in the format database 14-4. Therefore, the information stored in the format database 14-4 will be information stored that is older than the information at the current time.
(Ts[s]×maximum speed of the traveling road[m/sec]×2)[m] (Formula 2)
In the present embodiment, the length obtained by extending the range of Formula 1 by the distance of Formula 2 is set as a length 1404 of
(3 [s]×11.1 [m/s]×2)+(1 [s]×11.1 [m/s]×2)=88.8 [m] (Formula 3)
In this case, based on Formula 3, the area 1405 is assumed to be from the intersection to a distance of 88.8 meters along the road before the intersection (on the left side in
Next, a setting method of the area 1410 and the area 1411 will be explained. The area 1410 and the area 1411 are areas in which a bicycle is detected. It is assumed that the maximum speed of a bicycle is approximately 40 km/h. If calculated in a similar manner as that described above, the maximum distance that a bicycle that may collide with the own vehicle 1401 at the intersection in the future is currently separated from the intersection becomes Formula 4.
(3 [s]×11.1 [m/s])+(1 [s]×11.1 [m/s])=44.4 [m] (Formula 4)
In the present Embodiment, each of a length 1406 and a length 1407 shown in
Next, a setting method of the area 1408 and the area 1409 will be explained. The area 1408 and the area 1409 are areas in which a pedestrian is detected. It is assumed that the running speed of a pedestrian is approximately 100 meters in 10 seconds. Therefore, the maximum distance that a pedestrian that may collide with the own vehicle 1401 at the intersection in the future is currently separated from the intersection is 10 meters. For this reason, in the present embodiment, each of a length 1412 and a length 1413 shown in
In the above description, the concept of an area has been explained. The information that is stored in the information storage unit (memory/HD) 12-4 as an area is each vertex of an area. However, in reality, there are cases in which a road curves. Therefore, in a case in which a list of unique identifiers corresponding to an area is created based on that area, it is preferable to fit a range to a road based on map information stored in the information storage unit (memory/HD) 12-4. This point will be explained below.
As a method for creating a unique identifier list from an area, a method of searching for a unique identifier required in a blind spot accident reduction system using the autonomous moving body control system described above will be explained here.
As shown in Table 1203 of
An area 1801, which is an inner area delineated by solid lines, is a roadway on which vehicles travel, and the area of the outer side thereof is a sidewalk. All of the divided spaces are of the same size and are assigned a unique identifier in a continuous sequence from ID 071 to ID 150 from the bottom right to the top left. In reality, a plurality of spaces, each of which also has a unique identifier, are disposed in the vertical direction, but in
In
Next, an area search method will be explained with reference to
In step S1901, the system control apparatus 10 detects that the own vehicle 1802 has entered an intersection. As a method of detecting that the own vehicle 1802 has entered an intersection, the front camera of the detection unit 12-1 of the own vehicle 1802, which is an autonomous moving body 12, detects that the own vehicle 1802 has entered an intersection, and the system control apparatus 10 may receive notification to this effect. In addition, as a method of detecting that the own vehicle 1802 has entered an intersection, detection may be performed by comparing the self-position information acquired by using the GPS of the detection unit 12-1 of the own vehicle 1802, which is the autonomous moving body 12, with the map information during traveling. The system control apparatus 10 detects that the own vehicle 1802 has entered an intersection by receiving notification from the own vehicle 1802 as the autonomous moving body 12 that the own vehicle 1802 has entered the intersection. In a case in which the system control apparatus 10 has detected that the own vehicle 1802 has entered an intersection, the system control apparatus 10 executes the processing of step S1902. In a case in which the system control apparatus 10 has detected that the own vehicle 1802 has not entered an intersection, the system control apparatus 10 executes the processing of step S1901. The processing of step S1901 is an example of a determination unit that determines that a moving body has approached a collision danger area, which is an area in which there is a possibility of collision with a collision factor.
In step S1902, the system control apparatus 10 calculates the direction of travel in which the own vehicle 1802 has entered the intersection, and represents the direction of travel by a unique identifier. As a method for calculating the direction of travel of the own vehicle 1802, the direction of travel of the own vehicle 1802 can be calculated from the self-position information detected by the detection unit 12-1, or can be calculated by detecting the own vehicle 1802 by the sensor node 15. As a method of expressing the traveling direction with a unique identifier, the control unit 10-2 of the system control apparatus 10 expresses the traveling direction by converting the own vehicle 1802 and the area information that the own vehicle 1802 has passed through before entering the intersection into a unique identifier.
In step S1903, the system control apparatus 10 queries the conversion information holding apparatus 14 through the network connection unit 10-5 as to the presence or absence of dynamic information of the range of the area 1405 from the intersection. The data format and the like that the system control apparatus 10 at this time sends and receives at the time of a query will be explained with reference to
The data format of the data transmitted by the system control apparatus 10 at the time of a query is as follows. The data 1701 stores, at the head position thereof, position information of an intersection converted to a unique identifier. Following the unique identifier, the data 1701 stores, as at least two or more consecutive pieces of data, a unique identifier of the road that the own vehicle 1802 has passed along when entering the intersection. The system control apparatus 10 transmits such data as the data 1701 to the conversion information holding apparatus 14.
In step S1904, the conversion information holding apparatus 14 calculates a vector 1810, which is a vector indicating the direction of travel of the own vehicle 1802, from the traveling direction information of the own vehicle 1802 received from the system control apparatus 10.
In step S1905, the conversion information holding apparatus 14 searches for a unique identifier of a space adjacent to the intersection. At this time, when searching for a unique identifier in an adjacent space, the conversion information holding apparatus 14 does not search for a unique identifier in a direction having a vector component opposite to the vector 1810. That is, the conversion information holding apparatus 14 performs a search only in a direction 1811 that has the same vector component as the vector 1810. At the time that the own vehicle 1802 has reached the intersection, it is considered that the own vehicle has already passed the dynamic information present in spaces oriented in a direction opposite to the vector 1810, indicating that the own vehicle has already moved beyond the intersection.
In step S1906, the conversion information holding apparatus 14 confirms whether or not a space indicated by a searched unique identifier is a roadway from the information managed in the unique identifier management unit 14-2. In a case in which the space is not a roadway, the conversion information holding apparatus 14 executes the processing of step S1908, and does not execute a search as to whether there is dynamic information in the data managed by the unique identifier of that space. In a case in which the space is a roadway, the conversion information holding apparatus 14 performs the processing of step S1907.
In step S1907, the conversion information holding apparatus 14 confirms dynamic information included in the data managed by the unique identifier of the space. In addition, in step S1907, the conversion information holding apparatus 14 associates the dynamic information in the data with the unique identifier, and stores the position information of the dynamic information and the unique identifier in the unique identifier list.
In the case of the example of
In step S1908, the conversion information holding apparatus 14 determines whether a search of all necessary areas has been completed. The conversion information holding apparatus 14 executes the processing in S1909 in a case in which the search of all necessary areas has been completed. The conversion information holding apparatus 14 executes the processing in S1905 in a case in which the search of all necessary areas has not been completed.
In step S1909, the conversion information holding apparatus 14 transmits the unique identifier list to the system control apparatus 10. Data 1702 is an example of a unique identifier list. As with the data 1702, in the unique identifier list, a unique identifier is stored in association with dynamic information in a case in which there is dynamic information in the unique identifier list. In a case in which there is no dynamic information in a unique identifier list, the unique identifier list stores the unique identifier and 0. In the example of
According to the present embodiment, by creating a unique identifier list as described above, it is possible to know the collision factor that may collide with the own vehicle by dynamic information, including a collision factor positioned in a blind spot. Therefore, according to the present embodiment, it is possible to acquire information necessary for determining a collision factor in a blind spot accident reduction system. In addition, according to the present embodiment, because it is possible to omit information in the unique identifier list that is unnecessary in determining a collision factor, it becomes possible to reduce the communication volume and the processing load.
Next, a method in which the control unit 10-2 of the system control apparatus 10 detects a collision factor such as a motorcycle, a bicycle, a pedestrian, and the like that may collide with the own vehicle from a unique identifier list of each area created as described above will be explained.
The collision factor detected differs for each area. In a case of a unique identifier list of the area 1405, a motorcycle is detected. In a case of a unique identifier list of the area 1410 and the area 1411, a bicycle is detected. In a case of a unique identifier list of the area 1408 and the area 1409, a pedestrian is detected.
As described above, a unique identifier list that has been created in response to the entry of the autonomous moving body 12 into the intersection is stored in the information storage unit 10-4 of the system control apparatus 10. Interrupt processing is executed when triggered by an update of this unique identifier list, and the processing shown in the flowchart of
In step S2002, the system control apparatus 10 acquires the direction information of the dynamic information of the collision factor to be searched from within the unique identifier list. As mentioned above, the dynamic information in the table 1203 of
In step S2003, the system control apparatus 10 confirms the direction toward which the collision factor is facing. If the collision factor is facing the direction of the intersection, the system control apparatus 10 executes the processing of step S2004, assuming that there is a possibility that the collision factor enters the intersection. If the collision factor is not facing the direction of the intersection, the system control apparatus 10 executes the processing of step S2012, assuming that the collision factor will not enter the intersection.
In step S2004, the system control apparatus 10 confirms the speed of the collision factor. If the speed of the collision factor is 0, that is, the collision factor is stationary, the system control apparatus 10 assumes that there is no entry of the collision factor into the intersection, and executes the processing of step S2012. If the speed of the collision factor is not zero and the collision factor is in motion, the system control apparatus 10 executes the processing of step S2005, assuming that there is a possibility that the collision factor enters the intersection.
As described above, at the time at which the system control apparatus 10 generates a unique identifier list, the information storage unit 10-4 stores the latitude and longitude, which are the position information of the intersection. In addition, the latitude and longitude position information of the unique identifier that has been assigned to the space in which a collision factor exists (unique identifier of interest) is stored in the unique identifier information of the unique identifier list. In step S2005, the system control apparatus 10 calculates a distance D from the collision factor to the intersection by using the position information of these intersections and the position information of the unique identifier of interest.
In step S2006, the system control apparatus 10 calculates a collision factor prediction time Tf, which is the predicted time it takes for the collision factor to arrive at the intersection, according to Formula 5.
Tf=D/S−Ts (Formula 5)
In Formula 5, D is a distance from the collision factor to the intersection calculated at step S2005, S is a speed of the collision factor acquired from the format database 14-4, and Ts is a time difference between the current time and the update time of the unique identifier. Because the distance D is a value calculated by using information at a time that is a time difference Ts earlier than the current time, a collision factor prediction time Tf is obtained by subtracting Ts from the time obtained by dividing the distance D by the speed S.
In step S2007 and step S2008, the system control apparatus 10 determines the relationship between the predicted collision factor time Tf obtained in step S2006 and the own vehicle prediction time Tm, which is a predicted time at which the own vehicle arrives at the intersection. That is, the system control apparatus 10 performs a determination as to whether there is a possibility of a collision between the own vehicle and the collision factor based on the difference between Tf and Tm.
In step S2007, the system control apparatus 10 determines whether the absolute value of the difference between Tf and Tm (|Tf−Tm|) is smaller than a first threshold value T1. In a case in which the absolute value of the difference between Tf and Tm is smaller than the first threshold value T1, the system control apparatus 10 performs the processing of step S2009. In the present embodiment, T1 is set to be, for example, 3 seconds. In a case in which the absolute value of the difference between Tf and Tm is smaller than the first threshold value T1, the system control apparatus 10 determines that there is a high possibility of a collision. In a case in which the absolute value of the difference between Tf and Tm is not smaller than the first threshold value T1, the system control apparatus 10 executes the processing of step S2008.
In step S2009, because there is a high risk that a collision factor that exists in the space to which a current unique identifier is assigned may collide with the autonomous moving body 12, which is the own vehicle, the system control apparatus 10 transmits a warning to the autonomous moving body 12. When the control unit 12-2 of the autonomous moving body 12 receives the warning from the system control apparatus 10, the control unit 12-2 controls a driving unit 12-6 so as to temporarily stop a right turn of the autonomous moving body 12. It should be noted that there may be a case in which, when the control unit 12-2 receives a warning from the system control apparatus 10, the control unit 12-2 executes processing of notifying the driver of the autonomous moving body 12 that a temporary stop of the right turn has been executed. Furthermore, there may also be a case in which the control unit 12-2, upon receiving a warning from the system control apparatus 10, executes processing to strongly warn the driver of the autonomous moving body 12 to temporarily stop the right turn. A warning in this case is executed by a warning display or a warning sound output to the driver.
In step S2008, the system control apparatus 10 determines whether the absolute value of the difference between Tf and Tm (|Tf−Tm|) is smaller than a second threshold value T2. The second threshold T2 is greater than the first threshold T1. In a case in which the absolute value of the difference between Tf and Tm is smaller than the second threshold T2, the system control apparatus 10 performs the processing of step S2010. In the present embodiment, T2 is set to be, for example, 10 seconds. In a case in which the absolute value of the difference between Tf and Tm is smaller than the second threshold value T2, the system control apparatus 10 determines that there is a possibility of a collision. In a case in which the absolute value of the difference between Tf and Tm is not smaller than the second threshold value T2, the system control apparatus 10 executes the processing of step S2011.
In step S2010, because there is a possibility that a collision factor that exists in the space to which a current unique identifier is assigned may collide with the autonomous moving body 12, which is the own vehicle, the system control apparatus 10 transmits a caution to the autonomous moving body 12, which is the own vehicle. When the control unit 12-2 of the autonomous moving body 12 receives the caution from the system control apparatus 10, the control unit 12-2 determines whether or not the autonomous moving body 12 should be stopped according to the detection control of the autonomous moving body 12 and the response performance of the driving unit, and the like. The control unit 12-2 controls a driving unit 12-6, and temporarily stops the right turn of the autonomous moving body 12 in a case in which the performance of the autonomous moving body 12 is not sufficiently good. In a case in which the performance of the autonomous moving body 12 is good, the control unit 12-2 may limit processing to the execution of a caution display and caution sound output to the driver, and there may be a case in which the control unit 12-2 does not execute the processing to temporarily stop the right turn of the autonomous moving body 12. It should be noted that in a case in which the control unit 12-2 has temporarily stopped the right turn of the autonomous moving body 12, there may also be a case in which the control unit 12-2 executes processing to notify the driver of the autonomous moving body 12 that the right turn has been temporarily stopped. Furthermore, there may also be a case in which the control unit 12-2, upon receiving a caution from the system control apparatus 10, executes processing to caution the driver of the autonomous moving body 12 to temporarily stop the right turn. A caution in this case is executed by a caution display or a caution sound output to the driver.
In step S2011, the system control apparatus 10 transmits a notification to the autonomous moving body 12, which is the own vehicle, indicating that a collision factor is approaching. When the control unit 12-2 of the autonomous moving body 12 receives a notification from the system control apparatus 10 indicating that a collision factor is approaching, the control unit 12-2 determines that the collision factor is approaching but not within collision distance, and controls a driving unit 12-6 so as to cause the autonomous moving body 12 to promptly turn right. It should be noted that there may be a case in which, when the control unit 12-2 receives a notification from the system control apparatus 10 indicating that a collision factor is approaching, the control unit 12-2 executes processing for notifying the driver of the autonomous moving body system 12 that the collision factor is approaching. Furthermore, there may be a case in which, when the control unit 12-2 receives a notification from the system control apparatus 10 indicating that a collision factor is approaching, the control unit 12-2 executes processing for notifying the driver of the autonomous moving body 12 to promptly turn right because the collision factor is approaching. A notification in this case is the execution of a notification display or a notification sound output to the driver.
When the warning by the processing of step S2019, the caution by the processing of step S2010, or the notification by the processing of step S2011 is complete, the system control apparatus 10 executes the processing of step S2012. The warning by the processing of step S2019, the caution by the processing of step S2010, and the processing of step S2011 are examples of a collision avoidance control unit that performs control so as to avoid a collision with a collision factor of a moving body according to the level of collision risk. The warning by the processing of step S2019, the caution by the processing of step S2010, and the processing of step S2011 are examples of performing a notification corresponding to a high collision risk to a moving body.
In step S2012, the system control apparatus 10 confirms whether a unique identifier still remains to be searched for in the unique identifier list. In a case in which there still remains a unique identifier to be searched, the system control apparatus 10 performs the processing of step S2002. The system control apparatus 10 terminates the processing in a case in which there is no unique identifier remaining to be searched.
In the present embodiment, it is assumed that the processing time is several hundred [ms] from the start of the processing to the end of the processing of
In the Second Embodiment described above, a unique identifier list was generated from a predetermined area, and a collision factor was determined from a unique identifier list. However, when the own vehicle is moving at a high speed or if there is a delay in the recognition of the front vehicle-mounted camera, the detection of the collision factor may be delayed. Furthermore, in the Second Embodiment, when creating the unique identifier list, the dynamic information of a unique identifier was acquired from an intersection. In a Third Embodiment of the present invention, a method of quickly searching for a collision factor by changing the creation starting point of a unique identifier list will be explained. In the Third Embodiment, a method is provided to detect a collision factor earlier than in the Second Embodiment by changing the starting point for creating the unique identifier list according to the speed information of an oncoming vehicle.
In step S2117, the autonomous moving body 12, which is the own vehicle, simultaneously measures the traveling speed of an oncoming vehicle when the processing of
In step S2118, the system control apparatus 10 calculates the difference between the update time of a unique identifier and the current time. Specifically, first, the control unit 10-2 of the system control apparatus 10 accesses the format database 14-4 of the conversion information holding apparatus 14 through the network control unit 10-5. By this access, the control unit 10-2 acquires the update time of the dynamic information of around the own vehicle, which is stored in the format database 14-4. The control unit 10-2 calculates the time difference Ts, which is a difference between the acquired updated time and the current time of around the own vehicle, and stores the time difference Ts in the information storage unit 10-4.
The autonomous moving body 12, which is the own vehicle, recognizes whether an intersection exists in an image captured by the front vehicle-mounted camera. In step S2101, the system control apparatus 10 determines whether the front vehicle-mounted camera recognizes an intersection by receiving information from the own vehicle as to whether the own vehicle is recognizing the intersection. It should be noted that the autonomous moving body 12, which is the own vehicle, transmits the image captured by the front vehicle-mounted camera to the system control apparatus 10, and the system control apparatus 10 may execute processing for recognizing whether an intersection exists in the received image. In the recognition of an intersection, a map stored in the information storage unit 10-4 and GPS information from the detection unit 12-1 may be used, and the intersection may be recognized when the intersection becomes 100 meters ahead. The system control apparatus 10 terminates the processing if the front vehicle-mounted camera does not recognize an intersection. The system control apparatus 10 executes the processing of step S2102 if the front vehicle-mounted camera recognizes an intersection.
In step S2102, the system control apparatus 10 calculates the latitude and longitude of a recognized intersection. Specifically, the distance to an intersection can be acquired by using the front vehicle-mounted camera of the autonomous moving body 12, which is the own vehicle, and the latitude and longitude of the intersection can be determined from the own vehicle position. Similar to when the speed of an oncoming vehicle was acquired, the distance to the intersection can be measured by a technique that applies image phase detection AF of the front vehicle-mounted camera. In addition, as an alternative method, the position coordinates of the next intersection may be calculated based on the latitude and longitude position information of the own vehicle calculated by the GPS held by the detection unit 12-1 of the autonomous moving body 12, which is the own vehicle, and a map that has been stored in the information storage unit 10-4.
In step S2103, the system control apparatus 10 determines the arrival prediction time Tm, which is the predicted time it takes for the own vehicle to arrive at the intersection. The system control apparatus 10 can determine the arrival prediction time Tm from the distance to the intersection obtained in step S2102 and the traveling speed of the own vehicle.
According to present embodiment, in particular, a search of a high-speed collision factor is realized by the configuration explained below. In step S2104, the system control apparatus 10 acquires the search start point from the traveling speed of the oncoming vehicle obtained in step S2117. In the Second Embodiment, the search was started from the space of the position of the intersection. In the Third Embodiment, a search is started from the space in which a collision factor with a high possibility of colliding with the own vehicle is estimated to currently exist.
In the present embodiment, a space in which a collision factor with a high possibility of colliding with the own vehicle is estimated to currently exist by using the position coordinates of the intersection, the time difference Ts between the update time of the dynamic information and the current time, the arrival prediction time Tm at which is the prediction time it takes for a vehicle to arrive at an intersection, and the traveling speed St of the oncoming vehicle. The coordinates of this estimated space are the coordinates of a space in which the collision factor arriving at the intersection after Tm has elapsed is considered to be currently traveling and is the search start point. The position coordinates of the intersection were obtained in the processing of step S2102. The position coordinates of the intersection were obtained in the processing of step S2102. The time difference Ts between the update time of the dynamic information and the current time was obtained in the processing of step S2118. The arrival prediction time Tm, which is the time it takes for the own vehicle to arrive at the intersection, was obtained in step S2103. The traveling speed St of the oncoming vehicle was obtained in step S2117.
Specifically, the search start point is set to a position separated by a distance shown by Formula 6 along the path of the oncoming lane from the position coordinates of the intersection.
(Tm+Ts)×St (Formula 6)
In step S2119, the system control apparatus 10 creates a list A of unique identifiers (hereinafter, “unique identifier list A”) toward the intersection from the search start point acquired in step S2104. Because the creation method of the list is similar to that of the Second Embodiment, a detailed explanation thereof will be omitted.
In step S2120, the system control apparatus 10 creates a list B of unique identifiers (hereinafter “unique identifier list B”) in the opposite direction of the intersection from the search start point acquired in step S2104. Because the creation method of the list is similar to that of the Second Embodiment, a detailed explanation thereof will be omitted.
The range of the unique identifier list A created in step S2119, and the range of the unique identifier list B created in step S2120 will be explained with reference to
The system control apparatus 10 detects dynamic information within a unique identifier in the direction of arrow 2205 from the search start point of mark 2202 and stores the dynamic information in the unique identifier list A. The unique identifier list A is a first-in-first-out (FIFO) structure, such that the first stored unique identifier is the first to be read out. As a result, the dynamic information of a unique identifier in the area of the area 2203 is listed in the unique identifier list A.
The system control apparatus 10 detects dynamic information within a unique identifier in the direction of an arrow 2206 from the search start point of mark 2202 and stores the dynamic information in the unique identifier list B. The unique identifier list B is a first-in-first-out (FIFO) structure, such that the first stored unique identifier is the first to be read out. As a result, the dynamic information of the unique identifier in the area of area 2204 is listed in the unique identifier list B. That the unique identifier list A and the unique identifier list B have a FIFO structure is an example of performing control so as to avoid a collision based on information of the collision factor in sequential order from information of the collision factor that has been stored first in the unique identifier list, according to a level of collision risk.
The explanation of
In step S2105, the system control apparatus 10 confirms whether there is an unconfirmed unique identifier in the unique identifier list A and the unique identifier list B. The system control apparatus 10 terminates the processing in a case in which all of the moving body information of the unique identifiers listed in both lists has been confirmed. The system control apparatus 10 executes the processing of step S2121 in a case in which all of the moving body information of the unique identifiers listed in both lists has not been confirmed.
In step S2121, the system control apparatus 10 acquires unique identifier data so as to process the data in the unique identifier list A alternately with the data in the unique identifier list B. In a case in which the system control apparatus 10 has acquired the data of the unique identifier of the head of the unique identifier list A in the previous step S2121, the system control apparatus 10 acquires the data of the unique identifier of the head of the unique identifier list B in the current step S2121. In a case in which the system control apparatus 10 has acquired the data of the unique identifier of the head of the unique identifier list B in the previous step S2121, the system control apparatus 10 acquires the data of the second unique identifier from the head of the unique identifier list A in the current step S2121. By alternating between the confirmation of unique identifier list A and unique identifier list B, the system control apparatus 10 can confirm from the unique identifier closest to the mark 2202, thereby reducing the time required to identify a collision factor.
According to the Second Embodiment and Third Embodiment explained above, it is possible to provide a blind spot accident reduction system by using an autonomous moving body control system.
As explained above, according to each of the embodiments of the present invention, the format of the digital architecture and the autonomous moving body control system using same are provided more efficiently, taking safety into consideration.
It should be noted that in each of the embodiments described above, an example in which a control system is applied to an autonomous moving body has been explained. However, the present invention is not limited to an autonomous moving body such as an Automatic Guided Vehicle (AGV) or Autonomous Mobile Robot (AMR). For example, an autonomous moving body may be any moving apparatus that moves, such as an automobile, a train, a ship, an airplane, a robot, a drone, and the like. In addition, the control system of the present invention may be partially mounted thereon, or may not be mounted thereon. Furthermore, the present invention can also be applied in a case in which a moving body is controlled remotely.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2022-145035, filed Sep. 13 2022, which is hereby incorporated by reference wherein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2022-145035 | Sep 2022 | JP | national |