This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2021-028504 filed on Feb. 25, 2021, the content of which is incorporated herein by reference.
This invention relates to a map generation apparatus configured to generate a map around a subject vehicle.
Conventionally, there is a known apparatus in which white lines of a lane and a parking lot frame are recognized using an image captured by a camera mounted on a vehicle, and the recognition results of the white lines are used for vehicle driving control and parking support. Such an apparatus is described, for example, in Japanese Unexamined Patent Publication No. 2014-104853 (JP2014-104853A). In the apparatus disclosed in JP2014-104853A, edge points at which a change in luminance in the captured image is equal to or greater than a threshold is extracted, and the white lines are recognized based on the edge points.
In the apparatus described in JP2014-104853A, a white line is recognized for a lane on which a subject vehicle has actually traveled. Therefore, in order to generate a map including position information of the white line, it is necessary for the subject vehicle to actually travel in each lane, and it is difficult to efficiently generate the map.
An aspect of the present invention is a map generation apparatus including a detection device that detects an external situation around a subject vehicle, and an electronic control unit including a microprocessor and a memory connected to the microprocessor. The microprocessor is configured to perform generating simultaneously a first map for a current lane on which the subject vehicle travels and a second map for an opposite lane opposite to the current lane, based on the external situation detected by the detection device.
The objects, features, and advantages of the present invention will become clearer from the following description of embodiments in relation to the attached drawings, in which:
Hereinafter, an embodiment of the present invention is explained with reference to
First, the general configuration of the subject vehicle for self-driving will be explained.
The term external sensor group 1 herein is a collective designation encompassing multiple sensors (external sensors) for detecting external circumstances constituting subject vehicle ambience data. For example, the external sensor group 1 includes, inter alia, a LIDAR (Light Detection and Ranging) for measuring distance from the subject vehicle to ambient obstacles by measuring scattered light produced by laser light radiated from the subject vehicle in every direction, a RADAR (Radio Detection and Ranging) for detecting other vehicles and obstacles around the subject vehicle by radiating electromagnetic waves and detecting reflected waves, and a CCD, CMOS or other image sensor-equipped on-board cameras for imaging subject vehicle ambience (forward, reward and sideways).
The term internal sensor group 2 herein is a collective designation encompassing multiple sensors (internal sensors) for detecting driving state of the subject vehicle. For example, the internal sensor group 2 includes, inter alia, a vehicle speed sensor for detecting vehicle speed of the subject vehicle, acceleration sensors for detecting forward-rearward direction acceleration and lateral acceleration of the subject vehicle, respectively, rotational speed sensor for detecting rotational speed of the travel drive source, a yaw rate sensor for detecting rotation angle speed around a vertical axis passing center of gravity of the subject vehicle and the like. The internal sensor group 2 also includes sensors for detecting driver driving operations in manual drive mode, including, for example, accelerator pedal operations, brake pedal operations, steering wheel operations and the like.
The term input/output device 3 is used herein as a collective designation encompassing apparatuses receiving instructions input by the driver and outputting information to the driver. The input/output device 3 includes, inter alia, switches which the driver uses to input various instructions, a microphone which the driver uses to input voice instructions, a display for presenting information to the driver via displayed images, and a speaker for presenting information to the driver by voice.
The position measurement unit (GNSS unit) 4 includes a position measurement sensor for receiving signal from positioning satellites to measure the location of the subject vehicle. The positioning satellites are satellites such as GPS satellites and Quasi-Zenith satellite. The position measurement unit 4 measures absolute position (latitude, longitude and the like) of the subject vehicle based on signal received by the position measurement sensor.
The map database 5 is a unit storing general map data used by the navigation unit 6 and is, for example, implemented using a magnetic disk or semiconductor element. The map data include road position data and road shape (curvature etc.) data, along with intersection and road branch position data. The map data stored in the map database 5 are different from high-accuracy map data stored in a memory unit 12 of the controller 10.
The navigation unit 6 retrieves target road routes to destinations input by the driver and performs guidance along selected target routes. Destination input and target route guidance is performed through the input/output device 3. Target routes are computed based on current position of the subject vehicle measured by the position measurement unit 4 and map data stored in the map database 35. The current position of the subject vehicle can be measured, using the values detected by the external sensor group 1, and on the basis of this current position and high-accuracy map data stored in the memory unit 12, target route may be calculated.
The communication unit 7 communicates through networks including the Internet and other wireless communication networks to access servers (not shown in the drawings) to acquire map data, travel history information, traffic data and the like, periodically or at arbitrary times. In addition to acquiring travel history information, travel history information of the subject vehicle may be transmitted to the server via the communication unit 7. The networks include not only public wireless communications network, but also closed communications networks, such as wireless LAN, Wi-Fi and Bluetooth, which are established for a predetermined administrative area. Acquired map data are output to the map database 5 and/or memory unit 12 via the controller 10 to update their stored map data.
The actuators AC are actuators for traveling of the subject vehicle. If the travel drive source is the engine, the actuators AC include a throttle actuator for adjusting opening angle of the throttle valve of the engine (throttle opening angle). If the travel drive source is the travel motor, the actuators AC include the travel motor. The actuators AC also include a brake actuator for operating a braking device and turning actuator for turning the front wheels FW.
The controller 10 is constituted by an electronic control unit (ECU). More specifically, the controller 10 incorporates a computer including a CPU or other processing unit (a microprocessor) 51 for executing a processing in relation to travel control, the memory unit (a memory) 12 of RAM, ROM and the like, and an input/output interface or other peripheral circuits not shown in the drawings. In
The memory unit 12 stores high-accuracy detailed road map data (road map information) for self-driving. The road map information includes information on road position, information on road shape (curvature, etc.), information on gradient of the road, information on position of intersections and branches, information on type and position of division line such as white line, information on the number of lanes, information on width of lane and the position of each lane (center position of lane and boundary line of lane), information on position of landmarks (traffic lights, signs, buildings, etc.) as a mark on the map, and information on the road surface profile such as unevennesses of the road surface, etc. The map information stored in the memory unit 12 includes map information (referred to as external map information) acquired from the outside of the subject vehicle through the communication unit 7, and map information (referred to as internal map information) created by the subject vehicle itself using the detection values of the external sensor group 1 or the detection values of the external sensor group 1 and the internal sensor group 2.
The external map information is, for example, information of a map (called a cloud map) acquired through a cloud server, and the internal map information is information of a map (called an environmental map) consisting of point cloud data generated by mapping using a technique such as SLAM (Simultaneous Localization and Mapping). The external map information is shared by the subject vehicle and other vehicles, whereas the internal map information is unique map information of the subject vehicle (e.g., map information that the subject vehicle has alone). In an area in which no external map information exists, such as a newly established road, an environmental map is created by the subject vehicle itself. The internal map information may be provided to the server or another vehicle via the communication unit 7. The memory unit 12 also stores information such as programs for various controls, and thresholds used in the programs.
As functional configurations in relation to mainly self-driving, the processing unit 11 includes a subject vehicle position recognition unit 13, an external environment recognition unit 14, an action plan generation unit 15, a driving control unit 16, and a map generation unit 17.
The subject vehicle position recognition unit 13 recognizes the position of the subject vehicle (subject vehicle position) on the map based on position information of the subject vehicle calculated by the position measurement unit 4 and map information stored in the map database 5. Optionally, the subject vehicle position can be recognized using map information stored in the memory unit 12 and ambience data of the subject vehicle detected by the external sensor group 1, whereby the subject vehicle position can be recognized with high accuracy. The movement information (movement direction, movement distance) of the subject vehicle is calculated based on the detection value of the internal sensor group 2, thereby it is also possible to recognize the position of the subject vehicle. Optionally, when the subject vehicle position can be measured by sensors installed externally on the road or by the roadside, the subject vehicle position can be recognized with high accuracy by communicating with such sensors through the communication unit 7.
The external environment recognition unit 14 recognizes external circumstances around the subject vehicle based on signals from cameras, LIDERs, RADARs and the like of the external sensor group 1. For example, it recognizes position, speed and acceleration of nearby vehicles (forward vehicle or rearward vehicle) driving in the vicinity of the subject vehicle, position of vehicles stopped or parked in the vicinity of the subject vehicle, and position and state of other objects. Other objects include traffic signs, traffic lights, road, buildings, guardrails, power poles, commercial signs, pedestrians, bicycles, and the like. The other objects (road) also include road division lines (white lines, etc.) and stop lines. Recognized states of other objects include, for example, traffic light color (red, green or yellow) and moving speed and direction of pedestrians and bicycles. A part of a stationary object among other objects, constitutes a landmark serving as an index of position on the map, and the external environment recognition unit 14 also recognizes the position and type of the landmark.
The action plan generation unit 15 generates a driving path (target path) of the subject vehicle from present time point to a certain time ahead based on, for example, a target route computed by the navigation unit 6, map information stored in the memory unit 12, subject vehicle position recognized by the subject vehicle position recognition unit 13, and external circumstances recognized by the external environment recognition unit 14. When multiple paths are available on the target route as target path candidates, the action plan generation unit 15 selects from among them the path that optimally satisfies legal compliance, safe efficient driving and other criteria, and defines the selected path as the target path. The action plan generation unit 15 then generates an action plan matched to the generated target path. An action plan is also called “travel plan”. The action plan generation unit 15 generates various kinds of action plans corresponding to overtake traveling for overtaking the forward vehicle, lane-change traveling to move from one traffic lane to another, following traveling to follow the preceding vehicle, lane-keep traveling to maintain same lane, deceleration or acceleration traveling. When generating a target path, the action plan generation unit 15 first decides a drive mode and generates the target path in line with the drive mode.
In self-drive mode, the driving control unit 16 controls the actuators AC to drive the subject vehicle along target path generated by the action plan generation unit 15. More specifically, the driving control unit 16 calculates required driving force for achieving the target accelerations of sequential unit times calculated by the action plan generation unit 15, taking running resistance caused by road gradient and the like into account. And the driving control unit 16 feedback-controls the actuators AC to bring actual acceleration detected by the internal sensor group 2, for example, into coincidence with target acceleration. In other words, the driving control unit 16 controls the actuators AC so that the subject vehicle travels at target speed and target acceleration. On the other hand, in manual drive mode, the driving control unit 16 controls the actuators AC in accordance with driving instructions by the driver (steering operation and the like) acquired from the internal sensor group 2.
The map generation unit 17 generates the environment map constituted by three-dimensional point cloud data using detection values detected by the external sensor group 1 during traveling in the manual drive mode. Specifically, an edge indicating an outline of an object is extracted from a camera image acquired by the camera based on luminance and color information for each pixel, and a feature point is extracted using the edge information. The feature point is, for example, an intersection of the edges, and corresponds to a road division line, a corner of a building, a corner of a road sign, or the like. The map generation unit 17 calculates the distance to the extracted feature point and sequentially plots the feature point on the environment map, thereby generating the environment map around the road on which the subject vehicle has traveled. The environment map may be generated by extracting the feature point of an object around the subject vehicle using data acquired by radar or LIDAR instead of the camera.
The subject vehicle position recognition unit 13 performs subject vehicle position estimation processing in parallel with map creation processing by the map generation unit 17. That is, the position of the subject vehicle is estimated based on a change in the position of the feature point over time. The map creation processing and the position estimation processing are simultaneously performed, for example, according to an algorithm of SLAM using signals from the camera or LIDAR. The map generation unit 17 can generate the environment map not only when the vehicle travels in the manual drive mode but also when the vehicle travels in the self-drive mode. If the environment map has already been generated and stored in the memory unit 12, the map generation unit 17 may update the environment map with a newly obtained feature point.
A configuration of a map generation apparatus according to the present embodiment will be described.
As illustrated in
In
In such a driving scene, by extracting edge points from a camera image acquired while the subject vehicle 101 travels on the current lane, it is possible to generate a map of the current lane (first lane LN1) included in the detectable area AR1. However, if the subject vehicle 101 should travel on the opposite lane in order to generate a map of the opposite lane (second lane LN2), it is difficult to efficiently generate the map. Therefore, in order to efficiently generate the maps of both the current lane and the opposite lane, the present embodiment configures a map generation apparatus as follows.
The camera 1a is a monocular camera having an imaging element (image sensor) such as a CCD or a CMOS, and constitutes a part of the external sensor group 1 in
The sensor 2a is a detection part used to calculate a movement amount and a movement direction of the subject vehicle 101. The sensor 2a is a part of the internal sensor group 2, and includes, for example, a vehicle speed sensor and a yaw rate sensor. That is, the controller 10 (for example, a subject vehicle position recognition unit 13 in
The controller 10 in
The accuracy determination unit 141 determines whether or not the detection accuracy of the second lane LN2 by the camera 1a is a predetermined value a or more on the basis of the camera image acquired by the camera 1a at the time of traveling on the first lane LN1. This determination is a determination as to whether or not the division lines L2 and L3 of the second lane LN2 are included in the detectable area AR1 of the camera 1a. For example, as illustrated in
On the other hand, as illustrated in
The map generation unit 17 has an outward path map generation unit 171 that generates an environmental map (outward path map) of an outward path which is the first lane LN1, and a return path map generation unit 172 that generates an environmental map (return path map) of a return path which is the second lane LN2. At the time of traveling on the outward path in the manual drive mode, the outward path map generation unit 171 extracts feature points of objects (the building, the division lines L1 and L2, and the like) around the subject vehicle 101 on the basis of a camera image acquired by the camera 1a, and estimates a position of the subject vehicle by the sensor 2a, thereby generating the environmental map of the outward path. The generated outward path map is stored in the memory unit 12. At this time, the outward path map generation unit 171 recognizes the positions of the division lines L1 and L2 (
When it is determined by the accuracy determination unit 141 that the detection accuracy of the second lane LN2 by the camera 1a is the predetermined value a or more at the time of traveling on the outward path in the manual drive mode, the return path map generation unit 172 generates the environmental map of the return path. That is, in this case, as illustrated in
On the other hand, when it is determined by the accuracy determination unit 141 that the detection accuracy is less than the predetermined value a at the time of traveling on the outward path in the manual drive mode, the return path map generation unit 172 generates a return path map as follows. First, a boundary line L0 between the first lane LN1 and the second lane LN2 is set on the basis of the camera image. Next, an environmental map of the outward path is moved symmetrically with the boundary line L0 as a symmetry axis. That is, the outward path map is bisymmetrically inverted by mirroring. In other words, inversion is performed symmetrically in a direction diagonally opposite to the outward path map. As a result, as indicated by a dotted line in
The return path map obtained by the mirroring is not obtained by actually imaging the second lane LN2, but is a map predicted on the assumption that the first lane LN1 and the second lane LN2 are symmetric. Therefore, the obtained return path map is a simple map and corresponds to a temporary map. After generating the temporary map, the return path map generation unit 172 updates the map information of the temporary map with the camera image obtained when the subject vehicle 101 travels on the return path in the manual drive mode, for example. That is, as illustrated in
The updated map corresponds to the environmental map obtained by the camera image at the time of traveling on the return path, and is the complete environmental map of the return path. However, at the time of traveling on the return path, since the temporary map of the return path is generated in advance, it is not necessary to generate the return path map from the beginning. Therefore, the return path map can be efficiently generated, and the processing load of the controller 10 can be reduced. In this way, in the present embodiment, when the environmental map of the outward path is generated, the environmental map of the return path is simultaneously generated.
The action plan generation unit 15 sets a target route when the subject vehicle 101 travels on the return path using the map information of the environmental map of the return path (second lane LN2) obtained at the time of traveling on the outward path (first lane LN1). The driving control unit 16 (
As illustrated in
In S4, an environmental map (return path map) of the return path (second lane LN2) is generated on the basis of the camera image read in S1. For example, as illustrated in
The operation of the map generation apparatus 50 according to the present embodiment is summarized as follows. As illustrated in
On the other hand, as illustrated in
As a result, it is possible to generate the return path map even before the vehicle actually travels on the return path in the manual drive mode. Therefore, it is possible to set the target route when the vehicle travels on the return path in the self-drive mode on the basis of the return path map, and the vehicle can travel on the return path in the self-drive mode. In this case, when the detection accuracy of the second lane LN2 by the camera 1a is the predetermined value a or more, the map generation is performed using the actual camera image in preference to the map generation by the mirroring, so that the return path map can be created with high accuracy.
According to the present embodiment, following functions and effects can be achieved.
(1) The map generation apparatus 50 includes a camera 1a that detects an external situation around the subject vehicle 101; and a map generation unit 17 that simultaneously generates an outward path map for a current lane (first lane LN1) on which the subject vehicle 101 travels and a return path map for an opposite lane (second lane LN2) opposite to the current lane, on the basis of the external situation detected by the camera 1a (
(2) The map generation apparatus 50 further includes an accuracy determination unit 141 that determines whether or not detection accuracy of the external situation for the second lane LN2 detected by the camera 1a is a predetermined value a or more (
(3) When it is determined by the accuracy determination unit 141 that the detection accuracy is less than the predetermined value a, the return path map generation unit 172 generates the return path map by symmetrically moving (moving in a line-symmetric manner) the outward path map with the boundary line L0 between the first lane LN1 and the second lane LN2 as a symmetry axis (
(4) The return path map generation unit 172 simultaneously generates the outward path map and the return path map while the subject vehicle 101 travels on the first lane LN1, on the basis of the external situation detected by the camera 1a when the subject vehicle 101 travels on the first lane LN1 (
(5) The map generation apparatus 50 further includes an action plan generation unit 15 (a route setting unit) that sets a target route when the subject vehicle 101 travels on the second lane LN2, on the basis of the return path map generated by the map generation unit 17 (
The above embodiment may be modified into various forms. Some modifications will be described below. In the above embodiment, the external sensor group 1, which is an in-vehicle detector such as the camera 1a, detects the external situation around the subject vehicle 101. However, the external situation may be detected using an in-vehicle detector such as a LiDAR other than the camera 1a or a detection device other than the in-vehicle detector. Information from an in-vehicle detector (camera or the like) mounted on an oncoming vehicle traveling on an opposite lane may be acquired via the communication unit 7, and the outward path map or the return path map may be generated. In the above embodiment, the outward path map generation unit 171 generates the outward path map (a first map) on the basis of the external situation detected by the camera 1a when the subject vehicle 101 travels on the first lane LN1 (current lane), and generates the return path map (a second map) in different modes on the basis of the determination result of the accuracy determination unit 141. However, a map generation unit may have any configuration as long as the first map and the second map are simultaneously generated.
In the above embodiment, when it is determined that the detection accuracy for the opposite lane detected by the camera 1a is less than the predetermined value a, the return path map is generated by symmetrically moving (moving in a line-symmetric manner) the outward path map with the boundary line L0 between the current lane and the opposite lane as a symmetry axis. However, the inversion mode of the outward path map is not limited to line symmetry in which the boundary line is symmetric line. In the above embodiment, when it is determined that the detection accuracy for the opposite lane detected by the camera 1a is less than the predetermined value a, the action plan generation unit 15 as a route setting unit sets the target route for self-driving in traveling on the return path using the map (a temporary map) generated by the mirroring. However, the target route for self-driving may be set using a complete return path map instead of using the temporary map.
In the above embodiment, the outward path map and the return path map obtained when the subject vehicle 101 travels on the outward path are stored in the memory unit 12. However, these pieces of map information may be further transmitted to a server via the communication unit 7. In a case where there is other vehicle having a map generation function similar to that of the present embodiment, that is, in a case where there is other vehicle that generates a map on the basis of an external situation of a detection device, map information may be transmitted to and received from other vehicle via the communication unit 7.
In the above embodiment, the example in which the map generation apparatus is applied to the self-driving vehicle has been described. That is, the example in which the self-driving vehicle generates the environmental map has been described. However, the present invention can be similarly applied to a case where a manual driving vehicle having or not having a driving support function generates the environmental map.
The present invention can also be used as a map generation method including detecting an external situation around a subject vehicle 101 by a detection device such as a camera 1a, and simultaneously generating a first map for a current lane LN1 on which the subject vehicle 101 travels and a second map for an opposite lane LN2 opposite to the current lane LN1, based on the detected external situation.
The above embodiment can be combined as desired with one or more of the above modifications. The modifications can also be combined with one another.
According to the present invention, it is possible to generate a map for a lane on which a subject vehicle has not traveled yet, and a map generation can be efficiently performed.
Above, while the present invention has been described with reference to the preferred embodiments thereof, it will be understood, by those skilled in the art, that various changes and modifications may be made thereto without departing from the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2021-028504 | Feb 2021 | JP | national |