An embodiment of the preferred embodiment relates to a vehicle position estimation device and a vehicle control device.
Technologies for achieving automated valet parking including automated parking and automated retrieval are now studied. In the automated parking, after an occupant gets out of a vehicle in a predetermined drop-off area within a parking lot, the vehicle autonomously moves from the drop-off area to an available parking space and parks itself there in response to a predetermined instruction. In the automated retrieval, after the automated parking is completed, the vehicle autonomously moves out of the parking space to a predetermined pick-up area and stops itself there in response to a predetermined call.
Patent Document 1: Japanese Unexamined Patent Application Publication No. 2015-41348 (JP 2015-41348 A)
In technologies based on autonomous travel, such as the automated valet parking described above, it is important to accurately find the current position of a vehicle during the autonomous travel. In this regard, one conventional method (what is called odometry) estimates the current position of a vehicle using a value detected by a wheel speed sensor or the like. However, this method may not always accurately find the current position of a vehicle because an error in estimation result increases cumulatively with an increase in distance traveled by the vehicle.
Therefore, a purpose of an embodiment is to provide a vehicle position estimation device and a vehicle control device that are capable of accurately finding the current position of a vehicle.
A vehicle position estimation device according to an example of an embodiment includes the following: a parking lot data acquisition unit that acquires parking lot data capable of identifying an absolute orientation and an absolute position of a road surface marking provided on a road surface of a parking lot, an image data acquisition unit that acquires image data obtained by an on-board camera that captures a situation around a vehicle, and a position estimation unit that calculates a relative orientation and a relative position of the road surface marking with respect to the vehicle on the image data by detecting road surface marking data related to the road surface marking from the image data, and that estimates an actual orientation and an actual position of the vehicle on the basis of the calculated relative orientation, the calculated relative position, and the parking lot data.
The vehicle position estimation device described above is capable of accurately finding the current position (the actual position) and the current orientation (the actual orientation) of the vehicle by taking into account deviations of the theoretical position and the theoretical orientation of the road surface marking that are identified using the relative position and the relative orientation calculated on the basis of the image data from the (normal) absolute position and the (normal) absolute orientation of the road surface marking that are identified on the basis of the parking lot data.
In the vehicle position estimation device according to the example described above, the position estimation unit calculates the relative orientation and the relative position of the road surface marking that is located on either a left side or a right side of the vehicle by detecting the road surface marking data from side image data that is the image data representative of the situation on either the left side or the right side of the vehicle. This structure is capable of easily calculating the relative orientation and the relative position of the road surface marking by using the side image data that tends to capture the road surface marking.
Further, in the vehicle position estimation device according to the example described above, the parking lot data acquisition unit acquires, as the parking lot data, boundary line data capable of identifying the absolute orientation and the absolute position of a boundary line that is the road surface marking indicative of a boundary of a parking space pre-provided in the parking lot, and the position estimation unit calculates the relative orientation and the relative position of the boundary line by detecting, as the road surface marking data, a position of an end portion of the boundary line and an orientation of the boundary line on the image data, and estimates the actual orientation and the actual position of the vehicle on the basis of the calculated relative orientation, the calculated relative position, and the boundary line data. This structure is capable of easily estimating the actual orientation and the actual position of the vehicle by using the boundary line that is commonly provided as the road surface marking indicative of the boundary of the parking space.
In the above structure using the boundary line, of the boundary line on the image data, the position estimation unit detects, as the road surface marking data, the position of the end portion that is located on an opening portion side of the parking space that is delineated by the boundary line in such a manner as to have an opening portion, and a direction of extension of the boundary line including the end portion. This structure is capable of easily estimating the actual orientation and the actual position of the vehicle by using the position of the end portion of the boundary line that is located on the opening portion side of the parking space and the direction of extension of the boundary line including the end portion.
In the above structure using the boundary line, of the boundary line on the image data, the position estimation unit detects, as the road surface marking data, a position of a central point of the end portion and a direction of extension of the boundary line including the end portion. This structure is capable of easily estimating the actual orientation and the actual position of the vehicle by using the position of the central point of the end portion of the boundary line and the direction of extension of the boundary line including the end portion.
Further, in the vehicle position estimation device according to the example described above, the parking lot data acquisition unit acquires, as the parking lot data, first marker data capable of identifying the absolute orientation and the absolute position of a first marker that includes a first line segment and that is the road surface marking pre-provided around a route along which the vehicle travels in the parking lot, and the position estimation unit calculates the relative orientation and the relative position of the first marker with respect to the vehicle by detecting, as the road surface marking data, a position of the first marker and an orientation of the first line segment included in the first marker on the image data, and estimates the actual orientation and the actual position of the vehicle on the basis of the calculated relative orientation, the calculated relative position, and the first marker data. This structure is capable of easily estimating the actual orientation and the actual position of the vehicle by using the first marker.
Further, in the vehicle position estimation device according to the example described above, the parking lot data acquisition unit acquires, as the parking lot data, at least one of boundary line data and second marker data. The boundary line data is capable of identifying the absolute orientation and the absolute position of a boundary line that is the road surface marking indicative of a boundary of a parking space pre-provided in the parking lot. The second marker data is capable of identifying the absolute orientation and the absolute position of a second marker. The second marker includes a second line segment and is the road surface marking that is provided in an area around the boundary line and on the inside of a route along which the vehicle makes a turn in the parking lot. In this structure, when the vehicle makes the turn, the position estimation unit calculates the relative orientation and the relative position of at least one of the boundary line and the second marker that are located on the inside of the turn of the vehicle by detecting, as the road surface marking data, data related to the at least one of the boundary line and the second marker from inside image data that is the image data representative of the situation on the inside of the turn of the vehicle, and estimates the actual orientation and the actual position of the vehicle on the basis of the calculated relative orientation, the calculated relative position, and the at least one of the boundary line data and the second marker data corresponding to the detected road surface marking data. This structure is capable of accurately estimating the actual orientation and position of the vehicle during the turn by using at least one of the boundary line and the second marker.
Further, in the vehicle position estimation device according to the example described above, the position estimation unit detects, as the road surface marking data, a first value indicative of an orientation and a position of the road surface marking in a first coordinate system on the image data, converts the first value into a second value in a second coordinate system associated with the on-board camera, and converts the second value into a third value in a third coordinate system associated with the vehicle so as to calculate the third value as the relative orientation and the relative position of the road surface marking with respect to the vehicle. This structure is capable of easily calculating the relative orientation and the relative position of the road surface marking with respect to the vehicle by coordinate transformation.
Further, in the vehicle position estimation device according to the example described above, the position estimation unit calculates the theoretical absolute orientation and the theoretical absolute position of the road surface marking on the basis of estimation values of the actual orientation and the actual position of the vehicle and on the basis of the relative orientation and the relative position of the road surface marking. The estimation values are based on previous estimation results of the actual orientation and the actual position of the vehicle and based on the amounts of change in the actual orientation and the actual position of the vehicle that are based on odometry. Then, the position estimation unit extracts, from the parking lot data, partial data corresponding to an area around the theoretical absolute position, corrects the estimation values of the actual orientation and the actual position of the vehicle on the basis of differences of the theoretical absolute orientation and the theoretical absolute position from the absolute orientation and the absolute position that are based on the partial data, and estimates the actual orientation and the actual position of the vehicle on the basis of the corrected values. This structure is capable of easily estimating the actual orientation and position of the vehicle by using the partial data, not using all the parking lot data.
In this case, after correcting the estimation value of the actual orientation such that the theoretical absolute orientation coincides with the absolute orientation that is based on the partial data, the position estimation unit corrects the estimation value of the actual position such that the theoretical absolute position coincides with the absolute position that is based on the partial data. This structure is capable of easily correcting the estimation values of the actual orientation and the actual position of the vehicle in a stepwise manner.
A vehicle position estimation device according to another example of the embodiment includes the following: a parking lot data acquisition unit that acquires parking lot data including information on an absolute position of each of multiple road surface markings that are provided on a road surface of a parking lot; an image data acquisition unit that acquires image data on an image captured by an on-board camera that captures a situation around a vehicle; and a position estimation unit that calculates relative positions of at least two of the multiple road surface markings with respect to the vehicle on the image data by detecting road surface marking data related to the at least two road surface markings from the image data, and that estimates an actual position of the vehicle on the basis of the calculated relative positions and the parking lot data.
The vehicle position estimation device described above is capable of accurately finding the current position (the actual position) of the vehicle by taking into account deviations of the theoretical positions of the at least two road surface markings (and the positional relationship therebetween) that are identified using the relative positions calculated on the basis of the image data from the (normal) absolute positions of the at least two road surface markings (and the positional relationship therebetween) that are identified on the basis of the parking lot data.
In the vehicle position estimation device according to the other example, the multiple road surface markings include at least one first road surface marking located on the left side of the vehicle and at least one second road surface marking located on the right side of the vehicle, and the position estimation unit calculates the relative positions of the first road surface marking and the second road surface marking by detecting, as the road surface marking data, a first position of the first road surface marking from left side image data that is the image data representative of the situation on the left side of the vehicle and by detecting, as the road surface marking data, a second position of the second road surface marking from right side image data that is the image data representative of the situation on the right side of the vehicle. This structure is capable of accurately calculating the relative positions of at least two road surface markings by using two images of different types (the left side image data and the right side image data).
Further, in the vehicle position estimation device according to the other example, the position estimation unit calculates the relative positions of at least two road surface markings that are located on either the left side or the right side of the vehicle by detecting, as the road surface marking data, a position of each of the at least two road surface markings from side image data that is the image data representative of the situation on either the left side or the right side of the vehicle. This structure is capable of easily calculating the relative positions of at least two road surface markings by using an image of one type (the side image data) only.
Further, in the vehicle position estimation device according to the other example, the parking lot data acquisition unit acquires, as the parking lot data, boundary line data capable of identifying the absolute positions of end portions of multiple boundary lines that are the road surface markings indicative of a boundary of a parking space pre-provided in the parking lot, and the position estimation unit calculates the relative positions of at least two of the multiple boundary lines by detecting, as the road surface marking data, positions of the end portions of the at least two boundary lines on the image data, and estimates the actual position of the vehicle on the basis of the calculated relative positions and the boundary line data. This structure is capable of easily estimating the actual position of the vehicle by using at least two of the multiple boundary lines that are commonly provided as the road surface markings indicative of the boundary of the parking space.
In the above structure using at least two boundary lines, of the at least two boundary lines on the image data, the position estimation unit detects, as the road surface marking data, the positions of the end portions that are located on an opening portion side of the parking space that is delineated by the boundary lines in such a manner as to have an opening portion. This structure is capable of easily estimating the actual position of the vehicle by using the positions of the end portions of the at least two boundary lines that are located on the opening portion side of the parking space.
Further, in the above structure using at least two boundary lines, the position estimation unit detects, as the road surface marking data, positions of central points of the end portions of the at least two boundary lines on the image data. This structure is capable of easily estimating the actual position of the vehicle by using the positions of the central points of the end portions of the at least two boundary lines.
Further, in the vehicle position estimation device according to the other example, the parking lot data acquisition unit acquires, as the parking lot data, boundary line data and marker data. The boundary line data is capable of identifying the absolute positions of end portions of multiple boundary lines that are the road surface markings indicative of a boundary of a parking space pre-provided in the parking lot. The marker data is capable of identifying the absolute positions of multiple markers that are pre-provided around a route along which the vehicle travels in the parking lot, and the position estimation unit estimates the actual position of the vehicle by detecting the road surface marking data that is related to at least two of the multiple boundary lines, at least two of the multiple markers, or both at least one of the multiple boundary lines and at least one of the multiple markers. This structure is capable of easily estimating the actual position of the vehicle on the basis of a combination of any two or more of the multiple boundary lines and the multiple markers.
Further, in the vehicle position estimation device according to the other example, the position estimation unit detects, as the road surface marking data, first values indicative of positions of the at least two road surface markings in a first coordinate system on the image data, converts the first values into second values in a second coordinate system that is associated with the on-board camera, and converts the second values into third values in a third coordinate system that is associated with the vehicle so as to calculate the third values as the relative positions of the at least two road surface markings with respect to the vehicle. This structure is capable of easily calculating the relative positions of at least two road surface markings with respect to the vehicle by coordinate transformation.
Further, in the vehicle position estimation device according to the other example, the position estimation unit calculates theoretical absolute positions of the at least two road surface markings on the basis of an estimation value of the actual position of the vehicle and on the basis of the relative positions of the at least two road surface markings. The estimation value is based on a previous estimation result of the actual position of the vehicle and based on the amount of change in the actual position of the vehicle that is based on odometry. Then, the position estimation unit extracts, from the parking lot data, partial data corresponding to an area around the theoretical absolute positions, corrects the estimation value of the actual position of the vehicle on the basis of differences of the theoretical absolute positions from the absolute positions that are based on the partial data, and estimates the actual position of the vehicle on the basis of the corrected value. This structure is capable of easily estimating the actual position of the vehicle by using the partial data, not using all the parking lot data.
A vehicle control device according to further another example of the embodiment is configured to be mounted on a vehicle and including the following: a travel control unit that controls a traveling state of the vehicle to achieve autonomous travel in a parking lot; a parking lot data acquisition unit that acquires parking lot data capable of identifying an absolute orientation and an absolute position of a road surface marking provided on a road surface of the parking lot; an image data acquisition unit that acquires image data obtained by an on-board camera that captures a situation around the vehicle; and a position estimation unit that calculates a relative orientation and a relative position of the road surface marking with respect to the vehicle on the image data during the autonomous travel by detecting road surface marking data related to the road surface marking from the image data, and that estimates an actual orientation and an actual position of the vehicle on the basis of the calculated relative orientation, the calculated relative position, and the parking lot data.
The vehicle control device described above is capable of accurately finding the current position (the actual position) and the current orientation (the actual orientation) of the vehicle during the autonomous travel by taking into account deviations of the theoretical position and the theoretical orientation of the road surface marking that are identified using the relative position and the relative orientation calculated on the basis of the image data from the (normal) absolute position and the (normal) absolute orientation of the road surface marking that are identified on the basis of the parking lot data.
A vehicle control device according to still further another example of the embodiment is configured to be mounted on a vehicle and including the following: a travel control unit that controls a traveling state of the vehicle to achieve autonomous travel in a parking lot; a parking lot data acquisition unit that acquires parking lot data including information on an absolute position of each of multiple road surface markings that are provided on a road surface of the parking lot; an image data acquisition unit that acquires image data on an image captured by an on-board camera that captures a situation around a vehicle; and a position estimation unit that calculates relative positions of at least two of the multiple road surface markings with respect to the vehicle on the image data during the autonomous travel by detecting road surface marking data related to the at least two road surface markings from the image data, and that estimates an actual position of the vehicle including an actual orientation thereof on the basis of the calculated relative positions and the parking lot data.
The vehicle control device described above is capable of accurately finding the current position (the actual position) of the vehicle during the autonomous travel by taking into account deviations of the theoretical positions of the at least two road surface markings (and the positional relationship therebetween) that are identified using the relative positions calculated on the basis of the image data from the (normal) absolute positions of the at least two road surface markings (and the positional relationship therebetween) that are identified on the basis of the parking lot data.
An embodiment is described below with reference to the drawings. The structures of the embodiment described below, and effects and results (advantages) brought by the structures are merely by way of example and are not limited to those described below.
First, with reference to
As illustrated in
Further, as illustrated in
The management device 101 is structured to monitor the situation in the parking lot P by receiving image data obtained from at least one monitoring camera 103 that captures images of the situation in the parking lot P and by receiving data output from various sensors (not illustrated) or the like installed in the parking lot P, and is structured to manage the parking spaces R on the basis of the monitoring result. Information that the management device 101 receives to monitor the situation in the parking lot P may be hereinafter sometimes referred to collectively as sensor data.
In the embodiment, the number and the arrangement of drop-off areas P1, pick-up areas P2, and parking spaces R in the parking lot P are not limited to the example illustrated in
With reference to
First, the hardware structure of the management device 101 according to the embodiment is described with reference to
In the example illustrated in
The CPU 301 is a hardware processor and exercises control over the management device 101. The CPU 301 reads out various control programs (computer programs) stored, for example, in the ROM 302, and implements various functions in accordance with instructions defined in the various control programs.
The ROM 302 is a nonvolatile primary storage device and stores parameters or the like necessary to execute the various control programs.
The RAM 303 is a volatile primary storage device and provides a working area for the CPU 301.
The communication interface 304 is an interface that implements communication between the management device 101 and an external device. For example, the communication interface 304 implements transmission and reception of signals by wireless communication between the management device 101 and the vehicle V (the vehicle control system 102).
The input-output interface 305 is an interface that implements connection between the management device 101 and an external device. Examples of the external device may include an input-output device that is used by an operator of the management device 101.
The SSD 306 is a rewritable nonvolatile secondary storage device. The management device 101 according to the embodiment may include a hard disk drive (HDD) as a secondary storage device, instead of the SSD 306 (or in addition to the SSD 306).
Next, the system structure of the vehicle control system 102 according to the embodiment is described with reference to
The braking system 401 controls deceleration of the vehicle V. The braking system 401 includes a braking unit 401a, a braking control unit 401b, and a braking unit sensor 401c.
The braking unit 401a is a device for decelerating the vehicle V and includes, for example, a brake pedal.
The braking control unit 401b is an electronic control unit (ECU) and is structured from, for example, a computer having a hardware processor such as a CPU. The braking control unit 401b actuates the braking unit 401a by driving an actuator (not illustrated) on the basis of an instruction from the vehicle control device 410, thereby controlling the degree of deceleration of the vehicle V.
The braking unit sensor 401c is a device for detecting the state of the braking unit 401a. For example, when the braking unit 401a includes a brake pedal, the braking unit sensor 401c detects, as the state of the braking unit 401a, the position of the brake pedal or a pressure acting on the brake pedal. The braking unit sensor 401c outputs the detected state of the braking unit 401a to the on-board network 450.
The acceleration system 402 controls acceleration of the vehicle V. The acceleration system 402 includes an acceleration unit 402a, an acceleration control unit 402b, and an acceleration unit sensor 402c.
The acceleration unit 402a is a device for accelerating the vehicle V and includes, for example, an accelerator pedal.
The acceleration control unit 402b is an ECU and is structured from, for example, a computer having a hardware processor such as a CPU. The acceleration control unit 402b actuates the acceleration unit 402a by driving an actuator (not illustrated) on the basis of an instruction from the vehicle control device 410, thereby controlling the degree of acceleration of the vehicle V.
The acceleration unit sensor 402c is a device for detecting the state of the acceleration unit 402a. For example, when the acceleration unit 402a includes an accelerator pedal, the acceleration unit sensor 402c detects the position of the accelerator pedal or a pressure acting on the accelerator pedal. The acceleration unit sensor 402c outputs the detected state of the acceleration unit 402a to the on-board network 450.
The steering system 403 controls the direction of travel of the vehicle V. The steering system 403 includes a steering unit 403a, a steering control unit 403b, and a steering unit sensor 403c.
The steering unit 403a is a device for turning steerable wheels of the vehicle V and includes, for example, a steering wheel or a handle.
The steering control unit 403b is an ECU and is structured from, for example, a computer having a hardware processor such as a CPU. The steering control unit 403b actuates the steering unit 403a by driving an actuator (not illustrated) on the basis of an instruction from the vehicle control device 410, thereby controlling the direction of travel of the vehicle V.
The steering unit sensor 403c is a device for detecting the state of the steering unit 403a. For example, when the steering unit 403a includes a steering wheel, the steering unit sensor 403c detects the position of the steering wheel or the rotation angle of the steering wheel. On the other hand, when the steering unit 403a includes a handle, the steering unit sensor 403c may detect the position of the handle or a pressure acting on the handle. The steering unit sensor 403c outputs the detected state of the steering unit 403a to the on-board network 450.
The shifting system 404 controls the speed ratio of the vehicle V. The shifting system 404 includes a shifting unit 404a, a shifting control unit 404b, and a shifting unit sensor 404c.
The shifting unit 404a is a device for changing the speed ratio of the vehicle V and includes, for example, a shift lever.
The shifting control unit 404b is an ECU and is structured from, for example, a computer having a hardware processor such as a CPU. The shifting control unit 404b actuates the shifting unit 404a by driving an actuator (not illustrated) on the basis of an instruction from the vehicle control device 410, thereby controlling the speed ratio of the vehicle V.
The shifting unit sensor 404c is a device for detecting the state of the shifting unit 404a. For example, when the shifting unit 404a includes a shift lever, the shifting unit sensor 404c detects the position of the shift lever or a pressure acting on the shift lever. The shifting unit sensor 404c outputs the detected state of the shifting unit 404a to the on-board network 450.
The obstacle sensor 405 is a device for detecting information on an obstacle that may be located around the vehicle V. The obstacle sensor 405 includes a distance measurement sensor, such as a sonar, for detecting the distance to an obstacle. The obstacle sensor 405 outputs the detected information to the on-board network 450.
The traveling state sensor 406 is a device for detecting the traveling state of the vehicle V. For example, the traveling state sensor 406 includes the following: a wheel speed sensor that detects the wheel speed of the vehicle V; an acceleration sensor that detects longitudinal or lateral acceleration of the vehicle V; and a gyroscope sensor that detects the turning speed (angular velocity) of the vehicle V. The traveling state sensor 406 outputs the detected traveling state to the on-board network 450.
The communication interface 407 is an interface that implements communication between the vehicle control system 102 and an external device. For example, the communication interface 407 implements transmission and reception of signals by wireless communication between the vehicle control system 102 and the management device 101, and also implements transmission and reception of signals by wireless communication between the vehicle control system 102 and the terminal device T.
The on-board camera 408 is a device for capturing images of the situation around the vehicle V. For example, multiple on-board cameras 408 are provided to capture images of areas including road surfaces in front of, behind, and beside (on both the right and left sides of) the vehicle V. The image data obtained by the on-board camera 408 is used to monitor the situation around the vehicle V (including to detect an obstacle). The on-board camera 408 outputs the obtained image data to the vehicle control device 410. The image data obtained from the on-board camera 408 and data obtained from the various sensors included in the vehicle control system 102 may be hereinafter sometimes referred to collectively as sensor data.
The monitor device 409 is mounted, for example, on a dashboard in the cabin of the vehicle V. The monitor device 409 includes a display unit 409a, a voice output unit 409b, and an operation input unit 409c.
The display unit 409a is a device for displaying an image in accordance with an instruction from the vehicle control device 410. The display unit 409a is structured from, for example, a liquid crystal display (LCD) or an organic electroluminescent display (OELD).
The voice output unit 409b is a device for producing a voice output in accordance with an instruction from the vehicle control device 410. The voice output unit 409b is structured from, for example, a speaker.
The operation input unit 409c is a device for receiving an input from an occupant in the vehicle V. The operation input unit 409c is structured from, for example, a touch screen provided on a display screen of the display unit 409a, or a physical operation switch. The operation input unit 409c outputs the received input to the on-board network 450.
The vehicle control device 410 is a device for exercising control over the vehicle control system 102. The vehicle control device 410 is an ECU and has computer resources including a CPU 410a, a ROM 410b, and a RAM 410c.
More specifically, the vehicle control device 410 includes the CPU 410a, the ROM 410b, the RAM 410c, a SSD 410d, a display control unit 410e, and a voice control unit 410f.
The CPU 410a is a hardware processor and exercises control over the vehicle control device 410. The CPU 410a reads out various control programs (computer programs) stored, for example, in the ROM 410b, and implements various functions in accordance with instructions defined in the various control programs.
The ROM 410b is a nonvolatile primary storage device and stores parameters or the like necessary to execute the various control programs.
The RAM 410c is a volatile primary storage device and provides a working area for the CPU 410a.
The SSD 410d is a rewritable nonvolatile secondary storage device. The vehicle control device 410 according to the embodiment may include a HDD as a secondary storage device, instead of the SSD 410d (or in addition to the SSD 410d).
The display control unit 410e mainly governs the following, among various processes that are executed by the vehicle control device 410: image processing on image data obtained from the on-board camera 408; and generation of image data to be output to the display unit 409a of the monitor device 409.
The voice control unit 410f mainly governs the following, among various processes that are executed by the vehicle control device 410: generation of voice data to be output to the voice output unit 409b of the monitor device 409.
The on-board network 450 connects the braking system 401, the acceleration system 402, the steering system 403, the shifting system 404, the obstacle sensor 405, the traveling state sensor 406, the communication interface 407, the operation input unit 409c of the monitor device 409, and the vehicle control device 410 together in such a manner as to enable communication therebetween.
In order to achieve autonomous travel, such as the automated parking and the automated retrieval in the automated valet parking system, it is important to accurately find the current position of the vehicle V during the autonomous travel. In this regard, one conventional method (what is called odometry) estimates the current position of the vehicle V using a value detected by a wheel speed sensor or the like. However, this method may not always accurately find the current position of the vehicle V because an error in estimation result increases cumulatively with an increase in distance traveled by the vehicle V.
Therefore, according to the embodiment, the vehicle control device 410 is provided with functions described below to accurately find the current position of the vehicle V during the autonomous travel in the automated parking and in the automated retrieval. That is, according to the embodiment, the vehicle control device 410 is an example of a “vehicle position estimation device”.
As illustrated in
The communication control unit 511 controls wireless communication with the vehicle control device 410. For example, the communication control unit 511 performs the following: authenticates the vehicle control device 410 by transmitting and receiving predetermined data to and from the vehicle control device 410; receives predetermined completion notifications that are output from the vehicle control device 410 when the automated parking and the automated retrieval are completed; and transmits, as needed, map data on the parking lot P and a navigation route that are described later.
The sensor data acquisition unit 512 acquires the sensor data described above from the monitoring camera 103 and various sensors (not illustrated) or the like installed in the parking lot P. The sensor data acquired by the sensor data acquisition unit 512 (in particular, image data obtained from the monitoring camera 103) may be used, for example, to check availability of the parking spaces R.
The parking lot data administration unit 513 manages data (information) on the parking lot P. For example, the parking lot data administration unit 513 manages map data on the parking lot P and availability of the parking spaces R. For example, when the automated parking is performed, the parking lot data administration unit 513 selects one parking space R among the parking spaces R that are available, and designates the selected one parking space R as a target parking space to which the vehicle V is to be moved in the automated parking. Further, if the parking space R is changed because the vehicle V moves again after the completion of the automated parking, the parking lot data administration unit 513 identifies the changed parking space R on the basis of sensor data acquired from the sensor data acquisition unit 512.
The navigation route generation unit 514 generates navigation routes to be directed to the vehicle control device 410 when the automated parking and the automated retrieval are performed. More specifically, the navigation route generation unit 514 generates, as the navigation route, a rough path from the drop-off area P1 to the target parking space when the automated parking is performed, and generates, as the navigation route, a rough path from the target parking space (the parking space R where the vehicle V is currently parked if the vehicle V has been moved after the automated parking) to the pick-up area P2 when the automated retrieval is performed.
On the other hand, as illustrated in
The communication control unit 521 controls wireless communication with the management device 101. For example, the communication control unit 521 performs the following: authenticates the vehicle control device 410 by transmitting and receiving predetermined data to and from the management device 101; transmits predetermined completion notifications to the management device 101 when the automated parking and the automated retrieval are completed; and receives, as needed, map data on the parking lot P and the navigation route from the management device 101. Thus, the communication control unit 521 functions as a map data acquisition unit that acquires map data on the parking lot P.
In the embodiment, for example, the map data includes information used to identify the absolute positions of various road markings (concrete examples are described later) that may be placed on a road surface of the parking lot P. The absolute position, as used herein, is a concept including an orientation (the absolute orientation) that the road marking has. That is, when a road marking includes a linear marking having a predetermined direction (orientation), not only the absolute position of the road marking, but also the absolute orientation indicated by the linear marking included in the road marking are identifiable from the map data.
The sensor data acquisition unit 522 is an example of an image data acquisition unit that acquires image data obtained by the on-board camera 408, and acquires sensor data including the image data and data output from various sensors provided in the vehicle control system 102. The sensor data acquired by the sensor data acquisition unit 522 may be used for various types of traveling control of the vehicle V to be performed by the travel control unit 523 described below, such as generating an actual travel route (including a parking route and a retrieval route) based on the navigation route received from the management device 101, and setting various parameters (vehicle speed, steering angle, direction of travel, etc.) that are necessary for the vehicle V to actually travel along the travel route.
The travel control unit 523 controls the braking system 401, the acceleration system 402, the steering system 403, the shifting system 404, etc. and thereby controls the traveling state of the vehicle V to perform various types of traveling control for achieving the automated parking and the automated retrieval. Examples of the various types of traveling control include start control from the drop-off area P1, travel control from the drop-off area P1 to the parking space R (including parking control), travel control from the parking space R to the pick-up area P2 (including retrieval control), and stop control into the pick-up area P2.
The position estimation unit 524 estimates the current position of the vehicle V by odometry described above, when the vehicle V is autonomously traveling in the automated parking and in the automated retrieval. Then, the position estimation unit 524 estimates the current position (the actual position) of the vehicle V by correcting the result estimated by odometry, on the basis of image data acquired by the sensor data acquisition unit 522, in such a manner as to cancel its cumulative errors. The actual position, as used herein, is a concept including the orientation (the actual orientation) of the vehicle V.
That is, according to the embodiment, during the autonomous travel, the position estimation unit 524 first detects, from image data acquired by the sensor data acquisition unit 522, road surface marking data related to a road surface marking located around the vehicle V and thus calculates the relative position of the road surface marking with respect to the vehicle V on the image data. Then, the position estimation unit 524 corrects the estimation result based on odometry, on the basis of the difference between a theoretical absolute position of the road surface marking that is identified on the basis of the relative position of the road surface marking, and a normal absolute position of the road surface marking that is based on parking lot data acquired by the communication control unit 531. The position estimation unit 524 sets the corrected value as a normal estimation value of the current position (the actual position) of the vehicle V. The relative position, as used herein, is a concept including a relative orientation of the road surface marking with respect to the vehicle V.
For example, according to the embodiment, when the vehicle V travels in a direction crossing the boundary lines L during the autonomous travel as in an example described later, the position estimation unit 524 detects road surface marking data on the basis of side image data that is image data representative of the situation in an area beside the vehicle V. The road surface marking data is related to the positions of end portions E of the boundary lines L closer to the vehicle V (closer to opening portions of the parking spaces R) and is also related to the orientations of the boundary lines L. Then, on the basis of the detected road surface marking data, the position estimation unit 524 calculates relative positions indicating the positions of the end portions E of the boundary lines L with respect to the vehicle V and calculates relative orientations indicating the orientations of the boundary lines L with respect to the vehicle V. Then, on the basis of the calculated relative positions and the relative orientations of the boundary lines L and on the basis of the absolute positions and the absolute orientations of the boundary lines L that are based on map data on the parking lot P, the position estimation unit 524 corrects the estimation results that are based on odometry and thus estimates the current position (the actual position and the actual orientation) of the vehicle V.
In the example illustrated in
The theoretical absolute position (and absolute orientation) of the boundary line L62 is identified using the estimation result based on odometry as described above, and therefore may be affected by cumulative errors due to odometry. In contrast, as already described, since map data on the parking lot P managed by the management device 101 includes information for identifying the normal absolute positions (and absolute orientations) of the road surface markings, the map data includes boundary line data for identifying the normal absolute position (and absolute orientation) of the boundary line L62 as the road surface marking.
For this reason, according to the embodiment, the communication control unit 521 acquires the boundary line data as the map data from the management device 101. Further, the position estimation unit 524 evaluates a difference between the theoretical absolute position of the boundary line L62 that is identified on the basis of the relative position (including the relative orientation) described above, and the normal absolute position (including the absolute orientation) of the boundary line L62 that is identified on the basis of the boundary line data. Then, on the basis of the difference, the position estimation unit 524 corrects the deviation of the estimation result that is based on odometry, and estimates the corrected value as the actual position (including the actual orientation) of the vehicle V. This correction that takes account of both the relative position and the relative orientation is described in detail later with reference to the drawings, and therefore is not described here anymore.
In the example illustrated in
In the example illustrated in
Therefore, in the example illustrated in
L76 is detectable by performing image recognition processing, such as while-line detection, on a set of side image data obtained by the two on-board cameras 408. The relative position of the boundary line L72 (more specifically, the relative position of the end portion E72 of the boundary line L72) and the relative position of the boundary line L76 (more specifically, the relative position of the end portion E76 of the boundary line L76), with respect to the vehicle V, are calculable using the detected road surface marking data. Further, theoretical absolute positions of the end portions E72 and E76 of the boundary lines L72 and L76 are identifiable using the calculated relative positions and using the estimation results based on odometry.
The position estimation unit 524 corrects deviations of the estimation results of the position and direction (orientation) of the vehicle V that are based on odometry, by checking the theoretical absolute positions of the end portions E72 and E76 of the boundary lines L72 and L76 identified on the basis of the relative positions against normal absolute positions of the end portions E72 and E76 of the boundary lines L72 and L76 identified on the basis of map data (boundary line data). In the example illustrated in
As described above, in the example illustrated in
In the examples illustrated in
In the example illustrated in
The position estimation unit 524 corrects deviations of the estimation results of the position and direction (orientation) of the vehicle V that are produced by odometry, by checking the theoretical absolute positions of the end portions E81 and E82 of the boundary lines L81 and L82 identified on the basis of the relative positions against normal absolute positions of the end portions E81 and E82 of the boundary lines L81 and L82 identified on the basis of map data (boundary line data). Although the example illustrated in
In the example illustrated in
In the example illustrated in
In the example illustrated in
As in the examples illustrated in
For this reason, according to the embodiment, the communication control unit 521 acquires the marker data as the map data from the management device 101. Further, the position estimation unit 524 evaluates a difference between the theoretical absolute positions of the markers M91 and M92 that are identified on the basis of the relative positions (including the relative orientations) described above, and the normal absolute positions (including the absolute orientations) of the markers M91 and M92 that are identified on the basis of the marker data. Then, on the basis of the difference, the position estimation unit 524 corrects the deviation of the estimation result that is based on odometry, and estimates the corrected value as the actual position (including the actual orientation) of the vehicle V.
In the examples illustrated in
In the example illustrated in
Then, on the basis of the calculated relative positions and the estimation results of the position and orientation of the vehicle V that are based on odometry, the position estimation unit 524 identifies theoretical absolute positions of the boundary lines L101 and L102 and the markers M101 to M103. Further, on the basis of the difference between the identified theoretical absolute positions and the normal absolute positions that are identified from the boundary line data and the marker data, the position estimation unit 524 corrects the deviation of the estimation results that are based on odometry, and estimates the corrected values as the actual position (including the actual orientation) of the vehicle V.
In the example illustrated in
In the examples illustrated in
Details of the correction that may be performed in
First, details of correction that may be performed in
In
As illustrated in
Then, on the basis of a preset parameter, the position estimation unit 524 converts the coordinates (X1, Y1) into the dimension of actual distance. The parameter, as used herein, is a parameter (in units of m/dot) indicating how many meters corresponds to one dot of image data actually. The central point P10 after conversion is hereinafter sometimes referred to as a point P20, and coordinates of the point P20 are hereinafter sometimes referred to as (X2, Y2).
As described below, the position estimation unit 524 changes the origin of the X-Y coordinate system as appropriate and thus calculates the relative position (and the relative orientation) of the boundary line L1 with respect to the vehicle V.
As illustrated in
The distance between the center C2 of the on-board camera 408 and the center C1 of the area R1 corresponding to the capture area of the on-board camera 408 is predetermined according to factors including the specifications of the on-board camera 408. Therefore, in the example illustrated in
Upon completion of calculation of the coordinates (X3, Y3), the position estimation unit 524 further changes the origin of the X-Y coordinate system from the center C2 of the on-board camera 408 to a center C3 of the vehicle V and further calculates coordinates of a point corresponding to the point P30, and a value indicative of the direction D1. The relationship between the centers C2 and C3 is predetermined according to factors including the specifications of the vehicle V. The coordinate values calculated in this way represent the relative position and the relative orientation of the boundary line L1 with respect to (the center C3 of) the vehicle V.
Upon completion of calculation of the relative position, the position estimation unit 524 identifies a theoretical absolute position and a theoretical absolute orientation of the boundary line L1 on the basis of the position and orientation of (the center C of) the vehicle V that are estimated by odometry.
On the other hand, the position estimation unit 524 extracts, from map data acquired by the communication control unit 521, boundary line data related to the boundary line L around the position of the vehicle V estimated by odometry (i.e., the boundary line L1). The boundary line data includes, for example, the (normal) absolute positions of both end points of the boundary line L1. An absolute orientation indicative of the direction D1 of extension of the boundary line L1 is identifiable by taking into account the positional relationship between the two end points. Thus, on the basis of the boundary line data extracted from the map data, the position estimation unit 524 identifies both the absolute position of the end portion E1 of the boundary line L1 and the absolute orientation representative of the direction of extension of the boundary line L1.
Then, the position estimation unit 524 evaluates a difference between the theoretical absolute position (and orientation) of the boundary line L1 that is identified on the basis of the image data and the normal absolute position (and orientation) of the boundary line L1 that is identified on the basis of the map data (the boundary line data). This difference corresponds to cumulative errors of estimation results of the position and orientation of the vehicle V that are produced by odometry. Therefore, the position estimation unit 524 corrects the estimation results of the position and orientation of the vehicle V produced by odometry to cancel the cumulative errors, and sets the corrected values as the normal current position (actual position and orientation) of the vehicle V.
Next, details of correction that may be performed in
In
As illustrated in
Then, on the basis of a preset parameter (that is the same as that already described), the position estimation unit 524 converts the coordinates (X1, Y1) and (X12, Y12) into the dimension of actual distance. The central points P11 and P12 after conversion are hereinafter sometimes referred to respectively as points P21 and P22, and coordinates of the points P21 and P22 are hereinafter sometimes referred to respectively as (X21, Y21) and (X22, Y22).
As described below, the position estimation unit 524 changes the origin of the X-Y coordinate system as appropriate and thus calculates the relative positions of the boundary lines L11 and L12 with respect to the vehicle V.
As illustrated in
Upon completion of calculation of the coordinates (X31, Y31) and (X32, Y32) of the points P31 and P32, the position estimation unit 524 further changes the origin of the X-Y coordinate system from the center C12 of the on-board camera 408 to a center C13 of the vehicle V and further calculates coordinates of points corresponding to the points P3 with respect to the changed X-Y coordinate system. The coordinate values calculated in this way represent the relative positions of the boundary lines L11 and L12 with respect to (the center C13 of) the vehicle V.
Upon completion of calculation of the relative positions, the position estimation unit 524 identifies theoretical absolute positions and theoretical absolute orientations of the boundary lines L11 and L12 on the basis of the relative positions and the position of (the center C13 of) the vehicle V that is estimated by odometry.
The position estimation unit 524 evaluates a difference between the theoretical absolute positions of the boundary lines L11 and L12 that are identified on the basis of the image data and the normal absolute positions of the boundary lines L11 and L12 that are identified on the basis of the map data (the boundary line data). Then, the position estimation unit 524 corrects the estimation result of the position of the vehicle V produced by odometry. As long as the deviations of the theoretical absolute positions of the boundary lines L11 and L12 from the normal absolute positions thereof are found, the deviation of the orientation of the vehicle V is also found on the basis of the positional relationship among the three points including the vehicle V. Therefore, the estimation result of the orientation of the vehicle V produced by odometry is also correctable on the basis of the difference. Then, the position estimation unit 524 sets the corrected value as the normal current position (actual position and orientation) of the vehicle V.
Although the above description illustrates that the actual position of the vehicle V is estimated using the results of image recognition processing on the two boundary lines L (L11 and L12), three or more road surface markings may be subjected to image recognition processing. The road surface markings used to estimate the actual position of the vehicle V is not limited to the boundary lines L. For example, in the parking lot P where both the boundary lines L and the markers M are provided as road surface markings, the actual position of the vehicle V may be estimated using the results of image recognition processing on at least two markers M, or the actual position of the vehicle V may be estimated using both at least one boundary line L and at least one marker M.
Next, with reference to
The process sequence illustrated in
When the communication is established in S1101, the management device 101 transmits map data on the parking lot P to the vehicle control device 410 in S1102.
Then, in S1103, the management device 101 checks available parking spaces R and designates one of the available parking spaces R as a target parking space to be assigned to the vehicle V.
Then, in S1104, the management device 101 generates a (rough) navigation route from the drop-off area P1 to the target parking space designated in S1103.
Then, in S1105, the management device 101 transmits the navigation route generated in S1104 to the vehicle control device 410.
On the other hand, after receiving the map data transmitted in S1102 from the management device 101, the vehicle control device 410 estimates in S1106 an initial position within the drop-off area P1. The initial position is the current position of the vehicle V within the drop-off area P1 and is used as a starting point to start from the drop-off area P1. The initial position is estimated by a method that uses image data obtained by the on-board cameras 408, as with the current positon estimation methods already described. In the example illustrated in
After estimating the initial position in S1106 and receiving the navigation route transmitted in S1105 from the management device 101, the vehicle control device 410 generates, in S1107 on the basis of elements including the initial position estimated in S1106, a travel route that is to be actually traveled during the automated parking and that is more accurate than the navigation route.
Then, in S1108, the vehicle control device 410 performs control to start from the drop-off area P1.
Then, in S1109, the vehicle control device 410 performs control to travel along the travel route generated in S1107. This traveling control is performed while estimating the current position by a method that uses image data like the one described above. The flow of processes executed to estimate the current position is described in detail later with reference to other drawings, and therefore is not described here anymore.
Then, in S1110, the vehicle control device 410 performs control to park in the target parking space.
Then, when the parking control in S1110 is completed, the vehicle control device 410 transmits a parking completion notification to the management device 101 in S1111.
In this way, the automated parking in the automated valet parking is achieved.
The process sequence illustrated in
When the communication is established in S1201, the management device 101 transmits map data on the parking lot P to the vehicle control device 410 in S1202.
Then, in S1203, the management device 101 checks the parking space R where the vehicle V equipped with the vehicle control device 410 communicating therewith is currently located. In the embodiment, the procedure of S1203 is executed on the basis of image data obtained by the monitoring camera 103 and other appropriate data.
Then, in S1204, the management device 101 generates a (rough) navigation route from the parking space R to the pick-up area P2 checked in S1203.
Then, in S1205, the management device 101 transmits the navigation route generated in S1204 to the vehicle control device 410.
On the other hand, after receiving the map data transmitted in S1202 from the management device 101, the vehicle control device 410 estimates in S1206 a retrieval position within the parking space R where the vehicle V is currently located. The retrieval position refers to the current position of the vehicle V within the parking space R and is used as a starting point to leave the parking space R. Methods similar to the current position estimation methods described already (methods that use map data and predetermined road surface marking data that is detected from image data by image recognition processing) may be used to estimate the retrieval position. In the example illustrated in
After estimating the retrieval position in S1206 and receiving the navigation route transmitted in S1205 from the management device 101, the vehicle control device 410 generates, in S1207 on the basis of elements including the retrieval position estimated in S1206, a travel route that is to be actually traveled during the automated retrieval and that is more accurate than the navigation route.
Then, in S1208, the vehicle control device 410 performs control to leave the parking space R.
Then, in S1209, the vehicle control device 410 performs control to travel along the travel route generated in S1207. This traveling control is performed while estimating the current position by a method (details are described later) that uses image data like the one described above, as with the traveling control performed in S1109 in
Then, in S1210, the vehicle control device 410 performs control to stop in the pick-up area P2.
Then, when the stop control in S1210 is completed, the vehicle control device 410 transmits a retrieval completion notification to the management device 101 in S1211.
In this way, the automated retrieval in the automated valet parking is achieved.
The process flow illustrated in
Then, in S1302, the vehicle control device 410 extracts, from the image data acquired in S1301, road surface marking data related to road surface markings on the image data by predetermined image recognition processing. In this S1302, processing is performed in accordance with, for example, a process flow illustrated in next
The process flow illustrated in
Then, in S1402, the vehicle control device 410 performs white color extraction processing on the image data that has undergone the distortion correction processing in S1401. Since road surface markings, such as the boundary lines L and the markers M, are commonly drawn with white color, the procedure of S1402 makes it possible to extract a while region including the road surface markings (the boundary lines L) from the image data that has undergone the distortion correction processing.
Then, in S1403, the vehicle control device 410 performs faintness improvement processing to improve a faint portion that may be included in the white region extracted in S1402.
Then, in S1404, the vehicle control device 410 performs a Hough transform on the image data that has undergone the faintness improvement processing in S1403, thereby extracting, from the image data, linear regions as candidates for the road surface markings (the boundary lines L).
Then, in S1405, the vehicle control device 410 selects the candidates for the road surface markings (the boundary lines L) extracted in S1404, on the basis of a predetermined criteria.
Then, in S1406, the vehicle control device 410 applies a projective transformation to the image data including the candidates selected in S1405, thereby generating image data corresponding to an area representing the capture area of the on-board camera 408 in plan view.
Then, in S1407, the vehicle control device 410 further selects candidates for the road surface markings (the boundary lines L) included in the image data that has undergone the projective transformation, on the basis of a predetermined criteria.
Then, in S1408, the vehicle control device 410 calculates, as road surface marking data, the relative positions (possibly including the relative orientations) of the candidates extracted in S1407.
The procedures of S1401 to S1408 described above are completed, and then the process proceeds to S1303 of
The process flow illustrated in
Then, in S1502, on the basis of the road surface marking data calculated by the process flow illustrated in
Then, in S1503, the vehicle control device 410 identifies the absolute positions (possibly including the absolute orientations) of the road surface markings on the basis of map data acquired by the communication control unit 521. More specifically, the vehicle control device 410 extracts, from the absolute positions of all the road surface markings included in the map data, the ones that are close to the theoretical absolute positions of the road surface markings identified using the results calculated in S1502 (may be referred to as partial data corresponding to an area around the theoretical absolute positions), thereby identifying the normal absolute positions of the road surface markings that are to be compared in the procedure of the next S1504 with the theoretical absolute positions so as to evaluate differences therebetween. For example, when the image data used to calculate the road surface marking data is the left side image data, the vehicle control device 410 extracts the absolute positions that are close to the theoretical absolute positions, by extracting, from the absolute positions of all the road surface markings included in the map data, absolute positions corresponding to the left side of the current positon of the vehicle V that is based on odometry. When road surface markings are boundary lines, the boundary lines are commonly spaced at intervals of about 2.5 meters. This interval of 2.5 meters is greater than an error expected in odometry. Therefore, in the embodiment, there is hardly any possibility that the normal absolute positions of the road surface markings are incorrectly identified by the procedure of S1503.
Then, in S1504, the vehicle control device 410 evaluate differences between the theoretical absolute positions of the road surface markings identified on the basis of the results calculated in S1502, and the normal absolute positions of the road surface markings identified in S1503, and corrects the value calculated in S1501, i.e., the value of the current position of the vehicle V calculated by odometry, on the basis of the differences.
Then, in S1505, the vehicle control device 410 estimates the value corrected in S1504 as the normal current position of the vehicle V. In the embodiment, various parameters (vehicle speed, steering angle, direction of travel, etc.) necessary for autonomous travel of the vehicle V are set on the basis of the results estimated in S1505.
The vehicle control device 410 according to the embodiment may execute the following procedures in accordance with S1501 to S1505 described above.
Specifically, in S1501, the vehicle control device 410 first calculates the actual orientation of the vehicle V that is based on odometry, by adding the amount of change in orientation based on sensor data, i.e., the amount of change in orientation of the vehicle V estimated by odometry, to the previous estimation value related to the current orientation (the actual orientation) of the vehicle V. Then, the vehicle control device 410 calculates the current position of the vehicle V that is based on odometry, by adding the amount of change in position based on sensor data, i.e., the amount of change in position of the vehicle V estimated by odometry, to the previous estimation value related to the current position (the actual position) of the vehicle V in the actual orientation of the vehicle V based on odometry.
In the embodiment, if there is no previous estimation value related to the actual orientation and the actual position in S1501 yet, the actual orientation and the actual position of the vehicle V on the basis of odometry may be used, without being processed, in the next S1502 and subsequent procedures thereto.
Then, in S1502, on the basis of the road surface marking data calculated by the process flow illustrated in
In
As illustrated in
Then, the position estimation unit 524 calculates the coordinates (X21, Y21) of a central point P21 of an end portion E21 of the boundary line L21 and the coordinates (X22, Y22) of an end point P22 of the boundary line L21 on the boundary side of the boundary line L21. Then, on the basis of these two coordinates, the position estimation unit 524 calculates, using an arctangent function or the like, a slope representative of the direction D21 of extension of the boundary line L21, for example, as a counterclockwise angle with respect to the X-axis.
Then, on the basis of a preset parameter, the position estimation unit 524 converts the distance from the coordinates (X21, Y21) of the central point P21 of the end portion E21 of the boundary line L21 to the center C21 of the area R21 into the dimension of actual distance. The parameter, as used herein, is a parameter (in units of m/dot) indicating how many meters corresponds to one dot of image data actually.
Then, as illustrated in next
As illustrated in
It is noted that when the orientations of the X-axis and the Y-axis are kept unchanged before and after the coordinate transformation, the value indicative of the direction D21 remains constant before and after the coordinate transformation. In this case, since the positional relationship between the center C22 of the on-board camera 408 and the center C21 of the area R21 corresponding to the capture area of the on-board camera 408 is predetermined according to factors including the specifications of the on-board camera 408, the coordinate transformation is achievable by adjusting the Y-axis component only.
Upon completion of calculation of the values on the X-Y coordinate system with the origin at the center C22, the position estimation unit 524 further changes the origin of the X-Y coordinate system from the center C22 to the center C23, on the basis of the positional relationship between the center C22 of the on-board camera 408 and the center C23 of the vehicle V that is predetermined according to factors including the specifications of the vehicle V. Then, the position estimation unit 524 calculates the coordinates of a point corresponding to the point P33 and a value indicative of the direction D21 on the X-Y coordinate system with the changed origin, respectively, as the relative position and the relative orientation of the boundary line L21 with respect to (the center C3 of) the vehicle V.
Then, the position estimation unit 524 identifies the theoretical absolute position of the end portion E21 of the boundary line L21 and the theoretical absolute orientation indicative of the direction D21 of extension of the boundary line L21, on the basis of the relative position and the relative orientation of the boundary line L21 with respect to (the center C3 of) the vehicle V that are acquired by the method described above, and the actual position and the actual orientation of the vehicle V that are calculated in S1501.
The terms “absolute position” and “absolute orientation” as used in the present disclosure may be indicated by values specified in a geographic coordinate system that has the same meaning all over the world, such as latitude and longitude, or may be indicated by values specified in a given coordinate system that makes sense only in the parking lot P.
Returning to
In the embodiment, if none of the absolute positions of all the road surface markings included in the map data has a degree of closeness, greater than a certain value, to the theoretical absolute positions of the road surface markings identified using the results calculated in S1502, the next S1504 and subsequent procedures thereto may not be executed.
In S1504, the vehicle control device 410 evaluates differences of the theoretical absolute positions and the theoretical absolute orientations of the road surface markings that are identified on the basis of the results calculated in S1502, respectively from the normal absolute positions and the normal absolute orientations of the road surface markings that are identified in S1503. Then, on the basis of these differences, the vehicle control device 410 corrects the estimation values of the actual position and the actual orientation of the vehicle V calculated in S1501.
More specifically, the position estimation unit 524 of the vehicle control device 410 corrects the actual orientation of the vehicle V such that the theoretical and normal absolute orientations of the road surface markings coincide with each other, in a manner as illustrated in next
In
As illustrated in
Then, upon completion of the actual orientation of the vehicle V, the position estimation unit 524 of the vehicle control device 410 corrects the actual position of the vehicle V such that the theoretical absolute position of the road surface marking coincides with the normal absolute position thereof, in a manner as illustrated in the next
As illustrated in
As described above, the position estimation unit 524 of the vehicle control device 410 according to the embodiment is capable of correcting the actual position of the vehicle V on the basis of the difference between the theoretical and normal absolute positions of the road surface marking after correcting the actual orientation of the vehicle V on the basis of the difference between the theoretical and normal absolute orientations of the road surface marking.
Returning to
As described so far, a vehicle control device 410 according to the embodiment includes a travel control unit 523 that controls a traveling state of a vehicle V to achieve autonomous travel in a parking lot P. Further, the vehicle control device 410 includes the following: a communication control unit 521 that acquires parking lot data capable of identifying the absolute position of a road surface marking, including the absolute orientation thereof, provided on a road surface of the parking lot P; a sensor data acquisition unit 522 that acquires image data obtained by an on-board camera 408; and a position estimation unit 524 that calculates the relative position of the vehicle V, including the relative orientation thereof, on the image data during the autonomous travel by detecting road surface marking data related to the road surface marking from the image data, and that estimates the actual positon of the vehicle V, including the actual orientation thereof, on the basis of the calculated relative position and the parking lot data.
On the basis of the above structure, the embodiment is capable of accurately finding the current position (the actual position including the actual orientation) of the vehicle V during the autonomous travel by taking into account deviations between the theoretical position (and orientation) of the road surface marking identified using the relative position calculated on the basis of the image data, and the normal absolute position (and the absolute orientation) of the road surface marking identified on the basis of the parking lot data.
According to the embodiment, the position estimation unit 524 may calculate the relative position (including the relative orientation) of the road surface marking located on either the left or right side of the vehicle V by detecting the road surface marking data from side image data that is the image data representative of the situation on either the left or right side of the vehicle V. This structure is capable of easily calculating the relative position (including the relative orientation) of the road surface marking by using the side image data that tends to capture the road surface marking.
Further, according to the embodiment, the communication control unit 521 may acquire, as the parking lot data, boundary line data capable of identifying the absolute position of a boundary line L that is the road surface marking indicative of a boundary of a parking space R that is pre-provided in the parking lot P, and the position estimation unit 524 may calculate the relative position (including the relative orientation) of the boundary line L by detecting, as the road surface marking data, the position of an end portion E of the boundary line L and the orientation of the boundary line L on the image data, and may estimate the actual position (including the actual orientation) of the vehicle V on the basis of the calculated relative position and the boundary line data. This structure is capable of easily estimating the actual position of the vehicle V by using the boundary line L that is commonly provided as the road surface marking indicative of the boundary of the parking space R.
In a structure like the one described above using the boundary line L, the following of the boundary line L on the image data are detected as the road surface marking data by the position estimation unit 524: the position of the end portion E located on an opening portion side of the parking space R that is delineated by the boundary line L in such a manner as to have an opening portion (an entrance and exit for the vehicle V); and the direction of extension of the boundary line L including the end portion E (the longitudinal direction of the parking space R). This structure is capable of easily estimating the actual orientation and the actual position of the vehicle V by using the position of the end portion E of the boundary line L that is located on the opening portion side of the parking space R and the direction of extension of the boundary line L including the end portion E.
In a structure like the one described above using the boundary line L, the following of the boundary line L on the image data are detected as the road surface marking data by the position estimation unit 524: the position of the central point of the end portion E; and part in the direction of extension of the boundary line L including the end portion E (the longitudinal direction of the parking space R). This structure is capable of easily estimating the actual orientation and the actual position of the vehicle V by using the position of the central point of the end portion E of the boundary line L and the direction of extension of the boundary line L including the end portion E.
Further, according to the embodiment, the communication control unit 521 may acquire, as the parking lot data, marker data that is capable of identifying the absolute position (including the absolute orientation) of a first marker (for example, the markers M91 and M92 including the line segments LS91 and LS92 illustrated in
Further, according to the embodiment, the communication control unit 521 may acquire, as the parking lot data, boundary line data and marker data. The boundary line data is capable of identifying the absolute position of a boundary line L pre-provided in the parking lot P. The marker data is capable of identifying the absolute position of a second marker (for example, the markers M101 to M103 including the line segments LS101 to LS103 illustrated in
Further, in the embodiment, the position estimation unit 524 first detects, as the road surface marking data, a first value indicative of the orientation and position of the road surface marking in a first coordinate system on the image data (for example, a coordinate system with an origin at the center of the image data) (refer to
Further, in the embodiment, the position estimation unit 524 calculates the theoretical absolute orientation and the theoretical absolute position of the road surface marking on the basis of estimation values of the actual orientation and the actual position of the vehicle V and on the basis of the relative orientation and the relative position of the road surface marking. The estimation values of the actual orientation and the actual position of the vehicle V are based on previous estimation results of the actual orientation and the actual position of the vehicle V and based on the amounts of change in the actual orientation and the actual position of the vehicle V that are based on odometry. Then, the position estimation unit 524 extracts, from the parking lot data, partial data corresponding to an area around the theoretical absolute position, corrects the estimation values of the actual orientation and the actual position of the vehicle V on the basis of differences of the theoretical absolute position from the absolute orientation and the absolute position that are based on the partial data, and estimates the actual orientation and the actual position of the vehicle V on the basis of the corrected values. This structure is capable of easily estimating the actual orientation and the actual position of the vehicle V by using the partial data, not using all the parking lot data.
In this case, after correcting the estimation value of the actual orientation such that the theoretical absolute orientation coincides with the absolute orientation that is based on the partial data, the position estimation unit 524 corrects the estimation value of the actual position such that the theoretical absolute position coincides with the absolute position that is based on the partial data. This structure is capable of easily correcting the actual orientation and the actual position of the vehicle V in a stepwise manner.
In the embodiment, the communication control unit 521 may acquire parking lot data including information on the absolute position of each of multiple road surface markings. During the autonomous travel, the position estimation unit 524 may calculate the relative positions (not including relative orientations) of at least two of the multiple road surface markings with respect to the vehicle V on the image data by detecting road surface marking data related to the at least two road surface markings from the image data, and may estimate the actual position of the vehicle V including the actual orientation thereof on the basis of the calculated relative positions and the parking lot data. This structure is capable of accurately finding the current position (the actual position including the actual orientation) of the vehicle during the autonomous travel by taking into account deviations of the theoretical positions of the at least two road surface markings (and the positional relationship therebetween) that are identified using the relative positions calculated on the basis of the image data from the (normal) absolute positions of the at least two road surface markings (and the positional relationship therebetween) that are identified on the basis of the parking lot data, without taking into account the relative positions of the road surface markings.
In the above structure that calculates the relative positions only of multiple road surface markings, the position estimation unit 512 may calculate the relative position of at least one first road surface marking and the relative position of at least one second road surface marking by detecting, as the road surface marking data, a first position of the first road surface marking (for example, the position of the end portion E72 of the boundary line L72 illustrated in
Further, in the above structure that calculates the relative positions only of multiple road surface markings, the position estimation unit 524 may calculate the relative positions of at least two road surface markings that are located on either the left side or the right side of the vehicle V by detecting, as the road surface marking data, the position of each of the at least two road surface markings (for examples, the positions of the end portions E81 and E82 of the boundary lines L81 and L82 illustrated in
In this case, the position estimation unit 524 detects, as the road surface marking data, the positions of end portions E of at least two boundary lines L on the image data, and the end portions E are located on an opening portion side (an entrance and exit for the vehicle V) of a parking space that is delineated by the boundary lines L in such a manner as to have the opening portion. This structure is capable of easily estimating the actual position of the vehicle V by using the positions of the end portions E of the at least two boundary lines L that are located on the opening portion side of the parking space R.
Further, in this structure, the position estimation unit 524 detects, as the road surface marking data, the positions of the central points of the end portions E of the at least two boundary lines L on the image data. This structure is capable of easily estimating the actual position of the vehicle V by using the positions of the central points of the end portions E of the at least two boundary lines L.
Further, in the above structure that calculates the relative positions only of multiple road surface markings, the communication control unit 521 may acquire, as the parking lot data, boundary line data and the marker data. The boundary line data is capable of identifying the absolute positions of end portions E of multiple boundary lines L. The marker data is capable of identifying the absolute positions of multiple markers M. In this case, the position estimation unit 524 may estimate the actual position of the vehicle V by detecting the road surface marking data that is related to at least two of the multiple boundary lines L, at least two of the multiple markers M, or both at least one of the multiple boundary lines L and at least one of the multiple markers M. This structure is capable of easily estimating the actual position of the vehicle V on the basis of a combination of any two or more of the multiple boundary lines L and the multiple markers M.
Further, in the above structure that calculates the relative positions only of multiple road surface markings, the position estimation unit 524 may first detect, as the road surface marking data, first values indicative of the positions of at least two road surface markings in a first coordinate system on the map data (for example, a coordinate system with an origin at the center of the image data) (refer to
In the vehicle position estimation device according to another example, the position estimation unit 524 may calculate theoretical absolute positions of at least two road surface markings on the basis of an estimation value of the actual position of the vehicle V and on the basis of the relative positions of the at least two road surface markings. The estimation value of the actual position of the vehicle is based on a previous estimation result of the actual position of the vehicle V and based on an amount of change in the actual position of the vehicle V that is based on odometry. Then, the position estimation unit 524 may extract, from the parking lot data, partial data corresponding to an area around the theoretical absolute positions, may correct the estimation value of the actual position of the vehicle V on the basis of differences of the theoretical absolute positions from the absolute positions that are based on the partial data, and may estimate the actual position of the vehicle V on the basis of the corrected value. This structure is capable of easily estimating the actual position of the vehicle V by using the partial data, not using all the parking lot data.
The embodiment described above illustrates that the technology of the preferred embodiment is applied to automated valet parking systems. However, the technology of the preferred embodiment is applicable to parking systems other than automated valet parking systems, as long as appropriate road surface markings are provided in a parking lot, and the parking systems are capable of acquiring data related to the absolute positions of the road surface markings.
The embodiment described above illustrates that the vehicle control device provided as a vehicle position estimation device includes the travel control unit, in addition to the sensor data acquisition unit as a parking lot data acquisition unit, the sensor data acquisition unit as an image data acquisition unit, and the position estimation unit. However, in the embodiment, a device other than vehicle control device and not including the travel control unit may be provided as the vehicle position estimation device, as long as the other device includes at least the parking lot data acquisition unit, the image data acquisition unit, and the position estimation unit described above.
Although embodiments of the preferred embodiment have been described above, the embodiments are merely given by way of example and are not intended to limit the scope of the invention. The novel embodiments described above may be implemented in various forms, and various omissions, substitutions, and changes may be made without departing from the spirit of the invention. The embodiments and modifications thereof fall within the scope and sprit of the invention, as defined by the claims and equivalent thereof.
408: ON-BOARD CAMERA
410: VEHICLE CONTROL DEVICE (VEHICLE POSITION ESTIMATION DEVICE)
521: COMMUNICATION CONTROL UNIT (PARKING LOT DATA ACQUISITION UNIT
522: SENSOR DATA ACQUISITION UNIT (IMAGE DATA ACQUISITION UNIT)
523: TRAVEL CONTROL UNIT
524: POSITION ESTIMATION UNIT
E, E1, E11, E12, E21, E51, E62, E72, E76, E81, E82: END PORTION
L, L1, L11, L12, L21, L51, L52, L61-L63, L71-L76, L81-L83, L91-L93, L101-L102: BOUNDARY LINE
M, M91, M92, M101-M103: MARKER
P: PARKING LOT
R: PARKING SPACE
V: VEHICLE
Number | Date | Country | Kind |
---|---|---|---|
2017-221899 | Nov 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/042586 | 11/16/2018 | WO | 00 |