The present invention relates to a vehicle control system, a vehicle control method, and a vehicle control program.
Priority is claimed on Japanese Patent Application No. 2016-108527, filed on May 31, 2016, the content of which is incorporated herein by reference.
Conventionally, devices determining a steering angle of a subject vehicle on the basis of a running locus of a preceding vehicle are known (for example, Patent Literature 1). This following system sets a point at which a perpendicular line extending from a position of a subject vehicle toward a running locus of a preceding vehicle and the running locus of the preceding vehicle intersect with each other. In addition, the following system calculates a predicted position reached when the subject vehicle runs from the perpendicular point at the current speed of the subject vehicle for a predetermined time and performs steering control on the basis of a radius of curvature of the running locus of the preceding vehicle at the predicted position.
[Patent Literature 1]
Japanese Unexamined Patent Application, First Publication No. H10-100738
In a case in which the running locus of the preceding vehicle at the predicted position is a running locus having a small radius of curvature to form a sharp curve, there are cases in which the steering angle of the subject vehicle greatly changes.
An aspect according to the present invention has been made in consideration of such situations, and one object thereof is to provide a vehicle control system, a vehicle control method, and a vehicle control program capable of realizing smoother steering control.
(1) According to one aspect of the present invention, there is provided a vehicle control system including: a position recognizing unit configured to recognize a position of a vehicle; a locus generating unit configured to generate a target locus of the vehicle; and a running control unit configured to set a reference position with respect to the position of the vehicle recognized by the position recognizing unit on the target locus generated by the locus generating unit and control steering of the vehicle on the basis of a circular arc that has a tangent along a travelling direction of the vehicle and passes through the reference position and the position of the vehicle.
(2) In the aspect (1) described above, the running control unit may set a position on the target locus of the vehicle in a case in which the vehicle is assumed to have run on the target locus for a predetermined time or over a predetermined distance from a position on the target locus that is the closest to the position of the vehicle recognized by the position recognizing unit as the reference position.
(3) In the aspect (1) or (2) described above, the running control unit may derive a first index value based on the circular arc and a second index value for increasing control of steering of the vehicle as a deviation between the reference position and the position of the vehicle in a direction orthogonal to the travelling direction of the vehicle becomes larger and control the steering of the vehicle on the basis of the first index value and the second index value.
(4) In the aspect (3) described above, in a case in which the deviation is equal to or greater than a first predetermined value, the running control unit may limit the control of the steering of the vehicle.
(5) In any one of the aspects (1) to (4) described above, in a case in which a curvature of the circular arc exceeds a second predetermined value, the running control unit may limit the control of the steering of the vehicle.
(6) In the aspect (3) described above, the running control unit may control the steering of the vehicle on the basis of a position of the vehicle on the circular arc in a case in which the vehicle runs on the circular arc for a time shorter than the predetermined time used for acquiring the reference position and the position of the vehicle recognized by the position recognizing unit.
(7) According to one aspect of the present invention, there is provided a vehicle control method using an in-vehicle computer, the vehicle control method including: generating a future target locus of the vehicle; and setting a reference position with respect to a position of the vehicle recognized by a position recognizing unit recognizing the position of the vehicle on the generated future target locus and controlling steering of the vehicle on the basis of a circular arc that has a tangent along a travelling direction of the vehicle and passes through the reference position and the position of the vehicle.
(8) According to one aspect of the present invention, there is provided a vehicle control program causing an in-vehicle computer to execute: generating a future target locus of the vehicle; and setting a reference position with respect to a position of the vehicle recognized by a position recognizing unit recognizing the position of the vehicle on the generated future target locus and controlling steering of the vehicle on the basis of a circular arc that has a tangent along a travelling direction of the vehicle and passes through the reference position and the position of the vehicle.
According to the aspects (1), (2), and (4) to (8) described above, the steering of a vehicle is controlled on the basis of a circular arc passing through a reference position and a position of the vehicle, and accordingly, smoother steering control can be realized.
According to the aspect (3) described above, the running control unit controls the steering of a vehicle on the basis of a first index value based on a circular arc and a second index value for increasing control of the steering of the vehicle as a deviation between the reference position and the position of the vehicle in a direction orthogonal to the travelling direction of the vehicle becomes larger, whereby steering can be controlled such that the vehicle is closer to the target locus.
Hereinafter, a vehicle control system, a vehicle control method, and a vehicle control program according to embodiments of the present invention will be described with reference to the drawings.
As illustrated in
Each of the finders 20-1 to 20-7, for example, is a light detection and ranging or a laser imaging detection and ranging (LIDAR) device measuring a distance to a target by measuring scattered light from emitted light. For example, the finder 20-1 is mounted on a front grille or the like, and the finders 20-2 and 20-3 are mounted on side faces of a vehicle body, door mirrors, inside of head lights, on near side lights, or the like. The finder 20-4 is mounted in a trunk lid or the like, and the finders 20-5 and 20-6 are mounted on side faces of the vehicle body, inside of tail lamps or the like. Each of the finders 20-1 to 20-6 described above, for example, has a detection area of about 150 degrees with respect to a horizontal direction. In addition, the finder 20-7 is mounted on a roof or the like.
For example, the finder 20-7 has a detection area of 360 degrees with respect to a horizontal direction. The radars 30-1 and 30-4, for example, are long-distance millimeter wave radars having a wider detection area in a depth direction than that of the other radars. In addition, the radars 30-2, 30-3, 30-5, and 30-6 are middle-distance millimeter wave radars having a narrower detection area in a depth direction than that of the radars 30-1 and 30-4.
Hereinafter, in a case in which the finders 20-1 to 20-7 are not particularly distinguished from each other, one thereof will be simply referred to as a “finder 20,” and, in a case in which the radars 30-1 to 30-6 are not particularly distinguished from each other, one thereof will be simply referred to as a “radar 30.” The radar 30, for example, detects an object using a frequency modulated continuous wave (FM-CW) system.
The camera 40, for example, is a digital camera using a solid-state imaging device such as a charge coupled device (CCD), a complementary metal oxide semiconductor (CMOS), or the like. The camera 40 is mounted in an upper part of a front windshield, a rear face of an interior mirror, or the like. The camera 40, for example, repeats imaging of the side in front of the subject vehicle M periodically. The camera 40 may be a stereo camera including a plurality of cameras.
The configuration illustrated in
The navigation device 50 includes a global navigation satellite system (GNSS) receiver, map information (navigation map), a touch panel-type display device functioning as a user interface, a speaker, a microphone, and the like. The navigation device 50 identifies a location of the subject vehicle M using the GNSS receiver and derives a route from the location to a destination designated by a user. The route derived by the navigation device 50 is provided to the target lane determining unit 110 of the vehicle control system 100. The location of the subject vehicle M may be identified or complemented by an inertial navigation system (INS) using an output of the vehicle sensor 60. In addition, when the vehicle control system 100 implements a manual driving mode, the navigation device 50 performs guidance using speech or a navigation display for a route to the destination. Components used for identifying the location of the subject vehicle M may be disposed to be independent from the navigation device 50. In addition, the navigation device 50, for example, may be realized by a function of a terminal device such as a smartphone, a tablet terminal, or the like held by a user or the like. In such a case, information is transmitted and received using wireless or wired communication between the terminal device and the vehicle control system 100.
The communication device 55, for example, performs radio communication using a cellular network, a Wi-Fi network, Bluetooth (registered trademark), dedicated short range communication (DSRC), or the like.
The vehicle sensor 60 includes a vehicle speed sensor detecting a vehicle speed, an acceleration sensor detecting an acceleration, a yaw rate sensor detecting an angular velocity around a vertical axis, an azimuth sensor detecting the azimuth of the subject vehicle M, and the like.
The display device 62 displays information as an image. The display device 62, for example, includes a liquid crystal display (LCD), an organic electroluminescence (EL) display device, or the like. In this embodiment, the display device 62 will be described as being a head-up display that displays an image inside the field of view of a vehicle occupant by reflecting the image onto a front window of the subject vehicle M. In addition, the display device 62 may be a display device included in the navigation device 50 or a display device of an instrument panel that displays the state (a speed and the like) of the subject vehicle M. The speaker 64 outputs information as speech.
The operation device 70, for example, includes an acceleration pedal, a steering wheel, a brake pedal, a shift lever, and the like. An operation detecting sensor 72 that detects the presence/absence of a driver's operation and the amount of the operation is mounted in the operation device 70. The operation detecting sensor 72, for example, includes an acceleration opening degree sensor, a steering torque sensor, a brake sensor, a shift position sensor, and the like. The operation detecting sensor 72 outputs an acceleration opening degree, a steering torque, a brake depression amount, a shift position, and the like to the running control unit 160 as detection results. Instead of this, a detection result acquired by the operation detecting sensor 72 may be directly output to the running driving force output device 200, the steering device 210, or the brake device 220.
The changeover switch 80 is a switch that is operated by a driver or the like. The changeover switch 80 accepts an operation of a driver or the like, generates a control mode designation signal used for designating a control mode according to the running control unit 160 as one of an automatic driving mode and a manual driving mode, and outputs the generated control mode designation signal to the switching control unit 150. The automatic driving mode, as described above, is a driving mode in which running is performed in a state in which a driver does not perform an operation (or the amount of operation is smaller than that of the manual driving mode, or the operation frequency is lower than that of the manual driving mode) and, more specifically, is a driving mode in which some or all of the running driving force output device 200, the steering device 210, and the brake device 220 are controlled on the basis of an action plan. In addition, the changeover switch 80 may accept various operations in addition to the operation of switching of the automatic driving mode.
Before description of the vehicle control system 100, the running driving force output device 200, the steering device 210, and the brake device 220 will be described.
The running driving force output device 200 outputs a running driving force (torque) used for running the vehicle to driving wheels. For example, the running driving force output device 200 includes an engine, a transmission, and an engine control unit (ECU) controlling the engine in a case in which the subject vehicle M is an automobile having an internal combustion engine as its power source, includes a running motor and a motor ECU controlling the running motor in a case in which the subject vehicle M is an electric vehicle having a motor as its power source, and includes an engine, a transmission, an engine ECU, a running motor, and a motor ECU in a case in which the subject vehicle M is a hybrid vehicle. In a case in which the running driving force output device 200 includes only an engine, the engine ECU adjusts a throttle opening degree, a shift level, and the like of the engine in accordance with information input from a running control unit 160 to be described later. On the other hand, in a case in which the running driving force output device 200 includes only a running motor, the motor ECU adjusts a duty ratio of a PWM signal given to the running motor in accordance with information input from the running control unit 160. In a case in which the running driving force output device 200 includes an engine and a running motor, an engine ECU and a motor ECU control a running driving force in cooperation with each other in accordance with information input from the running control unit 160.
The steering device 210, for example, includes a steering ECU and an electric motor.
The electric motor, for example, changes the direction of steered wheels by applying a force to a rack and pinion mechanism. The steering ECU changes the direction of the steering wheels by driving the electric motor in accordance with information input from the vehicle control system 100 or information of a steering angle or a steering torque that is input.
The brake device 220, for example, is an electric servo brake device including a brake caliper, a cylinder delivering hydraulic pressure to the brake caliper, an electric motor generating hydraulic pressure in the cylinder, and a brake control unit. The brake control unit of the electric servo brake device performs control of the electric motor in accordance with information input from the running control unit 160 such that a brake torque according to a braking operation is output to each vehicle wheel. The electric servo brake device may include a mechanism delivering hydraulic pressure generated by an operation of the brake pedal to the cylinder through a master cylinder as a backup. In addition, the brake device 220 is not limited to the electric servo brake device described above and may be an electronic control-type hydraulic brake device. The electronic control-type hydraulic brake device delivers hydraulic pressure of the master cylinder to the cylinder by controlling an actuator in accordance with information input from the running control unit 160. In addition, the brake device 220 may include a regenerative brake using the running motor which can be included in the running driving force output device 200.
Hereinafter, the vehicle control system 100 will be described. The vehicle control system 100, for example, is realized by one or more processors or hardware having functions equivalent thereto. The vehicle control system 100 may be configured by combining an electronic control unit (ECU), a micro-processing unit (MPU), or the like in which a processor such as a CPU, a storage device, and a communication interface are interconnected through an internal bus.
Referring to
In the storage unit 180, for example, information such as high-accuracy map information 182, target lane information 184, action plan information 186, and the like is stored. The storage unit 180 is realized by a read only memory (ROM), a random access memory (RAM), a hard disk drive (HDD), a flash memory, or the like. A program executed by the processor may be stored in the storage unit 180 in advance or may be downloaded from an external device through in-vehicle internet facilities or the like. In addition, a program may be installed in the storage unit 180 by mounting a portable-type storage medium storing the program in a drive device not illustrated in the drawing. Furthermore, the vehicle control system 100 may be distributed using a plurality of computer devices.
The target lane determining unit 110, for example, is realized by an MPU. The target lane determining unit 110 divides a route provided from the navigation device 50 into a plurality of blocks (for example, divides the route at every 100 [m] in the vehicle travelling direction) and determines a target lane for each block by referring to the high-accuracy map information 182. The target lane determining unit 110, for example, determines a lane, in which the subject vehicle runs, represented using a position from the left side. For example, in a case in which a branching point, a merging point, or the like is present in the route, the target lane determining unit 110 determines a target lane such that the subject vehicle M can run in a running route that is rational for advancing to a branching destination. The target lane determined by the target lane determining unit 110 is stored in the storage unit 180 as target lane information 184.
The high-accuracy map information 182 is a map information having a higher accuracy than that of the navigation map included in the navigation device 50. The high-accuracy map information 182, for example, includes information of the center of a lane or information of boundaries of a lane and the like. In addition, in the high-accuracy map information 182, road information, traffic regulations information, address information (an address and a zip code), facilities information, telephone number information, and the like may be included. In the road information, information representing a type of road such as an expressway, a toll road, a national road, or a prefectural road and information such as the number of lanes of a road, a width of each lane, a gradient of a road, the position of a road (three-dimensional coordinates including longitude, latitude, and a height), a curvature of the curve of a lane, locations of merging and branching points of lanes, signs installed on a road, and the like are included. In the traffic regulations information, information of closure of a lane due to roadwork, traffic accidents, congestion, or the like is included.
The automatic driving mode control unit 130 determines an automatic driving mode executed by the automatic driving control unit 120. The following modes are included in the automatic driving mode according to this embodiment. The following are merely examples, and the number and types of the automatic driving mode may be arbitrarily determined.
A mode A is a mode of which the degree of automatic driving is the highest. In a case in which the mode A is executed, the entire vehicle control such as complicated merging control is automatically performed, and accordingly, a vehicle occupant does not need to monitor the vicinity or the state of the subject vehicle M.
A mode B is a mode of which a degree of automatic driving is the second highest after the mode A. In a case in which the mode B is executed, generally, the entire vehicle control is automatically performed, but a driving operation of the subject vehicle M may be given over to a vehicle occupant in accordance with situations. For this reason, the vehicle occupant needs to monitor the vicinity and the state of the subject vehicle M.
A mode C is a mode of which a degree of automatic driving is the third highest after the mode B. In a case in which the mode C is executed, a vehicle occupant needs to perform a confirmation operation according to situations on the changeover switch 80. In the mode C, for example, in a case in which a timing for a lane change is notified to a vehicle occupant, and the vehicle occupant performs an operation of instructing a lane change with the changeover switch 80, automatic lane change is performed. For this reason, the vehicle occupant needs to monitor the vicinity and the state of the subject vehicle M.
The automatic driving mode control unit 130 determines an automatic driving mode on the basis of a vehicle occupant's operation on the changeover switch 80, an event determined by the action plan generating unit 144, a running mode determined by the locus generating unit 146, and the like. In the automatic driving mode, a limit according to the performance and the like of the detection device DD of the subject vehicle M may be set. For example, in a case in which the performance of the detection device DD is low, the mode A may not be executed. In any one of the modes, switching to a manual driving mode (overriding) can be made by performing an operation for the configuration of the driving operation system with the changeover switch 80.
The subject vehicle position recognizing unit 140 of the automatic driving control unit 120 recognizes a lane (running lane) in which the subject vehicle M is running and a relative position of the subject vehicle M with respect to the running lane on the basis of the high-accuracy map information 182 stored in the storage unit 180 and information input from the finder 20, the radar 30, the camera 40, the navigation device 50, or the vehicle sensor 60.
For example, the subject vehicle position recognizing unit 140 compares a pattern of road partition lines recognized in the high-accuracy map information 182 (for example, an array of solid lines and broken lines) with a pattern of road partition lines in the vicinity of the subject vehicle M that has been recognized in an image captured by the camera 40, thereby recognizing a running lane.
In the recognition, the position of the subject vehicle M acquired from the navigation device 50 or a result of the process executed by an INS may be additionally taken into account.
The external system recognizing unit 142 recognizes states of each surrounding vehicle such as a position, a speed, an acceleration, and the like thereof on the basis of information input from the finder 20, the radar 30, the camera 40, and the like. For example, a surrounding vehicle is a vehicle running in the vicinity of the subject vehicle M and is a vehicle running in the same direction as that of the subject vehicle M. The position of a surrounding vehicle may be represented as a representative point on another vehicle such as the center of gravity, a corner, or the like and may be represented by an area represented by the contour of another vehicle. The “state” of a surrounding vehicle may include an acceleration of a surrounding vehicle and whether or not a lane is being changed (or whether or not a lane is to be changed) acquired on the basis of information of various devices described above. In addition, the external system recognizing unit 142 may recognize positions of a guard rail, a telegraph pole, a parked vehicle, a pedestrian, and other objects in addition to the surrounding vehicles.
The action plan generating unit 144 sets a start point of automatic driving and/or a destination of automatic driving. The start point of automatic driving may be the current position of the subject vehicle M or a point at which an operation instructing automatic driving is performed. The action plan generating unit 144 generates an action plan for a section between the start point and a destination of the automatic driving. The section is not limited thereto, and the action plan generating unit 144 may generate an action plan for an arbitrary section.
The action plan, for example, is configured of a plurality of events that are sequentially executed. The events, for example, include a deceleration event of decelerating the subject vehicle M, an acceleration event of accelerating the subject vehicle M, a lane keeping event of causing the subject vehicle M to run without deviating from a running lane, a lane changing event of changing a running lane, an overtaking event of causing the subject vehicle M to overtake a preceding vehicle, a branching event of changing lane to a desired lane at a branching point or causing the subject vehicle M to run without deviating from a current running lane, a merging event of accelerating/decelerating the subject vehicle M and changing a running lane in a merging lane for merging into a main lane, and a handover event of transitioning from a manual driving mode to an automatic driving mode at a start point of automatic driving or transitioning from an automatic driving mode to a manual driving mode at a planned end point of automatic driving, and the like. The action plan generating unit 144 sets a lane changing event, a branching event, or a merging event at a place at which a target lane determined by the target lane determining unit 110 is changed. Information representing the action plan generated by the action plan generating unit 144 is stored in the storage unit 180 as action plan information 186.
For example, when a lane keeping event is executed, the running mode determining unit 146A may determine one running mode among constant-speed running, following running, low-speed following running, decelerating running, curve running, obstacle avoidance running, and the like. In this case, in a case in which another vehicle is not present in front of the subject vehicle M, the running mode determining unit 146A may determine constant-speed running as the running mode. In addition, in a case in which following running for a preceding vehicle is to be executed, the running mode determining unit 146A may determine following running as the running mode. In addition, in the case of congestion or the like, the running mode determining unit 146A may determine low-speed following running as the running mode. Furthermore, in a case in which deceleration of a preceding vehicle is recognized by the external system recognizing unit 142 or in a case in which an event of stopping, parking, or the like is to be executed, the running mode determining unit 146A may determine decelerating running as the running mode. In addition, in a case in which the subject vehicle M is recognized to have reached a curved road by the external system recognizing unit 142, the running mode determining unit 146A may determine curve running as the running mode. Furthermore, in a case in which an obstacle is recognized in front of the subject vehicle M by the external system recognizing unit 142, the running mode determining unit 146A may determine obstacle avoidance running as the running mode. In addition, in a case in which a lane changing event, an overtaking event, a branching event, a merging event, a handover event, or the like is executed, the running mode determining unit 146A may determine a running mode corresponding to each event.
The locus candidate generating unit 146B generates candidates for a locus on the basis of the running mode determined by the running mode determining unit 146A.
The locus candidate generating unit 146B, for example, determines loci as illustrated in
In this way, since the locus points K include a speed component, the locus candidate generating unit 146B needs to give a target speed to each of the locus points K. The target speed is determined in accordance with the running mode determined by the running mode determining unit 146A.
Here, a technique for determining a target speed in a case in which a lane change (including branching) is performed will be described.
The locus candidate generating unit 146B, first, sets a lane change target position (or a merging target position). The lane change target position is set as a relative position with respect to a surrounding vehicle and is for determining “surrounding vehicles between which a lane change is performed.” The locus candidate generating unit 146B determines a target speed of a case in which a lane change is performed focusing on three surrounding vehicles using the lane change target position as a reference.
In the drawing, an own lane L1 is illustrated, and an adjacent lane L2 is illustrated. Here, in the same lane as that of the subject vehicle M, a surrounding vehicle running immediately before the subject vehicle M will be defined as a vehicle mA running ahead, a surrounding vehicle running immediately before the lane change target position TA will be defined as a front reference vehicle mB, and a surrounding vehicle running immediately after the lane change target position TA will be defined as a rear reference vehicle mC. When the subject vehicle M needs to perform acceleration/deceleration for movement to the lateral side of the lane change target position TA, at this time, overtaking the vehicle mA running ahead needs to be avoided. For this reason, the locus candidate generating unit 146B predicts future states of the three surrounding vehicles and sets a target speed such that there is no interference with each of the surrounding vehicles.
The evaluation/selection unit 146C performs evaluations for the generated candidates for the locus generated by the locus candidate generating unit 146B, for example, from two viewpoints of planning and safety and selects a target locus to be output to the running control unit 160. From the viewpoint of planning, for example, a locus is evaluated highly in a case in which the ability to follow a plan that has already been generated (for example, an action plan) is high, and the total length of the locus is short. For example, in a case in which it is desirable to perform a lane change to the right side, a locus in which a lane change to the left side is temporarily performed, and then, the subject vehicle returns has a low evaluation. From the viewpoint of safety, for example, in a case in which, at each locus point, a distance between the subject vehicle M and an object (a surrounding vehicle or the like) is long, and the amounts of changes in the acceleration/deceleration and the steering angle are small, the locus is evaluated highly.
The switching control unit 150 performs switching between the automatic driving mode and the manual driving mode on the basis of a signal input from the changeover switch 80. In addition, the switching control unit 150 switches the driving mode from the automatic driving mode to the manual driving mode on the basis of an operation instructing acceleration, deceleration, or steering on the operation device 70. For example, in a case in which a state in which the amount of operation represented by a signal input from the operation device 70 exceeds a threshold continues for a reference time or more, the switching control unit 150 may switch the driving mode from the automatic driving mode to the manual driving mode (overriding). In addition, in a case in which an operation on the operation device 70 has not been detected for a predetermined time after the switching to the manual driving mode according to overriding, the switching control unit 150 may return the driving mode to the automatic driving mode.
The running control unit 160, for example, as illustrated in
The gazing position deriving unit 170 derives a gazing position (reference position) of the subject vehicle M. The gazing position deriving unit 170 sets a position on a target locus of the subject vehicle M acquired in a case in which the subject vehicle M is assumed to run on the target locus from a position on the target locus that is closest to the position of the subject vehicle M for a predetermined time as a gazing position.
The first steering angle deriving unit 172 controls the steering of the subject vehicle M on the basis of a virtual circular arc that has a tangent along a travelling direction of the subject vehicle M and passes through the gazing position and the position of the subject vehicle M. Here, the travelling direction of the subject vehicle M may be a direction of a center axis of the vehicle, a direction in which a velocity vector of the subject vehicle M at the moment is directed, or one of directions acquired by performing correction based on a yaw rate for these.
For example, the first steering angle deriving unit 172 derives the position of the subject vehicle M at a time t (current position; x0, y0), the position of the subject vehicle M at a time t+1 (x1, y1), and the position of the subject vehicle M at a time t+2 (x2, y2) in the target locus. The first steering angle deriving unit 172 derives a curvature of the regular circle by assuming that the subject vehicle M moves on a regular circle passing through the positions of these three points at a certain time. The first steering angle deriving unit 172 derives a steering angle of the subject vehicle M on the basis of the following Equation (1) by assuming that the subject vehicle M moves in the regular circle in a steady state. In the following Equation (1), δ represents a steering angle (steering wheel angle), k represents a curvature of the regular circle, A represents a stability factor, V represents a vehicle speed, L represents an inter-axial distance, and n represents a gear ratio. The steering angle, for example, is represented as an absolute value, which is the same in the following description.
δ=k×(1+A×V2)×L×n (1)
In addition, the first steering angle deriving unit 172 may derive the position of the subject vehicle M at a time t (current position; x0, y0), the position of the subject vehicle M at a time t−1 (−x1, −y1), and the position of the subject vehicle M at a time t+1 (x1, y1) in the target locus and derive a curvature using a normal circle passing through the positions of these three points.
In addition, in a case in which the curvature of the circular arc exceeds a predetermined value (a second predetermined value), the first steering angle deriving unit 172 may limit the control of the steering of the subject vehicle M by correcting the curvature of the circular arc to a predetermined value or less. The circular arc is a part of the circumference of the regular circle.
The second steering angle deriving unit 174 derives a second steering angle for increasing the control of the steering of the subject vehicle M as a deviation between the gazing position and the position of the subject vehicle M in a direction orthogonal to the travelling direction of the subject vehicle M increases.
The integration unit 176 derives a steering angle to be output to the steering device 210 by integrating the first steering angle and the second steering angle. The integration unit 176 may change weighting factors of the first steering angle and the second steering angle in accordance with the vehicle speed. More specifically, the integration unit 176 sets a weighting factor of the first steering angle to be larger than the weighting factor of the second steering angle in the case of a low vehicle speed (for example, the vehicle speed is equal to or less than a first predetermined speed). The reason for this is that the first steering angle derived on the basis of the circular arc has low error at a low speed. On the other hand, by setting the weighting factor of the second steering angle to be larger than the weighting factor of the first steering angle at a high speed (equal to or higher than a second predetermined speed), a deviation of the first steering angle can be compensated.
First, the gazing position deriving unit 170 of the steering angle control unit 164 sets a position on the target locus that is close to the subject vehicle M (Step S100). Next, the steering angle control unit 164 derives a gazing position of the subject vehicle M after a predetermined time on the basis of the set position and the vehicle speed of the subject vehicle M (Step S102).
The predetermined time Tref used for acquiring the gazing position OB described above is a long time relative to one sampling time Ts for which the running control unit 160 executes the process. For example, in a case in which the processing period of the running control unit 160 is 0.1 seconds, the predetermined time Tref is 0.5 seconds. In such a case, the gazing position OB is a position assumed to be a position at which the subject vehicle M is located after 0.5 seconds.
Next, the first steering angle deriving unit 172 derives a circular arc joining the current position of the subject vehicle M and the gazing position OB (Step S104). Next, the first steering angle deriving unit 172 derives a first steering angle for running on the derived circular arc (Step S106).
Next, the second steering angle deriving unit 174 derives a second steering angle on the basis of a deviation between the subject vehicle M and the gazing position OB in the horizontal direction (Step S108).
Next, the integration unit 176 derives a steering angle used for control by integrating the first steering angle and the second steering angle (Step S110). The integration unit 176 may derive a steering angle by adding the first steering angle and the second steering angle or may device a steering angle by acquiring a weighted sum of the first steering angle and the second steering angle using weighting factors. In addition, in a case in which the derived steering angle is equal to or greater than a predetermined angle, the integration unit 176 may limit the steering angle to a predetermined angle or an angle that is equal to or less than a predetermined angle. In this way, the process of this flowchart ends.
In a case in which some or all of the first steering angle, the second steering angle, and the steering angle derived by the integration unit 176 is equal to or greater than a predetermined angle, the steering angle control unit 164 may prompt a vehicle occupant to perform handover before the steering angle is controlled as being equal to or greater than a predetermined angle. In such a case, for example, the steering angle control unit 164 outputs a notification used for prompting handover to the speaker 64 or the display device 62. In this way, it is suppressed that the steering angle is controlled as being equal to or greater than the predetermined angle in automatic driving. In addition, it is suppressed that a vehicle occupant performs handover in a state in which the steering of the steering angle at the predetermined angle or more is not recognized.
Views in which the subject vehicle M is controlled will be described with reference to
In the processing period (1), a gazing position OB of the subject vehicle M for the next processing period (2) is assumed to be derived. As illustrated in
As illustrated in
As illustrated in
In this way, a gazing position OB is derived for every processing period, and a circular arc AR connecting the gazing position OB and the subject vehicle M is derived. The steering angle control unit 164 derives a steering angle on the basis of the derived circular arc AR. As a result, the subject vehicle M can come close to the target locus in a smooth locus.
In addition, the subject vehicle M can enter the target locus to follow the target locus. As a result, smoother steering control can be realized.
In the description presented above, as an example, while the view in which the position of the subject vehicle M deviates from the target locus by a predetermined distance or more has been described, here, the process of a view in which the position of the subject vehicle M does not deviate from the target locus by a predetermined distance or more and a view in which the position of the subject vehicle M and the target locus coincide with each other will be described.
The gazing position deriving unit 170 derives a gazing position of the subject vehicle M in accordance with the curvature of the target locus.
For example, the gazing position deriving unit 170 derives a gazing position as being closer to the vicinity of the subject vehicle M as the curvature of the target locus becomes larger and derives a gazing position to be further away from the subject vehicle M as the curvature of the target locus becomes closer to zero (straight line). The first steering angle deriving unit 172, for example, derives a circular arc joining the current position of the subject vehicle M and the gazing position and derives a first steering angle for running on the derived circular arc.
In this way, since a gazing position of the subject vehicle M is derived in accordance with a curvature of the target locus, in a case in which the curvature of the target locus is large, the radius of the circular arc becomes small, and the subject vehicle M is controlled such that the ability to follow the target locus is good. As a result, a deviation between the target locus having a large curvature and the position of the subject vehicle M is suppressed from occurring. In addition, in a case in which the curvature of the target locus is close to zero, the radius of the circular arc become large and becomes close to a straight line, and accordingly, the running stability of the subject vehicle M is improved.
In addition, the gazing position deriving unit 170 derives the gazing position of the subject vehicle M in accordance with a target speed given for each locus point K of the target locus. For example, as the target speed becomes higher, the gazing position deriving unit 170 derives a gazing position to be farther away for the improvement of the running stability. On the other hand, as the target speed becomes lower, the gazing position deriving unit 170 derives a gazing position to be nearer for the control of the subject vehicle to have a good ability to follow the target locus.
For example, the target locus is a locus in which the subject vehicle M can run with a gravitational acceleration in the horizontal direction (lateral G) being equal to or less than a predetermined value. In a curved road, for example, the target speed is set to be equal to or less than a predetermined speed such that the lateral G does not exceed a predetermined value. Accordingly, a gazing position on the curved road becomes closer to the subject vehicle M than a gazing position in a straight road.
As described above, in a case in which the curvature of the target locus is small or in a case in which the target vehicle speed is high, the gazing position is set to be father away from the subject vehicle M, and accordingly, the behavior of the subject vehicle M becomes stable. On the other hand, in a case in which the curvature of the target locus is large or in a case in which the target vehicle speed is low, the gazing position is set to be closer to the subject vehicle M, and accordingly, the subject vehicle M is controlled to have a good ability to follow the target locus, and accordingly, a deviation between the target locus and the subject vehicle M is suppressed.
In addition, the second steering angle deriving unit 174 derives a second steering angle on the basis of a deviation between the subject vehicle M and the gazing position OB in the horizontal direction. The integration unit 176 derives a steering angle in which a relation between the position of the subject vehicle M and the target locus is taken into account by integrating the first steering angle and the second steering angle.
According of the first embodiment described above, the vehicle control system 100 controls the steering of the subject vehicle M on the basis of the first steering angle derived on the basis of the circular arc AR that has the tangent TL along the travelling direction of the subject vehicle M and passes through the gazing position OB and the position of the subject vehicle M and the second steering angle for increasing control of the steering of the subject vehicle M as a deviation between the gazing position OB and the position of the subject vehicle M in a direction orthogonal to the travelling direction of the subject vehicle M becomes larger, whereby smoother steering control can be realized.
Hereinafter, a second embodiment will be described.
First, the gazing position deriving unit 164Aa of the steering angle control unit 164A sets a position on the target locus that is close to the subject vehicle M (Step S200). Next, the steering angle control unit 164A derives a gazing position of the subject vehicle M after a predetermined time on the basis of the set position and the vehicle speed of the subject vehicle M (Step S202).
Next, the steering angle deriving unit 164Ab derives a circular arc joining the current position of the subject vehicle M and the gazing position (Step S204). Next, the steering angle deriving unit 164Ab derives a steering angle for running on the derived circular arc (Step S206).
Next, the steering angle deriving unit 164Ab derives a steering angle on the basis of the vehicle speed and the first steering angle (Step S208). In this way, the process of this flowchart ends. For example, the steering angle deriving unit 164Ab derives a steering angle by referring to a steering angle map MP associated with the vehicle speed and a maximum steering angle. The steering angle deriving unit 164Ab derives a steering angle such that the steering angle is limited as being equal to or less than a predetermined angle by referring to the steering angle map MP.
According to the second embodiment described above, the vehicle control system 100 controls the steering of the subject vehicle M on the basis of the first steering angle derived on the basis of the circular arc AR that has a tangent TL along the travelling direction of the subject vehicle M and passes through the gazing position OB and the position of the subject vehicle M, and accordingly, the subject vehicle M can be controlled such that it smoothly returns to the target locus while reducing the processing load.
Hereinafter, a third embodiment will be described. A vehicle control system 100A according to the third embodiment derives a steering angle in a case in which manual driving is executed instead of deriving the steering angle in a case in which automatic driving is executed, which is different from the first embodiment. Hereinafter, relating differences will be focused in description.
The curve determining unit 147 determines whether or not a road on which the subject vehicle M is running or is planned to run is a curved road on the basis of a result of a collation between a position of the subject vehicle M recognized by the subject vehicle position recognizing unit 140 and the high-accuracy map information 182.
In a case in which it is determined by the curve determining unit 147 that the subject vehicle M is running or is planned to run on a curved road, the target locus setting unit 148 generates a target locus on the curved road.
The target locus on the curved road, for example, is a locus in which center points on the curved road are connected.
The steering angle control unit 164 derives a steering angle on the basis of a target locus set by the target locus setting unit 148. In this embodiment, a timing at which the steering angle control unit 164 derives a steering angle will be described as being a case in which the position of the subject vehicle M is deviating from the target locus or a case in which the position of the subject vehicle M has deviated from the target locus, on a curved road. Here, the case in which the position is deviating or the case in which the position has deviated represents that a “predetermined position” such as the center of gravity of the subject vehicle M is separated away from a position in the target locus that is the closest to the “predetermined position” by a predetermined distance or more. In a case in which the position of the subject vehicle M is deviating from the target locus or a case in which the position has deviated from the target locus, the steering angle control unit 164 derives a steering angle such that the subject vehicle M runs on the target locus. The steering angle control unit 164 outputs the derived steering angle to the steering device 210, thereby assisting manual driving of a vehicle occupant. In addition, this assisting function may be controlled as being on or off by operating the changeover switch 80.
For example, in a case in which the position of the subject vehicle M deviates from the target locus due to an erroneous operation of a vehicle occupant in a state in which the assisting function is set to the on state, the subject vehicle M is controlled on the basis of the steering angle derived by the steering angle control unit 164. Accordingly, the subject vehicle M is controlled such that it runs on the target locus.
According to the third embodiment described above, in a case in which manual driving is executed, when the subject vehicle M deviates from the target locus, the vehicle control system 100A assists manual driving such that the subject vehicle M runs on the target locus, whereby the running stability of the subject vehicle M can be improved.
According to the embodiment described above, by including a position recognizing unit that recognizes a position of a vehicle, a locus generating unit that generates a future target locus of the vehicle, and a running control unit that sets a reference position on the target locus with respect to the position of the vehicle recognized by the position recognizing unit and controls steering of the vehicle on the basis of a circular arc that has a tangent along the travelling direction of the vehicle and passes through the reference position and the position of the vehicle, smoother steering control can be realized.
As above, while the embodiments of the present invention have been described using the embodiment, the present invention is not limited to such embodiment at all, and various modifications and substitutions may be made in a range not departing from the concept of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2016-108527 | May 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/018019 | 5/12/2017 | WO | 00 |