This disclosure relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
Implementations herein provide a driving assistance system or vision system or imaging system for a vehicle that utilizes one or more cameras (preferably one or more CMOS cameras) to capture image data representative of images exterior of the vehicle, and provides an electronic control unit (ECU) that includes electronic circuitry and associated software. The electronic circuitry of the ECU includes an image processor for processing image data captured by the camera to detect presence of objects in the field of view of the camera. The ECU, responsive to processing by the image processor of image data captured by the camera and as the vehicle travels along a current traffic lane, determines a current lateral position of the vehicle within the current traffic lane. The ECU, responsive to processing by the image processor of image data captured by the camera and as the vehicle travels along the current traffic lane, determines a desired or target lateral position of the vehicle within the current traffic lane. The ECU, responsive to determining the current lateral position of the vehicle and the target lateral position of the vehicle, determines a steering command to guide the vehicle from the current lateral position to the target lateral position. The ECU, responsive to guiding the vehicle to the target lateral position within the current traffic lane, determines a steering angle offset required to maintain the target lateral position within the current traffic lane and the ECU stores the determined steering angle offset in nonvolatile memory disposed at the vehicle. The ECU, after storing the determined steering angle offset and on a subsequent trip, determines a second target lateral position of the vehicle within the current traffic lane. The ECU, responsive to determining the second target lateral position of the vehicle, applies the stored steering angle offset to maintain the second target lateral position within the current traffic lane.
These and other objects, advantages, purposes and features of these implementations will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver or driving assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide a display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior viewing imaging sensor or camera, such as a rearward viewing imaging sensor or camera 14a (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a forward viewing camera 14b at the front (or at the windshield) of the vehicle, and a sideward/rearward viewing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
Advanced driver assistance systems (ADAS) often obtain information of the surrounding environment through various sensors such as cameras, radar, LiDAR, etc. This information is used by various features (e.g. adaptive cruise control, lane centering, lane keeping assist, etc.) to assist the driver while driving or operating a vehicle. These features (like lane keeping assist or lane centering) often use data captured by the camera (e.g., lane-mark information) to control lateral motion of the vehicle.
For lateral control of the vehicle, the desired lateral position of the equipped vehicle within a traffic lane the vehicle is traveling along is calculated and compared against the actual position of the equipped vehicle in the lane. A lateral controller controls the lateral motion of the vehicle to minimize the error between the actual lateral position of the vehicle within the traffic lane and desired lateral position of the vehicle within the traffic lane. The output of the lateral controller is typically either steering angle or steering motor torque to a steering system of the vehicle (i.e., to cause the vehicle to move laterally within the lane to approach the desired lateral position).
There are several vehicle related phenomena that lead to a steering torque pull or steering angle bias while driving on a straight road. These phenomena include, for example, asymmetric tire alignment, steering column bias, road banking, tire wear, etc. Electric Power Steering (EPS) systems have methods to detect and compensate for torque pull experienced by the driver of a vehicle while doing manual operation of a motor vehicle. However, ADAS functions also need to calculate an angle and/or torque bias to send a compensated command to the EPS (i.e., compensate the angle and/or torque command for the bias).
Implementations herein include a vehicular driving assistance system that learns or determines the steering angle or torque offset (i.e., bias) present (if any) in the equipped vehicle so that an appropriate steering angle/torque compensation can be applied while performing a robust lateral motion control of the equipped vehicle. That is, the system may learn and remember the angle/torque offset in the steering system so that appropriate steering angle/torque that takes into account the offset can be generated for lateral control of the vehicle (i.e., left or right movement of the vehicle within the traffic lane or across traffic lanes). When active, the system attempts to minimize the error between the desired or target lateral position of the vehicle within the traffic lane the vehicle is traveling along and the actual lateral position of the vehicle within the traffic lane. The system includes a controller that may accumulate instantaneous error over time. If and when any steering angle and/or torque offset or bias is present in the steering system (e.g., from a misalignment of the vehicle), the controller requires additional time to accumulate the error and reach the desired value. Typically, when the system is activated, the accumulation of the error begins at zero (i.e., no bias or offset), thus the controller requires more time to reach the desired value. To reduce this additional time taken by the controller to achieve the desired value, the system learns the steering angle/torque offset.
The driver assistance system learns and stores the last valid accumulated error when the system was last active (e.g., from a previous trip using the vehicle or previous usage of the system during the same trip with the vehicle). For example, when the system is deactivated (via actuation of a user input, shutting the vehicle off, etc.), the system stores the most recent error accumulation in non-volatile storage or memory. Whenever the feature activates again (e.g., via actuation of a user input, starting the vehicle, etc.), the controller begins the accumulation of the error signal starting from the learned/stored value instead of starting from zero again. Thus, the controller can achieve the desired value in less time and ultimately decrease the response time of the system to achieve the desired lateral position within the traffic lane the equipped vehicle is travelling in.
Referring now to
The system includes a lateral controller 220 or control portion with a path generator module 230, a motion planner module 240, and a motion control module 300. The path generator generates a target path along the current traffic lane for the equipped vehicle to follow. The motion planner calculates, based on the target path generated by the path generator, a desired curvature for the target path of the equipped vehicle. The motion control module uses the desired curvature calculated by the motion planner and current vehicle states (i.e., captured by the vehicle sensors), to generate a steering angle or torque command to guide the vehicle along the desired trajectory of the target path. The block diagram 200 also includes a steering system 250. The steering system includes both hardware and software that executes the steering torque or angle command generated by the motion control module in order to move the vehicle toward the desired path along the traffic lane.
Referring now to
The adaptive learning module receives the integrated error from the controller and filters large transients (e.g., sudden jumps) in error values to learn the steering and/or torque command. The controller, based on error between the desired and actual curvature of the path of the vehicle (e.g., determined during execution of a steering command and/or after execution of a steering command), attempts to achieve and maintain the desired steering angle/torque that best allows the vehicle to follow the desired path along the traffic lane. Optionally, the controller is a feedforward and integrator type controller although other equivalent controllers can be used as well. For example, the controller may be a feedforward and proportional-integral-derivative (PID) controller, a sliding mode controller, etc. When an integrator is used and the vehicle is in a steady-state condition, the output of integrator typically corresponds to an additional steering command to compensate for the offset in steering angle or torque.
Thus, the vehicular driving assistance system includes an ADAS lateral control system that uses ADAS sensors to control lateral motion of the vehicle by sending a steering command to the steering system. The ADAS lateral control system includes one or more of a path generator module, a motion planner module, and a motion control module. The motion control module includes an adaptive learning module that learns and updates the steering command offset and saves the learned steering command offset when vehicle power is off or the system is disabled. The motion control module also includes an enable steer offset learning module that enables offset learning based on various conditions such as when the vehicle speed is greater than a defined threshold, the vehicle yaw rate is less than a defined threshold, the lateral feature system is enabled for a certain period, and/or the vehicle roll angle is less than a defined threshold. The motion control module also includes a controller that calculates a steering command to control lateral motion of the vehicle and applies the learned steering command offset while calculating the steering command. The controller applies the learned steering command offset when initializing the integrator. The learned value can be saved in a nonvolatile memory disposed at the vehicle and can subsequently retrieved and provided to the controller whenever the vehicle powers on or when the ADAS system is enabled.
The vehicular control system may utilize aspects of the systems described in U.S. Pat. Nos. 10,315,651; 9,340,227; 9,180,908 and/or 6,882,287, and/or U.S. patent applications, Ser. No. 17/452,419, filed Oct. 27, 2021), Ser. No. 17/457,767, filed Dec. 6, 2021, Ser. No. 17/445,198, filed Aug. 17, 2021, Ser. No. 17/445,199, filed Aug. 17, 2021, and/or Ser. No. 17/445,200, filed Aug. 17, 2021, which are all hereby incorporated herein by reference in their entireties.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in U.S. Pat. No. 10,099,614 and/or 10,071,687, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least one million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in U.S. Pat. Nos. 10,071,687; 9,900,490; 9,126,525 and/or 9,036,026, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of these implementations, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 63/201,724, filed May 11, 2021, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6580986 | Uenuma et al. | Jun 2003 | B1 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6824281 | Schofield et al. | Nov 2004 | B2 |
6882287 | Schofield | Apr 2005 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7480149 | DeWard et al. | Jan 2009 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
8256821 | Lawlor et al. | Sep 2012 | B2 |
9180908 | Van Dan Elzen et al. | Nov 2015 | B2 |
9340227 | Bajpai | May 2016 | B2 |
9487159 | Achenbach | Nov 2016 | B2 |
9596387 | Achenbach et al. | Mar 2017 | B2 |
9871971 | Wang et al. | Jan 2018 | B2 |
9896039 | Achenbach et al. | Feb 2018 | B2 |
9988047 | Johnson et al. | Jun 2018 | B2 |
10032369 | Koravadi | Jul 2018 | B2 |
10055651 | Chundrlik, Jr. et al. | Aug 2018 | B2 |
10071687 | Ihlenburg et al. | Sep 2018 | B2 |
10099614 | Diessner | Oct 2018 | B2 |
10268904 | Gupta | Apr 2019 | B2 |
10315651 | Fiaschetti et al. | Jun 2019 | B2 |
10449899 | Gupta et al. | Oct 2019 | B2 |
10571923 | Tamboli | Feb 2020 | B2 |
11014569 | Ghasemalizadeh | May 2021 | B2 |
11345400 | Funke | May 2022 | B2 |
11414127 | Funke | Aug 2022 | B2 |
11685431 | Al Assad | Jun 2023 | B2 |
20090295181 | Lawlor et al. | Dec 2009 | A1 |
20110010054 | Wilson-Jones et al. | Jan 2011 | A1 |
20140160284 | Achenbach et al. | Jun 2014 | A1 |
20140226012 | Achenbach | Aug 2014 | A1 |
20150015713 | Wang et al. | Jan 2015 | A1 |
20150327398 | Achenbach et al. | Nov 2015 | A1 |
20160159394 | Ryu et al. | Jun 2016 | A1 |
20210263518 | Sheng | Aug 2021 | A1 |
20220048504 | Prasad Challa et al. | Feb 2022 | A1 |
20220048509 | Prasad Challa | Feb 2022 | A1 |
20220048566 | Prasad Challa et al. | Feb 2022 | A1 |
20220135030 | Varunjikar | May 2022 | A1 |
20220176960 | Awathe et al. | Jun 2022 | A1 |
20220363250 | Varunjikar | Nov 2022 | A1 |
20230134480 | Varunjikar | May 2023 | A1 |
20230415734 | Zhu | Dec 2023 | A1 |
Number | Date | Country |
---|---|---|
3360746 | Aug 2018 | EP |
Entry |
---|
Snider J.M., “Automatic Steering Methods for Autonomous Automobile Path Tracking”, Feb. 2009, CMU thesis. |
Werling et al., Invariant Trajectory Tracking With a Full-Size Autonomous Road Vehicle, IEEE, vol. 26, No. 4, Aug. 2010. |
Werling et al., Optimal trajectories for time-critical street scenarios using discretized terminal manifolds, The International Journal of Robotics Research, Mar. 2012. |
Number | Date | Country | |
---|---|---|---|
20220363250 A1 | Nov 2022 | US |
Number | Date | Country | |
---|---|---|---|
63201724 | May 2021 | US |