The present disclosure generally relates to autonomous vehicle control and more particularly relates to use in vehicle control of road angle data to compensate for gravity influence in measured acceleration data.
This section provides background information related to the present disclosure which is not necessarily prior art.
Modern vehicles include various autonomous control features. These features assist the driver in, for example, braking, steering and engine power control by using sensed data from a variety of sources as part of complex control algorithms. In development are vehicles allowing ever less involvement of the driver in operation of the vehicle.
Such autonomous control functions are reliant on accuracy of sensed data. One source of sensed data in many vehicles is an inertial measurement unit, which provides data on various components of vehicle acceleration. Automated control processes in the vehicle rely on the acceleration data.
Accordingly, it is desirable to account for any unintended false influences of acceleration data from the inertial measurement unit. In addition, it is desirable to control automated vehicle functions based on accurate sensor data. Furthermore, other desirable features and characteristics of the present invention will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the foregoing technical field and background.
A vehicle is provided having a controlled vehicle function. In one embodiment, the vehicle includes a vehicle control system. The vehicle control system includes an inertial measurement unit including a sensor for measuring a measured acceleration component of a vehicle. A processor is configured to obtain one or both of road gradient and road bank angle of a road being travelled by the vehicle. The processor is configured to control the vehicle function responsive to the acceleration component of the vehicle and the one or both of road gradient and bank angle.
A vehicle control system is provided for controlling a vehicle function. In one embodiment, the vehicle control system includes an inertial measurement unit including a sensor for measuring a measured acceleration component of a vehicle. A processor is configured to obtain one or both of road gradient and road bank angle of a road being travelled by the vehicle. The processor is configured to control the vehicle function responsive to the acceleration component of the vehicle and the one or both of road gradient and bank angle.
A method is provided for controlling a function of a vehicle. In one embodiment, the method includes measuring a measured acceleration component of a vehicle. The method includes obtaining one or both of road gradient and road bank angle of a road being travelled by the vehicle. The method includes controlling the vehicle function responsive to the acceleration component of the vehicle and the one or both of road gradient and bank angle.
The exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
The following detailed description is merely exemplary in nature and is not intended to limit the application and uses. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description. As used herein, the term module refers to an application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
As described in greater detail further below, and according to an exemplary embodiment, the vehicle 100 includes various cameras 101, 103 and/or other sensors 167 from which road angle can be derived as well as a vehicle control system 102 for determining at least one acceleration offset and controlling at least one vehicle function based on the at least one acceleration offset. In the depicted embodiment, the cameras include visual cameras 103 and lidar cameras 101 distributed around the vehicle including at front, rear and both sides of the vehicle 100. Other imaging devices than visual and lidar cameras may be utilized. The cameras may obtain video data, which includes images obtained at a high frame rate, or lower time frequency images. It will be appreciated that the number and/or location of cameras 101, 103 may vary in different embodiments. The other sensors 167 may include at least one level sensor arranged to measure longitudinal and/lateral road angle, i.e. longitudinal road gradient and/or road banking angle.
Also as discussed further below, the vehicle control system 102 includes a controller 106. In various embodiments, the vehicle control system 102 provides determination of acceleration offset and autonomous vehicle control functions based thereon, as set forth in greater detail further below in connection with the discussion of
In one embodiment depicted in
In the exemplary embodiment illustrated in
It will be appreciated that in other embodiments, the actuator assembly 120 may include one or more other types of engines and/or motors, such as an electric motor/generator, instead of or in addition to the combustion engine. In certain embodiments, the electronic system 118 comprises an engine system that controls the engine 130 and/or one or more other systems of the vehicle 100.
Still referring to
The steering system 150 is mounted on the chassis 112, and controls steering of the wheels 116. In one embodiment, the steering system 150 may include a non-depicted steering wheel and a steering column. In various embodiments, the steering wheel receives inputs from a driver of the vehicle 100, and the steering column results in desired steering angles for the wheels 116 via the drive shafts 134 based on the inputs from the driver. In certain embodiments, an autonomous vehicle may utilize steering commands for the steering system 150 that are generated by the vehicle control system 102, with no involvement from the driver. In other embodiments, the steering system 150 receives commands from both the user and the vehicle control system 102 in a semi-autonomous implementation. In other embodiments, the vehicle control system 102 controls at least one function of the steering system 150 responsive to acceleration and at least one of road gradient and bank angle, as will be described further herein. For example, the vehicle control system 102 may generate a steering control command using offset acceleration, where offset acceleration is determined as described further herein.
The braking system 155 is mounted on the chassis 112, and provides braking for the vehicle 100. In an embodiment, the braking system 155 receives inputs from the driver via a non-depicted brake pedal, and provides appropriate braking via brake units (not depicted). In certain embodiments, an autonomous vehicle may utilize braking commands for the braking system 155 that are generated by the vehicle control system 102, with no involvement from the driver. In other embodiments, the steering system 150 receives commands from both the user and the vehicle control system 102 in a semi-autonomous implementation. In other embodiments, the vehicle control system 102 controls at least one function of the braking system 155 responsive to acceleration and at least one of road gradient and bank angle, as will be described further herein. For example, the vehicle control system 102 may generate a braking control command using offset acceleration, where offset acceleration is determined as described further herein.
The power system 160 is mounted on the chassis 112, and provides power control of the vehicle 100 with the set power representative of a desired speed or acceleration of the vehicle 100. The power system 160 communicates with the powertrain 129 in order to control power delivered to the driveshafts 134. For example, the power system 160 may include an acceleration input system comprising an accelerator pedal 161 that is engaged by a driver, with the engagement representative of a desired speed or acceleration of the vehicle 100. In certain embodiments, an autonomous vehicle may utilize power commands for the power system 160 that are generated by the vehicle control system 102, with no involvement from the driver so as to provide automated speed and acceleration control. In other embodiments, the power system 160 receives commands from both the user and the vehicle control system 102 in a semi-autonomous implementation. In other embodiments, the vehicle control system 102 controls at least one function of the power system 160 responsive to acceleration and at least one of road gradient and bank angle, as will be described further herein.
As noted above and depicted in
The plurality of cameras 101, 103 obtain images with respect to various different locations of the vehicle 100. In addition, in various embodiments, the cameras 101,103 also obtain images with respect to surroundings, including objects, in proximity to the vehicle 100, surrounding roads, and surrounding road features such as building, curbs, roadside banks, etc. As depicted in one embodiment, cameras 101, 103 are included within or proximate each of the rear view mirror 140, side mirrors 142, front grill 144, and rear region 146. In one embodiment, the cameras 101, 103 comprise video cameras controlled via the controller 106. In various embodiments, the cameras 103 may also be disposed in or proximate one or more other locations of the vehicle 100. The cameras 101 represent LIDAR cameras or sensors, in the present embodiment. The cameras 103 represent visual cameras, e.g. cameras operating in the visible, infrared or ultraviolet ranges using ambient light. Other imaging devices are possible than LIDAR and visual cameras.
The sensor array 104 includes various sensors (also referred to herein as sensor units) that are used for providing measurements and/or data for use by the controller 106. In embodiments, the sensor array 104 includes at least one level sensor 167 that is able to measure, electronically, longitudinal and/or lateral angle of the car relative to horizontal. Exemplary implementations of the at least one level sensor (also known as an inclinometer) would be an electrolytic tilt sensor, an accelerometer, a liquid capacitive device, a gas bubble in liquid device, a pendulum device, a Micro Electro Mechanical Sensor, MEMS, tilt sensor, etc. In embodiments, a two-axis digital inclinometer is included so that both lateral and longitudinal road incline relative to horizontal can be measured.
In exemplary embodiments, the sensor array 104 includes an inertial measurement unit 166 including at least one accelerometer as a sensor for measuring acceleration of the vehicle. The inertial measurement unit 166 is configured to obtain various acceleration readings including longitudinal, vertical and lateral acceleration. In various embodiments, the inertial measurement unit is a self-contained system that measures linear and angular motion usually with a triad of gyroscopes and triad of accelerometers. The inertial measurement unit can be gimballed or strapdown and is configured for outputting quantities of angular velocity and acceleration of the vehicle 100. The vehicle control system 102 is configured to autonomously control various vehicle functions, such as steering, braking and power as described above with respect to the steering, braking and power systems 150, 155, 160, based on, at least in part, acceleration measurements from the inertial measurement unit 166. That is, at least one vehicle command may be generated based on a control algorithm or calculation that incorporates acceleration readings from the inertial measurement unit 166. Road slope in the lateral and longitudinal direction can falsely affect the acceleration readings, which can thus result in false control maneuvers. Embodiments of the present disclosure obtain the road angle for road gradient and/or road bank and utilize this information in controlling at least on vehicle function, thereby alleviating any false control maneuvers that might otherwise have occurred.
In exemplary embodiments, the sensor array 104 includes a GPS navigation device or GPS receiver 168. The GPS receiver 168 is a device that is capable of receiving information from GPS satellites. Based on the GPS information, the receiver 168 is capable of calculating its geographical location. The GPS receiver may use assisted GPS (A-GPS) technology by which telecommunications base stations and/or cell towers provide device location tracking capability. The GPS receiver 168 is configured for providing global positioning data for use in locating the vehicle with respect to an enhanced digital map 184 as described further below.
In various embodiments, the sensors of the sensor array 104 comprise one or more detection sensors 162, interface sensors 163, gear sensors 164, and/or wheel speed sensors 165. The detection sensors 162 (e.g. radar, lidar, sonar, machine vision, Hall Effect, and/or other sensors) detect objects in proximity to the vehicle 100. The interface sensors 163 detect a user's engagement of an interface of the vehicle 100 (e.g. a button, a knob, a display screen, and/or one or more other interfaces). The gear sensors 164 detect a gear or transmission state of the vehicle 100 (e.g. park, drive, neutral, or reverse). The wheel speed sensors 165 measure a speed of one or more of the wheels 116 of the vehicle 100. In various embodiments, the sensor array 104 provides the measured information to the controller 106 for processing, including for determining acceleration offset based on road angle in accordance with the steps of the methods and systems described with respect to
As depicted in
In the depicted embodiment, the computer system of the controller 106 includes a processor 172, a memory 174, an interface 176, a storage device 178, and a bus 180. The processor 172 performs the computation and control functions of the controller 106, and may comprise any type of processor or multiple processors, single integrated circuits such as a microprocessor, or any suitable number of integrated circuit devices and/or circuit boards working in cooperation to accomplish the functions of a processing unit. During operation, the processor 172 executes one or more programs 182 contained within the memory 174 and, as such, controls the general operation of the controller 106 and the computer system of the controller 106, generally in executing the methods and systems described further below in connection with
The memory 174 can be any type of suitable memory. For example, the memory 174 may include various types of dynamic random access memory (DRAM) such as SDRAM, the various types of static RAM (SRAM), and the various types of non-volatile memory (PROM, EPROM, and flash). In certain examples, the memory 174 is located on and/or co-located on the same computer chip as the processor 172. In the depicted embodiment, the memory 174 stores the above-referenced program 182 along with one or more stored maps 184. In certain embodiments, the stored maps 184 are enhanced digital maps 184 including a collection of data compiled and formatted into a virtual image. The enhanced digital maps provide representations of a particular area, detailing roads, terrain encompassing the surrounding area and other points of interest. The enhanced digital map 174 allows the calculation of distances from one place to another. The enhanced digital map 174 is used with the Global Positioning System, or GPS satellite network, as part of an automotive navigation system. The enhanced digital map may also include traffic updates, service locations and other enhancement data for the user. Further, the enhanced digital map 174 includes, in embodiments, a layer of road angle data representing angle of longitudinal and/or lateral road inclination (e.g. road gradient and road banking angle). In other embodiments, the enhanced digital map 174 includes a layer of road images from which road angle data can be derived through image analysis. The enhanced digital map 174 may include data sets for virtual maps, satellite (aerial views) views, and hybrid (a combination of virtual map and aerial views) views. The enhanced digital maps 174 may be defined in a GIS file format, which is a standard of encoding geographical information into a computer file. The enhanced digital map 174 may be accessed by the vehicle control system for various functions including extracting road angle data as described further herein, and for satellite navigation. The enhanced digital map 174, and a satellite navigation system computer program, of the vehicle control system 102 may be stored in memory 174 located in the vehicle 100 or in cloud storage. Cloud computing may be utilized as part of the vehicle control system 102 for various functions described herein, including obtaining road angle data from the enhanced digital maps 174 and satellite navigation.
The bus 180 serves to transmit programs, data, status and other information or signals between the various components of the computer system of the controller 106. The interface 176 allows communication to the computer system of the controller 106, for example from a system driver and/or another computer system, and can be implemented using any suitable method and apparatus. In one embodiment, the interface 176 obtains the various data from the sensors of the sensor array 104. The interface 176 can include one or more network interfaces to communicate with other systems or components. The interface 176 may also include one or more network interfaces to communicate with technicians, and/or one or more storage interfaces to connect to storage apparatuses, such as the storage device 178.
The storage device 178 can be any suitable type of storage apparatus, including direct access storage devices such as hard disk drives, flash systems, floppy disk drives and optical disk drives. In one exemplary embodiment, the storage device 178 comprises a program product from which memory 174 can receive a program 182 that executes one or more embodiments of one or more processes and systems of the present disclosure, such as the features described further below in connection with
The bus 180 can be any suitable physical or logical means of connecting computer systems and components. This includes, but is not limited to, direct hard-wired connections, fiber optics, infrared and wireless bus technologies. During operation, the program 182 is stored in the memory 174 and executed by the processor 172.
It will be appreciated that while this exemplary embodiment is described in the context of a fully functioning computer system, those skilled in the art will recognize that the mechanisms of the present disclosure are capable of being distributed as a program product with one or more types of non-transitory computer-readable signal bearing media used to store the program and the instructions thereof and carry out the distribution thereof, such as a non-transitory computer readable medium bearing the program and containing computer instructions stored therein for causing a computer processor (such as the processor 172) to perform and execute the program. Such a program product may take a variety of forms, and the present disclosure applies equally regardless of the particular type of computer-readable signal bearing media used to carry out the distribution. Examples of signal bearing media include: recordable media such as floppy disks, hard drives, memory cards and optical disks, and transmission media such as digital and analog communication links. It will be appreciated that cloud-based storage and/or other techniques may also be utilized in certain embodiments. It will similarly be appreciated that the computer system of the controller 106 may also otherwise differ from the embodiment depicted in
In
a′
x
=a
x-
g cos θx (equation 1)
a′
z
=a
z-g sin θx (equation 2)
where a′x and a′z represents offset or compensated acceleration components in the longitudinal and vertical directions x and z.
In
where a′y and a′z represents offset or compensated acceleration components in the lateral and vertical directions y and z.
The present disclosure proposes to determine at least one of the road angle components θx and θy, to determine at least one offset acceleration component a′x, y and/or z based on the road angle components and at least one measured acceleration component ax, y and/or z and to control at least one vehicle function based on the at least one offset acceleration component. Systems and methods are described herein, particularly with reference to
In the exemplary embodiment of
Continuing to refer to the exemplary embodiment of
The exemplary embodiment of
The road angle data {right arrow over (θ)} may be extracted from the enhanced digital map 184 in one embodiment. The road angle data {right arrow over (θ)} can be extracted from the enhanced digital map 184 as data representative of road angles or as image data that can be image processed as described below (particularly with reference to
In an additional or alternative embodiment, the road angle data {right arrow over (θ)} can be obtained through image or video data from the cameras 101, 103. In an alternative to the image or video data being obtained from the cameras 101, 103, it can be obtained by GPS interrogation of the enhanced digital map 184. In such embodiments, the vehicle control system 102 comprises a road image analysis module 304 and a road angle extraction module 302. In embodiments, the road image analysis module 304 comprises an input interface for receiving the image or video data and a processor operating an image analysis engine. In various embodiments, the image analysis engine is configured to determine at least one horizontal reference marker in image data and at least one road angle maker representing gradient and/or banking of the road. The image analysis engine may operate at least one image filter and at least one segmentation process to determine upon the horizontal reference and road angle markers. Exemplary horizontal markers can include horizontal roadside features including building and road infrastructure. For example, roadside walls, windows, balconies, etc. are representative of horizontal features that can be identified and marked by the analysis engine. Road angle markers can be determined based on curbs, e.g. curb tops, curb-road interface, pavement-building interface, road markings, and other road or roadside features. The road image analysis module 304 may include an output interface for communicating a result of image analysis to other modules, particularly an image including the horizontal reference markers and the road angle markers. In various embodiments, the road image analysis module 304 is configured to iteratively perform image analysis when enabled to allow iteratively updated acceleration data {right arrow over (a)} to be determined.
An example result of image analysis by the road image analysis module 304 is shown in
Another example result of image analysis by the road image analysis module 304 is shown in
Referring back to the exemplary embodiment of
In the exemplary embodiment of
Referring to
The dataflow diagram includes a process 402 of obtaining road angle data {right arrow over (θ)} according to various exemplary possibilities according to
In a process 414, acceleration data {right arrow over (a)} is obtained by reading such data from the inertial measurement unit 166.
The road angle data {right arrow over (θ)} obtained in process 402 and the acceleration data {right arrow over (a)} obtained in process 414 s used in a process 404 of determining offset acceleration {right arrow over (a)}′. In particular, the road angle data {right arrow over (θ)} and the acceleration data {right arrow over (a)} are used as inputs to a calculation for compensating influence of road gradient angle and/or road banking angle in lateral, longitudinal and/or vertical acceleration readings obtained from the inertial measurement unit 166. Exemplary calculations are shown by equations 1 to 4 described above.
In a process 406, the offset acceleration data {right arrow over (a)}′ determined in process 404 is used as an input for controlling at least one vehicle function 406. In particular, the offset acceleration data {right arrow over (a)}′ is used to determine at least one control command for the braking, steering and/or power system 150, 155, 160.
In an embodiment making use of road image data to derive road angle data {right arrow over (θ)}, processes 408, 410 are included. In process 408, image analysis processing is performed on road image or video data from the cameras 101, 103 or from images obtained by GPS based interrogation of the enhanced digital map 184. The image analysis processing identifies one or more horizontal reference features and one or more features indicative of road angle. Reference and road markers may be embedded in the road image data based on the identified one or more horizontal reference features and one or more features indicative of road angle, as described above.
In process 410, road angle data {right arrow over (θ)} is calculated based on the road image data that has been image processed in process 408. In particular, an angle is calculated between one or more horizontal reference markers and one or more road angle markers in the processed image data.
In some embodiments, process 412 may be included whereby calculated road angle data {right arrow over (θ)} obtained through processes 408 and 410 is stored in the enhanced digital map at a located identified by GPS data obtained from the GPS receiver 168.
The method 500 includes a step 502 of reading GPS data from the GPS receiver 168, in accordance with one embodiment. The GPS data serves as an input for a step 504 of interrogation of the enhanced digital map 184. The enhanced digital map 184 and associated processor implemented search engine returns either road angle data {right arrow over (θ)} or a no data flag indicating that no road angle data {right arrow over (θ)} is available for the map location corresponding to the GPS data.
In embodiments, step 506 determines whether road angle data {right arrow over (θ)} is available based on whether the no data flag is returned or whether road angle data is returned {right arrow over (θ)} by interrogating the enhanced digital map 184 in step 504. If road angle data {right arrow over (θ)} is available, it is used in a step 518 of obtaining road angle data {right arrow over (θ)} for subsequent processing. If no road angle data {right arrow over (θ)} is available from the enhanced digital map 184, road image analysis steps 508 to 516 are performed.
In step 508, road image or video data is read from the cameras 101, 103. In step 510, road image analysis is performed to simplify the image data for subsequent road angle data extraction steps 512, 514. In particular, image analysis step 510 may entail image filtering and segmentation processes. In step 512, horizontal reference markers and road angle markers are identified in the processed image data from step 510 as described above with reference to
In various embodiments of the method 500, acceleration data {right arrow over (a)} is read from the inertial measurement unit 166 in step 520. In step 522, the accelerations data {right arrow over (a)} and the road angle data {right arrow over (θ)} serve as inputs for determining offset acceleration data {right arrow over (a)}′ according to processes described in the foregoing. The offset acceleration data {right arrow over (a)}′ is operable to control at least one autonomous function of the vehicle 100 in step 524.
The exemplary method 500 of
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the disclosure in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope of the disclosure as set forth in the appended claims and the legal equivalents thereof.