The present disclosure relates to a Light Detection and Ranging (LiDAR) sensor alignment system, and more particularly, to a LiDAR sensor alignment system of a tracking system for automated vehicles.
The operation of modern vehicles is becoming increasingly autonomous, causing a decrease in driver intervention. The various control features are becoming increasingly complex while vehicle accuracy, efficiency, and reliability must be at least maintained. The complex nature of such automated systems may require a large number of sensors. Such sensors may become misaligned. If not corrected, such misalignment may degrade optimal vehicle performance.
In one, non-limiting, exemplary embodiment of the present disclosure, a Light Detection and Ranging (LiDAR) sensor alignment system includes first and second LiDAR sensors, and a controller. The first and second LiDAR sensors are each configured to monitor respective first and second regions and output respective first and second LiDAR signals associated with the regions. The controller is configured to receive the signals, recognize a target detected by both the first and second LiDAR sensors, utilize a first coordinate map associated with the first region to determine a first mapped location of the target, utilize a second coordinate map associated with the second region to determine a second mapped location of the target, and associate the first and second mapped locations to determine if the first and second LiDAR sensors are aligned.
In another, non-limiting, embodiment, an automated vehicle includes a first LiDAR sensor, a second LiDAR sensor, and a controller. The first LiDAR sensor is configured to monitor a first region and output a first LiDAR signal associated with the first region. The second LiDAR sensor is configured to monitor a second region and output a second LiDAR signal associated with the second region. A first segment of the first region completely overlaps a second segment of the second region when both the first and second LiDAR sensors are aligned. The controller includes a processor and an electronic storage medium. The processor is configured to receive and process the first and second LiDAR signals, recognize a target detected by both the first and second LiDAR sensors, utilize a first coordinate map associated with the first region and stored in the electronic storage medium to determine a first mapped location of the target, determine a second hypothetical location of the target associated with a second coordinate map orientated in a preprogrammed alignment configuration with the first coordinate map, utilize the second coordinate map associated with the second region and stored in the electronic storage medium to determine a second mapped location of the target, and compare the second hypothetical location to the second mapped location to determine if the first and second LiDAR sensors are aligned.
In another, non-limiting, embodiment, a computer software product is executed by a controller of an automated vehicle that includes first and second LiDAR sensors configured to output respective first and second LiDAR signals associated with respective first and second regions. The computer software product includes a preprogrammed database, a recognition module, a location assignment module, and a comparison module. The preprogrammed database includes preprogrammed first and second coordinate maps associated with the respective first and second regions, and an alignment model indicative of the first and second coordinate maps being aligned. The recognition module is configured to receive the first and second LiDAR signals and recognize a target detected by both the first and second LiDAR sensors. The location assignment module is configured to assign a first mapped location of the target relative to the first coordinate map, assign a modeled second location of the detected target relative to the first mapped location and the second coordinate map when not associated with the second region and when utilizing the alignment model, and assign a true second mapped location of the detected target when utilizing the second coordinate map relative to the second region and regardless of whether the second LiDAR sensor is aligned or misaligned. The comparison module is configured to compare the modeled second mapped location to the true second mapped location to determine if the first and second LiDAR sensors are aligned.
These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.
The subject matter which is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
A target 24 that may be, for example, a corner of another vehicle, may be generally located forward of the host vehicle 20. The various components and/or systems of the host vehicle 20, which contribute toward automated operational aspects of the host vehicle, may generally detect, sense, and/or image the target 24 in order to affect a desired response by the host vehicle 20.
The host vehicle 20 includes a Light Detection and Ranging (LiDAR) sensor alignment system 22. The LiDAR sensor alignment system 22 may include a first LiDAR sensor 26, a second LiDAR sensor 28, at least one mount device 30, and a controller 32. In
The mount device 30 may be attached to, and may extend between, the second LiDAR sensor 28 and a body 34 of the host vehicle 20. The mount device 30 may be adapted to adjust the positioning of the second LiDAR sensor 28. In one embodiment, this adjustment may be conducted manually, and in another embodiment, the mount device 30 may include an electric alignment drive, or motor, constructed to automatically align the second LiDAR sensor 28.
Each LiDAR sensor 26, 28 includes respective field of views or regions 36, 38 generally established to monitor for objects or targets 24 within one or both regions 36, 38. For example, each LiDAR sensor 26, 28 may include a field of view 36, 38 generally associated with respective viewing angles of about forty-five degrees (see arrows 40, 42). In the present example, the LiDAR sensors 26, 28 may be separated from one-another and mounted on opposite forward corners of the host vehicle 20. With this configuration, the aligned LiDAR sensors 26, 28, working together, can view a greater area than just one sensor. It is further illustrated that the regions 36, 38 may overlap, beginning at a distance forward of the vehicle 20. Therefore, each region 36, 38, may include respective overlap segments 40, 42 that completely overlap one-another. The general area of this overlap changes as one of the LiDAR sensors 26, 28 moves from the aligned position to the misaligned position.
The LiDAR sensor alignment system 22 may further include communication pathways 44, 46, 47 that may be wired or wireless. The first pathway 44 may extend between the controller 32 and the first LiDAR sensor 26. The second pathway 46, may extend between the controller 32 and the second LiDAR sensor 28, and the third pathway 47 may extend between the controller 32 and the mount device 30.
Referring to
In the example of a semi-autonomous host vehicle 20, the host vehicle may be typically driven by an operator 48. In this case, an automation system (not shown) may provide assistance to the operator 48. This assistance may be the mere activation of a warning device 50 (see
Referring to
The LiDAR sensors 26, 28 are generally known to one having skill in the art, and when in an aligned position, are configured to at least assist in the detection and monitoring of the object 24. More specifically, each LiDAR sensor 26, 28 may include a large array of individual light or laser beams that are pulsed at a predetermined frequency. Sensor(s) included as part of the LiDAR sensors 26, 28 are configured to detect the reflected, or returned, light. The time between the initial pulsing of the light and the sensed light return is used to calculate the distance of the reflecting object surface. The rapid pulsing of the LiDAR sensors 26, 28 and the information obtained can be processed to determine movement of the detected object 24. Similarly, and in any given moment in time, the location of the object 24 may be determined within a three-dimensional space.
The application 60 may include a database 62, a recognition module 64, a location assignment module 66, and a comparison module 68. The recognition module 64 is configured to receive first and second LiDAR signals (see arrows 70, 72) from the respective LiDAR sensors 26, 28 over the respective pathways 44, 46. Once received, the recognition module 64 processes the signals 70, 72 to recognize the target. For purposes of sensor alignment, the module 68 may recognize a target 24 that is detected by both sensors 26, 28 in a given, singular, moment in time.
The location assignment module 66 is configured to utilize a first coordinate map 74 stored in the database 62 to determine and assign a first mapped location 75 of the target 24. For example, in
In the present example, the first LiDAR sensor 26 may be assumed to be aligned and is thus used as a reference to determine alignment of the second LiDAR sensor 28. To assist in this execution, the location assignment module 66 may further utilize an alignment model 80 preprogramed into the database 62 of the electronic storage medium 58. The alignment model 80 is generally a reference that orientates the first and second coordinate maps 74, 76 with respect to one another, assuming that the first and second LiDAR sensors 26, 28 are aligned. Utilizing the first mapped location 75 and the model 80, the location assignment module 66 may determine a modeled second location 82 (see
The comparison module 68 is configured to compare the true second mapped location 78 to the modeled second location 82. If the two locations 78, 82 generally match, the application 58 may determine that the LiDAR sensors 26, 28 are aligned. If the two locations 78, 82 do not match, the application 58 may utilize the coordinates to determine how far, and in which direction, the second LiDAR sensor 28 is out of alignment.
Referring to
If the second LiDAR sensor 28 is misaligned, the controller 32 may initiate an action by sending a command signal (see arrow 84) to the mount device 30 over pathway 47, and which may cause the mount device 30 to realign the LiDAR sensor 28 by a magnitude and direction that may be determined by comparison of the locations 78, 82. In another embodiment, a command signal (see arrow 86) may be sent to the warning device 50, as the action, to notify the operator 48 of the misalignment.
Accordingly, the LiDAR sensor alignment system 22 for automated operation of the host vehicle 20 advances the automated vehicle arts by enabling a system, application, or controller to perform self-diagnostics thereby improving overall vehicle accuracy, efficiency, and reliability.
The various functions described above may be implemented or supported by a computer program that is formed from computer readable program codes, and that is embodied in a computer readable medium. Computer readable program codes may include source codes, object codes, executable codes, and others. Computer readable mediums may be any type of media capable of being accessed by a computer, and may include Read Only Memory (ROM), Random Access Memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), or other forms.
Terms used herein such as component, application, module, system, and the like are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, or software execution. By way of example, an application may be, but is not limited to, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. It is understood that an application running on a server and the server, may be a component. One or more applications may reside within a process and/or thread of execution and an application may be localized on one computer and/or distributed between two or more computers
While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description.