The present disclosure relates to automated vehicles, and more particularly, to Light Detection and Ranging (LiDAR) tracking systems of the automated vehicles.
The operation of modern vehicles is becoming increasingly autonomous, causing a decrease in driver intervention. A control feature of such a modern vehicle may cause the vehicle to recognize a moving object (e.g., another vehicle) and react accordingly. The recognition of such moving objects may include a dimensional determination, speed, travel direction, and distance at any given moment in time. Unfortunately, in some applications, the moving object may, at least momentarily, move behind an obstructing object, thereby interrupting the recognition of the moving object and potentially delaying reaction or causing the modern vehicle to conservatively react.
In one, non-limiting, exemplary embodiment, a tracking system for at least partial automated operation of a host vehicle is configured to detect and monitor a moving object that may be at least momentarily, and at least partially, obstructed by an obstruction. The tracking system includes an object device and a controller. The object device is configured to detect the object with respect to the obstruction by monitoring for object and the obstruction at a prescribed frequency, and output a plurality of object signals at the prescribed frequency. The controller is configured to receive and process the plurality of object signals to recognize the object, determine a reference point of the object, and utilize the reference point to determine a true speed of the object as the object is increasingly or decreasingly obstructed by the obstruction.
In another, non-limiting, embodiment, an automated vehicle includes a controller and a tracking system. The controller includes a processor and an electronic storage medium. The tracking system includes a LiDAR device configured to detect a moving object and send a plurality of object signals to the controller. An application is stored in the electronic storage medium and executed by the processor to determine a reference point of the moving object based at least in-part on the plurality of object signals. The application utilizes the reference point to determine a true speed of the moving object as the moving object is at least momentarily increasingly or decreasingly obstructed by an obstruction.
In another, non-limiting, embodiment, a computer software product is executed by a controller of a host vehicle, and is configured to receive an object signal associated with a moving object and receive a positioning signal associated with the host vehicle to effect an automated reaction of the host vehicle based on at least a true speed of the moving object. The true speed of the moving object is determined as the moving object is increasingly or decreasingly obstructed by an obstruction. The computer software product includes an object recognition module and a determination module. The object recognition module is configured to receive the object signal to recognize the moving object. The determination module is configured to assign a fixed reference point upon the moving object once recognized to determine the true speed of the moving object.
These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.
The subject matter which is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
In the example of the moving object 28 being a vehicle, the vehicle 28 may include a front face 42, a rear face 44, a passenger side 46, and an operator side 48 all spanning between respective vehicle corners 50, 52, 54, 56. For tracking purposes, the vehicle 28 may further include a reference point 58 that may be a center point. In the illustrated example, the center point 58 is generally the center of a ‘footprint’ of the vehicle 28. In another example, the center point 58 may be the center of one of the sides 46, 48 being, at least in-part, viewed by the tracking system 22.
In at least the example of the tracking system 22 being a LiDAR tracking system, the system 22 is adapted to generally recognize the shape and size of at least a portion of the object or vehicle 28 within the unobstructed view of the tracking system 22. As is generally known in the art of LiDAR tracking systems, the system 22 is further configured to recognize the direction of motion 30 and speed of the moving object 28. Once the moving object 28 is recognized, the tracking system 22 is configured to timely initiate and/or coordinate an appropriate response, or reaction, by the host vehicle 20. That is, in the example of an autonomous vehicle 20 (i.e., fully automated), the tracking system may control (or effect the control of) the speed, steering, brakes, and other aspects of the host vehicle operation generally needed for the host vehicle 20 to travel upon the roadway 26 without interaction from an occupant, or operator 60 (see
Referring to
Referring to
The object device 68 of the system 22 may be at least one LiDAR device as is generally known to one having skill in the art, and is configured to detect and monitor the moving object 28 and the obstruction 36. More specifically, the LiDAR device 68 may include a large array of individual light or laser beams that are pulsed at a predetermined frequency. Sensor(s) included as part of the LiDAR device 38 are configured to detect the reflected, or returned, light. The time between the initial pulsing of the light and the sensed light return is used to calculate the distance of the reflecting object surface. The rapid pulsing of the device 38 and the information obtained can be processed to determine movement of the detected object 28 and/or obstruction 36.
The object device 68 may be mounted toward the front of the host vehicle 20. Alternatively, the object device 68 may be a plurality of LiDAR devices with each device mounted proximate to a respective corner 50, 52, 54, 56 of the host vehicle 20. In yet another example, the LiDAR device 68 may include the capability of rotating at a known frequency to capture a three-hundred and sixty degree scene. The application 80 may include an integration module 82, an object recognition module 84, a center point determination module 86, and an object data base 88.
In at least the example of multiple LiDAR devices 68, the integration module 82 may be configured to integrate multiple signals 90 received from the multiple LiDAR devices 68. The object recognition module 84 may be configured to generally receive a processed signal of multiple signals 90, from the integration module 82 if multiple LiDAR devices 68 are utilized. In the example of a single LiDAR device 68, the object recognition module 84 may receive the signal 90 directly from the LiDAR device 68.
The positioning device 70 of the tracking system 22 may be configured to determine a relative position, speed, and direction of the host vehicle 20. This positioning data may be sent to the application 80, executed by the controller 74, as a signal 92 and is generally coordinated with the signal or data 90 sent from the object device 68 in order for the application 80 to determine a desired host vehicle reaction. The positioning device 70 may be, or may include, a motion sensor 70A, a geographic navigation device 70B (e.g., global positioning system (GPS)), a speed sensor 70C, and/or other devices configured to determine the position, speed, and direction of movement of the host vehicle 20. The positioning device 70 may be mounted at the front of the host vehicle 20, but other locations such as on the roof of the host vehicle 20, or within the occupant compartment and looking through the windshield of the host vehicle 20 are also contemplated.
During operation of more typical automated vehicles, traditional LiDAR tracking systems may sense a moving object, and from what is clearly viewable, may determine a location of the moving object and a speed. In a scenario where the same traditional tracking system is sensing the moving object that begins to move behind an obstruction (i.e., an object that prevents the tracking system from sensing the entire moving object), the tracking system may incorrectly determine that the moving object is slowing down, or moving slower than it actually is, and may not be capable of recognizing the ‘true’ location of a forward portion of the moving object. Similarly, in a scenario where the moving object is emerging from behind an obstruction, the traditional tracking system may determine that the moving object is moving faster than it actually is.
Referring to
As the moving object begins to move behind the obstruction 36, the reference point 58 remains fixed, and the application continues to determine object speed via the reference point, and not the entire portion of the moving object 28 that remains viewable (i.e., that portion not yet behind the obstruction 36). When the reference point 58 is behind the obstruction 36, the application 80 may utilize the last calculated speed of the moving object 28 stored in the electronic storage medium 78 of the controller 74. In an example, where the moving object is decelerating or accelerating, the application 80 may also use the last deceleration or acceleration rate of the moving object 28. Similarly, and in an application where the moving object 28 is turning or changing direction, the application may utilize the last recorded positioning vectors.
Referring to
Referring to
In another embodiment, recognition of the moving object 28 may be more, or different, than the measurement between forward and rearward corners 52, 54. That is, the signal 90 sent by the LiDAR device 68 may contain enough information for the recognition module 84 to determine what the object is by accessing prescribed data in the object data base 88. For example, LiDAR device 68 may only sense a portion of a side 46 of the moving object 28. However, this limited information may be used to search for, as one example, a vehicle type or model. Once the vehicle model is known, the length and/or width of the vehicle may also be accessed.
In another embodiment, the reference point 58 may be the center of a ‘footprint’ of the moving object 28. If the moving object 28 is a vehicle, and if the length and width of the vehicle is determined, as discussed above, the reference point 58 of the ‘footprint’ (i.e., center of an area) may also be determined.
Accordingly, the tracking system 22 for automated operation of the host vehicle 20 advances the automated vehicle arts by enabling a system, application, or controller to react more quickly, efficiently, and/or accurately to a moving object that may be at least partially blocked by an obstruction at any given moment in time. In addition, the present disclosure provides a tracking system capable of determining a true moving object speed and a true object location as the moving object moves behind, or emerges from, an obstruction.
The various functions described above may be implemented or supported by a computer program that is formed from computer readable program codes, and that is embodied in a computer readable medium. Computer readable program codes may include source codes, object codes, executable codes, and others. Computer readable mediums may be any type of media capable of being accessed by a computer, and may include Read Only Memory (ROM), Random Access Memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), or other forms.
Terms used herein such as component, application, module, system, and the like are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, or software execution. By way of example, an application may be, but is not limited to, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. It is understood that an application running on a server and the server, may be a component. One or more applications may reside within a process and/or thread of execution and an application may be localized on one computer and/or distributed between two or more computers
While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description.