The field of the disclosure relates generally to autonomous vehicles and, more specifically, lane positioning in regions of irregular lane width.
Autonomous vehicles are continuously sensing the environment and road the vehicle is travelling, collecting large amounts of data from sensors and remote sources, and processing the collected data to generate detailed information about the environment. An autonomy system of the autonomous vehicle processes the data and uses the results to make decisions about future operations of the autonomous vehicle.
A foundational aspect of autonomous operation of a vehicle on a roadway is to maintain the autonomous vehicle in its lane, e.g., between lane lines. At least some known autonomous systems detect lane markers on either side of the vehicle and operate roughly equidistant from both, i.e., in the center. However, merging lanes and obscured or missing lane markings can be perceived by the autonomous vehicle as sudden changes in lane width and can introduce behaviors by the autonomous vehicle that are unexpected and anomalous when experienced by passengers or observed by nearby drivers.
Therefore, a need exists for a simplified and efficient method of lane positioning in situations where the lane width may become irregular with respect to the rest of the road (e.g. when a lane ends, construction zones).
This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present disclosure described or claimed below. This description is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light and not as admissions of prior art.
In one aspect, an autonomy computing system for an autonomous vehicle traveling through a lane width irregularity includes at least one memory storing executable instructions defining a lane positioning module and at least one processor. The processor, upon executing the lane positioning module, is configured to: detect the lane width irregularity; select a lane marking to be used as a guide through the lane width irregularity; and maintain a predefined distance from the lane marking through the lane width irregularity.
In another aspect, a method of lane positioning of an autonomous vehicle is provided. The autonomous vehicle is travelling through a lane width irregularity. The method includes detecting the lane width irregularity. The method includes selecting a lane marking to be used as a guide through the lane width irregularity. The method includes maintaining a predefined distance from the lane marking through the lane width irregularity.
In yet another aspect, an autonomous vehicle includes a plurality of sensors configured to detect a first lane marking and a second lane marking on the roadway, at least one memory configured to store executable instructions defining a lane positioning module, and at least one processor coupled to the at least one memory. The processor is configured, upon executing the executable instructions, to: detect a lane width irregularity based on the first lane marking and the second lane marking; select a lane marking among the first lane marking and the second lane marking to be used as a guide through the lane width irregularity; and maintain a predefined distance from the lane marking throughout the lane width irregularity.
Various refinements exist of the features noted in relation to the above-mentioned aspects. Further features may also be incorporated in the above-mentioned aspects as well. These refinements and additional features may exist individually or in any combination. For instance, various features discussed below in relation to any of the illustrated examples may be incorporated into any of the above-described aspects, alone or in any combination.
The following drawings form part of the present specification and are included to further demonstrate certain aspects of the present disclosure. The disclosure may be better understood by reference to one or more of these drawings in combination with the detailed description of specific embodiments presented herein.
Corresponding reference characters indicate corresponding parts throughout the several views of the drawings. Although specific features of various examples may be shown in some drawings and not in others, this is for convenience only. Any feature of any drawing may be referenced or claimed in combination with any feature of any other drawing.
The following detailed description and examples set forth preferred materials, components, and procedures used in accordance with the present disclosure. This description and these examples, however, are provided by way of illustration only, and nothing therein shall be deemed to be a limitation upon the overall scope of the present disclosure.
When an autonomous vehicle encounters a sudden change in lane width, for example, when a lane ends or lane markings are ambiguous or obscured, the autonomy system detects, at least momentarily, that the lane width roughly doubles where two lanes join and are no longer divided by a lane marking. This can result in a sudden steering “correction” to center the autonomous vehicle in what it detects is the new lane width, e.g., the center of the two merging lanes. Likewise, when a single lane divides into multiple lanes, new lane markings appear, resulting in a sudden steering correction to center the autonomous vehicle in one of the newly divided lanes. The behavior is often considered “jerky” by passengers and observers.
Lane width irregularities exist in myriad scenarios, including lane mergers and divisions, worn or obscured lane markings, or missing lane markings, which can be common in construction zones, for example, on temporary roadways or newly paved road. At least some of these scenarios are often accompanied by signage indicating the upcoming road condition. For example, lane mergers are often preceded by signage, and road construction commonly uses signage for safety reasons. However, signage is not uniform, may itself be obscured, worn, or fallen, and may be ambiguously perceived by an autonomous vehicle, rendering the autonomous vehicle incapable of correlating the signage to the autonomous vehicle's lane position. In other words, signage may not reliably translate, when perceived by the autonomous vehicle, to a smooth lane positioning maneuver by the autonomous vehicle.
The disclosed systems include an autonomy computing system that achieves a smooth lane positioning regardless of signage. The disclosed lane positioning methods identify either the left or right lane markings (or both) and controls the autonomous vehicle to position the autonomous vehicle a set distance from the identified (or selected one) lane marking.
In the example embodiment, sensors 202 may include various sensors such as, for example, radio detection and ranging (RADAR) sensors 210, light detection and ranging (LiDAR) sensors 212, cameras 214, acoustic sensors 216, temperature sensors 218, or inertial navigation system (INS) 220, which may include one or more global navigation satellite system (GNSS) receivers 222 and one or more inertial measurement units (IMU) 224. Other sensors 202 not shown in
Cameras 214 are configured to capture images of the environment surrounding autonomous vehicle 100 in any aspect or field of view (FOV). The FOV can have any angle or aspect such that images of the areas ahead of, to the side, behind, above, or below autonomous vehicle 100 may be captured. In some embodiments, the FOV may be limited to particular areas around autonomous vehicle 100 (e.g., forward of autonomous vehicle 100, to the sides of autonomous vehicle 100, etc.) or may surround 360 degrees of autonomous vehicle 100. In some embodiments, autonomous vehicle 100 includes multiple cameras 214, and the images from each of the multiple cameras 214 may be stitched or combined to generate a visual representation of the multiple cameras' FOVs, which may be used to, for example, generate a bird's eye view of the environment surrounding autonomous vehicle 100. In some embodiments, the image data generated by cameras 214 may be sent to autonomy computing system 200 or other aspects of autonomous vehicle 100, and this image data may include autonomous vehicle 100 or a generated representation of autonomous vehicle 100. In some embodiments, one or more systems or components of autonomy computing system 200 may overlay labels to the features depicted in the image data, such as on a raster layer or other semantic layer of a high-definition (HD) map.
LiDAR sensors 212 generally include a laser generator and a detector that send and receive a LiDAR signal such that LiDAR point clouds (or “LiDAR images”) of the areas ahead of, to the side, behind, above, or below autonomous vehicle 100 can be captured and represented in the LiDAR point clouds. Radar sensors 210 may include short-range RADAR (SRR), mid-range RADAR (MRR), long-range RADAR (LRR), or ground-penetrating RADAR (GPR). One or more sensors may emit radio waves, and a processor may process received reflected data (e.g., raw radar sensor data) from the emitted radio waves. In some embodiments, the system inputs from cameras 214, radar sensors 210, or LiDAR sensors 212 may be fused or used in combination to determine conditions (e.g., locations of other objects) around autonomous vehicle 100.
GNSS receiver 222 is positioned on autonomous vehicle 100 and may be configured to determine a location of autonomous vehicle 100, which it may embody as GNSS data, as described herein. GNSS receiver 222 may be configured to receive one or more signals from a global navigation satellite system (e.g., Global Positioning System (GPS) constellation) to localize autonomous vehicle 100 via geolocation. In some embodiments, GNSS receiver 222 may provide an input to or be configured to interact with, update, or otherwise utilize one or more digital maps, such as an HD map (e.g., in a raster layer or other semantic map). In some embodiments, GNSS receiver 222 may provide direct velocity measurement via inspection of the Doppler effect on the signal carrier wave. Multiple GNSS receivers 222 may also provide direct measurements of the orientation of autonomous vehicle 100. For example, with two GNSS receivers 222, two attitude angles (e.g., roll and yaw) may be measured or determined. In some embodiments, autonomous vehicle 100 is configured to receive updates from an external network (e.g., a cellular network). The updates may include one or more of position data (e.g., serving as an alternative or supplement to GNSS data), speed/direction data, orientation or attitude data, traffic data, weather data, or other types of data about autonomous vehicle 100 and its environment.
IMU 224 is a micro-electrical-mechanical (MEMS) device that measures and reports one or more features regarding the motion of autonomous vehicle 100, although other implementations are contemplated, such as mechanical, fiber-optic gyro (FOG), or FOG-on-chip (SiFOG) devices. IMU 224 may measure an acceleration, angular rate, and or an orientation of autonomous vehicle 100 or one or more of its individual components using a combination of accelerometers, gyroscopes, or magnetometers. IMU 224 may detect linear acceleration using one or more accelerometers and rotational rate using one or more gyroscopes and attitude information from one or more magnetometers. In some embodiments, IMU 224 may be communicatively coupled to one or more other systems, for example, GNSS receiver 222 and may provide input to and receive output from GNSS receiver 222 such that autonomy computing system 200 is able to determine the motive characteristics (acceleration, speed/direction, orientation/attitude, etc.) of autonomous vehicle 100.
In the example embodiment, autonomy computing system 200 employs vehicle interface 204 to send commands to the various aspects of autonomous vehicle 100 that actually control the motion of autonomous vehicle 100 (e.g., engine, throttle, steering wheel, brakes, etc.) and to receive input data from one or more sensors 202 (e.g., internal sensors). External interfaces 206 are configured to enable autonomous vehicle 100 to communicate with an external network via, for example, a wired or wireless connection, such as Wi-Fi 226 or other radios 228. In embodiments including a wireless connection, the connection may be a wireless communication signal (e.g., Wi-Fi, cellular, LTE, 5g, Bluetooth, etc.).
In some embodiments, external interfaces 206 may be configured to communicate with an external network via a wired connection 244, such as, for example, during testing of autonomous vehicle 100 or when downloading mission data after completion of a trip. The connection(s) may be used to download and install various lines of code in the form of digital files (e.g., HD maps), executable programs (e.g., navigation programs), and other computer-readable code that may be used by autonomous vehicle 100 to navigate or otherwise operate, either autonomously or semi-autonomously. The digital files, executable programs, and other computer readable code may be stored locally or remotely and may be routinely updated (e.g., automatically or manually) via external interfaces 206 or updated on demand. In some embodiments, autonomous vehicle 100 may deploy with all of the data it needs to complete a mission (e.g., perception, localization, and mission planning) and may not utilize a wireless connection or other connection while underway.
In the example embodiment, autonomy computing system 200 is implemented by one or more processors and memory devices of autonomous vehicle 100. Autonomy computing system 200 includes modules, which may be hardware components (e.g., processors or other circuits) or software components (e.g., computer applications or processes executable by autonomy computing system 200), configured to generate outputs, such as control signals, based on inputs received from, for example, sensors 202. These modules may include, for example, a calibration module 230, a mapping module 232, a motion estimation module 234, a perception and understanding module 236, a behaviors and planning module 238, a control module or controller 240, and a lane positioning module 242. Lane positioning module 242, for example, may be embodied within another module, such as behaviors and planning module 238, or separately. These modules may be implemented in dedicated hardware such as, for example, an application specific integrated circuit (ASIC), field programmable gate array (FPGA), or microprocessor, or implemented as executable software modules, or firmware, written to memory and executed on one or more processors onboard autonomous vehicle 100.
Lane positioning module 242 maintains proper lane position for autonomous vehicle 100 in all conditions, e.g., regardless of signage for given road conditions. Lane positioning module 242 receives, for example, positions of left or right lane markings from perception and understanding module 236 and computes a lane position offset from the identified lane marking. Where both left and right lane markings are detected by perceptions and understanding module 236, in combination with sensors 202, lane positioning module 242 selects one lane marking from which lane positioning is derived.
Autonomy computing system 200 of autonomous vehicle 100 may be completely autonomous (fully autonomous) or semi-autonomous. In one example, autonomy computing system 200 can operate under Level 5 autonomy (e.g., full driving automation), Level 4 autonomy (e.g., high driving automation), or Level 3 autonomy (e.g., conditional driving automation). As used herein the term “autonomous” includes both fully autonomous and semi-autonomous.
Conventionally, autonomous vehicle 350 maintains a centered position within lane 302. In other words, autonomous vehicle 350 monitors the first outer lane marking 308 and first lane marking 310 and centers itself between these two markings. However, as autonomous vehicle 350 approaches irregularity 316, lane marking 310 disappears. As the lane 302 narrows, and lane marking 310 is removed, autonomous vehicle 350 centers itself between second lane marking 312 and first outer lane marking 308. In irregularity 316, autonomous vehicle 350 centers itself based upon the transition lane width 322a, 322b, 322c, and 322d.
A planned path 330 is shown in
A planned path 430 is shown in
In the embodiment shown in
A total absolute curvature is computed by integrating curvature Ks over the path length, ds. Lane positioning module 242 utilizes a “lookahead” distance to define the path length for determining curvature. The lookahead distance is at most the effective range of forward looking sensors on autonomous vehicle 100. For example, sensors 202 may have an effective range of 100 meters or 200 meters. Left and right lane markings, for example, first outer lane marking 408 and first lane marking 410, are detected and represented as splines extending out the lookahead distance. Total absolute curvature for each is then computed by lane positioning module 242 over that lookahead distance, i.e., the path length, ds. The lane marking having the lower total absolute curvature is selected, and autonomous vehicle 100 maintains a predetermined offset from the selected lane marking.
Lane positioning module 242 determines when irregularity 416 ends by continuing to monitor the curvature of lane markings. When the curvature of the lane markings matches within a predetermined threshold, then the lane width is no longer irregular and is effectively constant. Lane positioning module 242, upon detecting the constant lane width, resumes operating autonomous vehicle 100 to center between left and right lane markings, e.g., between first outer lane marking 408 and second lane marking 412.
One lane marking is selected 504 by lane positioning module 242 to be used as a guide through lane width irregularity 416. For example, lane positioning module 242, in certain embodiments, selects the lane marking exhibiting a lower curvature. Lane positioning module 242 then maintains 506 autonomous vehicle 100 at a predefined distance from the selected lane marking through lane width irregularity 416. For example, lane positioning module 242 may select first outer lane marking 408 and, more specifically, angled portion 418, to use as a guide. Autonomous vehicle 100 maintains planned path 430, which maintains the predefined distance from first outer lane marking 408 until the end of lane width irregularity 416 is reached.
Methods described herein may be implemented, for example, on autonomy computing system 200 shown in
In the example embodiment, the memory device 604 includes one or more devices that enable information, such as executable instructions or other data (e.g., sensor data), to be stored and retrieved. Moreover, the memory device 604 includes one or more computer readable media, such as, without limitation, dynamic random access memory (DRAM), static random access memory (SRAM), a solid state disk, or a hard disk. In the example embodiment, the memory device 604 stores, without limitation, application source code, application object code, configuration data, additional input events, application states, assertion statements, validation results, or any other type of data. The computing device 600, in the example embodiment, may also include a communication interface 606 that is coupled to the processor 602 via system bus 608. Moreover, the communication interface 606 is communicatively coupled to data acquisition devices.
In the example embodiment, processor 602 may be programmed by encoding an operation using one or more executable instructions and providing the executable instructions in the memory device 604. In the example embodiment, the processor 602 is programmed to select a plurality of measurements that are received from data acquisition devices.
In operation, a computer executes computer-executable instructions embodied in one or more computer-executable components stored on one or more computer-readable media to implement aspects of the disclosure described or illustrated herein. The order of execution or performance of the operations in embodiments of the disclosure illustrated and described herein is not essential, unless otherwise specified. That is, the operations may be performed in any order, unless otherwise specified, and embodiments of the disclosure may include additional or fewer operations than those disclosed herein. For example, it is contemplated that executing or performing a particular operation before, contemporaneously with, or after another operation is within the scope of aspects of the disclosure.
An example technical effect of the methods, systems, and apparatus described herein includes at least one of: (a) identifying a lane ending without indicating signage; (b) determining a smooth path for merging; or (c) returning to normal lane positioning algorithms after performing a merge due to a lane ending.
Some embodiments involve the use of one or more electronic processing or computing devices. As used herein, the terms “processor” and “computer” and related terms, e.g., “processing device,” and “computing device” are not limited to just those integrated circuits referred to in the art as a computer, but broadly refers to one or more processors, processing devices or systems, general purpose central processing units (CPUs), graphics processing units (GPUs), microcontrollers, microcomputers, programmable logic controllers (PLCs), reduced instruction set computer (RISC) processors, field programmable gate arrays (FPGAs), digital signal processors (DSPs), application specific integrated circuits (ASICs), and other programmable circuits or processing devices capable of executing the functions described herein, and these terms are used interchangeably herein. These processing devices are generally “configured” to execute functions by programming or being programmed, or by the provisioning of instructions for execution. The above examples are not intended to limit in any way the definition or meaning of the terms processor, processing device, and related terms.
The various aspects illustrated by logical blocks, modules, circuits, processes, algorithms, and algorithm steps described above may be implemented as electronic hardware, software, or combinations of both. Certain disclosed components, blocks, modules, circuits, and steps are described in terms of their functionality, illustrating the interchangeability of their implementation in electronic hardware or software. The implementation of such functionality varies among different applications given varying system architectures and design constraints. Although such implementations may vary from application to application, they do not constitute a departure from the scope of this disclosure.
Aspects of embodiments implemented in software may be implemented in program code, application software, application programming interfaces (APIs), firmware, middleware, microcode, hardware description languages (HDLs), or any combination thereof. A code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to, or integrated with, another code segment or a electronic hardware by passing or receiving information, data, arguments, parameters, memory contents, or memory locations. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
The actual software code or specialized control hardware used to implement these systems and methods is not limiting of the claimed features or this disclosure. Thus, the operation and behavior of the systems and methods were described without reference to the specific software code being understood that software and control hardware can be designed to implement the systems and methods based on the description herein.
When implemented in software, the disclosed functions may be embodied, or stored, as one or more instructions or code on or in memory. In the embodiments described herein, memory may include, but is not limited to, a non-transitory computer-readable medium, such as flash memory, a random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and non-volatile RAM (NVRAM). As used herein, the term “non-transitory computer-readable media” is intended to be representative of any tangible, computer-readable media, including, without limitation, non-transitory computer storage devices, including, without limitation, volatile and non-volatile media, and removable and non-removable media such as a firmware, physical and virtual storage, CD-ROM, DVD, and any other digital source such as a network, a server, cloud system, or the Internet, as well as yet to be developed digital means, with the sole exception being a transitory propagating signal. The methods described herein may be embodied as executable instructions, e.g., “software” and “firmware,” in a non-transitory computer-readable medium.
As used herein, the terms “software” and “firmware” are interchangeable, and include any computer program stored in memory for execution by personal computers, workstations, clients, and servers. Such instructions, when executed by a processor, configure the processor to perform at least a portion of the disclosed methods.
As used herein, an element or step recited in the singular and proceeded with the word “a” or “an” should be understood as not excluding plural elements or steps unless such exclusion is explicitly recited. Furthermore, references to “one embodiment” of the disclosure or an “exemplary embodiment” are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Likewise, limitations associated with “one embodiment” or “an embodiment” should not be interpreted as limiting to all embodiments unless explicitly recited.
Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is generally intended, within the context presented, to disclose that an item, term, etc. may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Likewise, conjunctive language such as the phrase “at least one of X, Y, and Z,” unless specifically stated otherwise, is generally intended, within the context presented, to disclose at least one of X, at least one of Y, and at least one of Z.
The disclosed systems and methods are not limited to the specific embodiments described herein. Rather, components of the systems or steps of the methods may be utilized independently and separately from other described components or steps.
This written description uses examples to disclose various embodiments, which include the best mode, to enable any person skilled in the art to practice those embodiments, including making and using any devices or systems and performing any incorporated methods. The patentable scope is defined by the claims and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences form the literal language of the claims.