The subject disclosure relates to display systems and more particularly to a system integrating autonomous driving information into a head up display.
Drivers use many technologies to aid in the driving experience.
Current methodologies for autonomous driving (AD) check the road, then check if there's an issue. Some may provide warnings of an imminent or potential collision but this is not helpful in complex driving situations. The computer vision in AD is not trained to recognize some hazards and traffic signs.
Some drivers use GPS to help them navigate their trip. GPS is hard to look at on a phone display or standalone device screen and takes the driver's focus away from the road. In addition, scale is not intuitive on GPS maps on devices and it is hard to tell if something is 5 miles away or 500 ft away.
As can be seen, there is a need to improve on driver assist technologies.
In one aspect of the disclosure, a vehicle driving assist system is disclosed. The system includes a head-up display (HUD). An augmented reality (AR) engine is connected to the HUD. Sensors positioned on the vehicle detect an environment surrounding the vehicle. A processor is connected to the sensors and to the AR engine. Environmental data detected by the sensors is provided to the processor. The processor is configured to determine a presence of physical objects and a position of respective physical objects relative to the vehicle based on the environmental data. The AR engine displays on the HUD an augmented reality scene that includes one or more virtual objects associated with one of the physical objects.
It is understood that other configurations of the subject technology will become readily apparent to those skilled in the art from the following detailed description, wherein various configurations of the subject technology are shown and described by way of illustration. As will be realized, the subject technology is capable of other and different configurations and its several details are capable of modification in various other respects, all without departing from the scope of the subject technology. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not as restrictive.
The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology may be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a thorough understanding of the subject technology. However, it will be apparent to those skilled in the art that the subject technology may be practiced without these specific details. Like or similar components are labeled with identical element numbers for ease of understanding.
Referring to the Figures in general, embodiments of the subject technology provide a head up display (“HUD”) system in a vehicle with autonomous driving information used and shown on the display in real-time. Information from sensors built into the vehicle are used for visual enhancement of the driving experience through the head-up display. In addition, some embodiments include an Autonomous Driving (“AD”) system for controlling the course of the vehicle automatically based on the information from the sensors. AD based information may be displayed in the HUD and presented in various forms including for example, a replica of the surrounding vehicle's driving environment (for example, simulated roads, vehicles, obstacles, and other road related elements). Auxiliary information including vehicle speed, distance to objects, roadway lanes, maps and map directions/routes, and potential vehicle collisions, and recommended vehicle positioning may be displayed.
Referring now to
Referring back to
The environmental information of the current environment surrounding the vehicle may be obtained from sensors 130 positioned on various parts of the vehicle. Embodiments may position a plurality of sensors 130 so that multiple or as many directions as possible detect environmental data that may be replicated onto the HUD 110. The sensors 130 may be for example, cameras, forward looking infrared (FLIR) sensors, thermal detectors, or ultrasonic detectors. The sensors 130 may detect physical objects near the vehicle, approaching the vehicle, or far off from the vehicle. Physical objects may include for example, other vehicles, lane markers, road boundaries such as guardrails, K-rails, impact devices, terrain, poles, signs, lane dividers, and debris. Physical objects may be moving or still.
The HUD 110 may be a standalone electronic display positioned in front of or on top of a windshield (not shown). In some embodiments, the HUD 110 includes a layer of fluorinated ethylene propylene (FEP) that may be installed onto a substrate as a standalone display structure or may be applied to a windshield/window structure as a film. In some embodiments, the HUD 110 may be integrated into the windshield. Integrated embodiments include wiring of glass that produces an electronic image. In some embodiments, the HUD 110 may comprise more than one section of a display. For example, the HUD 110 may include a front or central display area/section 111. The HUD 110 may include a driver side window display area/section 115. The HUD 110 may include a passenger side window display area/section 120. The front or central display area/section 111 may span a substantial length of the windshield or may span from approximately a left end of the dashboard (not shown) to approximately a center line of the vehicle. The driver side window display area/section 115 and the passenger side window display area/section 120 may cover a substantial portion of their respective side windows, (for example more than 50% of the window).
In some embodiments, the HUD 110 may be projected onto an existing surface including for example, the windshield or the layer of FEP. The system 100 may include a projector 150. The projector 150 may be positioned in the cabin interior 105 and disposed to project AR imagery or a virtual simulated environment onto the HUD 110. In some embodiments, the projector 150 may be a triple projector with multiple sub-projectors 155 disposed to project onto the front or central display area/section 111, the driver side window display area/section 115, and the passenger side window display area/section 120. Some embodiments include a selectable feature that displays an AR scene onto the display area/section of the user's choice.
Elements that may not be visible in
In an exemplary embodiment, the simulation engine 1445 is used at an accelerated pace to analyze statistical probability of road conditions and other traffic. The simulated scenarios may be used by the AD system to determine whether a collision may be imminent and whether the AD system should automatically engage vehicular deviation from the current trajectory into an optimal position which avoids collision and/or presents a better path for continuous driving.
In
In some embodiments, the system 100 may include a vehicle control system 1420 that includes an autonomous driving control engine 1430. In AD embodiments, the vehicle control system 1420 may automatically control the course and speed of the vehicle using. The autonomous driving control engine 1430 may direct vehicle control based on data from an on-board navigation system 1460 and the data from the sensors 130, collision detection engine 1444, and simulation engine 1445. The autonomous vehicle control system 1420 may automatically take control of the vehicle (if the vehicle was not already being driven autonomously), in the event of an imminent or potential collision and steer the vehicle onto a safer alternate course. In some embodiments, the control of the vehicle is engaged for an optimized vehicle position (for example, increased distance from a vehicle ahead of the subject vehicle, a different lane with better traffic flow up ahead, easier lane changing position for an upcoming lane change or merge, etc.). In some embodiments, the alternate course is shown in the HUD 110.
In some embodiments, the on-board navigation system 1460 may generate a digital map in the HUD 110.
Referring now to
Referring now to
In some embodiments, the AR display may be delivered through a worn device and the system may transmit the AD information into the worn device. Some embodiments may incorporate voice control, speech to text and text to speech features whose input/output may be included on the HUD 110/900.
Those of skill in the art would appreciate that various components and blocks may be arranged differently (e.g., arranged in a different order, or partitioned in a different way) all without departing from the scope of the subject technology.
The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. The previous description provides various examples of the subject technology, and the subject technology is not limited to these examples. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects.
Aspects of the disclosed invention are described above with reference to block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to the processor 1410, which executes and creates means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks in the figures. In some embodiments, this may be software or a software application, sometimes referred to colloquially as an “app”.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the call flow process and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the call flow process or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the blocks may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or call flow illustration, and combinations of blocks in the block diagrams and/or call flow illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
Thus, the claims are not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.” Unless specifically stated otherwise, the term “some” refers to one or more. Pronouns in the masculine (e.g., his) include the feminine and neuter gender (e.g., her and its) and vice versa. Headings and subheadings, if any, are used for convenience only and do not limit the invention.
Terms such as “top,” “bottom,” “front,” “rear,” “above,” “below” and the like as used in this disclosure should be understood as referring to an arbitrary frame of reference, rather than to the ordinary gravitational frame of reference. Thus, a top surface, a bottom surface, a front surface, and a rear surface may extend upwardly, downwardly, diagonally, or horizontally in a gravitational frame of reference. Similarly, an item disposed above another item may be located above or below the other item along a vertical, horizontal or diagonal direction; and an item disposed below another item may be located below or above the other item along a vertical, horizontal or diagonal direction.
A phrase such as an “aspect” does not imply that such aspect is essential to the subject technology or that such aspect applies to all configurations of the subject technology. A disclosure relating to an aspect may apply to all configurations, or one or more configurations. An aspect may provide one or more examples. A phrase such as an aspect may refer to one or more aspects and vice versa. A phrase such as an “embodiment” does not imply that such embodiment is essential to the subject technology or that such embodiment applies to all configurations of the subject technology. A disclosure relating to an embodiment may apply to all embodiments, or one or more embodiments. An embodiment may provide one or more examples. A phrase such an embodiment may refer to one or more embodiments and vice versa. A phrase such as a “configuration” does not imply that such configuration is essential to the subject technology or that such configuration applies to all configurations of the subject technology. A disclosure relating to a configuration may apply to all configurations, or one or more configurations. A configuration may provide one or more examples. A phrase such a configuration may refer to one or more configurations and vice versa.
The word “exemplary” is used herein to mean “serving as an example or illustration.” Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs.
All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims. No claim element is to be construed under the provisions of 35 U.S.C. § 112, sixth paragraph, unless the element is expressly recited using the phrase “means for” or, in the case of a method claim, the element is recited using the phrase “step for.” Furthermore, to the extent that the term “include,” “have,” or the like is used in the description or the claims, such term is intended to be inclusive in a manner similar to the term “comprise” as “comprise” is interpreted when employed as a transitional word in a claim.
This application claims benefit under 35 U.S.C. § 119(e) of U.S. Provisional Application having Ser. No. 63/077,384 filed Sep. 11, 2020, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63077384 | Sep 2020 | US |