This relates generally to augmented reality (AR) and more specifically to a set of AR driving glasses designed for use in a vehicle.
Vehicles, especially automobiles, increasingly include display heads up displays (HUDs) for displaying information at a location closer to the driver's line of sight than, for example, typical instrument clusters and dashboards. In some examples, HUDs are incorporated into the front windshield of the vehicle. HUDs can display information of use to the driver, such as vehicle speed, navigation directions, notifications, and other information. However, due to the relatively high cost of HUDs, the size of current HUDs is small, limiting their full potential. For example, HUDs can be limited to a small portion of the windshield, which prevents the display of information at locations on the windshield that do not include the HUD.
The present invention is directed to augmented reality (AR) driving methods and systems, such as glasses for use in a vehicle. In some embodiments, the AR driving glasses include one or more lenses having displays included therein. The displays display one or more images related to operation of the vehicle, such as indications of hazards, navigation directions, and/or information about the vehicle. The AR driving glasses receive information from the vehicle for generating the displayed images. Wired or wireless communication are possible. Wireless AR driving glasses include rechargeable batteries to provide power while in use. Power cables are also possible for wired configurations. The size and location of the image are adjusted by the AR driving glasses based on data from one or more sensors (e.g., gyroscopes and/or cameras) included in the AR driving glasses and/or data from the vehicle (e.g., speedometer data). In accordance with certain embodiments, the lenses further include variable focal points allowing for wearers that use corrective lenses to use the AR driving glasses without their corrective eyewear. In some embodiments, the AR driving glasses include an iris scanner for identifying a user and updating one or more settings, such as focal point, in accordance with the identity of the user of the AR driving glasses. The AR driving glasses further include variable darkness of the lenses (i.e., electrochromic material within the lenses).
In the following description, references are made to the accompanying drawings that form a part hereof, and in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be used and structural changes can be made without departing from the scope of the disclosed examples. Further, in the context of this disclosure, “autonomous driving” (or the like) can refer to autonomous driving, partially autonomous driving, and/or driver assistance systems.
The present invention is directed to augmented reality (AR) driving methods and systems, such as glasses for use in a vehicle. In some embodiments, the AR driving glasses include one or more lenses having displays included therein. The displays display one or more images related to operation of the vehicle, such as indications of hazards, navigation directions, and/or information about the vehicle. The AR driving glasses receive information from the vehicle for generating the displayed images. Wired or wireless communication are possible. Wireless AR driving glasses include rechargeable batteries to provide power while in use. Power cables are also possible for wired configurations. The size and location of the image are adjusted by the AR driving glasses based on data from one or more sensors (e.g., gyroscopes and/or cameras) included in the AR driving glasses and/or data from the vehicle (e.g., speedometer data). In accordance with certain embodiments, the lenses further include variable focal points allowing for wearers that use corrective lenses to use the AR driving glasses without their corrective eyewear. In some embodiments, the AR driving glasses include an iris scanner for identifying a user and updating one or more settings, such as focal point, in accordance with the identity of the user of the AR driving glasses. The AR driving glasses further include variable darkness of the lenses (i.e., electrochromic material within the lenses).
Vehicle control system 100 further includes an on-board computer 110 that is coupled to the cameras 106, sensors 107, GNSS receiver 108, map information interface 105, and communication system 150 and that is capable of receiving outputs from the sensors 107, the GNSS receiver 108, map information interface 105, and communication system 150. The on-board computer 110 is capable of transmitting information to the AR driving glasses to cause the AR driving glasses to display one or more images, generate one or more tactile alerts, change lens tint, and/or change lens focus. Additional functions of the AR glasses controlled by the on-board computer 110 are possible and are contemplated within the possession of this invention. On-board computer 110 includes one or more of storage 112, memory 116, and a processor 114. Processor 114 can perform the methods described below with reference to
In some embodiments, the vehicle control system 100 is connected to (e.g., via controller 120) one or more actuator systems 130 in the vehicle and one or more indicator systems 140 in the vehicle. The one or more actuator systems 130 can include, but are not limited to, a motor 131 or engine 132, battery system 133, transmission gearing 134, suspension setup 135, brakes 136, steering system 137 and door system 138. The vehicle control system 100 controls, via controller 120, one or more of these actuator systems 130 during vehicle operation; for example, to control the vehicle during fully or partially autonomous driving operations, using the motor 131 or engine 132, battery system 133, transmission gearing 134, suspension setup 135, brakes 136 and/or steering system 137, etc. Actuator systems 130 can also include sensors that send dead reckoning information (e.g., steering information, speed information, etc.) to on-board computer 110 (e.g., via controller 120) to determine the vehicle's location and orientation. The one or more indicator systems 140 can include, but are not limited to, one or more speakers 141 in the vehicle (e.g., as part of an entertainment system in the vehicle), one or more lights 142 in the vehicle, one or more displays 143 in the vehicle (e.g., as part of a control or entertainment system in the vehicle) and one or more tactile actuators 144 in the vehicle (e.g., as part of a steering wheel or seat in the vehicle). The vehicle control system 100 controls, via controller 120, one or more of these indicator systems 140 to provide visual and/or audio indications, such as an indication that a driver will need to take control of the vehicle, for example.
In some embodiments, system 200 includes one or more sensors 220. The sensors 220 can include one or more gyroscopes 222, one or more cameras 224, an ambient light sensor 226, and/or one or more biometric sensors 228. Additional sensors are possible. Gyroscopes 222 sense the position and movement of the AR driving glasses incorporating system 200. In some embodiments, the gyroscope 222 data are used to determine the location of one or more images displayed by system 200. Cameras 224 can include cameras directed in the direction the wearer of the glasses is looking and/or at the eyes of the wearer of the glasses. Cameras 224 can capture images of the surroundings of system 200 to determine which images to display. Images captured of the wearer of the glasses can be used to detect where the wearer is looking for the purpose of modifying the location of one or more displayed images. Cameras 224 can also be used to detect a level of ambient light to control the darkness of the glasses. Additionally or alternatively, system 200 includes an ambient light sensor 226 separate from the cameras 224 for determining the level of ambient light. In some embodiments, system 200 includes one or more biometric sensors 228 (e.g., an iris scanner) for identifying the wearer of the glasses. System 200 can personalize one or more settings of the glasses, such as variable focus 262 or other features, based on the identity of the wearer of the glasses. In some embodiments, biometric sensors 228 are used to authenticate an authorized driver/user of the vehicle. When the wearer of the AR driving glasses is determined to be an authorized user of the vehicle, the AR glasses display (e.g., on display 266) an image confirming successful authentication. Optionally, successful authentication can cause the vehicle to power on, unlock, or provide some other level of vehicle access. When the wearer of the AR driving glasses is determined not to be an authorized user of the vehicle, the AR glasses display (e.g., on display 266) an image confirming authentication failure (e.g., access denied). Optionally, failed authentication can cause the vehicle to power off, lock, or deny some other level of vehicle access. In some embodiments, other sensors are possible.
System 200 further includes one or more communication systems 230. Communication systems 230 can be used to communicate with the vehicle (e.g., a vehicle incorporating system 100) the wearer is driving or riding in and/or one or more electronic devices within the vehicle (e.g., a smartphone, tablet, or other consumer electronic device). In some embodiments, system 200 includes a wireless transceiver 232 configured to communicate using a wireless connection (e.g., Bluetooth, cellular, Wi-Fi, or some other wireless protocol). Additionally or alternatively, system 200 includes a wired connection 234 to one or more other systems with which it communicates. The AR driving glasses can send information such as sensor data 220, button 250 status and/or other information using communication systems 230. Communication systems 230 can receive information such as sensor data from other devices, images for display by the AR driving glasses, alerts, and/or other information.
System 200 further includes one or more power systems 240. In some embodiments, system 200 includes a battery 242, which can be a rechargeable battery or a single-use battery. Additionally or alternatively, the power systems 240 include a power cable 244 that can recharge a rechargeable battery 242 or directly power system 200.
System 200 optionally includes tactile control 250. Tactile control 250 can include one or more of a tab, button, switch, knob, dial, or other control feature operable by the wearer of the AR driving glasses. In some embodiments, the tactile control 250 can be used to answer a phone call transmitted to a vehicle (e.g., by way of a mobile phone) in communication with the AR driving glasses. For example, operating the tactile control can cause a call to be answered or terminated. Tactile control 250 can optionally function to dismiss one or more alerts communicated by the AR driving glasses. Other uses of tactile control 250 are possible.
System 200 further includes a plurality of lenses 260 of the AR driving glasses. Lenses 260 can include one or more lenses in front of the wearer's eyes and/or in the wearer's periphery, as will be illustrated below in
System 200 optionally includes tactile feedback 270. In some embodiments, the AR driving glasses can include a vibrating mechanism that generates vibrations in association with one or more alerts displayed using display 266 or played using a speaker system of the vehicle (e.g., speaker 141). For example, tactile feedback 270 alerts the wearer when the vehicle is in a dangerous situation (e.g., a hazard is detected). In some embodiments, system 200 includes multiple tactile feedback mechanisms 270, allowing the AR driving glasses to produce directional tactile feedback. For example, when there is a hazard to the left of the vehicle, a tactile feedback mechanism 270 on the left side of AR driving glasses provides tactile feedback to the user.
In some embodiments, system 200 includes computer 210. Computer 210 includes one or more controllers 212, memory 214, and one or more processors 216. Computer 210 controls one or more operations executed by systems of the AR driving glasses.
During operation, AR driving glasses 300 can present information to the wearer in the form of images and/or tactile feedback. In some embodiments, the AR driving glasses 300 receive data from the vehicle to control the information presented to the wearer, such as navigation information, hazard alerts, and other information that the computer 210 of the AR driving glasses can use to generate one or more images to be displayed on the lenses 322 and/or 324. In some embodiments, the vehicle generates and transmits the images to the AR driving glasses 300 to display. The location and size of the displayed images can be determined based on the vehicle's speed and surroundings, the wearer's head position, where the wearer is looking, and other factors.
In some embodiments, navigation image 632 is associated with navigation directions provided by the vehicle and/or a mobile device operatively coupled to the vehicle and/or to the AR driving glasses 600. As shown in
In some embodiments, warning image 634 is associated with a driver assistance system of the vehicle. As shown in
There are a number of different ways that navigation image 632 and/or warning image 634 can be generated by the system. In some embodiments, the vehicle and/or mobile device transmits to the AR driving glasses 600 information about the navigation instructions (e.g., that a right turn is the next direction) or about the other vehicle (e.g., the location of the other vehicle) and the computer 210 on the AR driving glasses 600 generates the images 632 and/or 634 using that information. In this way, the amount of data being transmitted between the AR driving glasses 600 and the vehicle and/or mobile device is relatively small while the amount of processing performed by the AR driving glasses is relatively large. In some embodiments, the AR driving glasses 600 transmit the sensor data for sizing and positioning one or more of the images 632 and 634 to the vehicle and/or mobile device and receive the navigation image 632 and/or warning image 634 to be displayed. In this way, the amount of data being transmitted between the AR driving glasses 600 and the vehicle and/or mobile device is relatively large while the amount of processing performed by the AR driving glasses is relatively small.
At step 702, the AR driving glasses receive information from the vehicle. The information can include information that the AR driving glasses use to generate one or more images (e.g., navigation instructions, a type and location of a hazard, vehicle information such as speed, fuel level, climate control settings, infotainment settings, etc.) or an image to be displayed (i.e., the vehicle generates the image). In some embodiments, vehicle information images are displayed such that they are positioned over the corresponding systems of the vehicle. For example, when the user changes which air vents of the climate control system are in use (e.g., upper vents, foot vents, or defrost vents), an image is displayed to superimpose arrows near the newly-activated vents. Likewise, the color of the image can correspond to a set point or a change of the set point of the climate control system. In some embodiments, the AR driving glasses can display images when the user changes settings of the vehicle's sound system. For example, when the sound balance is changed, one or more images are displayed over the location of the speakers indicating the change in balance (e.g., when the balance is moved to the right, the sound indicator images over the right speakers increase in size while the sound indicator images over the left speakers decrease in size). Information can be received via a wireless transceiver 232 or a wired connection 234 to the vehicle. In some embodiments, the AR driving glasses can additionally or alternatively receive information from a mobile device in communication with the vehicle.
At step 704, the AR driving glasses generate an image. In embodiments where the vehicle transmits an image to the AR driving glasses, generating the image includes receiving the image. Alternatively, the computer 210 of the AR driving glasses generates the image for display based on information received in step 702.
At step 706, the AR driving glasses measure the head pose (location and orientation) of the wearer. Head pose is measured based on one or more sensors 220 of the AR driving glasses, such as gyroscopes 222 and cameras 224.
At step 708, the AR driving glasses measure the gaze of the wearer. Gaze is measured with one or more cameras 224 of the AR driving glasses. The cameras 224 capture one or more images of the wearer's eyes to determine where the user is looking.
At step 710, the AR driving glasses set the image size. An image that is meant to be displayed as though it is at a particular location outside of the vehicle (e.g., navigation image 632 being displayed as though it is on the road) is sized according to the distance at which the image is supposed to appear to be located. For example, when the navigation turn is far away, the navigation image 632 is small and as the vehicle moves closer to the navigation turn, the navigation image 632 increases in size. Vehicle speed, head pose, and user gaze can also be used to determine the appropriate image size.
At step 712, the AR driving glasses set the image location. Image location is based on where the image is supposed to appear to be located (e.g., as described with reference to the navigation image 632 and warning images 634, 674, and 676), the user gaze, and the user head pose.
At step 714, the AR driving glasses display the image. The image is displayed on one or more displays 266 incorporated into the AR driving glasses lenses (e.g., lenses 260, 322, 324, 422, 424, 522, 524, 622, 624, and/or 672). The displays 266 can be controlled by the AR driving glasses' computer 210.
At step 716, the AR driving glasses optionally generate tactile feedback. The tactile feedback can be generated for one or more of the images described herein. For example, tactile feedback can be generated to notify the wearer of an upcoming navigation direction or emerging hazard (e.g., such as a nearby vehicle, pedestrian, or red light).
Thus, the disclosure above describes AR driving glasses and methods of their use.
Therefore, according to the above, some examples of the disclosure are related to an augmented-reality system having an eyewear apparatus comprising: a frame; at first lens connected to the frame, the first lens comprising a display; one or more sensors; one or more processors operatively coupled to the one or more sensors; and a memory including instructions, which when executed by the one or more processors, cause the one or more processors to perform a method comprising the steps of: generating one or more images to be displayed on the displays based at least on data from the one or more sensors. Additionally or alternatively, in some examples the first lens is located such that it is in front of the eyes of a wearer of the eyewear apparatus, the eyewear apparatus further comprises: a second lens, the second lens comprising a display, the second lens located such that it is at a periphery of an eye of the wearer of the eyewear apparatus, and a transceiver operatively coupled to a vehicle, the vehicle includes a side mirror camera, and the method performed by the processors further comprises the step of generating an image of the one or more images on the second lens based on data from the side mirror camera, the data received from the vehicle at the transceiver of the eyewear apparatus. Additionally or alternatively, in some examples the first lens has a variable focal point, the one or more sensors comprise an iris scanner, and the method further comprises the steps of: receiving, from the iris scanner, biometric data, matching the received biometric data to a stored user profile, and controlling the variable focal point of the first lens to become a stored focal point associated with the stored user profile. Additionally or alternatively, in some examples the one or more sensors comprise a gyroscope, and the method further comprises the steps of: receiving, from the gyroscope, one or more of motion and orientation data, and determining one or more of a location and size of the one or more images to be displayed in accordance with the one or more of motion and orientation data. Additionally or alternatively, in some examples the system further comprises one or more cameras directed towards eyes of a wearer of the eyewear apparatus, wherein the method further comprises the steps of: receiving, from the one or more cameras, one or more captured images including the eyes of the wearer, and determining one or more of a location and size of the one or more images to be displayed in accordance with the one or more captured images. Additionally or alternatively, in some examples the system further comprises a vibrating mechanism, wherein the method further comprises the steps of: detecting a hazard, generating one or more image notifications of the hazard, and while displaying the one or more image notifications of the hazard, causing the vibrating mechanism to vibrate. Additionally or alternatively, in some examples the system further comprises a connector cable couplable to a vehicle, the connector cable configured to receive power from the vehicle, and transmit information to and from the vehicle. Additionally or alternatively, in some examples the system further comprises a wireless transceiver, wherein the method further comprises the steps of: receiving, at the wireless transceiver, vehicle speed data from a vehicle operatively coupled to the eyewear apparatus, and determining one or more of a location and size of the one or more images to be displayed in accordance with the vehicle speed data. Additionally or alternatively, in some examples the system further comprises a wireless transceiver, wherein the method further comprises the steps of: receiving, at the wireless transceiver, vehicle settings data including an indication of a change in a vehicle setting from a vehicle operatively coupled to the eyewear apparatus, generating a vehicle settings image based on the vehicle settings data, and displaying the vehicle settings image at a location on the first lens corresponding to a component of the vehicle associated with the vehicle setting.
Some examples of the disclosure are related to a method of displaying an image on an eyewear apparatus, the method comprising: receiving data from one or more sensors included in the eyewear apparatus; and generating an image for display on a display included in a first lens included in the eyewear apparatus, the images generated based on at least the data from the one or more sensors. Additionally or alternatively, in some examples the first lens is located such that it is in front of the eyes of a wearer of the eyewear apparatus, the eyewear apparatus further comprises: a second lens, the second lens comprising a display, the second lens located such that it is at a periphery of an eye of the wearer of the eyewear apparatus, and a transceiver operatively coupled to a vehicle, the vehicle includes a side mirror camera, and the method further comprises the step of generating an image of the one or more images on the second lens based on data from the side mirror camera, the data received from the vehicle at the transceiver of the eyewear apparatus. Additionally or alternatively, in some examples the one or more sensors comprise a gyroscope, and the method further comprises the steps of: receiving, from the gyroscope, one or more of motion and orientation data, and determining one or more of a location and size of the one or more images to be displayed in accordance with the one or more of motion and orientation data. Additionally or alternatively, in some examples the eyewear apparatus further includes one or more cameras directed towards eyes of a wearer of the eyewear apparatus, and the method further comprises the steps of: receiving, from the one or more cameras, one or more captured images including the eyes of the wearer, and determining one or more of a location and size of the one or more images to be displayed in accordance with the one or more captured images. Additionally or alternatively, in some examples the eyewear apparatus further comprises a vibrating mechanism, and the method further comprises the steps of: detecting a hazard, generating one or more image notifications of the hazard, and while displaying the one or more image notifications of the hazard, causing the vibrating mechanism to vibrate. Additionally or alternatively, in some examples the eyewear apparatus further comprises a wireless transceiver, and the method further comprises the steps of: receiving, at the wireless transceiver, vehicle speed data from a vehicle operatively coupled to the eyewear apparatus, and determining one or more of a location and size of the one or more images to be displayed in accordance with the vehicle speed data. Additionally or alternatively, in some examples the eyewear apparatus further comprises a wireless transceiver, and the method further comprises the steps of: receiving, at the wireless transceiver, vehicle settings data including an indication of a change in a vehicle setting from a vehicle operatively coupled to the eyewear apparatus, generating a vehicle settings image based on the vehicle settings data, and displaying the vehicle settings image at a location on the first lens corresponding to a component of the vehicle associated with the vehicle setting.
Some examples of the disclosure are related to a non-transitory computer-readable medium including instructions, which when executed by one or more processors of an eyewear apparatus, cause the one or more processors to perform a method comprising: receiving data from one or more sensors included in the eyewear apparatus; and generating an image for display on a display included in a first lens included in the eyewear apparatus, the images generated based on at least the data from the one or more sensors. Additionally or alternatively, in some examples the first lens is located such that it is in front of the eyes of a wearer of the eyewear apparatus, the eyewear apparatus further comprises: a second lens, the second lens comprising a display, the second lens located such that it is at a periphery of an eye of the wearer of the eyewear apparatus, and a transceiver operatively coupled to a vehicle, the vehicle includes a side mirror camera, and the method further comprises the step of generating an image of the one or more images on the second lens based on data from the side mirror camera, the data received from the vehicle at the transceiver of the eyewear apparatus. Additionally or alternatively, in some examples the eyewear apparatus further comprises a wireless transceiver, and the method further comprises the steps of: receiving, at the wireless transceiver, vehicle speed data from a vehicle operatively coupled to the eyewear apparatus, and determining one or more of a location and size of the one or more images to be displayed in accordance with the vehicle speed data. Additionally or alternatively, in some examples the eyewear apparatus further comprises a wireless transceiver, and the method further comprises the steps of: receiving, at the wireless transceiver, vehicle settings data including an indication of a change in a vehicle setting from a vehicle operatively coupled to the eyewear apparatus, generating a vehicle settings image based on the vehicle settings data, and displaying the vehicle settings image at a location on the first lens corresponding to a component of the vehicle associated with the vehicle setting.
Although examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of examples of this disclosure as defined by the appended claims.