This relates generally to electronic devices and, more particularly, to electronic devices with displays.
Electronic devices such as head-mounted devices have displays. Head-mounted devices such may be used to provide a user with virtual content. In some arrangements, computer-generated content may be overlaid on top of real-world content. Cellular telephones and other portable devices have displays for presenting a user with text message content, web pages, and other images when the device is held in the user's hand.
These types of devices may be relatively inflexible and may not be able to provide a user with desired content in a variety of situations.
An electronic device may have back-to-back displays. A first of the displays may be used to display content for a user. A second of the displays may display publically viewable images while the electronic device is being worn on the head of the user.
The first display may be operated in one or more modes. For example, the first display may be operated in a first mode in which a user may view images directly on the display while the device is being held in the hand of the user. In a second mode, the user may be presented with images such as virtual reality images while the electronic device is being worn on the head of the user and while lenses in the display are used to direct the images into an eye box. The second display may be used to display images for the user (e.g., when the user is holding the device in the first mode) or publicly viewable images (e.g., when the device is being worn on the user's head in the second mode).
The first display may have microlenses, tunable lens structures, holograms, lasers, and other structures for displaying images in multiple selectable eye boxes while the electronic device is being worn on the head of a user. These structures may include tunable lenses or other optical components that allow control circuitry in the device to adjust the first display to accommodate vision defects of the user such as nearsightedness.
In some configurations, a switchable diffuser may be incorporated into the second display. In the second mode of operation, the switchable diffuser may allow microlenses of the pixels of the second display to produce collimated light that is focused by a user's eyes to form virtual images. The virtual images may be overlaid on images of the real world that are captured by a front-facing camera on the electronic device while the electronic device is being worn by the user. In the first mode of operation, the switchable diffuser may diffuse light from the pixels so that the second display may be used while the device is being held in the hand of the user.
A schematic diagram of an illustrative electronic device such as a head-mounted device is shown in
Electronic device 10 may include communications circuitry for operating with external devices such as eternal equipment 30 over wired and/or wireless communications links such as communications link 32. Electronic device 10 may, for example, include wireless circuitry 14. Wireless circuitry 14 may include wireless communications circuitry. The wireless communications circuitry may include one or more antennas and radio-frequency transceiver circuitry for transmitting and receiving wireless signals over wireless links such as illustrative wireless link 32 with external equipment 30. If desired, external equipment 30 may be coupled to device 10 using wired connections in addition to or instead of using wireless communications. External equipment 30 may be a peer device (e.g., another device having the components of device 10 of
Wireless communications circuitry in device 10 (e.g., circuitry in wireless circuitry 14) may be used in communicating with wireless local area network equipment (e.g., WiFi® equipment in equipment 30). Wireless communications circuitry in device 10 may also communicate using cellular telephone frequencies, using near-field communications, and/or using other wireless communications bands and protocols. If desired, wireless communications circuitry or other wireless circuitry 14 in device 10 may be used to detect and/or identify electronic devices (e.g., equipment 30) associated with people in the vicinity of device 10. For example, equipment 30 may be a portable electronic device associated with an acquaintance of the user of device 10. Equipment 30 may broadcast local wireless signals that identify equipment 30 as belonging to the acquaintance of the user (e.g., short-range signals having a range of 0-10 m, at least 1 m, at least 2 m, less than 20 m, etc.). In this type of arrangement, device 10 can use wireless circuitry 14 to detect the broadcast wireless signals and thereby detect when the acquaintance of the user is in the vicinity of device 10 and the user. Other techniques for identifying nearby individuals may also be used by device 10, if desired.
Device 10 may also include input-output circuitry 16. Input-output circuitry 16 includes user input devices 18. User input devices 18 may include electrical components that allow a user of device 10 to supply control circuitry 12 with user input. For example, user input devices 18 may include buttons, joysticks, track pads, force-sensitive buttons, keyboards, gesture recognition sensors (e.g., sensors based on image sensors and/or other sensors that detect user gestures such as hand wave gestures, etc.), microphones for gathering voice commands, and/or other circuitry for gathering commands and other input from a user. If desired, devices 18 may include virtual reality gloves that track a user's hand motions and finger motions and that use these motions in controlling device 10.
Device 10 may also include environmental sensors 20. Environmental sensors 20 may include devices such as ambient light sensors, temperature sensors, humidity sensors, moisture sensors, air particulate sensors, carbon dioxide sensors and other gas concentration sensors, barometric pressure sensors and other air pressure sensors, magnetic sensors, cameras (e.g., one or more cameras that capture real-time images of the real-world environment currently surrounding device 10 so that these images may be presented in real time on a user viewable display and/or for recording images), gaze detection components (e.g., to detect a gaze of an external person in the vicinity of device 10), and/or other sensors that can gather readings on the environment surrounding the user of device 10.
User monitoring sensors 22 may be used to monitor the user of device 10. For example, sensors 22 may include image sensors (cameras) for gathering images of a user's face and other portions of a user. In some configurations, user monitoring sensors 22 may include cameras (digital image sensors) and other components that form part of a gaze tracking system. The camera(s) or other components of the gaze tracking system may face a user's eyes and may track the user's gaze (e.g., images and other information captured by the gaze tracking system may be analyzed by the circuitry of device 10 such as control circuitry 12 to determine the direction in which the user's eyes are oriented). This gaze information may be used to determine the location on a user-facing display in device 10 where the user's eyes are directed (sometimes referred to as the point of gaze of the user). If desired, the gaze tracking system may also gather information on the focus of the user's eyes and other information such as eye movement information. The gaze tracking system of user monitoring sensors 22 may sometimes be referred to as a gaze detection system, eye tracking system, gaze tracking system, or eye monitoring system. If desired, image sensors other than cameras (e.g., infrared and/or visible light-emitting diodes and light detectors, etc.) may be used in monitoring a user's gaze in system 62.
User monitoring sensors 22 may also include heart rate sensors (e.g., optical heart rate sensors that emit light and process detected reflected light signals, pressure-based heart rate sensors, etc.), blood oxygen level sensors, perspiration sensors (e.g., sensors based on image sensors and/or moisture sensors that detect user skin moisture levels), blood pressure sensors, electrocardiogram sensors, accelerometers to measure body movements, other physiological sensors, and/or other sensors that can measure attributes associated with a user. If desired, user monitoring sensors 22 may include motion sensors that measure the motion of device 10 and user 34. The motion sensors may be inertial measurement units based on components such as accelerometers, gyroscopes, and/or compasses, and/or may include other circuitry that measures motion. A motion sensor in sensors 22 may, for example, determine whether a user is sitting or is otherwise at rest or is walking, running, riding a bicycle, or is otherwise in motion and/or engaged in a physical activity.
Output devices 24 may include devices such as displays 26 and other visual output devices. In some configurations, status indicators may be used to present visual information. A status indicator or other non-display visual output device may include a light-emitting diode or other light-emitting component to convey information (e.g., a component that produces illumination using a fixed color, using multiple colors, using a time-varying light pattern, etc.). For example, a status indicator formed from a pair of light-emitting diodes of different colors may emit light of a first color when the user is busy and viewing content and may emit light of a second color when the user is not busy and is available for social interactions. In other configurations, non-status-indicator visual output devices may be used in presenting visual information such as images. Non-status-indicator visual output devices may include devices for presenting adjustable text, devices for presenting still and/or moving graphics, and displays (e.g., displays with pixel arrays having at least 1000 pixels, at least 10,000 pixels, fewer than million pixels, or other suitable number of pixels for presenting images).
In general, displays and other light-emitting components that emit light (e.g., light-emitting diodes, lasers such as vertical cavity surface emitting laser diodes, lamps, status indicator lights formed from multiple light sources such as these, backlit low-resolution output components such as backlight electrophoretic components, backlit patterned ink symbols, etc.) may be used to present any suitable visual information (e.g., icons, icons that flash with predetermined patterns or that have predetermined colors to convey information about the state of the user, whether content is being presented to the user, and/or other status information). Non-display components may have relatively few adjustable light-emitting components (e.g., 2-10 light-emitting diodes, fewer than 15 light-emitting diodes, at least one light-emitting diode, etc.). Displays 26, which generally include thousands of pixels or more, may be liquid crystal displays, liquid crystal-on-silicon displays, microelectromechanical systems displays, electrophoretic displays, light-emitting diode displays (e.g., organic light-emitting diode displays, displays based on pixels formed from crystalline semiconductor dies, sometimes referred to as micro-light-emitting diodes or microLEDs, etc.), or displays based on other display technologies. Displays 26 may include touch sensitive displays (e.g., displays with two-dimensional touch sensors formed from two-dimensional capacitive touch sensor electrode arrays) or may be insensitive to touch.
As shown in
Displays 26 may include one or more inwardly facing displays such as display 46 that are visible to a user of head-mounted device 10 such as user 50, who is viewing display 46 in direction 52, and may include one or more outwardly facing displays such as display 44 that are visible to people in the vicinity of user 50 such as person 54, who is viewing display 44 in direction 56. Display 46 and display 44 may include pixels P that are configured to display images. In some configurations, device 10 may be operated while being held in the hand of the user. In this operating scenario, display 46 can be adjusted to display images directly on the face of display 46 for viewing by the user.
Inwardly facing displays such as display 46, which may sometimes be referred to as user viewable displays or internal display assemblies, may have display surfaces (pixel arrays) that are oriented towards a user's eyes when device 10 is worn on a user's head. In this scenario, display 46 may be hidden from view by individuals other than the user. Outwardly facing displays, which may sometimes be referred to as publicly viewable displays or external display assemblies may have display surfaces that are oriented away from the user. Outwardly facing displays will be visible to people in the vicinity of a user of device 10 such as person 54 but will not generally be visible to the user of device 10 such as user 50. An inwardly facing display may have the same resolution as an outwardly facing display or, if desired, the inwardly facing display may have a higher resolution than the outwardly facing display to enhance display quality for the user.
Outwardly facing displays can provide information that enables outward interactions of the user with the real world (e.g., people in the vicinity of the user). Outwardly facing displays may, for example, display information about the content that a user is viewing, information on the identity of the user, information on whether a user is occupied or is available for social interactions, and other information on the state of the user. If desired, an eye tracking system may be used to track a user's eyes and an outwardly facing display may be used to display virtual eyes that change appearance based on information on the state of the user's eyes gathered using the eye tracking system or other components. Facial features such as virtual eyebrows, emojis, and/or other content representative of the user's current emotional state may also be displayed on the outwardly facing display. An outwardly facing display may be used in forming a graphical user interface for people in the vicinity of the user (e.g., selectable on-screen items when the outwardly facing display is a touch screen or displays information responsive to voice commands from people in the vicinity of the user, etc.). When device 10 is not being worn by user 50, support structures 34 (e.g., hinged members, detachable members, etc.) may be stowed, allowing device 10 to be used as a cellular telephone, other portable electronic device, or other equipment that is not mounted on a user's head. In some configurations, portions of support structures 34 may be decoupled from displays 46 (e.g., to help make device 10 more compact and pocketable).
Displays 26, which may sometimes be referred to as back-to-back displays or oppositely oriented displays, may be attached back-to-back on opposing sides of an opaque support structure such as support structure substrate 34P of
Optical system 60 may be used to help a nearby user focus on images produced by display 46 (e.g., when the user's eyes are within 10 cm or so of display 26 while device 10 is on the user's head). With one illustrative arrangement, optical system 60 includes lenses 64 on transparent spacer 62. There may be left and right lenses 64 in lenses 64 corresponding to the left and right eyes of user 50. Lenses 64 may be Fresnel lenses, holographic lenses, or other lenses. Spacer 62 may be formed from transparent polymer, clear glass, or other transparent material.
In the illustrative configuration of
If desired, display 46 may have light directing structures associated with pixels P. The light directing structures, which may sometimes be referred to as light steering structures, beam steering structures, or light redirecting structures, may collimate and steer the light emitted from pixels P. Each pixel may, for example, have a prism, a microlens, a holographic microlens (sometimes referred to as a holographic light conditioning and directing device) or other structure that directs light in a desired direction. Configurations in which display 46 has pixels with microlenses may sometimes be described herein as an example. Prisms or other light direction structures may also be used.
As shown in
In each pixel P of
As device 10 shifts relative to the eyes of the user, a user's eyes may potentially move out of a fixed eye box. Accordingly, display 46 may be configured to display images in multiple selectable eye boxes. During operation, gaze tracking sensor circuitry in device 10 can be used by control circuitry 12 to determine the current location of the user's eyes and an appropriate matching eye box into which images are directed can be selected dynamically.
An illustrative configuration that allows display 46 to project images into multiple selectable eye boxes is shown in
Additional subsets of pixels P may be used to produce images in one or more additional laterally offset eye boxes, if desired. The positions of the eye boxes may be arranged to overlap slightly at the eye box edges, to provide complete coverage without gaps in the event that a user's eyes move to a different eye box position during use of device 10. The pixels P that produce images in each of the laterally offset eye boxes may use any suitable light redirecting structures for oriented emitted light in desired directions (e.g., a prism, a hologram, a microlens, etc.). If desired, the hologram overlapping each light source 72 may be a transmission hologram. Transmission holograms may not be 100% efficient, so some light that is emitted perpendicular to the surface of the display by light source 72 may like through the transmission hologram rather than being redirected at a desired angle towards a desired eye box location. There is a risk that this leaked light, which may sometimes be referred to as zero order light, might result in undesirable image ghosts and contrast reduction. To reduce undesired effects such as these, an optical wedge may be interposed between light source 72 and the overlapping hologram. Consider, as an example, the arrangement of
Control circuitry 12 can dynamically select the set of pixels P to use in presenting images to the user (e.g., control circuitry 12 can dynamically select which eye box to use in presenting images to the user) based on gaze tracking sensor output or other information on the current position of the user's eyes relative to support structures 34 and other portions of device 10. In addition to accommodating shifting movement of device 10 relative to the head of a user, dynamic eye box control operations can be used to accommodate different user interpupillary distances.
If desired, display 46 may have holographic structures that help expend and collimate beams of light 74 from light sources 72 (e.g., to help increase pixels per inch and image resolution). This type of arrangement is shown in
If desired, display 46 may have an array of lenses such as microlenses 70 that are formed from adjustable lens structures. These lenses exhibit variable lens power and may be tuned dynamically by control circuitry 12 to correct for a user's vision defects (nearsightedness, etc.) and/or to dynamically move an eye box (e.g., by dynamically shifting the center of the lens laterally). Variable lenses may be formed using mechanical deformation (e.g., using piezoelectric actuators, electrostatic actuators, etc.), using electro-optical refractive index adjustment structures, using tunable liquid crystal lens structures, using adjustable and/or fixed holograms, etc.
An illustrative tunable liquid crystal lens 90 is shown in
As shown in
The foregoing is merely illustrative and various modifications can be to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of provisional patent application No. 62/618,502, filed Jan. 17, 2018, which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5465175 | Woodgate | Nov 1995 | A |
6043799 | Tidwell | Mar 2000 | A |
7486341 | Hong | Feb 2009 | B2 |
7688509 | Vissenberg | Mar 2010 | B2 |
8477425 | Border et al. | Jul 2013 | B2 |
8508830 | Wang | Aug 2013 | B1 |
8558853 | Sagardoyburu | Oct 2013 | B2 |
8890771 | Pance | Nov 2014 | B2 |
9250445 | Tosaya | Feb 2016 | B2 |
9279989 | Song | Mar 2016 | B2 |
9335548 | Cakmakci | May 2016 | B1 |
9674510 | Aiden | Jun 2017 | B2 |
9798144 | Sako | Oct 2017 | B2 |
9841537 | Luebke | Dec 2017 | B2 |
10088673 | Xu | Oct 2018 | B2 |
10191283 | Alexander | Jan 2019 | B2 |
10197808 | Du | Feb 2019 | B2 |
10210844 | Kollin | Feb 2019 | B2 |
10545336 | Dubey | Jan 2020 | B2 |
10754296 | Zhang | Aug 2020 | B1 |
10761327 | Trail | Sep 2020 | B2 |
10855977 | Perreault | Dec 2020 | B2 |
10939085 | Li | Mar 2021 | B2 |
20010005284 | Lee | Jun 2001 | A1 |
20060176242 | Jaramaz et al. | Aug 2006 | A1 |
20060238545 | Bakin | Oct 2006 | A1 |
20110255159 | Michael Krijn | Oct 2011 | A1 |
20120068913 | BarZeev et al. | Mar 2012 | A1 |
20130169545 | Eaton | Jul 2013 | A1 |
20150145751 | Momonoi | May 2015 | A1 |
20150379896 | Yang | Dec 2015 | A1 |
20160025978 | Mallinson | Jan 2016 | A1 |
20160349509 | Lanier et al. | Dec 2016 | A1 |
20180095278 | Masson | Apr 2018 | A1 |
20180203231 | Glik | Jul 2018 | A1 |
20190041719 | Zhuang | Feb 2019 | A1 |
20190043392 | Abele | Feb 2019 | A1 |
20190124313 | Li | Apr 2019 | A1 |
20210080730 | Morrison | Mar 2021 | A1 |
Number | Date | Country |
---|---|---|
102009054232 | May 2011 | DE |
Number | Date | Country | |
---|---|---|---|
62618502 | Jan 2018 | US |