This application is a continuation of U.S. patent application Ser. No. 16/460,272, filed Jul. 2, 2019, which claims priority to U.S. provisional patent application No. 62/732,993 filed Sep. 18, 2018, which are hereby incorporated by reference herein in their entireties.
This relates generally to electronic devices, and, more particularly, to electronic devices such as head-mounted devices.
Electronic devices such as head-mounted devices may have optical components such as displays and cameras.
It can be challenging to incorporate components such as displays and cameras into a head-mounted device. Space in a head-mounted device can be limited, making it difficult to mount components in desired locations. As a result, some components may not operate as well as desired.
An electronic device such as a head-mounted device may have a user display for displaying an image to a user. The head-mounted device may have a housing with head-mounted support structures. The display and lenses may be mounted in the housing. Images from the user display may be displayed in eye boxes after passing through the lenses.
A forward-facing component such as a forward-facing camera and/or an externally viewable display may be supported on a front side of the housing. The user display and the lenses may be interposed between the forward-facing component and the eye boxes. This causes the forward-facing component to be spatially separated from the eye boxes and from the face of a user.
The forward-facing component may be overlapped by an optical system. The optical system may have a double-folded light path that virtually shifts the position of the forward-facing component. The optical system may have a reflective polarizer and a partially reflective mirror that are separated by a given distance. The optical system may virtually shift the forward-facing component toward the eye box by twice the given distance.
An electronic device such as a head-mounted device may be provided with head-mounted support structures that allow the electronic device to be worn on a user's head. While being worn on the user's head, a display in the electronic device may present an image for viewing by the user.
The image may include computer-generated content. The image may also include images of real-world objects captured with a forward-facing camera. An optical system with a double-folded light path may overlap the front-facing camera. The optical system may virtually shift the viewpoint of the camera so that the viewpoint coincides with eye boxes at which the user's eyes are located. This reduces motion parallax when using the front-facing camera to display real-world images on the display.
If desired, the electronic device may have a forward-facing display that is viewable by people in the vicinity of the user. The display may be display any suitable content. With one illustrative arrangement, the forward-facing display may be display real-world images of the user's face and/or may display an image of the user's face that contains computer-generated facial features (e.g., an avatar, computer-generated content overlaid on a real-world face image, etc.). An optical system with a double-folded light path may overlap the forward-facing display, so that face images that are displayed on the forward-facing display are virtually shifted to a location that lies in a common plane with the surface of the user's face and the eye boxes. This allows images on the forward-facing display to be displayed without parallax due to changes in an observer's angular orientation when viewing the image.
An illustrative system of the type that may include an electronic device with an optical system having a double-folded light path is shown in
As shown in
Housing 12 and associated support structure 22 may be formed from metal, polymer, glass, ceramic, crystalline material such as sapphire, soft materials such as fabric and foam, wood or other natural materials, other materials, and/or combinations of these materials. Housing 12 may be configured to form an enclosure and supporting structures for eyeglasses, goggles, a helmet, a hat, a visor, and/or other head-mounted electronic device.
Display 14 may be mounted in housing 12. During operation, display 14, which may sometimes be referred to as a user display, may display an image that is viewable by a user through lenses such as lens 16 (e.g., an optical system that allows the image on display 14 to be viewed by the user). Display 14 may contain a single pixel array that spans left and right lenses and/or may have separate left and right portions associated respectively with left and right lenses. Light from display 14 is directed through the lenses in direction 18 toward left and right eye boxes 20. When electronic device 10 is being worn on user's head 24, the eyes of the user will be located in left and right eye boxes 20 and the image on display 14 can be viewed by the user. If desired, left and right portions of display 14 may be mounted in movable left and right lens modules each of which includes a corresponding lens 16. The positions of these lens module structures may, if desired, be adjusted to help focus the display image and/or to accommodate a user's head (e.g., to accommodate a user's interpupillary distance).
Device 10 may also include a forward-facing camera that captures images of real-world objects such as real-world object 34 in real time. This allows captured images of real-world objects in the user's surroundings such as object 34 to be displayed for the user in real time. If desired, computer-generated content can be overlaid on the real-world images (e.g., display 14 may display both real-world content and overlaid computer-generated content). Device 10 may also be operated in a mode in which only computer-generated content is displayed.
Device 10 may have exterior surfaces that are planar, that have a curved cross-sectional profile, and/or that have other suitable shapes. In the example of
In some configurations, device 10 may be a stand-alone device. In other situations, device 10 may be coupled to external equipment 32 using a wired or wireless link. As shown in
There may, in general, be any suitable number of external devices (equipment 32) in system 8. For example, device 10 may be a stand-alone device that operates with one or more accessory devices such as wireless controllers. As another example, device 10 may serve as an input-output accessory that interacts with a host computing device (e.g., device 10 can provide user input to equipment 32 for controlling equipment 32). The host device can optionally receive additional input from wireless controllers or other devices in system 8. During operation, a host device may supply content to device 10 for displaying to a user and/or others. The user may interact with the system by supplying input (and receiving output) using device 10 and/or using optional additional input-output devices such as wireless controllers.
Device 10 may include an optical component such as a forward-facing camera and/or a forward-facing display (sometimes referred to as an external display or externally viewable display) as shown by illustrative forward-facing optical component(s) 70. An optical system such as optical system 56 may overlap forward-facing optical component 70 (e.g., optical system 56 may overlap component 70 on front side F of device 10). Optical system 56 may have a double-folded light path that eliminates parallax associated with using optical component 70.
Input-output circuitry in device 10 such as input-output devices 42 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Input-output devices 42 may include input devices that gather user input and other input and may include output devices that supply visual output, audible output, or other output. These devices may include buttons, joysticks, scrolling wheels, touch pads (e.g., capacitive touch sensors and/or other touch sensors), key pads, keyboards, microphones, speakers, tone generators, vibrators and other haptic output devices, light-emitting diodes and other status indicators, etc.
Input-output devices 42 may include one or more displays such as user display 14 (see, e.g.,
Display 14 and/or display 44 may, for example, be displays such as an organic light-emitting diode (OLED) display with an array of thin-film organic light-emitting diode pixels, a liquid crystal display with an array of liquid crystal display pixels and an optional backlight unit, a display having an array of pixels formed from respective crystalline light-emitting diodes each of which has a respective crystalline semiconductor light-emitting diode die (sometimes referred to as microLEDs or μLEDs), a display based on a digital micromirror device or other microelectromechanical systems device (e.g., a scanning mirror), a liquid-crystal-on-silicon display, and/or other displays. Camera 54 may be a color camera that captures real-world images of the environment surrounding the user (see, e.g., real-world object 34 of
Input-output devices 42 may include sensors 50. Sensors 50 may be mounted on external surfaces of device 10, may operate through windows or other portions of the housing for device 10, and/or may be mounted in one or more interior regions of device 10.
Sensors 50 may include force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors such as microphones, touch and/or proximity sensors such as capacitive sensors (e.g., a two-dimensional capacitive touch sensor integrated into a display, a two-dimensional capacitive touch sensor overlapping a display, and/or a touch sensor that forms a button, trackpad or other two-dimensional touch sensor, or other input device not associated with a display), and other sensors. If desired, sensors 50 may include optical sensors such as optical sensors that emit and detect light, ultrasonic sensors, optical touch sensors, optical proximity sensors, and/or other touch sensors and/or proximity sensors, monochromatic and color ambient light sensors, image sensors, fingerprint sensors, temperature sensors, sensors for measuring three-dimensional non-contact gestures (“air gestures”), pressure sensors, sensors for detecting position, orientation, and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or inertial measurement units that contain some or all of these sensors), health sensors, radio-frequency sensors (e.g., sensors that gather position information, three-dimensional radio-frequency images, and/or other information using radar principals or other radio-frequency sensing), depth sensors (e.g., structured light sensors and/or depth sensors based on stereo imaging devices), optical sensors such as self-mixing sensors and light detection and ranging (lidar) sensors that gather time-of-flight measurements, humidity sensors, moisture sensors, gaze tracking sensors, three-dimensional sensors (e.g., pairs of two-dimensional image sensors that gather three-dimensional images using binocular vision, three-dimensional structured light sensors that emit an array of infrared light beams or other structured light using arrays of lasers or other light emitters and associated optical components and that capture images of the spots created as the beams illuminate target objects, and/or other three-dimensional image sensors), facial recognition sensors based on three-dimensional image sensors, visual odometry sensors based on image sensors, and/or other sensors. In some arrangements, device 10 may use sensors 50 and/or other input-output devices to gather user input (e.g., buttons may be used to gather button press input, touch sensors overlapping displays can be used for gathering user touch screen input, touch pads may be used in gathering touch input, microphones may be used for gathering audio input, etc.).
If desired, electronic device 10 may include additional components 52. Components 52 may include haptic output devices, audio output devices such as speakers, light sources such as light-emitting diodes (e.g., crystalline semiconductor light-emitting diodes for status indicators and/or components), other optical output devices, and/or other circuitry for gathering input and/or providing output. Device 10 may also include an optional battery or other energy storage device, connector ports such as port 28 of
Optical system 56′ may use any suitable components for creating a double-folded light path for real-world image light passing to camera 54. In the example of
During operation, real-world image light ray 92 passes through circular polarizer 86 and becomes circularly polarized. This circularly polarized ray is transmitted though partially reflective mirror 84 as circularly polarized ray 94 and reaches quarter wave plate 82. Quarter wave plate 82 converts circularly polarized ray 94 to a linear polarization state aligned with the Y axis of
As shown in
In the example of
As with optical system 56′ of
Mirrors 80 and 84 are separated by distance S. Because of the double reflections experienced by the light of an image displayed by external display 44 as this light reflects from mirrors 84 and 80 before exiting optical system 56″ as light 116 for observing by an observer who is looking at device 10 in direction 106, the position of external display 44 is virtually shifted by distance 2S to plane 104. The location of plane 104 is shared with the surface of the user's face on head 24 and eye boxes 20. By virtually shifting the position of display 44 so that the apparent position of display 44 to external observers is aligned with the user's face and is aligned with eye boxes 20, parallax in the displayed face image or other image on display 44 due to changes in the angle of view of the external viewer viewing display 44 will be eliminated (i.e., the image on display 44 can contain facial features and other visual elements that appear to rest directly at the outer surface of the user's head, rather than appearing at a distance 2S in front of the user's head). This can make the displayed face image more lifelike and pleasing to view because it appears as if the displayed face image is part of the user's real face.
As described above, one aspect of the present technology is the gathering and use of information such as sensor information. The present disclosure contemplates that in some instances, data may be gathered that includes personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter ID's, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, username, password, biometric information, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver targeted content that is of greater interest to the user. Accordingly, use of such personal information data enables users to calculated control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, health and fitness data may be used to provide insights into a user's general wellness, or may be used as positive feedback to individuals using technology to pursue wellness goals.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the United States, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA), whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In another example, users can select not to provide certain types of user data. In yet another example, users can select to limit the length of time user-specific data is maintained. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an application (“app”) that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data at a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of information that may include personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data.
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
Number | Name | Date | Kind |
---|---|---|---|
5933279 | Yamazaki | Aug 1999 | A |
6646811 | Inoguchi | Nov 2003 | B2 |
6980363 | Takagi et al. | Dec 2005 | B1 |
8120717 | Daly | Feb 2012 | B2 |
8982471 | Starner et al. | Mar 2015 | B1 |
9046999 | Teller et al. | Jun 2015 | B1 |
9740282 | Mclnerny | Aug 2017 | B1 |
9939650 | Smith et al. | Apr 2018 | B2 |
9995936 | Macannuco et al. | Jun 2018 | B1 |
10789777 | Sheikh | Sep 2020 | B1 |
10861240 | Wheelwright et al. | Dec 2020 | B1 |
20040070839 | Yagi et al. | Apr 2004 | A1 |
20040140949 | Takagi | Jul 2004 | A1 |
20050068314 | Aso et al. | Mar 2005 | A1 |
20070285338 | Yanagisawa | Dec 2007 | A1 |
20110169928 | Gassel et al. | Jul 2011 | A1 |
20110194029 | Herrmann et al. | Aug 2011 | A1 |
20140266990 | Makino et al. | Sep 2014 | A1 |
20150067580 | Um et al. | Mar 2015 | A1 |
20150219897 | Mukawa et al. | Aug 2015 | A1 |
20150253573 | Sako et al. | Sep 2015 | A1 |
20160054565 | Izumihara | Feb 2016 | A1 |
20160217621 | Raghoebardajal et al. | Jul 2016 | A1 |
20160313790 | Clement et al. | Oct 2016 | A1 |
20160327795 | Jarvenpaa | Nov 2016 | A1 |
20160341959 | Gibbs et al. | Nov 2016 | A1 |
20170212352 | Cobb et al. | Jul 2017 | A1 |
20170255015 | Geng et al. | Sep 2017 | A1 |
20180004478 | Chen | Jan 2018 | A1 |
20180096533 | Osman et al. | Apr 2018 | A1 |
20180101020 | Gollier et al. | Apr 2018 | A1 |
20180284539 | Zha | Oct 2018 | A1 |
20190286406 | Chen | Sep 2019 | A1 |
20200012107 | Greenberg | Jan 2020 | A1 |
20210231259 | Ma | Jul 2021 | A1 |
Entry |
---|
Gugenheimer et al., “FaceDisplay: Towards Asymmetric Multi-User Interaction for Nomadic Virtual Reality”, CHI 2018 (Apr. 21-26, 2018), Montreal, QC, Canada. |
Mai et al., “TransparentHMD: Revealing the HMD User's Face to Bystanders”, 16th International Conference on Mobile and Ubiquitous Multimedia (MUM 2017), Nov. 26-29, 2017, Stuttgart, Germany. |
Pohl et al., “See what I see: concepts to improve the social acceptance of HMDs,” IEEE VR 2016, 2 pages (Mar. 19-23, 2016). |
Number | Date | Country | |
---|---|---|---|
62732993 | Sep 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16460272 | Jul 2019 | US |
Child | 17539921 | US |