This relates generally to headbands, and, more particularly, to adjustable headbands for electronic devices.
Electronic devices such as head-mounted devices may have displays for displaying images. The displays may be housed in a head-mounted support structure.
A head-mounted device may have a head-mounted housing containing rear-facing displays that display images for a user when the head-mounted housing is worn by the user. The head-mounted housing may be coupled to the user's head using an upper headband and a lower headband coupled to the head-mounted housing using a support member.
The upper headband and/or the lower headband may rotate relative to the support member to adjust the position of the headband on the user's head. For example, the adjustable headband(s) may be movable between a front portion of the user's head and a rear portion of the user's head.
The adjustable headband(s) may rotate about a friction hinge, in a recess of the support member, or about a rotating member of the support member. The adjustable headband(s) may lock into multiple positions as the adjustable headband(s) are rotated relative to the support member.
Additionally or alternatively, the adjustable headband(s) may be coupled to an attachment on the support member that can slide along a portion of the support member. The adjustable headband(s) may be further moved on the user's head by sliding the attachment on the support member.
The adjustable headband may include a rigid portion that is coupled to the support member and a fabric portion that extends from the rigid portion. The fabric portion may include stiffeners embedded in the fabric.
Head-mounted devices include head-mounted support structures that allow the devices to be worn on the heads of users. The head-mounted support structures may include device housings for housing components such as displays that are used for presenting a user with visual content. The head-mounted support structures for a head-mounted device may also include headbands and other structures that help hold a device housing on the face of a user. The headbands of a head-mounted device may be adjustable.
In some embodiments, it may be desirable to incorporate headbands that can contact the user's head in multiple locations while the head-mounted device is worn. The headbands may include a lower headband that is in contact with a portion of the back of the user's head and an upper headband. The upper and/or lower headbands may be adjustable. For example, the upper and/or lower headbands may pivot so that the upper and lower headbands can contact different portions of the user's head. In an illustrative example, the lower headband may contact a lower rear portion of the user's head, while the upper headband is movable between a front portion of the user's head (e.g., the user's forehead) and an upper rear portion of the user's head. To allow the upper and/or lower headbands to be adjusted in this way, the headbands may be attached to a head-mounted support structure via a pivoting structure, such as a friction hinge.
The friction hinge and/or the adjustable headband(s) may include tooth features or other features to provide mechanical haptics as the headband(s) are adjusted. Alternatively or additionally, the friction hinge and/or the adjustable headband(s) may include an encoder. The head-mounted device may receive signals from the encoder that reflect the position of the adjustable headband(s). In this way, the position of the adjustable headband(s) may be known when the adjustments are made.
The adjustable headband(s) may also be movable closer and farther from the front of the user's head. In particular, the pivot point around which the adjustable headband(s) rotate may slide forward and backward within the head-mounted support structure. By allowing for the adjustment of one or more headbands, the head-mounted device may be supported against the face of the user, while providing adjustments for user comfort and/or accommodations.
To present a user with images for viewing from eye boxes (e.g., eye boxes in which the user's eyes are located when device 10 is being worn on the users' head such as head 22 of
If desired, housing 12 may have forward-facing components such as cameras, other sensors, and/or a display on front F for gathering sensor measurements/other input and/or display information on front F. Housing 12 and may have a soft cushion on an opposing rear side of housing 12. The rear of housing 12 may have openings that allow the user to view images from the left and right optical systems (e.g., when the rear of housing 12 is resting on front surface 20F of the user's head 22).
Device 10 may have headbands 33 and 34, and may have other structures (e.g., an optional over-the-head strap) to help hold housing 12 on head 22. Headbands 33 and 34 may have first and second ends coupled, respectively, to the left and right sides of housing 12. In the example of
Headbands 33 and 34 may have a soft flexible portion such as central portion 30. Portions 30 may be formed between two stiffer portions such as end portions 28 on the left and right ends of headbands 33 and 34. Portions 28 may be stiffened using embedded polymer stiffeners (e.g., single-layer or multilayer polymer stiffening strips) and/or other stiffening members.
Portions 30 may be formed from a stretchable material such as stretchy fabric. Portions 30 may, as an example, be formed from a band of flat knit fabric that includes stretchable strands of material (e.g., elastomeric strands) and/or which uses a stretchable fabric construction (e.g., a stretchable knit construction). Alternatively, portions 30 may be formed from a band of woven fabric, which may include stretchable strands of material and/or may use a stretchable fabric construction. Narrowed end portions of the band of knit fabric may, if desired, extend over stiffening members in end portions 28 (e.g., to ensure that headband 26 has a uniform external appearance).
The stretchability of headband portions 30 (and therefore headbands 33 and 34) allows headbands 33 and 34 to be stretched along their lengths. This allows the length of headbands 33 and 34 to be temporarily increased to help a user to place headbands 33 and 34 over the user's head when a user is donning device 10. When headbands 33 and 34 are released, the stretchiness and elastic nature of portions 30 of headbands 33 and 34 will help shorten headbands 33 and 34 and pull headbands 33 and 34 against the user's head.
Although not shown in
In the illustrative example of
Although headbands 33 and 34 may support device 10 against the face of the user when in the positions shown in
As shown in
Alternatively, headband 33 may be moved to a top portion of head 22, as shown by location 33′. For example, headband 33 may pivot around protrusions 24P (or otherwise pivot relative to members 24) from rear 20R to the top of head 22.
The illustrative positions of headband 33 shown in
In the examples of
As shown in
Headband 33 may be coupled to headband 34 using friction hinge 37. In particular, friction hinge portion 36 may be coupled to headband 34, such as by molding friction hinge portion 36 to headband 34, adhesively attaching friction hinge portion 36 to headband 34, or otherwise attaching friction hinge portion 36 to headband 34. Similarly, friction hinge portion 38 may be coupled to headband 33, such as by molding friction hinge portion 38 to headband 33, adhesively attaching friction hinge portion 38 to headband 33, or otherwise attaching friction hinge portion 38 to headband 33. Friction hinge portions 36 and 38 may be formed from metal, plastic, and/or other suitable material(s).
Friction hinge portion 38 may be removably attached to friction hinge portion 36. For example, hinge portion 38 may be friction fit into friction hinge portion 36. In other words, the opening in friction hinge portion 36 may be just wide enough to accommodate friction hinge portion 38, so that when friction hinge portion 38 is pushed into the opening, friction hinge portion 38 remains removably attached to friction hinge portion 36. Alternatively or additionally, friction hinge portion 36 may have recesses and friction hinge portion 38 may have protrusions that engage with the recesses in friction hinge portion 36. In general, however, friction hinge portions 36 and 38 may be removably attached to each other in any suitable manner.
To adjust the position of headband 33 relative to member 24 (e.g., as shown in
Although not shown in
If desired, friction hinge 57 may include tooth features or other features that provide mechanical haptics. The tooth features may provide locations at which an adjustable headband may lock into place (e.g., require additional force to move further) with respect to member 24. An illustrative example of friction hinge portions with such features are shown in
As shown in
By including features 56 and 58 on friction hinge portions 26 and 28, friction hinge 27 may have discrete locations at which friction hinge 27 will lock into place as headband 33 is rotated relative to member 24. The locations may correspond to the locations of headband 33 in
The example of forming locking locations using teeth/protrusions and corresponding friction hinge portions is merely illustrative. In general, an adjustable headband may lock into place using any suitable mechanism. For example, a screw may be used to tighten the hinge around which the adjustable headband rotates to lock it into place, a secondary member may be clipped onto the hinge to lock the adjustable headband into place, or any other suitable mechanism may be used to lock the adjustable headband into place.
Although
In some embodiments, features 56 and/or 58 may alternatively or additionally include an encoder that is electrically connected to device 10. By including an encoder in friction hinge 27, device 10 may determine the position of the adjustable headband on the user's head. Device 10 may then alert a user to adjust the position of the headband. For example, device 10 may determine that there is an inadequate seal to the user's face, and may suggest adjusting the headband to a specific position to remedy the issue. Alternatively, device 10 may suggest adjusting the headband to provide a tighter or looser fit based on the information being displayed by device 10. For example, if a movie or video game is being displayed by device 10, device 10 may recommend tightening the headband. However, if other information, such as a webpage, is being displayed by device 10, device 10 may recommend loosening the headband to improve user comfort. However, these examples are merely illustrative. In general, device 10 may provide any suitable recommendation to a user regarding the position and/or fit of the adjustable headband.
Moreover, although headband 33 is shown as attaching to headband 34 in
As shown in
Member 24 may include optional recess 24P′ that may receive a post from another headband, such as headband 34, to removably couple the headband to member 24. Alternatively, headband 34 may be coupled to a lower surface of headband 33 (e.g., the opposite surface from friction hinge portion 38). If desired, a friction hinge may be formed between headband 33 and headband 34 to allow headband 34 to rotate/pivot relative to headband 33 and member 24.
Although
As shown in
Although not shown in
As an alternative to the friction hinges of
As shown in
Member 24 may have opening or recess 44. Headband 33 may be coupled to member 24 within opening or recess 44. For example, headband 33 may be coupled to a pivoting member within opening/recess using a latch, snap, clasp, or other suitable mechanism. Headband 33 may pivot within recess 44 along direction 46 about the pivoting member in recess 44. In particular, as shown in
Although not shown in
As another example, an upper adjustable headband may be coupled between a housing support member and a lower headband. An illustrative example is shown in
As shown in
Headband 33 may have protrusion 33P, which may have the same or different design as protrusion 24P. For example, protrusion 33P may be a rectangular protrusion with rounded edges (or any other suitable shape) and may have one or more indentations or other features to which a latch in headband 34 may be attached. Headband 34 may rotate/pivot relative to headband 33, if desired, such as by incorporating a rotating member on which protrusion 33P is mounted (as in
The arrangement of
Instead of, or in addition to, allowing one or more headbands to rotate to different portions of a user's head, the headbands may be coupled to a support member portion with an adjustable position. An illustrative example is shown in
As shown in
Additionally, the position of attachment 49 (e.g., the position of attachment of headbands 33 and 34 on member 24) may be adjusted. In particular, member 24 may have opening 48 in which attachment 49 is mounted. Opening 48 may be, for example, an opening with a recessed portion, and attachment 49 may have protrusions that extend into the recessed portion to keep attachment 49 within opening 48. In an illustrative example, attachment 49 may include a circular metal member, such as member 40 of
Attachment 49 may be slidably moved in directions 50. For example, attachment 49 may be moved between its original position (at the rightmost edge of opening 48 in
Although not shown in
Regardless of the attachment and/or rotational mechanism with which an adjustable headband is attached to a housing member, the adjustable headband may include reinforced portions. An illustrative example of an adjustable headband is shown in
As shown in
Portion 30 may be a soft portion, such as a woven or knit portion formed from fabric 50. If desired, stiffeners 52 may optionally be included in fabric 50. Stiffeners 52 may be formed from a cord, such as a braided cord, or a flexible strip of polymer (e.g., an elastomer such as thermoplastic polyurethane). Stiffeners 52 may be sufficiently flexible to permit the headband to bend and twist, but may not stretch substantially along its length and may therefore sometimes be referred to as a non-stretchable stiffener, non-stretchable member, non-stretchable stiffening structure, etc. Stiffeners 52 may be significantly less stretchy and soft than fabric 50 and may serve to increase the stiffness and decrease (or eliminate) stretchiness at desired portions along headband 33. At the same time, the flexibility of stiffeners 52 may allow headband 33 to bend around the curvature of a user's head. Stiffeners 52 may be inserted into selected portions of headband 33 to selectively stiffen headband 33 at desired portions along its length, if desired.
However, the use of fabric 50 to form portion 30 of headband 33 is merely illustrative. Portion 30 may be formed from a rigid material, such as plastic, metal, or other material, and may include a cushion or pad on the rigid material to rest on the user's head and provide additional comfort. By forming portion 30 from rigid material (or by incorporating stiffeners 52 into fabric 50), it may be easier to rotate/pivot or otherwise adjust headband 33 relative to a housing support member.
In addition to providing comfort and adjustability when a head-mounted device is worn, the use of one or more adjustable headbands may allow for more compact storage and easier donning of the device and headbands. An illustrative example is shown in
As shown in
Additionally, by adjusting headbands 33 and 34 to be against device 10, a user may don device 10 by first putting their face against device 10 and then rotating headbands 33 and 34 over their head and into their desired positions (e.g., the positions in
As described above, one aspect of the present technology is the gathering and use of information such as information from input-output devices. The present disclosure contemplates that in some instances, data may be gathered that includes personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter ID's, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, username, password, biometric information, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver targeted content that is of greater interest to the user. Accordingly, use of such personal information data enables users to have control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, health and fitness data may be used to provide insights into a user's general wellness, or may be used as positive feedback to individuals using technology to pursue wellness goals.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the United States, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA), whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In another example, users can select not to provide certain types of user data. In yet another example, users can select to limit the length of time user-specific data is maintained. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an application (“app”) that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data at a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of information that may include personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data.
Physical environment: A physical environment refers to a physical world that people can sense and/or interact with without aid of electronic systems. Physical environments, such as a physical park, include physical articles, such as physical trees, physical buildings, and physical people. People can directly sense and/or interact with the physical environment, such as through sight, touch, hearing, taste, and smell.
Computer-generated reality: in contrast, a computer-generated reality (CGR) environment refers to a wholly or partially simulated environment that people sense and/or interact with via an electronic system. In CGR, a subset of a person's physical motions, or representations thereof, are tracked, and, in response, one or more characteristics of one or more virtual objects simulated in the CGR environment are adjusted in a manner that comports with at least one law of physics. For example, a CGR system may detect a person's head turning and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. In some situations (e.g., for accessibility reasons), adjustments to characteristic(s) of virtual object(s) in a CGR environment may be made in response to representations of physical motions (e.g., vocal commands). A person may sense and/or interact with a CGR object using any one of their senses, including sight, sound, touch, taste, and smell. For example, a person may sense and/or interact with audio objects that create 3D or spatial audio environment that provides the perception of point audio sources in 3D space. In another example, audio objects may enable audio transparency, which selectively incorporates ambient sounds from the physical environment with or without computer-generated audio. In some CGR environments, a person may sense and/or interact only with audio objects. Examples of CGR include virtual reality and mixed reality.
Virtual reality: A virtual reality (VR) environment refers to a simulated environment that is designed to be based entirely on computer-generated sensory inputs for one or more senses. A VR environment comprises a plurality of virtual objects with which a person may sense and/or interact. For example, computer-generated imagery of trees, buildings, and avatars representing people are examples of virtual objects. A person may sense and/or interact with virtual objects in the VR environment through a simulation of the person's presence within the computer-generated environment, and/or through a simulation of a subset of the person's physical movements within the computer-generated environment.
Mixed reality: In contrast to a VR environment, which is designed to be based entirely on computer-generated sensory inputs, a mixed reality (MR) environment refers to a simulated environment that is designed to incorporate sensory inputs from the physical environment, or a representation thereof, in addition to including computer-generated sensory inputs (e.g., virtual objects). On a virtuality continuum, a mixed reality environment is anywhere between, but not including, a wholly physical environment at one end and virtual reality environment at the other end. In some MR environments, computer-generated sensory inputs may respond to changes in sensory inputs from the physical environment. Also, some electronic systems for presenting an MR environment may track location and/or orientation with respect to the physical environment to enable virtual objects to interact with real objects (that is, physical articles from the physical environment or representations thereof). For example, a system may account for movements so that a virtual tree appears stationery with respect to the physical ground. Examples of mixed realities include augmented reality and augmented virtuality. Augmented reality: an augmented reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed over a physical environment, or a representation thereof. For example, an electronic system for presenting an AR environment may have a transparent or translucent display through which a person may directly view the physical environment. The system may be configured to present virtual objects on the transparent or translucent display, so that a person, using the system, perceives the virtual objects superimposed over the physical environment. Alternatively, a system may have an opaque display and one or more imaging sensors that capture images or video of the physical environment, which are representations of the physical environment. The system composites the images or video with virtual objects, and presents the composition on the opaque display. A person, using the system, indirectly views the physical environment by way of the images or video of the physical environment, and perceives the virtual objects superimposed over the physical environment. As used herein, a video of the physical environment shown on an opaque display is called “pass-through video,” meaning a system uses one or more image sensor(s) to capture images of the physical environment, and uses those images in presenting the AR environment on the opaque display. Further alternatively, a system may have a projection system that projects virtual objects into the physical environment, for example, as a hologram or on a physical surface, so that a person, using the system, perceives the virtual objects superimposed over the physical environment. An augmented reality environment also refers to a simulated environment in which a representation of a physical environment is transformed by computer-generated sensory information. For example, in providing pass-through video, a system may transform one or more sensor images to impose a select perspective (e.g., viewpoint) different than the perspective captured by the imaging sensors. As another example, a representation of a physical environment may be transformed by graphically modifying (e.g., enlarging) portions thereof, such that the modified portion may be representative but not photorealistic versions of the originally captured images. As a further example, a representation of a physical environment may be transformed by graphically eliminating or obfuscating portions thereof. Augmented virtuality: an augmented virtuality (AV) environment refers to a simulated environment in which a virtual or computer generated environment incorporates one or more sensory inputs from the physical environment. The sensory inputs may be representations of one or more characteristics of the physical environment. For example, an AV park may have virtual trees and virtual buildings, but people with faces photorealistically reproduced from images taken of physical people. As another example, a virtual object may adopt a shape or color of a physical article imaged by one or more imaging sensors. As a further example, a virtual object may adopt shadows consistent with the position of the sun in the physical environment.
Hardware: there are many different types of electronic systems that enable a person to sense and/or interact with various CGR environments. Examples include head mounted systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablets, and desktop/laptop computers. A head mounted system may have one or more speaker(s) and an integrated opaque display. Alternatively, a head mounted system may be configured to accept an external opaque display (e.g., a smartphone). The head mounted system may incorporate one or more imaging sensors to capture images or video of the physical environment, and/or one or more microphones to capture audio of the physical environment. Rather than an opaque display, a head mounted system may have a transparent or translucent display. The transparent or translucent display may have a medium through which light representative of images is directed to a person's eyes. The display may utilize digital light projection, OLEDs, LEDs, uLEDs, liquid crystal on silicon, laser scanning light sources, or any combination of these technologies. The medium may be an optical waveguide, a hologram medium, an optical combiner, an optical reflector, or any combination thereof. In one embodiment, the transparent or translucent display may be configured to become opaque selectively. Projection-based systems may employ retinal projection technology that projects graphical images onto a person's retina. Projection systems also may be configured to project virtual objects into the physical environment, for example, as a hologram or on a physical surface.
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.