This relates generally to electronic devices and, more particularly, to wearable electronic device systems.
Electronic devices are sometimes configured to be worn by users. For example, head-mounted devices are provided with head-mounted structures that allow the devices to be worn on users' heads. The head-mounted devices may include optical systems with lenses. The lenses allow displays in the devices to present visual content to users.
Users have faces of different shapes and sizes. This can pose challenges when a head-mounted device is to be used by multiple users. If care is not taken, a head-mounted device may not fit well for certain users.
A head-mounted device may have a display that displays content for a user. Head-mounted support structures in the device support the display on the head of the user.
The head-mounted device may have lenses in lens assemblies (also referred to as lens modules herein). A left positioner may be used to position a left lens assembly. A right positioner may be used to position a right lens assembly. The left and right lens modules may have respective left and right lenses and respective left and right portions of a display.
To accommodate users with different interpupillary distances, the left and right lens assemblies may be moved towards or away from each other. A user may supply the interpupillary distance of the user to the head-mounted device, an image sensor or other device may be used in measuring the interpupillary distance to provide to the head-mounted device, and/or gaze tracking sensors in the head-mounted device may measure the interpupillary distance of the user while the head-mounted device is being worn on the head of the user. Other sensing arrangements may be used to measure lens assembly positions relative to the user's nose, if desired.
To avoid excessive pressure on a user's nose, sensing circuitry such as force sensing circuitry may be used to detect the pressure applied to the user's nose by the left and right lens assemblies. Control circuitry may stop the positioners from moving the lens assemblies closer to the user's nose in response to determining that a force threshold has been reached, and may reposition the lens assemblies further from the nose, if desired. In this way, the lens assemblies may be adjusted to match the user's interpupillary distance or to get as close as possible to the user's interpupillary distance without applying too much force to the user's nose.
Electronic devices may include displays and other components for presenting content to users. The electronic devices may be wearable electronic devices. A wearable electronic device such as a head-mounted device may have head-mounted support structures that allow the head-mounted device to be worn on a user's head.
A head-mounted device may contain a display formed from one or more display panels (displays) for displaying visual content to a user. A lens system may be used to allow the user to focus on the display and view the visual content. The lens system may have a left lens that is aligned with a user's left eye and a right lens that is aligned with a user's right eye.
Not all users have eyes that are separated by the same interpupillary distance. To ensure that a wide range of users are able to comfortably view content on the display, the head-mounted device may be provided with lens positioners. The lens positioners may be used in adjusting the lens-to-lens spacing between the left and right lenses to match the interpupillary distance of the user.
To prevent excessive pressure on the surface of the user's nose, force sensors can be used determine how much pressure is applied to the user's nose with the lenses as the lens-to-lens spacing is changed. Control circuitry in the head-mounted device may adjust the left and right lenses to match the user's interpupillary distance, unless the lenses apply too much pressure to the user's nose (e.g., the pressure measured by the force sensors exceeds a threshold). In some situations, the left and right lenses may be spaced so that the lens-to-lens spacing between the left and right lenses matches the user's interpupillary distance. In other situations, the lens-to-lens spacing between the left and right lenses will be slightly larger than the user's interpupillary distance to ensure that the lenses do not press excessively against the user's nose. Sensor circuitry such as force sensing circuitry may be used to provide the control circuitry with real-time feedback on the pressure applied by the lenses to the user's nose, thereby ensuring that the positions of the left and right lenses are adjusted satisfactorily.
A schematic diagram of an illustrative system having an electronic device with sensor circuitry that ensures satisfactory placement of lenses relative to a user's facial features is shown in
As shown in
During operation, the communications circuitry of the devices in system 8 (e.g., the communications circuitry of control circuitry 12 of device 10), may be used to support communication between the electronic devices. For example, one electronic device may transmit video and/or audio data to another electronic device in system 8. Electronic devices in system 8 may use wired and/or wireless communications circuitry to communicate through one or more communications networks (e.g., the internet, local area networks, etc.). The communications circuitry may be used to allow data to be received by device 10 from external equipment (e.g., a tethered computer, a portable device such as a handheld device or laptop computer, online computing equipment such as a remote server or other remote computing equipment, or other electrical equipment) and/or to provide data to external equipment.
Device 10 may include input-output devices 22. Input-output devices 22 may be used to allow a user to provide device 10 with user input. Input-output devices 22 may also be used to gather information on the environment in which device 10 is operating. Output components in devices 22 may allow device 10 to provide a user with output and may be used to communicate with external electrical equipment.
As shown in
Display 14 may be used to display images. The visual content that is displayed on display 14 may be viewed by a user of device 10. Displays in device 10 such as display 14 may be organic light-emitting diode displays or other displays based on arrays of light-emitting diodes, liquid crystal displays, liquid-crystal-on-silicon displays, projectors or displays based on projecting light beams on a surface directly or indirectly through specialized optics (e.g., digital micromirror devices), electrophoretic displays, plasma displays, electrowetting displays, microLED displays, or any other suitable displays.
Display 14 may present computer-generated content such as virtual reality content and mixed reality content to a user. Virtual reality content may be displayed in the absence of real-world content. Mixed reality content, which may sometimes be referred to as augmented reality content, may include computer-generated images that are overlaid on real-world images. The real-world images may be captured by a camera (e.g., a forward-facing camera) and merged with overlaid computer-generated content or an optical coupling system may be used to allow computer-generated content to be overlaid on top of real-world images. As an example, a pair of mixed reality glasses or other augmented reality head-mounted display may include a display device that provides images to a user through a beam splitter, prism, holographic coupler, or other optical coupler. Configurations in which display 14 is used to display virtual reality content to a user through lenses are described herein as an example.
Input-output devices 22 may include sensors 16. Sensors 16 may include, for example, three-dimensional sensors (e.g., three-dimensional image sensors such as structured light sensors that emit beams of light and that use two-dimensional digital image sensors to gather image data for three-dimensional images from light spots that are produced when a target is illuminated by the beams of light, binocular three-dimensional image sensors that gather three-dimensional images using two or more cameras in a binocular imaging arrangement, three-dimensional lidar (light detection and ranging) sensors, three-dimensional radio-frequency sensors, or other sensors that gather three-dimensional image data), cameras (e.g., infrared and/or visible digital image sensors), gaze tracking sensors (e.g., a gaze tracking system based on an image sensor and, if desired, a light source that emits one or more beams of light that are tracked using the image sensor after reflecting from a user's eyes), touch sensors, buttons, force sensors, sensors such as contact sensors based on switches, gas sensors, pressure sensors, moisture sensors, magnetic sensors, audio sensors (microphones), ambient light sensors, microphones for gathering voice commands and other audio input, sensors that are configured to gather information on motion, position, and/or orientation (e.g., accelerometers, gyroscopes, compasses, and/or inertial measurement units that include all of these sensors or a subset of one or two of these sensors), fingerprint sensors and other biometric sensors, optical position sensors (optical encoders), and/or other position sensors such as linear position sensors, and/or other sensors. As shown in
User input and other information may be gathered using sensors and other input devices in input-output devices 22. If desired, input-output devices 22 may include other devices 24 such as haptic output devices (e.g., vibrating components), light-emitting diodes and other light sources, speakers such as ear speakers for producing audio output, and other electrical components. Device 10 may include circuits for receiving wireless power, circuits for transmitting power wirelessly to other devices, batteries and other energy storage devices (e.g., capacitors), joysticks, buttons, and/or other components.
Electronic device 10 may have housing structures (e.g., housing walls, straps, etc.), as shown by illustrative support structures 26 of
Display 14 may include left and right display panels (e.g., left and right pixel arrays, sometimes referred to as left and right displays or left and right display portions) that are mounted respectively in left and right display modules 70 corresponding respectively to a user's left eye (and left eye box 60) and right eye (and right eye box). Modules 70, which may sometimes be referred to as lens support structures, lens assemblies, lens housings, or lens and display housings, may be individually positioned relative to the housing wall structures of main unit 26-2 and relative to the user's eyes using positioning circuitry such as respective left and right positioners 58. Positioners 58 may include stepper motors, piezoelectric actuators, motors, linear electromagnetic actuators, and/or other electronic components for adjusting lens assembly positions. Positioners 58 may be controlled by control circuitry 12 during operation of device 10. For example, positioners 58 may be used to adjust the spacing between modules 70 (and therefore the lens-to-lens spacing between the left and right lenses of modules 70) to match the interpupillary distance (IPD) of a user's eyes. This allows the user to view the left and right display portions of display 14 in the left and right lens modules. In some cases, however, the lenses may apply excess pressure to the user's nose if adjusted to the user's IPD. Therefore, sensors may be incorporated into device 10 to monitor the pressure on the user's nose. An illustrative arrangement that includes force sensors to monitor the pressure applied to the user's nose is shown in
As shown in
To ensure that display 14 is viewable by the user when the user's eyes are located in eye boxes 60 (
In scenarios in which the user's nose is small, there may be ample room available to align lens centers LC with eye centers PC. In scenarios in which the user's nose is larger, control circuitry 12 may position modules 70 as shown in
In operation, positioners 58 may move lens modules 70 (
Force sensor 20 may be incorporated between nasal flap 29 and each lens assembly 70. Nasal flap 29 may be a fabric, polymer, or other material component that allows for a comfortable fit for a user of device 10. For example, nasal flap 29 may be interposed between each lens assembly 70 and nose 40. In some embodiments, nasal flap 29 may extend along both sides and over the top of nose 40 (e.g., over at least a portion of a bridge of nose 40). However, this is merely illustrative. Nasal flap 29 may have two separate portion, one between left lens assembly 70 and nose 40 and another between right lens assembly 70 and nose 40, or may be omitted from device 10 if desired.
In embodiments in which nasal flap 29 is included in device 10, force sensors 20 may be incorporated between each lens assembly 70 (i.e., left lens assembly 70 and right lens assembly 70) and nasal flap 29. However, this location of force sensors 20 is merely illustrative. In general, force sensors 20 may be included in any desired location within device 10. For example, force sensors 20 may be formed between nasal flap 29 and nose 40 as shown by position 20′, may be formed within nasal flap 29 or be integral with nasal flap 29 as shown by position 20″, or may be formed within lens assembly 70 as shown by position 20″′.
Although
Regardless of where force sensors 20 are formed within device 10, force sensors 20 may monitor the amount of force applied to nose 40 by lens modules 70 to ensure that excess pressure is not applied to nose 40. Force sensors 20 may continuously monitor the applied force, or may flag control circuitry 12 when the applied force surpasses a threshold. Force sensors 20 may be any desired type of sensor that monitors the amount of force/pressure applied to nose 40. Some examples of force sensors 20 are shown in
In some examples, force sensor(s) 20 may be direct force sensors, such as force sensor 20 of
In other examples, force sensor(s) 20 may be formed from conductive strands, yarn, or fibers and incorporated into a fabric in device 10. For example, force sensor(s) 20 may be formed form smart fabrics or other fabrics that include conductive strands. The conductive strands may be arranged to form a force sensor. An example of this arrangement is shown in
As shown in
Although conductive strands 28 are shown as straight strands in
Non-conductive strands 33 may be interspersed between at least some of conductive strands 28, if desired. As shown in
Fabric 31 may form some or all of nasal flap 29, or may otherwise cover nasal flap 29. Because nasal flap 29 is between nose 40 and lens assembly 70 when device 10 is worn by a user (
Alternatively or additionally, fabric 31 may be otherwise incorporated into device 10. For example, fabric 31 may form a curtain fabric that is incorporated into device 10 between support structure 26-2 and lenses 70 (
In other examples, force sensor(s) 20 may be incorporated into an interior region of nasal flap 29. An example of an arrangement in which a force sensor is located in the interior of nasal flap 29 is shown in
As shown in
Although
As an alternative to the sensors shown in
As shown in
Although the previous embodiments have included a dedicated force sensor 20 to ensure that excessive force is not applied to a user's nose, this is merely illustrative. Device 10 may use other sensors (such as sensors 16) to determine if the force applied to nose 40 exceeds a threshold. An example of this arrangement is shown in
As shown in
When assembly 70 moves toward nasal flap 29 (and therefore nose 40), it will eventually contact and push against nasal flap 29 and nose 40, causing flexible members 34 to be moved in direction 38. If flexible members 34 move far enough (i.e., the amount of force applied to nose 40 exceeds or meets a threshold), flexible members 34 may block light-emitting component 36. As a result, light-detecting component 42 may stop detecting light emitted by light-emitting component 36. Based on the changed signal of light-detecting component 42, control circuitry 12 may stop positioners 58 from moving lens assembly 70 further toward the user's nose.
Although flexible members 34 are shown as covering light-emitting component 36, this is merely illustrative. Flexible members 34 may cover light-detecting component 42, or may merely move between light-emitting component 36 and light-detecting component 42. Additionally, any desired number of flexible members 34 may be used.
In some embodiments, light-detecting component 42 may be used without intervening flexible members 34 to determine the position of the user's nose. For example, light-detecting component 42 may be a camera (e.g., a camera sensitive to visible light) that determines how close lens assembly 70 is to nose 40. Control circuitry, such as control circuitry 12 (
Alternatively or additionally, a three-dimensional scan may be performed on a user of device 10 to ensure a proper fitting. The three-dimensional scan may be performed by three-dimensional sensors (such as cameras, infrared dot projectors, infrared sensors, etc.) in device 10 or external to device 10. The three-dimensional scan may be used to capture the topology of a user's face. Control circuitry 12 may then use the topology of the user's face to ensure that positioners 58 do not move lens assemblies 70 too close into nose 40 or apply too much force onto nose 40. For example, control circuity 12 may use the topology information to calculate a maximum distance that lens assemblies can move without applying too much force to nose 40 and limit movement to that maximum distance.
Alternatively or additionally, device 10 may include a manual button that a user may press to indicate discomfort from too much force on nose 40. In response to detecting a press of the manual button, positioners 58 may stop moving lens assemblies 70 toward nose 40. If desired, positioners 58 may also reverse to back lens assemblies 70 off of nose 40 in response to the press of the manual button and/or excessive force detection. For example, positioners 58 may move lens modules 70 off of nose 40 by a desired gap (e.g., a gap G of at least 0.1 mm, at least 0.2 mm, at least 1 mm, at least 2 mm, less than 5 mm, or other suitable spacing).
If desired, the position of lens modules 70 relative to the corresponding surfaces of nose may be measured using feedback from motors in positioners 58 as lens modules 70 are moved into contact with the surfaces of nose 40. An illustrative control circuit for a positioner such as positioner 58 is shown in
Illustrative operations involved in operating device 10 in system 8 are shown in
During the operations of block 100, information on the distance between the user's eyes (interpupillary distance IPD, sometimes referred to as pupillary distance) may be gathered. With one illustrative arrangement, device 10 or other equipment in system 8 gathers the user's interpupillary distance from the user by prompting the user to type the interpupillary distance into a data entry box on display 14 or a display in other equipment in system 8. The user may also supply the user's interpupillary distance using voice input or other user input arrangements. With another illustrative arrangement, a sensor in device 10 or other a sensor in a stand-alone computer, portable device, or other equipment in system 8 may measure the user's interpupillary distance. For example, a sensor such as a two-dimensional or three-dimensional image sensor may gather an image of the user's face to measure the value of interpupillary distance IPD. After the measurement of the interpupillary distance has been made, the interpupillary distance may be provided to device 10 (e.g., over a wired or wireless communications paths). If desired, gaze trackers may measure the locations of the centers of the user's eyes PD and thereby determine IPD from direct measurement as a user is wearing device 10 on the user's head.
After gathering interpupillary distance IPD, control circuitry 12 of device 10 may, during the operations of block 102, use positioners 58 to adjust the lens-to-lens spacing between lens centers LC so that this distance matches interpupillary distance IPD and so that the centers of lenses 72 are aligned with respective eye centers PC. While positioners 58 are moving lens modules 70 and lenses 72 (e.g., while the lens-to-lens spacing is being reduced to move modules 70 towards adjacent surfaces of the user's nose), control circuitry 12 uses force sensing circuitry (e.g., force sensor(s) 20) to monitor the force applied by lens modules 70 on nose 40). In some situations, the user's nose 40 may prevent lenses 72 from being brought sufficiently close to each other to allow the lens-to-lens spacing to exactly match IPD without creating a risk of discomfort for the user.
In other words, force sensor(s) 20 may indicate that too much force is being applied to nose 40 by lens modules 70, or that the force being applied to nose 40 by lens modules 70 has reached a threshold. Alternatively or additionally, device 10 may include a manual button that a user may press to indicate discomfort from too much force on nose 40. In response to the excessive force measured by force sensor(s) 20 or the press of the button by a user, control circuitry 12 may then stop positioners 58 from moving lens modules 70 further toward nose 40. If desired, positioners 58 may move lens modules 70 off of nose 40 by a desired gap (e.g., a gap G of at least 0.1 mm, at least 0.2 mm, at least 1 mm, at least 2 mm, less than 5 mm, or other suitable spacing).
Following the positioning of modules 70 at desired locations relative to nose 40 to ensure user comfort while wearing device 10, control circuitry 12 may use display 14 to present visual content to the user through lenses 72 (block 104).
Instead of, or in addition to, mounting force sensors on the nasal flap, force sensors may be coupled to the lens modules. An illustrative example of force sensors coupled to a lens module is shown in
As shown in
Blades 92 may extend in the x-direction between intermediate trim piece 71 and trim ring 90. Blades 94 may extend in the y-direction between optical portion 69 and intermediate trim piece 71. To measure the force applied to lens module 70, such as when lens module 70 contacts the user's nose when making adjustments based on IPD, strain gauges 96A and 96B may be coupled to blades 92 and 94. In the example of
When lens module 70 contacts the user's nose, trim ring 90 and/or intermediate trim piece 71 may deflect, causing deflections of blades 92 and/or blades 94. Strain gauges 96 may measure these deflections, which may be proportional to the force applied to the nose. In this way, strain gauges 96 may monitor the force applied to the nose by lens module 70. If the force is too great, control circuitry, such as control circuitry 12, may stop the lens modules from moving toward the user's nose, and may reverse away from the user's nose by a set distance, if desired. In this way, strain gauges on blades 92 and/or 94 may monitor the force applied to a user's nose to avoid excessive pressure on the nose.
Although
Although
Instead of, or in addition to, having strain gauges within lens module 70, strain gauges may be coupled to an attachment mechanism between lens module 70 and head-mounted housing 26. An illustrative example of this arrangement is shown in
As shown in
Coupler 98 may also include flexible structures 108 (also referred to as blades 108 herein) that extend between optical module 70 and attachment 106. Flexible structures 108 may be formed from metal, plastic, or other material that flexes when force is applied to lens module 70 (e.g., because attachment 106 remains fixed). For example, blades 108 may be formed from steel or other metal with a low thickness, such as less than 300 microns, less than 260 microns, less than 250 microns, or other suitable thickness.
Strain gauges 110 may be mounted on at least one of blades 108. Strain gauges 110 may measure the strain of blades 108 when force is applied to lens module 70, which is proportional to the force on lens module 70. In this way, the force applied on a nose by lens module 70 may be monitored, and lens module 70 may be stopped from pressing against the nose if there is excessive force applied.
Although strain gauges 110 are shown on only one of blades 108, this is merely illustrative. In general, strain gauges 110 may be applied to any or all of blades 108.
Instead of having strain gauges 110 mounted on blades 108 in coupler 98, coupler 98 may include spring structures and load cells or position sensors to determine how much force is being applied to a nose by module 70. Illustrative arrangements of springs and force sensing components are shown in
As shown in
Instead of using a tension spring as in
As another example, a spring interferometer may be used, as shown in
A physical environment refers to a physical world that people can sense and/or interact with without aid of electronic devices. The physical environment may include physical features such as a physical surface or a physical object. For example, the physical environment corresponds to a physical park that includes physical trees, physical buildings, and physical people. People can directly sense and/or interact with the physical environment such as through sight, touch, hearing, taste, and smell. In contrast, an extended reality (XR) environment refers to a wholly or partially simulated environment that people sense and/or interact with via an electronic device. For example, the XR environment may include augmented reality (AR) content, mixed reality (MR) content, virtual reality (VR) content, and/or the like. With an XR system, a subset of a person's physical motions, or representations thereof, are tracked, and, in response, one or more characteristics of one or more virtual objects simulated in the XR environment are adjusted in a manner that comports with at least one law of physics. As one example, the XR system may detect head movement and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. As another example, the XR system may detect movement of the electronic device presenting the XR environment (e.g., a mobile phone, a tablet, a laptop, or the like) and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. In some situations (e.g., for accessibility reasons), the XR system may adjust characteristic(s) of graphical content in the XR environment in response to representations of physical motions (e.g., vocal commands).
There are many different types of electronic systems that enable a person to sense and/or interact with various XR environments. Examples include head mountable systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablets, and desktop/laptop computers. A head mountable system may have one or more speaker(s) and an integrated opaque display. Alternatively, a head mountable system may be configured to accept an external opaque display (e.g., a smartphone). The head mountable system may incorporate one or more imaging sensors to capture images or video of the physical environment, and/or one or more microphones to capture audio of the physical environment. Rather than an opaque display, a head mountable system may have a transparent or translucent display. The transparent or translucent display may have a medium through which light representative of images is directed to a person's eyes. The display may utilize digital light projection, OLEDs, LEDs, uLEDs, liquid crystal on silicon, laser scanning light source, or any combination of these technologies. The medium may be an optical waveguide, a hologram medium, an optical combiner, an optical reflector, or any combination thereof. In some implementations, the transparent or translucent display may be configured to become opaque selectively. Projection-based systems may employ retinal projection technology that projects graphical images onto a person's retina. Projection systems also may be configured to project virtual objects into the physical environment, for example, as a hologram or on a physical surface.
As described above, one aspect of the present technology is the gathering and use of information such as sensor information. The present disclosure contemplates that in some instances, data may be gathered that includes personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter ID's, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, username, password, biometric information, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver targeted content that is of greater interest to the user. Accordingly, use of such personal information data enables users to have control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, health and fitness data may be used to provide insights into a user's general wellness, or may be used as positive feedback to individuals using technology to pursue wellness goals.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the United States, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA), whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In another example, users can select not to provide certain types of user data. In yet another example, users can select to limit the length of time user-specific data is maintained. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an application (“app”) that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data at a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of information that may include personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data.
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of U.S. provisional patent application No. 63/355,990, filed Jun. 27, 2022, and U.S. provisional patent application No. 63/431,514, filed Dec. 9, 2022, which are hereby incorporated by reference herein in their entireties.
Number | Date | Country | |
---|---|---|---|
63355990 | Jun 2022 | US | |
63431514 | Dec 2022 | US |