This relates generally to electronic devices, and, more particularly, to electronic devices with displays.
Electronic devices often include displays. For example, an electronic device may have an organic light-emitting diode (OLED) display based on organic light-emitting diode pixels. In this type of display, each pixel includes a light-emitting diode and thin-film transistors for controlling application of a signal to the light-emitting diode to produce light. The light-emitting diodes may include OLED layers positioned between an anode and a cathode.
There is a trend towards borderless electronic devices with a full-face display. These devices, however, may still need to include sensors such as cameras, ambient light sensors, and proximity sensors to provide other device capabilities. Since the display now covers the entire front face of the electronic device, the sensors will have to be placed under the display stack. In practice, however, the amount of light transmission through the display stack is very low (i.e., the transmission might be less than 20% in the visible spectrum), which severely limits the sensing performance under the display.
It is within this context that the embodiments herein arise.
An electronic device may include a display and an optical sensor formed underneath the display. The electronic device may include a plurality of non-pixel regions that overlap the optical sensor. Each non-pixel region may be devoid of thin-film transistors and other display components. The plurality of non-pixel regions is configured to increase the transmittance of light through the display to the sensor. The non-pixel regions may therefore be referred to as transparent windows in the display.
The resolution of the display panel may be reduced in some areas due to the presence of the transparent windows. To prevent a visible border between the reduced resolution areas of the display and full resolution areas of the display, uniformity compensation circuitry may be used to compensate pixel data.
Uniformity compensation circuitry may receive input pixel data having gray levels for each pixel in the display. The uniformity compensation circuitry may output compensated pixel data for the display. The uniformity compensation circuitry may use one or more compensation maps that include compensation factors associated with pixel locations. The uniformity compensation circuitry may also use region-specific gamma look-up tables to apply different gamma curves to pixels in different regions of the display.
The uniformity compensation circuitry may also be used to form a transition region adjacent to a boundary between a pixel removal region of the display and a full pixel density region of the display. The maximum luminance of pixels may gradually be changed across the transition region.
An illustrative electronic device of the type that may be provided with a display is shown in
As shown in
Input-output circuitry in device 10 such as input-output devices 12 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Input-output devices 12 may include buttons, joysticks, scrolling wheels, touch pads, key pads, keyboards, microphones, speakers, tone generators, vibrators, cameras, sensors, light-emitting diodes and other status indicators, data ports, etc. A user can control the operation of device 10 by supplying commands through input resources of input-output devices 12 and may receive status information and other output from device 10 using the output resources of input-output devices 12.
Input-output devices 12 may include one or more displays such as display 14. Display 14 may be a touch screen display that includes a touch sensor for gathering touch input from a user or display 14 may be insensitive to touch. A touch sensor for display 14 may be based on an array of capacitive touch sensor electrodes, acoustic touch sensor structures, resistive touch components, force-based touch sensor structures, a light-based touch sensor, or other suitable touch sensor arrangements. A touch sensor for display 14 may be formed from electrodes formed on a common display substrate with the display pixels of display 14 or may be formed from a separate touch sensor panel that overlaps the pixels of display 14. If desired, display 14 may be insensitive to touch (i.e., the touch sensor may be omitted). Display 14 in electronic device 10 may be a head-up display that can be viewed without requiring users to look away from a typical viewpoint or may be a head-mounted display that is incorporated into a device that is worn on a user's head. If desired, display 14 may also be a holographic display used to display holograms.
Control circuitry 16 may be used to run software on device 10 such as operating system code and applications. During operation of device 10, the software running on control circuitry 16 may display images on display 14.
Input-output devices 12 may also include one or more sensors 13 such as force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors such as microphones, touch and/or proximity sensors such as capacitive sensors (e.g., a two-dimensional capacitive touch sensor associated with a display and/or a touch sensor that forms a button, trackpad, or other input device not associated with a display), and other sensors. In accordance with some embodiments, sensors 13 may include optical sensors such as optical sensors that emit and detect light (e.g., optical proximity sensors such as transreflective optical proximity structures), ultrasonic sensors, and/or other touch and/or proximity sensors, monochromatic and color ambient light sensors, image sensors, fingerprint sensors, temperature sensors, proximity sensors and other sensors for measuring three-dimensional non-contact gestures (“air gestures”), pressure sensors, sensors for detecting position, orientation, and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or inertial measurement units that contain some or all of these sensors), health sensors, radio-frequency sensors, depth sensors (e.g., structured light sensors and/or depth sensors based on stereo imaging devices), optical sensors such as self-mixing sensors and light detection and ranging (lidar) sensors that gather time-of-flight measurements, humidity sensors, moisture sensors, gaze tracking sensors, and/or other sensors. In some arrangements, device 10 may use sensors 13 and/or other input-output devices to gather user input (e.g., buttons may be used to gather button press input, touch sensors overlapping displays can be used for gathering user touch screen input, touch pads may be used in gathering touch input, microphones may be used for gathering audio input, accelerometers may be used in monitoring when a finger contacts an input surface and may therefore be used to gather finger press input, etc.).
Display 14 may be an organic light-emitting diode display or may be a display based on other types of display technology (e.g., liquid crystal displays). Device configurations in which display 14 is an organic light-emitting diode display are sometimes described herein as an example. This is, however, merely illustrative. Any suitable type of display may be used, if desired. In general, display 14 may have a rectangular shape (i.e., display 14 may have a rectangular footprint and a rectangular peripheral edge that runs around the rectangular footprint) or may have other suitable shapes. Display 14 may be planar or may have a curved profile.
A top view of a portion of display 14 is shown in
Display driver circuitry may be used to control the operation of pixels 22. The display driver circuitry may be formed from integrated circuits, thin-film transistor circuits, or other suitable circuitry. Display driver circuitry 30 of
To display the images on display pixels 22, display driver circuitry 30 may supply image data to data lines D while issuing clock signals and other control signals to supporting display driver circuitry such as gate driver circuitry 34 over path 38. If desired, display driver circuitry 30 may also supply clock signals and other control signals to gate driver circuitry 34 on an opposing edge of display 14.
Gate driver circuitry 34 (sometimes referred to as row control circuitry) may be implemented as part of an integrated circuit and/or may be implemented using thin-film transistor circuitry. Horizontal control lines G in display 14 may carry gate line signals such as scan line signals, emission enable control signals, and other horizontal control signals for controlling the display pixels 22 of each row. There may be any suitable number of horizontal control signals per row of pixels 22 (e.g., one or more row control signals, two or more row control signals, three or more row control signals, four or more row control signals, etc.).
The region on display 14 where the display pixels 22 are formed may sometimes be referred to herein as the active area. Electronic device 10 has an external housing with a peripheral edge. The region surrounding the active area and within the peripheral edge of device 10 is the border region. Images can only be displayed to a user of the device in the active region. It is generally desirable to minimize the border region of device 10. For example, device 10 may be provided with a full-face display 14 that extends across the entire front face of the device. If desired, display 14 may also wrap around over the edge of the front face so that at least part of the lateral edges or at least part of the back surface of device 10 is used for display purposes.
Device 10 may include a sensor 13 mounted behind display 14 (e.g., behind the active area of the display).
Thin-film transistor (TFT) layers 304 may be formed over inorganic buffer layers 303 and organic substrates 302 and 300. The TFT layers 304 may include thin-film transistor circuitry such as thin-film transistors, thin-film capacitors, associated routing circuitry, and other thin-film structures formed within multiple metal routing layers and dielectric layers. Organic light-emitting diode (OLED) layers 306 may be formed over the TFT layers 304. The OLED layers 306 may include a diode cathode layer, a diode anode layer, and emissive material interposed between the cathode and anode layers. The OLED layers may include a pixel definition layer that defines the light-emitting area of each pixel. The TFT circuitry in layer 304 may be used to control an array of display pixels formed by the OLED layers 306.
Circuitry formed in the TFT layers 304 and the OLED layers 306 may be protected by encapsulation layers 308. As an example, encapsulation layers 308 may include a first inorganic encapsulation layer, an organic encapsulation layer formed on the first inorganic encapsulation layer, and a second inorganic encapsulation layer formed on the organic encapsulation layer. Encapsulation layers 308 formed in this way can help prevent moisture and other potential contaminants from damaging the conductive circuitry that is covered by layers 308. Substrate 300, polyimide layers 302, buffer layers 303, TFT layers 304, OLED layers 306, and encapsulation layers 308 may be collectively referred to as a display panel.
One or more polarizer films 312 may be formed over the encapsulation layers 308 using adhesive 310. Adhesive 310 may be implemented using optically clear adhesive (OCA) material that offer high light transmittance. One or more touch layers 316 that implement the touch sensor functions of touch-screen display 14 may be formed over polarizer films 312 using adhesive 314 (e.g., OCA material). For example, touch layers 316 may include horizontal touch sensor electrodes and vertical touch sensor electrodes collectively forming an array of capacitive touch sensor electrodes. Lastly, the display stack may be topped off with a cover glass layer 320 (sometimes referred to as a display cover layer 320) that is formed over the touch layers 316 using additional adhesive 318 (e.g., OCA material). display cover layer 320 may be a transparent layer (e.g., transparent plastic or glass) that serves as an outer protective layer for display 14. The outer surface of display cover layer 320 may form an exterior surface of the display and the electronic device that includes the display.
Still referring to
Each of the multitude of layers in the display stack contributes to the degraded light transmission to sensor 13. In particular, the dense thin-film transistors and associated routing structures in TFT layers 304 of the display stack contribute substantially to the low transmission. In accordance with an embodiment, at least some of the display pixels may be selectively removed in regions of the display stack located directly over sensor(s) 13. Regions of display 14 that at least partially cover or overlap with sensor(s) 13 in which at least a portion of the display pixels have been removed are sometimes referred to as pixel removal regions or pixel free regions. Removing display pixels (e.g., removing transistors and/or capacitors associated with one or more sub-pixels) in the pixel free regions can drastically help increase transmission and improve the performance of the under-display sensor 13. In addition to removing display pixels, portions of additional layers such as polyimide layers 302 and/or substrate 300 may be removed for additional transmission improvement. Polarizer 312 may also be bleached for additional transmission improvement.
In display window 324, anode 306-1 and emissive material 306-2 may be omitted. Without the display window, an additional pixel may be formed in area 324 adjacent to the pixel in area 322 (according to the pixel pattern). However, to increase the transmittance of light to sensor 13 under the display, the pixel(s) in area 324 are removed. The absence of emissive material 306-2 and anode 306-1 may increase the transmittance through the display stack. Additional circuitry within thin-film transistor layer 304 may also be omitted in pixel removal area to increase transmittance.
Additional transmission improvements through the display stack may be obtained by selectively removing additional components from the display stack in high-transmittance area 324. As shown in
Polyimide layers 302 may be removed in high-transmittance area 324 in addition to cathode layer 306-3. The removal of the polyimide layers 302 results in an opening 328 in the pixel removal region. Said another way, the polyimide layer may have polyimide material that defines an opening 328 in the pixel removal region. The polyimide layers may be removed via etching (e.g., laser etching or plasma etching). Alternatively, the polyimide layers may be patterned to have an opening in high-transmittance area 324 during the original polyimide formation steps. Removing the polyimide layer 302 in high-transmittance area 324 may result in additional transmittance of light to sensor 13 in high-transmittance area 324.
Substrate 300 may be removed in high-transmittance area 324 in addition to cathode layer 306-3 and polyimide layer 302. The removal of the substrate 300 results in an opening 330 in the pixel removal region. Said another way, the substrate 300 may have material (e.g., PET, PEN, etc.) that defines an opening 330 in the pixel removal region. The substrate may be removed via etching (e.g., with a laser). Alternatively, the substrate may be patterned to have an opening in high-transmittance area 324 during the original substrate formation steps. Removing the substrate 300 in high-transmittance area 324 may result in additional transmittance of light to sensor 13 in high-transmittance area 324. The polyimide opening 328 and substrate opening 330 may be considered to form a single unitary opening. When removing portions of polyimide layer 302 and/or substrate 300, inorganic buffer layers 303 may serve as an etch stop for the etching step. Openings 328 and 330 may be filled with air or another desired transparent filler.
In addition to having openings in cathode 306-3, polyimide layers 302, and/or substrate 300, the polarizer 312 in the display may be bleached for additional transmittance in the pixel removal region.
As shown in
The pattern of pixels (322) and transparent openings (324) in
In general, the display subpixels may be partially removed from any region(s) of display 14.
The example of
Because pixel removal region 332 includes both pixel regions 322 and transparent windows 324, the pixel density (e.g., the number of sub-pixels per unit area) in pixel region 332 is reduced relative to full pixel density region 334. In general, the pixel density in pixel removal region 332 may be reduced by any desired amount relative to the pixel density in region 334 (e.g., reduced by 5% or more, reduced by 10% or more, reduced by 25% or more, reduced by 50% or more, reduced by between 10% and 60%, reduced by between 30% and 60%, reduced by between 40% and 60%, etc.). In
For the border between pixel removal region 332 and full pixel density region 334 to be imperceptible to the viewer, the pixels in pixel removal region 332 may be brighter than the pixels in full pixel density region 334. Consider the example where all the pixels in
Ultimately, content generation circuitry 202 may output pixel data for a given frame. The pixel data may include a target luminance value (gray level) for each sub-pixel in the display for the frame.
As previously discussed, without additional compensation, unmodified pixel data displayed on the display may result in undesired borders or differing appearances between pixel removal region 332 and normal region 334. Therefore, device 10 includes uniformity compensation circuitry 204 that is configured to adjust the pixel data to account for the varying pixel density between the pixel removal region 332 and the normal pixel region 334. Uniformity compensation circuitry 204 may match the uniformity and pixel gamma between regions 332 and 334 and use content transition to make the boundary between regions 332 and 334 imperceptible to the viewer.
First, the uniformity-compensation circuitry 204 may include region-specific compensation maps 206. The compensation maps 206 may determine a corrected gray level for a given sub-pixel based on the location of that sub-pixel and other possible factors. For example, a first pixel at the boundary between pixel removal region 332 and normal pixel region 334 may have an associated first compensation value, a second pixel in pixel removal region 332 that is not adjacent to the boundary may have an associated second compensation value, and a third pixel in normal region 334 that is not adjacent to the boundary may have an associated third compensation value.
The compensation maps are therefore used to adjust a gray level for a given sub-pixel based on the position of the sub-pixel. The compensation maps may optionally use the gray levels of one or more adjacent sub-pixels in determining compensation for a given sub-pixel. The compensation maps may take other factors into account such as the ambient light level (e.g., obtained by an ambient light sensor in the electronic device), a display brightness setting (e.g., set by the user or based on other considerations such as power consumption considerations), a temperature (e.g., obtained by a temperature sensor in the electronic device), etc. Based on these inputs, the initial gray levels from content generation circuitry 202, and the location of the corresponding sub-pixel, compensation maps 206 may output a compensated gray level for each sub-pixel.
The compensation maps may include binning of compensation values to mitigate memory requirements. The granularity (resolution) of the compensation maps may vary or may be constant across the compensation map. In general, granularity may increase as the pixel location nears the boundary 338 between regions 332 and 334. Sub-pixels at (or very close to) boundary 338 may have the highest granularity within the compensation map. For example, each sub-pixel may have a unique compensation value in this area. In contrast, compensation values may be binned for sub-pixels further from the boundary. A single compensation value may be stored in the compensation map to apply to a binned group of two sub-pixels, four sub-pixels, sixteen sub-pixels, more than two sub-pixels, more than four sub-pixels, more than eight sub-pixels, more than ten sub-pixels, more than twenty sub-pixels, more than forty sub-pixels, etc. The binned groups of sub-pixels may include 2×2 groups of sub-pixels, 4×4 groups of sub-pixels, or groups of any other desired dimensions.
It should be noted that a single compensation map may be used for the entire display or multiple compensation maps may be used for different portions of the display. For example, pixel removal region 332 may have a compensation map, full pixel density region 334 may have a compensation map, and a boundary region between pixel removal region 332 and full pixel density 334 may have a compensation map. Multiple compensation maps may also be included to account for the additional factors mentioned above (ambient light level, display brightness setting, temperature, etc.).
In the example of
After compensation map 206 compensates the gray levels, region-specific gamma look-up tables 208 may be used to obtain values that will ultimately be provided to the display. The region-specific look-up tables 208 may include a plurality of look-up tables representing gamma curves. Gamma curves are used to map luminance levels (e.g., gray levels) for each pixel to corresponding voltage levels (e.g., voltages applied to the pixels using display driver circuitry). Gamma curves may account for the non-linear manner in which viewers perceive light and color. A gamma look-up table may include a table of output voltages that each correspond to a particular input (e.g., gray level). The gamma look-up table may output a voltage for a given sub-pixel based on the input gray level for that sub-pixel.
Due to the different gamma behavior of pixels in pixel removal region 332 and full pixel density region 334, pixels in different regions may have different associated gamma look-up tables. For example, a first gamma look-up table (representing a first gamma curve) may be used for pixels in pixel removal region 332. A second, different gamma look-up table (representing a second gamma curve that is different than the first gamma curve) may be used for pixels in full pixel density region 334. The same gray level input may have different associated outputs from the first and second gamma look-up tables. If desired, additional look-up tables may be used for additional regions of the display (e.g., different pixel removal regions may have different associated look-up tables, a boundary region may have a specific look-up table, etc.).
The uniformity compensation circuitry 204 may use region-specific compensation maps 206 and region-specific gamma look-up tables 208 to produce compensated pixel data in any desired manner. For example, the use of region-specific compensation maps 206 and region-specific gamma look-up tables 208 may be sequential. In one example, region-specific compensation maps 206 may be used to compensate the gray levels of the pixel data and region-specific gamma look-up tables 208 may subsequently be used to convert the compensated gray levels into voltages for the compensated pixel data. In another example, region-specific gamma look-up tables 208 may convert the gray levels from the pixel data into voltages which are then subsequently compensated using region-specific compensation maps 206 to produce the compensated pixel data. In yet another example, the region-specific compensation maps 206 and region-specific gamma look-up tables 208 may be used in parallel (e.g., the region-specific compensation maps 206 may be used to compensate the region-specific gamma look-up tables 208 and the pixel data is converted to compensated pixel data in one step).
Ultimately, uniformity compensation circuitry 204 outputs compensated pixel data for the display pixels based on input pixel data (e.g., gray levels). The uniformity compensation circuitry may use one or more region-specific compensation maps and one or more region-specific gamma look-up tables to convert the input pixel data into compensated pixel data. The uniformity compensation circuitry may use other inputs (e.g., ambient light level, display brightness setting, temperature, gray levels of sub-pixels adjacent to a target sub-pixel, etc.) to convert the input pixel data into compensated pixel data. The compensated pixel data may be provided by uniformity compensation circuitry 204 to display driver circuitry 30. The display driver circuitry 30 provides the compensated pixel data to the display pixels 22, which emit light based on the received compensated pixel data.
If desired, a brightness transition may be used to seamlessly blend the boundary between pixel removal region 332 and full pixel density region 334 (thus mitigating the visibility of the boundary).
As shown in
Profile 254 shows the average luminance of the odd rows as a function of position within the display. Profile 258 shows the average luminance of the even rows as a function of position within the display. Profile 256 shows the average total luminance of both the even rows and odd rows as a function of position within the display. It should be noted that luminance as discussed in connection with
Since the even rows in region 332 are removed and cannot emit light, the even rows have an average luminance of L1 (e.g., 0 or off) in pixel removal region 332 (as shown by profile 258). The odd rows have an average luminance of L2 in pixel removal region 332. The total average luminance in pixel removal region 332 is therefore L3 (e.g., approximately half of L2).
This luminance scheme may be held throughout pixel removal region 332, even as the pixels approach boundary 338 between pixel removal region 332 and full pixel density region 338. At boundary region 338, full pixel density region 334 begins. At this point, (e.g., to the right of boundary 338 in
Consider region 340 of full pixel density region 334. Region 340 (sometimes referred to as normal display region 340, uniform luminance region 340, etc.) may be separated from boundary 338 by a given distance. Region 340 may display content ‘normally.’ In other words, both the odd rows and even rows may operate with the same average luminance (e.g., L3 in
Full pixel density region 334 has full pixel density up to the boundary 338. Therefore, the normal display scheme in region 340 could, if desired, be used in all of full pixel density region 334 (including immediately adjacent to boundary 338). However, this may result in a perceptible border between the full pixel density region 334 and pixel removal region 332. The display may therefore include a transition region 342 between boundary 338 and normal region 340.
Transition region 342 may be used to gradually transition the luminance distribution from fully one-sided (e.g., 100% of the luminance comes from odd rows) in region 332 to fully mixed (e.g., 50% of the luminance comes from odd rows and 50% from even rows) in region 340. This gradual transition in the distribution of luminance between odd rows (e.g., the first portion of the pattern) and even rows (e.g., the second portion of the pattern) may mitigate the visibility of the boundary between the pixel removal region 332 and full pixel density region 334. The gradual luminance transition effectively imitates a gradual transition in pixel density between the reduced pixel density of region 332 and the full pixel density of region 334. For this reason, region 342 may sometimes be referred to as a pixel density transition region.
As shown in
The display may also include buffer region 336 between boundary 338 and transition region 342. In buffer region 336, the luminance profile of the pixel removal region 332 is maintained (e.g., 100% of the luminance comes from odd rows) even though the display has the physical capability to emit light in the even rows in this region. The buffer region may reduce the perceptibility of the boundary between pixel removal region 332 and full pixel density region 334. In some cases, however, the buffer region may be omitted. In these cases, transition region 342 may start immediately at the boundary between regions 332 and 334.
The width of buffer region 336 may be less than thirty columns (of sub-pixels), less than twenty-five columns, less than twenty columns, less than ten columns, less than five columns, zero columns (when the buffer region is omitted entirely), more than five columns, more than ten columns, more than twenty columns, between twenty and forty columns, etc. The width of transition region 342 may be less than one hundred columns, less than seventy-five columns, less than sixty columns, less than fifty columns, less than forty columns, less than thirty columns, less than twenty columns, less than ten columns, less than five columns, more than five columns, more than ten columns, more than twenty columns, more than forty columns, more than fifty columns, more than sixty columns, between thirty and one hundred columns, etc.
Uniformity compensation circuitry 204 in
It should be noted that content generating circuitry 202, uniformity compensation circuitry 204, and display driver circuitry 30 may be implemented using one or more microprocessors, microcontrollers, digital signal processors, graphics processing units, application-specific integrated circuits, and other integrated circuits. Content generating circuitry 202, uniformity compensation circuitry 204, and display driver circuitry 30 may sometimes be referred to as part of display 14 and/or may sometimes be referred to as control circuitry (e.g., part of control circuitry 16 in
As described above, one aspect of the present technology is the gathering and use of information such as information from input-output devices. The present disclosure contemplates that in some instances, data may be gathered that includes personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter ID's, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, username, password, biometric information, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver targeted content that is of greater interest to the user. Accordingly, use of such personal information data enables users to calculated control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, health and fitness data may be used to provide insights into a user's general wellness, or may be used as positive feedback to individuals using technology to pursue wellness goals.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the United States, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA), whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In another example, users can select not to provide certain types of user data. In yet another example, users can select to limit the length of time user-specific data is maintained. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an application (“app”) that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data at a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of information that may include personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data.
The foregoing is merely illustrative and various modifications can be made by those skilled in the art without departing from the scope and spirit of the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application is a continuation of non-provisional patent application Ser. No. 17/368,548, filed Jul. 6, 2021, which claims the benefit of provisional patent application No. 63/062,097, filed Aug. 6, 2020, which are hereby incorporated by reference herein in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
7355606 | Paquette | Apr 2008 | B2 |
9098136 | Kim | Aug 2015 | B2 |
10268884 | Jones et al. | Apr 2019 | B2 |
20140375704 | Bi et al. | Dec 2014 | A1 |
20170116934 | Tien et al. | Apr 2017 | A1 |
20200234634 | Li | Jul 2020 | A1 |
20200294450 | Kim et al. | Sep 2020 | A1 |
20200394964 | Hyun et al. | Dec 2020 | A1 |
20210241671 | Lee | Aug 2021 | A1 |
20210343222 | Hei | Nov 2021 | A1 |
20210358379 | Li et al. | Nov 2021 | A1 |
Number | Date | Country | |
---|---|---|---|
20240038158 A1 | Feb 2024 | US |
Number | Date | Country | |
---|---|---|---|
63062097 | Aug 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17368548 | Jul 2021 | US |
Child | 18481119 | US |