This relates generally to electronic devices, and, more particularly, to electronic devices with displays.
Electronic devices often include displays. Displays such as organic light-emitting diode displays have pixels with light-emitting diodes. During normal operation, the pixels are illuminated to display images for a user.
In some situations, it may be desirable to provide non-image illumination with the pixels. If care is not taken, this illumination will not have desired attributes.
It would therefore be desirable to be able to provide improved electronic devices and display arrangements for accommodating the use of pixels to provide non-image illumination.
An electronic device may have a display. The display may have an array of pixels such as an array of pixels with organic light-emitting diodes or other light-emitting diodes. The device may have an array of electrical components mounted under the display. The electrical components may be an array of light sensors for capturing fingerprints from a user or for gathering information on other external objects. The light sensors in the array may gather light readings through an array of corresponding transparent windows in the display.
A capacitive touch sensor, proximity sensor, light detector, strain gauge sensor or other force sensor, or other sensor may be used by control circuitry in the device to monitor for the presence of a user's finger or other object over the array of light sensors. In response to detecting the user's finger, the control circuitry can direct the display to illuminate a portion of the display or all of the display with uniform light. For example, in a configuration in which a light sensor array occupies a portion of a display, a subset of the pixels that overlaps the light sensor may be illuminated.
The illuminated subset of pixels can produce a flash of illumination or may otherwise be adjusted in brightness independently from pixels in the rest of the display. The flash may be relatively brief. For example, the length of the flash may be equal to one frame time (e.g., 1/60 s in a display in which the rate at which image frames are displayed is 60 Hz). The flash may illuminate a user's finger that is adjacent to the subset of pixels and the light sensor array. Reflected light from the user's finger may illuminate the array of light sensors for a fingerprint capture operation. Illuminating the light sensors with a flash of light from subset of the pixels overlapping the light sensor array (i.e., a flash region) may help ensure that fingerprint capture operations are performed satisfactorily.
The display may have display driver circuitry that facilitates the momentary illumination of the subset of pixels with uniform flash data while image data or other suitable data is displayed in other portions of the display. The display driver circuitry may have multiplexer circuitry that selectively routes either image data or flash data to a set of pixels in a fixed flash region on the display or may have multiplexer circuitry that can be dynamically configured to place the flash region at a desired location on the display.
Further features will be more apparent from the accompanying drawings and the following detailed description.
An illustrative electronic device of the type that may be provided with a display is shown in
Input-output circuitry in device 10 such as input-output devices 12 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Input-output devices 12 may include buttons, joysticks, scrolling wheels, touch pads, key pads, keyboards, microphones, speakers, tone generators, vibrators, cameras, sensors (e.g., light-based proximity sensors such as infrared proximity sensors, capacitive touch sensors, force sensors such as capacitive force sensors and strain gauge force sensors, light detectors, etc.), light-emitting diodes and other status indicators, data ports, and other electrical components. A user can control the operation of device 10 by supplying commands through input-output devices 12 and may receive status information and other output from device 10 using the output resources of input-output devices 12.
Input-output devices 12 may include one or more displays such as display 14. Display 14 may be a touch screen display that includes a touch sensor for gathering touch input from a user or display 14 may be insensitive to touch. A touch sensor for display 14 may be based on an array of capacitive touch sensor electrodes, acoustic touch sensor structures, resistive touch components, force-based touch sensor structures, a light-based touch sensor, or other suitable touch sensor arrangements.
Control circuitry 16 may be used to run software on device 10 such as operating system code and applications. During operation of device 10, the software running on control circuitry 16 may display images on display 14 using an array of pixels in display 14.
When it is desired to produce illumination with the pixels of display 14, the software running on control circuitry 16 may use display 14 to illuminate a region of the pixels on display 14. The region may, for example, be a rectangular portion of display 14 or a region with another shape that serves as flash illumination for a photograph, flash illumination for a fingerprint capture operation, illumination for document scanning operations, or illumination for other operations in which an object external to device 10 is to be illuminated.
The illuminated region, which may sometimes be referred to as a flash region or flash area, may be white or may have other colors. The color of the flash area (e.g., the color temperature of a white flash area) may be adjusted to provide illumination with desired color characteristics (e.g., to satisfy aesthetic requirements, to enhance the warmth of a photograph, to ensure that a fingerprint capture operation is performed satisfactorily, etc.). The brightness of the flash area may also be adjusted. Uniform flash illumination is generally appropriate, but non-uniform patterns of illumination may be provided, if desired.
Device 10 may be a tablet computer, laptop computer, a desktop computer, a display, a cellular telephone, a media player, a wristwatch device or other wearable electronic equipment, or other suitable electronic device.
Display 14 may be an organic light-emitting diode display or may be a display based on other types of display technology. Configurations in which display 14 is an organic light-emitting diode display are sometimes described herein as an example. This is, however, merely illustrative. Any suitable type of display may be used, if desired.
Display 14 may have a rectangular shape (i.e., display 14 may have a rectangular footprint and a rectangular peripheral edge that runs around the rectangular footprint) or may have other suitable shapes. Display 14 may be planar or may have a curved profile.
A top view of a portion of display 14 is shown in
Display driver circuitry may be used to control the operation of pixels 22. The display driver circuitry may be formed from integrated circuits, thin-film transistor circuits, or other suitable circuitry. Display driver circuitry 30 of
To display the images on display pixels 22, display driver circuitry 30 may supply image data to data lines D while issuing clock signals and other control signals to supporting display driver circuitry such as gate driver circuitry 34 over path 38. Gate driver circuitry 34 can assert appropriate gate signals (e.g., gate signals in successive rows may be asserted in sequence to load each frame of data). If desired, circuitry 30 may also supply clock signals and other control signals to gate driver circuitry on an opposing edge of display 14.
Gate driver circuitry 34 (sometimes referred to as horizontal control line control circuitry) may be implemented as part of an integrated circuit and/or may be implemented using thin-film transistor circuitry. Horizontal control lines G in display 14 may carry gate line signals (e.g., scan line signals, emission enable control signals, and other horizontal control signals) for controlling the pixels of each row. There may be any suitable number of horizontal control signals per row of pixels 22 (e.g., one or more, two or more, three or more, four or more, etc.).
It may be desirable to incorporate electrical components into display 14 and/or device 10. As shown in
Electrical components 84 may be audio components (e.g., microphones, speakers, etc.), radio-frequency components, haptic components (e.g., piezoelectric structures, vibrators, etc.), may be capacitive touch sensor components or other touch sensor structures, may be temperature sensors, pressure sensors, magnetic sensors, or other sensors, or may be any other suitable type of electrical component. With one suitable arrangement, which may sometimes be described herein as an example, electrical components 84 may be light-based components (e.g., components that emit and/or detect visible light, infrared light, and/or ultraviolet light).
Light-based components 84 may emit and/or detect light that passes through transparent windows 76 in display 14. Windows 76 may be formed in regions located between pixels 22 and may include transparent materials (e.g., clear plastic, glass, etc.) and/or holes (e.g., air-filled openings or openings filled with transparent material that pass partly or fully through substrate 36 and other display layers 74 of display 14 such as thin-film layers forming thin-film transistors and organic light-emitting diodes).
There may be a window 76 between each pair of pixels 22 or, more preferably, blocks of pixels 22 (e.g., blocks of tens, hundreds, or thousands of pixels) may be associated with windows 76 and electrical components 84.
Examples of light-based components 84 that emit light include light-emitting diodes (e.g., organic light-emitting diodes, discrete crystalline light-emitting diode dies, etc.), lasers, and lamps. Examples of light-based components that detect light include light detectors such as photodiodes and phototransistors. Some components may, if desired, include both light emitters and detectors. For example, components 84 may emit infrared light and may include light detector structures for detecting a portion of the emitted light that has reflected from nearby objects such as object 86. Components of this type may be used to implement a proximity sensor. In configurations in which components 84 include light sensors, an array of components 84 may form a light-based fingerprint sensor (e.g., when object 86 is the finger of a user) or other light-based sensor (e.g., a light sensor that detects the presence or absence of a finger or other external object by determining when components 84 have been shadowed by object 86 so that ambient light at components 84 is reduced). The presence of a user's finger or other external object 86 over a given portion of display 14 (e.g., over a region that includes an array of components 84) may, if desired, be detected using a touch sensor formed from capacitive touch sensor electrodes in display 14, a force sensor (e.g., a capacitive force sensor that measures force by detecting capacitance changes as a user presses on a portion of display 14, a strain gauge that measures force on display 14, or other force sensing structures), a light detector (e.g., a light detector that detects the user's finger by measuring shadowing of ambient light), an infrared proximity sensor or array of infrared proximity sensors or other light-based sensors, etc.
If desired, light-based sensors such as these may sense fingerprints while object 86 is illuminated with light 24 from one or more of pixels 22. This light may be produced by placing a region of display 14 (i.e., a “flash region”) in a flash mode. When operating normally, the pixels of the flash region may be used in displaying images for a user on display 14. In the flash mode, pixels 22 may produce a block of solid white light or other illumination to briefly illuminate object 86. Pixels 22 may, for example, produce a flash of white light that lasts for the duration of one frame of image data on display 14. The flash region of display 14 may be aligned with a portion of display 14 that includes an array of windows 76 (as an example). Finger sensing components such as a force sensor, capacitive touch sensor, proximity sensor, or other detector may also overlap this portion of display 14 to detect when a user's finger is present and flash illumination is appropriate.
An illustrative display with a flash region is shown in
During normal image data loading operations, data lines D0 . . . DF may be used to load image data into display 14. Rows of pixels may be loaded in sequence by issuing control signals over gate lines G0 . . . GF.
Data line voltages suitable for operating the pixels of region 100 in flash mode may be supplied to the pixels of region 100 using data lines DN . . . DM while issuing a sequence of control signals on gate lines GK . . . GL.
Illustrative display driver circuitry (see, e.g., the display driver circuitry of
Gamma block 108 may receive voltage Vreg2 from output 106 and may generate a set of voltages V255 . . . V0 at outputs 112 (e.g., using a voltage divider formed from a resistor tree and other circuitry). The values of V225 . . . V0 may be used in establishing a desired mapping between digital image data values (e.g., 0 . . . 255 or other suitable range of values) and analog voltage levels for use as analog image data signals for the pixels of display 14. To display images on display 14, image buffer 118 may supply digital image data to gamma multiplexer 110 via path 116. Gamma multiplexer 110 may supply a desired voltage from one of lines 112 to gamma multiplexer 114 to use as data signal D in response to the digital image data signal received from image buffer 118 on path 116. The gamma block circuitry and gamma multiplexer circuitry of display 14 may be used to supply signals to multiple data lines. The display driver circuitry of display 14 may, for example, include gamma block circuitry and gamma multiplexer circuitry that implement the functions of gamma block 108 and gamma multiplexer 110 of
Display 14 may contain subpixels of different colors. For example, display 14 may contain red pixels (subpixels), green pixels (subpixels), and blue pixels (subpixels). Data signals D may be demultiplexed onto corresponding subpixel data lines 136 using data line demultiplexer circuitry such as data line demultiplexer 134. There may be a demultiplexer such as demultiplexer 134 associated with each column of red, green, and blue pixels. During operation, the voltage on line 114 may be placed in a state appropriate for a red subpixel while control signal MUXR is taken high to direct demultiplexer 134 to route the voltage on line 114 to red subpixel data line R. Control signals MUXG and MUXB may likewise be asserted to demultiplex the signal on line 114 onto data lines G and B.
The circuitry of
Mode selection control signal MODE_SELECT may be deasserted whenever it is desired to route normal image data to subpixel data lines in the columns of display 14 associated with data lines DN . . . DM of
Mode selection control signal MODE_SELECT may be asserted whenever it is desired to route flash data Df from line 128 to line 132 for loading into the pixels of flash region 100 (e.g., when loading signals into region 100 using data lines DN . . . DM and using gate lines GK . . . GL during flash mode operations in the example of
Flash data signals Df may be generated by flash digital-to-analog converter 122 based on a digital flash setting signal that control circuitry 16 supplies to converter 122 at control input 120. Converter 122 may produce different values of Df for different flash brightness levels. For example, converter 122 may produce a relatively large voltage Vf for use as flash data Df when the flash setting on input 120 is set to a “high” setting, may produce a relatively low voltage Vf when the flash setting on input 120 is set to a “low” setting, and may produce an intermediate voltage Vf when the flash setting on input 120 is set to a “medium” setting. When high data values Df are loaded into the pixels of flash region 100, the pixels of region 100 will produce bright output. The use of medium or low data values Df will result in corresponding medium or low output light levels from region 100. The use of three different brightness settings is merely illustrative. Converter 122 may support more than three different brightness levels or fewer than three different levels. Converter 122 may also produce data values Df that are different for the subpixels of different colors in region 100. This allows the color temperature or other color attributes of the output light produced by flash region 100 to be adjusted. Color adjustments may be made independently of brightness level adjustments or different colors may be associated with different brightness levels. Flash data Df is generally uniform across region 100 (i.e., all of pixels 22 in region 100 receive the same data: the same red subpixel value, the same green subpixel value, and the same blue subpixel value). If desired, data Df can be varied within region 100 to create flash illumination with a non-uniform intensity pattern.
As shown in
During the operations of step 140, the display driver circuitry of
In situations in which no flash data is to be presented (e.g., in situations in which flash region 100 is being used to display an image and is not being used to produce flash illuminations), the operations of step 140 may be used to load data into all rows of display 14 (e.g., rows GK . . . GF) while image data is presented to all data lines D0 . . . DF. Once an entire frame of image data has been loaded into display 14 and displayed for a user, processing may loop back to step 140, as indicated by line 142, so that another frame of image data may be processed.
In situations in which flash data is to be presented in flash region 100, signal MODE_SELECT may be asserted (step 144). During step 144, the gate lines of display 14 that overlap flash region 100 may be asserted in sequence. At the same time, image data may be presented to the data lines that do not overlap the flash region while flash data is simultaneously presented to the data lines that do overlap the flash region. The remainder of the pixels in display 14 (i.e., the pixels in rows below the flash region, if any) may then be loaded with image data by deasserting MODE_SELECT and processing initiated for a fresh frame (step 140).
If desired, the display driver circuitry for display 14 may be configured to allow the position of flash region 100 to be adjusted by control circuitry 16. This approach may be used, for example, to allow a fingerprint(s) to be captured at a number of different locations on display 14.
Consider, as an example, the illustrative display driver circuitry of
The flash mode selection input for the gamma multiplexer circuitry may be used to adjust the position of the flash region. The inputs for each of gamma multiplexers 100′ may all be independent or groups of two or more of these inputs may be connected together to conserve circuit resources. In situations in which the FLASH_MODE signal in each column is independently adjustable by control circuitry 16, control circuitry 16 can select a pattern of asserted and deasserted FLASH_MODE signals to adjust the horizontal position of flash region 100 to any desired location within display 14. In situations in which there are fewer independently adjustable FLASH_MODE signals, the available horizontal positions for region 100 will be correspondingly restricted, but fewer different control lines will be required. Vertical positioning of region 100 may be implemented by asserting FLASH_MODE in appropriate columns while a set of gate lines that overlap the desired position of region 100 are being asserted.
During manufacturing, display 14 may be calibrated. For example, test image data may be displayed on display 14 while image calibration measurements are made and test flash data may be displayed in a test flash region of display 14 while flash calibration measurements are made. Resulting calibration data for display 14 (e.g., global image calibration data, region-specific image calibration data, pixel-by-pixel image calibration data, and flash region calibration data for all pixels, blocks of pixels, or each pixel in flash region 100) can then be stored in display 14 and used in producing calibrated image data with gamma block 108 and in producing calibrated flash illumination.
Control circuitry 16 may monitor sensors and other input-output devices 12 to determine when to initiate flash mode operations. For example, circuitry 16 may monitor input from a capacitive touch sensor to determine when a user's finger has been placed over flash region 100. The flash region can then be illuminated so that an array of light detectors 84 in this region can be used to capture a fingerprint (as an example). If desired, other types of sensor input can be processed by control circuitry 16 to determine when a user's finger or other object is in flash region 100 for fingerprint capture. For example, components 84 or other components in device 10 may include infrared emitters and sensors that form light-based proximity sensors. When a proximity sensor reading indicates that a user's finger is present, control circuitry 16 can illuminate region 100 and gather sensor readings from an array of light sensor components 84. In some situations, components 84 (e.g., light sensors) may output signals with a given level during normal ambient lighting conditions and may exhibit output signals with a temporarily reduced level when normal ambient lighting conditions are still present but a finger or other object is shadowing components 84. Force sensors (capacitive sensors, strain gauges, etc.) may be use to detect the presence of a users finger. In general, proximity sensor measurements, capacitive touch sensor measurements, ambient light sensor shadow detection, actuation of a force sensor (e.g., a strain gauge, etc.), actuation of a switch under region 100, or other suitable arrangements may be used in determining when to activate flash region 100 and capture a fingerprint. The foregoing examples are merely illustrative.
In operations such as fingerprint capture operations, it may be desirable for the illumination provided by the pixels of flash region 100 to be uniform. Accordingly, each of the pixels in this region may be provided with the same flash data Df. The color of the light produced in region 100 can be adjusted by adjusting the relative magnitude of the output produced by the red, green, and blue subpixels (or subpixels of other suitable colors) within this uniform data for region 100. If desired, different portions of region 100 can be provided with correspondingly different values of data Df (e.g., to produce patterned flash illumination, graded flash illumination, or flash region output with other non-uniform characteristics). Moreover, the non-flash region of display 14 may be used to display output with a particular brightness (normal, higher than normal, or lower than normal), a particular color (blue, green, red, or other colors), may be used to display a pattern of non-image data, may be used to display modified image data, or may be used to display other desired output during the use of flash region 100 to produce flash output. The use of the non-flash regions of display 14 to display normal image data while flash region 100 supplies uniform flash output is merely illustrative.
If desired, display 14 may be provided with a first region (e.g., region 100 of
As shown in
Mapping circuitry 150 may include 8-bit to 10-bit mapping circuits 156 and 158. Circuit 158 may receive local brightness control signals on brightness control input 154 for a region such as region 100 and circuit 156 may receive brightness control signals on brightness control input 152 for the rest of display 14. Image data for local region 100 of display 14 may be provided to mapping circuit 158 from image buffer 118 at input 116A. Image data for the rest of display 14 may be provided to mapping circuit 156 from image buffer 118 at input 116B. The image data at inputs 116A and 116B may be 8-bit data or may have any other suitable bit size. The corresponding output data on path 116′ may be 10-bit data or may have any other suitable size larger than the input data.
Mapping circuitry 150 may produce an output that is based on the image data and brightness control data presented to the inputs of mapping circuitry 150. Mapping circuit 158 may, for example, produce an output that is equal to the product of the grey level (image data) presented at input 116A and the brightness control signal for region 100 that is presented at input 154, whereas mapping circuit 156 may produce an output that is equal to the product of the grey level (image data) presented at input 116B and the brightness control signal for the region of display 14 other than region 100 that is presented at input 152.
There may be two separate sets of gamma mappings for region 100 and the rest of display 14 (i.e., two corresponding sets of curves relating input digital data values to the analog voltage levels produced by circuitry 110 for pixels 22). Consider, as an example, the gamma curves of
In the illustrative configuration of
The brightness of the content in region 100 can be adjusted using brightness setting 154 independently of the brightness of the content in the rest of display 14, which is adjusted using brightness setting 152. Region 100 can have a momentarily enhanced brightness (e.g., to produce a flash of illumination in configurations in which region 100 contains an array of light sensors 84) or can be provided with enhanced brightness for longer periods of time. While the brightness setting for region 100 is being momentarily enhanced, the digital image data corresponding to region 100 can be provided with a single value (e.g., to produce a block of solid white illumination) or may correspond to a pattern or part of an image. If desired, the independence of the brightness adjustments for region 100 and the rest of display 14 may be used to reduce the brightness of the pixels in region 100 relative to the pixels in the rest of display 14. The use of separate brightness adjustments for region 100 and the rest of display 14 to produce a momentarily enlarged brightness in region 100 is merely illustrative.
The foregoing is merely illustrative and various modifications can be made by those skilled in the art without departing from the scope and spirit of the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application is a continuation of patent application Ser. No. 17/216,376, filed Mar. 29, 2021, which is a continuation of patent application Ser. No. 16/584,807, filed Sep. 26, 2019, now U.S. Pat. No. 10,984,752, which is a continuation of patent application Ser. No. 16/222,492, filed Dec. 17, 2018, now U.S. Pat. No. 10,467,985, which is a continuation of patent application Ser. No. 15/257,448, filed Sep. 6, 2016, now U.S. Pat. No. 10,157,590, which claims the benefit of provisional patent application No. 62/267,537, filed Dec. 15, 2015, all of which are incorporated by reference herein in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
5726443 | Immega et al. | Mar 1998 | A |
6028581 | Umeya | Feb 2000 | A |
6172667 | Sayag | Jan 2001 | B1 |
6339429 | Schug | Jan 2002 | B1 |
6430325 | Shimoda | Aug 2002 | B1 |
6459436 | Kumada | Oct 2002 | B1 |
6559433 | Ozawa | May 2003 | B1 |
7164961 | Mei et al. | Jan 2007 | B2 |
7460196 | Kim et al. | Dec 2008 | B2 |
7468721 | Nakano | Dec 2008 | B2 |
7804493 | Gettemy | Sep 2010 | B2 |
7825891 | Yao et al. | Nov 2010 | B2 |
7960682 | Gardner, Jr. | Jun 2011 | B2 |
8194031 | Yao et al. | Jun 2012 | B2 |
8330909 | Yoshida et al. | Dec 2012 | B2 |
8361818 | Cho et al. | Jan 2013 | B2 |
8384003 | Gardner, Jr. | Feb 2013 | B2 |
8947627 | Rappoport et al. | Feb 2015 | B2 |
8987652 | Zheng | Mar 2015 | B2 |
9024530 | Land et al. | May 2015 | B2 |
9028718 | Kijima et al. | May 2015 | B2 |
9129548 | Zheng | May 2015 | B2 |
9070648 | de Jong et al. | Jun 2015 | B2 |
9183779 | Soto | Nov 2015 | B2 |
9223442 | Hoffman | Dec 2015 | B2 |
9245934 | Chung et al. | Jan 2016 | B2 |
9299748 | Schicktanz et al. | Mar 2016 | B2 |
9310843 | Shedletsky et al. | Apr 2016 | B2 |
9354735 | Abileah et al. | May 2016 | B2 |
9466653 | de Jong et al. | Oct 2016 | B2 |
9829614 | Smith et al. | Nov 2017 | B2 |
20010000676 | Zhang et al. | May 2001 | A1 |
20020079512 | Yamazaki et al. | Jun 2002 | A1 |
20030174870 | Kim et al. | Sep 2003 | A1 |
20030189211 | Deitz | Oct 2003 | A1 |
20030189586 | Vronay | Oct 2003 | A1 |
20040036820 | Runolinna | Feb 2004 | A1 |
20040095402 | Nakano | May 2004 | A1 |
20040140762 | Tohma et al. | Jul 2004 | A1 |
20050056842 | Nashi et al. | Mar 2005 | A1 |
20050219197 | Pasqualini et al. | Oct 2005 | A1 |
20060033016 | Ogawa et al. | Feb 2006 | A1 |
20060049533 | Kamoshita | Mar 2006 | A1 |
20060238517 | King et al. | Oct 2006 | A1 |
20060267625 | Kaneko | Nov 2006 | A1 |
20070236485 | Trepte | Oct 2007 | A1 |
20070252005 | Konicek | Nov 2007 | A1 |
20070257254 | Yang et al. | Nov 2007 | A1 |
20080084374 | Abileah et al. | Apr 2008 | A1 |
20080158173 | Hamblin et al. | Jul 2008 | A1 |
20080284716 | Edwards | Nov 2008 | A1 |
20090002341 | Saito et al. | Jan 2009 | A1 |
20090033850 | Ishiguro et al. | Feb 2009 | A1 |
20090102763 | Border et al. | Apr 2009 | A1 |
20090251560 | Azar et al. | Oct 2009 | A1 |
20100079426 | Pance et al. | Apr 2010 | A1 |
20100148163 | Im et al. | Jun 2010 | A1 |
20100177046 | Shin et al. | Jul 2010 | A1 |
20100273530 | Jarvis et al. | Oct 2010 | A1 |
20100277634 | Watanabe | Nov 2010 | A1 |
20100302196 | Han et al. | Dec 2010 | A1 |
20110122560 | Andre et al. | May 2011 | A1 |
20110147571 | Bieber et al. | Jun 2011 | A1 |
20110216042 | Wassvik et al. | Sep 2011 | A1 |
20110220922 | Kim et al. | Sep 2011 | A1 |
20110227873 | Chung et al. | Sep 2011 | A1 |
20110234538 | Chen et al. | Sep 2011 | A1 |
20110248961 | Svajda et al. | Oct 2011 | A1 |
20110267198 | Erhart et al. | Nov 2011 | A1 |
20120043894 | Koh et al. | Feb 2012 | A1 |
20120106808 | Morioka et al. | May 2012 | A1 |
20120153153 | Chang et al. | Jun 2012 | A1 |
20120176298 | Suh et al. | Jul 2012 | A1 |
20120194441 | Frey | Aug 2012 | A1 |
20120218239 | Yao et al. | Aug 2012 | A1 |
20120267611 | Chung et al. | Oct 2012 | A1 |
20130002731 | Tam | Jan 2013 | A1 |
20130076712 | Zheng et al. | Mar 2013 | A1 |
20130094126 | Rappoport et al. | Apr 2013 | A1 |
20130106813 | Hotelling et al. | May 2013 | A1 |
20130161489 | Gardner | Jun 2013 | A1 |
20130222681 | Wan | Aug 2013 | A1 |
20140085265 | Yin | Mar 2014 | A1 |
20150331508 | Nho et al. | Nov 2015 | A1 |
20160189671 | Kim | Jun 2016 | A1 |
20160342282 | Wassvik | Nov 2016 | A1 |
20170091506 | Sinha et al. | Mar 2017 | A1 |
20170169275 | Mackey et al. | Jun 2017 | A1 |
20180040279 | Kimura | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
1335430 | Aug 2003 | EP |
2144293 | Jan 2010 | EP |
2432196 | Mar 2012 | EP |
201237962 | Sep 2012 | TW |
200041378 | Jul 2000 | WO |
200237454 | May 2002 | WO |
2007069107 | Jun 2007 | WO |
2017048478 | Mar 2017 | WO |
Entry |
---|
Minhyuk Choi et al., U.S. Appl. No. 15/274,546, filed Sep. 23, 2016. |
Number | Date | Country | |
---|---|---|---|
20220270569 A1 | Aug 2022 | US |
Number | Date | Country | |
---|---|---|---|
62267537 | Dec 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17216376 | Mar 2021 | US |
Child | 17741138 | US | |
Parent | 16584807 | Sep 2019 | US |
Child | 17216376 | US | |
Parent | 16222492 | Dec 2018 | US |
Child | 16584807 | US | |
Parent | 15257448 | Sep 2016 | US |
Child | 16222492 | US |