This relates to sensors and, more particularly, to ambient light sensors for electronic devices.
Cellular telephones and other portable devices with displays such a tablet computers sometimes contain ambient light sensors. An ambient light sensor can detect when a portable device is in a bright light environment. For example, an ambient light sensor can detect when a portable device is exposed to direct sunlight. When bright light is detected, the portable device can automatically increase the brightness level of the display to ensure that images on the display remain visible and are not obscured by the presence of the bright light. In dark surroundings, the display brightness level can be reduced to save power and provide a comfortable reading environment.
With conventional devices, ambient light sensors are implemented using first and second silicon photosensors to receive ambient light. The first photosensor is used to detect an amount of infrared light in the ambient light, whereas the second photosensor is used to detect an amount of visible light and infrared light in the ambient light. Readings from the first and second photosensors are subtracted to obtain a corresponding visible light level. This visible light level is then used to increase or decrease the display brightness level accordingly.
Computing visible light levels in this way, however, requires the infrared sensing capabilities of the first and second photosensors to be accurate and consistent relative to one another. Any mismatch in infrared sensitivity can cause the visible light reading to be erroneous. Designing and manufacturing two silicon photosensors that are well-matched in performance may be challenging.
It would therefore be desirable to be able to provide improved ambient light sensor systems for electronic devices.
An electronic device may have a display with a brightness that is adjusted based on ambient light data from one or more ambient light sensors. The electronic device may be operated in an environment in which the electronic device is exposed to ambient light from at least a given one of a plurality of light sources each producing a different respective ratio of infrared light to visible light. For example, the ambient light may be one of light-emitting-diode (LED) light, fluorescent light, solar light, incandescent light, tungsten light, a mix of some of these lights, and/or other types of light.
The ambient light sensor may be used to characterize the ambient light to determine which of the plurality of light sources produced the ambient light. The electronic device may also include control circuitry for adjusting the display brightness based at least partly on which of the plurality of light sources produced the ambient light. In particular, the electronic device may include at least one silicon-based photosensor that can be used to measure a total ambient light level and to generate a corresponding raw sensor output.
The control circuitry may be used to identify the lighting type of the ambient light (i.e., may determine which of the plurality of light sources produced the ambient light) by computing a ratio of signals generated by at least two silicon-based photosensors on the electronic device, identifying a color temperature of the ambient light, identifying a modulation frequency of the ambient light, using a combination of these techniques, or by obtaining other parameters associated with the ambient light. The control circuitry may then select a suitable compensation factor based on the identified light type by referring to a lookup table stored on the control circuitry. The control circuitry may then output a visible light reading indicative of how much visible light is contained in the ambient light (e.g., by taking the product of the raw sensor reading and the selected compensation factor).
Further features of the present invention, its nature and various advantages will be more apparent from the accompanying drawings and the following detailed description.
Electronic devices such as device 10 of
Device 10 of
Device 10 may include a housing such as housing 12. Housing 12, which may sometimes be referred to as a case, may be formed of plastic, glass, ceramics, fiber composites, metal (e.g., stainless steel, aluminum, etc.), other suitable materials, or a combination of these materials.
Housing 12 may be formed using an unibody configuration in which some or all of housing 12 is machined or molded as a single structure or may be formed using multiple structures (e.g., an internal frame structure, one or more structures that form exterior housing surfaces, etc.).
In some configurations, housing 12 may be formed using front and rear housing structures that are substantially planar. For example, the rear of device 10 may be formed from a planar housing structure such as a planar glass member, a planar plastic member, a planar metal structure, or other substantially planar structure. The edges (sidewalls) of housing 12 may be straight (vertical) or may be curved (e.g., housing 12 may be provided with sidewalls formed from rounded extensions of a rear planar housing wall).
As shown in
Display 14 may, for example, be a touch screen that incorporates capacitive touch electrodes or a touch sensor formed using other types of touch technology (e.g., resistive touch, light-based touch, acoustic touch, force-sensor-based touch, etc.). Display 14 may include image pixels formed from light-emitting diodes (LEDs), organic LEDs (OLEDs), plasma cells, electronic ink elements, liquid crystal display (LCD) components, or other suitable image pixel structures.
Display 14 may have an active region and an inactive region. Active region 22 of display 14 may lie within rectangular boundary 24. Within active region 22, display pixels such as liquid crystal display pixels or organic light-emitting diode display pixels may display images for a user of device 10. Active display region 22 may be surrounded by an inactive region such as inactive region 26. Inactive region 26 may have the shape of a rectangular ring surrounding active region 22 and rectangular boundary 24 (as an example). To prevent a user from viewing internal device structures under inactive region 26, the underside of the cover layer for display 14 may be coated with an opaque masking layer in inactive region 26. The opaque masking layer may be formed from a layer of ink (e.g., black or white ink or ink of other colors), a layer of plastic, or other suitable opaque masking material.
Device 10 may include input-output ports, buttons, sensors, status indicator lights, speakers, microphones, and other input-output components. As shown in
Ambient light sensors may be mounted at any locations within device 10 that are potentially exposed to ambient light. For example, one or more ambient light sensors may be mounted behind openings or other windows in housing 12 (e.g., clear windows or openings in a metal housing, clear windows or openings in a plastic housing, etc.). With one suitable arrangement, one or more ambient light sensors may be formed in device 10 on portions of display 14. For example, one or more ambient light sensors may be mounted to a thin-film transistor layer or other display layer that is located under a display cover layer in inactive region 26 of display 14, as shown by illustrative ambient light sensor locations 18 in
Ambient light sensors may be mounted under ambient light sensor windows in the opaque masking layer in inactive region 26 or may be mounted in other locations in device 10 that are exposed to ambient light. In configurations in which ambient light sensors are mounted under region 26 of display 14, ambient light sensor windows for the ambient light sensors may be formed by creating circular holes or other openings in the opaque masking layer in region 26. Ambient light sensor windows may also be formed by creating localized regions of material that are less opaque than the remaining opaque masking material or that otherwise are configured to allow sufficiently strong ambient light signals to be detected. For example, ambient light sensor windows may be created by locally thinning portions of an opaque masking layer or by depositing material in the ambient light sensor windows that is partly transparent. During operation, ambient light from the exterior of device 10 may pass through the ambient light sensor windows to reach associated ambient light sensors in the interior of device 10.
The ambient light sensors that are used in device 10 may be formed from silicon or other semiconductors. Ambient light sensors may be mounted on one or more substrates within device 10. With one suitable arrangement, ambient light sensors are formed from a semiconductor such as silicon and are mounted on a substrate layer that is formed from one of the layers in display 14. Other types of ambient light sensors and/or mounting arrangements may be used if desired. The use of silicon ambient light sensors that are mounted on a display substrate layer is merely illustrative.
A schematic diagram of an illustrative electronic device such as electronic device 10 of
Storage and processing circuitry 30 may be used to run software on device 10 such as internet browsing applications, voice-over-internet-protocol (VOIP) telephone call applications, email applications, media playback applications, operating system functions, etc. The software may be used to implement control operations such as real time display brightness adjustments or other actions taken in response to measured ambient light data. Circuitry 30 may, for example, be configured to implement a control algorithm that controls the gathering and use of ambient light sensor data from ambient light sensors located in regions such as regions 18 of
Input-output circuitry 42 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Input-output circuitry 42 may include sensors 32 and at least one camera module 34. Sensors 32 may include ambient light sensors, proximity sensors, touch sensors (e.g., capacitive touch sensors that are part of a touch screen display or that are implemented using stand-alone touch sensor structures), accelerometers, and other sensors. Camera module 34 may include an image sensor, a corresponding lens system, and an associated flash unit that can be used to acquire images for a user during operation of device 10.
Input-output circuitry 42 may also include one or more displays such as display 14. Display 14 may be a liquid crystal display, an organic light-emitting diode display, an electronic ink display, a plasma display, a display that uses other display technologies, or a display that uses any two or more of these display configurations. Display 14 may include an array of touch sensors (i.e., display 14 may be a touch screen). The touch sensors may be capacitive touch sensors formed from an array of transparent touch sensor electrodes such as indium tin oxide (ITO) electrodes or may be touch sensors formed using other touch technologies (e.g., acoustic touch, pressure-sensitive touch, resistive touch, etc.).
Audio components 36 may be used to provide device 10 with audio input and output capabilities. Examples of audio components that may be included in device 10 include speakers, microphones, buzzers, tone generators, and other components for producing and detecting sound.
Communications circuitry 38 may be used to provide device 10 with the ability to communicate with external equipment. Communications circuitry 38 may include analog and digital input-output port circuitry and wireless circuitry based on radio-frequency signals and/or light.
Device 10 may also include a battery, power management circuitry, and other input-output devices 40. Input-output devices 40 may include buttons, joysticks, click wheels, scrolling wheels, touch pads, key pads, keyboards, cameras, light-emitting diodes and other status indicators, etc.
A user can control the operation of device 10 by supplying commands through input-output circuitry 42 and may receive status information and other output from device 10 using the output resources of input-output circuitry 42. Using ambient light sensor readings from one or more ambient light sensors in sensors 32, storage and processing circuitry 30 can automatically take actions in real time such as adjusting the brightness of display 34, adjusting the brightness of status indicator light-emitting diodes in devices 40, adjusting the colors or contrast of display 34 or status indicator lights, etc.
Display structures that are used in forming images for display 14 may be mounted under active region 22 of display 14. In the example of
The display structures of display 14 may include a touch sensor array such as touch sensor array 51 for providing display 14 with the ability to sense input from an external object such as external object 76 when external object 76 is in the vicinity of a touch sensor on array 51. With one suitable arrangement, touch sensor array 51 may be implemented on a clear dielectric substrate such as a layer of glass or plastic and may include an array of indium tin oxide electrodes or other clear electrodes such as electrodes 50. The electrodes may be used in making capacitive touch sensor measurements.
Display 14 may include a backlight unit such as backlight unit 70 for providing backlight 72 that travels vertically upwards in dimension Z through the other layers of display 14. The display structures may also include upper and lower polarizers such as lower polarizer 68 and upper polarizer 64. Color filter layer 66 and thin-film transistor layer 60 may be interposed between polarizers 68 and 64. A layer of liquid crystal material may be placed between color filter layer 66 and thin-film transistor layer 60.
Color filter layer 66 may contain a pattern of colored elements for providing display 14 with the ability to display colored images. Thin-film transistor layer 60 may include pixel structures for applying localized electric fields to the liquid crystal layer. The localized electric fields may be generated using thin-film transistors and associated electrodes that are formed on a clear substrate such as a glass or plastic substrate. The electrodes and other conductive structures on thin-film transistors layer 60 may be formed from metal (e.g., aluminum) and transparent conductive material such as indium tin oxide. In the
One or more ambient light sensors 52 may be provided in device 10. As shown in
Indium tin oxide traces or other conductive patterned traces that are formed on thin-film transistor layer 60 may form electrical paths that are connected to leads in ambient light sensors 52. For example, one or more contacts such as gold pads or pads formed from other metals may be attached to indium tin oxide traces or metal traces using anisotropic conductive film (ACF) or other conductive adhesive. Solder connections, welds, connections formed using connectors, and other electrical interconnect techniques may be used to mount ambient light sensors 52 to thin-film transistor layer 60 if desired.
An opaque masking layer such as opaque masking layer 46 may be provided in inactive region 26. The opaque masking layer may be used to block internal device components from view by a user through peripheral edge portions of clear display cover layer 44. The opaque masking layer may be formed from black ink, black plastic, plastic or ink of other colors, metal, or other opaque substances. Ambient light sensor windows such as windows 48 may be formed in opaque masking layer 46. For example, circular holes or openings with other shapes may be formed in layer 46 to serve as ambient light sensor windows 48. Ambient light sensor windows 48 may, if desired, be formed in locations such as locations 18 of
If desired, a flexible printed circuit (“flex circuit”) cable such as cable 90 may be used to interconnect traces 62 on thin-film transistor layer 60 to additional circuitry in device 10 (e.g., storage and processing circuitry 30 of
During operation of device 10, ambient light 74 may pass through ambient light sensor windows 48 and may be detected using ambient light sensors 52. Signals from ambient light sensors 52 may be routed to analog-to-digital converter circuitry that is implemented within the silicon substrates from which ambient light sensors 52 are formed, to analog-to-digital converter circuitry that is formed on thin-film-transistor layer 60 or that is formed in an integrated circuit that is mounted to thin-film transistor layer 60, or to analog-to-digital converter circuitry and/or other control circuitry located elsewhere in device 10 such as one or more integrated circuits in storage and processing circuitry 30 of
If desired, an ambient light sensor may be implemented as part of a silicon device that has additional circuitry (i.e., ambient light sensors 52 may be implemented as integrated circuits). An ambient light sensor with this type of configuration may be provided with built-in analog-to-digital converter circuitry and communications circuitry so that digital light sensor signals can be routed to a processor using a serial interface or other digital communications path.
At step 904, the first output signal is subtracted from the second output signal to obtain a human eye response signal (i.e., a signal substantially proportional to the amount of visible light in the ambient environment). Computing the amount of ambient visible light using this conventional approach, however, requires the infrared sensing capabilities of the first and second sensor elements to be accurate and consistent relative to one another. Any mismatch in infrared sensitivity can cause the visible light reading to be faulty. Because designing and manufacturing two sensors that are well-matched in performance is challenging, it would be desirable to provide improved ways of computing visible light level information.
A silicon photosensor may, as an example, be more sensitivity to light outside the visible spectrum. As shown in the plot of
The x-axis of
In general, it is desirable for ambient light sensors to detect light in the visible spectrum since humans are only capable of perceiving light in that particular range. In other words, it may be advantageous for ambient light sensors to detect ambient light levels based only on the amount of visible light present but not on the amount of IR light or other types of existing light source with wavelengths outside the visible spectrum (e.g., it may be desirable to remove the IR content from the raw silicon photosensor output so that a corresponding final sensor output is representative of only the visible light that is present in the ambient environment).
This can be accomplished by compensating the silicon photosensor output values (e.g., by adjusting/normalizing the responses with non-negligible IR content to match the response of line 110 having negligible IR content). For example, sensor output values associated with line 112 may be adjusted using a first compensation factor, whereas sensor output values associated with line 114 may be adjusted using a second compensation factor that is different than the first compensation factor so as to effectively shift lines 112 and 114 towards line 110 (as indicated by arrow 116 in
In response to receiving ambient light, first photosensor 200 may generate a corresponding signal S1 while second photosensor 202 may generate a corresponding signal S2. Signal S1 may be proportional to the amount of infrared light that is present in the ambient environment, whereas signal S2 may be proportional to the amount of visible and infrared (VIS+IR) light that is present in the ambient environment. Signals S1 and S2 generated in this way may be routed to signal processing circuitry 204 for data conversion, processing, and/or storage. In general, signal processing circuitry 204 may be implemented on the silicon substrate from which the first and second photosensors are formed, on thin-film-transistor layer 60, on an integrated circuit that is mounted to thin-film transistor layer 60, or as part of control circuitry located elsewhere in device 10 such as one or more integrated circuits in storage and processing circuitry 30 of
Signal processing circuitry 204 of
Signal processing circuitry 204 may then refer to a compensation factor lookup table 206 that is stored on circuitry 204 (as an example). Different ratio levels may correspond to a respective entry in lookup table 206 listing a desired sensor output compensation factor that should be used when calculating the final sensor output value (see, e.g.,
A final compensated sensor output value may be computed using equation 1 as follows:
Sout=K*Sraw (1)
As shown in equation 1, final compensated sensor output signal Sout may be calculating by taking the product of compensation factor K and a raw (uncompensated) sensor output value Sraw. Raw sensor output value Sraw may include visible light and infrared light content (e.g., Sraw may represent the total ambient light level). In the example of
As with other embodiments described herein, signal processing circuitry 304 may be implemented on the silicon substrate from which photosensor 300 is formed, on thin-film-transistor layer 60, on an integrated circuit that is mounted to thin-film transistor layer 60, or as part of control circuitry located elsewhere in device 10 such as one or more integrated circuits in storage and processing circuitry 30 of
Signal processing circuitry 304 of
As shown in table 306 of
The final compensated sensor output value may also be computed using equation 1. In the example of
As with other embodiments described herein, signal processing circuitry 404 may be implemented on the silicon substrate from which photosensor 400 is formed, on thin-film-transistor layer 60, on an integrated circuit that is mounted to thin-film transistor layer 60, or as part of control circuitry located elsewhere in device 10 such as one or more integrated circuits in storage and processing circuitry 30 of
Signal processing circuitry 404 of
As shown in table 406 of
The final compensated sensor output value may also be computed using equation 1. In the example of
Methods of compensating raw sensor output values as described in connection with
If desired, other combination of techniques may be employed to determine whether the predominant light source in the ambient environment is an LED light source, a fluorescent light source, a solar light source, an incandescent light source, a tungsten light source, or other types of light source. In other suitable embodiments, ambient light sensor 52 may be operated in an environment with mixed lighting. In such scenarios, processing circuitries 204, 304, and 404 (sometimes referred to as control circuitries) may identify the lighting type based on the dominant light source that is present in the ambient environment.
At step 504, a signal processor (e.g., control circuitry 204, 304, and/or 404) may receive the raw sensor reading and select an appropriate compensation factor K by referring to a lookup table based on a computed ratio (as described in connection with
At step 506, a compensated sensor reading Sout may be obtained by computing the product of Sraw and the selected compensation factor (see, equation 1). The compensated sensor reading Sout, which is representative of the ambient brightness level according to the human response (without any infrared content), may then be used in controlling the brightness level of display 14 during normal operation of device 10 (step 508).
The foregoing is merely illustrative of the principles of this invention and various modifications can be made by those skilled in the art without departing from the scope and spirit of the invention. The foregoing embodiments may be implemented individually or in any combination.
Number | Name | Date | Kind |
---|---|---|---|
6339429 | Schug | Jan 2002 | B1 |
6459436 | Kumada | Oct 2002 | B1 |
6801836 | Schanin | Oct 2004 | B2 |
7153720 | Augusto | Dec 2006 | B2 |
7391172 | Ferguson et al. | Jun 2008 | B2 |
7453057 | Drummond et al. | Nov 2008 | B2 |
7460196 | Kim | Dec 2008 | B2 |
7586479 | Park et al. | Sep 2009 | B2 |
7825891 | Yao et al. | Nov 2010 | B2 |
7957762 | Herz et al. | Jun 2011 | B2 |
7960682 | Gardner, Jr. | Jun 2011 | B2 |
8031164 | Herz et al. | Oct 2011 | B2 |
8194031 | Yao et al. | Jun 2012 | B2 |
8223117 | Ferguson | Jul 2012 | B2 |
8274051 | Aswell et al. | Sep 2012 | B1 |
8304711 | Drummond et al. | Nov 2012 | B2 |
8384003 | Gardner, Jr. | Feb 2013 | B2 |
20030189211 | Dietz | Oct 2003 | A1 |
20030189586 | Vronay | Oct 2003 | A1 |
20040032676 | Drummond et al. | Feb 2004 | A1 |
20040036820 | Runolinna | Feb 2004 | A1 |
20040095402 | Nakano | May 2004 | A1 |
20050219197 | Pasqualini et al. | Oct 2005 | A1 |
20070236485 | Trepte | Oct 2007 | A1 |
20070268241 | Nitta et al. | Nov 2007 | A1 |
20080055297 | Park | Mar 2008 | A1 |
20080284716 | Edwards | Nov 2008 | A1 |
20090237423 | Shih et al. | Sep 2009 | A1 |
20100079426 | Pance et al. | Apr 2010 | A1 |
20100090996 | Chou et al. | Apr 2010 | A1 |
20110043503 | Hadwen | Feb 2011 | A1 |
20110199349 | Katoh | Aug 2011 | A1 |
20110234302 | Utsunomiya et al. | Sep 2011 | A1 |
20110248170 | Holcombe et al. | Oct 2011 | A1 |
20110273377 | Merz | Nov 2011 | A1 |
20120001841 | Gokingco et al. | Jan 2012 | A1 |
20120056091 | Mahowald | Mar 2012 | A1 |
20120170284 | Shedletsky | Jul 2012 | A1 |
20120188483 | Matsuzaki et al. | Jul 2012 | A1 |
20120218239 | Yao et al. | Aug 2012 | A1 |
Number | Date | Country |
---|---|---|
1335430 | Aug 2003 | EP |
0041378 | Jul 2000 | WO |
0237454 | May 2002 | WO |
2007069107 | Jun 2007 | WO |
Entry |
---|
Zheng et al., U.S. Appl. No. 13/241,034, filed Sep. 22, 2011. |
Chiang et al., “Integrated Ambient Light Sensor With Nanocrystalline Silicon on a Low-Temperature Polysilicon Display Panel”, IEEE Transactions on Electron Devices, vol. 56, No. 4, pp. 578-586, Apr. 2009. |
Wuu et al., “A Leading-Edge 0.9 μm Pixel CMOS Image Sensor Technology with Backside Illumination: Future Challenges for Pixel Scaling (Invited)”, Taiwan Semiconductor Manufacturing Company, 2009. |
Hotelling et al., U.S. Appl. No. 13/283,446, filed Oct. 27, 2011. |
Shedletsky et al., U.S. Appl. No. 13/732,966, filed Jan. 2, 2013. |
Jong et al., U.S. Appl. No. 13/686,746, filed Nov. 27, 2012. |
Land et al., U.S. Appl. No. 13/746,549, filed Jan. 22, 2012. |
Yin, U.S. Appl. No. 13/628,388, filed Sep. 27, 2012. |
Gardner, Jr., U.S. Appl. No. 13/771,779, filed Feb. 20, 2012. |
Zheng, U.S. Appl. No. 13/738,908, filed Jan. 10, 2013. |
Zheng, et al., U.S. Appl. No. 13/241,034, filed Sep. 22, 2011. |
Number | Date | Country | |
---|---|---|---|
20140132578 A1 | May 2014 | US |