The disclosed technology relates to an electronic device configured to use the device's own display to provide an illumination source for front-facing image sensors. Aspects are also directed to methods of using the same.
Many digital devices are equipped with a front-facing image sensor for capturing self-images of a user. However, most of the devices equipped with front-facing image sensors lack a dedicated illumination source to provide additional illumination for capturing the self-image using the front-facing image sensor in a low-light environment. In many cases, the benefit of adding such an illumination source does not outweigh the added technological complexity and the associated cost of having a dedicated illumination source for the digital devices having front-facing image sensors.
In one aspect, a mobile device for capturing one or more digital images of a subject is disclosed. The mobile device includes an image sensor, a display device, a memory, and a processor coupled to the memory. The processor is configured to receive a command to capture the one or more digital images, determine that the one or more digital images include the subject, cause the display device to output, in a dynamic illumination mode and based on a preexisting illumination condition, an illumination image in response to the command to capture the one or more digital images of the subject, and cause the image sensor to receive the one or more digital images of the subject while the subject is illuminated by the illumination image. The subject is not depicted by the display device while the illumination image is output by the display device.
In another aspect, a method is disclosed for capturing one or more digital images of a subject using a mobile device including an image sensor and a display device. The method includes receiving a command to capture the one or more digital images, determining that the one or more digital images include the subject, causing the display device to output an illumination image, in a dynamic illumination mode and based on a preexisting illumination condition, an illumination image in response to the command to capture the one or more digital images of the subject, wherein the subject is not depicted by the display device while the illumination image is output by the display device, and capturing the one or more digital images of the subject using the image sensor while the subject is illuminated by the illumination image.
In another aspect, a non-transitory computer-readable medium comprising instructions that when executed cause the mobile device, which includes an image sensor and a display device, to perform the following steps including receiving a command to capture the one or more digital images, determining that the one or more digital images include the subject, causing the display device to output an illumination image, in a dynamic illumination mode and based on a preexisting illumination condition, an illumination image in response to the command to capture the one or more digital images of the subject, wherein the subject is not depicted by the display device while the illumination image is output by the display device, and capturing the one or more digital images of the subject using the image sensor while the subject is illuminated by the illumination image.
Many digital devices come with a front-facing image sensor for capturing self-images a user. The captured self-image may be a static image such as a photograph, or may be a dynamic image such as a video. However, most if not all devices with front-facing cameras lack a dedicated illumination source (e.g., a flash or LED light for capturing still images or video). As a result, when using the front-facing image sensor in a low-light environment, the illumination from ambient light may not be sufficient to provide adequate illumination for the image sensor. While adding a flash or an LED source can provide a solution, the benefit of adding such an illumination source does not outweigh the added technological complexity and the associated cost of having a dedicated illumination source for a front-facing camera on the digital devices. Thus, there is a need for a cost-effective illumination source for capturing images using the front-facing image sensor of a digital device.
The disclosure is directed to an electronic device having a front-facing image sensor and a digital display, where the electronic device is configured to use the digital display as an illumination source for the front-facing image sensor. Aspects are also directed to methods of using the same. One advantage of the system described herein is that it improves the low-light performance of the front-facing image sensor of the electronic device without incurring the added costs or complexity of an additional illumination source.
Thus, one embodiment is an electronic device that is configured to illuminate the digital display as an image is being captured by a front facing camera. The user may activate the front facing camera to capture an image, and this would cause the digital display to flash a bright white color while the image is being captured. In another aspect, the digital display may brighten to a predefined brightness, or to a predefined color, as the image is being captured. This feature may allow the user to choose how the digital display is used to improve a low-light capture of images from the front camera.
The following disclosure may describe the features of various embodiments of a digital device having a front-facing image sensor and a display device configured as an illumination source in the context of one type of device (e.g., a smart phone). However, it is to be understood that other embodiments are possible, including any suitable electronic devices that can be configured to have a front-facing image sensor and a display device configured as an illumination source. Such devices include, for example, mobile phones, tablet computers, notebook computers, desktop computers, video cameras, portable music players, among others. In addition, the display device that may provide this function include an LED, LCD, OLED, AMOLED, or other similar types of displays that can be configured as an illumination source for a front-facing image sensor of a digital device.
As described herein, an “image” may refer to not only a still digital image as but may also refer to a video comprising instantaneous frames of many images. In addition, an image can refer to images displayed on the display device 104, or images that exist in a memory device or storage device of the digital device 102 but not displayed on the display device 104.
As shown, the user 120 would begin an image capture mode with the digital device 102 wherein an illumination image 106 would be displayed. The user 120 could then activate a shutter button to capture the image at a particular point. As the shutter button is activated, the digital device 102 would instruct the display device 104 to flash a bright white color that would better illuminate the user 120. This would improve the image being captured by adding additional light onto the user 120.
During use, such as a video call, the first display device 104A may be configured to brighten as the user 120a is in the call. This brightening would allow the system to transmit a higher quality image to the second user 120b. Similarly, the second display device 104b could be configured to brighten while the second user 120b was on a video call.
A digital device with a front-facing image sensor such as the first and second digital devices 102A and 102B of
The illustrated digital device 200 includes the central processing module 250 configured to control the overall operation of the digital device 200 and may include a suitable microprocessor configured to perform processing functions of the digital device 200. In some embodiments, the central processing module 250 includes specialized sub-processing modules such as a graphics processing module.
The digital device 200 further includes the command input module 210 configured to receive various modes of command input from a user. In some embodiments, the command input module 210 can include any number of suitable input devices such as a voice recognition device, a gesture recognition device, a motion sensing device, a touch screen device, a keyboard device, and an auxiliary input/output (I/O) device, among others. The command input module can also include supporting circuitry to transform physical input signals such as a voice wave or a motion into digital signals.
The digital device 200 further includes the illumination sensing module 220 configured to determine an illumination condition. The illumination sensing module 220 comprises the front-facing image sensor and an image sensor controller. The image sensor includes a plurality of pixels configured to convert incident photons into electrical signals, which are transferred to the central processing module to be processed. In a typical image sensor, each pixel includes a photosensitive area, which is configured to absorb incident photons of light. In some embodiments, incident photons may be directed by a micro lens over each pixel to enhance the quantum efficiency of photon collection. The absorbed photons are converted into electrons, whose number may depend on the energy of the incident photon. The electrons are in turn converted to a voltage signal.
In some embodiments, the image sensor includes a charge-coupled device (CCD) image sensor. A CCD image sensor comprises a color filter array and a pixel array. Each pixel of a CCD image sensor includes a color filter comprising a pattern of red, green and blue filters. In one example, the color filter may be arranged in a Bayer filter pattern having a 2×2 checker board color filter pattern. The 2×2 checkerboard filter pattern of a Bayer filter includes one red and one blue filters disposed diagonally to one another and two green filters disposed diagonally to one another. The filtered photons passing through different color filters are then absorbed by a photodiode within the pixel array. The photodiode converts the absorbed photons into a charge, and the charge is moved to a single location by applying different voltages to pixels, in a process called charge-coupling. Because the charge in the pixel is moved by applying different voltages, CCD image sensors are supported by external voltage generators.
In some embodiments, the image sensor includes a complementary metal oxide semiconductor (CMOS) image sensor. Like CCD image sensors, CMOS image sensors include an array of photo-sensitive diodes, one diode within each pixel. Unlike CCDs, however, each pixel in a CMOS imager has its own individual integrated amplifier. In addition, each pixel in a CMOS imager can be read directly in an x-y coordinate system, rather than through movement of a charge. Thus, a CMOS image sensor pixel detects a photon directly and converts it to a voltage, which is outputted.
The illumination sensing module 220 includes additional circuitry for converting the outputted voltages resulting from an incident photon into digital information, which may be processed by the central processing module 250. The illumination sensing module 220 further includes an image sensor controller configured to control the image sensor in response to various commands from the central processing module 250.
The illumination adjustment module 230 may be configured to adjust the illumination conditions of the display device to and from an imaging illumination condition and a normal viewing illumination condition, in response to a command received from a user. The illumination adjustment module includes the display device and a display controller. In one embodiment, the display device can include an active matrix organic light-emitting diode (AMOLED) display comprising an active matrix of organic light-emitting diode (OLED) pixels that generate light upon electrical activation. The OLED pixels can be integrated onto a thin film transistor (TFT) array, which functions as a series of switches to control the current flowing to each individual pixel. Other embodiments of the display device are possible, including an LED, LCD, OLED, AMOLED, or any other similar types of displays that can be configured as an illumination source for a front-facing image sensor of the digital device 200.
The light emission intensity and therefore the luminance of each pixel within a display can be adjusted by the current supplied to a emitting element, such as a light-emitting diode (LED). In one implementation, the display is an active matrix display such as an AMOLED, whose pixels comprise two transistors and a capacitor. A first transistor whose drain is connected to a light emitting diode (e.g., OLED) is configured to control the amount of current flowing through the diode and therefore the light emission intensity by controlling a gate-source voltage of the first transistor. The gate-source voltage is in turn maintained by the capacitor connected between the gate and the source of the first transistor. The gate-source voltage can be modified by controlling the amount of charge stored in the capacitor through controlling a second transistor, whose gate is connected to a row select line and whose source is connected to a data line. Thus, by controlling various voltages such as the row select line voltage and the data line voltage to control the second transistor, which in turn controls the current delivered to the light emitting diode through the first transistor, the luminance value of each pixel in the display device can be adjusted to provide varying degrees of illumination for the front-facing image sensor.
The front-facing image sensor module 240 is configured to capture the digital image through the front-facing image sensor under the image illumination condition. The front-facing image sensor module can include and share similar hardware devices as the illumination sensing module. For example, the front-facing image sensor module 240 comprises the front-facing image sensor and an image sensor controller, whose functions and operations are substantially the same as the illumination sensing module 220. In addition, the illumination adjustment module performs calculations necessary to determine various illumination conditions for the display device of the illumination adjustment module 230.
The digital device 200 further includes the memory module 260 configured to store information while the digital device 200 is powered on. The memory module 260 can include memory devices such as a static random access memory (SRAM) and a dynamic random access memory (RAM). The memory devices can be configured as different levels of cache memory communicatively coupled to the central processing module 250 through a memory bus that provides a data path for flow of data to and from the memory devices and the microprocessor. In particular, the memory module may hold image information at various stages of the operation of the digital device to provide illumination for the front-facing image sensor using the display device.
The digital device 200 further includes the storage module 270 configured to store media such as photo and video files, as well as software codes. In some embodiments, the storage module 270 is configured to permanently store media even when the digital device 200 is powered off. In some implementations, the storage module 270 includes storage media, such as a hard disk, a nonvolatile memory such as flash memory, read-only memory (ROM), among others.
The digital device 200 further includes the communication subsystem 280 configured to communicatively connect the digital device 200 to the network 290. The communication subsystem 280 includes circuitry configured for wireless communication. For example, the communication subsystem 280 may enable Wi-Fi® communication between the digital device 200 and the network 290 using one of 802.11 standards. The communication system 280 may additionally enable standards such as BLUETOOTH®, Code Division Multiple Access® (CDMA), and Global System for Mobile Communication® (GSM), among others.
The digital device of the illustrated embodiments in
The method 300 of using a digital device with a front-facing image sensor and a display device configured as an illumination source begins at a start state 310 and moves to a state 320 to receive a command to capture a digital image using the front-facing image sensor. In one aspect, the command may be received in any suitable form that can be processed by the command input module 210, including a voice command processed by a voice recognition device, a gesture command processed by a gesture recognition device, a touch command processed by a touch screen device, a keyboard command processed by a keyboard device, a motion command processed by a motion sensing device, among other suitable forms of a user command.
After receiving the command to capture the digital image at the state 320, the method 300 moves to a state 330 and activates the front-facing image sensor. In one aspect, activating the front-facing image sensor at state 330 can include, for example, providing an access voltage to the access lines of the image sensor and providing Vcc to an image sensor controller of the image sensor module 220.
An illumination condition provided by the display device can be defined by many parameters, including luminance and chrominance values of the pixels of the display device. For example, as a person having ordinary skill in the art will understand, the actual values of luminance and chrominance depend on the color space being used to describe them. For example, in RGB or sRGB color spaces, each pixel can have a relative luminance Y represented by the equation Y=rR+gG+bB, where R, G, and B represent color components red, green and blue and r, g, b are constants. For example, for sRGB space, the constants r, b, and b have values 0.212, 0.7152, and 0.0722, respectively. In Y′UV color space, for example, Y′ represents a luma value and U and V represent two color components. RGB space and the Y′UV space are related by the well-known transformational relationships:
In addition, a person skilled in the art will also understand that any suitable color space representation, such as one of YUV, YCbCr, YPbPr, etc, can be used to represent an illumination condition of the pixels of the display device. In the description herein, the term “luminance” is used generally to refer to an overall intensity of the light, and the term “chrominance” is used generally to refer to a color component.
According to one embodiment, the method 300 of using the digital device with a front-facing image sensor includes providing a dynamic illumination mode, which can be selected by the user. When activated by the user, the dynamic illumination mode allows for an optimization of the illumination condition provided by the display device based on a pre-existing illumination condition determined by the illumination sensing module. When the dynamic illumination mode is not activated by the user, a predetermined default illumination condition is provided by the display device, irrespective of the pre-existing illumination condition. Details of the illumination modes will be more apparent in the discussions that follow. After activating the front-facing image sensor at the state 330, the method 300 moves to a decision state 340 to determine whether or not a dynamic illumination mode has been activated.
When a determination is made at the decision state 340 that the dynamic illumination mode is not activated, the method 300 adjusts the display device to a default imaging illumination condition at a process state 350. Additional details on the steps performed to adjust the display device at the state 350 are discussed below with reference to
However, when a determination is made at the decision state 340 that the dynamic illumination mode is activated, the method 300 moves to a process state 360 to determine a preexisting illumination condition. Additional information on how to determine a preexisting illumination condition can be found with reference to
Once the preexisting illumination condition has been determined at the process state 360, the method 300 moves to a decision state 370 to determine whether additional illumination is needed. This determination may be based on the computed difference between an average luminance value of the subject and a stored luminance criteria corresponding to that subject. If the computed difference exceeds a certain threshold percentage value, the method 300 may proceed to a process state 380 to adjust the display device to an optimized imaging illumination condition. However, if the computed difference does not exceed a certain threshold percentage value, the method 300 proceeds to the process state 350 to adjust the display device to a default imaging illumination condition as discussed above.
By way of an example only, the stored target luminance criteria for a human face may include 18% in gray scale of the luminance curve. In an 8-bit luminance curve, there may be 28=256 levels of luminance values such that 18% in gray scale corresponds to 46th gray level. If the average luminance value of the human face captured in the test frame has an average luminance value corresponding to, for example, 10% in gray scale corresponding to 26th gray level in an 8-bit luminance curve, the computed difference would be 8%. Whether the method 300 proceeds to adjusting the display device to an optimized imaging illumination condition or to adjusting the display device to a default imaging illumination condition may depend on whether or not the computed difference of 8% exceeds the threshold value in one embodiment.
After adjusting the display device to an optimized imaging illumination condition at the process state 380, the method 300 moves to the state 390 to activate the shutter. The method 300 then moves to a state 392 wherein the image or the video frame is captured while the illumination image is displayed on the display device. The method 300 then moves to a state 394 wherein the shutter is deactivated. Finally, the method 300 moves to a state 396 wherein the display device is returned to normal illumination condition.
According to one implementation, the process 360 for determining the preexisting illumination condition includes capturing a test frame at the state 362. A test frame may be a frame captured using a set of fixed test frame imaging conditions, including an f-number and an exposure time. In some implementations, the test frame imaging conditions include a relatively low f-number and a relatively short exposure time compared to actual imaging conditions in order to maximize speed. In other implementations, the test frame imaging conditions includes an f-number and an exposure time that are similar to actual imaging conditions.
Still referring to
Determining the metering region may include determining a rectangular area comprising a fixed percentage of the total display area of the test frame to be the metering region. By way of an example only, the metering region may have, for example, a rectangular metering region having a width equal to about 75% of the test frame width and a length equal to about 75% of the test frame length. Other embodiments are possible, where the metering region may include a non-rectangular area and/or a rectangular area occupying different percentages of the length and/or width of the test frame.
In another aspect, determining the subject to be imaged may be based on any suitable number of subject determination criteria. In some implementations, the subject determination criteria may include determining a fraction of the total test frame area a potential subject occupies. In other implementations, the subject determination criteria may include an average luminance value of the potential subject compared to an overall average of luminance of the total test frame. In yet other implementations, the subject determination criteria may include other criteria such as an average value of the color components of the potential subject compared to an average value of the color components of the total test frame. Using one or more of the subject determination criteria and comparing against a reference list stored in the storage module, a subject of the test frame can be determined.
In another aspect, determining the subject to be imaged may include determining that the subject includes a human face. Determining that the subject is a human may invoke any one or more of face-detection algorithms known in the art. For example, the determination of a human face can be made based on any number of suitable factors, such as the ovular nature of the subject and minimum and maximum distances between the center point and the outer boundaries of the subject.
Still referring to
In some embodiments, the calculated additional illumination in the state 382 may be linearly or nonlinearly proportional to the computed difference between the average luminance value of the subject and the stored luminance criteria corresponding to that subject in the state 366 in
In other embodiments, the additional illumination may be calculated based on a difference between an average chrominance value of the subject and the stored chrominance criteria corresponding to that subject in the state 382. In this embodiment, color components having relatively low average values in the subject of the test frame may be calculated to be over-compensated by the display device while other color components having relatively high average values in the subject of the test frame may be calculated to be under-compensated so as to preferentially compensate color components in order to produce an aesthetically more pleasing image.
Still referring to
In one implementation, the illumination image may be an image that was being displayed before receiving the command to capture the digital image in state 320 in
The illumination image may be selected in the state 384 based on the additional illumination calculated in the state 382. For example, a suitable illumination image may be the one capable of providing the calculated additional illumination at the state 382. However, not all available illumination images may be capable of providing the calculated additional illumination at the state 382. As an illustrative example, a first illumination image may have pixels arranged to provide 1-5% of additional luminance, whereas a second illumination image may have pixels arranged to provide 5-10% of additional luminance. In this illustrative example, if the required additional luminance based on the calculated additional image at the state 382 exceeds 5%, the second illumination image would be selected over the first illumination image at the state 384.
Still referring
Still referring
Referring now to
Referring to
Still referring
Still referring
The illumination image 404 may be any suitable image displayed on the display device of the digital device 400 for providing adequate illumination for the front-facing image sensor 402. In some implementations, the illumination image 404 may be an image that was being displayed prior to receiving a command from the user to capture a digital image. One implementation of such an illumination image is depicted in
In some implementations, the illumination image may include one or more illumination regions configured such that pixels included in the illumination regions are configured to illuminate white light. A pixel may be configured to illuminate white light when the intensities of individual color components (e.g., R, G, and B of RGB color space) are balanced to have substantially the same values such that a human eye perceives the resulting light as being neutral without having a color preference.
The illumination image 404 of the digital device 400 in
The illumination image 404 of the digital device 400 in
The illumination image 404 of the digital device 400 in
In some implementations, the illumination image 404 may include one or more illumination regions configured such that pixels included in the illumination regions are configured to preferentially illuminate colored light of a color component (e.g., R, G, or B in RGB space). The pixels may be configured to preferentially illuminate colored light when the intensity of one of the color components is enhanced while intensities of other color components are suppressed such that a human eye perceives the resulting light as having a color. For example, to preferentially illuminate red light, the photodiodes corresponding to green and blue lights may be suppressed such that the color component R has a relatively high value in comparison to the color components G and B.
The illumination image 404 of the digital device 400 in
The illumination image 404 of the digital device 400 in
The illumination image 404 of the digital device 400 in
The illumination image 404 of the digital device 400 in
In some implementations, the illumination image may include an image captured by the front-facing image sensor. In some implementations, the image captured by the front-facing image sensor may be a preview image for a still image. In other implementations, the image captured by the front-facing camera may be a real-time frame being captured in a video.
One implementation using the image captured by the front-facing camera itself as an illumination image is depicted in
Another implementation using the image captured by the front-facing camera itself as an illumination image is depicted in
Another implementation using the image captured by the front-facing camera itself as an illumination image is depicted in
The previous description of the disclosed implementations is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Implementations disclosed herein provide systems, methods and apparatus for using the device's own display to provide an illumination source for front-facing image sensors. One skilled in the art will recognize that these embodiments may be implemented in hardware, software, firmware, or any combination thereof.
In the description, specific details are given to provide a thorough understanding of the examples. However, it will be understood by one of ordinary skill in the art that the examples may be practiced without these specific details. For example, electrical components/devices may be shown in block diagrams in order not to obscure the examples in unnecessary detail. In other instances, such components, other structures and techniques may be shown in detail to further explain the examples.
Headings are included herein for reference and to aid in locating various sections. These headings are not intended to limit the scope of the concepts described with respect thereto. Such concepts may have applicability throughout the entire specification.
It is also noted that the examples may be described as a process, which is depicted as a flowchart, a flow diagram, a finite state diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel, or concurrently, and the process can be repeated. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a software function, its termination corresponds to a return of the function to the calling function or the main function.
The previous description of the disclosed implementations is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
This application is a continuation application of, and claims priority under 35 U.S.C. § 120 to co-pending and commonly owned U.S. patent application Ser. No. 15/694,715 entitled “DISPLAY DEVICE CONFIGURED AS AN ILLUMINATION SOURCE” filed on Sep. 1, 2017, which is a continuation application of, and claims priority under 35 U.S.C. § 120 to commonly owned U.S. patent application Ser. No. 15/362,595 entitled “DISPLAY DEVICE CONFIGURED AS AN ILLUMINATION SOURCE” filed on Nov. 28, 2016, which is a continuation application of, and claims priority under 35 U.S.C. § 120 to commonly owned U.S. patent application Ser. No. 13/932,844 entitled “DISPLAY DEVICE CONFIGURED AS AN ILLUMINATION SOURCE” filed on Jul. 1, 2013, now granted as U.S. Pat. No. 9,525,811, the entireties of which are incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
6970580 | Kies | Nov 2005 | B2 |
7663691 | Ciudad et al. | Feb 2010 | B2 |
7733383 | Kato | Jun 2010 | B2 |
7738032 | Kollias et al. | Jun 2010 | B2 |
8139122 | Rolston | Mar 2012 | B2 |
8238968 | Frydman | Aug 2012 | B1 |
8248519 | Liao | Aug 2012 | B2 |
8400519 | Choi | Mar 2013 | B2 |
8553103 | Samadani | Oct 2013 | B1 |
8625023 | Rolston | Jan 2014 | B2 |
8695610 | Samain et al. | Apr 2014 | B2 |
8730384 | Guo et al. | May 2014 | B2 |
8941775 | Guo et al. | Jan 2015 | B2 |
9075975 | Bud | Jul 2015 | B2 |
9525811 | Moskovchenko | Dec 2016 | B2 |
9565410 | Huai | Feb 2017 | B2 |
9635255 | Baldwin | Apr 2017 | B1 |
9781321 | Moskovchenko | Oct 2017 | B2 |
9843707 | Hirakata et al. | Dec 2017 | B2 |
11070710 | Moskovchenko | Jul 2021 | B2 |
20010013897 | Kowno et al. | Aug 2001 | A1 |
20020071246 | Stewart | Jun 2002 | A1 |
20030045916 | Anderson et al. | Mar 2003 | A1 |
20030086703 | Kollias et al. | May 2003 | A1 |
20030098922 | Barkan | May 2003 | A1 |
20030189665 | Yamada | Oct 2003 | A1 |
20040075645 | Taylor | Apr 2004 | A1 |
20040125996 | Eddowes et al. | Jul 2004 | A1 |
20040146290 | Kollias et al. | Jul 2004 | A1 |
20040239799 | Suzuki et al. | Dec 2004 | A1 |
20050146863 | Mullani | Jul 2005 | A1 |
20050190288 | Yamada | Sep 2005 | A1 |
20060092184 | Nam | May 2006 | A1 |
20060097978 | Ng et al. | May 2006 | A1 |
20070248342 | Tamminen et al. | Oct 2007 | A1 |
20070279427 | Marks | Dec 2007 | A1 |
20080150878 | Kang | Jun 2008 | A1 |
20080231742 | Kurase | Sep 2008 | A1 |
20080252749 | Fujiwara | Oct 2008 | A1 |
20080259067 | Wang et al. | Oct 2008 | A1 |
20080260242 | MacKinnon | Oct 2008 | A1 |
20080307307 | Ciudad et al. | Dec 2008 | A1 |
20090078852 | Lin et al. | Mar 2009 | A1 |
20090175555 | Mahowald | Jul 2009 | A1 |
20090273661 | Mauchly | Nov 2009 | A1 |
20090322889 | Kujawa | Dec 2009 | A1 |
20100013943 | Thorn et al. | Jan 2010 | A1 |
20100073497 | Katsumata et al. | Mar 2010 | A1 |
20100118179 | Ciudad | May 2010 | A1 |
20100149398 | Gayer | Jun 2010 | A1 |
20100164920 | Shimoharada | Jul 2010 | A1 |
20100194961 | Patel | Aug 2010 | A1 |
20100265228 | Kimura et al. | Oct 2010 | A1 |
20110115833 | Shimoyama | May 2011 | A1 |
20110117959 | Rolston | May 2011 | A1 |
20110205395 | Levy | Aug 2011 | A1 |
20110317988 | Lee | Dec 2011 | A1 |
20120013779 | Hattery et al. | Jan 2012 | A1 |
20120162481 | Kim | Jun 2012 | A1 |
20120243200 | Sutton et al. | Sep 2012 | A1 |
20120249855 | Ciudad et al. | Oct 2012 | A1 |
20120294600 | Osawa | Nov 2012 | A1 |
20130015946 | Lau et al. | Jan 2013 | A1 |
20130038771 | Brunner et al. | Feb 2013 | A1 |
20130050233 | Hirsch | Feb 2013 | A1 |
20130083222 | Matsuzawa | Apr 2013 | A1 |
20130135508 | Inaba | May 2013 | A1 |
20130148002 | Kim et al. | Jun 2013 | A1 |
20130162862 | Zhao | Jun 2013 | A1 |
20130170743 | Finlayson et al. | Jul 2013 | A1 |
20130201653 | Shoemake et al. | Aug 2013 | A1 |
20130314581 | Kido | Nov 2013 | A1 |
20140055978 | Gantz et al. | Feb 2014 | A1 |
20140160314 | Schatvet et al. | Jun 2014 | A1 |
20140225980 | Patel et al. | Aug 2014 | A1 |
20140285699 | Kato | Sep 2014 | A1 |
20140289534 | Parry et al. | Sep 2014 | A1 |
20140293134 | Hung et al. | Oct 2014 | A1 |
20140313303 | Davis et al. | Oct 2014 | A1 |
20140337930 | Hoyos et al. | Nov 2014 | A1 |
20150015721 | Fan et al. | Jan 2015 | A1 |
20150181101 | Ciudad et al. | Jun 2015 | A1 |
20160037042 | Zhang et al. | Feb 2016 | A1 |
20160092724 | Jeong | Mar 2016 | A1 |
20170201664 | Mahowald | Jul 2017 | A1 |
20170366715 | Moskovchenko | Dec 2017 | A1 |
Number | Date | Country |
---|---|---|
1574851 | Feb 2005 | CN |
1726696 | Jan 2006 | CN |
101022564 | Aug 2007 | CN |
102053360 | May 2011 | CN |
102232290 | Nov 2011 | CN |
102573239 | Jul 2012 | CN |
103024146 | Apr 2013 | CN |
103152523 | Jun 2013 | CN |
1775939 | Apr 2007 | EP |
2565602 | Mar 2013 | EP |
2905955 | Aug 2015 | EP |
2963913 | Jan 2016 | EP |
2007110717 | Apr 2007 | JP |
2010161783 | Jul 2010 | JP |
2011109483 | Jun 2011 | JP |
100773584 | Nov 2007 | KR |
100839093 | Jun 2008 | KR |
20080058820 | Jun 2008 | KR |
2012099505 | Jul 2012 | WO |
2012103554 | Aug 2012 | WO |
2015002699 | Jan 2015 | WO |
Entry |
---|
European Search Report—EP20163256—Search Authority—Munich—dated Jun. 17, 2020. |
European Search Report—EP18194908—Search Authority—Munich—dated Nov. 7, 2018. |
International Search Report and Written Opinion—PCT/US2014/038991—ISA/EPO—dated Aug. 25, 2014. |
Ma T-Y. et al., “Automatic Brightness Control of the Handheld Device Display with Low Illumination”, 2012 IEEE International Conference on Computer Science and Automation Engineering (CSAE), May 25-27, 2012, Zhangjiajje, China, vol. 2, May 25, 2012 (May 25, 2012), May 27, 2012 (May 27, 2012), pp. 382-385, XP002728351, Proceedings of the 2012 IEEE International Conference on Computer Science and Automation Engineering (CSAE) IEEE Piscataway, NJ, USA, ISBN: 978-1-4673-0089-6. |
Number | Date | Country | |
---|---|---|---|
20210329152 A1 | Oct 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15694715 | Sep 2017 | US |
Child | 17305240 | US | |
Parent | 15362595 | Nov 2016 | US |
Child | 15694715 | US | |
Parent | 13932844 | Jul 2013 | US |
Child | 15362595 | US |