Dynamic color profile management for electronic devices

Information

  • Patent Grant
  • 9508318
  • Patent Number
    9,508,318
  • Date Filed
    Monday, December 31, 2012
    11 years ago
  • Date Issued
    Tuesday, November 29, 2016
    7 years ago
Abstract
Dynamic white point management techniques include determining a white point of ambient light proximate to a display. A color profile adjustment is determined based upon the determined white point and intensity of the ambient light. The image color space is transformed to a display color space for rendering on the display based on the determined adjusted to the color profile.
Description
BACKGROUND OF THE INVENTION

Electronic devices have made significant contributions toward the advancement of modern society and are utilized in a number of applications to achieve advantageous results. Numerous devices, such as desktop personal computers (PCs), laptop PCs, tablet PCs, netbooks, smart phones, game consoles, servers, and the like have facilitated increased productivity and reduced costs in communicating and analyzing data in most areas of entertainment, education, business, and science. One common aspect of such electronic devices is the display. The display may be utilized to control the operation of the device, output content to the user, and the like.


The display, of a number electronic devices, may be subject to changing environments, particularly for mobile electronic devices such as tablet PCs, smart phones, personal game consoles, and the like. The changing environment commonly impacts the clarity of the display for the user. Accordingly, there is a continuing need for improved display technology.


SUMMARY OF THE INVENTION

The present technology may best be understood by referring to the following description and accompanying drawings that are used to illustrate embodiments of the present technology directed toward techniques for dynamically adjusting the color gamut subset of a display to match the current ambient light.


In one embodiment, the method of dynamically managing the white point (e.g., color profile) of a display includes determining a white point and intensity of ambient light proximate a display. A color profile adjustment of a display is determined based upon the determined white point and intensity of the ambient light. The image color space is transformed to a display color space based on the determined color profile adjustment of the display.


In another embodiment, an electronic device including one or more dynamic white point (e.g., color profile) managed displays includes one or more light sensors and one or more processing units. The one or more processing units determine a white point and intensity of the ambient light sensed by the light sensor. The one or more processing units also determine a color profile adjustment based upon the determined white point and intensity of the ambient light. The one or more processing units further transform an image color space to a display color space based on the determined color space adjusted and to present content in the display color space on the display.


This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the present technology are illustrated by way of example and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:



FIG. 1 shows a block diagram of an electronic device including one or more dynamic white point (e.g., color profile) managed displays, in accordance with one embodiment of the present technology.



FIG. 2 shows a flow diagram of a method of dynamically managing the white point (e.g., color profile) of a display, in accordance with one embodiment of the present technology.





DETAILED DESCRIPTION OF THE INVENTION

Reference will now be made in detail to the embodiments of the present technology, examples of which are illustrated in the accompanying drawings. While the present technology will be described in conjunction with these embodiments, it will be understood that they are not intended to limit the invention to these embodiments. On the contrary, the invention is intended to cover alternatives, modifications and equivalents, which may be included within the scope of the invention as defined by the appended claims. Furthermore, in the following detailed description of the present technology, numerous specific details are set forth in order to provide a thorough understanding of the present technology. However, it is understood that the present technology may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the present technology.


Some embodiments of the present technology which follow are presented in terms of routines, modules, logic blocks, and other symbolic representations of operations on data within one or more electronic devices. The descriptions and representations are the means used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art. A routine, module, logic block and/or the like, is herein, and generally, conceived to be a self-consistent sequence of processes or instructions leading to a desired result. The processes are those including physical manipulations of physical quantities. Usually, though not necessarily, these physical manipulations take the form of electric or magnetic signals capable of being stored, transferred, compared and otherwise manipulated in an electronic device. For reasons of convenience, and with reference to common usage, these signals are referred to as data, bits, values, elements, symbols, characters, terms, numbers, strings, and/or the like with reference to embodiments of the present technology.


It should be borne in mind, however, that all of these terms are to be interpreted as referencing physical manipulations and quantities and are merely convenient labels and are to be interpreted further in view of terms commonly used in the art. Unless specifically stated otherwise as apparent from the following discussion, it is understood that through discussions of the present technology, discussions utilizing the terms such as “receiving,” and/or the like, refer to the actions and processes of an electronic device such as an electronic computing device, that manipulates and transforms data. The data is represented as physical (e.g., electronic) quantities within the electronic device's logic circuits, registers, memories and/or the like, and is transformed into other data similarly represented as physical quantities within the electronic device.


In this application, the use of the disjunctive is intended to include the conjunctive. The use of definite or indefinite articles is not intended to indicate cardinality. In particular, a reference to “the” object or “a” object is intended to denote also one of a possible plurality of such objects. It is also to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.


Displays typically have a larger possible color gamut than is used at any one time. The subset of the gamut that is active can be adjusted to remap input colors based on a color profile. In embodiments of the present technology, the color gamut subset used for rendering image on a display is dynamically adjusted so that the white point of the display matches the current ambient lighting in which the display is utilized.


Referring now to FIG. 1, an electronic device including one or more dynamic white point (e.g., color profile) managed displays, in accordance with one embodiment of the present technology, is shown. The electronic device 110 includes one or more light sensors 120, one or more processing units 130, one or more displays 140 and any number of other subsystems for implementing functionalities of the electronic device. For example, the electronic device may include one or more communication interfaces, a keyboard, speakers, a microphone and the like for implementing devices such as laptop PCs, tablet PCs, netbooks, smartphones and/or the like. However, these additional subsystems are not necessary to an understanding of embodiment of the present technology and therefore are not discussed further.


In one implementation, the light sensor may be one or more photo diodes, such as a red, a green and a blue wide angle photo diode. In another implementation, the light sensor may be a camera. In one implementation, the one or more processing units may include an image processor 132 and a display processor 134. The one or more processing units may also include a central processing unit, a graphics processing unit, application specific integrated circuit (ASIC), field programmable array (FPGA), combinational logic, and/or the like. The one or more processing units may be implemented in any combination of hardware, firmware and/or software. In one implementation, the display may be a basic display. In another implementation, the display may be as touch screen display.


Operation of the electronic device will be further explained with reference to FIG. 2, which shows a method of dynamically managing the white point (e.g., color profile) of a display. The method may be implemented in hardware, firmware, as computing device-executable instructions (e.g., computer program) that are stored in computing device-readable media (e.g., computer memory) and executed by a computing device (e.g., processor) or any combination thereof.


The method may begin with determining a white point and optionally an intensity of an ambient light proximate a display, at 220. In one implementation, the white point is determined by estimating a set of tristimulas values (e.g., XYZ, sRGB) or chromaticity (e.g., xy, uv) for the color white of the ambient light. In another implementation, the white point is determined by estimating a set of chromaticity coordinates (e.g., YUV) for the color white of the ambient light In one implementation, the one or more light sensors in combination with the one or more processing units periodically determine the white point and intensity of the ambient light proximate the electronic device. In one implementation, a camera of the electronic device captures image frames and estimates the white chromaticity and intensity using auto exposure and auto white balance components of the image processing stack. In another implementation, a combinational logic circuit central processor or the like may determine the white point and intensity of the light captured by a set of wide angle photo diodes (e.g., red, green and blue).


At 230, an adjustment to a color profile of the display is determined based upon the determined white point and optionally the intensity of the ambient light proximate the display. In one implementation, a central processor, ASIC, display processor or the like may determine a gamut subset of the display that has a white point and optionally an intensity that substantially matches the determined white point and optionally the intensity of the ambient light. In one implementation, the adjustment may be determined from a data structure that correlates relative color response of the light sensor to chromaticity setting of the display. In another implementation, the adjustment may be determined on a per-unit basis from a data structure that correlates the relative color response of the particular light sensor to one or more know lights and the measured chromaticity at one or more settings of the particular display. The per-unit measurements of the sensor and display may be utilized to overcome manufacturing variations. In addition, both measurements may also be combined with general characteristics of the light sensor and/or display based on more extensive measurements than is appropriate for a manufacturing line.


At 240, a correction to the determined color profile adjustment may optionally be performed based on one or more factors. In one implementation, the selected subset of the display color space profile may be adjusted based on a bezel size of the display. In another implementation, the selected profile may be adjusted based on the relative and absolute brightness of the screen and/or the ambient light. In yet another implementation, the selected profile may be adjusted based on limits on the display gamut.


At 250, the color profile adjustment may optionally be dampened, limited or the like, at 260. In one implementation, the central processor, ASIC, display processor or the like may dampen or limit the rate of change in the color profile adjustment to provide a smooth change in display color, thus avoiding disruption to the user experience.


At 260, for an image presented on the display, the color space of the image is transformed to a color space of the display based on the determined color profile adjustment, the corrected color profile adjustment, the dampened color profile adjustment and/or limited color profile adjustment. In one implementation, the one or mere processing units transforms the color image space to the color display space based upon the determined color profile adjustment. In one implementation, the transform is determined from a color correction matrix, lookup table, or the like.


The processes of 220-260 may be iteratively repeated. The processes may be iteratively repeated periodically, when a change in the ambient light is detected, or the like.


A predetermined image color space to display color space transform may optionally be applied at one or more time, at 210. In one implementation, the processing unit may apply a default image color space to display color space transform before the processes of 220-260 are performed. In another implementation, the processing unit may apply a previously determined image color space to display color space transform. The choice of applying a default or previously determined image color space to display color space transform may be based upon a state of the device, timing information and or the like. For example, if the display is turned off and turned back on again within a few minutes, the odds are good that the last white point could be used again when the display it turned back on. In such an example, the previously determined image color space to display color space transform may be applied. In another example, the default image color space to display color space transform may be applied when the device has been turned back on after several days. In such case, the odds are good that the ambient light is completely changed and therefore the default image color space to display color space transform may be as good as any until processes 220-260, can be performed.


For electronic devices that already include a camera, a plurality of wide angle photo diodes, or other similar light sensor, dynamic white point adjustment in response to changes in the ambient light may advantageously be implemented without added costs to the bill of materials for the electronic device. The white point adjustment technique may be implemented transparently to the end user of the electronic device. Automatically adjusting the white point of the display according to the current ambient lighting can advantageously reduce eyestrain and provide a better viewing experience for the user. Furthermore, embodiments of the present technology may coexist with manual color profile adjustment, and may include optional controls such as enable-disable, manual white point selection, limits on adjustment range and speed, exposed to end users and/or application, and/or the like.


The foregoing descriptions of specific embodiments of the present technology have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the present technology and its practical application, to thereby enable others skilled in the art to best utilize the present technology and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.

Claims
  • 1. A method comprising: periodically determining a white point of ambient light proximate a display;determining a color profile adjustment for the color space of the display based upon the determined current white point of the ambient light;dampening a current color profile adjustment based upon one or more previous determined color profile adjustments; andtransforming an image color space to the display color space based on the dampened color profile adjustment of the display so that the white point of the display substantially matches the current white point of the ambient light proximate the display, wherein the color gamut of the display color space is greater than the color gamut of the image color space.
  • 2. The method according to claim 1, wherein determining the white point of the ambient light comprises estimating a set of tristimulus values or chromaticity coordinates of a color white of the ambient light.
  • 3. The method according to claim 1, where determining the color profile adjustment comprises substantially determining a gamut subset of the display having a white point that substantially matches the determined white point of the ambient light.
  • 4. The method according to claim 1, wherein transforming the image color space to the display color space comprises transforming a white point of an encoded image to a rendered image substantially matching the white point of the ambient light.
  • 5. The method according to claim 1, wherein determining the white point of the ambient light, determining the color profile adjustment, dampening the current color profile adjustment and transforming the image color space to the display color space is iteratively performed on a periodic basis.
  • 6. The method according to claim 1, wherein determining the color profile adjustment, dampening the current color profile adjustment and transforming the image color space to the display color space is iteratively performed each time a change in the white point of the ambient light proximate to the display is determined.
  • 7. The method according to claim 1, further comprising: applying a correction to the determined color profile adjustment based upon one or more parameters selected from a group consisting of a bezel size of the display, a relative brightness of the display, an absolute brightness of the display and a limit on the display gamut; andfurther transforming the image color space to the display color space based on the dampened corrected color profile adjustment.
  • 8. The method according to claim 1, further comprising applying a predetermined image color space to display color space transform before the processes of determining the white point of the ambient light, determining the color profile adjustment, dampening the current color profile and transforming the image color space to the display color space is performed.
  • 9. A computing device comprising: a means for determining a current white point of ambient light proximate to a display;a means for determining a color profile adjustment for the color space of the display based upon the determined white point of the ambient light; anda means for transforming an image color space to a display color space based on the determined color profile adjustment so that the white point of the display substantially matched the current white point of the ambient light proximate the display, wherein the color gamut of the display color space is greater than the color gamut of the image color space.
  • 10. The computing device of claim 9, further comprising a means for applying a correction to the color profile adjustment based upon one or more parameters selected from a group consisting of a bezel size of the display, a relative brightness of the display, an absolute brightness of the display and a limit on the display gamut.
  • 11. The computing device of claim 9, further comprising a means for applying a predetermined image color space to display color space transform before determining the white point of the ambient light, determining the color space adjustment and transforming the image color space to the display color space is performed.
  • 12. The computing device of claim 9, further comprising a means for iteratively determining the white point of the ambient light, determining the color space adjustment and transforming the image color space to the display color space on a periodic basis.
  • 13. The computing device of claim 9, further comprising a means for iteratively determining the color space adjustment and transforming the image color space to the display color space each time a change in the white point of the ambient light is determined.
  • 14. A computing device comprising: a display;a light sensor to sense ambient light; anda processing unit to determine a white point of the ambient light sensed by the light sensor, to determine a color profile adjustment for the color space of the display based upon the determined white point of the ambient light, to transform an image color space to a display color space based on the determined color space adjusted and to present content in the display color space on the display so that the white point of the display substantially matched the white point of the ambient light, wherein the color gamut of the display color space is greater than the color gamut of the image color space.
  • 15. The computing device of claim 14, wherein the light sensor comprises a camera.
  • 16. The computing device of claim 14, wherein the light sensor comprises a plurality of photo diodes.
  • 17. The computing device of claim 14, wherein the light sensor senses the ambient light behind the display.
  • 18. The computing device of claim 14, wherein the light sensor sense the ambient light in front of the display.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Patent Application No. 61/700,824 filed Sep. 13, 2012, which is incorporated herein by reference.

US Referenced Citations (185)
Number Name Date Kind
3904818 Kovac Sep 1975 A
4253120 Levine Feb 1981 A
4646251 Hayes et al. Feb 1987 A
4739495 Levine Apr 1988 A
4771470 Geiser et al. Sep 1988 A
4920428 Lin et al. Apr 1990 A
4987496 Greivenkamp, Jr. Jan 1991 A
5175430 Enke et al. Dec 1992 A
5261029 Abi-Ezzi et al. Nov 1993 A
5305994 Matsui et al. Apr 1994 A
5387983 Sugiura et al. Feb 1995 A
5475430 Hamada et al. Dec 1995 A
5513016 Inoue Apr 1996 A
5608824 Shimizu et al. Mar 1997 A
5652621 Adams, Jr. et al. Jul 1997 A
5793433 Kim et al. Aug 1998 A
5878174 Stewart et al. Mar 1999 A
5903273 Mochizuki et al. May 1999 A
5905530 Yokota et al. May 1999 A
5995109 Goel et al. Nov 1999 A
6016474 Kim et al. Jan 2000 A
6078331 Pulli et al. Jun 2000 A
6111988 Horowitz et al. Aug 2000 A
6118547 Tanioka Sep 2000 A
6141740 Mahalingaiah et al. Oct 2000 A
6151457 Kawamoto Nov 2000 A
6175430 Ito Jan 2001 B1
6252611 Kondo Jun 2001 B1
6256038 Krishnamurthy Jul 2001 B1
6281931 Tsao et al. Aug 2001 B1
6289103 Sako et al. Sep 2001 B1
6314493 Luick Nov 2001 B1
6319682 Hochman Nov 2001 B1
6323934 Enomoto Nov 2001 B1
6392216 Peng-Tan May 2002 B1
6396397 Bos et al. May 2002 B1
6438664 McGrath et al. Aug 2002 B1
6486971 Kawamoto Nov 2002 B1
6584202 Montag et al. Jun 2003 B1
6683643 Takayama et al. Jan 2004 B1
6707452 Veach Mar 2004 B1
6724423 Sudo Apr 2004 B1
6724932 Ito Apr 2004 B1
6737625 Baharav et al. May 2004 B2
6760080 Moddel et al. Jul 2004 B1
6785814 Usami et al. Aug 2004 B1
6806452 Bos et al. Oct 2004 B2
6839062 Aronson et al. Jan 2005 B2
6856441 Zhang et al. Feb 2005 B2
6891543 Wyatt May 2005 B2
6900836 Hamilton, Jr. May 2005 B2
6950099 Stollnitz et al. Sep 2005 B2
7009639 Une et al. Mar 2006 B1
7015909 Morgan III et al. Mar 2006 B1
7023479 Hiramatsu et al. Apr 2006 B2
7088388 MacLean et al. Aug 2006 B2
7092018 Watanabe Aug 2006 B1
7106368 Daiku et al. Sep 2006 B2
7133041 Kaufman et al. Nov 2006 B2
7133072 Harada Nov 2006 B2
7221779 Kawakami et al. May 2007 B2
7227586 Finlayson et al. Jun 2007 B2
7245319 Enomoto Jul 2007 B1
7305148 Spampinato et al. Dec 2007 B2
7343040 Chanas et al. Mar 2008 B2
7486844 Chang et al. Feb 2009 B2
7502505 Malvar et al. Mar 2009 B2
7580070 Yanof et al. Aug 2009 B2
7626612 John et al. Dec 2009 B2
7627193 Alon et al. Dec 2009 B2
7671910 Lee Mar 2010 B2
7728880 Hung et al. Jun 2010 B2
7750956 Wloka Jul 2010 B2
7817187 Silsby et al. Oct 2010 B2
7859568 Shimano et al. Dec 2010 B2
7860382 Grip Dec 2010 B2
8238695 Davey et al. Aug 2012 B1
20010001234 Addy et al. May 2001 A1
20010012113 Yoshizawa et al. Aug 2001 A1
20010012127 Fukuda et al. Aug 2001 A1
20010015821 Namizuka et al. Aug 2001 A1
20010019429 Oteki et al. Sep 2001 A1
20010021278 Fukuda et al. Sep 2001 A1
20010033410 Helsel et al. Oct 2001 A1
20010050778 Fukuda et al. Dec 2001 A1
20010054126 Fukuda et al. Dec 2001 A1
20020012131 Oteki et al. Jan 2002 A1
20020015111 Harada Feb 2002 A1
20020018244 Namizuka et al. Feb 2002 A1
20020027670 Takahashi et al. Mar 2002 A1
20020033887 Hieda et al. Mar 2002 A1
20020041383 Lewis, Jr. et al. Apr 2002 A1
20020044778 Suzuki Apr 2002 A1
20020054374 Inoue et al. May 2002 A1
20020063802 Gullichsen et al. May 2002 A1
20020105579 Levine et al. Aug 2002 A1
20020126210 Shinohara et al. Sep 2002 A1
20020146136 Carter, Jr. Oct 2002 A1
20020149683 Post Oct 2002 A1
20020158971 Daiku et al. Oct 2002 A1
20020167202 Pfalzgraf Nov 2002 A1
20020167602 Nguyen Nov 2002 A1
20020191694 Ohyama et al. Dec 2002 A1
20020196470 Kawamoto et al. Dec 2002 A1
20030035100 Dimsdale et al. Feb 2003 A1
20030067461 Fletcher et al. Apr 2003 A1
20030122825 Kawamoto Jul 2003 A1
20030142222 Hordley Jul 2003 A1
20030146975 Joung et al. Aug 2003 A1
20030169353 Keshet et al. Sep 2003 A1
20030169918 Sogawa Sep 2003 A1
20030197701 Teodosiadis et al. Oct 2003 A1
20030218672 Zhang et al. Nov 2003 A1
20030222995 Kaplinsky et al. Dec 2003 A1
20030223007 Takane Dec 2003 A1
20040001061 Stollnitz et al. Jan 2004 A1
20040001234 Curry et al. Jan 2004 A1
20040032516 Kakarala Feb 2004 A1
20040066970 Matsugu Apr 2004 A1
20040100588 Hartson et al. May 2004 A1
20040101313 Akiyama May 2004 A1
20040109069 Kaplinsky et al. Jun 2004 A1
20040178974 Miller et al. Sep 2004 A1
20040189875 Zhai et al. Sep 2004 A1
20040218071 Chauville et al. Nov 2004 A1
20040247196 Chanas et al. Dec 2004 A1
20050007378 Grove Jan 2005 A1
20050007477 Ahiska Jan 2005 A1
20050030395 Hattori Feb 2005 A1
20050046704 Kinoshita Mar 2005 A1
20050099418 Cabral et al. May 2005 A1
20050175257 Kuroki Aug 2005 A1
20050185058 Sablak Aug 2005 A1
20050213128 Imai et al. Sep 2005 A1
20050238225 Jo et al. Oct 2005 A1
20050243181 Castello et al. Nov 2005 A1
20050248671 Schweng Nov 2005 A1
20050261849 Kochi et al. Nov 2005 A1
20050286097 Hung et al. Dec 2005 A1
20060050158 Irie Mar 2006 A1
20060061658 Faulkner et al. Mar 2006 A1
20060087509 Ebert et al. Apr 2006 A1
20060119710 Ben-Ezra et al. Jun 2006 A1
20060133697 Uvarov et al. Jun 2006 A1
20060176375 Hwang et al. Aug 2006 A1
20060197664 Zhang et al. Sep 2006 A1
20060268180 Chou Nov 2006 A1
20060274171 Wang Dec 2006 A1
20060290794 Bergman et al. Dec 2006 A1
20060293089 Herberger et al. Dec 2006 A1
20070002165 Parks Jan 2007 A1
20070091188 Chen et al. Apr 2007 A1
20070139405 Marcinkiewicz Jun 2007 A1
20070147706 Sasaki et al. Jun 2007 A1
20070171288 Inoue et al. Jul 2007 A1
20070236770 Doherty et al. Oct 2007 A1
20070247532 Sasaki Oct 2007 A1
20080030587 Helbing Feb 2008 A1
20080062164 Bassi et al. Mar 2008 A1
20080218599 Klijn et al. Sep 2008 A1
20080231726 John Sep 2008 A1
20080303918 Keithley Dec 2008 A1
20090002517 Yokomitsu et al. Jan 2009 A1
20090010539 Guarnera et al. Jan 2009 A1
20090116750 Lee et al. May 2009 A1
20090160957 Deng et al. Jun 2009 A1
20090257677 Cabral et al. Oct 2009 A1
20090295941 Nakajima et al. Dec 2009 A1
20100266201 Cabral et al. Oct 2010 A1
20100309333 Smith et al. Dec 2010 A1
20110074980 Border et al. Mar 2011 A1
20110096190 Silverstein et al. Apr 2011 A1
20110122273 Kanemitsu et al. May 2011 A1
20120019569 Byun Jan 2012 A1
20120293472 Wong et al. Nov 2012 A1
20130050165 Northway et al. Feb 2013 A1
20130083216 Jiang et al. Apr 2013 A1
20130212094 Naguib et al. Aug 2013 A1
20130242133 Li Sep 2013 A1
20140063300 Lin et al. Mar 2014 A1
20140125836 Pieper May 2014 A1
20150002692 Cabral et al. Jan 2015 A1
20150002693 Cabral et al. Jan 2015 A1
20150130967 Pieper May 2015 A1
20160037044 Motta et al. Feb 2016 A1
Foreign Referenced Citations (39)
Number Date Country
1275870 Dec 2000 CN
0392565 Oct 1990 EP
1449169 May 2003 EP
1378790 Jul 2004 EP
1447977 Aug 2004 EP
1550980 Jul 2005 EP
2045026 Oct 1980 GB
2363018 May 2001 GB
61187467 Aug 1986 JP
62151978 Jul 1987 JP
37015631 Jan 1995 JP
8036640 Feb 1996 JP
38079622 Mar 1996 JP
2001052194 Feb 2001 JP
2002207242 Jul 2002 JP
2003085542 Mar 2003 JP
2004221838 Aug 2004 JP
2005094048 Apr 2005 JP
2005182785 Jul 2005 JP
2005520442 Jul 2005 JP
2006025005 Jan 2006 JP
2006086822 Mar 2006 JP
2006094494 Apr 2006 JP
2006121612 May 2006 JP
2006134157 May 2006 JP
2007019959 Jan 2007 JP
2007148500 Jun 2007 JP
2007233833 Sep 2007 JP
2007282158 Oct 2007 JP
2008085388 Apr 2008 JP
2008277926 Nov 2008 JP
2009021962 Jan 2009 JP
1020040043156 May 2004 KR
1020060068497 Jun 2006 KR
1020070004202 Jan 2007 KR
03043308 May 2003 WO
2004063989 Jul 2004 WO
2007056459 May 2007 WO
2007093864 Aug 2007 WO
Non-Patent Literature Citations (38)
Entry
Bolz, P. Schroder; “rapid evaluation of catmull-clark subdivision surfaces”; Web 3D '02.
J. Stam; “Exact Evaluation of Catmull-clark subdivision surfaces at arbitrary parameter values”; Jul. 1998; Computer Graphics; vol. 32; pp. 395-404.
Keith R. Slavin; Application As Filed entitled “Efficient Method for Reducing Noise and Blur in a Composite Still Image From a Rolling Shutter Camera”; Application No. 12069669; Filed Feb. 11, 2008.
Ko et al., “Fast Digital Image Stabilizer Based on Gray-Coded Bit-Plane Matching”, IEEE Transactions on Consumer Electronics, vol. 45, No. 3, pp. 598-603, Aug. 1999.
Ko, et al., “Digital Image Stabilizing Algorithms Basd on Bit-Plane Matching”, IEEE Transactions on Consumer Electronics, vol. 44, No. 3, pp. 617-622, Aug. 1988.
Krus, M., Bourdot, P., Osorio, A., Guisnel, F., Thibault, G., Adaptive tessellation of connected primitives for interactive ovalkthroughs in complex industrial virtual environments, Jun. 1999, Proceedings of the Eurographics workshop, pp. 1-10.
Kumar, S., Manocha, D., Interactive display of large scale trimmed NURBS models, 1994, University of North Carolina at Chapel Hill, Technical Report, p. 1-36.
Kuno et al. “New Interpolation Method Using Discriminated Color Correlation for Digital Still Cameras” IEEE Transac. On Consumer Electronics, vol. 45, No. 1, Feb. 1999, pp. 259-267.
Loop, C., DeRose, T., Generalized B-Spline surfaces o arbitrary topology, Aug. 1990, SIGRAPH 90, pp. 347-356.
M. Halstead, M. Kass, T. DeRose; “efficient, fair interolation using catmull-clark surfaces”; Sep. 1993; Computer Graphics and Interactive Techniques, Proc; pp. 35-44.
Morimoto et al., “Fast Electronic Digital Image Stabilization for Off-Road Navigation”, Computer Vision Laboratory, center for Automated Research University of Maryland, Real-Time Imaging, vol. 2, pp. 285-296, 1996.
Paik et al., “An Adaptive Motion Decision system for Digital Image Stabilizer Based on Edge Pattern Matching”, IEEE Transactions on Consumer Electronics, vol. 38, No. 3, pp. 607-616, Aug. 1992.
Parhami, Computer Arithmetic, Oxford University Press, Jun. 2000, pp. 413-418.
S. Erturk, “Digital Image Stabilization with Sub-Image Phase Correlation Based Global Motion Estimation”, IEEE Transactions on Consumer Electronics, vol. 49, No. 4, pp. 1320-1325, Nov. 2003.
S. Erturk, “Real-Time Digital Image Stabilization Using Kalman Filters”, http://www,ideallibrary.com, Real-Time Imaging 8, pp. 317-328, 2002.
T. DeRose, M., Kass, T. Troung; “subdivision surfaces in character animation”; Jul. 1998; Computer Graphics and Interactive Techniques, Proc; pp. 85-94.
Takeuchi, S., Kanai, T., Suzuki, H., Shimada, K., Kimura, F., Subdivision surface fitting with QEM-basd mesh simplificatio and reconstruction of aproximated B-Spline surfaces, 200, Eighth Pacific Conference on computer graphics and applications pp. 202-2012.
Uomori et al., “Automatic Image Stabilizing System by Full-Digital Signal Processing”, vol. 36, No. 3, pp. 510-519, Aug. 1990.
Uomori et al., “Electronic Image Stabiliztion System for Video Cameras and Vcrs”, J. Soc. Motion Pict. Telev. Eng., vol. 101, pp. 66-75, 1992.
“A Pipelined Architecture for Real-Time orrection of Barrel Distortion in Wide-Angle Camera Images”, Hau, T. Ngo, Student Member, IEEE and Vijayan K. Asari, Senior Member IEEE, IEEE Transaction on Circuits and Sytstems for Video Technology: vol. 15 No. 3 Mar. 2005 pp. 436-444.
“Calibration and removal of lateral chromatic abberation in images” Mallon, et al. Science Direct Copyright 2006; 11 pages.
“Method of Color Interpolation in a Singe Sensor Color Camera Using Green Channel Seperation” Weerasighe, et al Visual Information Processing Lab, Motorola Austrailian Research Center pgs. IV-3233-IV3236, 2002.
D. Doo, M. Sabin “Behaviour of recrusive division surfaces near extraordinary points”; Sep. 197; Computer Aided Design; vol. 10, pp. 356-360.
D.W.H. Doo; “A subdivision algorithm for smoothing down irregular shaped polyhedrons”; 1978; Interactive Techniques in Computer Aided Design; pp. 157-165.
Davis, J., Marschner, S., Garr, M., Levoy, M., Filling holes in complex surfaces using volumetric diffusion, Dec. 2001, Stanford University, pp. 1-9.
Donald D. Spencer, “Illustrated Computer Graphics Dictionary”, 1993, Camelot Publishing Company, p. 272.
Duca et al., “A Relational Debugging Engine for Graphics Pipeline, International Conference on Computer Graphics and Interactive Techniques”, ACM SIGGRAPH Jul. 2005, pp. 453-463.
E. Catmull, J. Clark, “recursively enerated B-Spline surfaces on arbitrary topological meshes”; Nov. 1978; Computer aided design; vol. 10; pp. 350-355.
gDEBugger, graphicRemedy, http://www.gremedy.com, Aug. 8, 2006, pp. 1-18 .
http://en.wikipedia.org/wiki/Bayer—filter; “Bayer Filter”; Wikipedia, the free encyclopedia; pp. 1-4.
http://en.wikipedia.org/wiki/Color—filter—array; “Color Filter Array”; Wikipedia, the free encyclopedia; pp. 1-5.
http://en.wikipedia.org/wiki/Color—space; “Color Space”; Wikipedia, the free encyclopedia; pp. 1-4.
http://en.wikipedia.org/wiki/Color—translation; “Color Management”; Wikipedia, the free encyclopedia; pp. 1-4.
http://en.wikipedia.org/wiki/Demosaicing; “Demosaicing”; Wikipedia, the free encyclopedia; pp. 1-5.
http://en.wikipedia.org/wiki/Half—tone; “Halftone”; Wikipedia, the free encyclopedia; pp. 1-5.
http://en.wikipedia.org/wiki/L*a*b*; “Lab Color Space”; Wikipedia, the free encyclopedia; pp. 1-4.
http://Slashdot.orgiarticles/07/09/0611431217.html.
http:englishrussia.com/?p=1377.
Related Publications (1)
Number Date Country
20140071102 A1 Mar 2014 US
Provisional Applications (1)
Number Date Country
61700824 Sep 2012 US