Color enhancement is a known art in the field of consumer electronics to enhance the appearance of an image (still or video) to look more vibrant by artificially shifting the colors corresponding to real-life objects towards what the human eye and the human persona commonly associate with beauty. For example, a field of grass or a piece of foliage naturally appearing as pale green may be artificially shifted to a more saturated green to make the field or foliage appear fresher and more verdant. A pale blue sky may be artificially shifted towards a more saturated blue to make the sky appear more vibrant and clear. Similarly, pallid human skin may be artificially shifted to a more reddish brown, causing the human skin appear to have a healthier complexion. Accordingly, circuitry has been developed to detect programmable regions of blue, green, and skin and to perform a programmable shift when the regions are detected.
Blue, green and skin enhancements are the usual color enhancements performed in the industry. In conventional techniques, images may be encoded as a plurality of pixels, each pixel having a color. In order to perform the color enhancement of an image, the colors of the pixels comprising the image must be detected. Specifically, a determination must be made whether a given pixel in the image has the color of interest (e.g., blue, green and “skin color”). After a pixel having a color of interest is detected, the color value of that pixel is multiplied and/or shifted by a certain amount.
The detection and the shift are usually performed in the YCbCr color space. A YCbCr space is a 3 dimensional space where Y is the monochrome component pertaining to the brightness or luminance of the image, and the Cb-Cr plane corresponds to the color components of the image for a particular value of luminance. Typically, the Cb-Cr color plane comprises a vertical axis (Cr) and a horizontal axis (Cb). For many luminance values, the color green can be largely detected if the value of a pixel's color component falls in the 3rd quadrant (Cb<0, Cr<0). Similarly, the color blue is largely detected in the 4th quadrant (Cb>0, Cr<0). Likewise, skin color is usually detected somewhere in the second quadrant (Cb<0, Cr>0).
According to conventional methods, a region (typically a triangle for green or blue, and a trapezoid for skin) is defined in a Cb-Cr color plane as a region of interest, and a second, corresponding region (of the same shape as the region of interest) is defined in the same Cb-Cr color plane as the shift region. Any pixel which is detected in the region of interest is thus shifted to a corresponding position in the shift region. As regions of interest and shift regions may overlap in some portions, a pixel may be shifted to be in another position in the region of interest. Shifts may be executed as a vector shift, such that every position in a region of interest is shifted in the magnitude and direction by the same vector.
The programmable parameters for blue and green enhancement typically include: (i) the regions of interest (e.g., “detection regions”) based on the side lengths of the triangle and the offset from the origin (O), and (ii) the shift out vector towards more lively green or blue. For skin, the detection is based on parameters such as the shift from the origin, the length of the sides of the trapezoid, and the angle of location with respect to the vertical (Cr) axis. Enhancement for skin is a vector that either specifies an inward squeeze of that trapezoidal area (e.g., to make it conform to a narrower range of widely preferred skin hue) or a shift towards red (e.g., to make the skin more livid).
For a given set of values for the parameters, conventional methods of detection and shift are performed independently of Y (luminance). In other words, the detection region and the accompanying shift region will not vary along the luminance axis. Specifically, the same detection region and corresponding shift region (according to the same shift vector) will appear in the same relative positions in each Cb-Cr plane for each Y along the luminance axis. However, the positions of colors on the Cb-Cr planes vary along the luminance axis. For example, along the luminance axis, a color region does not always remain restricted to a fixed point, or even a fixed quadrant. Also, the shape of the color region of interest (to be enhanced) grows and shrinks along the luminance axis, and different colors are distributed dissimilarly in Cb-Cr planes along the luminance axis
Therefore, a color shade that occupies a certain region of the Cb-Cr plane for one value of luminance on the luminance axis may occupy a different region in the Cb-Cr plane at a different luminance value on the luminance axis. The color intensity also changes along the luminance axis, so that a color (e.g., green) which moves from dark (green) to light (green) along the luminance axis occupies varying regions on the Cb-Cr plane for varying luminance values, e.g., as one moves along the luminance axis Accordingly, a region of interest which includes the position of a color in a Cb-Cr plane for one luminance may not include the position of the same color in a Cb-Cr plane for another luminance. Thus, a detection region for one luminance that would detect a color and perform a shift for pixels pertaining to one color may not detect the color for another value of the luminance. Conversely, an unintended shift may be performed for a color which was outside the detection region for the original value of luminance, but whose position now lies within the detection region in the new value of luminance.
Furthermore, conventional methods are often restricted by several limitations which adversely affect their efficacy. For example, current methods for color enhancement are restricted to blue, green and skin enhancement. Color enhancement for other colors (e.g., red) is not available through conventional color enhancement techniques. Moreover, the shape of the detection regions and corresponding shift regions are typically invariable, and/or may also be invariable in size along the Y (luminance) axis. These limitations further exacerbate the issue of having undetected enhancement candidates and improper enhancements.
Embodiments of the present invention are directed to provide a method and system for enhancing the display of color input in graphical display devices, such as image display devices and video display devices, etc. . . . A method is provided which allows for the construction of a variable detection volume and a variable shift volume along a luminance axis in a three dimensional color space. Color detection and color shifts therefore vary by luminance advantageously.
One novel method enables a re-positioning of detection regions comprised in the detection volume to account for shifts of a color region. Another novel method provides the ability to adjust the size and orientation of a detection region and corresponding shift region. Yet another novel method allows for the selection and usage of an assortment of shapes for more flexible and precise detection and shift schemes.
Each of the above novel methods provide parameters that vary depending on the luminance of the image, thereby providing advantageous color enhancement in the resultant display. In short, color enhancement is more accurately specified based on the brightness of the color.
The accompanying drawings, which are incorporated in and form a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention:
Reference will now be made in detail to several embodiments. While the subject matter will be described in conjunction with the alternative embodiments, it will be understood that they are not intended to limit the claimed subject matter to these embodiments. On the contrary, the claimed subject matter is intended to cover alternative, modifications, and equivalents, which may be included within the spirit and scope of the claimed subject matter as defined by the appended claims.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the claimed subject matter. However, it will be recognized by one skilled in the art that embodiments may be practiced without these specific details or with equivalents thereof. In other instances, well-known processes, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects and features of the subject matter.
Portions of the detailed description that follow are presented and discussed in terms of a process. Although steps and sequencing thereof are disclosed in a figure herein (e.g.,
Some portions of the detailed description are presented in terms of procedures, steps, logic blocks, processing, and other symbolic representations of operations on data bits that can be performed on computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. A procedure, computer-executed step, logic block, process, etc., is here, and generally, conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout, discussions utilizing terms such as “accessing,” “writing,” “including,” “storing,” “transmitting,” “traversing,” “associating,” “identifying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
While the following exemplary configurations are shown as incorporating specific, enumerated features and elements, it is understood that such depiction is exemplary. Accordingly, embodiments are well suited to applications involving different, additional, or fewer elements, features, or arrangements.
Exemplary Color Enhancement Color Space
With reference now to
In one embodiment, color enhancement color space 100 is an implementation of a component in a color image pipeline. Color enhancement color space 100 may be, for example, one of the components commonly used between an image source (e.g., a camera, scanner, or the rendering engine in a computer game), and an image renderer (e.g., a television set, computer screen, computer printer or cinema screen), for performing any intermediate digital image processing consisting of two or more separate processing blocks. An image/video pipeline may be implemented as computer software, in a digital signal processor, on a field-programmable gate array (FPGA) or as a fixed-function application-specific integrated circuit (ASIC). In addition, analog circuits can be used to perform many of the same functions.
In one embodiment, a color coordinate plane may comprise, for example, a Cb-Cr color space for encoding color information. In a typical embodiment, a color space comprises a plurality of discrete positions in a coordinate plane 101, 103, 105 and 107, each position, when coupled to the associated luminance value, corresponding to a specific color In further embodiments, each of the color coordinate planes 101, 103, 105 and 107 includes at least one detection region (e.g., detection regions 111, 113, 115, 117). Each detection region 111, 113, 115 and 117 comprises a bounded area of a color coordinate plane 101, 103, 105 and 107 comprising a plurality of positions in the color coordinate plane 101, 103, 105 and 107.
In one embodiment, each detection region 111, 113, 115 and 117 further corresponds to one or more shades in a family of colors for which color enhancement is desired. In another embodiment, a detection region may be separately defined for each color coordinate plane 101, 103, 105 and 107 along the luminance axis 199 throughout the detection volume 121 for each of the families of colors (e.g., red, blue, yellow and green). In still further embodiments, a detection region may be separately defined for each color coordinate plane 101, 103, 105 and 107 along the luminance axis 199 throughout the detection volume 121 comprising a combination of different colors (e.g., a mixture of variable amounts of red, blue, green and yellow).
As depicted in
In a further embodiment, the combination of detection regions 111, 113, 115 and 117 along the luminance axis 199 forms a detection volume 121. In one embodiment, each detection region 111, 113, 115 and 117 may be independently defined based on its luminance. In alternate embodiments, a detection volume 121 may be linearly interpolated from two or more defined detection regions 111, 113, 115 and 117. For example, a detection region defined in one color coordinate plane may be linearly coupled to the detection region defined in another color coordinate plane in the detection volume 121 having an alternate luminance value. The line segments extending from each vertex and traversing the three dimensional color space between the defined color coordinate planes thus bound the detection regions for the color coordinate planes corresponding to the luminance values between the luminance values of the defined detection regions. In alternate embodiments, when more than two detection regions are defined, interpolation may be performed between each of detection region and the most proximate defined detection regions corresponding to luminance values (both greater or less than) along the luminance axis 199. In still further embodiments, interpolation may be avoided by defining as many planes on the luminance axis as there are possible luminance values, e.g., 256 planes in a system with an 8-bit luminance value.
In still further embodiments, input (e.g., a pixel) received is compared to the detection volume 121. If the color of the pixel corresponds to a position within a detection region 111, 113, 115 and 117 of a color coordinate plane 101, 103, 105 and 107 for the pixel's luminance value, the pixel becomes a candidate for color enhancement, e.g., shifting within its color coordinate plane by some defined amount.
With reference to
In one embodiment, each color coordinate plane of the plurality of color coordinate planes 201, 203, and 205 is a two dimensional plane comprising four quadrants, designated according to a typical Cartesian coordinate system, and separated by two intersecting axes. In one embodiment, each set of quadrants in a color coordinate plane corresponds to the color quadrants of a Cb-Cr color plane. As depicted in
As presented, color enhancement space 200 includes a plurality of detection volumes. Color enhancement space 200 comprises detection volume 271, with detection regions (e.g., 221, 241, 261) disposed in the third quadrant of the plurality of color coordinate planes 201, 203 and 205 in color enhancement space 200; and detection volume 275, with detection regions (e.g., 225, 245, 265) disposed in the first quadrant of the plurality of color coordinate planes 201, 203 and 205. Each detection volume may, for example, correspond to a specific color or a group of related colors (e.g., shades or hues within the same family of color) for which enhancement is desired (e.g., green, blue, red, etc).
As presented, each detection volume 271, 275 is comprised of a plurality of detection regions (e.g., detection regions 221, 225, 241, 245, 261 and 265), disposed in color coordinate planes 201, 203 and 205, respectively, and corresponding to the luminance value of the appropriate color coordinate plane 201, 203 and 205. Each detection volume 271, 275 also has a corresponding shift volume 273, 277 comprising a plurality of shift regions (e.g., shift regions 223, 227, 243, 247, 263 and 267). In one embodiment, the relative position of a detection region may vary by luminance. Furthermore, each detection region comprised in a detection volume 271, 273 further corresponds to a shift region in the same color coordinate plane, 201, 203 and 205, for the same luminance value. In further embodiments, each of the plurality of positions bounded by a detection region 221, 225, 241, 245, 261 and 265 has a corresponding position in the associated shift region 223, 227, 243, 247, 263 and 267, respectively. For example, each position in detection 221 may be pre-mapped to an alternate position in color coordinate plane 201 comprised in shift region 223, and may thus provide, in some embodiments, for shift variance by luminance.
In one embodiment, input (such as a pixel) comprising a luminance value and a chromatic value is translated into a coordinate position in a color coordinate plane. The resultant position is compared to a detection volume 271, 275 in color enhancement space 200. If the position and luminance value correspond to a position in the detection volume, the coordinate position of the pixel may be shifted to a pre-mapped position in the shift region corresponding to the specific detection region having the luminance value of the input. For example, a position detected in detection volume 271 may be shifted to a corresponding, pre-mapped position in shift volume 273 based on luminance. An exemplary shift is indicated by the dotted directed line segments, indicating a vector shift from a detection region to the corresponding shift region (e.g., 241 to 243). Likewise, a position detected in detection volume 275 may be shifted to a corresponding, pre-mapped position in shift volume 277. In alternate embodiments, a color enhancement color space 200 may include additional detection volumes and corresponding shift volumes corresponding to separate colors.
While detection regions 221, 225, 241, 245, 261 and 265 and corresponding shift regions 223, 227, 243, 247, 263 and 267 have been presented as being disposed entirely in one quadrant, such depiction is exemplary. Accordingly, embodiments are well suited to include a detection region and/or shift region each occupying portions of a plurality of quadrants.
With reference now to
According to one embodiment, the combination of detection regions 311, 313, and 315 along the luminance axis 399 forms a detection volume 321. In one embodiment, each detection region 311, 313, and 315 may be independently defined, based on luminance. In alternate embodiments, a detection volume 321 may be linearly interpolated from two or more defined detection regions 311, 313, and 315. For example, a detection region defined in one color coordinate plane may be linearly coupled to the detection region defined in another color coordinate plane having an alternate luminance value. The line segments extending from each point on the circumference (or bounding edge for detection regions of other geometric shapes) and traversing the three dimensional color space between the defined color coordinate planes thus form the circumference (or boundaries) of the detection regions for the color coordinate planes corresponding to the luminance values between the luminance values of the defined detection regions.
In alternate embodiments, when more than two detection regions are defined, interpolation may be performed between each of detection region and proximate defined detection regions corresponding to luminance values (both greater and less than) along the luminance axis 399. For example, with reference to
In one embodiment, each detection region 311, 313 and 315 may be variable along the luminance axis 399. A detection region 311, 313 and 315 may be variable by, for example, the size of a detection region and/or shift region for different coordinate planes along the luminance axis. For example, the colors comprised in a detection region (e.g., detection region 311) of one color coordinate plane (e.g., color coordinate plane 301) for one luminance value may have a different position in a color coordinate plane (e.g., color coordinate plane 303, 305) of a different luminance value. Accordingly, to effectively “capture” the same colors during detection for color enhancement may require a re-positioning (or other like adjustment) of the detection regions for other luminance values. Accordingly, in one embodiment, a detection region 311, 313, and 315 may have a position, relative to the origin in the color coordinate plane 301, 303 and 305, which is different for one or more other luminance values in the three dimensional color space 300.
In further embodiments, the size of a detection region 311, 313 and 315 may also vary within the plurality of color coordinate planes 301, 303 and 305 based on the luminance value along the luminance axis 399. As depicted, detection region 313 comprises an area less than that of detection region 311 and 315. Consequently, detection volume 321 exhibits an interpolation consistent with the variance in size. In still further embodiments, the position and size of the shift regions comprising a shift volume (not shown) corresponding to said detection regions 311, 313 and 315 may also vary in size and position with respect to other shift regions in the shift volume along the luminance axis 399. In yet further embodiments, the position and size of the shift regions comprising a shift volume corresponding to said detection regions 311, 313 and 315 may also vary in size and position relative to the respective corresponding detection regions 311, 313 and 315 along the luminance axis 399.
With reference now to
In some embodiments, the orientation of a detection region 411, 413 may vary within the plurality of color coordinate planes 401, 403 along the luminance axis 499. For example, a detection region (e.g., detection region 413) may be rotated about a separate axis relative to another detection region (e.g., detection region 411) for the same color or group of colors for a plurality of color coordinate planes 401, 403 along the luminance axis 499. As depicted, detection region 411 comprises a trapezoid having four sides, enumerated a, b, c, and d. Detection region 413 depicts an exemplary rotation with corresponding sides. Consequently, detection volume 421, when interpolated from detection region 411 and 413, exhibits a torsion consistent with the variance in orientation. In further embodiments, the rotation of a detection region relative to another detection region for the same color or group may accompany a re-location and/or adjustment to the area of the detection region.
Exemplary Color Enhancement Process
With reference to
At step 501, color data is received for one or more pixels. The pixels may comprise, for example, the pixels of an image frame or still frame of a video. In one embodiment, the color data for each pixel includes the luminance value of the pixel, and a set of chromatic values. In further embodiments, the color space is a Cb-Cr color space.
At step 503, the set of chromatic values comprising the color data received in step 501 is translated into coordinates representing the color of the pixel as a first position in a color coordinate plane having the luminance received as input in a color space.
At step 505, the color data for the pixels received in step 501 and translated in step 503 is compared to a detection volume. Comparing the color data for the pixels received in step 501 may comprise, for example, determining the luminance-specific detection region in a detection volume and comparing the position of the pixel within the luminance-specific detection region. A color is “detected” if the position of the pixel's color (e.g., the first position) lies within the area bounded by the luminance-specific detection region corresponding to the luminance value of the pixel. In one embodiment, each pixel of the plurality of pixels may be compared to the luminance specific detection region in the detection volume corresponding to the luminance of the pixel. A pixel having an undetected color (e.g., a pixel having a position in the color space outside the detection volume) is unmodified and may be displayed without alteration. A pixel whose color data corresponds to a position in the color space within the detection volume proceeds to step 507.
In one embodiment, the detection volume is constructed along a luminance axis for a three dimensional color space. A detection volume may be constructed by, for example, independently defining a specific detection region comprising the detection volume for each luminance value in the luminance axis in the three dimensional color space. Alternatively, a detection volume may be interpolated from two or more luminance-specific detection regions defined for two or more luminance values in the luminance axis. For example, a detection volume may be interpolated from a first defined detection region in a first luminance-specific color coordinate plane corresponding to a first luminance value and a second defined detection region in a second luminance-specific color coordinate plane corresponding to a second luminance value. The plurality of points along the perimeter of the first detection region in the first luminance-specific color coordinate plane may be linearly coupled to corresponding points along the perimeter of a second detection region in a second luminance-specific color coordinate plane, the resulting volume having the first and second detection regions as a top and bottom base.
Accordingly, a plurality of cross-sections of the resulting volume may be used to define a plurality of detection regions, each detection region being disposed in a distinct coordinate space and specific to a discrete luminance between the first and second luminance values in the luminance axis. In one embodiment, the relative position, size and/or orientation of a detection region with respect to the other detection regions comprising the detection volume may be variable along the luminance axis.
At step 507, a pixel having a color corresponding to a position in the detection volume constructed in step 501 is shifted to a second position to enhance the color of the pixel when displayed. The color data of the pixel is shifted such that the coordinates representing the color of the pixel as a position in the color coordinate plane is modified to correspond to an alternate position in the color coordinate plane. In one embodiment, the alternate position is a pre-defined position in a shift volume. For example, a pixel having a position within a detection region will have its coordinates modified to represent the position, in a shift region associated with the detection region, which corresponds to the specific position in the detection region.
In one embodiment, a shift volume corresponding to the detection volume is constructed along the same luminance axis for the same three dimensional color space. The shift volume may be interpolated from a first defined shift region in the first luminance-specific color coordinate plane and a second defined shift region in the second luminance-specific color coordinate plane. The shift volume may be interpolated by linearly coupling a plurality of points along the perimeter of the first shift region and the second shift region, wherein the resulting volume, bounded by the first and second shift regions, form the shift volume.
A plurality of luminance-specific shift regions may be thus defined from cross-sections of the resulting shift volume for the plurality of luminance values between the first and second luminance values in the luminance axis. In one embodiment, the relative position, size and/or orientation of a shift region with respect to the other shift regions comprising the shift volume may be variable along the luminance axis. In further embodiments, the relative position, size and/or orientation of a shift region with respect to the corresponding detection region may be variable along the luminance axis.
In one embodiment, each detection region in a detection volume has a corresponding shift region in a shift volume. Specifically, each discrete position in a detection region corresponds to a specific discrete position in the corresponding shift region. In further embodiments, each discrete position in a detection region is pre-mapped to another, luminance-specific position in a shift region. A discrete position in a detection region may be pre-mapped to a position in a corresponding shift region by, for example, correlating the position in the detection region with respect to the entire detection region to a position in the shift region having the same relative position with respect to the shift region. In further embodiments, a shift region corresponding to a detection region is disposed in the same luminance-specific color coordinate plane wherein the detection region is disposed. In still further embodiments, the magnitude and direction of the resultant “shift” from a position in the detection region to the corresponding position in the shift region may also be luminance-specific, and variable for detection regions and shift regions disposed in color-coordinate planes specific to other luminance values in the luminance axis.
At step 509, the pixel of the frame (e.g., image frame or still frame of a video) is displayed as the color corresponding to the color data of the pixel. The color data may be displayed as modified according to step 507, or, if undetected in step 505, the color data may be displayed according to the originally received color data.
With reference to
The specific detection region of a detection volume, wherein the color data of a pixel is detected, is determined at step 601. In one embodiment, the detection region is a color coordinate plane corresponding to the discrete luminance value included in the color data of the pixel. In some embodiments, determining a detection region comprises referencing the detection region in a color coordinate plane corresponding to the given luminance value. For example, the detection region may be determined by determining the cross-section of the detection volume disposed in the color-coordinate plane corresponding to the given luminance value.
At step 603, the position (a “first position”) of the pixel in the detection region is determined. The location in the detection region may comprise, for example, the position in the color coordinate plane corresponding to the set of coordinates included in the color data of the pixel.
At step 605, the position (a “second position”) of the pixel in the shift region corresponding to the position of the first position in the detection region is determined. Thus, a pixel translated to have a position equal to the first position will be shifted (e.g., by adjusting the chromatic values comprising the color data of the pixel) to the second position. In one embodiment, the position in the shift region may be pre-mapped. In alternate embodiments, the position in the shift region may be determined dynamically by juxtaposing a position in the shift region having the same relativity to other positions in the shift region as the first position with respect to the other positions in the detection region. In some embodiments, the shift region may comprise a bounded area in the same color coordinate plane as the detection region. In further embodiments, the relative displacement of the second position from the first position may be luminance-specific, and variable for other luminance values in the luminance axis.
At step 607, the coordinates of the color data of the pixel are modified to correspond to the second position, the modification comprising a displacement from the original, first position of the color data to a desired color-enhanced position.
Volume Construction
With reference to
At step 701, a first detection area in a first luminance-specific color coordinate plane is received. The first detection area may be pre-defined and retrieved from a storage component, or dynamically defined and received as input from an external source (e.g., a user). In one embodiment, the first detection area is a bounded region in a color coordinate plane specific to a first luminance in a color space. In further embodiments, the color space is a YCbCr color space. In still further embodiments, the bounded region is shaped as a geometric shape.
At step 703, a second detection area in a second luminance-specific color coordinate plane is received specific to a second luminance in the color space.
At step 705, a plurality of detection regions is interpolated from the first detection area and the second detection area. The plurality of detection regions may be interpolated by, for example, linearly interpolating a plurality of detection regions disposed in a plurality of luminance-specific color coordinate planes comprising the intervening color space between the first luminance-specific color-coordinate plane and the second luminance-specific color coordinate plane. The plurality of detection regions is subsequently combined to form a detection volume.
At step 707, a first shift area is defined in the same luminance-specific color coordinate plane comprising the first detection area. The first shift area corresponds to the first detection area and may be pre-mapped to the first detection area and retrieved from a storage component, or dynamically defined and mapped from input from an external source (e.g., a user). In one embodiment, the first shift area is a bounded region corresponding to the first detection area in the luminance-specific color coordinate plane specific to the first luminance in the color space. In one embodiment, the first shift area assumes a geometric shape similar to the shape of the first detection area. In further embodiments, the size, orientation and position relative to the first detection area may be adjusted.
At step 709, a second shift area is defined in the same luminance-specific color coordinate plane comprising the second detection area. The second shift area corresponds to the second detection area.
At step 711, a plurality of shift regions is interpolated from the first shift area and the second shift area. The plurality of shift regions may be interpolated by, for example, linearly interpolating a plurality of shift regions disposed in the plurality of luminance-specific color coordinate planes comprising the intervening color space between the first shift area and the second shift area. The plurality of detection regions is subsequently combined to form a shift volume which corresponds to the detection volume. Subsequently received input detected in a detection region in the detection volume constructed at step 705 will be shifted (e.g., a displacement in the color coordinate plane will be executed) for the portion of input into the shift region corresponding to the detection region and comprised in the shift volume constructed at step 711.
In one embodiment, the detection volume and/or the shift volume is variable along the luminance axis. Thus, subsequent modifications (including additions) to either a luminance-specific detection region in the detection volume or a luminance-specific shift region in the shift volume may be automatically extrapolated to each of the other luminance-specific regions (e.g., detection or shift) in the affected volume.
Color Enhancement System
With reference to
At step 801, a detection volume in a color space is displayed. In one embodiment, the detection volume displayed in the color space may correspond to a default set of values. Alternatively, the detection volume may comprise a set of values previously stored by a user. The detection volume may be displayed in, for example, a graphical user interface in an application for providing color enhancement functionality. In one embodiment, the detection volume may be displayed as a three dimensional object in a color space formed from the combination of a plurality of two dimensional shapes along a luminance axis, functioning as the third dimensional component of the three dimensional volume. In a further embodiment, each of the two dimensional color-coordinate planes is specific to a luminance value in the luminance axis.
In alternate embodiments, a specific luminance in the luminance axis may be selected, and the color coordinate plane and detection region disposed in the color coordinate plane specific to the specific luminance may be displayed independently of the rest of the detection volume. In further embodiments, detection volume may be displayed as a graph (e.g., line graph, bar graph, etc. . . . ) displaying the position of a detection region in a luminance-specific color coordinate plane relative to detection regions in the detection volume specific to alternate luminance values
At step 803, a shift volume corresponding to the detection volume in a color space is displayed. In one embodiment, the shift volume may be displayed in the same display or interface and according to the same representation (e.g., three dimensional color space, or as a series of two dimensional color-coordinate plane) as the detection volume. In one embodiment, the shift volume displayed in the color space may correspond to a default set of values. Alternatively, the shift volume may comprise a set of values previously stored by a user. In alternate embodiments, the shift volume may be displayed in any like fashion described above with reference to the display of the detection volume. In some embodiments, step 803 may be performed simultaneously with step 801.
At step 805, user input is received from an interface on the display. The user input may comprise, for example, a modification to the luminance-specific detection region in the detection volume displayed in step 801, or a modification to the luminance-specific shift region in the shift volume displayed in step 803. A modification may comprise, for example, adjusting a size, shape, orientation, or location in the luminance-specific color coordinate plane of a detection region or a shift region.
At step 807, the volume (e.g., detection volume and/or shift volume), comprising the region (e.g., detection region or shift region) modified in response to user input in step 805, is adjusted to correspond to the user input received. Adjusting a volume may comprise, for example, re-interpolating the luminance-specific regions comprising the volume, including the modified region. Thus, an adjusted volume may be adjusted along a luminance axis, wherein the corresponding detection and shift functionality, where appropriate, is variable along the luminance axis. After the adjustment is performed, the display of the adjusted volume is also modified to display the modification.
At step 809, the user input modification and resultant modified volume is stored in a storage component, such as a memory, coupled to the graphical user interface. In one embodiment, subsequent graphical inputs (e.g., image frames, still frames of a video, etc. . . . ) are compared to the detection volume and shifted into the shift volume according to the luminance-specific shift parameter, including any modifications made thereto.
Exemplary Computing Device
With reference to
It is understood that embodiments can be practiced on many different types of computer system 900. Examples include, but are not limited to, desktop computers, workstations, servers, media servers, laptops, gaming consoles, digital televisions, PVRs, and personal digital assistants (PDAs), as well as other electronic devices with computing and data storage capabilities, such as wireless telephones, media center computers, digital video recorders, digital cameras, and digital audio playback or recording devices.
As presented in
Additionally, computing system 900 may also have additional features/functionality. For example, computing system 900 may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. Such additional storage is illustrated in
Computer system 900 also comprises an optional alphanumeric input device 906, an optional cursor control or directing device 907, and one or more signal communication interfaces (input/output devices, e.g., a network interface card) 908. Optional alphanumeric input device 906 can communicate information and command selections to central processor 901. Optional cursor control or directing device 907 is coupled to bus 909 for communicating user input information and command selections to central processor 901. Signal communication interface (input/output device) 908, which is also coupled to bus 909, can be a serial port. Communication interface 909 may also include wireless communication mechanisms. Using communication interface 909, computer system 900 can be communicatively coupled to other computer systems over a communication network such as the Internet or an intranet (e.g., a local area network), or can receive data (e.g., a digital television signal).
Although the subject matter has been described in language specific to structural features and/or processological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Number | Name | Date | Kind |
---|---|---|---|
3904818 | Kovac | Sep 1975 | A |
4253120 | Levine | Feb 1981 | A |
4646251 | Hayes et al. | Feb 1987 | A |
4682664 | Kemp | Jul 1987 | A |
4685071 | Lee | Aug 1987 | A |
4739495 | Levine | Apr 1988 | A |
4771470 | Geiser et al. | Sep 1988 | A |
4803477 | Miyatake et al. | Feb 1989 | A |
4920428 | Lin et al. | Apr 1990 | A |
4987496 | Greivenkamp, Jr. | Jan 1991 | A |
5175430 | Enke et al. | Dec 1992 | A |
5227789 | Barry et al. | Jul 1993 | A |
5261029 | Abi-Ezzi et al. | Nov 1993 | A |
5305994 | Matsui et al. | Apr 1994 | A |
5338901 | Dietrich | Aug 1994 | A |
5387983 | Sugiura et al. | Feb 1995 | A |
5414824 | Grochowski | May 1995 | A |
5475430 | Hamada et al. | Dec 1995 | A |
5513016 | Inoue | Apr 1996 | A |
5608824 | Shimizu et al. | Mar 1997 | A |
5652621 | Adams, Jr. et al. | Jul 1997 | A |
5736987 | Drucker et al. | Apr 1998 | A |
5793371 | Deering | Aug 1998 | A |
5793433 | Kim et al. | Aug 1998 | A |
5822452 | Tarolli et al. | Oct 1998 | A |
5831625 | Rich et al. | Nov 1998 | A |
5831640 | Wang et al. | Nov 1998 | A |
5835097 | Vaswani et al. | Nov 1998 | A |
5841442 | Einkauf et al. | Nov 1998 | A |
5878174 | Stewart et al. | Mar 1999 | A |
5892517 | Rich | Apr 1999 | A |
5903273 | Mochizuki et al. | May 1999 | A |
5963984 | Garibay, Jr. et al. | Oct 1999 | A |
5995109 | Goel et al. | Nov 1999 | A |
6016474 | Kim et al. | Jan 2000 | A |
6052127 | Vaswani et al. | Apr 2000 | A |
6078331 | Pulli et al. | Jun 2000 | A |
6078334 | Hanaoka et al. | Jun 2000 | A |
6118547 | Tanioka | Sep 2000 | A |
6128000 | Jouppi et al. | Oct 2000 | A |
6141740 | Mahalingaiah et al. | Oct 2000 | A |
6151457 | Kawamoto | Nov 2000 | A |
6175430 | Ito | Jan 2001 | B1 |
6184893 | Devic et al. | Feb 2001 | B1 |
6236405 | Schilling et al. | May 2001 | B1 |
6252611 | Kondo | Jun 2001 | B1 |
6281931 | Tsao et al. | Aug 2001 | B1 |
6289103 | Sako et al. | Sep 2001 | B1 |
6298169 | Guenter | Oct 2001 | B1 |
6314493 | Luick | Nov 2001 | B1 |
6319682 | Hochman | Nov 2001 | B1 |
6323934 | Enomoto | Nov 2001 | B1 |
6339428 | Fowler et al. | Jan 2002 | B1 |
6392216 | Peng-Tan | May 2002 | B1 |
6396397 | Bos et al. | May 2002 | B1 |
6438664 | McGrath et al. | Aug 2002 | B1 |
6469707 | Voorhies | Oct 2002 | B1 |
6486971 | Kawamoto | Nov 2002 | B1 |
6504952 | Takemura et al. | Jan 2003 | B1 |
6549997 | Kalyanasundharam | Apr 2003 | B2 |
6556311 | Benear et al. | Apr 2003 | B1 |
6584202 | Montag et al. | Jun 2003 | B1 |
6594388 | Gindele et al. | Jul 2003 | B1 |
6683643 | Takayama et al. | Jan 2004 | B1 |
6707452 | Veach | Mar 2004 | B1 |
6724932 | Ito | Apr 2004 | B1 |
6737625 | Baharav et al. | May 2004 | B2 |
6760080 | Moddel et al. | Jul 2004 | B1 |
6785814 | Usami et al. | Aug 2004 | B1 |
6806452 | Bos et al. | Oct 2004 | B2 |
6819793 | Reshetov et al. | Nov 2004 | B1 |
6839062 | Aronson et al. | Jan 2005 | B2 |
6839813 | Chauvel | Jan 2005 | B2 |
6856441 | Zhang et al. | Feb 2005 | B2 |
6859208 | White | Feb 2005 | B1 |
6876362 | Newhall, Jr. et al. | Apr 2005 | B1 |
6883079 | Priborsky | Apr 2005 | B1 |
6891543 | Wyatt | May 2005 | B2 |
6900836 | Hamilton, Jr. | May 2005 | B2 |
6940511 | Akenine-Moller et al. | Sep 2005 | B2 |
6950099 | Stollnitz et al. | Sep 2005 | B2 |
7009639 | Une et al. | Mar 2006 | B1 |
7015909 | Morgan, III et al. | Mar 2006 | B1 |
7023479 | Hiramatsu et al. | Apr 2006 | B2 |
7081898 | Sevigny | Jul 2006 | B2 |
7082508 | Khan et al. | Jul 2006 | B2 |
7088388 | MacLean et al. | Aug 2006 | B2 |
7092018 | Watanabe | Aug 2006 | B1 |
7106368 | Daiku et al. | Sep 2006 | B2 |
7107441 | Zimmer et al. | Sep 2006 | B2 |
7116335 | Pearce et al. | Oct 2006 | B2 |
7120715 | Chauvel et al. | Oct 2006 | B2 |
7133072 | Harada | Nov 2006 | B2 |
7146041 | Takahashi | Dec 2006 | B2 |
7221779 | Kawakami et al. | May 2007 | B2 |
7227586 | Finlayson et al. | Jun 2007 | B2 |
7236649 | Fenney | Jun 2007 | B2 |
7245319 | Enomoto | Jul 2007 | B1 |
7305148 | Spampinato et al. | Dec 2007 | B2 |
7343040 | Chanas et al. | Mar 2008 | B2 |
7397946 | Reshetov et al. | Jul 2008 | B2 |
7447869 | Kruger et al. | Nov 2008 | B2 |
7486844 | Chang et al. | Feb 2009 | B2 |
7502505 | Malvar et al. | Mar 2009 | B2 |
7519781 | Wilt | Apr 2009 | B1 |
7545382 | Montrym et al. | Jun 2009 | B1 |
7580070 | Yanof et al. | Aug 2009 | B2 |
7627193 | Alon et al. | Dec 2009 | B2 |
7671910 | Lee | Mar 2010 | B2 |
7728880 | Hung et al. | Jun 2010 | B2 |
7750956 | Wloka | Jul 2010 | B2 |
7760936 | King et al. | Jul 2010 | B1 |
7912279 | Hsu et al. | Mar 2011 | B2 |
8049789 | Innocent | Nov 2011 | B2 |
20010001234 | Addy et al. | May 2001 | A1 |
20010012113 | Yoshizawa et al. | Aug 2001 | A1 |
20010012127 | Fukuda et al. | Aug 2001 | A1 |
20010015821 | Namizuka et al. | Aug 2001 | A1 |
20010019429 | Oteki et al. | Sep 2001 | A1 |
20010021278 | Fukuda et al. | Sep 2001 | A1 |
20010033410 | Helsel et al. | Oct 2001 | A1 |
20010050778 | Fukuda et al. | Dec 2001 | A1 |
20010054126 | Fukuda et al. | Dec 2001 | A1 |
20020012131 | Oteki et al. | Jan 2002 | A1 |
20020015111 | Harada | Feb 2002 | A1 |
20020018244 | Namizuka et al. | Feb 2002 | A1 |
20020027670 | Takahashi et al. | Mar 2002 | A1 |
20020033887 | Hieda et al. | Mar 2002 | A1 |
20020041383 | Lewis, Jr. et al. | Apr 2002 | A1 |
20020044778 | Suzuki | Apr 2002 | A1 |
20020054374 | Inoue et al. | May 2002 | A1 |
20020063802 | Gullichsen et al. | May 2002 | A1 |
20020105579 | Levine et al. | Aug 2002 | A1 |
20020126210 | Shinohara et al. | Sep 2002 | A1 |
20020146136 | Carter, Jr. | Oct 2002 | A1 |
20020149683 | Post | Oct 2002 | A1 |
20020158971 | Daiku et al. | Oct 2002 | A1 |
20020167202 | Pfalzgraf | Nov 2002 | A1 |
20020167602 | Nguyen | Nov 2002 | A1 |
20020169938 | Scott et al. | Nov 2002 | A1 |
20020172199 | Scott et al. | Nov 2002 | A1 |
20020191694 | Ohyama et al. | Dec 2002 | A1 |
20020196470 | Kawamoto et al. | Dec 2002 | A1 |
20030035100 | Dimsdale et al. | Feb 2003 | A1 |
20030067461 | Fletcher et al. | Apr 2003 | A1 |
20030122825 | Kawamoto | Jul 2003 | A1 |
20030142222 | Hordley | Jul 2003 | A1 |
20030146975 | Joung et al. | Aug 2003 | A1 |
20030167420 | Parsons | Sep 2003 | A1 |
20030169353 | Keshet et al. | Sep 2003 | A1 |
20030169918 | Sogawa | Sep 2003 | A1 |
20030197701 | Teodosiadis et al. | Oct 2003 | A1 |
20030222995 | Kaplinsky et al. | Dec 2003 | A1 |
20030223007 | Takane | Dec 2003 | A1 |
20040001061 | Stollnitz et al. | Jan 2004 | A1 |
20040001234 | Curry et al. | Jan 2004 | A1 |
20040032516 | Kakarala | Feb 2004 | A1 |
20040051716 | Sevigny | Mar 2004 | A1 |
20040066970 | Matsugu | Apr 2004 | A1 |
20040100588 | Hartson et al. | May 2004 | A1 |
20040101313 | Akiyama | May 2004 | A1 |
20040109069 | Kaplinsky et al. | Jun 2004 | A1 |
20040151372 | Reshetov et al. | Aug 2004 | A1 |
20040189875 | Zhai et al. | Sep 2004 | A1 |
20040207631 | Fenney et al. | Oct 2004 | A1 |
20040218071 | Chauville et al. | Nov 2004 | A1 |
20040247196 | Chanas et al. | Dec 2004 | A1 |
20050007378 | Grove | Jan 2005 | A1 |
20050007477 | Ahiska | Jan 2005 | A1 |
20050030395 | Hattori | Feb 2005 | A1 |
20050046704 | Kinoshita | Mar 2005 | A1 |
20050073591 | Ishiga et al. | Apr 2005 | A1 |
20050099418 | Cabral et al. | May 2005 | A1 |
20050110790 | D'Amora | May 2005 | A1 |
20050185058 | Sablak | Aug 2005 | A1 |
20050238225 | Jo et al. | Oct 2005 | A1 |
20050243181 | Castello et al. | Nov 2005 | A1 |
20050248671 | Schweng | Nov 2005 | A1 |
20050261849 | Kochi et al. | Nov 2005 | A1 |
20050268067 | Lee et al. | Dec 2005 | A1 |
20050286097 | Hung et al. | Dec 2005 | A1 |
20060004984 | Morris et al. | Jan 2006 | A1 |
20060050158 | Irie | Mar 2006 | A1 |
20060061658 | Faulkner et al. | Mar 2006 | A1 |
20060087509 | Ebert et al. | Apr 2006 | A1 |
20060133697 | Uvarov et al. | Jun 2006 | A1 |
20060153441 | Li | Jul 2006 | A1 |
20060176375 | Hwang et al. | Aug 2006 | A1 |
20060197664 | Zhang et al. | Sep 2006 | A1 |
20060259732 | Traut et al. | Nov 2006 | A1 |
20060259825 | Cruickshank et al. | Nov 2006 | A1 |
20060274171 | Wang | Dec 2006 | A1 |
20060290794 | Bergman et al. | Dec 2006 | A1 |
20060293089 | Herberger et al. | Dec 2006 | A1 |
20070073996 | Kruger et al. | Mar 2007 | A1 |
20070091188 | Chen et al. | Apr 2007 | A1 |
20070106874 | Pan et al. | May 2007 | A1 |
20070126756 | Glasco et al. | Jun 2007 | A1 |
20070147706 | Sasaki et al. | Jun 2007 | A1 |
20070157001 | Ritzau | Jul 2007 | A1 |
20070168634 | Morishita et al. | Jul 2007 | A1 |
20070168643 | Hummel et al. | Jul 2007 | A1 |
20070171288 | Inoue et al. | Jul 2007 | A1 |
20070236770 | Doherty et al. | Oct 2007 | A1 |
20070247532 | Sasaki | Oct 2007 | A1 |
20070262985 | Watanabe et al. | Nov 2007 | A1 |
20070285530 | Kim et al. | Dec 2007 | A1 |
20080030587 | Helbing | Feb 2008 | A1 |
20080062164 | Bassi et al. | Mar 2008 | A1 |
20080101690 | Hsu et al. | May 2008 | A1 |
20080143844 | Innocent | Jun 2008 | A1 |
20080263284 | da Silva et al. | Oct 2008 | A1 |
20090010539 | Guarnera et al. | Jan 2009 | A1 |
20090041341 | Scheibe | Feb 2009 | A1 |
20090116750 | Lee et al. | May 2009 | A1 |
20090160957 | Deng et al. | Jun 2009 | A1 |
20090257677 | Cabral et al. | Oct 2009 | A1 |
20090297022 | Pettigrew et al. | Dec 2009 | A1 |
20100266201 | Cabral et al. | Oct 2010 | A1 |
Number | Date | Country |
---|---|---|
1275870 | Dec 2000 | CN |
0392565 | Oct 1990 | EP |
1447977 | Aug 2004 | EP |
1550980 | Jul 2005 | EP |
2045026 | Oct 1980 | GB |
2363018 | May 2001 | GB |
61187467 | Aug 1986 | JP |
62151978 | Jul 1987 | JP |
07015631 | Jan 1995 | JP |
8036640 | Feb 1996 | JP |
08079622 | Mar 1996 | JP |
09233353 | Sep 1997 | JP |
2001052194 | Feb 2001 | JP |
2002207242 | Jul 2002 | JP |
2003085542 | Mar 2003 | JP |
2004221838 | Aug 2004 | JP |
2005094048 | Apr 2005 | JP |
2005182785 | Jul 2005 | JP |
2005520442 | Jul 2005 | JP |
2006086822 | Mar 2006 | JP |
2006094494 | Apr 2006 | JP |
2006121612 | May 2006 | JP |
2006134157 | May 2006 | JP |
20060203841 | Aug 2006 | JP |
2007019959 | Jan 2007 | JP |
2007148500 | Jun 2007 | JP |
2007233833 | Sep 2007 | JP |
2007282158 | Oct 2007 | JP |
2008085388 | Apr 2008 | JP |
2008277926 | Nov 2008 | JP |
2009021962 | Jan 2009 | JP |
1020040043156 | May 2004 | KR |
1020060068497 | Jun 2006 | KR |
1020070004202 | Jan 2007 | KR |
03043308 | May 2003 | WO |
2004063989 | Jul 2004 | WO |
2007093864 | Aug 2007 | WO |
Entry |
---|
Loop, C., DeRose, T., Generalized B-Spline surfaces o arbitrary topology, Aug. 1990, SIGRAPH 90, pp. 347-356. |
M. Halstead, M. Kass, T. DeRose; “efficient, fair interolation using catmull-clark surfaces”; Sep. 1993; Computer Graphics and Interactive Techniques, Proc; pp. 35-44. |
Morimoto et al., “Fast Electronic Digital Image Stabilization for Off-Road Navigation”, Computer Vision Laboratory, Center for Automated Research University of Maryland, Real-Time Imaging, vol. 2, pp. 285-296, 1996. |
Paik et al., “An Adaptive Motion Decision system for Digital Image Stabilizer Based on Edge Pattern Matching”, IEEE Transactions on Consumer Electronics, vol. 38, No. 3, pp. 607-616, Aug. 1992. |
Parhami, Computer Arithmetic, Oxford University Press, Jun. 2000, pp. 413-418. |
S. Erturk, “Digital Image Stabilization with Sub-Image Phase Correlation Based Global Motion Estimation”, IEEE Transactions on Consumer Electronics, vol. 49, No. 4, pp. 1320-1325, Nov. 2003. |
S. Erturk, “Real-Time Digital Image Stabilization Using Kalman Filters”, http://www.ideallibrary.com, Real-Time Imaging 8, pp. 317-328, 2002. |
T. DeRose, M., Kass, T. Troung; “subdivision surfaces in character animation”; Jul. 1998; Computer Graphics and Interactive Techniques, Proc; pp. 85-94. |
Takeuchi, S., Kanai, T., Suzuki, H., Shimada, K., Kimura, F., Subdivision surface fitting with QEM-basd mesh simplificatio and reconstruction of aproximated B-Spline surfaces, 200, Eighth Pacific Conference on computer graphics and applications pp. 202-2012. |
Uomori et al., “Automatic Image Stabilizing System by Full-Digital Signal Processing”, vol. 36, No. 3, pp. 510-519, Aug. 1990. |
Uomori et al., “Electronic Image Stabiliztion System for Video Cameras and VCRS”, J. Soc. Motion Pict. Telev. Eng., vol. 101, pp. 66-75, 1992. |
Chaudhuri, “The impact of NACKs in shared memory scientific applications”, Feb. 2004, IEEE, IEEE Transactions on Parallel and distributed systems vol. 15, No. 2, p. 134-150. |
Laibinis, “Formal Development of Reactive Fault Tolerant Systems”, Sep. 9, 2005, Springer, Second International Workshop, RISE 2005, p. 234-249. |
Wikipedia, Memory Address, Oct. 29, 2010, pp. 1-4, www.wikipedia.com. |
Wikipedia, Physical Address, Apr. 17, 2010, pp. 1-2, www.wikipedia.com. |
“A Pipelined Architecture for Real-Time orrection of Barrel Distortion in Wide-Angle Camera Images”, Hau, T. Ngo, Student Member, IEEE and Vijayan K. Asari, Senior Member IEEE, IEEE Transaction on Circuits and Sytstems for Video Technology: vol. 15 No. 3 Mar. 2005 pp. 436-444. |
“Calibration and removal of lateral chromatic abberation in images” Mallon, et al. Science Direct Copyright 2006; 11 pages. |
“Method of Color Interpolation in a Singe Sensor Color Camera Using Green Channel Seperation” Weerasighe, et al Visual Information Processing Lab, Motorola Austrailian Research Center pp. IV-3233-IV3236, 2002. |
D. Doo, M. Sabin “Behaviour of recrusive division surfaces near extraordinary points”; Sep. 197; Computer Aided Design; vol. 10, pp. 356-360. |
D.W.H. Doo; “A subdivision algorithm for smoothing down irregular shaped polyhedrons”; 1978; Interactive Techniques in Computer Aided Design; pp. 157-165. |
Davis, J., Marschner, S., Garr, M., Levoy, M., Filling holes in complex surfaces using volumetric diffusion, Dec. 2001, Stanford University, pp. 1-9. |
Donald D. Spencer, “Illustrated Computer Graphics Dictionary”, 1993, Camelot Publishing Company, p. 272. |
Duca et al., “A Relational Debugging Engine for Graphics Pipeline, International Conference on Computer Graphics and Interactive Techniques”, ACM SIGGRAPH Jul. 2005, pp. 453-463. |
E. Catmull, J. Clark, “recursively enerated B-Spline surfaces on arbitrary topological meshes”; Nov. 1978; Computer aided design; vol. 10; pp. 350-355. |
gDEBugger, graphicRemedy, http://www.gremedy.com, Aug. 8, 2006, pp. 1-18. |
http://en.wikipedia.org/wiki/Bayer—filter; “Bayer Filter”; Wikipedia, the free encyclopedia; pp. 1-4. |
http://en.wikipedia.org/wiki/Color—filter—array; “Color Filter Array”; Wikipedia, the free encyclopedia; pp. 1-5. |
http://en.wikipedia.org/wiki/Color—space; “Color Space”; Wikipedia, the free encyclopedia; pp. 1-4. |
http://en.wikipedia.org/wiki/Color—translation; “Color Management”; Wikipedia, the free encyclopedia; pp. 1-4. |
http://en.wikipedia.org/wiki/Demosaicing; “Demosaicing”; Wikipedia, the free encyclopedia; pp. 1-5. |
http://en.wikipedia.org/wiki/Half—tone; “Halftone”; Wikipedia, the free encyclopedia; pp. 1-5. |
http://en.wikipedia.org/wiki/L*a*b*; “Lab Color Space”; Wikipedia, the free encyclopedia; pp. 1-4. |
http://Slashdot.org/articles/07/09/06/1431217.html. |
http:englishrussia.com/?p=1377. |
J. Bolz, P. P Schroder; “rapid evaluation of catmull-clark subdivision surfaces”; Web 3D '02. |
J. Stam; “Exact Evaluation of Catmull-clark subdivision surfaces at arbitrary parameter values”; Jul. 1998; Computer Graphics; vol. 32; pp. 395-404. |
Keith R. Slavin; Application As Filed entitled “Efficient Method for Reducing Noise and Blur in a Composite Still Image From a Rolling Shutter Camera”; Application No. 12069669; Filed Feb. 11, 2008. |
Ko et al., “Fast Digital Image Stabilizer Based on Gray-Coded Bit-Plane Matching”, IEEE Transactions on Consumer Electronics, vol. 45, No. 3, pp. 598-603, Aug. 1999. |
Ko, et al., “Digital Image Stabilizing Algorithms Basd on Bit-Plane Matching”, IEEE Transactions on Consumer Electronics, vol. 44, No. 3, pp. 617-622, Aug. 1988. |
Krus, M., Bourdot, P., Osorio, A., Guisnel, F., Thibault, G., Adaptive tessellation of connected primitives for interactive walkthroughs in complex industrial virtual environments, Jun. 1999, Proceedings of the Eurographics workshop, pp. 1-10. |
Kumar, S., Manocha, D., Interactive display of large scale trimmed NURBS models, 1994, University of North Carolina at Chapel Hill, Technical Report, p. 1-36. |
Kuno et al. “New Interpolation Method Using Discriminated Color Correlation for Digital Still Cameras” IEEE Transac. On Consumer Electronics, vol. 45, No. 1, Feb. 1999, pp. 259-267. |
Number | Date | Country | |
---|---|---|---|
20100141671 A1 | Jun 2010 | US |