In current digital camera technology, a user can select a border frame around an image in a picture. A current digital camera typically has a border feature where the user can preview the picture in order to see how the picture appears with a selected border frame color. In one current solution, a user is limited to selecting the border frame color from among a fixed set of colors in a color palette (a selection of colors or a color set). In another current solution, the digital camera automatically picks the border frame color from a color palette with a fixed limited number of color values. Current solutions also use the color palette in order to pick a border frame color for gray-scale (black and white) images. Since the color palette is used for determining a border frame color of a gray-scale image, the current solutions perform the unnecessary step of analyzing inappropriate non-grayscale color selection possibilities for the border frame. Therefore, the current technology is limited in its capabilities and suffers from at least the above constraints and deficiencies.
Non-limiting and non-exhaustive embodiments of the present invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
In the description herein, numerous specific details are provided, such as examples of components and/or methods, to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that an embodiment of the invention can be practiced without one or more of the specific details, or with other apparatus, systems, methods, components, materials, parts, and/or the like. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of embodiments of the invention.
An image sensor stage 130 can be formed by, for example, an array of CCD (charge coupled devices) sensors or an array of CMOS (complementary metal oxide semiconductor) sensors, or other suitable types of sensors that may be developed as sensor technology continues to advance. Each sensor in the array is commonly referred to as a “pixel” and an image scene 107 that is sampled by the image sensor stage 130 is treated as an array of pixel samples that have color values stored in the memory 172.
Light 106 from an image scene 107 is received by the lens 105 and is transmitted through the aperture 110 and the shutter 115. In response to the light 106, the image sensor stage 130 generates a set of pixel samples 135, which are electrical signals. The pixel samples 135 are converted from analog electrical signals to digital electrical signals by an analog-to-digital (A/D) converter 140. Typically, a gain stage 145 is provided between the image sensor stage 130 and the A/D converter 140. In the embodiment of
The A/D converter 140 digitizes the pixel samples 135 into the corresponding digitized pixel samples 150. Each digitized pixel sample 150 is a digital value that indicates a charge amplitude from a corresponding sensor in the image sensor stage 130. The A/D converter 140 provides the corresponding digitized pixel samples 150 to the image processor 120.
A display 155 permits a user 160 to view an image 107a of the scene 107. The display 155 may be, for example, a liquid crystal display (LCD) or other types of screens.
The user 160 can also use a user interface 165 to control various operations in the apparatus 100. For example the user interface 165 includes buttons or other types of actuators to permit control of camera functions. The user 160 can also use the interface 165 to control the movement and position of an area selector 170. As an example, the area selector 170 can be cross-hairs 170. Note that the area selector 170 can have other shapes and forms. For example, the area selector 170 can instead be a square, circular, rectangular, a cursor, or other pre-defined shaped area that is imposed on the display 155. For purposes of clarity, in the examples discussed below, the selector 170 is assumed to be a cross-hair.
The apparatus 100 also includes a memory 172 and storage medium 175. The apparatus 100 also includes known camera components that are not shown in
The memory 172 can store a border frame color picker engine 180 in accordance with an embodiment of the invention. The engine 180 is typically implemented in software code. The software code can be implemented by, for example, use of known programming languages (e.g., C, C+, C++, or other suitable known languages). The memory 172 can also store a standard operating system 182 which permits the management of the operations in the apparatus 100.
In another embodiment of the invention, the engine 180 is included in a processor hardware 121 that is included in or coupled to the image processor 120. A saturation control engine 260 also performs operations that are discussed below and is typically implemented in software code. In another embodiment of the invention, the saturation control engine 260 is included in the processor hardware 121 instead of being embodied in software code. Therefore, in other embodiments of the invention, the processor hardware 121 can perform the functions of the engine 180 and/or the functions of the engine 260.
The storage medium 175 can also store the images of scenes 107 that are captured via the lens 105 and that are to be produced as pictures or photographs. The storage medium 175 can be a built-in memory device in the apparatus 100 or can be a removable memory device.
In accordance with an embodiment of the invention, the border frame color picker engine 180 determines a border frame color from a color set 185. The border frame color is in a border of a picture to be produced by the apparatus 100. The engine 180 selects the border frame color based upon the position of the selector 170 on the image 107a. For example, the engine 180 selects the border frame color by evaluating the color of pixels or selecting the colors of particular pixels in the image 107a. Various methods for using the selector 170 for selecting the particular pixels in the image 107a are discussed below with reference to the block diagrams in
In accordance with another embodiment of the invention, the engine 180 determines a proper color set (e.g., non-grayscale color set 185, grayscale set 189, or sepia set 191), by evaluating the pixels in the image 107a by use of various methods, as discussed below in further detail. For example, the color set 185 contains non-grayscale colors, while the grayscale set 189 contains only the grayscale colors. Other color sets may be included as well for selection by the engine 180. For example, the engine 180 may select the sepia set 191 which contains sepia-related colors (e.g., brown, grayish brown, or olive brown similar to that of sepia ink) or other different color sets. The color sets 185, grayscale set 189, and sepia set 191 are typically stored in a memory device such as, for example, the memory 172.
The engine 180 makes use of metadata or file data of image 107a which is typically stored in memory 172 after a camera captures a photographic shot of the scene 107. This metadata or file data contains information that indicates if the picture of image 107a was taken as sepia, or black and white (grayscale), or non-grayscale color. Therefore, the engine 180 reads the metadata or file data in order to more accurately determine if the image 107a is sepia, grayscale, or non-grayscale color. When the engine 180 has selected the color set (e.g., color set 185, grayscale set 189, or sepia set 191), the engine 180 then selects the border frame color in the selected color set by evaluating or selecting the pixels in the image 107a based on the position of the cursor 107a on the image 107a, as discussed below with reference to
In an embodiment of the invention, the engine 180 selects the border frame color 205 from a color set 185. The color set 185 may be, for example, a set of colors arranged as a palette of colors.
An object 210 in a scene 107 is captured as an object image 210a and stored in the memory 172. The object image 210a and the border frame color 205 forms a picture (or photograph) 215 to be produced by the apparatus 100. The user 160 can also view the picture 215 in the display 155. The image processor 120 stores the picture 215 in the memory 172.
In the example of
The speed of the operations described in the block diagrams in
In another embodiment of the invention, a color set (e.g., color set 185, grayscale set 189, or sepia set 191) is first selected based upon the evaluation of a color (or colors) with respect to the selector 170, and a color (from the color set) is then selected for the border frame color 205 based upon the evaluation of the color (or colors) with respect to the selector 170.
Typically, the selector 170 may be composed of two (or more) colors so that the selector 170 does not disappear or blend into a color of the object image 210a, although another embodiment may only use a selector 170 that is composed of only one color. As another example, the engine 180 outlines the colors of the selector 170 in gray bars (or other colors) and have the selector 170 outline the current color of the object image 210a in bright green bars (or other colors that differ from the selector 170 color), although another embodiment is not required to perform this optional feature. Other combination of example colors can be used as well.
Typically, the user 160 can drive and locate the selector 170 over any location on the object image 210a (or picture 215) by actuating a controller 167 (e.g., four-way buttons or other actuator types) in the user interface 165. The user 160 can drive and locate the selector 170 over locations on the object image 210a by other techniques that become available as user interface technology advances.
In an embodiment of the invention, the engine 180 selects a border frame color 205 that matches the color value of a pixel 220a (
Alternatively or additionally, the engine 180 selects a color 230 (
Note that the LCD resolution of a camera is typically on the order of approximately 1/60 of the width by 1/60 of the height of the original image stored in memory. This typical LCD image is called a “screennail” which differs from a “thumbnail” image. The screennail is approximately 320×240 pixels in size. Typically, the screennail is created by an averaging-technique where the color values of many pixels in an original image 107a are averaged together to obtain the screennail. Alternatively, the screennail can also be created by just selecting, for example, every 60th pixel in an original image stored in memory, although this technique does not provide as a desirable picture as the above averaging technique. The above averaging technique creates a more applicable color for a general area of pixels in the original image, rather than a match for every single pixel in the original image. In other words, as an example, the color values of red pixels in an original image are not required to be averaged by the engine 180 in order to determine a border color 205, if a screennail image is shown in the camera display 155. The screennail already shows the average color values of the original image, and is not the original image stored in the memory which could be, for example, one-million or more pixels. Therefore, the pixel values in a location in a screennail are average pixel values that can be used as the border color 205. When the display 155 provides a screennail, the screennail automatically contains the average color values of the original image stored in memory. Therefore, the selector 170 can be placed in a location in the screennail, and the color values of pixels in this location are average color values from the original image. The color values in this location can then be used as border colors 205.
Alternatively or additionally, the engine 180 selects a color 235 that is at or near the opposite side of the color wheel from the color value of a pixel 220a. Note that the engine 180 can be programmed to select other colors in the color set 185 based on the pixels that are overlaid by the selector 170. For example, the color selected from the color set 185 may be near the color value of the color 225. As mentioned above, the selector 170 (which can be, e.g., circular or other shapes) can be resized and moved on various positions in the image 210a (as shown in
Alternatively, the image 210a area within the selector 170 can be magnified (zoomed) as shown in
In another embodiment of the invention, the engine 180 provides the colors 225, 230, and 235 as a set of potential colors that the user 160 can select for the border frame color 205. Additionally, the engine 180 can permit the user 160 to select the color values near or in between the colors 225, 230, and 235 as possible choices for the border frame color 205. Therefore, the user 160 may have the option of fine tuning the color value to be used for the border frame color 205. The user 160 may use the user interface 165 in order to permit selection of the frame color 205 and for fine tuning of the frame color 205. As noted above, the image 210a that is seen on the display 155 is typically a screennail which is not the original image pixel data.
Note that the number of pixels that are overlaid by the selector 170, in the example of
Additionally, the number of colors in the color set 185 may vary in number. An advantage provided by embodiments of the invention is that the number of colors that can be provided in the color set 185 can now be increased and are no longer limited to a fixed number of colors of prior systems, and the border frame color picker engine 180 advantageously selects a color in the color set 185 for the border frame color 205. The engine 180 then displays the border frame color 205 in the picture 215 as shown on the display 155.
Note that when the user 165 moves the selector 170 to another location (e.g., location 240) in the object image 210a, the engine 180 determines and displays a potentially different color value for the border frame color 205. Therefore, as the user 165 drives the selector 170 in different locations in the object image 210, the border frame color 205 may change because other locations in the object image 210 may have different color values. The engine 180 displays, typically on the edge of the actual picture 215, the border color 205 as the user 165 is driving the selector 170 over different locations on the object image 210a. As mentioned above, this location 240 could also be magnified so that the particular color value at the pixel in position 240 is used as the border color 205.
Additionally or alternatively, an embodiment of the invention can use the saliency mapping method in order to determine the color value for the border frame color 205. For example, the engine 180 can detect the important features in the picture 215 by use of saliency mapping which detects the significant features of the image by detecting the edges 250 of the object image 210a, determining the focus area of the picture 215, and determining the location of the object 210a in the picture 215. As an example, the focus area is typically the position of the selector 170 in the image 210a. The saliency mapping methods are performed in various digital camera products that are commercially-available from HEWLETT-PACKARD COMPANY, Palo Alto, Calif.
The image of the border frame color 205 can time out (disappear from view in the display 155) after a given time frame has passed.
In another embodiment of the invention, a saturation control engine 260 selects a saturation level for the border frame color 205. The saturation control engine 260 provides a fixed number of saturation levels (e.g., 5 levels of saturation). The number of saturation levels can vary. As known to those skilled in the art, each saturation level provides a level of vividness and contains a certain mix of colors. For example, for the main colors in the color wheel (e.g., red, green, yellow, blue), each saturation level indicates certain mix levels of colors. As another example, for a grayscale color, the gray level in the grayscale color varies in amount for each saturation level.
Advantages of embodiments of the invention include increased ease-of-use of the camera by a user, matching the needed functionality to a simple interface mechanism in current cameras, and allows for the selected border frame color in a color palette to match the image. From an artistic viewpoint, embodiments of the invention advantageously provides a beneficial cohesive user interface that provides numerous options for an artist or user in selecting border colors, while also providing an ease of product use for users including users who are inexperienced with digital camera use.
Reference is now made to
In an embodiment of the invention, the border frame color picker engine 180 can magnify (enlarge) the image 210a stored in memory 172 by performing standard image magnification or image expansion techniques. When the image 210a is magnified, the pixels 305 become larger in area. For example, in
By magnifying the image 210a, the resolution of the color selection for the border 216 is increased because the selector 170 can select only the colors of the pixel (or pixels) that is contained in the selector 170. Therefore, the selector 170 can select more specific pixel colors in the image 210a for use as the border color 205. In contrast, in
As another example, the size and position of the selector 170 may be adjusted to different sizes as shown
In
In
Additionally, decreasing the selector size 170 permits the user to select the granularity (increments) of the selector 170 movements. For example, in
In the example of
Alternatively, the user can view, in the display 155, the original image stored in memory 172, instead of viewing a screennail in the display 155. In this alternative approach, it is advantageous to provide a method to fine tune the viewing of the millions of pixels of the original image by, for example, magnifying the selected pixels (e.g.,
In another embodiment, the engine 180 uses saliency mapping to detect the significant features of the image 210a in the selector 170 area, and then evaluate the pixel color values of these significant features in the selector 170 area. The engine 180 selects the color set (e.g., set 185, set 189, or set 191) by evaluating the color (or colors) in a salient area 450 or 455 and then selects a color from the selected color set for the border color 205 by the evaluation of the color (or colors) in a salient area. The color evaluation methods discussed above with reference to
In another embodiment, the engine 180 moves the selector 170 from one salient area to another salient area. For example, if the selector 170 was in the salient area 350, when the user attempts to move the selector 170 away from the salient area 450, then the engine 180 would move the selector 170 to another salient area 455. Additionally or alternatively, a button or actuator 167 in the user interface 165 can permit the user to move the selector 170 to the various salient areas. Therefore, the selector 170 jumps to and from the salient areas. The user can then select the specific areas of a salient area to evaluate a color value or color values by use of the fine-tuning color selection methods described above with reference to
If the object image is a color image (i.e., non-grayscale color image), then the engine 180 selects the color set 185 to provide possible color values (non-grayscale color values) for the border color 205. If the image is a grayscale image, then the engine 180 selects the grayscale set 185 to provide possible grayscale color values for the border color 205. The colors in the grayscale value are neutral colors such as, for example, tan, ivory, white, beige, black, white, and/or other grayscale colors. If the image is a sepia image, then the engine 180 selects the grayscale set 185 to provide possible sepia color values for the border color 205. The color set 191 may be a sepia palette which contains color values ranging from brown, grayish brown, and olive brown similar to that of sepia ink.
The engine 180 may also select other color sets for providing the border color 205. For example, the engine 180 may select a color set based on the evaluation of the pixels that are overlaid by the selector 170 or pixels that are located with respect to the position of the selector 170 as shown in
Advantages of embodiments of the invention include the following. The engine 180 advantageously reduces or narrows down the number of selections of color values for use in the border frame color. As a result, the un-useful color values in a color palette are eliminated for consideration as a border frame color. The engine 180 automatically determines the possible border frame colors and as a result, the user is not required to perform numerous button selections or presses in the user interface 165. In this manner, the user is able to more quickly scan through the black and white palette for potential grayscale values for the border frame color or scan through a palette with non-grayscale color values or with sepia color values.
The black and white palette provides a separate palette that is dedicated for a grayscale image. As a result, more flexibility is provided to select a frame color for a grayscale image.
Currently available features in digital cameras may also be used to help the user 160 to select among the potential border frame color values that are identified by the engine 180. For example, the user 160 can use the known “live view” mode which permits the user 160 to look at the image scene 107 in the display 155, while the camera captures the image scene 107 for a picture 215. As another example, the user 160 can use the known playback mode which stores the image scene 106 as an scene image in the memory 172. The user 160 can then use the user interface 165 to view a larger pixel sample or smaller pixel sample of the scene image. Increasing or decreasing the pixel sample of the object image 210 changes the number of pixels that are overlaid by a selector 170 or are contained within a selector 170, depending on the shape of the selector 170. As a result, the color value determined by the engine 180 for the border color 205 may potentially differ if the number of pixels overlaid by the selector 170 are increased or decreased.
It is also within the scope of the present invention to implement a program or code that can be stored in a machine-readable or computer-readable medium to permit a computer to perform any of the inventive techniques described above, or a program or code that can be stored in an article of manufacture that includes a computer readable medium on which computer-readable instructions for carrying out embodiments of the inventive techniques are stored. Other variations and modifications of the above-described embodiments and methods are possible in light of the teaching discussed herein.
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification and the claims. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.