The invention relates generally to the use of device color profiles used by image color management systems.
One goal of image color management technology is to ensure that a color image, graphic, or text object (hereinafter collectively referred to as graphical objects) is rendered as close as possible to its original intent on any device, despite differences in imaging technologies and color capabilities between devices. To achieve this goal, color characteristics of devices such as scanners, printers, and display monitors may be determined a priori and encapsulated in a device profile.
A device profile is a file that contains information about how to convert colors in the color space of a specific input device (e.g., a scanner) into a device-independent color space, or from a device independent color space into an output device's color space. Typical input and output device color spaces include the red-green-blue (RGB) and cyan-magenta-yellow-black (CMYK) color spaces. One illustrative device-independent or profile color space (PCS) is the Commission Internationale de lÉclairage (CIE) XYZ color space. (See Commission Internationale de lÉclairage Publication 15.2-1986, “Colorimetry, Second Edition.”)
Referring to
Input and output device profiles 112 are typically created by device manufacturers or third party vendors and may comprise one, or a few different profiles—where each profile may accommodate different input and output color spaces. In an environment in which one, or at most a few, profiles for each device can be determined a priori, the above described color management scheme may work reasonably well. In an environment in which graphical objects may be subject to a variety of different capture environments (such as images generated by a digital camera), however, a single a priori device profile cannot provide good color reproduction for the different capture environments. Thus, it would be beneficial to provide a technique (apparatus and method) to generate color profiles for graphical objects in a dynamic or automatic fashion.
In one embodiment the invention provides a method and apparatus to dynamically generate device profiles. These embodiments may include receiving a graphical object having associated profile information, generating a profile based on the associated profile information, and identifying the profile to a color management system. The method may include generating a new graphical object, from the received graphical object, that has had its profile information removed. The act of identifying the profile may include associating a filename with the profile and communicating the filename to the color management system, perhaps via an application programming interface function call. The method may further include communicating the graphical object to the color management system.
In another embodiment, a method and apparatus to distinguish between a newly received graphical object's profile information and prior received profile information is provided. In these embodiments, if the newly received graphical object's associated profile information is not equivalent to prior received profile information, a new profile is created and identified to the color management system. Equivalence may be determined by comparing profile attribute values such as measurement tag values, and/or illuminant tag values, and/or rendering intents values.
Methods in accordance with the invention may be stored in any media that is readable and executable by a computer system. Illustrative media include: semiconductor memory devices such as EPROM, EEPROM, and flash devices; magnetic disks (fixed, floppy, and removable); other magnetic media such as tape; and optical media such as CD-ROM disks. Further, methods in accordance with the invention may be embodied in a hardware device such as a printed circuit board comprising discrete logic, integrated circuits, or specially designed application specific integrated circuits (ASIC).
Techniques (including methods and devices) to dynamically generate device profiles are described. The following embodiments of this inventive concept are illustrative only and are not to be considered limiting in any respect.
Referring to
Bridge circuit 208 provides an interface for system random access memory (RAM) 210, accelerated graphics port (AGP) 212 devices such as display unit 214, and one or more expansion slots 216. Expansion slots 216 may be personal computer memory card international association (PCMCIA) slots.
Bridge circuit 218 may also couple system bus 206 to secondary bus 220, while also providing universal serial bus (USB) 222 and integrated device electronics (IDE) 224 interfaces. Common IDE devices include magnetic and optical storage units 226. Also coupled to secondary bus 220 may be system read only memory (ROM) 228, keyboard controller (KYBD) 230, audio device 232, and input-output (I/O) circuit 234. One illustrative bridge circuit 218 is the 82371AB PCI-to-ISA/IDE controller manufactured by Intel Corporation. Input-output circuit 234 may provide an interface for parallel 236 and serial 238 ports, floppy disk drives 240, and infrared ports 242.
Camera 202 may associate (e.g., store) profile information with each image at the time the image is captured. The associated profile information may include profile information in accordance with the International Color Consortium's (ICC's) profile format specification, version 3.4, August 1997. Thus, an image file generated by camera 202 may have the structure shown in
Measurement tags redColorantTag, greenColorantTag, and blueColorantTag represent the relative XYZ values of the input device's (e.g., camera 202) red, green, and blue colorants. Rendering intent information such as red, green, and blue tone reproduction curve (TRC) tags or attributes may be used by a color management module (CMM) to linearize RGB input and may be ignored if the input data is already linear. Illuminant tag information such as the mediaWhitePointTag may be used to record the XYZ (e.g., the PCS color space) values of the capture media's (e.g., digital “film”) white point. Another illuminant tag that may be recorded by camera 202 and included in an image's profile information 300 is the viewingConditionsType and associated tag value. The viewingConditionsType attribute may record the illuminant condition under which an image is captured such as whether it was taken under daylight, tungsten, or fluorescent lighting conditions. (In one embodiment of the ICC profile format, the mediaWhitePointTag value is used for generating absolute colorimetry and is referenced to the PCS so that the media white point as represented in the PCS is equivalent to this tag value.) Because each image captured by camera 202 may be subject to a different illumination condition it is, in general, not possible to generate a color profile a priori that provides good color reproduction of the captured image. This is one distinguishing feature between a digital camera and other image capture devices such as digital scanners which have a substantially constant capture environment. The lack of certainty in describing an image's illuminant condition means that, without a means of generating a device profile based on the image itself, the ability of a color management system to render the image as close as possible to its original intent on any device, despite differences in imaging technologies and color capabilities between devices, is substantially limited.
One method to dynamically generate a device profile is illustrated in
In another embodiment, referred to herein as the “live” mode of operation, camera 202 is coupled to computer system 200 during image capture, periodically transferring captured images in an automated manner. By way of example, camera 202 may be coupled to computer system 200 via electrical cable, radio frequency or infrared transmission channels, and may transfer images to computer system 200 at a rate of up to approximately 30 images per second. The transfer may be initiated by computer system 200 or by camera 202. When camera 202 is coupled to computer system 200, the probability of successive images having different taking conditions is relatively small. Thus, when operating in the live mode, it may not be necessary to create a new profile for every image that is transferred from camera 202 to computer system 200.
Referring to
If there is no match (the ‘no’ prong of step 504), a new profile is generated as described above and assigned a unique filename (step 506). The newly created profile may be indexed in a manner that allows its use with another image (step 508), and the CMM is notified of the new profile's filename via an appropriate applications programming interface (API) call (step 510). In one embodiment, a profile is generated for each unique set of tag table data, and a list/table of the filenames and associated profile information is kept available so that each incoming (i.e., current) image's profile data may be compared to prior unique profile data. In another embodiment, a new profile is generated (step 506) only when the current image's profile information differs from previous profile data by a specified amount. For example, a new profile may be created when the current image's mediaWhitePointTag value differs from a previous profile's mediaWhitePointTag value by a first specified percentage, or when the viewingConditionsTag value differs from a second specified percentage.
If there is a match between the current image's profile data and profile data associated with a previous image (the ‘yes’ prong of step 504), the filename associated with the matching profile's data is determined (step 512) and provided to the CMM through an appropriate API call (step 510). If the live mode session is complete (the ‘yes’ prong of step 514), processing is terminated (step 516). If the live mode session is not complete (the ‘no’ prong of step 514), processing continues at step 500.
In another embodiment, the ability to distinguish between live mode and non-live mode operations may be provided in a single application (comprising one or more computer programs) as shown in
If the imaging device is operating in the live mode (the ‘yes’ prong of step 604), the received image's profile information is compared with existing (i.e., previously generated and stored) profile data that is subject to change based on the image's capture environment such as changes in illuminant tag values such as mediaWhitePointTag and viewingConditionsTag values, or measurement tag values such as redColorantTag, greenColorantTag, and blueColorantTag values (step 618). If there is no match (the ‘no’ prong of step 620), a new profile is generated, assigned a unique filename, and indexed as described above and shown in
If there is a match between the current image's profile information and previous profile data (the ‘yes’ prong of step 620), that profile associated with the matching profile data is determined (step 628) and processing continues at step 608.
One advantage of dynamically generating device profiles is that each graphical object (e.g., an image) may be rendered as close as possible to its original intent on any device, despite differences in the imaging technologies and color capabilities between the device that captured the graphical object and the device displaying the graphical object. Another advantage of dynamically generated device profiles in accordance with one embodiment of the invention is that existing color management application programs are not required to be modified—they may interact with dynamically generated device profiles via a standard application programming interface. Yet another advantage of dynamically generated profiles is that in live mode the number of profiles needed to accurately process a large number of images may be small. (This is because a camera's capture environment is not likely to change frequently when coupled to a computer system.) In these cases, only a few unique profiles are created (see
Various changes in the materials, components, circuit elements, as well as in the details of the illustrated operational methods are possible without departing from the scope of the claims. For example, an image capture device may be a digital camera or any other device capable of providing an image containing device profile information. Thus, previously captured image files may be provided from computer storage such as magnetic and optical disks, magnetic tape, and flash memory devices. In one embodiment, an image capture device may be coupled to computer system 200 through expansion slots 216 or through I/O circuit 228.
Method steps of
Number | Name | Date | Kind |
---|---|---|---|
5298993 | Edgar et al. | Mar 1994 | A |
5550646 | Hassan et al. | Aug 1996 | A |
5694227 | Starkweather | Dec 1997 | A |
5739809 | McLaughlin et al. | Apr 1998 | A |
5806081 | Swen et al. | Sep 1998 | A |
5818525 | Elabd | Oct 1998 | A |
5881209 | Stokes | Mar 1999 | A |
5956044 | Giorgianni et al. | Sep 1999 | A |
5982416 | Ishii et al. | Nov 1999 | A |
6075888 | Schwartz | Jun 2000 | A |
6088038 | Edge et al. | Jul 2000 | A |
6151134 | Deppa et al. | Nov 2000 | A |
6226011 | Sakuyama et al. | May 2001 | B1 |
6236464 | Kohtani et al. | May 2001 | B1 |
6268930 | Ohta et al. | Jul 2001 | B1 |
6273535 | Inoue et al. | Aug 2001 | B1 |
6283858 | Hayes, Jr. et al. | Sep 2001 | B1 |
6307961 | Balonon-Rosen et al. | Oct 2001 | B1 |
6337922 | Kumada | Jan 2002 | B1 |
6445461 | Ozawa et al. | Sep 2002 | B1 |
6477318 | Ishii | Nov 2002 | B1 |
6493028 | Anderson et al. | Dec 2002 | B1 |
6504950 | Murashita et al. | Jan 2003 | B1 |