Video based color sensing device for a printing press control system

Information

  • Patent Grant
  • 5767980
  • Patent Number
    5,767,980
  • Date Filed
    Tuesday, June 20, 1995
    29 years ago
  • Date Issued
    Tuesday, June 16, 1998
    26 years ago
Abstract
A color sensing device of a printing press control system, having a plurality of lamp fixtures (100 and 102) for providing light in the visible region and the near infrared region of the spectrum to illuminate a viewing area (104), a camera assembly (108), the camera assembly having multiple channels to capture images in the visible region and the near infrared region, and at least one lens for generating the images, a calibration target (108) with a uniform light reflectance, a device for adjusting the distribution of the light so that image captured from said calibration target in each channel of the camera assembly is as even as possible, a device for applying a position related compensation process in order to obtain an image which corresponds to a position-invariant viewing condition, and a device for applying a camera value related compensation process in order to obtain an image under a standard viewing condition.
Description

BACKGROUND OF THE INVENTION
The present invention relates to control systems for a printing press.
In the past, four process inks (cyan, magenta, yellow and black) have been used on a printing press to produce copies with a gamut of colors. To improve trapping and reduce ink cost, various undercolor removal techniques (UCR) and grey component replacement (GCR) techniques have been used in color separation processing. The UCR and GCR techniques remove a certain amount of the cyan, magenta and yellow ink from some printing areas and replace them with a certain amount of the black ink. Thus, the black ink has been used to generate not only the text but also the color image, thus reducing the total volume of ink used to print. Different color separation equipment manufacturers offer different UCR and GCR techniques to determine when this black ink substitution will take place and what amount of inks will be substituted.
In the past, the press room color reproduction quality control process has been divided into two categories: "control by target" and "control by image."
In the "control by target" method, a set of color control targets is printed in a margin. Instruments, such as densitometers, are used to monitor the color attributes, such as the optical density, of these targets. The printing press is then adjusted based on the measured deviation of these control targets from a predefined attribute value. The application of this method for quality control creates waste and consumes resources in that an additional process is required to cut off this target from the final product. It also requires tight material control for paper color and porosity, ink color, and other printing parameters so that the desired image color is maintained.
In the "control by image" method, the print image on a production copy is compared with the printed image on a reference copy, called a proof. The press is then adjusted based on the difference between the production image and the reference image. This system is more versatile because it does not require an additional target to be printed. The "control by image" method is also more accurate than the "control by target" method because in some situations although the measured attributes of control targets on the production and reference images are the same, the two images will look different. Conventionally, both the image comparing task and the press adjusting task are performed by a press operator. To improve the productivity and the color consistency, several automatic printing quality inspection systems have been reported recently. These systems use opto-electronic sensor devices, such as a spectrophotometer, or CCD color cameras, to measure the color reproduction quality. Currently, the bandwidth of these sensor devices is limited to the visible region of 400 nm through 700 nm in wavelength of the electromagnetic spectrum. However, within the visible region, it is not possible for these devices to reliably distinguish the black ink from the process black made by the combination of cyan, magenta, and yellow inks, or to determine whether the black ink or all cyan, magenta, and yellow inks should be adjusted. Although these devices, such as spectrophotometers, might be able to measure the printed color accurately, it is difficult to use the measured color information to achieve the automatic control for a four-color press without a target due to the involvement of the UCR and GCR techniques. A control method without targets could require selecting the points in the image to be measured or a large number of measurements would have to be acquired. A camera system can acquire a large number of measurements simultaneously, giving it an advantage when targets are not printed.
Since the quality of control can be attributed, in part, to the consistency of measurement, it becomes necessary to provide the means to ensure this consistency. In order to control the printing press accurately, there are two fundamental requirements for this camera based color sensing system. These two requirements are position-invariant and time-invariant. The position-invariant requirement ensures that consistent measurements can be obtained from a sample regardless where this sample is positioned in the camera field of view. The time-invariant requirement ensures that repeatable measurements can be obtained from a sample over a long period of time.
However, many components used in a camera measurement system are not position-invariant. For example, a lens transmits less light at its border region than it does in its center region. Normally, the relative illumination of a lens is proportional to the fourth power of the cosine of the viewing angle. This means that at a 30-degree viewing angle, the relative illumination is only 50% of that along the optical axis of the lens. At a 45-degree viewing angle, the relative illumination is further reduced to 25%. Thus, an image obtained from an uniformly illuminated area will have darker corners, especially when the viewing angle is large. Depending upon the type of glass and surface coatings used, this dark corner problem may also be wavelength related. Therefore, certain camera channels may have more dark corner problems than other camera channels. To overcome this dark corner problem, maintain a higher dynamic range and to enable a uniform target to be viewed by the camera as uniform, more light is needed in the corner regions of the camera field of view.
Many components are not time-invariant. For example, the output of a lamp may vary based on the variation of the supplied voltage and ambient temperature. The characteristics of the camera preamplifier and analog-to-digital conversion circuit may also change from time to time. The camera lens iris setting may also be changed by vibration. All of these factors decrease the system repeatability.
To achieve and maintain the position-invariant and time-invariant requirements, a standard viewing condition is needed in order to compensate these variables.
SUMMARY OF THE INVENTION
A principal feature of the present invention is the provision of an improved lighting system for a control system of a printing press.
A color sensing device for a printing press control system comprising, a plurality of lamp fixtures for providing light in the visible region and the near infrared region of the spectrum to illuminate a viewing area, a camera assembly, said camera assembly comprising multiple channels to capture images in the visible region and the near infrared region, and at least one lens for generating said image, a calibration target with a uniform light reflectance, means for adjusting the distribution of said light so that images captured from said calibration target in each channel of said camera assembly is as uniform as possible, means for applying a position related compensation process in order to obtain an image which corresponds to a position-invariant viewing condition, and means for applying a camera value related compensation process in order to obtain an image which corresponds to a standard viewing condition.
A feature of the present invention is the provision of means for providing a light compensation.
Another feature of the invention is that the device obtains an image which corresponds to a uniform lighting condition.
Thus, a feature of the invention is that the device calibrates the lighting system, and provides a perceived uniform lighting condition which provides position independent measurements for the control system of the printing press.
Further features will become more fully apparent in the following description of the embodiments of the invention, and from the appended claims.





DESCRIPTION OF THE DRAWINGS
In the drawings:
FIG. 1 is a block diagram of a control system for a printing press of the present invention;
FIG. 2 is a diagrammatic view of the system of FIG. 1;
FIG. 3 is a block diagram of the control system of FIG. 1;
FIG. 4 is a diagrammatic view of a camera or sensor for the control system of the present invention;
FIG. 5 is a diagrammatic view of another embodiment of the camera or sensor for the control system for the present invention;
FIG. 6 is a diagrammatic view of a further embodiment of a camera or sensor for the control system of the present invention;
FIG. 7 is a chart plotting the normalized percentage of IR Reflection against the percentage Dot Area in a printed sheet;
FIG. 8 is a diagrammatic view of a spectrum of electromagnetic waves including the visible spectrum and the infrared spectrum;
FIG. 9 is a diagrammatic view of set of elements for a sensor space and ink space;
FIG. 10 is a block diagram of the sensor space and ink space in conjunction with the control system of the present invention;
FIG. 11 is a block diagram of the control system for adjusting the printing press;
FIG. 12 is a diagrammatic view of a lighting arrangement for the control system of the printing press and FIG. 12a is a diagrammatic view of a calibration target positioned in the field of view of a camera device;
FIG. 13 is a chart showing the intensity of the output of two groups of lamps in the lighting arrangement;
FIG. 14 is a chart showing percentage of transmittance of two filters used with the lamps;
FIG. 15 is a diagrammatic view of a multi-step calibration target; and
FIG. 16 is a chart showing a mapping between measured camera values and desired camera values.





DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now to FIG. 1, there is shown a control system generally designated 10 for a printing press 11 of the present invention.
The control system 10 has a 4 channel sensor 21, a data converter 23 for processing information from the sensor 21, and a device 25 for controlling ink for the press 11. As will be seen below, the 4 channel sensor 21 detects the energy reflected from a paper surface, such as the paper web for the press 11, in both the visible region and the infrared region of the electromagnetic spectrum. As shown in FIG. 8, electromagnetic waves in the infrared region have a longer wave length than the visible spectrum, with the wave lengths of the electromagnetic waves in the region of visible light being approximately 400 to 700 nanometers (nm), and the wave lengths of the electromagnetic waves in the infrared region, including near infrared, being equal to or greater than 800 nm.
As shown in FIG. 2, the control system 10 has a support 12 for placement of a sheet of paper 14 with image or indicia 16 on the sheet 14 in a configuration beneath a pair of opposed lights 18 and 20 for illuminating the sheet 14, The system 10 has a first color video camera or sensor 22 having three channels for detecting attributes of the inks from the sheet 14 in the visible region of the electromagnetic spectrum such as red, green and blue, or cyan, magenta, and yellow, and for sending the sensed information over separate lines or leads 24, 26, and 28 to a suitable digital computer 30 or Central Processing unit having a randomly addressable memory (RAM) and a read only memory (ROM), with the computer or CPU 30 having a suitable display 32. Thus, the three distinct color attributes of the inks are sensed by the camera 22 from the sheet 14, and are received in the memory of the computer 30 for storage and processing in the computer 30.
The system 10 also has a black/white second video camera or sensor 34 having a filter 50 such that it senses the attributes of the inks in the infrared region of the electromagnetic spectrum, having a wave length greater than the wave length of the electromagnetic waves in the visible region of light. The camera or sensor 34 thus senses infrared information from the sheet 14, and transmits the sensed information over a lead 36 to the computer 30, such that the infrared information is stored in and processed by the computer 30.
The normalized percentage of infrared (IR) reflection vs. the percentage of dot area is show in the chart of FIG. 7. It will be seen that the infrared reflectance of cyan, magenta, and yellow inks show no significant change as a function of percentage of dot area. However, the normalized infrared reflectance of the black ink displays a significant change as a function of percentage of dot area, and changes from a normalized value of 100% IR reflection for 0% dot area to approximately 18% IR reflection corresponding to 100% dot area. Hence, the black ink may be easily sensed and distinguished from other color inks in the infrared region of the electromagnetic waves.
As shown in FIG. 2, the sheet 14 may contain a printed image or indicia 16 which is obtained from a current press run of the press 11, termed a production or current copy. In addition, a sheet 38 containing a printed image or indicia 40, termed a reference copy, from a previous reference press run may be placed on the support 12 beneath the cameras 22 and 34 in order to sense the energy reflected from the sheet 38, and send the sensed information to the memory of the computer 30 for storage and processing in the computer 30, as will be described below. Thus, the cameras or sensors 22 and 34 may be used to sense both the current copy or sheet 14 and the reference copy or sheet 38. The information supplied by the cameras 22 and 34 is formed into digital information by a suitable analog to digital converter in a frame grabber board on the computer 30. Thus, the computer 30 operates on the digital information which is stored in its memory corresponding to the information sensed from the sheets 14 and 34 by the cameras or sensors 22 and 34.
Referring now to FIG. 3, there is shown a block diagram of the control system 10 for the printing press 11 of the present invention. As shown, the four inks (cyan, magenta, yellow, and black) of the four-color printing press 11 are first preset, after which a print is made by the press 11 with a current ink setting, thus producing a production or current printed copy, as shown. The color and black/white video cameras or sensors 22 and 34 of FIG. 2 serve as a four channel sensor 21 to capture an image of the current printed copy, and then place this information into the memory of the computer 30 after it has been formed into digital information.
Next, an "Ink Separation Process" 23 is used to convert the red, green, blue and IR images captured by the four channel sensor 21 into four separated cyan, magenta, yellow and black ink images, which represent the amount of corresponding ink presented on the live copy. The "Ink Separation Precess" 23 may utilize mathematic formulas, data look up tables or other suitable means to perform the data conversion task.
The similar processes are also applied to the reference copy. First, the four channel sensor 21 is used to capture the red, green, blue and IR images from the reference copy. Then, the "Ink Separation Process" 23 is utilized to obtain the cyan, magenta, yellow and black ink images, which represent the amount of corresponding ink presented on the reference copy.
As shown, the ink images of the production copy are compared with the ink images of the reference copy by the computer 30 to detect the variation of ink distribution for each of the cyan, magenta, yellow and black inks.
The determined differences in ink distribution are then processed by the computer 30 in order to obtain an indication for controlling the keys or other devices of the press 11 in an ink control process, and thus provide an indication of an ink adjustment to the press to obtain further copies which will have a closer match to the reference copy. The indication of ink changes may be automatically supplied to the press 11, or the operator may utilize the indications of ink color attributes to set the press 11, such as adjustments to ink input rate by using the keys.
In the past, four process inks (cyan, magenta, yellow, and black) have been used on a printing press to produce copies with a gamut of colors. In these systems, the black ink has been used to generate not only the text but also the color image. In a control by image system, the print image of a production copy is compared with the printed image on a reference copy, termed a proof, and the press is adjusted based on the difference between the production image and the reference image. However, within the visible region, it is not possible to reliably distinguish the black ink from the process black made by the combination of cyan, magenta, and yellow inks, or whether the black ink or all cyan, magenta, and yellow inks should be adjusted.
The four channel sensor 21 is utilized to sense not only attributes in three channels of the visible region, the fourth channel of the sensor 21 senses an attribute in the infrared region in order to determine the correct amount of inks, including black ink, to correctly reproduce the proof. The printing press control system uses the four channel detector or sensor 21 to detect the energy reflected from a paper surface, such as the sheets 14 and 38, or the paper web of the press 11, with three channels being in the visible region and one channel being in the infrared region of the electromagnetic spectrum. The control system 10 has a device 23 for converting the output of the sensing device 21 to a set of variables which represent the amount of ink presented on the paper for any of the cyan, magenta, yellow, and black inks, and a device 25 responsive to the converting device 23 for adjusting the four-color printing press 11 to maintain the color consistency.
In a preferred form, the bandwidth of the infrared channel may be between 800 nm and 1100 nm, which is a portion of the near infrared region, and which is compatible with a regular silicon detector, although the working wavelength of the infrared channel may be longer than 1100 nm. At least three distinct channels are utilized in the visible region which may correspond to red, green, and blue (RGB), or cyan, magenta, and yellow (CMY), or other colors. The bandwidth of each channel in the visible region may be less than 70 nm, more than 100 nm, or any value in between, with channels having a multiple peak in its passing band, such as magenta, being also included.
The sensor device 21 may be constructed from either a single element detector, a one-dimensional (linear) detector, a two-dimensional (area) detector, or other suitable detector structure, as will be seen below. The sensor device may be constructed by adding an additional infrared channel to existing devices, adding an infrared channel to a RGB color camera or a densitometer, or by extending the working band into the infrared region, e.g., adding infrared capability to a spectrophotometer. The light source 18 and 20 used provides sufficient radiated energy in both the visible region and the infrared region, depending upon the sensor working band and sensitivity.
All possible values which are output from the sensor device 21 may be used to form a vector space. For example, all possible values output from the sensor device 21 with red, green, blue and infrared channels form a four dimensional vector space R-G-B-IR, with the vector space being termed a sensor space S.sub.1, with each output from the sensor device 21 being termed a vector in the sensor space S.sub.1, with the minimum number of dimensions required by the sensor structure being 4. Thus, as shown in FIG. 9, a set S.sub.1 of elements e.sub.11 and e.sub.12 being given, with the elements e.sub.11 of the set S.sub.1 being the vectors v.sub.11 corresponding to the output from the sensor device 21 of sensing a production or current printed copy, and with the elements e.sub.12 of the set S.sub.1 being the vectors v.sub.12 corresponding to the output from the sensor device 21 sensing a reference printed copy. In accordance with the present invention, the printed image on a production or current copy may be compared with the printed image on a reference copy in the sensor space, and if the difference between the live copy L.C..sub.s and the reference copy R.C..sub.s is within a predefined tolerance level delta, at least for all the channels in the visible region of the sensor space, such that, � L.C..sub.s -R.C..sub.s ! <delta, the production or current copy is said to be acceptable by definition.
A set of variables may be defined to represent the amount of ink presented in a given area. For example, a set of variables C, M, Y, and K can be defined to represent or be a function of the amount of cyan, magenta, yellow, and black ink in a given area. This set of variables may correspond to the ink volume, average ink film thickness, dot size, or other quantities related to the amount of ink in a given area on the paper surface. The vector space formed by this set of variables is termed an ink space S.sub.2, with the ink space S.sub.2 having a dimension of 4 for a four color printing press 11. Thus, with reference to FIG. 9, a set S.sub.2 of elements d.sub.11 and d.sub.12 are given, with the elements d.sub.11 of the set S.sub.2 being the vectors v.sub.j1 corresponding to the variables associated with the production or current copy in the ink space S.sub.2, and with the elements d.sub.12 of the set S.sub.2 being the vectors v.sub.j2 corresponding to the variables associated with the reference copy in the ink space S.sub.2.
With reference to FIG. 9, there exists at least one transfer function or transformation phi which can map the elements d.sub.11 and d.sub.12 of the set S.sub.2 or the four dimensional ink space, into the elements e.sub.11 and e.sub.12 of the set s.sub.1 or the four dimensional sensor space, with the transformation phi being termed a forward transfer function, as shown in FIGS. 9 and 10. It is noted that the subsets in each set S.sub.1 and S.sub.2 may overlap or may be the same.
The forward transfer function may be used in a soft proof system which can generate a proof image which can be stored in the system as a reference or can be displayed on a CRT screen.
With further reference to FIG. 9, there exists at least one transfer function or reverse transformation phi.sup.-1 which can map the elements e.sub.11 and e.sub.12 of the set S.sub.1 of the four dimensional sensor space into the elements of d.sub.11 and d.sub.12 of the set S.sub.2 of the four dimensional ink space, with the transfer function being termed a reverse transfer function. Thus, both the production image and the reference image in the sensor space or set S.sub.1 can be mapped into the ink space or set S.sub.2 by applying the reverse transfer function phi.sup.-1 point by point as shown in FIGS. 9 and 10.
The difference between the production image and the reference image in the ink space S.sub.2 thus represents the difference of the ink distribution for each of the cyan, magenta, yellow, and black inks, as shown in FIG. 11. The difference between the live and reference images in the ink space S.sub.2 indicates which printing unit should be adjusted, which direction, up or down, it should be adjusted, and the amount of ink which should be adjusted. A suitable press control formula may be developed to adjust press parameters, such as ink input rate in lithographic or letterpresses, ink consistency in flexographic or gravure presses, water input rate in lithographic presses, or temperature in any of the above, based on the differences between the production and the reference image in the ink space S.sub.2.
In accordance with the present invention, the press adjustments can be achieved by the automatic control system 10, by press operator alone, or by the interaction between the automatic control system 10 and the press operator. Also, the sensor device 21 may be used to monitor the printing web of the press 11 directly, i.e., on press sensing, or to monitor the prints collected from the folder of the press, i.e., off press sensing. If the digital images from the color separation processing, or the film/plate images are available, the image of the reference copy in the sensor device 21 can be generated electronically by the forward transfer function phi. The electronically generated reference may be used to set up the press 11 in order to reduce the make ready time.
The color reproduction quality can be maintained through the entire press run, through different press runs on different presses, or at different times. Thus, a closed loop automatic color reproduction control system may be formed without an additional color control target. The variation of ink, paper, and other press parameters can be compensated such that the printed copies have the highest possible overall results in matching the reference copy.
As shown in FIG. 4, the camera or sensor 22 may be associated with a rotating filter member 52 having filters which only transmit the desired colors F.sub.1, F.sub.2, and F.sub.3, such as red, green, and blue during rotation, such that the camera or sensor 22 senses and records the colors F.sub.1, F.sub.2, and F.sub.3, sequentially or separately from the printed material which may be taken either from the current press run or from the reference press run. In addition, the filter member 52 may have an infrared (IR) filter F.sub.4 in order to sense and record the energy reflected form the printed material in the infrared region. The information received by the camera or sensor 22 from the filters may be recorded in the computer or CPU for use in forming the desired data to control the inks, as previously discussed.
In another form as shown in FIG. 5, the camera or sensor 22 may comprise a charge coupled device (CCD) with built in filters which converts light energy reflected from the printed material into electric energy in a video camera, i.e. F.sub.1, F.sub.2, F.sub.3, and F.sub.4, (IR), such as the distinct colors red, green, and blue in the visible region, and the near infrared energy in the infrared region, in order to supply the information to the computer 30 for storage and processing, as previously discussed.
Another embodiment of the camera or sensor 22 of the present invention is illustrated in FIG. 6, in which like reference numerals designate like parts. In this embodiment, the camera or sensor 22 has a beam splitter in order to separate the incoming light reflected from the printed material into an infrared beam for a first CCD 1, F.sub.1 such as red for a second CCD 2, F.sub.2 such as green for a third CCD 3, and F.sub.3 such as blue for a fourth CCD. In this embodiment, suitable prisms, lenses, or mirrors may be utilized to accomplish the beam splitting of light in order to obtain the desired color attributes in the various charge coupled devices to supply the information to the computer 30 for storage and processing in the computer 30, in a manner as previously described. Of course, any other suitable camera or sensing device may be utilized to obtain the desired colors.
Thus, a control system 10 for a printing press 11 is provided which ascertains three distinct attributes, such as colors, in the visible region of electromagnetic waves and an attribute in the infrared region of the electromagnetic spectrum for the printed inks. The control system 10 utilizes these four attributes in a four channel device to indicate and control the ink colors for use in the press 11.
Thus, the colors may be sensed from a sheet taken during a current press run, and from a sheet taken during a reference press run, after which the sensed information is utilized in order to modify ink settings of a press 11 in order to obtain repeatability of the same colors from the reference run to the current press run. In this manner, a consistent quality of colors may be maintained by the printing press 11 irrespective of the number of runs after the reference run has been made, and may be continuously used during a press run if desired.
A camera based color sensing device for a printing press control system usually comprises of a set of lamp fixtures and a camera assembly. In order to accurately control the printing process, this color sensing device should provide a position-invariant and time-invariant measurement.
However, many factors will effect the consistency and repeatability of the system. The lens has an uneven light transmittance from the center to the border. The amount of light produced by the lamp fixtures varies from time to time. The sensitivity of the image sensor may also drift due to temperature variation and aging. Device and calibration procedures are needed to provide a standard viewing condition for this camera based color sensing system.
As shown in FIG. 12, a four channel camera assembly 108 is used for capturing images. However, an integrated four channel camera, such that shown in FIG. 5 or 6, has not yet become commercially available at the present time. The two-camera approach shown in FIG. 2 provides a convenient way to reconstruct this four channel camera 108. In this embodiment, a color camera is used for capturing red, green and blue images and a monochrome camera for capturing near infrared images. Each of these four camera channels normally comprises a Charge Coupled Device (CCD) image sensor. The working wavelength range of this camera assembly is from 400 nm to about 1000 nm. This is about twice the range of the visible light spectrum. Like any other optical components, the light transmitting characteristics of the lens is wavelength related. A special lighting arrangement is often needed to ensure that a standard viewing condition can be established for each of these four camera channels, even if two cameras and two lenses are used. This standard viewing condition is also needed to maintain measurement consistency between two different color sensing systems.
As shown in FIGS. 12 and 13, the preferred light source comprises a first and second groups of lamps 100 and 102, respectively, to provide light in both the visible region (400-700 nm) and the near infrared region (700-1000 nm). At least one of the two groups of lamps 100 or 102 operates only in a single region, either the visible or the near infrared region, but not in both. For example, the first group of lamps has an output in both the visible and infrared regions. This covers the entire 400-1000 nm spectrum. The second group of lamps 102 has an output in the infrared region (700-1000 nm) only.
A halogen lamp is rich in energy in the desired 400-1000 nanometer spectrum and can be used in the two lamp groups 100 and 102. Some halogen lamps have filters to reduce the undesirable energy output in wavelengths longer than 1000 nm. A lamp MR16 sold by General Electric with a Constant Color Coating is an example of one such lamp.
As shown in FIG. 14, energy output can be constrained to the desired spectral region by using optical filters. A tempered color temperature compensation filter, such as a SCHOTT FG3 filter with a proper thickness is used in front of the first lamp group 100 to provide a standard D50 light source with energy extended into the near infrared region. Lamps in the second group 102 can be fitted with a tempered filter, such as a SCHOTT LP78 filter, to block visible light while passing infrared light longer than 780 nm. In order to reduce the ripple component in the light output, a DC power supply can be used to drive these halogen lamps.
Other light sources, such as Xenon lamps, can be used, as long as they provide enough energy in both the visible and near infrared regions. It is not necessary that the size of the lamp be small. Lamps with large physical dimensions can also be used. Linear lamps would be an example of the device where light output is present over a large area.
As shown in FIGS. 12 and 12a, a calibration target 106 with a uniform light reflectance in the visible and the near infrared region is positioned under a rectangular camera viewing area 104.
A blank sheet of paper can be used as the calibration target 106 if it remains flat and smooth, and its material content is homogeneous without granularity. Since this type of paper is not prevalent and the quality is difficult to maintain, a special calibration target can be constructed. A uniform gray calibration target can be made with various paints and surface modifying agents so as to have a flat spectral curve from 400-1000 nm. The gloss of this target is similar to that of a blank sheet of paper used to print a reference or production copy.
As shown in FIGS. 12 and 12a, the calibration target 106 is positioned in the field of view 104 of a four channel camera 108 so that the target surface is near perpendicular to the optical axis of the camera 108. The light source is mounted 45 degrees with respect to the camera optical axis to reduce the direct reflection from the target. All remaining surfaces outside the viewing are painted black with a mat finish.
A display lookup table is created to cause certain pixel values to become more prominent as viewed on a color monitor. This allows the operator to distinguish small changes in camera values so that the lamps can be adjusted to cause the light over that target surface to appear more uniform. Using the above viewing method with a lookup table, the first group of lamps 100 is adjusted to minimize the unevenness in the green image. This can be done by pointing the lamps 100 to a different position, readjusting the reflector of the lamps if it exists, or altering the light distribution pattern by using a mesh screen material or neutral density filters. The unevenness is checked in the red and blue images. If the light distribution patterns in the red and blue images are substantially different than that in the green image, the spectral output of the individual lamps and filters should be checked and corrected if necessary. While keeping the first group of lamps 100 unchanged, the second group of lamps 102 is adjusted so that the unevenness of the infrared image is also minimized. Statistics for each image, like standard deviation and average value, can be used to assist this operation.
Multiple images are captured from the calibration target 106 under this lighting condition. The images are averaged to remove individual pixel noise. A neighborhood averaging technique may be used to remove any high spatial frequency noise. The highest pixel value is found within each averaged image. An intermediate image is created by dividing this value by each of the pixel values in the averaged image. Each pixel in the intermediate image is then multiplied by a constant gain factor, e.g., 128 for an 8-bit image. This will create a light compensation image for each of the four channels.
The compensation process can be started by multiplying an image of interest with the light compensation image. The result of this multiplication is then divided by the constant gain factor. The purpose of this operation is to raise pixel values in the darker areas to a level equal to those in the brightest area. The resulting image corresponds to the image of interest as if it had been viewed under a uniform light condition.
The above compensation goal also can be achieved by lowering the pixel values in the brightest areas to a level equal to those in the darkest areas.
Applying the above position related compensation process to an image captured from the calibration target 106 will cause the resultant image to become uniform. When this compensation process is applied to any other captured image, it provides pixel values for the image as if the target was illuminated by a perceived uniform lighting condition. This implies that as the target is moved within the field of view, image features will maintain consistent pixel values. Thus, this position related compensation provides a position-invariant viewing condition to this color sensing system.
In order to reduce the variation caused by the drifting of the lamp and electronics, a gray scale calibration target can be used. As shown in FIG. 15, a gray scale calibration target 110 consists of 12 steps, each with different darkness. The darkest and lightest steps should represent the highest density encountered during the printing process and the whitest paper used, respectively. The number of steps included in this gray scale is based on the accuracy required. Normally, 10 through 30 steps should be sufficient. The material used to make this target should have a flat spectral curve.
After creating this multi-step target, measure the light reflectance from each step over the wavelength range from 400 nm through 1000 nm. Then calculate the averaged reflectance within the bandwidth of each camera channel for each step.
The next thing to do is to determine a desired camera value for the lightest step. This value should be chosen high enough to provide a wide dynamic range, but be low enough to prevent camera saturation under typical viewing conditions. Normally, the sensing device has a known relation between the light input and the signal output, such as a linear or a logarithm relation. Thus, desired camera values for other steps can be calculated accordingly. Representative data showing averaged reflectance and desired camera values of a 12 step target are provided in Table 1.
During the system setup, adjust iris or camera gains so that the camera value obtained from the lightest step is as close to its desired value as possible. Lock the iris or camera gain settings to prevent any possible changes.
The following paragraphs show a compensation procedure utilizing this multi-step target to eliminate any effect caused by component drifting.
Capture an image from this gray scale target. To reduce any stray light, a black background should be used behind the gray target. Apply the position related compensation to this image. Obtain camera values for each channel and each step as shown in Table 1. Put the desired camera value and the measured camera value in a graph for each camera channel. An example of blue channel data is shown in FIG. 16. Each data value represents a point in the graph in FIG. 16. A mapping can be created by connecting these points on the graph. A thin dotted straight line is also included in FIG. 16 to show the linear relationship. This mapping can be easily implemented by a data lookup table. Mappings for other channels can be generated in a similar way.
The above procedure should be performed periodically to compensate any possible component drifts. Thus, this camera value related compensation provides a time-invariant viewing condition and greatly improves the system repeatability.
By applying the position related compensation and then the camera value related compensation, a standard viewing condition can be established. The position-invariant and time-invariant requirements are satisfied.
Thus, in accordance with the present invention a standard viewing condition is provided for the camera based color sensing system to provide improved results in the control system of the printing press.
The forgoing detailed description has been given for clearness of understanding only, and no unnecessary limitations should be understood therefore, as modifications will be obvious to those skilled in the art.
TABLE 1__________________________________________________________________________Sheet 1Averaged Reflectance Desired Camera Value Measured Camera ValueStep Blue Green Blue NIR Blue Green Blue NIR Blue Green Blue NIR__________________________________________________________________________1 0.888 0.919 0.922 0.8908 218 226 227 226 224 234 230 2202 0.842 0.872 0.866 0.8212 207 214 213 208 215 223 218 2013 0.698 0.716 0.716 0.6854 172 176 176 174 184 187 185 1694 0.582 0.603 0.604 0.584 143 148 149 148 159 161 160 1445 0.495 0.51 0.512 0.493 122 125 126 125 138 138 137 1236 0.39 0.402 0.404 0.3884 96 99 99 99 113 110 109 987 0.294 0.302 0.301 0.2864 72 74 74 73 88 83 83 738 0.199 0.205 0.204 0.1958 49 50 50 50 63 58 57 529 0.148 0.152 0.151 0.1444 36 37 37 37 48 43 43 3910 0.075 0.074 0.072 0.068 18 18 18 17 25 21 21 2211 0.039 0.037 0.036 0.0356 10 9 9 9 14 10 11 1312 0.013 0.012 0.012 0.012 3 3 3 3 6 4 4 7__________________________________________________________________________
Claims
  • 1. A device to provide a substantially uniform lighting condition as perceived by a color sensing device for a control system in a printing press, comprising:
  • a first lamp for generating light in at least a visible region of a light spectrum;
  • a second lamp for generating light in only an infrared region of the light spectrum;
  • a calibration target; and
  • means for capturing images in the visible and the infrared regions;
  • wherein the light output by the first lamp is adjustable to reduce unevenness in a first image captured by the capturing means in the visible region, and the light output by the second lamp is adjustable to reduce unevenness in a second image captured by the capturing means in the infrared region to thereby develop a substantially uniform lighting condition as perceived by a color sensing device.
  • 2. A device as defined in claim 1 further comprising position compensation means for applying a position related compensation process to images captured by the capturing means to produce a position-invariant viewing condition.
  • 3. A device as defined in claim 2 wherein the position compensation means generates a compensation image from at least one image captured by the capturing means from the calibration target, and the position compensation means applies the compensation image to subsequent images captured by the capturing means to provide the position-invariant viewing condition.
  • 4. A device as defined in claim 2 wherein the position compensation means comprises a central processing unit.
  • 5. A device as defined in claim 1 wherein the capturing means comprises a camera, and further comprising camera value compensation means for applying at least one camera value related compensation process to images captured by the capturing means to produce a time-invariant viewing condition.
  • 6. A device as defined in claim 5 wherein the camera value related compensation means comprises a central processing unit.
  • 7. A device as defined in claim 5 wherein the at least one camera value related compensation process is implemented through a lookup table.
  • 8. A device as defined in claim 5 wherein the at least one camera value related compensation process is developed from captured images of a gray scale.
  • 9. A device as defined in claim 1 wherein the capturing means comprises a camera assembly having four channels.
  • 10. A device as defined in claim 9 wherein the four channels comprise red, green, blue, and infrared channels.
  • 11. A device as defined in claim 10 wherein the camera assembly comprises a color camera and a monochrome camera, the color camera providing the red, green and blue channels and the monochrome camera providing the infrared channel, the color camera having a lens and the monochrome camera having a lens.
  • 12. A device as defined in claim 10 wherein the camera assembly comprises an integrated four channel camera having a single lens.
  • 13. A device as defined in claim 9 wherein each channel of the camera assembly comprises a Charge Coupled Device image sensor.
  • 14. A device as defined in claim 1 wherein the capturing means has an associated optical axis, the optical axis being substantially perpendicular to a surface of a viewing area.
  • 15. A device as defined in claim 14 wherein the first lamp is positioned to emit light at an approximately 45 degree angle to the optical axis.
  • 16. A device as defined in claim 14 wherein the second lamp is positioned to emit light at an approximately 45 degree angle to the optical axis.
  • 17. A device as defined in claim 1 wherein the calibration target comprises a blank sheet of paper.
  • 18. A device as defined in claim 1 wherein the calibration target includes a painted working surface having a glossiness and lightness which is substantially similar to glossiness and lightness of a blank sheet of paper.
  • 19. A device as defined in claim 1 wherein the calibration target has a substantially flat spectral reflectance curve at least in a wavelength range from approximately 400 nm to 1000 nm.
  • 20. A device as defined in claim 1 wherein the light output by the first lamp is adjusted via a mesh screen.
  • 21. A device as defined in claim 1 wherein the light output by the first lamp is adjusted via a neutral density filter.
  • 22. A device as defined in claim 1 wherein the light output by the first lamp is adjusted by changing an orientation or position of the first lamp.
  • 23. A device as defined in claim 1 wherein the output of the second lamp is adjusted via a mesh screen.
  • 24. A device as defined in claim 1 wherein the output of the second lamp is adjusted via a neutral density filter.
  • 25. A device as defined in claim 1 wherein the output of the second lamp is adjusted by changing an orientation or position of the second lamp.
  • 26. A device as defined in claim 1 further comprising a display for viewing the images obtained by the capturing means and a programmable display lookup table for making image intensity variation appear more prominent on the display.
  • 27. A device as defined in claim 1 wherein the first image is a green image.
  • 28. A device as defined in claim 1 wherein a third image and a fourth image captured by the capturing means in the visible region are checked for unevenness to detect a need for correcting spectral output of the first lamp, and wherein the first image is a green image, the third image is a red image, and the fourth image is a blue image.
  • 29. A device as defined in claim 1 wherein the first lamp comprises a set of lamps.
  • 30. A device as defined in claim 1 wherein the second lamp comprises a set of lamps.
  • 31. A device as defined in claim 1 wherein the first lamp generates light in the visible and the infrared regions of the spectrum.
  • 32. A device to provide a substantially uniform lighting condition as perceived by a color sensing device for a control system in a printing press, comprising:
  • a first lamp for generating light in only a visible region of a light spectrum;
  • a second lamp for generating light in at least an infrared region of the light spectrum;
  • a calibration target; and
  • means for capturing images in the visible and the infrared regions;
  • wherein the light output by the first lamp is adjustable to reduce unevenness in a first image captured by the capturing means in the visible region, and the light output by the second lamp is adjustable to reduce unevenness in a second image captured by the capturing means in the infrared region to thereby develop a substantially uniform lighting condition as perceived by a color sensing device.
  • 33. A device as defined in claim 32 further comprising position compensation means for applying a position related compensation process to images captured by the capturing means to produce a position-invariant viewing condition.
  • 34. A device as defined in claim 33 wherein the position compensation means generates a compensation image from at least one image captured by the capturing means from the calibration target, and the position compensation means applies the compensation image to subsequent images captured by the capturing means to provide the position-invariant viewing condition.
  • 35. A device as defined in claim 33 wherein the position compensation means comprises a central processing unit.
  • 36. A device as defined in claim 32 wherein the capturing means comprises a camera, and further comprising camera value related compensation means for applying at least one camera value related compensation process to images captured by the capturing means to produce a time-invariant viewing condition.
  • 37. A device as defined in claim 60 wherein the camera value related compensation means comprises a central processing unit.
  • 38. A device as defined in claim 36 wherein the at least one camera value related compensation process is implemented through a lookup table.
  • 39. A device as defined in claim 36 wherein the at least one camera value related compensation process is developed from captured images of a gray scale.
  • 40. A device as defined in claim 32 wherein the capturing means comprises a camera assembly having four channels.
  • 41. A device as defined in claim 40 wherein the four channels comprise red, green, blue, and infrared channels.
  • 42. A device as defined in claim 41 wherein the camera assembly comprises a color camera and a monochrome camera, the color camera providing the red, green and blue channels and the monochrome camera providing the infrared channel, the color camera having a lens and the monochrome camera having a lens.
  • 43. A device as defined in claim 41 wherein the camera assembly comprises an integrated four channel camera having a single lens.
  • 44. A device as defined in claim 40 wherein each channel of the camera assembly comprises a Charge Coupled Device image sensor.
  • 45. A device as defined in claim 32 wherein the capturing means has an associated optical axis, the optical axis being substantially perpendicular to a surface of a viewing area.
  • 46. A device as defined in claim 45 wherein the first lamp is positioned to emit light at an approximately 45 degree angle to the optical axis.
  • 47. A device as defined in claim 45 wherein the second lamp is positioned to emit light at an approximately 45 degree angle to the optical axis.
  • 48. A device as defined in claim 32 wherein the calibration target comprises a blank sheet of paper.
  • 49. A device as defined in claim 32 wherein the calibration target includes a painted working surface having a glossiness and lightness which is substantially similar to glossiness and lightness of a blank sheet of paper.
  • 50. A device as defined in claim 32 wherein the calibration target has a substantially flat spectral reflectance curve at least in a wavelength range from approximately 400 nm to 1000 nm.
  • 51. A device as defined in claim 32 wherein the light output by the second lamp is adjusted via a mesh screen.
  • 52. A device as defined in claim 32 wherein the light output by the second lamp is adjusted via a neutral density filter.
  • 53. A device as defined in claim 32 wherein the light output by the first lamp is adjusted by changing an orientation or position of the first lamp.
  • 54. A device as defined in claim 32 wherein the output of the second lamp is adjusted via a mesh screen.
  • 55. A device as defined in claim 32 wherein the output of the second lamp is adjusted via a neutral density filter.
  • 56. A device as defined in claim 32 wherein the output of the second lamp is adjusted by changing an orientation or position of the second lamp.
  • 57. A device as defined in claim 32 further comprising a display for viewing the images obtained by the capturing means and a programmable display lookup table for making image intensity variation appear more prominent on the display.
  • 58. A device as defined in claim 32 wherein the first image is a green image.
  • 59. A device as defined in claim 32 wherein a third image and a fourth image captured by the capturing means in the visible region are checked for unevenness to detect a need for correcting spectral output of the second lamp, and wherein the first image is a green image, the third image is a red image, and the fourth image is a blue image.
  • 60. A device as defined in claim 32 wherein the first lamp comprises a set of lamps.
  • 61. A device as defined in claim 32 wherein the second lamp comprises a set of lamps.
  • 62. A device as defined in claim 32 wherein the second lamp generates light in the visible and the infrared regions of the spectrum.
  • 63. A method of providing a substantially uniform lighting condition as perceived by a color sensing device for a control system in a printing press, comprising the steps of:
  • providing first and second lamps, the first lamp producing light in at least a visible region of a light spectrum and the second lamp producing light in only an infrared region of the light spectrum;
  • providing a camera for viewing images on at least two channels, at least one of the channels being in the infrared region and at least one of the channels being in the visible region;
  • providing a calibration target;
  • viewing a first image of the calibration target in a visible region of the light spectrum with the camera;
  • reducing unevenness in the first image by adjusting the first lamp;
  • viewing a second image of the calibration target in the infrared region of the light spectrum with the camera; and
  • reducing unevenness in the second image by adjusting the second lamp.
  • 64. A method as defined in claim 63 further comprising the step of viewing third and fourth images of the calibration target in the visible region of the camera for unevenness to check the spectral output of the first lamp, wherein the first image is a green image, the third image is a red image, and the fourth image is a blue image.
  • 65. A method as defined in claim 63 further comprising the steps of:
  • capturing multiple images of the calibration target on each channel of the camera;
  • developing an averaged image for each of the channels by averaging corresponding pixels in the multiple images captured on each channel;
  • identifying a highest pixel value in each of the averaged images;
  • developing an intermediate compensation image for each channel by dividing the highest pixel value captured for each channel by every pixel in the averaged image of the corresponding channel;
  • capturing a channel image to be processed on each channel of the camera; and
  • multiplying pixels in each of the channel images to be processed with corresponding pixels in the intermediate compensation image for the corresponding channel.
  • 66. A method as defined in claim 63 further comprising the steps of:
  • providing a gray scale calibration target having a plurality of steps with different darkness characteristics;
  • measuring light reflectance for the plurality of steps on each channel of the camera;
  • calculating an average light reflectance over the bandwidth of each camera channel for each step in the plurality;
  • determining desired camera values for the plurality of steps in the gray scale calibration target;
  • adjusting the camera such that a measured camera value obtained from a lightest step on the gray scale calibration target is substantially equal to the desired camera value for the lightest step on the gray scale calibration target; and
  • mapping the measured camera values to the desired camera values for the plurality of steps in the gray scale calibration target for each channel of the camera.
  • 67. A method of providing a substantially uniform lighting condition as perceived by a color sensing device for a control system in a printing press, comprising the steps of:
  • providing first and second lamps, the first lamp producing light in only a visible region of a light spectrum, the second lamp producing light in at least an infrared region of the light spectrum;
  • providing a camera for viewing images on at least two channels, at least one of the channels being in the infrared region and at least one of the channels being in the visible region;
  • providing a calibration target;
  • viewing a first image of the calibration target in a visible region of the light spectrum with the camera;
  • reducing unevenness in the first image by adjusting the first lamp;
  • viewing a second image of the calibration target in the infrared region of the light spectrum with the camera; and
  • reducing unevenness in the second image by adjusting the second lamp.
  • 68. A method as defined in claim 67 further comprising the step of viewing third and fourth images of the calibration target in the visible region of the camera for unevenness to check the spectral output of the second lamp, wherein the first image is a green image, the third image is a red image, and the fourth image is a blue image.
  • 69. A method as defined in claim 67 further comprising the steps of:
  • capturing multiple images of the calibration target on each channel of the camera;
  • developing an averaged image for each of the channels by averaging corresponding pixels in the multiple images captured on each channel;
  • identifying a highest pixel value in each of the averaged images;
  • developing an intermediate compensation image for each channel by dividing the highest pixel value captured on each channel by every pixel in the averaged image of the corresponding channel;
  • capturing a channel image to be processed on each channel of the camera; and
  • multiplying pixels in each of the channel images to be processed with corresponding pixels in the intermediate compensation image for the corresponding channel.
  • 70. A method as defined in claim 67 further comprising the steps of:
  • providing a gray scale calibration target having a plurality of steps with different darkness characteristics;
  • measuring light reflectance for the plurality of steps on each channel of the camera;
  • calculating an average light reflectance over the bandwidth of each camera channel for each step in the plurality;
  • determining desired camera values for the plurality of steps in the gray scale calibration target;
  • adjusting the camera such that a measured camera value obtained from a lightest step on the gray scale calibration target is substantially equal to the desired camera value for the lightest step on the gray scale calibration target; and
  • mapping the measured camera values to the desired camera values for the plurality of steps in the gray scale calibration target for each channel of the camera.
US Referenced Citations (162)
Number Name Date Kind
2968988 Fothergill Jan 1961
3376426 Frommer et al. Apr 1968
3612753 Korman Oct 1971
3778541 Bowker Dec 1973
3806633 Coleman Apr 1974
3958509 Murray et al. May 1976
4249217 Korte et al. Feb 1981
4308553 Roetling Dec 1981
4393399 Gast et al. Jul 1983
4408231 Bushaw et al. Oct 1983
4441206 Kuniyoshgi et al. Apr 1984
4468692 Yamada et al. Aug 1984
4472736 Ushio et al. Sep 1984
4476487 Klie et al. Oct 1984
4481532 Clark et al. Nov 1984
4482917 Gaulke et al. Nov 1984
4486772 Klie et al. Dec 1984
4494875 Schramm et al. Jan 1985
4505589 Ott et al. Mar 1985
4520504 Walker et al. May 1985
4539647 Kaneko et al. Sep 1985
4561103 Horiguchi et al. Dec 1985
4564859 Knop et al. Jan 1986
4583186 Davis et al. Apr 1986
4590515 Wellendorf May 1986
4631578 Sasaki et al. Dec 1986
4631579 Hoffrichter et al. Dec 1986
4636081 Saitoh et al. Jan 1987
4643563 Sayanagi Feb 1987
4649500 Yamada et al. Mar 1987
4649502 Keller et al. Mar 1987
4649566 Tsunoda et al. Mar 1987
4666307 Matsumoto et al. May 1987
4667227 Ikeda May 1987
4678336 Tsunoda Jul 1987
4681455 Jeschke et al. Jul 1987
4685139 Masuda et al. Aug 1987
4713684 Kawamura et al. Dec 1987
4716456 Hosaka Dec 1987
4731661 Nagano Mar 1988
4752822 Kawamura Jun 1988
4758885 Sasaki et al. Jul 1988
4790022 Dennis Dec 1988
4794382 Lai et al. Dec 1988
4794648 Ayata et al. Dec 1988
4802107 Yamamoto et al. Jan 1989
4809061 Suzuki Feb 1989
4830501 Terashita May 1989
4837711 Suzuki Jun 1989
4839719 Hirota et al. Jun 1989
4839721 Abdulwahab et al. Jun 1989
4855765 Suzuki et al. Aug 1989
4879594 Stansfield et al. Nov 1989
4884130 Huntsman Nov 1989
4891690 Hasegawa et al. Jan 1990
4899214 Robbins et al. Feb 1990
4907076 Ohsawa Mar 1990
4908712 Uchiyama et al. Mar 1990
4910593 Weil Mar 1990
4922337 Hunt et al. May 1990
4926254 Nakatsuka et al. May 1990
4941038 Walowit Jul 1990
4947348 Van Arsdel Aug 1990
4949172 Hunt et al. Aug 1990
4949284 Watanabe Aug 1990
4956703 Uzuda et al. Sep 1990
4958221 Tsuboi et al. Sep 1990
4959790 Morgan Sep 1990
4962421 Murai Oct 1990
4967264 Parulski et al. Oct 1990
4967379 Ott Oct 1990
4970584 Sato et al. Nov 1990
4975769 Aizu et al. Dec 1990
4975862 Keller et al. Dec 1990
4977448 Murata et al. Dec 1990
5003494 Ng Mar 1991
5018008 Asada May 1991
5029107 Lee Jul 1991
5045937 Myrick Sep 1991
5047842 Bouman, Jr. et al. Sep 1991
5053866 Johnson Oct 1991
5068810 Ott Nov 1991
5081527 Naito Jan 1992
5084758 Danzuka et al. Jan 1992
5087126 Pochieh Feb 1992
5089977 Pflasterer et al. Feb 1992
5101448 Kawachiya et al. Mar 1992
5105466 Tsujiuchi et al. Apr 1992
5107332 Chan Apr 1992
5120624 Takanashi et al. Jun 1992
5121196 Hung Jun 1992
5122977 Pfeiffer Jun 1992
5125037 Lehtonen et al. Jun 1992
5126839 Sugiura Jun 1992
5128748 Murakami et al. Jul 1992
5130935 Takiguchi Jul 1992
5142356 Usami et al. Aug 1992
5148288 Hannah Sep 1992
5157483 Shoji et al. Oct 1992
5157506 Hannah Oct 1992
5162899 Naka et al. Nov 1992
5163012 Wuhrl et al. Nov 1992
5166755 Gat Nov 1992
5166789 Myrick Nov 1992
5170441 Mimura et al. Dec 1992
5172224 Collette et al. Dec 1992
5175772 Kahn et al. Dec 1992
5181081 Suhan Jan 1993
5181257 Steiner et al. Jan 1993
5182571 Creagh et al. Jan 1993
5182721 Kipphan et al. Jan 1993
5191361 Abe Mar 1993
5200817 Birnbaum Apr 1993
5206707 Ott Apr 1993
5216498 Matsunawa Jun 1993
5216504 Webb et al. Jun 1993
5224421 Doherty Jul 1993
5272518 Vincent Dec 1993
5282064 Yamada Jan 1994
5282671 Stewart et al. Feb 1994
5295003 Lee Mar 1994
5299034 Kanno et al. Mar 1994
5302833 Hamar et al. Apr 1994
5303028 Milch Apr 1994
5317425 Spence et al. May 1994
5325217 Nagler et al. Jun 1994
5329383 Collette Jul 1994
5345320 Hirota Sep 1994
5357448 Stanford Oct 1994
5359677 Katsurada et al. Oct 1994
5363318 McCauley Nov 1994
5384621 Hatch et al. Jan 1995
5386299 Wilson et al. Jan 1995
5392360 Weindelmayer et al. Feb 1995
5404156 Yamada et al. Apr 1995
5404158 Carlotta et al. Apr 1995
5412577 Sainio et al. May 1995
5416613 Rolleston et al. May 1995
5420945 Concannon et al. May 1995
5424553 Morton Jun 1995
5452112 Wan et al. Sep 1995
5459678 Feasey Oct 1995
5463469 Funada et al. Oct 1995
5467412 Capitant et al. Nov 1995
5479189 Chesauage et al. Dec 1995
5481380 Bestmann Jan 1996
5483359 Yumida et al. Jan 1996
5483360 Rolleston et al. Jan 1996
5488492 Abe Jan 1996
5491568 Wan Feb 1996
5493518 Keating Feb 1996
5508810 Sato Apr 1996
5509086 Edgar et al. Apr 1996
5509115 Butterfield et al. Apr 1996
5521722 Colvill et al. May 1996
5528377 Hutcheson Jun 1996
5530239 Konishi et al. Jun 1996
5530656 Six Jun 1996
5543940 Sherman Aug 1996
5574664 Feasey Nov 1996
5604586 Bahr et al. Feb 1997
5673336 Edgar et al. Sep 1997
Foreign Referenced Citations (10)
Number Date Country
0 142 470 B1 Oct 1984 EPX
408 507 Jul 1990 EPX
0 601 259 A1 Dec 1992 EPX
35 33 549 Oct 1986 DEX
40 23 320 Jan 1992 DEX
43 21 177 Jan 1995 DEX
60-115820 Nov 1983 JPX
2-110566 Apr 1990 JPX
649 842 Jun 1985 CHX
2 282 565 Apr 1995 GBX
Non-Patent Literature Citations (4)
Entry
Graphic Microsystems, Inc., Advertisement for Autosmart.TM. Software.
Graphic Microsystems, Inc. Autosmart II Version 10.0 User's Manual, pp. 1-2.
Heidelberg, Technical Series. . . 2 Stop Guessing About Color.
European search report issued in European patent application No. 96109381.2, dated Apr. 29, 1997.