The present invention relates to digitally enhancing a live image of an object using the chrominance and/or luminance values which could be received from a CMOS- or CCD-based video camera; and more specifically to digitally enhancing live images viewed through any optical or scanning inspection device such as, but not limited to, microscopes (dark or bright field), macroscopes, PCB inspection and re-work stations, medical grossing stations, telescopes, electron scopes and Atomic Force (AFM) or Scanning Probe (SPM) Microscopes and the methods of staining or highlighting live video images for use in digital microscopy and spectroscopy.
The use of micro- and macroscopic inspection has been plagued with either poor contrast or lack of definition of the object being viewed. As lenses and lighting techniques have been improved greatly over the past 50 years and have helped with the clarity and contrast of the subject matter, so have many doctors and scientists relied on “staining” the subject matter with fluoresces and other chemistries that respond to specific light wave lengths. This technique has been shown to improve some microscopic inspection industries, but only with still photography. It is also irreversible.
In fact, present digital microscopy and spectroscopy image enhancement and staining are limited to applying a chemical stain to a given slide and then taking a separate picture under several different light sources. After each picture is taken, each has to be copied over the top of the others so that each can be realized within the final photograph. The process can take several hours to perform to get a result only to find that the wrong color of light or stain was used during the build.
Further, in conventional RGB to YUV conversion systems, an interpolation of the red, green and blue data in the original pixel data is made in order to project color values for pixels in the sensor array that are not sensitive to that color. From the red, green and blue interpolated data, lumina and chroma values are generated. However, these methods do not take into account the different filtering and resolution requirements for lumina and chroma data. These systems do not optimize the filtering or interpolation process based on the lumina and chroma data. Thus, real-time viewing and broadcasting of stained object or specimens is not possible with any technology presently available or known in the art.
Therefore, a need exists for a staining method that allows all or discrete parts of an object or specimen under inspection to be stained without permanently altering it. Further, a need exists for the staining to permit real-time viewing and broadcasting of the stained object rather than stained snap-shots or a recorded version.
The present invention includes a method of digitally staining an object comprising viewing a live digital image of an object, wherein the object includes a first element and a second element, and wherein the live digital image is comprised of a plurality of pixels and modifying the values of a plurality of pixels in the image, wherein the values are selected from a group consisting of chrominance values and luminescence values, and wherein the modification results in a digitally stained image, wherein the first element is stained a first color and the second element is stained a second color.
The method further includes modification of chrominance values of the pixels using parametric controls, wherein the chrominance value of a first pixel that falls into a first calculated chrominance range is modified to reflect the mean of a first 9Bloc. The method further includes modifying the chrominance value of a second pixel that falls into a second calculated chrominance range to reflect the chrominance mean of a second 9Bloc.
The method further comprises determining an edge between the first element and the second element by comparing the high and low chrominance values of the 16 pixels surrounding the 9Bloc relative to the mean of the 9Bloc, wherein when the chrominance mean of one of the surrounding pixels of the 9Bloc falls above or below a pre-calculated high or low threshold, an edge is demarcated. The method further comprises digitally staining a microscopic slide and inversing the image digitally to simulate a dark-field environment.
The present invention also includes a chrominance enhancing method or technique, comprising digitally changing the chrominance and/or luminance value(s) of either pre- or post-processed individual pixel information of a CCD or CMOS imaging sensor through software and/or firmware calls. The method also includes real-time video that is either monochromatic or polychromatic. The present invention also includes a method of enhancing a live video image with respects to an image's individual R, G and B pixel values, thereby obtaining a modified outline of a subject displayed on a computer monitor.
The present invention also includes a method using transcoded RGB chroma values into YUV color space for the purpose of controlling the luminance and chrominance values independently by selecting the high and low chroma values based on a single selected pixel. This method also includes using an image's YUV color space in using the luminance, chrominance and alpha information to increase or decrease their values to simulate a chemical stain, using parametric type controls. Finally, the present invention includes a method of applying a minimum of six digital stains to a live digital video source for reasons of spectroscopy observation and study.
The present disclosure describes a chrominance or luminance enhancing method or technique comprising of digitally changing the chrominance and/or luminance values of either pre- or post-processed “live” individual pixel information of a CCD or CMOS imaging sensor through software or firmware. This can also be described as a method of enhancing a live video image with respect to the image's individual R, G, and B (Red, Green, Blue) pixel values, thereby obtaining a modified outline of the subject displayed on a computer monitor or other types of image viewing devices known in the art.
According to
Digital staining device 10 is capable of live, stained inspection methods in the applications of semiconductor, printed circuit boards, electronics, tab and wire bonding, hybrid circuit, metal works, quality control and textiles. Digital staining device 10 can also be any optical or scanning inspection device such as, but not limited to, microscopes (dark or bright field), macroscopes, printed circuit board inspection and re-work stations, medical grossing stations, telescopes, Electron, Atomic Force (AFM) or Scanning Probe (SPM) Microscopes and the methods of staining or highlighting live video images for use in digital microscopy, histogroscopy and spectroscopy.
According to this invention, a chemical, florescent or other stain can be simulated when the YUV color spaces using the luminance, chrominance and alpha information to increase or decrease its values based on the pre-calculated parametric controls. This invention can further be used to digitally stain a microscope slide and then digitally inversing the image to highlight a region of interest or completely turn deselected pixels to black in order to simulate a dark-field environment. As shown in
Digital staining device 10 is also capable of producing “live” or real-time staining of moving objects such as small organisms, single-celled organisms, cell tissue and other biological specimens. Specifically, the present invention discloses a method of digitally staining an object comprising: viewing a live digital image of an object, wherein the object includes a first element and a second element or more, and wherein the live digital image is comprised of a plurality of pixels; and modifying the values of a plurality of pixels in the image, wherein the values are selected from a group consisting of chrominance values and luminance values, and wherein the modification results in a digitally stained image, wherein the first element is stained a first color and the second element is stained a second color, and the third element is stained a third color and so on.
The present invention is also useful in detecting embedded digital signatures within a photograph, enhancing a fingerprint in a forensics laboratory, or highlighting a particular person or figure during security monitoring. According to the present invention, the method described above will hereinafter be referred to as Chroma-Photon Staining or CPS. It should be noted that the following explanation uses 8-bit values for the RGB and YUV color components, by way of example only. However, the CPS technique is not limited to 8-bit values.
The imaging sensors, such as camera 14, are usually arranged in Red, Green, Blue (RGB) format, and therefore data is obtained from these video sensors in RGB format. However, RGB format alone is inadequate for carrying out the method according to the present disclosure, in that RGB format does not permit separating the chrominance and luminance properties. Therefore, the present invention ultimately utilizes the YUV color space format. YUV color space allows for separating the chrominance and luminance properties of RGB format. Thus, according to the invention, the RGB values are trans-coded into YUV color space using an algorithm for the purpose of controlling the chrominance and luminance values independently. This is accomplished by selecting the high and low chroma values based on a 9Bloc (defined below) of a single selected pixel.
As shown in
In one embodiment, the method further demarcates an edge between the first element and the second element by comparing the high and low chrominance values of the 16 pixels surrounding the 9Bloc—in other words, the outer edge of a pixel block that is 25 pixels (five high and five wide), hereinafter denoted as a 25Bloc, with the mean of the 9Bloc (or the new value of the reference pixel). When the chrominance mean of one of the surrounding pixels rises above or falls below a pre-calculated high or low threshold relative to the mean of the 9Bloc, an edge is demarcated.
As shown in
CPS allows the spectroscopic stain maker to work in real-time with the live image which may or may not be chemically stained. Controlling the lighting environment is important for the CPS technique to have favorable results. Keeping a consistent “flood” of light and light temperature assists in obtaining consistent staining.
To better control the color conversion of the data from a camera sensor, the present data is to convert or “transcode” the Red, Green and Blue (RGB) data into YUV 4:4:4 color space. As shown in
Instead of each pixel having three color values, RGB, the color information is transcoded to CbCr color which is the U and V values. According to the present disclosure:
U=Cblue [1]
V=Cred [2]
The YUV conversion is accomplished according to the following equations:
Y=0.257R+0.504G+0.098B+16 [3]
U=−0.148R−0.291G+0.439B+128 [4]
V=0.439R−0.368G−0.071B+128 [5]
According to the present disclosure, the Y is the luma value. In one embodiment of the present disclosure, the user controls this feature independently from the color values, so the entire equation is:
Y=CbCr [6]
Green color is calculated by subtracting Cr from Cb, and the equation is:
Cg=Cb−Cr [7]
All notations are in hex values of FF(h) or less for 8 bit camera sensors and 400(h) for 10 bit camera sensor. The CPS technique does not involve any sub-sampling, thus, there is no color loss during the transcoding. Further, there is no compression.
Another issue with camera sensors and the CPS technique is that its accuracy is subject to the data received. High-grade CCDs have much higher dynamic range and signal to noise ration (SNR) than that of consumer grade CCDs or CMOS sensors. Sensors with 8 bit outputs will have far less contrast and DR than that of a 10 or 12 bit sensor. Other sensor issues such as temporal noise, fixed pattern noise, dark current and low pass filtering come into play with the pre-processed sensor data also.
With this in mind, a low-grade camera is less preferred then that of a high-grade for carrying out the CPS technique. However, the present disclosure envisions taking the particular conditions of the camera into consideration when using the CPS method. Therefore, the implementation of the present disclosure envisions using a high-grade CCD and a 10 or 12 bit sensor for optimal results. Dynamic Range (DR) quantifies the ability of a sensor to adequately image both highlights and dark shadows in a scene. It is defined as the ratio of the largest non-saturating input signal to the smallest detectable input signal. DR is a major factor of contrast and depth of field.
Referring back to
Modification or filtering of the 9Bloc of pixels is accomplished by averaging the four Green and four Blue pixel values with the one R value and arriving at certain averaged value, here equal to a value “A.” Therefore, according to
A=mean 9Bloc=mean(4G and 4B and 1R) [8]
Thus, A is also the new value of the reference pixel. As further shown in
B=mean of the outside 16 pixels of the 25Bloc=mean(8G and 8R) [9]
The modification of the 25Bloc is then accomplished by the following equation:
C=mean(A and B) [10]
The reference pixel contains three, 8-bit values, ranged 0 to 255 for each red, green and blue component. These RGB values are then transformed into YUV color space using the equations:
Y=0.257R+0.504G+0.098B+16 [11]
U=−0.148R−0.291G+0.439B+128 [12]
V=0.439R−0.368G−0.071B+128 [13]
The final 8-bit YUV component values represent the key pixel that is then used as the mean for the current bandwidth ranges. The bandwidth is an 8-bit value that represents the deviation above and below a component key pixel value that determines the bandwidth range for a color component. There are two bandwidth values used by the CPS technique: the first is applied to the luminance component (Y) of the key pixel while the second is applied to both chrominance components (U and V) of the key pixel. These values are saturated to the 0 and 255 levels to avoid overflow and underflow wrap-around problems. Thus:
Referring now to
Next, the CPS technique is applied in step 114. In step 114, each YUV component of each pixel in the copied video frame is checked against the high and low bandwidth ranges calculated above. In step 114, if all YUV components of a pixel fall within the bandwidth ranges, then the corresponding pixel in the original RGB frame is stained. The stain color is an RGB value that is alpha blended with the RGB value of the pixel being stained.
The alpha blend value ranges from 0.0 to 1.0. The alpha blending formula is the standard used by most production switchers or video mixers known in the art. Thus, alpha blending is accomplished according to the following:
In step 116, the stained RGB pixels enter the RGB frame buffer, and in step 118, the stained RGB image is produced.
Finally, multiple stains, each with their own key pixels, bandwidths and stain colors, may be applied to the same video frame in order to demarcate elements of the target object.
Therefore, the present disclosure and invention provide for an advantageous staining method that allows all or discrete parts of an object or specimen under inspection to be stained without permanently altering it. Further, the present disclosure and invention permits real-time viewing and broadcasting of the stained object which is not possible with any technology presently available or known in the art. The ability of the present method to allow real-time viewing and broadcasting, versus snap shots or video recordings that are the only options currently available, provides for a superior ability to manipulate the staining of the object or specimen, including the ability for two viewers in remote geographical areas to both manipulate the staining and viewing of the object or specimen in real-time. This novel method then provides users enhanced ability to exchange ideas and communicate more efficiently and effectively about the object or specimen that is the subject matter of the chroma-photon staining.
Various embodiments of the invention are described above in the Detailed Description. While these descriptions directly describe the above embodiments, it is understood that those skilled in the art may conceive modifications and/or variations to the specific embodiments shown and described herein. Any such modifications or variations that fall within the purview of this description are intended to be included therein as well. Unless specifically noted, it is the intention of the inventors that the words and phrases in the specification and claims be given the ordinary and accustomed meanings to those of ordinary skill in the applicable art(s).
The foregoing description of a preferred embodiment and best mode of the invention known to the applicant at this time of filing the application has been presented and is intended for the purposes of illustration and description. It is not intended to be exhaustive nor limit the invention to the precise form disclosed and many modifications and variations are possible in the light of the above teachings. The embodiment was chosen and described in order to best explain the principles of the invention and its practical application and to enable others skilled in the art to best utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated. Therefore, it is intended that the invention not be limited to the particular embodiments disclosed for carrying out the invention.
This application claims the benefit of priority of U.S. Provisional Patent Application Ser. No. 60/698,657, filed Jul. 12, 2005.
Number | Date | Country | |
---|---|---|---|
60698657 | Jul 2005 | US |