1. Field of Invention
The present invention relates to machine vision and, more specifically, to a system and method for improving image processing using a reflective background.
2. Description of Prior Art
Machine vision plays an important role in automated and robotic systems, such as assembly line manufacturing, quality control inspection, and sample processing. Conventional systems are generally comprised of a optical imager, such as a charged coupled device (CCD) or similar device using digital imaging technology, that is positioned capture images of objects that pass in front of it. In low-light or enclosed applications, machine vision systems may include an illumination source, such as a bank of light emitting diodes (LEDs), positioned proximately to the imager. The images are subsequently processed to decode information contained in the resulting two-dimensional image, such as 1D linear codes, 2D stacked/matrix codes, OCR fonts, and postal codes. The image captured by the machine vision system may also be subjected to more advanced processing, such as shape recognition or detection algorithms, that provide information about the object of interest in the image. However, the characteristics of digital images taken by machine vision systems, such as the contrast of the image, often limit the processing techniques that may be employed and adversely affects the accuracy of the results obtained from the processing of the image contents.
Some attempts to improve the quality of images obtained by machine vision systems involve the use of sophisticated lighting systems to improve the digital image captured by the system. For example, the illumination source may comprise multiple banks or arrays of LEDs that completely encircle the targeted object. While such a system is useful for solitary, immobile objects, it is not as effective for illuminating objects in motion and requires a plethora of expensive components.
Other attempts to more completely capture the digital image of an object in a machine vision system include the addition of a second light source or second imager for illuminating the object. For example, a second light source positioned adjacent to the imager and the associated primary light source, or a second light source positioned on the opposite side of the object to be imaged will increase the amount of light reflected by the object, thereby improving the quality of a digital image taken of the object. Imaging systems using sophisticated illumination banks or arrays require additional components that increase the cost of the machine vision system, need additional circuitry for controlling the dual illumination sources or imagers, and require a large footprint in the assembly line or process where they are used.
It is a principal object and advantage of the present invention to provide a system and method for improving the contrast of an image captured by a machine vision system.
It is an additional object and advantage of the present invention to provide a system and method for reducing the costs associated with machine vision systems.
It is a further object and advantage of the present invention to provide a system and method for reducing the elements required by machine vision systems.
Other objects and advantages of the present invention will in part be obvious, and in part appear hereinafter.
In accordance with the foregoing objects and advantages, the present invention comprises a system for obtaining an image of an object comprising a optical imager that includes an illumination source positioned on one side of the object to be imaged and a reflective background positioned on the other side of the object. In an alternative embodiment, the present invention comprises at least one imager positioned to capture images of objects moving along an assembly line or process and a reflective background behind the row of samples. The system of the present invention may be implemented where there are significant space restrictions, thereby providing advanced imaging capabilities that were previously unavailable, and may be used replace multiple elements, thereby reducing cost. The imager is programmed to perform decoding of information contained within the image, such as any barcodes or recognizable symbology, as well as for more advanced image processing, such as pattern matching and shape detection.
The present invention will be more fully understood and appreciated by reading the following Detailed Description in conjunction with the accompanying drawings, in which:
Referring now to the drawings, wherein like numerals refer to like parts throughout, there is seen in
Imager 12 preferably comprises a complementary metal oxide semiconductor (CMOS) image sensor and is capable of reading and interpreting two-dimensional images, such as 1D linear codes, 2D stacked/matrix codes, OCR fonts, RSS (Reduced Space Symbology) codes, and postal codes, as well as provides image capturing for use in a wide range of applications, such as image and shape recognition, signature capture, image capture, and optical character recognition (OCR).
As seen in
Imager 12 may comprise an IT4×10/80 SR/SF or IT5×10/80 series imager available from Hand Held Products, Inc. of Skaneateles Falls, N.Y. that is capable of scanning and decoding most standard barcodes including linear, stacked linear, matrix, OCR, and postal codes. The IT5×10/80 series imager is a CMOS-based decoded output engines that can read 2D codes, and has image capture capabilities. Imager 12 obtains an optical image of the field of view and, using preprogrammed algorithms, deciphers the context of the image to determine the presence of any decodable barcodes, linear codes, matrix codes, and the like. As will be explained hereinafter, imager 12 may further be programmed to perform other image processing algorithms, such as shape recognition, culling, match filtering, statistical analysis, and other high-level processing techniques, in addition to barcode detection.
Reflective background 16 comprises a thin film or sheet having reflective properties that is aligned to reflect all or a portion of light emitting from illumination source 18 back to imager 12. Reflective background 16 preferably includes retroreflective characteristics. Positioning of reflective material 16 saturates the background, thus improving the contrast of the image taken by imager 12, allowing for the use of advanced processing techniques without the need for additional illumination sources or sophisticated illumination control circuitry. Preferably, reflective background 16 comprises seven millimeter retro-reflective sheeting. Sheeting generally comprises a layer of glossy mylar bonded to a liner by an adhesive, such as a layer of permanent acrylic. The layer of mylar and the layer of adhesive are preferably one millimeter thick each and the liner may comprise 90# polyethylene coated paper, resulting in a reflective sheeting of approximately seven millimeters in thickness. An acceptable reflective sheeting is the Series 680 Reflective Sheeting available from 3M of St. Paul, Minn.
Referring to
Referring to
Microcontroller 30 is electrically connected to an imaging engine 36 for driving the optical imaging of a target object and receiving image data. Microcontroller 30 is also connected to an illumination engine 38 used for controlling timing and illumination source 18. Optionally, imaging engine 36 and illumination engine 38 may be provided in a single unit interconnected to microcontroller 30. Microcontroller 30 may comprise a MC9328MXL VH15 microprocessor, available from Freescale Semiconductor, Inc. of Chandler, Ariz. that is programmed prior to implementation in imager 12, or programmed anytime thereafter, such as by using interface 34 to upgrade the firmware used to control microcontroller 30.
Device 32 controls imaging of objects 14 and reflective background 16 based on host commands received from to host interface 34. Similarly, microcontroller 30 is capable of providing data to host device 32 via interface 34. As will be explained in more detail hereinafter, microcontroller 30 may be programmed to perform legacy barcode interpretation as well as advanced image processing, thereby reducing or eliminating the need for sophisticated or time-consuming communication of data to host device 32. For example, microcontroller 30 may be associated with a barcode interpretation submodule 40 reading and interpreting two-dimensional images, such as 1D linear codes, 2D stacked/matrix codes, OCR fonts, and postal codes. Microcontroller 30 may also be associated with an image processing submodule 42 that is programmed to perform more advanced image analysis techniques, such as pattern matching. Although barcode interpretation submodule 40 and image processing submodule 42 are shown in
Referring to
For optimum performance, object 14 should be properly oriented and centered relative to the images taken by imager 12. Calibration, which may or may not be performed, is accomplished prior to imaging by determining the pixel location of the image start region, the width of the image region, and the offset dimension of the target area relative to the image region (i.e., the location of the object). These values are stored in imager 12 in Flash memory for subsequent use. The difference value, the pixel location of the image start location, and the width of the retro-reflective region provide information to the host for proper horizontal and vertical alignment of imager 12. In addition, there is a known pixel location of the retro-reflective start line that must be adjusted. Hander 44 queries imager 12 for the difference values, which will enable rack 46 to be moved to the correct horizontal position.
Calibration may be accomplished by defining a set of host commands for use by handler 44, such as commands that select the processing mode (such as the direct facing positioning seen in
Handler 44 may further benefit from the designation of control commands governing operation of imager 12. For example, predefined commands that trigger imager 12 define what image analysis will be performed (i.e., barcode and/or image analysis) are useful. Commands controlling the amount of time system 10 will attempt to process images and the time period for timeouts may also be defined. Finally, commands governing image processing and selecting the data to be provided to handler 44 by imager 12 may be defined, such as the identification of rack 46, object 14, object type, the presence of a lid 22, the style of lid 22, and the type of compression (if any) to be used when transmitting information regarding the images (or the images themselves), to handler 44.
The general process 50 used by system 10 when configured to perform multiple image processing steps in connection with a host device, such as handler 44, is seen in
As seen in
As seen in
As seen in
Using the forgoing calibration and processing techniques, as well as other known processing procedures, the high degree of contrast achieved by system 10 may be used to determine the shape of an object, to detect whether an object is properly configured (e.g., whether lid 22 is positioned on container 20), to determine whether a transparent object has a certain level of fluid (e.g., whether container 20 has a particular fluid level 26), to measure the clarity of fluid in a transparent object, to detect clumps or bubbles in a fluid, or to detect shapes on an object (such as a trademark or logo)
Number | Name | Date | Kind |
---|---|---|---|
3198097 | Hine | Aug 1965 | A |
4084742 | Silverman | Apr 1978 | A |
4205917 | Abramson | Jun 1980 | A |
4236781 | Arimura | Dec 1980 | A |
4629319 | Clarke et al. | Dec 1986 | A |
4859862 | Planke et al. | Aug 1989 | A |
4891529 | Braun et al. | Jan 1990 | A |
4972093 | Cochran | Nov 1990 | A |
5072127 | Cochran | Dec 1991 | A |
5233186 | Ringlien | Aug 1993 | A |
5393965 | Bravman et al. | Feb 1995 | A |
5442446 | Gerber et al. | Aug 1995 | A |
5491328 | Rando | Feb 1996 | A |
5528371 | Sato et al. | Jun 1996 | A |
5540301 | Dumont | Jul 1996 | A |
5825495 | Huber | Oct 1998 | A |
5828056 | Alderman et al. | Oct 1998 | A |
5898169 | Nordbryhn | Apr 1999 | A |
5943125 | King | Aug 1999 | A |
5946500 | Oles | Aug 1999 | A |
6075883 | Stern et al. | Jun 2000 | A |
6122048 | Cochran et al. | Sep 2000 | A |
6290382 | Bourn et al. | Sep 2001 | B1 |
6363366 | Henty | Mar 2002 | B1 |
6488390 | Lebens et al. | Dec 2002 | B1 |
6588669 | Claus et al. | Jul 2003 | B1 |
6643009 | Takakusaki et al. | Nov 2003 | B2 |
6667762 | Bouvier et al. | Dec 2003 | B1 |
6784447 | Gochar, Jr. | Aug 2004 | B2 |
6885767 | Howell | Apr 2005 | B1 |
7319805 | Remillard et al. | Jan 2008 | B2 |
20030112430 | Lindner | Jun 2003 | A1 |
20040095465 | Numazaki et al. | May 2004 | A1 |
20040136190 | Christoph | Jul 2004 | A1 |
20040150815 | Sones et al. | Aug 2004 | A1 |
20040184032 | Mahon et al. | Sep 2004 | A1 |
20040190132 | Laschke et al. | Sep 2004 | A1 |
20040245332 | Silverbrook et al. | Dec 2004 | A1 |
20050098633 | Poloniewicz et al. | May 2005 | A1 |
20060067572 | White et al. | Mar 2006 | A1 |
Number | Date | Country | |
---|---|---|---|
20070286495 A1 | Dec 2007 | US |