Optical reader for classifying an image

Information

  • Patent Grant
  • 7413127
  • Patent Number
    7,413,127
  • Date Filed
    Friday, November 3, 2006
    18 years ago
  • Date Issued
    Tuesday, August 19, 2008
    16 years ago
Abstract
An optical reader that is configured for classifying an image. The optical reader includes an image analysis component that is configured for classifying the image by evaluating a result of attempting to locate and process a portion of the image as graphical symbol, and configured for classifying the image as an image including a graphic symbol if and when attempting to locate and process a portion of the image as graphical symbol is successful, and configured for classifying the image as an image excluding a graphic symbol if and when attempting to locate and process a portion of the image as graphical symbol is unsuccessful.
Description
FIELD OF THE INVENTION

The present invention relates generally to optical readers, and particularly to optical readers employing color imagers.


BACKGROUND OF THE INVENTION

Optical indicia readers equipped to read one-dimensional or two-dimensional bar code symbols are well known in the art. There are a number of optical character recognition systems on the market as well. In addition, many financial institutions today employ computer-driven signature capture systems. Many of these systems employ monochrome imagers because monochrome imagers are well-suited to read graphical symbols, such as bar codes, OCR symbols, or signatures.


On the other hand, the ability to provide image capture functionality along with indicia reading in one device is very appealing. Currently, optical readers having image capture functionality use monochrome imagers that provide gray scale images. While such devices are useful, gray scale images are less desirable than color images for viewing purposes. The public has come to expect color imaging. Further, monochrome images are often less distinct and not as informative as color images.


Unfortunately, there are problems associated with using color imaging systems to read graphical symbols. The first problem relates to the difficulty of distinguishing bi-tonal indicia in a color image. Because color imagers provide more information that bi-tonal indicia readers can use, color imaging data is often confusing to graphical symbol indicia readers. One way to solve this problem is to convert the color imaging data into gray-scale data. However, commercially available methods for converting color images to gray-scale are too slow for high-volume scanning. Thus, an optical reader employing a color imager with a gray scale converter would be slower and more expensive than an optical reader using monochrome imager because of the additional processing required.


Thus, a need exists for an inexpensive optical reader that is capable of performing color photography and evaluating graphical symbols. This optical reader must be capable of automatically determining whether an image includes a graphical symbol or is merely a color photographic image, and process the acquired color imaging data based on that determination. A need also exists for an optical reader that is able to associate an acquired color image with any subsequent acquired color image.


SUMMARY OF THE INVENTION

The invention provides for an optical reader that is configured for classifying an image. The optical reader includes a color imaging assembly that is configured for obtaining a representation of an image in a digital format. The optical reader also includes an image analysis component that is configured for classifying the image by evaluating a result of attempting to locate and process a portion of the image as graphical symbol, and configured for classifying the image as an image including a graphic symbol if and when attempting to locate and process a portion of the image as graphical symbol is successful, and configured for classifying the image as an image excluding a graphic symbol if and when attempting to locate and process a portion of the image as graphical symbol is unsuccessful.


In some embodiments, the optical reader is configured to attempt to locate and process a portion of the image as graphical symbol by searching for high energy regions within the image. High energy regions of the image can include black-white transitions.


In some embodiments, the optical reader is configured to attempt to locate and process a portion of the image as graphical symbol by attempting to decode the portion of the image as a bar code symbol. Optionally, if the bar code symbol represents a menu, and if attempting to decode the bar code symbol is successful, then a menu that corresponds to a result of decoding of the bar code symbol is executed. Optionally, if the bar code symbol represents data, and if attempting to decode the bar code symbol is successful, then data that corresponds to a result of decoding the bar code symbol is output to a display.


In some embodiments, if attempting to decode the bar code symbol is unsuccessful, then attempting to process a portion of the image as a graphical symbol further includes attempting to process the portion of the image as an optical character recognition (OCR) symbol.


In some embodiments, if attempting to process the portion of the image as an optical character recognition symbol is successful, then a representation of the processed optical character recognition symbol is stored into a memory.


In some embodiments, if attempting to process the portion of the image as an optical character recognition symbol is unsuccessful, then attempting to process a portion of the image as a graphical symbol further includes attempting to recognize the portion of the image as text. If attempting to recognize the portion of the image as text is successful, then the portion of the image is cropped and stored. Optionally, the portion of the image is further compressed after being cropped and before being stored.


In some embodiments, if attempting to recognize the portion of the image as text is unsuccessful, then attempting to process a portion of the image as a graphical symbol further includes attempting to recognize the portion of the image as a signature. If attempting to recognize the portion of the image as a signature is successful, then the portion of the image is cropped and stored. Optionally, the portion of the image is further compressed after being cropped and before being stored. Optionally, attempting to process a portion of the image as a graphical symbol further includes attempting to verify the signature.


In some embodiments, if attempting to recognize the portion of the image as a signature is unsuccessful, after attempts to decode a bar code symbol, process OCR text and recognize text are unsuccessful, then the image is classified as excluding a graphical symbol.


In some embodiments, attempting to locate and process a portion of the image as graphical symbol is successful if at least one of the steps of attempting to decode a bar code symbol, attempting to process a OCR symbol, attempting to recognize text and attempting to recognize a signature, are successful. Optionally, the image including a graphical symbol, is associated with another image that is subsequently captured. In some embodiments, if attempting to process a portion of the image as graphical symbol is unsuccessful, then the image is classified as an image excluding a graphic symbol.


In another aspect, the invention provides for a method for employing an optical reader to classify an image. In some embodiments, the method includes the steps of providing an optical reader, configuring the optical reader to obtain a representation of an image in a digital format, configuring the optical reader to classify the image by attempting to locate and process a portion of the image as graphical symbol. The method further includes the steps deciding if attempting to locate and process a portion of the image as graphical symbol is successful, then classifying the image as an image including a graphic symbol; and deciding if attempting to locate and process a portion of the image as graphical symbol is unsuccessful, then classifying the image as an image excluding a graphic symbol.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1A-1D are perspective views of various embodiments of the optical reader of the present invention;



FIG. 2 is a block diagram of the electro-optical assembly of the optical reader of the present invention;



FIG. 3 is an example of a graphical user interface display in accordance with the present invention;



FIG. 4 is a flow chart showing the processing flow for an automatic mode in accordance with another embodiment of the present invention;



FIG. 5A-5B is a flow chart showing the processing flow for a semi-automatic mode in accordance with another embodiment of the present invention;



FIG. 6A-6C are graphical depictions of the menu symbol used in the bar code processing flows depicted in FIG. 4 and FIG. 5;



FIG. 7 is a flow chart showing a method for reading a bar code in accordance with yet another embodiment of the present invention;



FIG. 8 is a flow chart showing a method for 1D autodiscrimination in accordance with the method depicted in FIG. 7;



FIG. 9 is a flow chart showing a method for 2D autodiscrimination in accordance with the method depicted in FIG. 7;



FIG. 10 is a flow chart showing a method for reading text in accordance with yet another embodiment of the present invention;



FIG. 11 is a flow chart showing a method for performing OCR in accordance with yet another embodiment of the present invention;



FIG. 12 is a flow chart showing a method for associating consecutive images taken with the color optical reader of the present invention;



FIG. 13 is an example of image association in accordance with the present invention;



FIG. 14 is a perspective view of a wireless color optical reader in accordance with yet another embodiment of the present invention;



FIG. 15 is a flow chart showing a method for transmitting packetized data from a color optical reader to a base station;



FIGS. 16A and 16B are diagrammatic depictions of packet formats in accordance with yet another embodiment of the present invention;



FIG. 17 is a flow chart showing a method for performing signature verification in accordance with yet another embodiment of the present invention; and



FIG. 18 is a diagrammatic depiction of color optical reader network applications in accordance with the present invention.





DETAILED DESCRIPTION

Reference will now be made in detail to the present exemplary embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. An exemplary embodiment of the optical reader of the present invention is shown in FIG. 1, and is designated generally throughout by reference numeral 10.


In accordance with the invention, the present invention for an optical reader includes a color imaging assembly for acquiring color imaging data. An image analysis circuit determines if the acquired image includes at least one graphical symbol. A processing circuit processes the imaging data based on the determination of whether the image includes at least one graphical symbol. The present invention allows a user to read graphical symbols, such as bar codes, text, OCR characters or signatures using a color imager. The color optical reader of the present invention is configured to automatically determine whether a color image includes a graphical symbol, or is merely a color photographic image. The optical reader of the present invention also is operative to associate one acquired image with at least one subsequently acquired image.


As embodied herein, and depicted in FIGS. 1A-1D, perspective views of the optical reader in accordance with various embodiments of the present invention are disclosed. FIG. 1A shows the underside of hand held wireless optical reader 10. FIG. 1B shows the top of the optical reader depicted in FIG. 1A. Optical reader 10 includes housing 100, antenna 102, window 104 and trigger 12. Window 104 accommodates illumination assembly 20 and imaging assembly 30. As shown in FIG. 1B, the top side of reader 10 includes function keys 14, alphanumeric key pad 16, and display 60. In one embodiment, function keys 14 include an enter key and up and down cursor keys. FIG. 1C is also a hand held wireless optical reader 10. Reader 10 includes function keys 14, alphanumeric key pad 16, writing stylus 18, display 60, and signature block 62. Stylus 18 is employed by a user to write his signature in signature block 62. FIG. 1D shows yet another embodiment of optical reader 10 of the present invention. In this embodiment, reader 10 includes a gun-shaped housing 100. Display 60 and keypad 16 are disposed on a top portion of gun-shaped housing 100, whereas trigger 12 is disposed on the underside of the top portion of housing 100. Housing 100 also includes window 104 that accommodates illumination assembly 20 and imaging assembly 30. Wire 106 is disposed at the butt-end of housing 100. Wire 106 provides optical reader 10 with a hard wired communication link for external devices such as a host processor or other data collection devices.


As embodied herein and depicted in FIG. 2, a block diagram of the electro-optical assembly of optical reader 10 of the present invention is disclosed. Optical reader 10 includes illumination assembly 20 and color imaging assembly 30, connected to processor 40. Illumination assembly 20 includes illumination optics 22 coupled to light source 24. Light source 24 is coupled to ASIC/FPGA 44. ASIC/FPGA 44 is programmed to drive light source 24. Imaging assembly 30 includes imaging optics 32 and color imager 34. Imaging optics 32 focuses the illumination light reflected from target T onto color imager 34. Color imager 34 provides color imaging data to ASIC/FPGA 44. Color imager 34 performs several functions. Color imager 34 generates analog color image signals using an imaging array color filter. The array color filter pattern is a Bayer-pattern. The analog color imaging data is converted into a digital format using an internal A/D converter which also functions as a quantizer. An 8-bit system provides 256 brightness levels, whereas a 12-bit converter provides over 4,000 brightness levels. Digital color imaging data is transmitted from imager 34 to ASIC/FPGA 44 and processor 42.


Optical reader 10 also includes processor 40. In the embodiment depicted in FIG. 2, processor 40 includes microprocessor 42 and ASIC 44. System bus 52 couples microprocessor 40, RAM 46, EROM 48, I/O circuit 50 and display 60.


Illumination optics 22 may be of any suitable type, but there is shown by way of example a lens system for directing light from light source 24 towards target T. It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to illumination optics 22 of the present invention depending on the complexity of the target illumination. For example, illumination optics 22 may include one or more lenses, diffusers, wedges, reflectors or a combination of these elements. In one embodiment, illumination optics 22 produces an aiming pattern on target T.


Light source 24 may be of any suitable type, but there is shown by way of example a plurality of white LEDs. It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to light source 24 of the present invention depending on the application. For example, illumination assembly 20 may be eliminated altogether if it is certain that the ambient light level will be high enough to obtain high quality color images. In another embodiment, red LEDs are employed instead of the white LEDs.


Color imager 34 may be of any suitable type, but there is shown by way of example, a CMOS color imager having an 640×480 pixel resolution. It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to color imager 34 of the present invention depending on cost and the resolution required by optical reader 10. In another embodiment, color imager 34 has 800×600 pixels. A typical VGA resolution of 640×480 pixels is adequate for displaying color images on a LCD or a computer monitor. In one megapixel embodiment, color imager 34 has 1156×864 pixels (almost 1-million pixels). In yet another embodiment, color imager 34 includes 1536×1024 pixels. One of ordinary skill in the art will recognize that as the resolution of imager 34 increases, so will the cost. In another embodiment, color imager 34 is implemented by scanning a linear CCD array. In other embodiments, color imager 34 is implemented using an area CCD solid state image sensor.


Processor 40 may be of any suitable type, but there is shown by way of example a processor which includes microprocessor 42 and ASIC 44 coupled to system bus 52. In one embodiment, microprocessor 42 and ASIC are programmable control devices that receive, process, and output data in accordance with an embedded program stored in EROM 48. As discussed above, microprocessor 42 and ASIC 44 are connected to system bus 52, which includes address, data, and control lines.


In the embodiment depicted in FIG. 2, microprocessor 42 is an off-the-shelf VLSI integrated circuit (IC) microprocessor. Microprocessor 42 is tasked with the over-all control of the electro-optics shown in FIG. 2. Processor 42 controls menu operations, command and data received from I/O circuit 50, data written to display 60, and operating system functions. I/O circuit 50 controls the information received from keypad 14 and keypad 16. Microprocessor 42 is also tasked with processing and decoding imaging data stored in RAM 46 in accordance with the programming instructions stored in EROM 48. Thus, microprocessor 42 performs bar code decoding, optical character recognition, signature verification, and color image processing.


In the embodiment depicted in FIG. 2, ASIC 44 is implemented using a programmable logic array (PLA) device. In a similar embodiment, ASIC 44 is implemented using a field programmable gate array (FPGA) device. ASIC 44 is tasked with controlling the image acquisition process, and the storage of image data. As part of the image acquisition process, ASIC 44 performs various timing and control functions including control of light source 24, control of color imager 34, and control of external interface 56.


It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to processor 40 of the present invention depending on the cost, availability, and performance of off-the-shelf microprocessors, and the type of color imager used. In one embodiment, microprocessor 42 and ASIC 44 are replaced by a single microprocessor 40. In one embodiment, microprocessor 40 is implemented using a single RISC processor. In yet another embodiment, microprocessor 40 is implemented using a RISC and DSP hybrid processor.


It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to the memory configuration of the present invention depending on cost and flexibility considerations. For example, in one embodiment, EROM 48 is implemented using EPROMs or E2PROMs. In yet another embodiment, FLASH memory is employed. RAM 46 typically includes at least one volatile memory device, and in some embodiments includes one or more long term non-volatile memory devices.


It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to I/O unit 50 of the present invention depending on the application and work environment. Embodiments of I/O unit 50 include an RS-232 interface, a LAN interface, PAN interface, a serial bus such as USB, an internet interface, and a wireless interface.


External interface 56 is used to transmit a discrete signal to control a peripheral device. Typically, the peripheral is an external illuminator. The external illuminator is used in place of light source 24.


It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to the operating system employed by optical reader 10 depending on the applications and desired operating environment. In one embodiment, a WindowsCE operating system is employed. In other embodiments, LINUX or PalmOS operating systems are employed. As a non-limiting example, application programs can be written using C, C++, Visual Basic, or Visual C++. Other languages can be used as well, depending on the application program. In other embodiments, optical reader 10 does not employ an operating system. For example, the simple reader depicted in FIG. 1D does not require a complex operating system.


As embodied herein and depicted in FIG. 3, an example of a graphical user interface in accordance with the present invention is disclosed. Display 60 provides a plurality of application program icons displayed on graphical user interface (GUI) 650. Selections are made by the user via arrow 652. For example, GUI 650 allows a user to select the automatic image capture mode by clicking on automatic mode icon 654. GUI 650 also includes semi-automatic image capture icon 656, bar-code scanning icon 658, OCR/text capture icon 660, signature capture mode icon 662, color photography mode icon 664, association mode icon 668, and additional application program icons 666. The application program icon 666 may allow the user to collect other biometric information such as finger and voice prints. In the WindowsCE environment, start button icon 670 and tool bars may also be displayed on GUI 650. GUI 650 also displays current application program data 672.


In the Automatic imaging mode, processor 40 is programmed to analyze the color imaging data to determine if an acquired image includes a graphical symbol or is merely a color photographic image. If it makes the determination that the color image includes a graphical symbol, it further analyzes the acquired image and classifies it as a bar code, OCR symbol, text, or a signature. Based on the classification, optical reader 10 jumps to the appropriate routine in EROM 48. The semi-automatic mode is similar. Thus, in the automatic or semi-automatic modes, the bar code scanning mode, the OCR/text mode, the signature capture mode, the color photography mode, and the association mode are controlled by the application program, not by the user.


However, the user may manually select any of the above listed modes. If the user clicks on bar code scanning icon 658, the bar code scanning application program will run. In this application program, the user may select between a 1D bar code mode, 2D bar code mode or an autodiscrimination mode. Further, the user can manually select and de-select the types of bar codes optical reader 10 is enabled to read or not read.


The user may also click on OCR/Text icon 660. Clicking icon 660 provides the user with a check validation mode, a text scanning mode, or a bi-tonal image capture mode. The check validation mode is performed in conjunction with network services.


Clicking on icon 662 provides the user with a signature capture mode. In one embodiment, this mode includes a signature verification program wherein the user may select between a static verification or a dynamic verification. In the static mode, the user captures the image of a signature. The captured image is compared with a reference image stored in a remote database. In the dynamic mode, optical reader 10 uses the stylus and signature block to capture the signature. In this mode, signature block 62 measures unique dynamic parameters, such as applied pressure, direction and timing of movements, or a combination of these parameters. One of ordinary skill in the art will recognize that this list is not meant to be all-inclusive, but rather, is a representative example. The captured dynamic parameters are compared with a reference data stored in a remote database.


The user selects the color photography mode by clicking on icon 664. This mode allows the user to select an automatic imaging mode wherein optical reader 10 makes the imaging adjustments (e.g., exposure, etc.) or a manual mode that allows the user to adjust imager settings as he pleases.


In another embodiment, display 60 provides the user with a menu listing the main modes of optical reader 10. The user employs keypad 16 to select the desired mode. A cursor key is employed to highlight any of the modes listed above. Upon pressing the enter key, processor 40 jumps to the appropriate routine stored in EROM 48. As discussed above, a user may select between an Automatic Imaging mode, a Semi-Automatic Imaging mode, a bar code scanning mode, an OCR/text mode, a signature capture mode, a color photography mode, or an association mode.


As embodied herein and depicted in FIG. 4, a flow chart showing the processing flow for the automatic imaging mode in accordance with another embodiment of the present invention is disclosed. After the user pulls the trigger in step 400, processor reads the selected mode. In this case the automatic mode has been selected by the user. The processor initializes optical reader 10 hardware, defines image data memory space, and initializes software mode settings. In step 408, optical reader 10 captures the image by obtaining color imaging data. In some embodiments, processor 40 may display the acquired image on display 60 during this step. In step 410, processor 40 determines if the captured image includes a graphical symbol. In one embodiment, processor 40 uses only a portion of the color imaging data to make this determination. Because there are more green pixels than either red or blue pixels in the Bayer-Pattern, processor 40 uses the green pixels to look for high energy regions in the acquired image. High energy, e.g. black-white transitions are a good indicator for the presence of a graphical symbol, such as a bar code symbol. A black and white bi-tonal image will consist of green pixels that are in one of two possible value ranges. One narrow range of values is representative of white portions of the image, whereas the other narrow range of values is representative of black portions of the image.


In another embodiment, step 410 is performed by considering all of the pixel values. However, the interpretation of the pixel's value is adjusted based on whether it is a red, green, or blue pixel. In another embodiment, processor 40 creates a gray-scale image to determine whether the image includes a graphical symbol.


If in step 410 processor 40 determines that there is no graphical symbol present in the image, the user is asked in step 432 if he desires to store the image. If so, the color photographic image is stored in memory in step 434. If processor 40 determines that the image includes a graphical symbol, the process flow moves on to step 418. In this step, processor 40 strikes scanning lines to locate bar code symbol identifiers. If processor 40 determines that the graphical symbol is a bar code symbol it attempts to decode the symbol in step 436. If the decoding is successful, the symbol may be a menu symbol or a data symbol. If it is a data symbol, the decoded value of the bar code symbol is output to the display. If it is a menu symbol, a menuing routine is executed. The menu symbol is discussed in more detail below.


If processor 40 does not locate a bar code symbol it moves onto step 420 and looks for OCR-A or OCR-B characters. If it finds these characters it performs optical character recognition in step 422. If it does not, processor evaluates the image for the presence of text. If text is located, the image is cropped, and the text is compressed and stored in steps 428 and 430. If the image does not include text, processor 40 evaluates the image for the presence of a signature. If one is present, the image is cropped, and the data is compressed and stored in steps 428 and 430. In another embodiment, optical reader 10 is networked, and processor 40 communicates with remote network resources to provide signature verification services. If processor 40 cannot detect a bar code symbol, OCR symbols, text, or a signature, the user is asked in step 432 if he desires to store the image. If he does, the color photographic image is stored in memory in step 434.


As embodied herein and depicted in FIG. 5, a flow chart showing the processing flow for the semi-automatic mode is disclosed. After the user pulls the trigger in step 500, processor reads the selected mode, initializes optical reader 10 hardware, defines image data memory space, and initializes software mode settings. In step 508, optical reader 10 captures and displays the image.


In step 510, processor 40 determines if the captured image includes a graphical symbol. Step 510 in the semi-automatic mode is identical to step 410 in the automatic mode. If processor 40 determines that the captured image does not include a graphical symbol, processor 40 asks the user if she wants to store the color image. If so, the color image is stored in step 514. In step 516, a prompt asks the user if he desires to associate the color image with another image. This step is not performed in the automatic mode. In step 518, if the user answers in the affirmative, the association is made and the processing flow returns to step 508.


In steps 520, 522, 526, and 532, the user is given the opportunity to select the type of graphical imaging that is to be performed. The method for performing OCR, text capture, and signature capture and/or verification are discussed above in the automatic mode description with one difference. In the semi-automatic mode, the user is asked in step 538 if he desires to associate the processed image with a subsequent captured image. If so, process flow is directed back to step 508 and another image is captured and displayed. The association feature can be used several times to associate multiple images.


If the user indicates that it is a bar code, an attempt is made to decode the symbol in step 540. Referring back to step 540, if the decoding attempt is successful, processor 40 determines in step 544 if the symbol is a menu symbol. If it is not a menu symbol, processor 40 displays the decoded bar code information on display 60. If it is a menu symbol, processor 40 executes the appropriate menu routine in step 546. In steps 552 to 564, processor 40 may continue to capture images if the trigger is continuously pulled. In step 562, the user is asked if he desires to associate the decoded bar-code with another image. If so, the program flow is directed back to step 508 and another image is captured and displayed. Processor 40 links this image to the decoded bar code information.


As embodied herein and depicted in FIG. 6A-6C, graphical depictions of the menu symbol used in the bar code processing flows depicted in FIG. 4 and FIG. 5 are disclosed. A decoded menu symbol includes menu word 600 which has the format depicted in FIG. 6A. Menu word 600 includes a one byte product ID code 600-1, that identifies the type and model of the optical reader. Field 600-2 of word 600 specifies the op-code. The op-codes are depicted in FIG. 6C. Op-code 0, refers to vector processing operations that are listed as A1-A4 in FIG. 6C. Vector processing allows the user to download, enabled codes, the parameter table, or current software to an external device. Op-codes 1-7 allow a user to modify a specific portion of the parameter table. These op-codes are used in conjunction with the offset field 600-3 and data fields 600-4 to 600-7. Offset field 600-3 is an index relative to the base address of the parameter table in memory that specifies the exact location in the parameter table. The data fields 600-4 to 600-7 are used to specify a bit mask that indicates which bits are to be modified. FIG. 6B depicts a second important group of options. For example, reader operating modes are included in F1-F6. These options are identical to the icons displayed on GUI 650 in FIG. 3. Offset field 600-3 accommodates other optical reader 10 options as shown.


As embodied herein and depicted in FIG. 7, a flow chart showing a method for reading a bar code in accordance with yet another embodiment of the present invention is disclosed. In step 700, processor 40 refers to a parameter table stored in EROM 48. Specifically, processor 40 determines if the parameter table is programmed to perform 1D decoding. If the parameter table has enabled 1D processing, 1D autodiscrimination is performed. The parameter table specifies the values of the parameters that define the operational mode of the reader. Examples of these parameters include the size and frame rate of the color imager, codes that are enabled during bar code decoding, I/O communications protocols, OCR options, and others. If 1D decoding is successful, the decoded data is stored or displayed, in accordance with the parameter table settings. If 1D codes are disabled or if 1D decoding is unsuccessful, processor moves on to step 708. In this step, processor 40 determines if any 2D codes are enabled. If the parameter table has all of the 2D codes disabled, processor 40 exits the bar code decoding routine. If 2D codes are enabled, 2D autodiscrimination is performed in step 710. If decoding is successful, the decoded data is either stored or output, depending on the parameters stored in the parameter table. If decoding is unsuccessful, processor exits the routine.


As embodied herein and depicted in FIG. 8, a flow chart showing a method for performing the 1D autodiscrimination of step 702 in FIG. 7 is disclosed. In step 800 processor 40 calculates the activities of selected image data elements. The activity is defined as a measure of the rate of change of the image data over a small two-dimensional portion of the region surrounding the selected data element. In one embodiment, the activity is calculated along any two arbitrarily selected directions which are orthogonal one to the other. Two mutually perpendicular directions are used because the orientation of the symbol is unknown. In step 802, processor 40 looks for “high activity” regions. These high activity regions are referred to as candidate symbol regions(CSRs). A high activity region indicates a transition from a black region to a white region, or vice-versa. If there is more than one CSR, it may indicate the presence of more than one bar code symbol. In step 804, processor 40 selects the largest CSR. In step 806, processor 40 calculates the centroid of the largest CSR. Subsequently, processor 40 finds the direction of the highest activity in the largest CSR. In a 1D bar code, this will be the direction perpendicular to the direction of the bars. In steps 810 and 812, processor defines the initial scan line (SC=0), as being the scan line bisecting the centroid of the bar code. Processor calculates the brightness values of sampling points along the initial scan line. These brightness values are converted to digital data in step 816. In decoding step 818, processor 40 applies one 1D decoding program after another. If decoding is unsuccessful, processor 40 checks if the entire CSR has been scanned. If not, it establishes a new scan line, and repeats the decoding process. If in step 822, the entire CSR has been scanned, and there are no CSRs remaining to be decoded, processor 40 exits the routine. If in step 820, 1D decoding is successful, processor 40 determines if the symbol is a 1D stacked symbol. If it is a 1D stacked symbol, processor 40 scans and decodes the remaining CSRs in the stacked symbol. If it is not a stacked symbol, the decoded 1D data is stored or output to display 60 in step 830. In step 838, processor 40 determines if there any unexamined regions. If there are unexamined regions, the decoding process is repeated. Otherwise, processor 40 exits the routine.


As embodied herein and depicted in FIG. 9, a flow chart showing a method for 2D autodiscrimination is disclosed. In step 900, processor 40 converts the image data into a two-state binarized format. In step 902, processor 40 locates all 2D finder patterns and identifies them by type. Pattern types include bulls-eye type patterns, waistband type patterns peripheral patterns, and others. If the number of finder patterns equals zero, processor 40 exits the routine. If there are finder patterns, processor 40 locates the finder pattern closest to the center of the field of view in one embodiment of the invention. The closest-to-the-center option has an advantage in that a centrally located image is likely to be a symbol. In step 908, processor 40 attempts to decode the symbol in accordance with the finder type. For example, the Aztec 2D matrix symbol employs a bulls-eye finder pattern. The DataMatrix symbology employs a peripheral finder pattern. If the decoding is successful, the decoded data is either stored or displayed. In step 914, processor 40 determines if there are any other unused finder patterns. If so, the symbols corresponding to those unused patterns are decoded, and the previously described steps are repeated. Otherwise, processor 40 exits the routine.


As embodied herein and depicted in FIG. 10, a flow chart showing a method for reading text in accordance with yet another embodiment of the present invention is disclosed. This routine can be accessed in a number of ways as described above. In step 1000, a bit-map image of the page is produced. In step 1002, the bit mapped image is sampled. In one embodiment, this is performed by analyzing every Nth scan line of the bit mapped image. The value of integer N is dependent on the resolution of the scanned image. In one embodiment the image is sampled every 1/40th of an inch. This provides sufficient resolution to locate and classify the various regions on the page. By sampling every 1/40th of an inch instead of every scan line, the processing and memory requirements of reader 10 are substantially reduced. In step 1004, processor 40 identifies the page features. Processor 40 analyzes the page and divides it into blank and non-blank portions. The non-blank portions are analyzed to distinguish text regions from non-text regions. After determining the layout of the page, processor 40 uses black-to-white transitions to determine degrees of skew. In step 1008, horizontal white spaces are identified to separate lines of text. In step 1010, vertical white spaces are identified within each line of text to thereby separate individual words and characters from each other. In step 1014, a character recognition algorithm is used in an attempt to recognize each individual character. Finally, in step 1016, processor 40 formats the recovered text before storing the text in memory.


As embodied herein and depicted in FIG. 11, a flow chart showing a method for performing OCR in accordance with yet another embodiment of the present invention is disclosed. In step 1100, reader 10 produces a bit-mapped image of the page. Subsequently, processor 40 finds lines of text in the image, locates the white spaces in each line, and isolates the characters. In step 1108, processor 40 performs character recognition, either OCR-A or OCR-B, as desired. The decoded characters are stored in memory.


As embodied herein and depicted in FIG. 12, a flow chart showing a method for associating consecutive images taken with the color optical reader of the present invention is disclosed. This method corresponds to icon 668 displayed on GUI 650 in FIG. 3. If icon 668 is not clicked on, processor 40 assumes that reader 10 is not operating in association mode. Thus, processor 40 will process a single image. If reader 10 is in association mode processor 40 initializes counter CNTR. In step 1206 processor 40 processes the first captured image. In step 1208, if CNTR is less than or equal to two, processor 40 processes image N, and links image N to the first image. In step 1216, CNTR is incremented by one. If CNTR is greater than two (step 1208), meaning that at least two images have already been linked, processor 40 asks the user if she desires to link another image. If so, the processing flow returns to step 1212. If not, processor 40 exits the routine.


As embodied herein and depicted in FIG. 13, an example of image association in accordance with the present invention is disclosed. One or ordinary skill in the art will recognize that associated images 1300 can be disposed on paper, displayed electronically on display 60, or displayed electronically sing other electronic means, such as a computer monitor. In this example, the first image captured is color photograph 1302 which shows a damaged parcel. The second image captured is bar code 1304 affixed to the side of the damaged parcel. Processor 40 decodes bar code 1304 and associates decoded bar code data 1306 with color photograph 1302. In this example, the user elected to associate a third image, signature 1308. Thus, personnel viewing record 1300 may reasonably conclude that a damaged parcel was delivered to Company XYZ, and that the person signing for the parcel delivery was someone named John W. Smith.


As embodied herein and depicted in FIG. 14, a perspective view of a wireless color optical reader network 1400 in accordance with another embodiment of the present invention is disclosed. Network 1400 includes N-cordless optical scanners 10 coupled to base terminal 202 by means of radio link 18. Base terminal 202 is connected to host computer 206 by communications link 204. Cordless optical reader 10 is of the type described above. It includes antenna 102, keypads 14 and 16, and display 60. A radio controller is included in both the optical scanner 10 and the base terminal 202. It will be apparent to those of ordinary skill in the pertinent art that radio controller may be of any suitable type, but by way of example, radio controller 30 provides frequency hopping spread spectrum communications (FHSS) between scanner 10 and base terminal 202. FHSS is a form of spread spectrum radio transmission that produces a narrow band signal that hops among a plurality of frequencies in a prearranged pattern. FHSS is often used in commercial environments because of its ability to minimize errors due to interference or jamming. However, those of ordinary skill in the art will recognize that optical scanner 10 and base terminal 202 may communicate using other wireless schemes and other modulation formats based on user requirements and environmental factors. Base terminal 202 includes antenna 208, which is used to transmit and receive messages from optical scanner 10. Antenna 208 is connected to a radio controller disposed inside terminal 202. Base terminal 202 also includes an I/O card, a base terminal processor, and a base terminal memory. The I/O card in base terminal 202 is coupled to the radio controller and communications link 204.


As embodied herein and depicted in FIG. 15, a flow chart showing a method for transmitting packetized data from a color optical reader to a base station is disclosed. In steps 1500 and 1502, optical reader 10 captures an image and processes the image as described above. In step 1504, the processed image, whether it be a color image, decoded bar codes, a text file, or signature verification information, is assembled into packets. In steps 1506 and 1508, a loop is created wherein packets are sent to the base terminal one-by-one until all packets are sent.


As embodied herein and depicted in FIG. 16A and FIG. 16B, diagrammatic depictions of packet formats in accordance with the present invention are disclosed. In one embodiment of the present invention, each packet can accommodate approximately 200 bytes of decoded data in a 256 byte packet. This is merely a representative example, and one of ordinary skill in the art will recognize that the scope of the present invention should not be limited to data packets of a certain size or format. FIG. 16A shows data packet 1600 which is used to transmit decoded data from an optical reader to a base terminal when only one data packet is required. Packet 1600 includes an optical reader address field, sequence number field, a packet length field, an image type field, image data, and an error check field. The optical reader address identifies a particular optical reader. Each packet includes a sequence number disposed in the second field. The next field contains the length of the image data field. After this, the packet contains a field identifying the type of image that was processed. After the image type, the image data payload of the packet is inserted. Finally, packet 200 includes an error checking field.



FIG. 16B shows header packet 1602 and data packet 1604 used to transmit decoded data from an optical scanner to a base terminal when more than one data packet is required. When more than one packet is required, reader 10 first transmits header packet 1602. After base terminal 202 acknowledges that it can process the remaining packets, reader 10 transmits remaining packets 1604. If base terminal 202 cannot process the remaining packets 1604, or if there is another problem, base terminal 202 will transmit an application packet to scanner 10 indicating the error. The definitions of the scanner address field, the sequence number field, symbol type, length, symbol data, and error check field were described above, and hence, will not be repeated. Header packet 1602 also includes a header identification field, which identifies the packet as a header packet. In the next field, packet 1602 includes a total length field, which includes the total length of the data contained in the decoded symbol. The next field includes the total number of packets in the message. The second-to-last field is the packet number. In the header packet, this number is designated as packet number “one.” The remaining packets 1604 also include a packet number field, which are incremented from 2 to N, depending on the total number of packets being transmitted in the message.


Packet 1600, packet 1602, and packet 1604 as described above may be of any suitable type, and are representative examples representing one embodiment of the present invention. One of ordinary skill in the art will recognize that the packets may be implemented in a variety of ways.


As embodied herein and depicted in FIG. 17, a flow chart showing a method for performing signature verification is disclosed. In step 1700, optical reader 10 captures the image of the document to thereby generate a bit-map of the image. One of ordinary skill in the art will recognize that in the automatic mode or semi-automatic mode, processor 40 determines that the image object is a graphical symbol in a subsequent step. Step 1202 is similar to steps 1002 and 1004 of FIG. 10. The image is sampled by analyzing every Nth scan line of the bit mapped image. As discussed above, the image must be scanned in such a way so as to provide sufficient resolution to locate and classify the various regions on the document. In the case of a check, the location of the various fields on the instrument are relatively standard. Check sizes may differ somewhat, but the check number, bank code, account number, date, signature block, and etc. are in the same relative locations from check to check. In step 1704, document data such as the name, check number, bank code, account number, and date, are extracted from the document using any OCR program and stored in memory. In step 1706, the image of the hand writing in the signature block is captured.


Steps 1708 and 1710 are performed using the wireless system 1400 described above. In other embodiments these steps are performed by a wireline system. For example, in one embodiment, optical reader 10 is coupled to a host computer via an RS-232 or USB link. In another embodiment, optical reader 10 is connected to a host computer via a LAN. One of ordinary skill in the art will recognize that the present invention should not be construed as being limited by these examples.


In steps 1712 and 1714, processor 40 initializes a counter and begins waiting for a reply from the host computer. In steps 1714-1718, if the reply is not received within time limit TL, the counter CNTR is incremented and the message is re-transmitted. After several attempts, if CNTR>N (N being an integer), processor 40 outputs a fault message. If the reply message is received within time limit TL, processor interprets the reply in step 1722. If the extracted data and the signature match information stored in the database accessible by the host computer, an approval message is displayed. If the extracted data and the signature do not match information stored in the database accessible by the host computer, a disapproval message is displayed. The dynamic signature verification embodiment is similar to the static embodiment described immediately above. In the dynamic version, the user provides his signature using stylus 18 and signature block 62, as shown in FIG. 1C. Signature block 62 provides processor 40 with the dynamic parameters recorded during signature. The dynamic parameters are transmitted to a host processor, as described above.


As embodied herein and depicted in FIG. 18, an example of a color optical reader network 1800 in accordance with the present invention is disclosed. Network 1800 includes wireless system 1400, personal computer 1802, optical reader 10, LAN 1820, network servicing center 1830, and personal area network (PAN) coupled together via network 1810.


One of ordinary skill in the art will recognize that network 1810 may be of any suitable type depending on the application, but there is shown by way of example the Internet. However, the present invention should not be construed as being limited to this example. In another embodiment, network 1810 is a private network. Those of ordinary skill in the art will also recognize that network 1810 is a wireline network in one embodiment, and a wireless network in another embodiment. Network 1810 may include circuit switched networks, IP networks, or both.


LAN 1820 includes server 1822, computer 1824, database 1826, and a plurality of optical readers 10. Database 1826 is used to store associated images along with other data fields. For example, it would be rather useful to store additional information with the associated images shown in FIG. 13. One may want to associate the delivery means, route, driver, and other related information for subsequent analysis. Network 1810 allows reader 10, PAN 1850, and wireless system 1400 a way to store such data in database 1826. System analysts can access this information via personal computer 1802 connected to network 1810. In one embodiment, LAN 1820 includes an Internet website. In this embodiment, users are authenticated before gaining access to database 1826.


Network servicing center 1830 is coupled to network 1810 via interface 1844. Center 1830 also includes server 1832, computer 1834, database 1836, signature verification module 1838, authentication module 1840, coupled together via a LAN. Center 1830 accommodates any number of useful applications programs 1842.


PAN 1850 includes at least one color optical reader 10 coupled to point-of-sale (POS) terminal 1854. POS terminal 1854 is coupled to network 1810 via interface 182. POS terminal 1854 includes a credit card reader and a signature capture block. In the scenario depicted in FIG. 18, a merchant user of POS terminal 1854 transmits an associated customer credit card number, signature, and in one embodiment, a color image of the customer, to Center 1830. Authentication module 1840 is used to authenticate the credit card and signature verification module is used to authenticate the signature. In another embodiment, database 1836 is used to store the customer's image, credit card number, and signature for verification purposes.


It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit and scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims
  • 1. An optical reader that is configured for classifying an image, comprising: a color imaging assembly that is configured for obtaining a representation of an image in a digital format; andan image analysis component that isconfigured for classifying the image by evaluating a result of attempting to locate and process a portion of the image as graphical symbol, andconfigured for classifying said image as an image including a graphic symbol if attempting to locate and process a portion of the image as graphical symbol is successful, andconfigured for classifying said image as an image excluding a graphic symbol if attempting to locate and process a portion of the image as graphical symbol is unsuccessful.
  • 2. The optical reader of claim 1, wherein classifying by attempting to locate and process a portion of the image as graphical symbol includes searching for high energy regions in the image.
  • 3. The optical reader of claim 2, wherein high energy regions include black-white transitions.
  • 4. The optical reader of claim 1, wherein attempting to locate and process a portion of the image as graphical symbol includes attempting to decode said portion of the image as a bar code symbol.
  • 5. The optical reader of claim 4, wherein if said bar code symbol represents a menu, then if attempting to decode said bar code symbol is successful, then a menu that corresponds to a result of decoding of said bar code symbol is executed.
  • 6. The optical reader of claim 4, wherein if said bar code symbol represents data, then if attempting to decode said bar code symbol is successful, then data that corresponds to a result of decoding said bar code symbol is output to a display.
  • 7. The optical reader of claim 4, wherein if attempting to decode said bar code symbol is unsuccessful, then attempting to process a portion of the image as a graphical symbol further includes attempting to process said portion of said image as an optical character recognition (OCR) symbol.
  • 8. The optical reader of claim 7, wherein if attempting to process said portion of said image as an optical character recognition symbol is successful, then a representation of said processed optical character recognition symbol is stored into a memory.
  • 9. The optical reader of claim 7, wherein if attempting to process said portion of said image as an optical character recognition symbol is unsuccessful, then attempting to process a portion of the image as a graphical symbol further includes attempting to recognize said portion of said image as text.
  • 10. The optical reader of claim 9, wherein attempting to recognize said portion of said image as text is successful, then said portion of said image is cropped and stored.
  • 11. The optical reader of claim 10, wherein said portion of said image is further compressed after being cropped and before being stored.
  • 12. The optical reader of claim 9, wherein attempting to recognize said portion of said image as text is unsuccessful, then attempting to process a portion of said image as a graphical symbol further includes attempting to recognize said portion of said image as a signature.
  • 13. The optical reader of claim 12, wherein if attempting to recognize said portion of said image as a signature is successful, then said portion of said image is cropped and stored.
  • 14. The optical reader of claim 12, wherein said portion of said image is further compressed after being cropped and before being stored.
  • 15. The optical reader of claim 12, wherein if attempting to recognize said portion of said image as a signature is unsuccessful, then said image is classified as excluding a graphical symbol.
  • 16. The optical reader of claim 13, wherein attempting to process a portion of the image as a graphical symbol further includes attempting to verify said signature.
  • 17. The optical reader of claim 1 wherein attempting to process a portion of the image as graphical symbol is unsuccessful, then said image is classified as an image excluding a graphic symbol.
  • 18. The optical reader of claim 1 wherein attempting to locate and process a portion of the image as graphical symbol is successful if at least one of attempting to decode a bar code symbol, attempting to process a OCR symbol, attempting to recognize text and attempting to recognize a signature, are successful.
  • 19. The optical reader of claim 16 wherein said image is associated with another image that is subsequently captured.
  • 20. A method for employing an optical reader to classify an image, the method comprising the steps of: providing an optical reader;configuring the optical reader to obtain a representation of an image in a digital format;configuring the optical reader to classify the image by attempting to locate and process a portion of the image as graphical symbol; and whereinif attempting to locate and process a portion of the image as graphical symbol is successful, then classifying the image as an image including a graphic symbol; and whereinif attempting to locate and process a portion of the image as graphical symbol is unsuccessful, then classifying the image as an image excluding a graphic symbol.
CROSS-REFERENCE TO RELATED APPLICATIONS

This is patent application is a divisional of U.S. patent application Ser. No. 10/764,741 filed on Jan. 26, 2004, which is a continuation of U.S. patent application Ser. No. 09/904,697 filed Jul. 13, 2001 and issued as U.S. Pat. No. 6,722,569 on Apr. 20, 2004, the benefit of priority to the above aforementioned patent applications under 35 U.S.C. §120 is hereby claimed. The content of the above aforementioned applications are relied upon and are incorporated herein by reference in their entirety.

US Referenced Citations (310)
Number Name Date Kind
3684868 Christie et al. Aug 1972 A
3716699 Eckert, Jr. et al. Feb 1973 A
3801775 Acker et al. Apr 1974 A
3902047 Tyler et al. Aug 1975 A
4020463 Himmel Apr 1977 A
4286255 Siy Aug 1981 A
4335303 Call Jun 1982 A
4387298 Petersen et al. Jun 1983 A
4499372 Nakano et al. Feb 1985 A
4516017 Hara et al. May 1985 A
4530584 Schmidt Jul 1985 A
4544064 Felder et al. Oct 1985 A
4562592 Chainer et al. Dec 1985 A
4581762 Lapidus et al. Apr 1986 A
4588211 Greene May 1986 A
4656591 Goldberg Apr 1987 A
4760248 Swartz et al. Jul 1988 A
4776464 Miller et al. Oct 1988 A
4794239 Allais Dec 1988 A
4806776 Kley Feb 1989 A
4832204 Handy et al. May 1989 A
4835372 Gombrich et al. May 1989 A
4855842 Hayes et al. Aug 1989 A
4858000 Lu Aug 1989 A
4868757 Gil Sep 1989 A
4873426 Sarna et al. Oct 1989 A
4877948 Krueger Oct 1989 A
4877949 Danielson et al. Oct 1989 A
4924078 Sant'Anselmo et al. May 1990 A
4948955 Lee et al. Aug 1990 A
4963756 Quan et al. Oct 1990 A
4972494 White et al. Nov 1990 A
4983818 Knowles Jan 1991 A
5010241 Butterworth Apr 1991 A
5019699 Koenck May 1991 A
5038391 Yamaguchi et al. Aug 1991 A
5039847 Morii et al. Aug 1991 A
5043908 Manduley et al. Aug 1991 A
5050223 Sumi et al. Sep 1991 A
5054102 Gaborski Oct 1991 A
5091975 Berger et al. Feb 1992 A
5102110 Reynolds Apr 1992 A
5103489 Miette et al. Apr 1992 A
5119433 Will Jun 1992 A
5120940 Willsie Jun 1992 A
5134669 Keogh et al. Jul 1992 A
5138140 Siemiatkowski et al. Aug 1992 A
5138141 Blanford et al. Aug 1992 A
5180904 Shepard et al. Jan 1993 A
5199081 Saito et al. Mar 1993 A
5199084 Kishi et al. Mar 1993 A
5212777 Gove et al. May 1993 A
5237161 Grodevant Aug 1993 A
5237625 Yamashita et al. Aug 1993 A
5243655 Wang Sep 1993 A
5260554 Grodevant Nov 1993 A
5262623 Batterman et al. Nov 1993 A
5262871 Wilder et al. Nov 1993 A
5278399 Sano et al. Jan 1994 A
5291243 Heckman et al. Mar 1994 A
5296689 Reddersen et al. Mar 1994 A
5296690 Chandler et al. Mar 1994 A
5296960 Ellingson et al. Mar 1994 A
5299116 Owens et al. Mar 1994 A
5301243 Olschafskie et al. Apr 1994 A
5304423 Niknafs et al. Apr 1994 A
5304786 Pavlidis et al. Apr 1994 A
5305122 Hayashi et al. Apr 1994 A
5307423 Gupta et al. Apr 1994 A
5313051 Brigida et al. May 1994 A
5317388 Surka et al. May 1994 A
5331151 Cochran et al. Jul 1994 A
5331176 Sant' Anselmo et al. Jul 1994 A
5337361 Wang et al. Aug 1994 A
5354977 Roustaei Oct 1994 A
5365048 Komiya et al. Nov 1994 A
5375226 Sano et al. Dec 1994 A
5378883 Batterman et al. Jan 1995 A
5392447 Schlack et al. Feb 1995 A
5396054 Krichever et al. Mar 1995 A
5399846 Pavlidis et al. Mar 1995 A
5410141 Koenck et al. Apr 1995 A
5413383 Laurash et al. May 1995 A
5414251 Durbin May 1995 A
5420403 Allum et al. May 1995 A
5420943 Mak May 1995 A
5421778 Kouramanis et al. Jun 1995 A
5422470 Kubo et al. Jun 1995 A
5428211 Zheng et al. Jun 1995 A
5428212 Tani et al. Jun 1995 A
5448375 Cooper et al. Sep 1995 A
5449201 Miller et al. Sep 1995 A
5467403 Fishbine et al. Nov 1995 A
5467411 Tanaka et al. Nov 1995 A
5471533 Wang et al. Nov 1995 A
5489158 Wang et al. Feb 1996 A
5489769 Kubo et al. Feb 1996 A
5496992 Madan et al. Mar 1996 A
5504322 Pavlidis et al. Apr 1996 A
5504367 Arackellian et al. Apr 1996 A
5506697 Li et al. Apr 1996 A
5508818 Hamma Apr 1996 A
5513017 Knodt et al. Apr 1996 A
5513264 Wang et al. Apr 1996 A
5521366 Wang et al. May 1996 A
5550364 Rudeen Aug 1996 A
5550366 Roustaei Aug 1996 A
5557091 Krummel Sep 1996 A
5557095 Clark et al. Sep 1996 A
5557519 Morita et al. Sep 1996 A
5570135 Gove et al. Oct 1996 A
5574519 Manico et al. Nov 1996 A
5581636 Skinger Dec 1996 A
5591955 Laser Jan 1997 A
5591956 Longacre, Jr. et al. Jan 1997 A
5598007 Bunce et al. Jan 1997 A
5602382 Ulvr et al. Feb 1997 A
5607187 Salive et al. Mar 1997 A
5617481 Nakamura et al. Apr 1997 A
5627915 Rosser et al. May 1997 A
5635694 Tuhro Jun 1997 A
5635697 Shellhammer et al. Jun 1997 A
5642442 Morton et al. Jun 1997 A
5644408 Li et al. Jul 1997 A
5646390 Wang et al. Jul 1997 A
5654533 Suzuki et al. Aug 1997 A
5659167 Wang et al. Aug 1997 A
5668803 Tymes et al. Sep 1997 A
5684290 Arackellian et al. Nov 1997 A
5691527 Hara et al. Nov 1997 A
5697504 Hiramatsu et al. Dec 1997 A
5702059 Chu et al. Dec 1997 A
5703349 Meyerson et al. Dec 1997 A
5708515 Nishiura et al. Jan 1998 A
5710419 Wang et al. Jan 1998 A
5714745 Ju et al. Feb 1998 A
5723868 Hammond, Jr. et al. Mar 1998 A
5726981 Ylitervo et al. Mar 1998 A
5734153 Swartz et al. Mar 1998 A
5756981 Roustaei et al. May 1998 A
5760382 Li et al. Jun 1998 A
5761686 Bloomberg Jun 1998 A
5763866 Seo et al. Jun 1998 A
5770841 Moed et al. Jun 1998 A
5773806 Longacre, Jr. Jun 1998 A
5773810 Hussey et al. Jun 1998 A
5780834 Havens et al. Jul 1998 A
5783811 Feng et al. Jul 1998 A
5786586 Pidhirny et al. Jul 1998 A
5793033 Feng et al. Aug 1998 A
5796090 Pavlidis et al. Aug 1998 A
5801371 Kahn et al. Sep 1998 A
5804805 Koenck et al. Sep 1998 A
5811784 Tausch et al. Sep 1998 A
5815200 Ju et al. Sep 1998 A
5818028 Meyerson et al. Oct 1998 A
5818528 Roth et al. Oct 1998 A
5821518 Sussmeier et al. Oct 1998 A
5821523 Bunte et al. Oct 1998 A
5825002 Roslak Oct 1998 A
5834754 Feng et al. Nov 1998 A
5837986 Barile et al. Nov 1998 A
5841121 Koenck Nov 1998 A
5844227 Schmidt et al. Dec 1998 A
5857029 Patel Jan 1999 A
5867595 Cymbalski Feb 1999 A
5869828 Braginsky Feb 1999 A
5877487 Tani et al. Mar 1999 A
5880453 Wang et al. Mar 1999 A
5886338 Arackellian et al. Mar 1999 A
5892824 Beatson et al. Apr 1999 A
5914476 Gerst, III et al. Jun 1999 A
5917925 Moore Jun 1999 A
5917945 Cymbalski Jun 1999 A
5920056 Bonnet Jul 1999 A
5929418 Ehrhart et al. Jul 1999 A
5936609 Matsuoka et al. Aug 1999 A
5942743 Schmidt et al. Aug 1999 A
5945661 Nukui et al. Aug 1999 A
5949052 Longacre, Jr. et al. Sep 1999 A
5949053 Zlotnick et al. Sep 1999 A
5949057 Feng Sep 1999 A
5965863 Parker et al. Oct 1999 A
5974202 Wang et al. Oct 1999 A
5990744 Nagaraj Nov 1999 A
5992744 Smith et al. Nov 1999 A
5992753 Xu Nov 1999 A
6000612 Xu Dec 1999 A
6002491 Li et al. Dec 1999 A
6010073 Bianchi et al. Jan 2000 A
6011873 Desai et al. Jan 2000 A
6015088 Parker et al. Jan 2000 A
6019286 Li et al. Feb 2000 A
6024284 Schmid et al. Feb 2000 A
6036095 Seo et al. Mar 2000 A
6055552 Curry Apr 2000 A
6060722 Havens et al. May 2000 A
6062475 Feng May 2000 A
6070805 Kaufman et al. Jun 2000 A
6075240 Watanabe et al. Jun 2000 A
6076731 Terrell Jun 2000 A
6076733 Wilz, Sr. et al. Jun 2000 A
6076738 Bloomberg et al. Jun 2000 A
6081827 Reber et al. Jun 2000 A
6089455 Yagita et al. Jul 2000 A
6094509 Zheng et al. Jul 2000 A
6095418 Swartz et al. Aug 2000 A
6098887 Figarella et al. Aug 2000 A
6101487 Yeung et al. Aug 2000 A
6102295 Ogami Aug 2000 A
6105871 Campo et al. Aug 2000 A
6108612 Vescovi et al. Aug 2000 A
6115513 Miyazaki et al. Sep 2000 A
6122410 Zheng et al. Sep 2000 A
6123261 Roustaei Sep 2000 A
6129278 Wang et al. Oct 2000 A
6133951 Miyadera et al. Oct 2000 A
6149063 Reynolds et al. Nov 2000 A
6155491 Dueker et al. Dec 2000 A
6157027 Watanabe et al. Dec 2000 A
6186404 Ehrhart et al. Feb 2001 B1
6189796 Itoh et al. Feb 2001 B1
6195122 Vincent Feb 2001 B1
6212504 Hayosh Apr 2001 B1
6220509 Byford et al. Apr 2001 B1
6223988 Batterman et al. May 2001 B1
6234394 Kahn et al. May 2001 B1
6262804 Friend et al. Jul 2001 B1
6283375 Wilz, Sr. et al. Sep 2001 B1
6285916 Kadaba et al. Sep 2001 B1
6286760 Schmidt et al. Sep 2001 B1
6290132 Dickson et al. Sep 2001 B1
6292181 Banerjee et al. Sep 2001 B1
6298176 Longacre, Jr. et al. Oct 2001 B2
6304313 Honma et al. Oct 2001 B1
6304660 Ehrhart et al. Oct 2001 B1
6311896 Mulla et al. Nov 2001 B1
6315204 Knighton et al. Nov 2001 B1
6321992 Knowles et al. Nov 2001 B1
6330975 Bunte et al. Dec 2001 B1
6336587 He et al. Jan 2002 B1
6340114 Correa et al. Jan 2002 B1
6347163 Roustaei Feb 2002 B2
6357662 Helton et al. Mar 2002 B1
6373507 Camara et al. Apr 2002 B1
6375075 Ackley et al. Apr 2002 B1
6384907 Gooch et al. May 2002 B1
6398112 Li et al. Jun 2002 B1
6405929 Ehrhart et al. Jun 2002 B1
6418325 Reber et al. Jul 2002 B1
6419157 Ehrhart et al. Jul 2002 B1
6460766 Olschafskie et al. Oct 2002 B1
6494375 Ishibashi et al. Dec 2002 B1
6512218 Canini et al. Jan 2003 B1
6512541 Dunton et al. Jan 2003 B2
6533168 Ching Mar 2003 B1
6539360 Kadaba Mar 2003 B1
6556242 Dunton et al. Apr 2003 B1
6561428 Meier et al. May 2003 B2
6572020 Barkan Jun 2003 B2
6575367 Longacre, Jr. Jun 2003 B1
6598798 Kashi et al. Jul 2003 B1
6621598 Oda et al. Sep 2003 B1
6629642 Swartz et al. Oct 2003 B1
6637658 Barber et al. Oct 2003 B2
6641046 Durbin Nov 2003 B2
6651060 Harper et al. Nov 2003 B1
6655597 Swartz et al. Dec 2003 B1
6678425 Flores et al. Jan 2004 B1
6681994 Koenck Jan 2004 B1
6688523 Koenck Feb 2004 B1
6694366 Gernert et al. Feb 2004 B1
6695209 La Feb 2004 B1
6703633 Tullis Mar 2004 B2
6722569 Ehrhart et al. Apr 2004 B2
6736322 Gobburu et al. May 2004 B2
6738092 Nakagawa et al. May 2004 B1
6746164 Albright et al. Jun 2004 B1
6752319 Ehrhart et al. Jun 2004 B2
6758403 Keys et al. Jul 2004 B1
6766053 Fan et al. Jul 2004 B2
6772949 Wilz, Sr. et al. Aug 2004 B2
6783069 Hecht et al. Aug 2004 B1
6786069 Ochi et al. Sep 2004 B2
6811088 Lanzaro et al. Nov 2004 B2
6827273 Wilz, Sr. et al. Dec 2004 B2
6832725 Gardiner et al. Dec 2004 B2
6834807 Ehrhart et al. Dec 2004 B2
6877664 Oliva et al. Apr 2005 B1
6889904 Bianculli et al. May 2005 B2
6910633 Swartz et al. Jun 2005 B2
6942151 Ehrhart Sep 2005 B2
6976626 Schmidt et al. Dec 2005 B2
6976631 Kashi et al. Dec 2005 B2
7068821 Matsutani et al. Jun 2006 B2
7111787 Ehrhart Sep 2006 B2
7121468 Schmidt et al. Oct 2006 B2
7222789 Longacre, Jr. et al. May 2007 B2
7287697 Ehrhart et al. Oct 2007 B2
7293712 Wang Nov 2007 B2
7303134 Ehrhart et al. Dec 2007 B2
20020039099 Harper Apr 2002 A1
20030062419 Ehrhart et al. Apr 2003 A1
20030127519 Ehrhart et al. Jul 2003 A1
20050001035 Hawley et al. Jan 2005 A1
20050008263 Nagahashi et al. Jan 2005 A1
20050056699 Meier et al. Mar 2005 A1
20050161511 Parker et al. Jul 2005 A1
20050167507 Swartz et al. Aug 2005 A1
20060274171 Wang Dec 2006 A1
Foreign Referenced Citations (41)
Number Date Country
0350933 Jan 1990 EP
0392159 Oct 1990 EP
0439682 Aug 1991 EP
0498366 Aug 1992 EP
0690403 Jan 1996 EP
0733991 Sep 1996 EP
0910032 Apr 1999 EP
0917087 May 1999 EP
0978990 Feb 2000 EP
0998147 May 2000 EP
0999514 May 2000 EP
01050793 Nov 2000 EP
2357209 Jun 2001 GB
61059569 Mar 1986 JP
63311474 Dec 1988 JP
01216486 Aug 1989 JP
03020058 Jan 1991 JP
03054680 Mar 1991 JP
04257844 Sep 1992 JP
06250775 Sep 1994 JP
10224773 Aug 1998 JP
11232378 Aug 1999 JP
2000050028 Feb 2000 JP
2000293622 Oct 2000 JP
WO-9202371 Feb 1992 WO
WO-9217861 Oct 1992 WO
WO-9513196 May 1995 WO
WO-9524278 Sep 1995 WO
WO-9534043 Dec 1995 WO
WO-9639676 Dec 1996 WO
WO-9708647 Mar 1997 WO
WO-9950736 Oct 1999 WO
WO-0016241 Mar 2000 WO
WO-0072246 Nov 2000 WO
WO-0122358 Mar 2001 WO
WO-0146899 Jun 2001 WO
WO-02080520 Oct 2002 WO
WO-03001435 Jan 2003 WO
WO-03081520 Oct 2003 WO
WO-03081521 Oct 2003 WO
WO-2004064382 Jul 2004 WO
Related Publications (1)
Number Date Country
20070051814 A1 Mar 2007 US
Divisions (1)
Number Date Country
Parent 10764741 Jan 2004 US
Child 11592636 US
Continuations (1)
Number Date Country
Parent 09904697 Jul 2001 US
Child 10764741 US