1. Field of the Invention
The present invention relates generally to optical readers, and particularly to optical readers employing imagers.
2. Technical Background
Optical indicia readers equipped to read one-dimensional or two-dimensional bar code symbols are well known in the art. There are a number of optical character recognition systems on the market as well. In addition, many financial institutions today employ computer-driven signature capture systems. Many of these systems employ monochrome imagers because monochrome imagers are well-suited to read graphical symbols, such as bar codes, OCR symbols, or signatures.
On the other hand, the ability to provide image capture functionality along with indicia reading in one device is very appealing. Currently, optical readers having image capture functionality use monochrome imagers that provide gray scale images. While such devices are useful, gray scale images are less desirable than color images for viewing purposes. The public has come to expect color imaging. Further, monochrome images are often less distinct and not as informative as color images.
Unfortunately, there are problems associated with using color imaging systems to read graphical symbols. The first problem relates to the difficulty of distinguishing bi-tonal indicia in a color image. Because color imagers provide more information that bi-tonal indicia readers can use, color imaging data is often confusing to graphical symbol indicia readers. One way to solve this problem is to convert the color imaging data into gray-scale data. However, commercially available methods for converting color images to gray-scale are too slow for high-volume scanning. Thus, an optical reader employing a color imager with a gray scale converter would be slower and more expensive than an optical reader using monochrome imager because of the additional processing required.
Thus, a need exists for an inexpensive optical reader that is capable of performing color photography and evaluating graphical symbols. This optical reader must be capable of automatically determining whether an image includes a graphical symbol or is merely a color photographic image, and process the acquired color imaging data based on that determination. A need also exists for an optical reader that is able to associate an acquired color image with any subsequent acquired color image.
There is described a device having a two dimensional imager. The device having a two dimensional image sensor can be a hand held device. Imaging optics can be provided for focusing light reflected from a target onto the two dimensional imager. An image including imaging data can be obtained utilizing the hand held device. The image can include a representation of a signature.
The present invention addresses the needs identified above. The present invention is directed to an inexpensive optical reader that is configured to perform color photography or evaluate graphical symbols. The optical reader of the present invention automatically, or through manual selection, determines whether a captured image is a color photographic image or, a color image that includes a graphical symbol. Subsequently, the optical reader of the present invention processes the acquired imaging data in accordance with that determination. The optical reader of the present invention is operative to acquire and associate a plurality of acquired images.
One aspect of the present invention is an optical reader. The optical reader includes a color imaging assembly for acquiring an image of an object, the color imaging assembly generating imaging data corresponding to the image. An image analysis circuit is coupled to the color imaging assembly. The image analysis circuit being configured to determine if the color imaging data includes at least one graphical symbol. The image is classified as a graphical symbol, or the image is classified as a color photograph if the color imaging data does not include at least one graphical symbol. A processing circuit is coupled to the image analysis circuit. The processing circuit is operative to process the imaging data based on the determination.
In another aspect, the present invention includes an optical reader for capturing an image of an object. The optical reader includes a color imaging assembly for converting the image of the object into color digital data corresponding to the image.
An automatic mode selection circuit is coupled to the color imaging assembly. The mode selection circuit uses at least a portion of the color digital data to select one of a plurality of operational modes of the optical reader. The operational modes include at least a graphical symbol mode and a color photography mode. A processing circuit is coupled to the mode selection circuit. The processing circuit is configured to process the color digital data based on the selected operational mode.
In another aspect, the present invention includes an optical reader for capturing an image of an object. The optical reader includes a color imaging assembly for capturing the image as color imaging data. A classification circuit is coupled to the color imaging assembly, the classification circuit being configured to process at least a portion of the color imaging data to thereby select one of a plurality of classifications, whereby the image is classified as a color photographic image, or as an image that includes at least one graphical symbol. An automatic mode selector is coupled to the classification circuit, the automatic mode selector being configured to select an optical reader mode in accordance with the selected classification. A processor is coupled to the classification circuit, the processor being programmed to process the color imaging data in accordance with the optical reader mode selected by the automatic mode selector.
In another aspect, the present invention includes an optical reader for capturing an image of an object. The optical reader includes a color imaging assembly for capturing the image as color imaging data. A user mode selector is coupled to the color imaging assembly, the user mode selector being switchable between at least one automatic user mode, or a manual user mode for manually selecting one of a plurality of imaging modes of the optical reader, whereby the plurality of imaging modes includes at least one graphical symbol mode and a color photography mode. An automatic imaging mode selector is coupled to the user mode selector and the color imaging assembly, the automatic imaging mode selector being operative to automatically select one of the plurality of imaging modes when in the automatic user mode. A processing circuit is coupled to the user mode selector and the automatic mode selector, the processing circuit being programmed to process the color imaging data based on the selected one of the plurality of operational modes.
In another aspect, the present invention includes a method for acquiring an image of an object with an optical reader. The method includes: acquiring first color imaging data representing the image; analyzing the color imaging data to provide an image classification, whereby the image is classified as a color photograph, or as including at least one graphical symbol; and processing the color imaging data in accordance with the image classification.
In another aspect, the present invention includes a computer readable medium having computer-executable instructions for performing a method including: acquiring color imaging data; analyzing the color imaging data to provide an image classification, whereby the image is classified as a color photograph, or the image is classified as including at least one graphical symbol; and processing the color imaging data in accordance with the image classification.
In another aspect, the present invention includes an optical reader having a color imaging assembly for acquiring color imaging data, and a graphical user interface including a display and a selection device. In the optical reader, a method for selecting at least one optical reader operating mode includes: displaying at least one icon on the graphical user interface, the at least one icon corresponding to the at least one optical reader operating mode; clicking on the at least one icon with the selection device to thereby select the at least one optical reader operating mode corresponding to the selected at least one icon; and processing the color imaging data based on the selected at least one icon, whereby the color imaging data is processed as a color photographic image, or as an image that includes at least one graphical symbol.
In another aspect, the present invention includes an optical reader having a color imaging assembly for acquiring color imaging data, and a graphical user interface including a display and a selection device. In the optical reader, a method of providing and selecting from a menu on the display includes: retrieving a set of menu entries for the menu, each of the menu entries representing at least one operational mode of the optical reader; displaying the set of menu entries on the display; selecting a menu entry; emitting a menu selection signal indicative of a selected operational mode; and processing the imaging data based on the selected menu entry, whereby the imaging data is processed as a color photographic image or as an image that includes at least one graphical symbol.
In another aspect, the present invention includes a method for acquiring an image of an object with an optical reader. The method includes: providing a color imaging assembly; converting the image into color imaging data; classifying the image as either a color photograph, or as a color image that includes at least one graphical symbol; and processing the color imaging data in accordance with the step of classifying.
In another aspect, the present invention includes a method for acquiring an image of an object with an optical reader. The optical reader has a plurality of imaging modes including at least one graphical symbol mode, and a color photography mode. The method includes: capturing the image by acquiring color imaging data; analyzing at least a portion of the color imaging data to provide an image classification, whereby the image classification includes at least one graphical symbol classification and a color photography classification; automatically selecting one of a plurality of image processing modes based on the image classification provided in the step of analyzing; and processing the color imaging data based on the selected one of the plurality of image processing modes.
In another aspect, the present invention includes a method for acquiring an image of an object with an optical reader. The optical reader has a plurality of imaging modes including at least one graphical symbol mode, and a color photography mode. The method includes: capturing the image by acquiring color imaging data; automatically selecting one of the plurality of imaging modes based on an analysis of the color imaging data; and processing the color imaging data in accordance with a selected one of the plurality of imaging modes.
In another aspect, the present invention includes a system for processing at least one image. The system includes at least one network element. The system includes an optical reader including a color imager and a processor. The color imager is configured to capture the at least one image by generating color imaging data corresponding to the at least one image. The processor is configured to provide a classification of the color imaging data based on whether the color imaging data includes at least one graphical symbol. The processor is programmed to process the color imaging data in accordance with the classification. A network is coupled to the color optical reader and the at least one network element, whereby processed image data is transmitted between the network and the at least one network element.
Additional features and advantages of the invention will be set forth in the detailed description which follows, and in part will be readily apparent to those skilled in the art from that description or recognized by practicing the invention as described herein, including the detailed description which follows, the claims, as well as the appended drawings.
It is to be understood that the description herein is merely exemplary of the invention, and are intended to provide an overview or framework for understanding the nature and character of the invention as it is claimed. The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate various embodiments of the invention, and together with the description serve to explain the principles and operation of the invention.
Reference will now be made in detail to the present exemplary embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. An exemplary embodiment of the optical reader of the present invention is shown in
In accordance with the invention, the present invention for an optical reader includes a color imaging assembly for acquiring color imaging data. An image analysis circuit determines if the acquired image includes at least one graphical symbol. A processing circuit processes the imaging data based on the determination of whether the image includes at least one graphical symbol. The present invention allows a user to read graphical symbols, such as bar codes, text, OCR characters or signatures using a color imager. The color optical reader of the present invention is configured to automatically determine whether a color image includes a graphical symbol, or is merely a color photographic image. The optical reader of the present invention also is operative to associate one acquired image with at least one subsequently acquired image.
As embodied herein, and depicted in
As embodied herein and depicted in
Optical reader 10 also includes processor 40. In the embodiment depicted in
Illumination optics 22 may be of any suitable type, but there is shown by way of example a lens system for directing light from light source 24 towards target T. It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to illumination optics 22 of the present invention depending on the complexity of the target illumination. For example, illumination optics 22 may include one or more lenses, diffusers, wedges, reflectors or a combination of these elements. In one embodiment, illumination optics 22 produces an aiming pattern on target T.
Light source 24 may be of any suitable type, but there is shown by way of example a plurality of white LEDs. It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to light source 24 of the present invention depending on the application. For example, illumination assembly 20 may be eliminated altogether if it is certain that the ambient light level will be high enough to obtain high quality color images. In another embodiment, red LEDs are employed instead of the white LEDs.
Color imager 34 may be of any suitable type, but there is shown by way of example, a CMOS color imager having a 640×480 pixel resolution. It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to color imager 34 of the present invention depending on cost and the resolution required by optical reader 10. In another embodiment, color imager 34 has 800×600 pixels. A typical VGA resolution of 640×480 pixels is adequate for displaying color images on a LCD or a computer monitor. In one megapixel embodiment, color imager 34 has 1156×864 pixels (almost 1-million pixels). In yet another embodiment, color imager 34 includes 1536×1024 pixels. One of ordinary skill in the art will recognize that as the resolution of imager 34 increases, so will the cost. In another embodiment, color imager 34 is implemented by scanning a linear CCD array. In other embodiments, color imager 34 is implemented using an area CCD solid state image sensor.
Processor 40 may be of any suitable type, but there is shown by way of example a processor which includes microprocessor 42 and ASIC 44 coupled to system bus 52. In one embodiment, microprocessor 42 and ASIC are programmable control devices that receive, process, and output data in accordance with an embedded program stored in EROM 48. As discussed above, microprocessor 42 and ASIC 44 are connected to system bus 52, which includes address, data, and control lines.
In the embodiment depicted in
In the embodiment depicted in
It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to the memory configuration of the present invention depending on cost and flexibility considerations. For example, in one embodiment, EROM 48 is implemented using EPROMs or E2PROMs. In yet another embodiment, FLASH memory is employed. RAM 46 typically includes at least one volatile memory device, and in some embodiments includes one or more long term non-volatile memory devices.
It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to I/O unit 50 of the present invention depending on the application and work environment. Embodiments of I/O unit 50 include an RS-232 interface, a LAN interface, PAN interface, a serial bus such as USB, an internet interface, and a wireless interface.
External interface 56 is used to transmit a discrete signal to control a peripheral device. Typically, the peripheral is an external illuminator. The external illuminator is used in place of light source 24.
It will be apparent to those of ordinary skill in the pertinent art that modifications and variations can be made to the operating system employed by optical reader 10 depending on the applications and desired operating environment. In one embodiment, a WindowsCE operating system is employed. In other embodiments, LINUX or PalmOS operating systems are employed. As a non-limiting example, application programs can be written using C, C++, Visual Basic, or Visual C++. Other languages can be used as well, depending on the application program. In other embodiments, optical reader 10 does not employ an operating system. For example, the simple reader depicted in
As embodied herein and depicted in
In the automatic imaging mode, processor 40 is programmed to analyze the color imaging data to determine if an acquired image includes a graphical symbol or is merely a color photographic image. If it makes the determination that the color image includes a graphical symbol, it further analyzes the acquired image and classifies it as a bar code, OCR symbol, text, or a signature. Based on the classification, optical reader 10 jumps to the appropriate routine in EROM 48. The semi-automatic mode is similar. Thus, in the automatic or semi-automatic modes, the bar code scanning mode, the OCR/text mode, the signature capture mode, the color photography mode, and the association mode are controlled by the application program, not by the user.
However, the user may manually select any of the above listed modes. If the user clicks on bar code scanning icon 658, the bar code scanning application program will run. In this application program, the user may select between a 1D bar code mode, 2D bar code mode or an autodiscrimination mode. Further, the user can manually select and de-select the types of bar codes optical reader 10 is enabled to read or not read.
The user may also click on OCR/Text icon 660. Clicking icon 660 provides the user with a check validation mode, a text scanning mode, or a bi-tonal image capture mode. The check validation mode is performed in conjunction with network services.
Clicking on icon 662 provides the user with a signature capture mode. In one embodiment, this mode includes a signature verification program wherein the user may select between a static verification or a dynamic verification. In the static mode, the user captures the image of a signature. The captured image is compared with a reference image stored in a remote database. In the dynamic mode, optical reader 10 uses the stylus and signature block to capture the signature. In this mode, signature block 62 measures unique dynamic parameters, such as applied pressure, direction and timing of movements, or a combination of these parameters. One of ordinary skill in the art will recognize that this list is not meant to be all-inclusive, but rather, is a representative example. The captured dynamic parameters are compared with a reference data stored in a remote database.
The user selects the color photography mode by clicking on icon 664. This mode allows the user to select an automatic imaging mode wherein optical reader 10 makes the imaging adjustments (e.g., exposure, etc.) or a manual mode that allows the user to adjust imager settings as he pleases.
In another embodiment, display 60 provides the user with a menu listing the main modes of optical reader 10. The user employs keypad 16 to select the desired mode. A cursor key is employed to highlight any of the modes listed above. Upon pressing the enter key, processor 40 jumps to the appropriate routine stored in EROM 48. As discussed above, a user may select between an Automatic Imaging mode, a Semi-Automatic Imaging mode, a bar code scanning mode, an OCR/text mode, a signature capture mode, a color photography mode, or an association mode.
As embodied herein and depicted in
In another embodiment, step 410 is performed by considering all of the pixel values. However, the interpretation of the pixel's value is adjusted based on whether it is a red, green, or blue pixel. In another embodiment, processor 40 creates a gray-scale image to determine whether the image includes a graphical symbol.
If in step 410 processor 40 determines that there is no graphical symbol present in the image, the user is asked in step 432 if he desires to store the image. If so, the color photographic image is stored in memory in step 434. If processor 40 determines that the image includes a graphical symbol, the process flow moves on to step 418. In this step, processor 40 strikes scanning lines to locate bar code symbol identifiers. If processor 40 determines that the graphical symbol is a bar code symbol it attempts to decode the symbol in step 436. If the decoding is successful, the symbol may be a menu symbol or a data symbol. If it is a data symbol, the decoded value of the bar code symbol is output to the display. If it is a menu symbol, a menuing routine is executed. The menu symbol is discussed in more detail below.
If processor 40 does not locate a bar code symbol it moves onto step 420 and looks for OCR-A or OCR-B characters. If it finds these characters it performs optical character recognition in step 422. If it does not, processor evaluates the image for the presence of text. If text is located, the image is cropped, and the text is compressed and stored in steps 428 and 430. If the image does not include text, processor 40 evaluates the image for the presence of a signature. If one is present, the image is cropped, and the data is compressed and stored in steps 428 and 430. In another embodiment, optical reader 10 is networked, and processor 40 communicates with remote network resources to provide signature verification services. If processor 40 cannot detect a bar code symbol, OCR symbols, text, or a signature, the user is asked in step 432 if he desires to store the image. If he does, the color photographic image is stored in memory in step 434.
As embodied herein and depicted in
In step 510, processor 40 determines if the captured image includes a graphical symbol. Step 510 in the semi-automatic mode is identical to step 410 in the automatic mode. If processor 40 determines that the captured image does not include a graphical symbol, processor 40 asks the user if she wants to store the color image. If so, the color image is stored in step 514. In step 516, a prompt asks the user if he desires to associate the color image with another image. This step is not performed in the automatic mode. In step 518, if the user answers in the affirmative, the association is made and the processing flow returns to step 508.
In steps 520, 522, 526, and 532, the user is given the opportunity to select the type of graphical imaging that is to be performed. The method for performing OCR, text capture, and signature capture and/or verification are discussed above in the automatic mode description with one difference. In the semi-automatic mode, the user is asked in step 538 if he desires to associate the processed image with a subsequent captured image. If so, process flow is directed back to step 508 and another image is captured and displayed. The association feature can be used several times to associate multiple images.
If the user indicates that it is a bar code, an attempt is made to decode the symbol in step 540. Referring back to step 540, if the decoding attempt is successful, processor 40 determines in step 544 if the symbol is a menu symbol. If it is not a menu symbol, processor 40 displays the decoded bar code information on display 60. If it is a menu symbol, processor 40 executes the appropriate menu routine in step 546. In steps 552 to 564, processor 40 may continue to capture images if the trigger is continuously pulled. In step 562, the user is asked if he desires to associate the decoded bar-code with another image. If so, the program flow is directed back to step 508 and another image is captured and displayed. Processor 40 links this image to the decoded bar code information.
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
Packet 1600, packet 1602, and packet 1604 as described above may be of any suitable type, and are representative examples representing one embodiment of the present invention. One of ordinary skill in the art will recognize that the packets may be implemented in a variety of ways.
As embodied herein and depicted in
Steps 1708 and 1710 are performed using the wireless system 1400 described above. In other embodiments these steps are performed by a wireline system. For example, in one embodiment, optical reader 10 is coupled to a host computer via an RS-232 or USB link. In another embodiment, optical reader 10 is connected to a host computer via a LAN. One of ordinary skill in the art will recognize that the present invention should not be construed as being limited by these examples.
In steps 1712 and 1714, processor 40 initializes a counter and begins waiting for a reply from the host computer. In steps 1714-1718, if the reply is not received within time limit TL, the counter CNTR is incremented and the message is re-transmitted. After several attempts, if CNTR>N (N being an integer), processor 40 outputs a fault message. If the reply message is received within time limit TL, processor interprets the reply in step 1722. If the extracted data and the signature match information stored in the database accessible by the host computer, an approval message is displayed. If the extracted data and the signature do not match information stored in the database accessible by the host computer, a disapproval message is displayed. The dynamic signature verification embodiment is similar to the static embodiment described immediately above. In the dynamic version, the user provides his signature using stylus 18 and signature block 62, as shown in
As embodied herein and depicted in
One of ordinary skill in the art will recognize that network 1810 may be of any suitable type depending on the application, but there is shown by way of example the Internet. However, the present invention should not be construed as being limited to this example. In another embodiment, network 1810 is a private network. Those of ordinary skill in the art will also recognize that network 1810 is a wireline network in one embodiment, and a wireless network in another embodiment. Network 1810 may include circuit switched networks, IP networks, or both.
LAN 1820 includes server 1822, computer 1824, database 1826, and a plurality of optical readers 10. Database 1826 is used to store associated images along with other data fields. For example, it would be rather useful to store additional information with the associated images shown in
Network servicing center 1830 is coupled to network 1810 via interface 1844. Center 1830 also includes server 1832, computer 1834, database 1836, signature verification module 1838, authentication module 1840, coupled together via a LAN. Center 1830 accommodates any number of useful applications programs 1842.
PAN 1850 includes at least one color optical reader 10 coupled to point-of-sale (POS) terminal 1854. POS terminal 1854 is coupled to network 1810 via interface 182. POS terminal 1854 includes a credit card reader and a signature capture block. In the scenario depicted in
The present invention relates to an optical reader that includes a color imaging assembly that generates color imaging data. An image analysis circuit determines if the acquired image should be characterized as a color photograph or as including a graphical symbol. A processing circuit processes the imaging data based on the image analysis circuits determination of whether the image is a graphical symbol or a color photograph. The present invention allows a user to acquire and process both color images and graphical symbols, such as bar codes, text, OCR symbols or signatures. The optical reader of the present invention is also configured to associate an acquired image with at least one other acquired image.
A small sampling of the systems, methods, and apparatus that are described and defined herein is as follows:
A1. An optical reader comprising: a color imaging assembly for acquiring an image of an object, the color imaging assembly generating color imaging data corresponding to the image; an image analysis circuit coupled to the color imaging assembly, the image analysis circuit being configured to determine if the color imaging data includes at least one graphical symbol, whereby the image is classified as a graphical symbol image if the color imaging data includes at least one graphical symbol, or the image is classified as a color photograph if the color imaging data does not include at least one graphical symbol; and a processing circuit coupled to the image analysis circuit, the processing circuit being operative to process the color imaging data based on the classification of the image. A2. The optical reader of claim 1, wherein the processing circuit decodes a 1D 2 bar code symbol based on the classification. A3. The optical reader of claim 1, wherein the processing circuit decodes a 2D 2 bar code symbol based on the classification. A4. The optical reader of claim 1, wherein the processing circuit performs optical character recognition based on the classification. A5. The optical reader of claim 1, wherein the processing circuit performs a signature capture based on the classification. A6. The optical reader of claim 1, wherein the processing circuit stores a color image based on the classification. A7. The optical reader of claim 1, wherein the portion of the color imaging data is processed by evaluating only green pixel values in the color imaging data. A8. The optical reader of claim 1, wherein the classification circuit aggregates values of a red, blue and green triplet to form a super-pixel in the process of selecting one of a plurality of classifications. A9. The optical reader of claim 1, wherein the color imaging data is converted into a gray scale image in the process of selecting one of a plurality of classifications. A10. The optical reader of claim 1, further comprising an illumination light source including white LEDs. A11. The optical reader of claim 1, further comprising an illumination light source including red LEDs.
B1. An optical reader for capturing an image of an object, the optical reader comprising: a color imaging assembly for converting the image of the object into color digital data corresponding to the image; an automatic mode selection circuit coupled to the color imaging assembly, the mode selection circuit using at least a portion of the color digital data to select one of a plurality of operational modes of the optical reader, the operational modes including at least graphical symbol mode and a color photography mode; and a processing circuit coupled to the mode selection circuit, the processing circuit being configured to process the color digital data based on the selected operational mode. B2. The optical reader of claim B1, wherein the at least one graphical symbol mode includes decoding a 1D bar code. B3. The optical reader of claim B1, wherein the at least one graphical symbol mode includes decoding a 2D bar code. B4. The optical reader of claim B1, wherein the at least one graphical symbol mode includes optical character recognition. B5. The optical reader of claim B1, wherein the at least one graphical symbol mode includes capturing a signature. B6. The optical reader of claim B1, wherein the color photography mode includes storing a color photographic image in a computer-readable medium. B7. The optical reader of claim B1, further comprising an illumination light source including white LEDs. B8. The optical reader of claim B1, further comprising an illumination light source including red LEDs.
C1. An optical reader for capturing an image of an object, the optical reader comprising: a color imaging assembly for capturing the image as color imaging data; a classification circuit coupled to the color imaging assembly, the classification circuit being configured to process at least a portion of the color imaging data to thereby select one of a plurality of classifications, whereby the image is classified as a color photographic image, or as an image that includes at least one graphical symbol; an automatic mode selector coupled to the classification circuit, the automatic mode selector being configured to select an optical reader mode in accordance with the selected classification; and a processor coupled to the classification circuit, the processor being programmed to process the color imaging data in accordance with the optical reader mode selected by the automatic mode selector. C2. The optical reader of claim C1, wherein the portion of the color imaging data is processed by evaluating only green pixel values in the color imaging data. C3. The optical reader of claim 20, wherein the classification circuit aggregates values of a red, blue and green triplet to form a super-pixel in the process of selecting one of a plurality of classifications. C4. The optical reader of claim C1, wherein the color imaging data is converted into a gray scale image in the process of selecting one of a plurality of classifications. C5. The optical reader of claim C1, wherein the processor decodes a ID bar code symbol. C6. The optical reader of claim C1, wherein the processor decodes a 2D bar code symbol. C7. The optical reader of claim C1, wherein the processor performs an optical character recognition process. C8. The optical reader of claim C1, wherein the processor performs a signature capture process. C9. The optical reader of claim C1, wherein the processor stores a color image in a computer-readable medium.
D1. An optical reader for capturing an image of an object, the optical reader comprising: a color imaging assembly for capturing the image as color imaging data; a user mode selector coupled to the color imaging assembly, the user mode selector being switchable between at least one automatic user mode, or a manual user mode for manually selecting one of a plurality of imaging modes of the optical reader, whereby the plurality of imaging modes includes at least one graphical symbol mode and a color photography mode; an automatic imaging mode selector coupled to the user mode selector and the color imaging assembly, the automatic imaging mode selector being operative to automatically select one of the plurality of imaging modes when in the automatic user mode; and a processing circuit coupled to the user mode selector and the automatic mode selector, the processing circuit being programmed to process the color imaging data based on the selected one of the plurality of operational modes. D2. The optical reader of claim D1, wherein the plurality of imaging modes includes a 1D bar code decoding mode. D3. The optical reader of claim D1, wherein the plurality of imaging modes includes a 2D bar code decoding mode. D4. The optical reader of claim D1, wherein the plurality of imaging modes includes an optical character recognition mode. D5. The optical reader of claim D1, wherein the plurality of imaging modes includes a signature capture mode. D6. The optical reader of claim D1, wherein the plurality of imaging modes includes a color photography mode.
E1. A method for acquiring an image of an object with an optical reader, the method comprising: acquiring first color imaging data representing the image; analyzing the color imaging data to provide an image classification, whereby the image is classified as a color photograph, or as including at least one graphical symbol; and processing the color imaging data in accordance with the image classification. E2. The method of claim E1, wherein the step of processing includes decoding a 1D barcode. E3. The method of claim E1, wherein the step of processing includes decoding a 2D barcode. E4. The method of claim E1, wherein the step of processing includes an optical character recognition process. E5. The method of claim E1, wherein the step of processing includes capturing a signature. E6. The method of claim E1, wherein the step of processing includes storing a color photographic image in a computer-readable medium. E7. The method of claim E1, wherein the step of analyzing includes an analysis of only one color of the color imaging data during the step of providing an image classification. E8. The method of claim E1, further comprising: acquiring at least one second color imaging data representing at least one second image; analyzing the at least one second color imaging data to provide at least one second image classification, whereby the at least one second image is classified as a color photograph, or as an image including at least one graphical symbol; processing the at least one second color imaging data in accordance with the at least one second image classification; and associating the at least one second color imaging data with the first color imaging data. E9. The method of claim E1, wherein the step of associating includes displaying the at least one second color imaging data with the first color imaging data. E10. The method of claim E9, wherein the step of associating includes electronically displaying the at least one second color imaging data with the first color imaging data. E11. The method of claim E1 wherein the step of associating includes printing the at least one second color imaging data with the first color imaging data. E12. The method of claim E1, wherein the step of associating includes linking the at least one second color imaging data with the first color imaging data in memory. E13. The method of claim E1, wherein the step of associating includes storing the at least one second color imaging data with the first color imaging data as a record in a database.
F1. A computer readable medium having computer-executable instructions for performing a method comprising: acquiring color imaging data; analyzing the color imaging data to provide an image classification, whereby the image is classified as a color photograph, or the image is classified as including at least one graphical symbol; and processing the color imaging data in accordance with the image classification.
G1. In an optical reader having a color imaging assembly for acquiring color imaging data, and a graphical user interface including a display and a selection device, a method for selecting at least one optical reader operating mode, the method comprising: displaying at least one icon on the graphical user interface, the at least one icon corresponding to the at least one optical reader operating mode; clicking on the at least one icon with the selection device to thereby select the at least one optical reader operating mode corresponding to the selected at least one icon; and processing the color imaging data based on the selected at least one icon, whereby the color imaging data is processed as a color photographic image, or as an image that includes at least one graphical symbol.
H1. In an optical reader having a color imaging assembly for acquiring color imaging data, and a graphical user interface including a display and a selection device, a method of providing and selecting from a menu on the display, the method comprising: retrieving a set of menu entries for the menu, each of the menu entries representing at least one operational mode of the optical reader; displaying the set of menu entries on the display; selecting a menu entry; emitting a menu selection signal indicative of a selected operational mode; and processing the imaging data based on the selected menu entry, whereby the imaging data is processed as a color photographic image or as an image that includes at least one graphical symbol.
I1. A method for acquiring an image of an object with an optical reader, the method comprising: providing a color imaging assembly; converting the image into color imaging data; classifying the image as either a color photograph, or as a color image that includes at least one graphical symbol; and processing the color imaging data in accordance with the step of classifying.
J1. A method for acquiring an image of an object with an optical reader, the optical reader having a plurality of imaging modes including at least one graphical symbol mode, and a color photography mode, the method comprising: capturing the image by acquiring color imaging data; analyzing at least a portion of the color imaging data to provide an image classification, whereby the image classification includes at least one graphical symbol classification and a color photography classification; automatically selecting one of a plurality of image processing modes based on the image classification provided in the step of analyzing; and processing the color imaging data based on the selected one of the plurality of image processing modes.
K1. A method for acquiring an image of an object with an optical reader, the optical reader having a plurality of imaging modes including at least one graphical symbol mode, and a color photography mode, the method comprising: capturing the image by acquiring color imaging data; automatically selecting one of the plurality of imaging modes based on an analysis of the color imaging data; and processing the color imaging data in accordance with a selected one of the plurality of imaging modes.
L1. A system for processing at least one image, the system including at least one network element, the system comprising: an optical reader including a color imager and a processor, the color imager being configured to capture the at least one image by generating color imaging data corresponding to the at least one image, the processor being configured to provide a classification of the color imaging data based on whether the color imaging data includes at least one graphical symbol, the processor being programmed to process the color imaging data in accordance with the classification; and a network coupled to the color optical reader and the at least one network element, whereby processed image data is transmitted between the network and the at least one network element. L2. The system of claim L1, wherein the network includes the Internet. L3. The system of claim L1, wherein the network includes a wireless network. L4. The system of claim L1, wherein the network includes a circuit switched network. L5. The system of claim L1, wherein the network includes an IP network. L6. The system of claim L1, wherein the network includes a private network. L7. The system of claim L1, wherein the network element includes a LAN. L8. The system of claim L7, wherein the LAN further comprises: a server coupled to the network; and at least one optical reader coupled to the server. L9. The system of claim L7, wherein the at least one optical reader includes a color imager. L10. The system of claim L7, wherein the LAN includes a database, the database being configured to store a plurality of associated processed images. L11. The system of claim L8, wherein the plurality of associated processed images includes a color photographic image associated with decoded bar code data. L12. The system of claim L8, wherein the plurality of associated processed images includes a color photographic image associated with decoded OCR data. L13. The system of claim L8, wherein the plurality of associated processed images includes a color photographic image associated with decoded text data. L14. The system of claim L8, wherein the plurality of associated processed images includes a color photographic image associated with a captured signature. L15. The system of claim L8, wherein the plurality of associated processed images includes decoded bar code data. L16. The system of claim L8, wherein the plurality of associated processed images includes decoded OCR data. L17. The system of claim L8, wherein the plurality of associated processed images includes decoded text data. L18. The system of claim L8, wherein the plurality of associated processed images includes a captured signature. L19. The system of claim L7, wherein the LAN includes a POS terminal. L20. The system of claim L7, wherein the LAN includes a credit card authentication module. L21. The system of claim L7, wherein the LAN includes a signature verification module. L22. The system of claim L1, wherein the network element includes a PAN, the Pan having at least one optical reader coupled thereto. L23. The system of claim L22, wherein the at least one optical reader includes a color imager. L24. The system of claim L22, wherein the PAN includes a POS terminal. L25. The system of claim L1, wherein the network element further comprises: a wireless base station coupled to the network, the wireless base station being configured to transmit and receive processed image data to and from the network; and at least one wireless optical reader coupled to the wireless base station via an RF communications link. L26. The system of claim L25, wherein the at least one wireless optical reader includes a color imager. L27. The system of claim L1, wherein the processor further comprises an image analysis circuit coupled to the color imager, the image analysis circuit being configured to determine if the color imaging data includes at least one graphical symbol, whereby the image is classified as a graphical symbol image if the color imaging data includes at least one graphical symbol, or the image is classified as a color photograph if the color imaging data does not include at least one graphical symbol. L28. The system of claim L1, wherein the processor further comprises an automatic mode selection circuit coupled to the color imager, the automatic mode selection circuit using at least a portion of the color imaging data to select one of a plurality of operational modes of the optical reader, the operational modes including at least graphical symbol mode and a color photography mode. L29. The system of claim L1, wherein the processor further comprises: a classification circuit coupled to the color imager, the classification circuit being configured to process at least a portion of the color imaging data to thereby select one of a plurality of classifications, whereby the image is classified as a color photographic image, or as an image that includes at least one graphical symbol; an automatic mode selector coupled to the classification circuit, the automatic mode selector being configured to select an optical reader mode in accordance with the selected one of a plurality of classifications. L30. The system of claim L1, wherein the optical reader further comprises: a user mode selector coupled to the color imager, the user mode selector being switchable between at least one automatic user mode, or a manual user mode for manually selecting one of a plurality of imaging modes of the optical reader, whereby the plurality of imaging modes includes at least one graphical symbol mode and a color photography mode; an automatic imaging mode selector coupled to the user mode selector and the color imager, the automatic imaging mode selector being operative to automatically select one of the plurality of imaging modes when in the automatic user mode.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit and scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
The present application is a continuation application of U.S. patent application Ser. No. 12/188,804 filed Aug. 8, 2008, which is a divisional application of U.S. patent application Ser. No. 11/592,636 filed Nov. 3, 2006 and issued as U.S. Pat. No. 7,413,127 on Aug. 19, 2008, which is a divisional of U.S. patent application Ser. No. 10/764,741 filed on Jan. 26, 2004 and issued as U.S. Pat. No. 7,287,697 on Oct. 30, 2007, which is a continuation of U.S. patent application Ser. No. 09/904,697 filed Jul. 13, 2001 and issued as U.S. Pat. No. 6,722,569 on Apr. 20, 2004. The benefit of priority under 35 U.S.C. §120 of each of the above applications is claimed and each of the above applications is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3684868 | Christie et al. | Aug 1972 | A |
3716699 | Eckert, Jr. et al. | Feb 1973 | A |
3801775 | Acker | Apr 1974 | A |
3902047 | Tyler et al. | Aug 1975 | A |
4020463 | Himmel | Apr 1977 | A |
4286255 | Siy | Aug 1981 | A |
4335303 | Call | Jun 1982 | A |
4387298 | Petersen et al. | Jun 1983 | A |
4499372 | Nakano | Feb 1985 | A |
4516017 | Hara et al. | May 1985 | A |
4530584 | Schmidt | Jul 1985 | A |
4544064 | Felder | Oct 1985 | A |
4562592 | Chainer et al. | Dec 1985 | A |
4581762 | Lapidus et al. | Apr 1986 | A |
4588211 | Greene | May 1986 | A |
4656591 | Goldberg | Apr 1987 | A |
4760248 | Swartz et al. | Jul 1988 | A |
4776464 | Miller et al. | Oct 1988 | A |
4794239 | Allais | Dec 1988 | A |
4806776 | Kley | Feb 1989 | A |
4832204 | Handy et al. | May 1989 | A |
4835372 | Gombrich et al. | May 1989 | A |
4855842 | Hayes et al. | Aug 1989 | A |
4858000 | Lu | Aug 1989 | A |
4868757 | Gil | Sep 1989 | A |
4873426 | Sarna et al. | Oct 1989 | A |
4877948 | Krueger | Oct 1989 | A |
4877949 | Danielson et al. | Oct 1989 | A |
4924078 | Sant'Anselmo et al. | May 1990 | A |
4948955 | Lee et al. | Aug 1990 | A |
4963756 | Quan et al. | Oct 1990 | A |
4972494 | White et al. | Nov 1990 | A |
4983818 | Knowles | Jan 1991 | A |
5010241 | Butterworth | Apr 1991 | A |
5019699 | Koenck | May 1991 | A |
5038391 | Yamaguchi | Aug 1991 | A |
5039847 | Morii et al. | Aug 1991 | A |
5043908 | Manduley et al. | Aug 1991 | A |
5050223 | Sumi | Sep 1991 | A |
5054102 | Gaborski | Oct 1991 | A |
5070805 | Plante | Dec 1991 | A |
5089455 | Ketcham et al. | Feb 1992 | A |
5091975 | Berger et al. | Feb 1992 | A |
5102110 | Reynolds | Apr 1992 | A |
5103489 | Miette | Apr 1992 | A |
5108612 | Flaig et al. | Apr 1992 | A |
5119433 | Will | Jun 1992 | A |
5120940 | Willsie | Jun 1992 | A |
5134669 | Keogh et al. | Jul 1992 | A |
5138140 | Siemiatkowski et al. | Aug 1992 | A |
5138141 | Blanford et al. | Aug 1992 | A |
5180904 | Shepard et al. | Jan 1993 | A |
5199081 | Saito et al. | Mar 1993 | A |
5199084 | Kishi et al. | Mar 1993 | A |
5212777 | Gove et al. | May 1993 | A |
5237161 | Grodevant | Aug 1993 | A |
5237625 | Yamashita et al. | Aug 1993 | A |
5243655 | Wang | Sep 1993 | A |
5260554 | Grodevant | Nov 1993 | A |
5262623 | Batterman et al. | Nov 1993 | A |
5262871 | Wilder et al. | Nov 1993 | A |
5278399 | Sano | Jan 1994 | A |
5291243 | Heckman et al. | Mar 1994 | A |
5296689 | Reddersen et al. | Mar 1994 | A |
5296690 | Chandler et al. | Mar 1994 | A |
5296960 | Ellingson et al. | Mar 1994 | A |
5299116 | Owens et al. | Mar 1994 | A |
5301243 | Olschafskie et al. | Apr 1994 | A |
5304423 | Niknafs et al. | Apr 1994 | A |
5304786 | Pavlidis et al. | Apr 1994 | A |
5305122 | Hayashi et al. | Apr 1994 | A |
5307423 | Gupta et al. | Apr 1994 | A |
5313051 | Brigida et al. | May 1994 | A |
5317388 | Surka et al. | May 1994 | A |
5331151 | Cochran et al. | Jul 1994 | A |
5331176 | Sant' Anselmo et al. | Jul 1994 | A |
5337361 | Wang et al. | Aug 1994 | A |
5354977 | Roustaei | Oct 1994 | A |
5365048 | Komiya et al. | Nov 1994 | A |
5375226 | Sano et al. | Dec 1994 | A |
5378883 | Batterman et al. | Jan 1995 | A |
5392447 | Schlack et al. | Feb 1995 | A |
5396054 | Krichever et al. | Mar 1995 | A |
5399846 | Pavlidis et al. | Mar 1995 | A |
5410141 | Koenck et al. | Apr 1995 | A |
5413383 | Laurash et al. | May 1995 | A |
5414251 | Durbin | May 1995 | A |
5420403 | Allum et al. | May 1995 | A |
5420943 | Mak | May 1995 | A |
5421778 | Kouramanis | Jun 1995 | A |
5422470 | Kubo | Jun 1995 | A |
5428211 | Zheng et al. | Jun 1995 | A |
5428212 | Tani et al. | Jun 1995 | A |
5448375 | Cooper et al. | Sep 1995 | A |
5449201 | Miller et al. | Sep 1995 | A |
5467403 | Fishbine et al. | Nov 1995 | A |
5467411 | Tanaka et al. | Nov 1995 | A |
5471533 | Wang et al. | Nov 1995 | A |
5489158 | Wang et al. | Feb 1996 | A |
5489769 | Kubo | Feb 1996 | A |
5496992 | Madan et al. | Mar 1996 | A |
5504322 | Pavlidis et al. | Apr 1996 | A |
5504367 | Arackellian et al. | Apr 1996 | A |
5506697 | Li et al. | Apr 1996 | A |
5508818 | Hamma | Apr 1996 | A |
5513017 | Knodt et al. | Apr 1996 | A |
5513264 | Wang et al. | Apr 1996 | A |
5521366 | Wang et al. | May 1996 | A |
5550364 | Rudeen | Aug 1996 | A |
5550366 | Roustaei | Aug 1996 | A |
5557091 | Krummel | Sep 1996 | A |
5557095 | Clark et al. | Sep 1996 | A |
5557519 | Morita | Sep 1996 | A |
5570135 | Gove et al. | Oct 1996 | A |
5574519 | Manico et al. | Nov 1996 | A |
5581636 | Skinger | Dec 1996 | A |
5591955 | Laser | Jan 1997 | A |
5591956 | Longacre, Jr. et al. | Jan 1997 | A |
5598007 | Bunce et al. | Jan 1997 | A |
5602382 | Ulvr et al. | Feb 1997 | A |
5607187 | Salive et al. | Mar 1997 | A |
5617481 | Nakamura | Apr 1997 | A |
5627915 | Rosser et al. | May 1997 | A |
5635694 | Tuhro | Jun 1997 | A |
5635697 | Shellhammer et al. | Jun 1997 | A |
5642442 | Morton et al. | Jun 1997 | A |
5644408 | Li et al. | Jul 1997 | A |
5646390 | Wang et al. | Jul 1997 | A |
5654533 | Suzuki et al. | Aug 1997 | A |
5659167 | Wang et al. | Aug 1997 | A |
5668803 | Tymes et al. | Sep 1997 | A |
5684290 | Arackellian et al. | Nov 1997 | A |
5691527 | Hara et al. | Nov 1997 | A |
5697504 | Hiramatsu et al. | Dec 1997 | A |
5702059 | Chu et al. | Dec 1997 | A |
5703349 | Meyerson et al. | Dec 1997 | A |
5708515 | Nishiura et al. | Jan 1998 | A |
5710419 | Wang et al. | Jan 1998 | A |
5714745 | Ju et al. | Feb 1998 | A |
5723868 | Hammond, Jr. et al. | Mar 1998 | A |
5726981 | Ylitervo et al. | Mar 1998 | A |
5734153 | Swartz et al. | Mar 1998 | A |
5756981 | Roustaei et al. | May 1998 | A |
5760382 | Li et al. | Jun 1998 | A |
5761686 | Bloomberg | Jun 1998 | A |
5763866 | Seo et al. | Jun 1998 | A |
5770841 | Moed et al. | Jun 1998 | A |
5773806 | Longacre, Jr. | Jun 1998 | A |
5773810 | Hussey et al. | Jun 1998 | A |
5780834 | Havens et al. | Jul 1998 | A |
5783811 | Feng et al. | Jul 1998 | A |
5786586 | Pidhirny et al. | Jul 1998 | A |
5793033 | Feng et al. | Aug 1998 | A |
5796090 | Pavlidis et al. | Aug 1998 | A |
5801371 | Kahn et al. | Sep 1998 | A |
5804805 | Koenck et al. | Sep 1998 | A |
5811784 | Tausch et al. | Sep 1998 | A |
5815200 | Ju et al. | Sep 1998 | A |
5818028 | Meyerson et al. | Oct 1998 | A |
5818528 | Roth et al. | Oct 1998 | A |
5821518 | Sussmeier et al. | Oct 1998 | A |
5821523 | Bunte et al. | Oct 1998 | A |
5825002 | Roslak | Oct 1998 | A |
5834754 | Feng et al. | Nov 1998 | A |
5837986 | Barile et al. | Nov 1998 | A |
5841121 | Koenck | Nov 1998 | A |
5844227 | Schmidt et al. | Dec 1998 | A |
5857029 | Patel | Jan 1999 | A |
5859828 | Ishibashi | Jan 1999 | A |
5867595 | Cymbalski | Feb 1999 | A |
5869828 | Braginsky | Feb 1999 | A |
5877487 | Tani et al. | Mar 1999 | A |
5880453 | Wang et al. | Mar 1999 | A |
5886338 | Arackellian et al. | Mar 1999 | A |
5892824 | Beatson et al. | Apr 1999 | A |
5914476 | Gerst, III et al. | Jun 1999 | A |
5917925 | Moore | Jun 1999 | A |
5917945 | Cymbalski | Jun 1999 | A |
5920056 | Bonnet | Jul 1999 | A |
5929418 | Ehrhart et al. | Jul 1999 | A |
5936609 | Matsuoka et al. | Aug 1999 | A |
5942743 | Schmidt et al. | Aug 1999 | A |
5945661 | Nukui et al. | Aug 1999 | A |
5949052 | Longacre, Jr. et al. | Sep 1999 | A |
5949053 | Zlotnick | Sep 1999 | A |
5949057 | Feng | Sep 1999 | A |
5965863 | Parker et al. | Oct 1999 | A |
5974202 | Wang et al. | Oct 1999 | A |
5990744 | Nagaraj | Nov 1999 | A |
5992744 | Smith et al. | Nov 1999 | A |
5992753 | Xu | Nov 1999 | A |
6000612 | Xu | Dec 1999 | A |
6002491 | Li et al. | Dec 1999 | A |
6010073 | Bianchi | Jan 2000 | A |
6011873 | Desai et al. | Jan 2000 | A |
6015088 | Parker et al. | Jan 2000 | A |
6019286 | Li et al. | Feb 2000 | A |
6024284 | Schmid et al. | Feb 2000 | A |
6036095 | Seo | Mar 2000 | A |
6055552 | Curry | Apr 2000 | A |
6060722 | Havens et al. | May 2000 | A |
6062475 | Feng | May 2000 | A |
6070805 | Kaufman et al. | Jun 2000 | A |
6075240 | Watanabe et al. | Jun 2000 | A |
6076731 | Terrell | Jun 2000 | A |
6076733 | Wilz, Sr. et al. | Jun 2000 | A |
6076738 | Bloomberg et al. | Jun 2000 | A |
6081827 | Reber et al. | Jun 2000 | A |
6089455 | Yagita et al. | Jul 2000 | A |
6094509 | Zheng et al. | Jul 2000 | A |
6095418 | Swartz et al. | Aug 2000 | A |
6098887 | Figarella et al. | Aug 2000 | A |
6101487 | Yeung | Aug 2000 | A |
6102295 | Ogami | Aug 2000 | A |
6105871 | Campo et al. | Aug 2000 | A |
6108612 | Vescovi et al. | Aug 2000 | A |
6115513 | Miyazaki et al. | Sep 2000 | A |
6122410 | Zheng et al. | Sep 2000 | A |
6123261 | Roustaei | Sep 2000 | A |
6129278 | Wang et al. | Oct 2000 | A |
6133951 | Miyadera et al. | Oct 2000 | A |
6149063 | Reynolds et al. | Nov 2000 | A |
6155491 | Dueker et al. | Dec 2000 | A |
6157027 | Watanabe et al. | Dec 2000 | A |
6176429 | Reddersen et al. | Jan 2001 | B1 |
6186404 | Ehrhart et al. | Feb 2001 | B1 |
6189796 | Itoh et al. | Feb 2001 | B1 |
6195122 | Vincent | Feb 2001 | B1 |
6212504 | Hayosh | Apr 2001 | B1 |
6220509 | Byford | Apr 2001 | B1 |
6223988 | Batterman et al. | May 2001 | B1 |
6234394 | Kahn et al. | May 2001 | B1 |
6262804 | Friend et al. | Jul 2001 | B1 |
6283375 | Wilz, Sr. et al. | Sep 2001 | B1 |
6285916 | Kadaba et al. | Sep 2001 | B1 |
6286760 | Schmidt et al. | Sep 2001 | B1 |
6290132 | Dickson et al. | Sep 2001 | B1 |
6292181 | Banerjee et al. | Sep 2001 | B1 |
6298176 | Longacre, Jr. et al. | Oct 2001 | B2 |
6304313 | Honma et al. | Oct 2001 | B1 |
6304660 | Ehrhart et al. | Oct 2001 | B1 |
6311896 | Mulla et al. | Nov 2001 | B1 |
6315204 | Knighton et al. | Nov 2001 | B1 |
6321992 | Knowles et al. | Nov 2001 | B1 |
6330975 | Bunte et al. | Dec 2001 | B1 |
6336587 | He et al. | Jan 2002 | B1 |
6340114 | Correa et al. | Jan 2002 | B1 |
6347163 | Roustaei | Feb 2002 | B2 |
6357662 | Helton et al. | Mar 2002 | B1 |
6373507 | Camara et al. | Apr 2002 | B1 |
6375075 | Ackley et al. | Apr 2002 | B1 |
6384907 | Gooch et al. | May 2002 | B1 |
6398112 | Li et al. | Jun 2002 | B1 |
6405929 | Ehrhart et al. | Jun 2002 | B1 |
6418325 | Reber et al. | Jul 2002 | B1 |
6419157 | Ehrhart et al. | Jul 2002 | B1 |
6460766 | Olschafskie et al. | Oct 2002 | B1 |
6494375 | Ishibashi et al. | Dec 2002 | B1 |
6512218 | Canini et al. | Jan 2003 | B1 |
6512541 | Dunton et al. | Jan 2003 | B2 |
6533168 | Ching | Mar 2003 | B1 |
6539360 | Kadaba | Mar 2003 | B1 |
6556242 | Dunton et al. | Apr 2003 | B1 |
6561428 | Meier et al. | May 2003 | B2 |
6572020 | Barkan | Jun 2003 | B2 |
6575367 | Longacre, Jr. | Jun 2003 | B1 |
6598798 | Kashi et al. | Jul 2003 | B1 |
6621598 | Oda | Sep 2003 | B1 |
6629104 | Parulski et al. | Sep 2003 | B1 |
6629642 | Swartz et al. | Oct 2003 | B1 |
6637658 | Gardiner et al. | Oct 2003 | B2 |
6641046 | Durbin | Nov 2003 | B2 |
6651060 | Harper et al. | Nov 2003 | B1 |
6655597 | Swartz et al. | Dec 2003 | B1 |
6678425 | Flores et al. | Jan 2004 | B1 |
6681994 | Koenck | Jan 2004 | B1 |
6688523 | Koenck | Feb 2004 | B1 |
6694366 | Gernert et al. | Feb 2004 | B1 |
6695209 | La | Feb 2004 | B1 |
6703633 | Tullis | Mar 2004 | B2 |
6722569 | Ehrhart et al. | Apr 2004 | B2 |
6736322 | Gobburu et al. | May 2004 | B2 |
6738092 | Nakagawa et al. | May 2004 | B1 |
6746164 | Albright et al. | Jun 2004 | B1 |
6752319 | Ehrhart et al. | Jun 2004 | B2 |
6758403 | Keys et al. | Jul 2004 | B1 |
6760128 | Jackson et al. | Jul 2004 | B2 |
6766053 | Fan et al. | Jul 2004 | B2 |
6772949 | Wilz, Sr. et al. | Aug 2004 | B2 |
6783069 | Hecht et al. | Aug 2004 | B1 |
6786069 | Ochi et al. | Sep 2004 | B2 |
6811088 | Lanzaro et al. | Nov 2004 | B2 |
6827273 | Wilz, Sr. et al. | Dec 2004 | B2 |
6831682 | Silverbrook et al. | Dec 2004 | B1 |
6832725 | Gardiner et al. | Dec 2004 | B2 |
6834807 | Ehrhart et al. | Dec 2004 | B2 |
6877664 | Oliva | Apr 2005 | B1 |
6889904 | Bianculli et al. | May 2005 | B2 |
6910633 | Swartz et al. | Jun 2005 | B2 |
6915955 | Jung et al. | Jul 2005 | B2 |
6942151 | Ehrhart | Sep 2005 | B2 |
6976626 | Schmidt et al. | Dec 2005 | B2 |
6976631 | Kashi et al. | Dec 2005 | B2 |
7068821 | Matsutani et al. | Jun 2006 | B2 |
7111787 | Ehrhart | Sep 2006 | B2 |
7121468 | Schmidt et al. | Oct 2006 | B2 |
7222789 | Longacre, Jr. et al. | May 2007 | B2 |
7287697 | Ehrhart et al. | Oct 2007 | B2 |
7293712 | Wang | Nov 2007 | B2 |
7303134 | Ehrhart et al. | Dec 2007 | B2 |
7413127 | Ehrhart et al. | Aug 2008 | B2 |
7686222 | Ehrhart et al. | Mar 2010 | B2 |
8292180 | Ehrhart et al. | Oct 2012 | B2 |
20010055422 | Roustaei | Dec 2001 | A1 |
20020039099 | Harper | Apr 2002 | A1 |
20020053597 | Ehrhart et al. | May 2002 | A1 |
20020084327 | Ehrhart et al. | Jul 2002 | A1 |
20020110283 | Fan et al. | Aug 2002 | A1 |
20020128796 | Matsutani | Sep 2002 | A1 |
20020170970 | Ehrhart | Nov 2002 | A1 |
20020171745 | Ehrhart | Nov 2002 | A1 |
20030034463 | Tullis | Feb 2003 | A1 |
20030046192 | Eguchi et al. | Mar 2003 | A1 |
20030062413 | Gardiner et al. | Apr 2003 | A1 |
20030089775 | Yeakley et al. | May 2003 | A1 |
20030127519 | Ehrhart et al. | Jul 2003 | A1 |
20030206150 | Hussey et al. | Nov 2003 | A1 |
20040059806 | Webb | Mar 2004 | A1 |
20040155110 | Ehrhart et al. | Aug 2004 | A1 |
20040182928 | Ehrhart et al. | Sep 2004 | A1 |
20040206825 | Schmidt et al. | Oct 2004 | A1 |
20050001035 | Hawley et al. | Jan 2005 | A1 |
20050056699 | Meier et al. | Mar 2005 | A1 |
20050161511 | Parker et al. | Jul 2005 | A1 |
20050167504 | Meier et al. | Aug 2005 | A1 |
20050167507 | Swartz et al. | Aug 2005 | A1 |
20050184159 | Hattori et al. | Aug 2005 | A1 |
20060023218 | Jung et al. | Feb 2006 | A1 |
20060274171 | Wang | Dec 2006 | A1 |
20110284638 | Ehrhart et al. | Nov 2011 | A1 |
Number | Date | Country |
---|---|---|
0350933 | Jan 1990 | EP |
0392159 | Oct 1990 | EP |
0439682 | Aug 1991 | EP |
0498366 | Aug 1992 | EP |
0690403 | Jan 1996 | EP |
0733991 | Sep 1996 | EP |
0910032 | Apr 1999 | EP |
0917087 | May 1999 | EP |
0978990 | Feb 2000 | EP |
998147 | May 2000 | EP |
999514 | May 2000 | EP |
1050793 | Nov 2000 | EP |
2357209 | Jun 2001 | GB |
61059569 | Mar 1986 | JP |
63311474 | Dec 1988 | JP |
1216486 | Aug 1989 | JP |
3020058 | Jan 1991 | JP |
3054680 | Mar 1991 | JP |
4257844 | Sep 1992 | JP |
04316178 | Nov 1992 | JP |
04316178 | Nov 1992 | JP |
5324900 | Dec 1993 | JP |
6250775 | Sep 1994 | JP |
9204487 | Aug 1997 | JP |
9259215 | Oct 1997 | JP |
10224773 | Aug 1998 | JP |
11-220639 | Aug 1999 | JP |
11232378 | Aug 1999 | JP |
2000050028 | Feb 2000 | JP |
2000-82107 | Mar 2000 | JP |
2000293622 | Oct 2000 | JP |
2002111909 | Apr 2002 | JP |
WO-9202371 | Feb 1992 | WO |
WO-9217861 | Oct 1992 | WO |
WO-9513196 | May 1995 | WO |
WO-9524278 | Sep 1995 | WO |
WO-9534043 | Dec 1995 | WO |
WO-9639676 | Dec 1996 | WO |
WO-9708647 | Mar 1997 | WO |
WO-9950736 | Oct 1999 | WO |
WO-0016241 | Mar 2000 | WO |
WO-0072246 | Nov 2000 | WO |
WO-0122358 | Mar 2001 | WO |
WO-0146899 | Jun 2001 | WO |
WO-02080520 | Oct 2002 | WO |
WO-03001435 | Jan 2003 | WO |
WO-03081520 | Oct 2003 | WO |
WO-03081521 | Oct 2003 | WO |
WO-2004064382 | Jul 2004 | WO |
Number | Date | Country | |
---|---|---|---|
20100176201 A1 | Jul 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11592636 | Nov 2006 | US |
Child | 12188804 | US | |
Parent | 10764741 | Jan 2004 | US |
Child | 11592636 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12188804 | Aug 2008 | US |
Child | 12748076 | US | |
Parent | 09904697 | Jul 2001 | US |
Child | 10764741 | US |