The invention relates to an image capture apparatus and method.
Currently available image sensor based optical readers include circuitry which (1) captures a frame image data into a decoding buffer memory location, (2) attempts to decode a bar code symbol or OCR decodable text message represented in the frame image data, and which (3) outputs a decoded-out message corresponding to a decodable indicia represented in the frame of image data.
In these readers there is no further attempt to decode a message encoded in symbol or text characters represented in the frame of image data. When decoding fails using such a device, the reader captures another frame of image data, attempts to decode it, and continues capturing frames of image data and attempting to decode image data until a trigger of the reader is released or until a symbol is successfully decoded. If the symbol or text string is otherwise decodable but the reader is not configured to read the symbol or OCR text string in the field of view of the reader, another optical reader must be utilized to decode the decodable symbol or text string. Decodable symbols and decodable text characters are referred to generically herein as “decodable indicia.”
Another problem noted with use of optical readers is fraud. Bar code symbols are now used for identifying a wide range of products and other items including retail items, shipping containers, U.S. patents and personal identification cards. The increased use of bar code symbols and decodable text characters has made decodable symbol and text characters the target of fraud perpetrators. A common fraud scheme perpetrated in connection with decodable indicia is transposition. In a transposition fraud scheme a decodable indicia is taken from one item (such as a retail product of lower value) and transposed on another item (such as an item of higher value). Unfortunately, presently available optical readers are not equipped to detect when such transposition fraud schemes have taken place. Especially in environments where the decoding of symbols and text characters is highly automated, transposition and other fraud schemes related to bar code use go undetected.
There is a need for an optical reader which is better equipped to read obscure or otherwise hard to read symbols or text characters and which is better equipped for detecting fraud.
The preferred embodiment of the invention will now be described, by way of example only, with reference to the accompanying figures wherein like members bear like reference numerals and wherein:
a-1b show a reader according to the invention;
a-2d show alternative embodiments of optical reading imaging devices in which the invention may be incorporated;
a-3e show alternative electronic hardware for optical readers and reader communication systems for the invention;
a shows architecture for a program memory of an optical reader according to the invention.
a shows a printed image representation corresponding to a frame of image data having a window comprising an image representation of a decoded message;
b is a diagram illustrating a typical architecture of an image file;
There is provided an optical reading imaging device which is highly useful for reading obscure or hard to read symbols or OCR decodable text characters, which is highly useful for detecting fraud, and which is also highly useful for creating an easily searchable database of indexed image files.
Preferably, a reader according to the invention is in communication with or operating under the control of a powerful processor system or a network of powerful processor systems.
A reader according to the invention in one embodiment is operable in four user-selected modes of operation. The modes may be selected from a decoding option menu driver which is called-up by selecting a decoding function of the optical reading device, from a set of possible device functions. The decode function may be selected from a function menu driver which is made available to a user when a reader according to the invention is first powered up.
The user selectable modes of operation are: (1) “message only;” (2) “image only,” (3) “message and image,” and (4) “two-step message and image.”
In the first user selectable mode of operation, the “message only” mode, a reader according to the invention operates in accordance with the operation of a reader of the prior art discussed in the background herein. That is, when the first user-selected decoding mode of operation is selected, the reader captures a frame of image data into a decoding buffer memory location, attempts to decode any decodable indicia in the captured frame, and stores the decoded message in a memory location dedicated for storing the message information without storing into a designated frame storage memory location the frame of image data from which the decoded message was decoded.
When operating in the second user-selected decoding mode of operation, the “image only” mode, a reader according to the invention stores a frame of image data in a designated frame storage memory location where it is made available for transmitting to another memory location. It may be desirable to transfer the frame of image data to another memory location, for example, so that the image data can be subjected to bar code or OCR decoding operation a processor system other than the one responsible for the original image capture. The second mode of operation is highly useful in decoding environments where it is known that the decodable indicia is decodable but is of a type that cannot be decoded by the reader capturing the frame including the indicia as presently configured. For example, the reader reading the indicia may be capable of symbol decoding only whereas the decodable indicia of a capture image may comprise OCR characters. The second mode also conveniently allows a user to capture an image for any purpose which may be unrelated to decoding during the course of operating reader 10 in accordance with a decoding function of reader 10.
When operating in the third user-selected mode of operation, the “message and image” mode, a reader according to the invention stores to a designated frame storage memory location a frame of image data and stores to the same and/or another memory location a decoded message corresponding to the decodable indicia represented in the image.
In a fourth mode, the “two-step message and image mode”, a reader according to the invention may store into a designated frame storage memory location both a frame of image data and a decoded message associated with the frame of image data as in the third mode. However, in the fourth mode, the decoded message is not decoded from a decodable indicia represented in the stored frame of image data. A user captures two separate images during the course of operating the reader in the fourth mode. One of the captured images is stored in a dedicated memory space and the other of the captured images is subjected to decoding for developing a decoded-out message which is associated with the memory stored captured image.
In both the third and fourth modes, message data is associated with image data. The message data can be associated with image data in a number of different ways. For example, the reader may convert the decoded-out message into an image representation of the characters of the message data, and stitch the image representation of the message into a section of the frame of stored image data. The message data may also be stored in a memory location separate from the frame storage memory location, where it is retained as message data and not converted to image data. The message data may also be stored in a header byte location of a header associated with the image file encoding the stored frame of image data.
The third and fourth modes are highly useful for fraud detection. That is, by selecting the third or fourth modes a user has the capacity to view an image side-by-side to a decoded-out message-image. If the image comprises a representation of a package or item on which the bar code is located, a user can determine if the bar code or package have been tampered with by viewing the image in connection with the decoded message.
The third and fourth modes are also highly useful for providing secondary decoding functions. The message associated with an image in the third or fourth modes is decoded from a decodable indicia in or associated with the scene corresponding to the stored frame of image data. However, the scene represented by the stored frame of image data may include additional decodable indicia which was not subjected to decoding or of a type that could not be decoded by the as-configured reader at the time the reader captured the frame of image data stored in designated image frame storage location. The third and fourth modes allow this secondary decodable indicia to be decoded at a later time, after decoding of the indicia yielding the decoded-out message stored in a designated memory location during the third or fourth modes.
Still further, the third and fourth modes are highly useful for image indexing applications. Incorporating message data in a specific header location of several memory stored image data frame image files creates a database of image files, wherein each image file is indexed by the message associated with the image, as determined by the decodable indicia yielding the decoded-out message. When such a database is created, any one image file in the database can be accessed by searching for a particular decoded-out message in the particular header byte location of the various image data frame image files.
The invention is first described briefly with reference to
In the “message only” mode, reader 10 stores to a designated memory location a decoded-out data message. In an “image only” mode, a reader according to the invention, stores to a designated frame storage memory location a frame of image data without attempting to decode decoded indicia represented in the image. In a “message and image” mode, a reader according to the invention stores to a designated memory location a frame of image data and, in addition, a decoded-out message associated with the frame of image data to the frame storage memory location and or to another designated memory location. In the two-step message and image mode, a reader according to the invention stores into a designated memory location or locations both a frame of image data and a decoded-out message associated with the frame of image data as in the third mode. However, in the fourth mode, the decoded message is not decoded from a decodable indicia represented in the stored frame of image data. A user captures two separate images during the course of operating the reader in the fourth mode. One of the captured images is stored in a dedicated memory space and the other of the captured images is subjected to decoding for developing a decoded message which is then associated with the memory stored captured image.
Shown in the embodiment of
As is indicated in the specific embodiment of
Reader 10-1 may be equipped with a graphical user interface for aiding in the menu selection of one of the four operational modes. While the menu driver in the embodiment of
The availability of multiple operational modes of the reader described herein allows the operation of the reader to be optimized depending on the particular decoding environment. In case the snappiest of operations is desired, and the expected indicia to be decoded is common and readily decoded, and there is little likelihood of fraudulent bar code use, then the first mode is commonly selected. In the case that a captured symbol representation includes a decodable indicia but the reader as presently configured is not configured to read the symbol, it is desirable to select the second mode. The third and fourth modes are highly useful wherein a scene includes at least one decodable indicia that can be configured by the image capturing reader as presently configured, but also comprises other decodable indicia which cannot be decoded by the reader 10 as presently configured.
The third and forth modes are also highly useful in the case there is a substantial likelihood of indicia transposition fraud. Still further, the third and fourth modes are also highly useful in the case it is desired to file several images in an easily searchable indexed database of stored image files.
Block diagrams illustrating various types of electronic hardware configurations for optical readers in which the invention may be incorporated and communication systems comprising at least one optical reader are shown in
Reader processor assembly 30, includes an illumination assembly 21 for illuminating a target object T, such as a substrate bearing 1D or 2D bar code symbol or a text string, and an imaging assembly 33 for receiving an image of object T and generating an electrical output signal indicative of the data optically encoded therein. Illumination assembly 21 may, for example, include an illumination source assembly 22, together with an illuminating optics assembly 24, such as one or more lenses, diffusers, wedges, reflectors or a combination of such elements, for directing light from light source 22 in the direction of a target object T. Illumination assembly 21 may comprise, for example, laser or light emitting diodes (LEDs) such as white LEDs or red LEDs. Illumination assembly 21 may include target illumination and optics for projecting an aiming pattern on target T. Illumination assembly 21 may be eliminated if ambient light levels are certain to be high enough to allow high quality images of object T to be taken. Illumination assembly 21 may also be located remote from reader housing 11, at a location so as to eliminate or reduce specular reflections. Imaging assembly 33 may include an image sensor 32, such as a color or monochrome 1D or 2D CCD, CMOS, NMOS, PMOS, CID or CMD solid state image sensor, together with an imaging optics assembly 34 for receiving and focusing an image of object T onto image sensor 32. The array-based imaging assembly shown in
Reader processor assembly 30 of the embodiment of
More particularly, processor 42 is preferably a general purpose, off-the-shelf VLSI integrated circuit microprocessor which has overall control of the circuitry of
The actual division of labor between processor 42 and ASIC 44 will naturally depend on the type of off-the-shelf microprocessors that are available, the type of image sensor which is used, the rate at which image data is output by imaging assembly 33, etc. There is nothing in principle, however, that requires that any particular division of labor be made between processors 42 and 44, or even that such a division be made at all.
With processor architectures of the type shown in
ASIC 44 is preferably devoted primarily to controlling the image acquisition process, the A/D conversion process and the storage of image data, including the ability to access memories 46-1 and 47-1 via a DMA channel. ASIC 44 may also perform many timing and communication operations. ASIC 44 may, for example, control the illumination of LEDs 22, the timing of image sensor 32 and an analog-to-digital (A/D) converter 36-1, the transmission and reception of data to and from a processor system external to assembly 30, through an RS-232, a network such as an Ethernet, a serial bus such as USB, a wireless communication link (or other) compatible I/O interface as is indicated by interface 37-2. ASIC 44 may also control the outputting of user perceptible data via an output device, such as aural output device 14a, a good read LED 14g and/or a display monitor which may be provided by a liquid crystal display such as display 14d. Control of output, display and I/O functions may also be shared between processors 42 and 44, as suggested by bus driver I/O interface 37-3 or duplicated, as suggested by microprocessor serial I/O interface 37-1 and interface 37-2. As explained earlier, the specifics of this division of labor is of no significance to the present invention.
b shows a block diagram exemplary of an optical reader which is adapted to easily receive user-input control instructions resulting in a change in an operating program of a reader. In addition to having the elements of single state reader circuit of
An operator operating optical reader 10b can reprogram reader 10b in a variety of different ways. In one method for reprogramming reader 10b, an operator actuates a control button of keyboard 13k which has been pre-configured to result in the reprogramming of reader 10b. In another method for reprogramming reader 10b an operator actuates control of a processor system not integral with reader 10b to transmit an instruction to reprogram reader 10b. According to another method for reprogramming reader 10b, an operator moves reader 10b so that a “menu symbol” is in the field of view of image sensor 32 and then activates trigger 13t of reader 10b to capture an image representation of the menu symbol. A menu symbol is a specially designed bar code symbol which, when read by an appropriately configured optical reader results in a reader being programmed. The reprogramming of an optical reader with use of a menu symbol is described in detail in commonly assigned U.S. Pat. No. 5,965,863 incorporated herein by reference. Because the second and third of the above methodologies do not require actuation of a reader control button of keyboard 13k but nevertheless result in a reader being reprogrammed, it is seen that reader 10 may be keyboardless but nevertheless reprogrammable. It will be seen that the second or third of the above methodologies can be adapted for selecting one of the reader operating modes described herein.
A typical software architecture for an application operating program typically executed by an optical reader as shown in
Referring now to specific aspects of the software architecture of an operating program 60, program 60 includes an instruction section 62, and a parameter section 64. Further, instruction section 62 may include selectable routine section 62s. Instructions of instruction section 62 control the overall flow of operations of reader 10. Some instructions of instruction section 62 reference a parameter from a parameter table of parameter section 64. An instruction of instruction section 62 may state in pseudocode, for example, “set illumination to level determined by [value in parameter row x].” When executing such an instruction of instruction section 62, control circuit 40 may read the value of parameter row 64x. An instruction of instruction section 62 may also cause to be executed a selectable routine that is selected depending on the status of a parameter value of parameter section 64. For example, if the application program is a bar code decoding algorithm then an instruction of instruction section 62 may state in pseudocode, for example, “launch Maxicode decoding if Maxicode parameter of parameter row 64y is set to “on”. When executing such an instruction, control circuit 40 polls the contents of row 64y of parameter section 64 to determine whether to execute the routine called for by the instruction. If the parameter value indicates that the selectable routine is activated, control circuit 40, executes the appropriate instructions of routine instruction section 62s to execute the instruction routine.
It is seen, therefore, that the above described software architecture facilitates simplified reprogramming of reader 10. Reader 10 can be reprogrammed simply by changing a parameter of parameter section 64 of program 60, without changing the subroutine instruction section 62s or any other code of the instruction section 62 simply by changing a parameter of parameter section 64. The parameter of a parameter value of section 62 can be changed by appropriate user control entered via keyboard 13k, by reading a menu symbol configured to result in a change in parameter section 64, or by downloading a new parameter value or table via a processor system other than system 40 as shown in
Another architecture typical of an optical reader which may be configured in accordance with the invention is shown in
In architectures shown in
Referring to further aspects of readers 10a, 10b, and 10c at least one I/O interface e.g. interface 37-1, 37-2, and 37-3 facilitates local “wired” digital communication such as RS-232, Ethernet, serial bus including Universal Serial Bus (USB), or local wireless communication technology including “Bluetooth” communication technology. At least one I/O interface, e.g. interface 37-3, meanwhile, facilitates digital communication with remote processor assembly 88-1 in one of available remote communication technologies including dial-up, ISDN, DSL, cellular or other RF, and cable. Remote processor assembly 88-1 may be part of a network 88N of processor systems as suggested by assemblies 88-2, 88-3, and 88-4 links 88L and hub 88H e.g. a personal computer or main frame computer connected to a network, or a computer that is in communication with reader 10c only and is not part of a network. The network 88N to which assembly 88-1 belongs may be part of the internet. Further, assembly 88-1 may be a server of the network and may incorporate web pages for viewing by the remaining processor assemblies of the network. In addition to being in communication with reader 10c, assembly 88-1 may be in communication with a plurality of additional readers 10′ and 10″. Reader 10c may be part of a local area network (LAN). Reader 10 may communicate with system 88-1 via an I/O interface associated with system 88-1 or via an I/O interface 881 of network 88N such as a bridge or router. Further, a processor system external to processor system 40 such as processor system 70s may be included in the communication link between reader 10 and assembly 88-1. While the components of readers 10a, 10b, and 10c are represented in
Furthermore, the number of processors of reader 10 is normally of no fundamental significance to the present invention. In fact if processor 42 is made fast enough and powerful enough special purpose ASIC processor 44 can be eliminated. Likewise referring to reader 10c a single fast and powerful processor can be provided to carry out all of the functions contemplated by processors 40hp, 42, and 44 as is indicated by the architecture of reader 10e of
Referring to the embodiment of
The control circuit 40 as shown in the embodiment of
The reader communication system of
As described in U.S. Pat. No. 5,965,863, incorporated herein by reference, one function typically provided by nonintegrated local host processor system 70s is to create operating programs for downloading into reader 10. Processor system 70s typically has an operating system incorporated therein, such as WINDOWS, which enables an operator to develop operating programs using a graphical user interface. Nonintegrated local processor system 70s also can be configured to receive messages an/or image data from more than one reader, possibly in a keyboard wedge configuration as described in U.S. Pat. No. 6,161,760, incorporated herein by reference. It is also convenient to employ processor system 70 for data processing. For example a spreadsheet program can be incorporated in system 70s which is useful for analyzing data messages from reader 10e. An image processing application can be loaded into system 70s which is useful for editing, storing, or viewing electronic images received from reader 10e. It is also convenient to configure reader 10e to coordinate communication of data to and from a remote processor assembly such as assembly 88-1. Accordingly processor assembly 68 typically includes I/O interface 74-2 which facilitates remote communication with a remote processor assembly, e.g. assembly 88-1 as shown in
The various modes of operation of the device are now described in greater detail. A user may actuate the “message only” mode using one of a possible menu driver systems as previously explained. When trigger 13T is actuated with reader 10 in the first mode, control circuit 40 captures a frame of image data into a decoding buffer memory location, typically located within RAM 46, subjects the frame of image data within the buffer memory location to a decoding algorithm to generate a decoded-out message, then stores in a designated decoded-out message memory location of memory 45 the decoded-out message determined from application of the decoding algorithm.
The first mode is referred to as a “message only” mode despite there being a capture of a frame of image data into a buffer memory because in the first mode there is no writing of image data into a dedicated image frame storage memory location of memory 45 after the initial capturing of the image data into a buffer memory location. A designated image frame storage memory location of a memory 45 of an imaging device 10 is a memory location that is specifically designated for later access either by imaging device 10 or by an external processor system such as the processor system 70s of a host PC 68 as shown in the example of
In should be noted that control circuit 40 can be of a type that does not capture each new frame of image data into a single buffer memory location during the course of capturing images for decoding purposes. Instead, control circuit 40 during the course of decoding decodable indicia session may capture each newly captured image into a separate memory location of memory 45 and may attach a designation flag (such as in an allocated open byte of the image file) in the case the frame of image data is to be designated for further processing after decoding is complete. Control circuits 40 that do not utilize a decode buffer memory location during decoding attach designation flags to captured images captured during execution of the second, third, and fourth modes herein. That is, where a control circuit 40 that does not utilize a decode buffer captures a frame of image data into memory 45 while operating in accordance with the third mode, for example, control circuit 40 decodes image data represented in the frame of image data and attaches a designation flag to the captured frame of image data to indicate that the image data is to be subjected to further processing in addition to the decoding processing (such as uploading to processor system 70s, for example). Control circuit 40 thereby develops a “link list” of image files having designation flags attached thereto to designate that the frame of image data is to be subjected to further processing. The phrase “storing a frame of image data into a designated frame storage memory location” should be understood therefore to refer to both the situation where control circuit 40 transfers or copies an image frame from a decode buffer memory location of memory 45 and the situation where a control circuit 40 attaches a designation flag to a captured frame of image data captured into a memory location of memory 45 that is not a decode buffer memory location.
Various methods for decoding decodable indicia, including 1D symbols, 2D symbols, and text characters represented in captured image data are known. As has been indicated herein, reader 10 according to the invention attempts to decode decodable indicia represented in a captured frame of image data when executing the first, third, and fourth modes described herein. Specific features of algorithms for decoding decodable indicia represented in a captured frame of image data are described with reference to
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
As embodied herein and depicted in
Referring again to modes of operation of a reader 10 according to the invention, the second, “image only” only mode will now be described. A user selects the second mode as described previously by making a selection of an indicator corresponding to the second mode using a mode selector menu driver. When the second, “image only” mode is selected and trigger 13t is actuated, control circuit 40 captures or writes a frame of image data corresponding to the scene presently in the field of view of reader 10 into a designated image frame storage location of memory 45 without attempting to decode decodable indicia represented in the frame of image data. In the second mode, control circuit 40 may capture a frame of image data into a decoding buffer memory location of memory 45 and then write the frame of image data into a designated frame storage location of memory 45 without attempting to decode indecodable indicia represented therein or else control circuit 40 may bypass the decoding buffer memory location entirely and capture the frame of image data directly into a non-decoding buffer frame storage location of memory 45 as has been described herein.
The second mode is highly useful in a variety of commonly encountered decoding applications. For example, if a scene includes indicia that is decodable by way of available decoding technologies but not by reader 10 as presently configured, it is useful to select the second “image only” mode so that (1) the frame of image data corresponding to the scene can be shipped to an external processor system equipped to decode the decodable indicia, or so that (2) the reader can be reprogrammed so that it has the capacity to decode the particular type of decodable indicia in the captured image representation. Of course, the second mode is highly useful since it allows user to easily capture images for any purpose which may be unrelated to decoding during the course of operating reader 10 in accordance with a decode function.
Frame image capture functionality is available in imaging devices that are not normally equipped with decoding functionality. For example, digital cameras as depicted by the example of
Furthermore in the second mode, control circuit 40 may be made to execute steps in furtherance of decoding decodable indicia after executing the step of storing a frame of image data into a designated frame storage memory location of memory 45. For example, after storing a frame of image data in memory 45 in the second mode, control circuit 40 of reader 10 may transmit an instruction to an external processor system e.g. processor system 70s or the processor system of assembly 88-1 so that the external processor system transmits to reader 10 a new operating program which results in reader having the capacity to decode the image data represented in the frame of image data just written to the designated frame storage memory location of memory. Control circuit 40, as reprogrammed, may be configured to automatically decode, or may later be controlled to decode decodable indicia represented in the frame of image data stored in the designated frame storage memory location of memory 45. In addition, as part of the second mode, control circuit 40 after writing a frame of image data into a designated frame storage memory location of memory 45 may be made to transmit the stored frame of image data, or a copy thereof, to an external processor system such as host processor system 70s or a processor system of remote processor assembly 88-1 together with complementary instructions instructing the external processor system, e.g. system 70s or a system of assembly 88-1 to decode any decodable indicia in the frame of image data and to transmit a decoded-output message yielded by such decoding back to reader 10. When receiving the frame of image data and complementary instructions, the external processor system, e.g. system 70s or system of assembly 88-1 may then automatically decode the decodable indicia represented in the frame of image data and transmit the decoded-output message back to reader 10.
Referring now to the third, “image and message” mode, the third mode may be actuated by selecting a menu option out of a series of menu options as in the first and second modes. When operating in the third mode, actuation of trigger 13T results in control circuit 40 capturing a frame of image data corresponding to a scene presently in the field of view of reader into a buffer memory location of memory 45, attempting to decode decodable indicia represented in the frame, and storing both frame of image data and its associated decoded-out into a designated frame storage location of memory 45 or into a set of designated image and message storage locations of memory 45.
The fourth mode of operation, the “two-step message and image”, is similar to the third mode except that the fourth mode involves two image capturing steps instead of one. The fourth mode may be actuated, as in the first second and third modes by selecting an appropriate menu option out of a series of menu options as explained with reference to
It will be seen that the frame of image data that is associated with decoded-output message data in the fourth mode will not necessarily comprise an image representation of the decodable indicia from which the decoded-output message is generated. In the fourth mode, the frame of image data associated with a decoded-output message can be any image a user wishes to associate with a decoded-output message. For example, a user may actuate the fourth mode a first time to associate a large field image representation of a front side of package which may or may not comprise a representation of the indicia corresponding to the decoded-output message (of resolution that is insufficient for decoding), and actuate the fourth mode a second time to associate a large field view of a back side of a package that does not comprise a representation of a decodable indicia. The user may then actuate the fourth mode a third time to associate a decoded-output message with yet another image representation such as an image representation of the trailer box on which the package was loaded.
Furthermore, it will be seen that a reader according to the invention can have other modes of operations wherein more than one frame of image data is associated with a decoded-out message, wherein more than one decoded-out message is associated with a frame of image data written to designated frame storage location of memory 45, or wherein multiple decoded messages are associated with multiple frames of image data. The other modes of operation may require more than the pair of image capture steps required in the fourth mode of operation. Of course, useful embodiments of the invention can also have less than all of the four modes of operation explained in detail herein.
In both the third and fourth modes, decoded-out message data is associated with image data. Message data can be associated with image data in a number of useful ways in the third and fourth modes.
For example, according to a first method for associating image and message data as shown by the example of
According to another method for associating decoded-out message data and image data, control circuit 40 stores decoded-out message data into a designated message data storage location of memory 45 separate from a designated image data storage location of memory 45, without converting the message data into an image representation of message data.
Another highly useful method for associating decoding-out message data with image data is described with reference to
According to the method of the message and image association described with reference to
The message-image association method described with reference to
The third and fourth modes of operation as described herein are highly useful in the case supplementary decoding may be desired. With the number of decodable types of indicia ever expanding, the situation is a common one where a scene includes multiple types of decodable indicia but the reader capturing an image representation of the scene is configured to read only some of them. For example, reader 10 may be presently equipped to decode 1D symbols only and a scene may include 1D symbols, 2D symbols and decodable text strings. The third or fourth modes of operation as explained herein can be actuated in the case that a scene includes at least one decodable indicia that can be decoded by reader 10 as presently configured and at least one decodable indicia that cannot be decoded by reader 10 as presently configured. The third or fourth modes can be actuated so that reader 10 decodes the presently decodable indicia of the scene and associates an image representation with the decoded-out message. As explained in the description of the second mode herein, control circuit 40, either automatically or upon a user request, may then transmit the image representation to an external processor assembly, e.g. assembly 68 or 88-1 for decoding, which may transmit the decoded message back to reader 10, or may transmit a request to an external processor assembly e.g. assembly 68 or 88-1 to reprogram reader 10 so that reader 10 is reconfigured for reading the previously unreadable indicia.
The third and fourth modes of operation as described herein are also highly useful for fraud detection. It is seen from
In one method of fraud involving decodable indicia known as transposition fraud decodable symbols are lifted from or copied from a first item and placed on a second item.
The first item may be, for example, a retail product of lower value and the second item may be a retail product of higher value. The first item may also be, for example, an identification card owned by a person over 21 and the second item may be an identification card owned by a person under 21. Because it is common to allocate narrow classes of bar code decoded-out messages with narrow categories of items, the third and fourth modes, allow for a user of reader 10 or of a processor system in communication with reader 10 in many cases to quickly determine whether there has been a transposition fraud by simultaneous observation of a scene image representation and a decoded-out message.
The third and fourth modes also allow convenient observation of indicia bearing or transaction associated objects for such purposes as product or package tampering or damage. It is seen that the second, “image only” mode of operation described herein is also useful for purposes of fraud, tamper, or damage detected explained with reference to the third and fourth modes.
Particularly when the message-image associated method described with reference to
The utility of such an indexing function and further aspects of the image indexing function are illustrated by way of example. Referring to
While the present invention has been explained with reference to the structure disclosed herein, it is not confined to the details set forth and this invention is intended to cover any modifications and changes as may come within the scope of the following claims.
This application is a divisional of U. S. patent application Ser. No. 11/442,662 (now U.S. Pat. No. 7,543,747), filed May 25, 2006, which is a divisional of U.S. patent application Ser. No. 09/858,163, filed on May 15, 2001 (now U. S. Pat. No. 7,111,787). This application is also related to U.S. patent application Ser. No. 10/143,158 (now U.S. Pat. No. 6,942,151). Each of the above applications is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3801775 | Acker | Apr 1974 | A |
3902047 | Tyler et al. | Aug 1975 | A |
4020463 | Himmel | Apr 1977 | A |
4091270 | Musch et al. | May 1978 | A |
4286255 | Siy | Aug 1981 | A |
4335303 | Call | Jun 1982 | A |
4387298 | Petersen et al. | Jun 1983 | A |
4499372 | Nakano | Feb 1985 | A |
4530584 | Schmidt | Jul 1985 | A |
4544064 | Felder | Oct 1985 | A |
4562592 | Chainer et al. | Dec 1985 | A |
4581762 | Lapidus et al. | Apr 1986 | A |
4588211 | Greene | May 1986 | A |
4656591 | Goldberg | Apr 1987 | A |
4760248 | Swartz et al. | Jul 1988 | A |
4776464 | Miller et al. | Oct 1988 | A |
4794239 | Allais | Dec 1988 | A |
4825058 | Poland | Apr 1989 | A |
4832204 | Handy et al. | May 1989 | A |
4835372 | Gombrich et al. | May 1989 | A |
4855842 | Hayes et al. | Aug 1989 | A |
4858000 | Lu | Aug 1989 | A |
4868375 | Blanford | Sep 1989 | A |
4868757 | Gil | Sep 1989 | A |
4873426 | Sarna et al. | Oct 1989 | A |
4877948 | Krueger | Oct 1989 | A |
4877949 | Danielson et al. | Oct 1989 | A |
4924078 | Sant'Anselmo et al. | May 1990 | A |
4948955 | Lee et al. | Aug 1990 | A |
4972494 | White et al. | Nov 1990 | A |
4983818 | Knowles | Jan 1991 | A |
5010241 | Butterworth | Apr 1991 | A |
5019699 | Koenck | May 1991 | A |
5039391 | Reid et al. | Aug 1991 | A |
5039847 | Morii et al. | Aug 1991 | A |
5043908 | Manduley et al. | Aug 1991 | A |
5050223 | Sumi et al. | Sep 1991 | A |
5054102 | Gaborski | Oct 1991 | A |
5070805 | Plante | Dec 1991 | A |
5091945 | Kleijn | Feb 1992 | A |
5102110 | Reynolds | Apr 1992 | A |
5103489 | Miette et al. | Apr 1992 | A |
5108612 | Flaig et al. | Apr 1992 | A |
5120940 | Willsie | Jun 1992 | A |
5134669 | Keogh et al. | Jul 1992 | A |
5138140 | Siemiatkowski et al. | Aug 1992 | A |
5138141 | Blanford et al. | Aug 1992 | A |
5180904 | Shepard et al. | Jan 1993 | A |
5199081 | Saito et al. | Mar 1993 | A |
5199084 | Kishi et al. | Mar 1993 | A |
5212777 | Gove et al. | May 1993 | A |
5237161 | Grodevant | Aug 1993 | A |
5237625 | Yamashita et al. | Aug 1993 | A |
5243655 | Wang | Sep 1993 | A |
5260554 | Grodevant | Nov 1993 | A |
5262623 | Batterman et al. | Nov 1993 | A |
5262871 | Wilder et al. | Nov 1993 | A |
5278399 | Sano et al. | Jan 1994 | A |
5291243 | Heckman et al. | Mar 1994 | A |
5296689 | Reddersen et al. | Mar 1994 | A |
5296690 | Chandler et al. | Mar 1994 | A |
5296960 | Ellingson et al. | Mar 1994 | A |
5299116 | Owens et al. | Mar 1994 | A |
5301243 | Olschafskie et al. | Apr 1994 | A |
5304423 | Niknafs et al. | Apr 1994 | A |
5304786 | Pavlidis et al. | Apr 1994 | A |
5307423 | Gupta et al. | Apr 1994 | A |
5313051 | Brigida et al. | May 1994 | A |
5317388 | Surka et al. | May 1994 | A |
5331151 | Cochran et al. | Jul 1994 | A |
5331176 | Sant' Anselmo et al. | Jul 1994 | A |
5337361 | Wang et al. | Aug 1994 | A |
5354977 | Roustaei | Oct 1994 | A |
5365048 | Komiya et al. | Nov 1994 | A |
5375226 | Sano et al. | Dec 1994 | A |
5378883 | Batterman et al. | Jan 1995 | A |
5392447 | Schlack et al. | Feb 1995 | A |
5396054 | Krichever et al. | Mar 1995 | A |
5399846 | Pavlidis et al. | Mar 1995 | A |
5410141 | Koenck et al. | Apr 1995 | A |
5413383 | Laurash et al. | May 1995 | A |
5414251 | Durbin | May 1995 | A |
5420403 | Allum et al. | May 1995 | A |
5420943 | Mak | May 1995 | A |
5421778 | Kouramanis et al. | Jun 1995 | A |
5422470 | Kubo et al. | Jun 1995 | A |
5428211 | Zheng et al. | Jun 1995 | A |
5428212 | Tani et al. | Jun 1995 | A |
5448375 | Cooper et al. | Sep 1995 | A |
5449201 | Miller et al. | Sep 1995 | A |
5467411 | Tanaka et al. | Nov 1995 | A |
5471533 | Wang et al. | Nov 1995 | A |
5489158 | Wang et al. | Feb 1996 | A |
5489769 | Kubo et al. | Feb 1996 | A |
5496992 | Madan et al. | Mar 1996 | A |
5504322 | Pavlidis et al. | Apr 1996 | A |
5504367 | Arackellian et al. | Apr 1996 | A |
5506697 | Li et al. | Apr 1996 | A |
5508818 | Hamma | Apr 1996 | A |
5513017 | Knodt et al. | Apr 1996 | A |
5513264 | Wang et al. | Apr 1996 | A |
5521366 | Wang et al. | May 1996 | A |
5521523 | Kimura et al. | May 1996 | A |
5550364 | Rudeen | Aug 1996 | A |
5550366 | Roustaei | Aug 1996 | A |
5557091 | Krummel | Sep 1996 | A |
5557095 | Clark et al. | Sep 1996 | A |
5557519 | Morita et al. | Sep 1996 | A |
5570135 | Gove et al. | Oct 1996 | A |
5574519 | Manico et al. | Nov 1996 | A |
5581636 | Skinger | Dec 1996 | A |
5591955 | Laser | Jan 1997 | A |
5591956 | Longacre, Jr. et al. | Jan 1997 | A |
5594778 | Schaupp, Jr. et al. | Jan 1997 | A |
5598007 | Bunce et al. | Jan 1997 | A |
5602382 | Ulvr et al. | Feb 1997 | A |
5607187 | Salive et al. | Mar 1997 | A |
5617481 | Nakamura et al. | Apr 1997 | A |
5627915 | Rosser et al. | May 1997 | A |
5635694 | Tuhro | Jun 1997 | A |
5635697 | Shellhammer et al. | Jun 1997 | A |
5640202 | Kondo et al. | Jun 1997 | A |
5642442 | Morton et al. | Jun 1997 | A |
5644408 | Li et al. | Jul 1997 | A |
5646390 | Wang et al. | Jul 1997 | A |
5659167 | Wang et al. | Aug 1997 | A |
5668803 | Tymes et al. | Sep 1997 | A |
5684290 | Arackellian et al. | Nov 1997 | A |
5691527 | Hara et al. | Nov 1997 | A |
5697504 | Hiramatsu et al. | Dec 1997 | A |
5702059 | Chu et al. | Dec 1997 | A |
5703349 | Meyerson et al. | Dec 1997 | A |
5708680 | Gollnick et al. | Jan 1998 | A |
5710419 | Wang et al. | Jan 1998 | A |
5714745 | Ju et al. | Feb 1998 | A |
5723868 | Hammond, Jr. et al. | Mar 1998 | A |
5726981 | Ylitervo et al. | Mar 1998 | A |
5726984 | Kubler et al. | Mar 1998 | A |
5734153 | Swartz et al. | Mar 1998 | A |
5756981 | Roustaei et al. | May 1998 | A |
5760382 | Li et al. | Jun 1998 | A |
5761686 | Bloomberg | Jun 1998 | A |
5770841 | Moed et al. | Jun 1998 | A |
5773806 | Longacre, Jr. | Jun 1998 | A |
5773810 | Hussey et al. | Jun 1998 | A |
5777315 | Wilz et al. | Jul 1998 | A |
5780834 | Havens et al. | Jul 1998 | A |
5783811 | Feng et al. | Jul 1998 | A |
5786586 | Pidhirny et al. | Jul 1998 | A |
5793033 | Feng et al. | Aug 1998 | A |
5796090 | Pavlidis et al. | Aug 1998 | A |
5801371 | Kahn et al. | Sep 1998 | A |
5804805 | Koenck et al. | Sep 1998 | A |
5818028 | Meyerson et al. | Oct 1998 | A |
5818528 | Roth et al. | Oct 1998 | A |
5821518 | Sussmeier et al. | Oct 1998 | A |
5821523 | Bunte et al. | Oct 1998 | A |
5825002 | Roslak | Oct 1998 | A |
5835754 | Nakanishi | Nov 1998 | A |
5837986 | Barile et al. | Nov 1998 | A |
5841121 | Koenck | Nov 1998 | A |
5844227 | Schmidt et al. | Dec 1998 | A |
5848064 | Cowan | Dec 1998 | A |
5857029 | Patel | Jan 1999 | A |
5859828 | Ishibashi et al. | Jan 1999 | A |
5867595 | Cymbalski | Feb 1999 | A |
5869828 | Braginsky | Feb 1999 | A |
5880453 | Wang et al. | Mar 1999 | A |
5886338 | Arackellian et al. | Mar 1999 | A |
5892824 | Beatson et al. | Apr 1999 | A |
5903548 | Delamater | May 1999 | A |
5914476 | Gerst, III et al. | Jun 1999 | A |
5917925 | Moore | Jun 1999 | A |
5917945 | Cymbalski | Jun 1999 | A |
5920056 | Bonnet | Jul 1999 | A |
5929418 | Ehrhart et al. | Jul 1999 | A |
5942743 | Schmidt et al. | Aug 1999 | A |
5945661 | Nukui et al. | Aug 1999 | A |
5949052 | Longacre, Jr. et al. | Sep 1999 | A |
5949053 | Zlotnick et al. | Sep 1999 | A |
5949057 | Feng | Sep 1999 | A |
5965863 | Parker et al. | Oct 1999 | A |
5974202 | Wang et al. | Oct 1999 | A |
5990744 | Nagaraj | Nov 1999 | A |
5992744 | Smith et al. | Nov 1999 | A |
5992753 | Xu | Nov 1999 | A |
6000612 | Xu | Dec 1999 | A |
6002491 | Li et al. | Dec 1999 | A |
6011873 | Desai et al. | Jan 2000 | A |
6015088 | Parker et al. | Jan 2000 | A |
6019286 | Li et al. | Feb 2000 | A |
6024284 | Schmid et al. | Feb 2000 | A |
6036095 | Seo et al. | Mar 2000 | A |
6055552 | Curry | Apr 2000 | A |
6060722 | Havens et al. | May 2000 | A |
6062475 | Feng | May 2000 | A |
6070805 | Kaufman et al. | Jun 2000 | A |
6076731 | Terrell | Jun 2000 | A |
6076733 | Wilz, Sr. et al. | Jun 2000 | A |
6076738 | Bloomberg et al. | Jun 2000 | A |
6081827 | Reber et al. | Jun 2000 | A |
6089455 | Yagita et al. | Jul 2000 | A |
6094509 | Zheng et al. | Jul 2000 | A |
6095418 | Swartz et al. | Aug 2000 | A |
6098887 | Figarella et al. | Aug 2000 | A |
6101487 | Yeung et al. | Aug 2000 | A |
6105871 | Campo et al. | Aug 2000 | A |
6108612 | Vescovi et al. | Aug 2000 | A |
6115513 | Miyazaki et al. | Sep 2000 | A |
6119179 | Whitridge et al. | Sep 2000 | A |
6122410 | Zheng et al. | Sep 2000 | A |
6123261 | Roustaei | Sep 2000 | A |
6129278 | Wang et al. | Oct 2000 | A |
6131048 | Sudo et al. | Oct 2000 | A |
6133951 | Miyadera et al. | Oct 2000 | A |
6149063 | Reynolds et al. | Nov 2000 | A |
6157618 | Boss et al. | Dec 2000 | A |
6189796 | Itoh et al. | Feb 2001 | B1 |
6195122 | Vincent | Feb 2001 | B1 |
6198948 | Sudo et al. | Mar 2001 | B1 |
6212504 | Hayosh | Apr 2001 | B1 |
6220509 | Byford et al. | Apr 2001 | B1 |
6223988 | Batterman et al. | May 2001 | B1 |
6243447 | Swartz et al. | Jun 2001 | B1 |
6269336 | Ladd et al. | Jul 2001 | B1 |
6283375 | Wilz, Sr. et al. | Sep 2001 | B1 |
6285916 | Kadaba et al. | Sep 2001 | B1 |
6286760 | Schmidt et al. | Sep 2001 | B1 |
6288760 | Sawayama | Sep 2001 | B1 |
6290132 | Dickson et al. | Sep 2001 | B1 |
6292181 | Banerjee et al. | Sep 2001 | B1 |
6321989 | Wilz, Sr. et al. | Nov 2001 | B1 |
6321992 | Knowles et al. | Nov 2001 | B1 |
6330244 | Swartz et al. | Dec 2001 | B1 |
6330975 | Bunte et al. | Dec 2001 | B1 |
6336587 | He et al. | Jan 2002 | B1 |
6338587 | Kuo | Jan 2002 | B1 |
6340114 | Correa et al. | Jan 2002 | B1 |
6347163 | Roustaei | Feb 2002 | B2 |
6347743 | Wilz, Sr. et al. | Feb 2002 | B2 |
6357662 | Helton et al. | Mar 2002 | B1 |
6366771 | Angle et al. | Apr 2002 | B1 |
6373507 | Camara et al. | Apr 2002 | B1 |
6375075 | Ackley et al. | Apr 2002 | B1 |
6384907 | Gooch et al. | May 2002 | B1 |
6389010 | Kubler et al. | May 2002 | B1 |
6398112 | Li et al. | Jun 2002 | B1 |
6404764 | Jones et al. | Jun 2002 | B1 |
6404772 | Beach et al. | Jun 2002 | B1 |
6418325 | Reber et al. | Jul 2002 | B1 |
6424830 | O'Hagan et al. | Jul 2002 | B1 |
6478223 | Ackley | Nov 2002 | B1 |
6494375 | Ishibashi et al. | Dec 2002 | B1 |
6507856 | Chen et al. | Jan 2003 | B1 |
6512218 | Canini et al. | Jan 2003 | B1 |
6512541 | Dunton et al. | Jan 2003 | B2 |
6530523 | Oakeson et al. | Mar 2003 | B1 |
6539360 | Kadaba | Mar 2003 | B1 |
6539422 | Hunt et al. | Mar 2003 | B1 |
6540142 | Alleshouse | Apr 2003 | B1 |
6556242 | Dunton et al. | Apr 2003 | B1 |
6568596 | Shaw | May 2003 | B1 |
6572020 | Barkan | Jun 2003 | B2 |
6600734 | Gernert et al. | Jul 2003 | B1 |
6629642 | Swartz et al. | Oct 2003 | B1 |
6637658 | Barber et al. | Oct 2003 | B2 |
6641046 | Durbin | Nov 2003 | B2 |
6651060 | Harper et al. | Nov 2003 | B1 |
6655593 | Alleshouse | Dec 2003 | B2 |
6678425 | Flores et al. | Jan 2004 | B1 |
6681121 | Preston et al. | Jan 2004 | B1 |
6681994 | Koenck | Jan 2004 | B1 |
6688523 | Koenck | Feb 2004 | B1 |
6694366 | Gernert et al. | Feb 2004 | B1 |
6697805 | Choquier et al. | Feb 2004 | B1 |
6722569 | Ehrhart et al. | Apr 2004 | B2 |
6736322 | Gobburu et al. | May 2004 | B2 |
6738092 | Nakagawa et al. | May 2004 | B1 |
6746164 | Albright et al. | Jun 2004 | B1 |
6758403 | Keys et al. | Jul 2004 | B1 |
6763226 | McZeal, Jr. | Jul 2004 | B1 |
6764009 | Melick et al. | Jul 2004 | B2 |
6772947 | Shaw | Aug 2004 | B2 |
6772949 | Wilz, Sr. et al. | Aug 2004 | B2 |
6776342 | Thuries et al. | Aug 2004 | B1 |
6783069 | Hecht et al. | Aug 2004 | B1 |
6786069 | Ochi et al. | Sep 2004 | B2 |
6792452 | Philyaw | Sep 2004 | B1 |
6811088 | Lanzaro et al. | Nov 2004 | B2 |
6827273 | Wilz, Sr. et al. | Dec 2004 | B2 |
6834807 | Ehrhart et al. | Dec 2004 | B2 |
6845092 | Vassilovski et al. | Jan 2005 | B2 |
6847632 | Lee et al. | Jan 2005 | B1 |
6850510 | Kubler et al. | Feb 2005 | B2 |
6859134 | Heiman et al. | Feb 2005 | B1 |
6870827 | Voit et al. | Mar 2005 | B1 |
6902114 | Hashimoto et al. | Jun 2005 | B2 |
6908034 | Alleshouse | Jun 2005 | B2 |
6928289 | Cho et al. | Aug 2005 | B1 |
6942151 | Ehrhart | Sep 2005 | B2 |
7111787 | Ehrhart | Sep 2006 | B2 |
7263699 | Jacquemot et al. | Aug 2007 | B2 |
7287697 | Ehrhart et al. | Oct 2007 | B2 |
7543747 | Ehrhart | Jun 2009 | B2 |
20010055422 | Roustaei | Dec 2001 | A1 |
20020039099 | Harper | Apr 2002 | A1 |
20020111924 | Lewis | Aug 2002 | A1 |
20020170970 | Ehrhart | Nov 2002 | A1 |
20020171745 | Ehrhart | Nov 2002 | A1 |
20030046192 | Eguchi et al. | Mar 2003 | A1 |
20030136841 | Alleshouse | Jul 2003 | A1 |
20030197062 | Shaw | Oct 2003 | A1 |
20040003388 | Jacquemot et al. | Jan 2004 | A1 |
20040149826 | Alleshouse | Aug 2004 | A1 |
20040155110 | Ehrhart et al. | Aug 2004 | A1 |
20050008263 | Nagahashi et al. | Jan 2005 | A1 |
20070051814 | Ehrhart et al. | Mar 2007 | A1 |
20120076297 | Koziol et al. | Mar 2012 | A1 |
Number | Date | Country |
---|---|---|
0350933 | Jan 1990 | EP |
0392159 | Oct 1990 | EP |
0439682 | Aug 1991 | EP |
0733991 | Sep 1996 | EP |
0910032 | Apr 1999 | EP |
0998147 | May 2000 | EP |
0999514 | May 2000 | EP |
63-311474 | Dec 1988 | JP |
1-216486 | Aug 1989 | JP |
03-020058 | Jan 1991 | JP |
4257844 | Sep 1992 | JP |
05054152 | Mar 1993 | JP |
6-250775 | Sep 1994 | JP |
10224773 | Aug 1998 | JP |
WO-9202371 | Feb 1992 | WO |
WO-9217861 | Oct 1992 | WO |
WO-9513196 | May 1995 | WO |
WO-9524278 | Sep 1995 | WO |
WO-9534043 | Dec 1995 | WO |
WO-9639676 | Dec 1996 | WO |
WO-9708647 | Mar 1997 | WO |
WO-9950736 | Oct 1999 | WO |
WO-0072246 | Nov 2000 | WO |
WO-0122358 | Mar 2001 | WO |
WO-02080520 | Oct 2002 | WO |
Entry |
---|
Gonzalez; Woods, “Digital Image Processing,” Library of Congress, 10 pg article, pp. 417-425 and pp. 452-457; Copyright 1992 by Addison Wesley Publishing Company, Inc. |
International Search Report for International Patent Application No. PCT/US02/15095 dated Feb. 12, 2002. |
Report Summarizing references of record on U.S. Appl. No. 09/858,163, filed May 15, 2001. |
Notice for Grounds of Rejection, Japanese Patent Application No. 2002-590059. Dated Oct. 21, 2008. 2 pages and 3 patges which are the English translation of the Grounds for Rejection. |
Jan. 7, 2013 Response filed in U.S. Appl. No. 12/889,764. |
Number | Date | Country | |
---|---|---|---|
20100001073 A1 | Jan 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11442662 | May 2006 | US |
Child | 12480474 | US | |
Parent | 09858163 | May 2001 | US |
Child | 11442662 | US |