The subject matter of the present disclosure relates to optical based registers, and particularly, to image sensor based indicia reading terminals.
Indicia reading terminals and scanners (collectively, “terminals”) are available in multiple varieties. These terminals read and decode information encoded in decodable or information bearing indicia. Such decodable indicia are utilized generously, from encoding shipping and tracking information for packages, patient identification in hospitals, retail applications, and use on any number of forms and documents including, but not limited to, tax forms, order forms, transaction forms, survey forms, delivery forms, prescriptions, receipts, newspapers, product documents, reports, and the like.
Improvements in terminals are needed such as, for example, there is a need for a terminal with improved decoding of high density decodable indicia.
Terminals of the present disclosure incorporate lens elements that have a non-uniform magnification. This feature allows decoding of decodable indicia that exhibit different characteristics. These characteristics may affect the ability of the terminal to identify and decode the information stored therein. In one embodiment, the terminal comprises a lens element with at least two different levels of magnification. One of the levels of magnification is adequate to acquire information from decodable indica with higher density than other decodable indicia.
So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the accompanying drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments. Moreover, the drawings are not necessarily to scale, emphasis generally being placed upon illustrating the principles of certain embodiments of invention.
Thus, for further understanding of the concepts of the invention, reference can be made to the following detailed description, read in connection with the drawings in which:
Where applicable like numerals identify like components amount the various views.
At a relatively high level, embodiments of the terminal 100 read and decode decodable indicia of varying densities. These embodiments utilize non-uniform magnification of an image of a decodable indicia to improve process time and overall terminal performance. For example, non-uniform magnification can facilitate decoding of decodable indicia of different relative densities without the need to physically alter the relative location of the terminal 100 relative to the decodable indicia. In one aspect terminal 100 can correct for image distribution resulting from non-uniform magnification. Moreover, the inventors contemplate configurations of the terminal 100 that identify the presence of decodable indicia of higher relative density, thereby focusing processing of the information that the terminal collects at the information to which the high-density decodable indicia directly relates. The discussion below highlights features of these embodiments that implement such features.
For example,
The terminal 100 also comprises an actuation device 128. This feature permits operation of the terminal 100 by an end user (not shown). For example, actuation of the actuation device 128 initiates decoding of information stored in the decodable indicia 114 such as by capturing an image that corresponds to the imaging area 120. There are of course other components and hardware that facilitate capturing and decoding of the decodable indicia 114, some of which are discussed in more detail in connection with the optical reader illustrated in
In one embodiment, the terminal 100 can be part of a system 2000. Here the system 2000 has a local server 2250, a remote server 2500, and a network 2750 that couples the local server 2250 and the remote server 2550. This configuration of the system 2000 can process the captured image data, and in one configuration one or more of the local server 2250 and the remote server 2550 entirely process the captured image data and operate the terminal 100 in a manner consistent with the disclosure below. In one embodiment, one or more of the processing module 106 and the storage module 108, or complementary ones thereof, can be located outside of the terminal 100. This configuration permits data and information captured by the terminal 100 to be transferred from the terminal 100, e.g., to a corresponding external storage module 108 for immediate and/or further processing of the captured image data. In another embodiment, image processing steps and other processes can be distributed as between the terminal 100, the local server 2250, and the remote server 2550, with still other embodiments being configured for the image processing and other processes to be executed entirely by the terminal 100.
The lens element 118 can comprise optics that receives light that reflects from decodable indicia 114 and focuses the reflected light onto the image sensor 116. “Optics” as the term is used herein can include configurations of the lens element 118 with a single optical component (or element), e.g., a single lens. The inventors also understand, however, that the lens element 118 can combine multiple types of optical components for the features and function that the present disclosure contemplates. Exemplary optical components can include lenses that comprise glass and suitable composites, as well varying layers of materials in the form of, e.g., optical coatings.
The lens element 118 exhibits certain non-uniform magnification properties that facilitate decoding of decodable indicia 114. These properties may apply across the lens element 118 such as where the magnification provided by the lens element 118 is highest at the center of a field of view (FOV) and decreases outward toward the periphery of the field of view.
As
In the present example, the level of magnification permits decoding of decodable indicia of higher density relative to other decodable indicia. Specifically, by increasing a magnification of an indicia, a captured frame of image data can comprise greater resolution (each pixel position of an image representing a smaller portion of an indicia than would be the result without the increased magnification) thereby increasing a likelihood of decoding, even for higher density decodable indicia. Although gradually decreasing, the inventors contemplate that certain areas of the FOV have a level of magnification that is sufficient for decoding of high-density decodable indicia. These areas can vary in dimensions. The example of
With continued reference to
The code has data or information encoded therein. Information respecting various reference decode algorithms are available from various published standards, such as by the International Standards Organization (“ISO”). Examples may comprise one dimensional (or linear) symbologies, stacked symbologies, matrix symbologies, composite symbologies, or other machine readable indicia. One dimensional (or linear) symbologies which may include very large to ultra-small, Code 128, Interleaved 2 of 5, Codabar, Code 93, Code 11, Code 39, UPC, EAN, MSI, or other linear symbologies. Stacked symbologies may include PDF, Code 16K, Code 49 or other stacked symbologies. Matrix symbologies may include Aztec, Datamatrix, Maxicode, QR Code or other 2D symbologies. Composite symbologies may include linear symbologies combined with stacked symbologies. Other symbology examples may comprise OCR-A, OCR-B, MICR types of symbologies. UPC/EAN symbology or barcodes are standardly used to mark retail products throughout North America, Europe and several other countries throughout the world.
A number of decodable indicia consist of a series or a pattern of light and dark areas of varying widths. In common 1D decodable indicia, the dark areas may comprise elongated parallel “bars” separated by light areas or “spaces.” Certain 2D decodable indicia may encode information in, e.g., a checkerboard pattern. The arrangement of the dark areas and light areas define the information encoded in the resulting decodable indicia. For decodable indicia considered to be high density, the relative size, spacing, and other characteristics of the dark areas and the light areas may be smaller and more tightly defined. Decoding of high density decodable indicia therefore often requires that the image data the terminal 100 gathers is of higher quality than decodable indicia of relatively lower density.
The relative quality may determine the characteristics (e.g., size and shape) of the area HD and/or the shape of the magnification curve 202. As
The image sensor 116 is sensitive to light that passes through the lens element and onto the surface of the image sensor 116. Generally the image sensor 116 comprises a pixel array that has pixels arranged in rows and columns of pixels. The pixels generate signals from which image information is resolved by, e.g., the decoding module 104 and/or the processing module 106. As discussed above, the lens element directs light that reflects from the decodable indicia onto the image sensor 116.
Embodiments of the terminal 100 can correct for image distortion resulting from use of lens element 118 having magnification curve 202. In one embodiment, the terminal 100 deals with image distortion with a distortion correction. The distortion correction has one or more known or discernable values. These values can be programmed, such as in software or firmware, hardwired during manufacturing, and also implemented via user-selected configurations. The value of the distortion correction may be based on the manufacture and design of the lens element. In one example, the value reflects the variable magnification of the lens element that the magnification curve 202 describes and the present disclosure sets forth above. In other examples, the terminal 100 calculates the value according to an algorithm or other set of executable instructions that are stored in, e.g., the storage module 108.
Use of the image capture area 326 permits a terminal (e.g., terminal 100) to focus directly on, e.g., decoding a high density decodable indicia or, in one example, determining whether there is a high density decodable indicia present in the image area (e.g., image area 120). For example, if the image capture area 326 only covers the first area 322 (or a portion thereof), then the terminal can first process the captured image data that corresponds only to the portion of the optics that exhibit higher levels of magnification relative to other portions of the optics. The higher prescribed levels of magnification may more likely lead to successful decoding of the decodable indicia in the first area 322, particularly if the decodable indicia exhibits higher density or other properties that may make reading and decoding more difficult.
In one embodiment, the terminal can associate the configurations of the image capture area 326 with the prescribed levels of magnification of the optics. That is, the terminal can size the image capture area 326 to accommodate the relative dimensions, positions, and other features of the optics such as the portions of the optics that are ascribed the varying prescribed levels of magnification according with the magnification curve 202 of
In other embodiments, a succession of frames of image data that can be captured and subject to the described processing can be full frames (including pixel values corresponding to each pixel of the pixel array 302 or a maximum number of pixels read out from the pixel array 302 during operation of the terminal). A succession of frames of image data that can be captured and subject to the described processing can also be “windowed frames” comprising pixel values corresponding to less than a full frame of pixels of the pixel array 302. A succession of frames of image data that can be captured and subject to the described processing can also comprise a combination of full frames (e.g., the entirety of the pixel array 302) and windowed frames (e.g., one or more of the first area 322 and the second area 324). A full frame can be read out for capture by selectively addressing pixels of image sensor 300 having the pixel array 302 corresponding to the full frame. A windowed frame can be read out for capture by selectively addressing pixels of image sensor 300 having the pixel array 302 corresponding to the windowed frame. In one embodiment, a number of pixels subject to addressing and read out determine a picture size of a frame. Accordingly, a full frame can be regarded as having a first relatively larger picture size and a windowed frame can be regarded as having a relatively smaller picture size relative to a picture size of a full frame. A picture size of a windowed frame can vary depending on the number of pixels subject to addressing and readout for capture of a windowed frame.
Terminals of the present disclosure can capture frames of image data at a rate known as a frame rate. A typical frame rate is 60 frames per second (FPS) which translates to a frame time (frame period) of 16.6 ms. Another typical frame rate is 30 frames per second (FPS) which translates to a frame time (frame period) of 33.3 ms per frame. A frame rate of these terminals can be increased (and frame time decreased) by decreasing of a frame picture size.
Various configurations of the pixel array 302 are possible. For example, the pixel array 302 may be a hybrid monochrome and color array, which can include a first subset of monochrome pixels 328 devoid of color filter elements and a second subset of color pixels 330 including color filter elements. The majority of pixels of the image sensor array can be monochrome pixels of the first subset. Color sensitive pixels of the second subset are at spaced apart positions and can be uniformly or substantially uniformly distributed throughout the image sensor array. The color sensitive pixels can be spaced apart in positions of the pixel array 302 and can be disposed at positions uniformly or substantially uniformly throughout the pixel array 302. Color sensitive pixels may be distributed in the array in a specific pattern of uniform distribution such as a period of P=4 where, for every fourth row of pixels of the array, every fourth pixel is a color sensitive pixel as shown in
In one embodiment, the spaced apart color pixels of the image sensor 200 can follow a pattern according to a Bayer pattern. As
The aiming bank 410 can comprise various devices. These devices generate light beams, and in one example, the light beams are in the form of laser light and/or light of sufficient coherence to project long-distances outward from the imaging module 400. For one embodiment, the aiming bank 410 emits light that can impinge on objects at a distance of least about 5 m away. The light may facilitate alignment of the optics in the optical imaging assembly 402 with the target decodable indicia. In particular, focused light of, e.g., the laser, may facilitate alignment of the portions of the optics with the higher magnification (e.g., higher relative prescribed levels of magnification). In one example, configuration of the imaging module 400 associates the position of the aiming bank 410 with the position (and optical properties) of the optics in the optical imaging assembly 402. This association position permits proper alignment of the optics with the decodable indicia when the end user directs the light that the aiming bank 410 generates onto the target decodable indicia. For example, positioning the light on the decodable indicia may ensure that the decodable indicia is in position with the portion of the FOV that provides higher or better magnification.
The imaging module 400 is found in terminals and registers such as the terminal 100 (
Exemplary devices that can be used for devices of the user input interface 504 are generally discussed immediately below. Each of these is implemented as part of, and often integrated into the hand held housing 502 so as to permit an operator to input one or more operator initiated commands. These commands may specify, and/or activate certain functions of the indicia reading terminal. They may also initiate certain ones of the applications, drivers, and other executable instructions so as to cause the indicia reading terminal 500 to operate in an operating mode.
Devices that are used for the point controller 506 are generally configured so as to translate the operator initiated command into motion of a virtual pointer provided by a graphical user interface (“GUI”) of the operating system of the indicia reading terminal 500. It can include devices such as a thumbwheel, a roller ball, and a touch pad. In some other configurations, the devices may also include a mouse, or other auxiliary device that is connected to the indicia reading terminal 500 by way of, e.g., via wire or wireless communication technology.
Implementation of the keyboard 508 can be provided using one or more buttons, which are presented to the operator on the hand held housing 502. The touch panel 510 may supplement, or replace the buttons of the keyboard 508. For example, one of the GUIs of the operating system may be configured to provide one or more virtual icons for display on, e.g., the display 516, or as part of another display device on, or connected to the indicia reading terminal 500. Such virtual icons (e.g., buttons, and slide bars) are configured so that the operator can select them, e.g., by pressing or selecting the virtual icon with a stylus (not shown) or a finger (not shown).
The virtual icons can also be used to implement the trigger 512. On the other hand, other devices for use as the trigger 512 may be supported within, or as part of the hand held housing 502. These include, but are not limited to, a button, a switch, or a similar type of actionable hardware that can be incorporated into the embodiments of the indicia reading terminal 500. These can be used to activate one or more of the devices of the portable data terminal, such as the bar code reader discussed below.
Displays of the type suited for use on the indicia reading terminal 500 are generally configured to display images, data, and GUIs associated with the operating system and/or software (and related applications) of the indicia reading terminal 500. The displays can include, but are not limited to, LCD displays, plasma displays, LED displays, among many others and combinations thereof. Although preferred construction of the indicia reading terminal 500 will include devices that display data (e.g., images, and text) in color, the display that is selected for the display 516 may also display this data in monochrome (e.g., grayscale). It may also be desirable that the display 516 is configured to display the GUI, and in particular configurations of the indicia reading terminal 500 that display 516 may have an associated interactive overlay, like a touch screen overlay on touch panel 510. This permits the display 516 to be used as part the GUI so as to permit the operator to interact with the virtual icons, the buttons, and other implements of the GUI to initiate the operator initiated commands, e.g., by pressing on the display 516 and/or the touch panel 510 with the stylus (not shown) or finger (not shown).
The hand held housing 502 can be constructed so that it has a form, or “form factor” that can accommodate some, or all of the hardware and devices mentioned above, and discussed below. The form factor defines the overall configuration of the hand held housing 502. Suitable form factors that can be used for the hand held housing 502 include, but are not limited to, cell phones, mobile telephones, personal digital assistants (“PDA”), as well as other form factors that are sized and shaped to be held, cradled, and supported by the operator, e.g., in the operator's hand(s) as a gun-shaped device. One exemplary form factor is illustrated in the embodiment of the indicia reading terminal 500 that is illustrated in the present
The noted circuit components 602, 610, 612, and 614 can be packaged into an image sensor integrated circuit 616. In one example, image sensor integrated circuit 616 can be provided by an MT9V022 image sensor integrated circuit available from Micron Technology, Inc. In another example, image sensor integrated circuit 616 can incorporate a Bayer pattern filter. In such an embodiment, CPU 618 prior to subjecting a frame to further processing can interpolate pixel values intermediate of green pixel values for development of a monochrome frame of image data. In other embodiments, red, and/or blue pixel values can be utilized for the monochrome image data.
In the course of operation of terminal 600 image signals can be read out of image sensor 602, converted and stored into a system memory such as RAM 620. A memory 622 of terminal 600 can include RAM 620, a nonvolatile memory such as EPROM 624, and a storage memory device 626 such as may be provided by a flash memory or a hard drive memory. In one embodiment, terminal 600 can include CPU 618 which can be adapted to read out image data stored in memory 622 and subject such image data to various image processing algorithms. Terminal 600 can include a direct memory access unit (DMA) 628 for routing image information read out from image sensor 602 that has been subject to conversion to RAM 620. In another embodiment, terminal 600 can employ a system bus providing for bus arbitration mechanism (e.g., a PCI bus) thus eliminating the need for a central DMA controller. A skilled artisan would appreciate that other embodiments of the system bus architecture and/or direct memory access components providing for efficient data transfer between the image sensor 602 and RAM 620 are within the scope and the spirit of the invention.
Referring to further aspects of terminal 600, terminal 600 can include an imaging lens assembly 630 for focusing an image of a form barcode 632 located within a field of view 634 on a substrate 636 onto image sensor array 604. Imaging light rays can be transmitted about an optical axis 640. The imaging lens assembly 630 can be adapted to be capable of multiple focal lengths and/or multiple best focus distances.
Terminal 600 can also include an illumination pattern light source bank 642 for generating an illumination pattern 644 substantially corresponding to the field of view 634 of terminal 600, and an aiming pattern light source bank 646 for generating an aiming pattern 648 on substrate 636. In use, terminal 600 can be oriented by an operator with respect to a substrate 636 bearing the form barcode 632 in such manner that aiming pattern 648 is projected on the form barcode 632. In the example of
Each of illumination pattern light source bank 642 and aiming pattern light source bank 646 can include one or more light sources. The imaging lens assembly 630 can be controlled with use of lens assembly control circuit 650 and the illumination assembly comprising illumination pattern light source bank 642 and aiming pattern light source bank 646 can be controlled with use of illumination assembly control circuit 652. Lens assembly control circuit 650 can send signals to the imaging lens assembly 630, e.g., for changing a focal length and/or a best focus distance of imaging lens assembly 630. This can include for example providing a signal to the piezoelectric actuator to change the position of the variable position element of the focus element discussed above. Illumination assembly control circuit 652 can send signals to illumination pattern light source bank 642, e.g., for changing a level of illumination output by illumination pattern light source bank 642.
Terminal 600 can also include a number of peripheral devices such as display 654 for displaying such information as image frames captured with use of terminal 600, keyboard 656, pointing device 658, and trigger 660 which may be used to make active signals for activating frame readout and/or certain decoding processes. Terminal 600 can be adapted so that activation of trigger 660 activates one such signal and initiates a decode attempt of the form barcode 632.
Terminal 600 can include various interface circuits for coupling several of the peripheral devices to system address/data bus (system bus) 662, for communication with CPU 618 also coupled to system bus 662. Terminal 600 can include interface circuit 664 for coupling image sensor timing and control circuit 614 to system bus 662, interface circuit 668 for coupling the lens assembly control circuit 650 to system bus 662, interface circuit 670 for coupling the illumination assembly control circuit 652 to system bus 662, interface circuit 672 for coupling the display 654 to system bus 662, and interface circuit 676 for coupling the keyboard 656, pointing device 658, and trigger 660 to system bus 662.
In a further aspect, terminal 600 can include one or more I/O interfaces 673, 680 for providing communication with external devices (e.g., a cash register server, a store server, an inventory facility server, a peer terminal, a local area network base station, a cellular base station, etc.). I/O interfaces 673, 680 can be interfaces of any combination of known computer interfaces, e.g., Ethernet (IEEE 802.3), USB, IEEE 802.11, Bluetooth, CDMA, GSM, IEEE 1394, RS232 or any other computer interface.
Referring now to
Another exemplary embodiment of a method 800 is found in
The method 800 can also comprise, at block 812, determining whether the decoding of the decodable indicia was successful. If the decode is not successful, the method 800 may continue, at block, 814, applying a distortion correction and thereafter, at block 810 (if a timeout is not satisfied at block 815) attempting to decode again. On the other hand, if the decode is successful, the method 800 may continue, at block 816, performing a secondary operation, which may include additional processing steps, data transmission steps, and/or other steps that are consistent with exemplary terminals and devices the present disclosure describes.
The search for the decodable indicia can include, in one example, highest prioritized search for decodable indicia in an area of a captured frame corresponding to the area of greatest magnification, e.g., a center of a frame image data. As the disclosure explains above, this area often corresponds to the position where data respecting high-density decodable indicia is found. Thus, in one embodiment, processing of data improves because the processing can be prioritized to initially identify and decode representations of decodable indicia of a certain region, e.g., a center of a frame of image data. Windowing and other techniques to interrogate the pixels of the image sensor may help to determine if there the decodable indicia is present. When the decodable indicia is not located during this initial step, however, the terminal can continue with windowing or selective processing of pixels and/or areas of the pixel array. For example, the terminal can change the location and/or position of the window, process the data corresponding to those windowed pixels, and so on. These techniques can save valuable processing time. That is, rather than processing all of the data associated with all of the pixels of the pixel array, windowing processes a fraction or finite set of data at one or more locations corresponding to the pixel array.
In view of the foregoing, terminals of the present disclosure have variable and/or non-uniform magnification. The difference in magnification may correspond to different prescribed levels of magnification such as according to a known curve (e.g., curve 202) or other known manner. The prescribed levels accommodate for high density decodable indicia and other types of symbology that may be difficult to image and process. The terminal is operative to address image distortion that may exist as a result of the variable magnification designed into the lens system.
Where applicable it is contemplated that numerical values, as well as other values that are recited herein are modified by the term “about”, whether expressly stated or inherently derived by the discussion of the present disclosure. As used herein, the term “about” defines the numerical boundaries of the modified values so as to include, but not be limited to, tolerances and values up to, and including the numerical value so modified. That is, numerical values can include the actual value that is expressly stated, as well as other values that are, or can be, the decimal, fractional, or other multiple of the actual value indicated, and/or described in the disclosure.
While the present invention has been particularly shown and described with reference to certain exemplary embodiments, it will be understood by one skilled in the art that various changes in detail may be effected therein without departing from the spirit and scope of the invention as defined by claims that can be supported by the written description and drawings. Further, where exemplary embodiments are described with reference to a certain number of elements it will be understood that the exemplary embodiments can be practiced utilizing either less than or more than the certain number of elements.