1. Field of Invention
The present invention relates generally to digital image capturing and processing systems capable of reading bar code symbols and other graphical indicia in retail point-of-sale (POS) and other demanding environments.
2. Brief Description of the State of Knowledge in the Art
The use of bar code symbols for product and article identification is well known in the art. Presently, various types of bar code symbol scanners have been developed for reading bar code symbols at retail points of sale (POS). In general, these bar code symbol readers can be classified into two (2) distinct classes.
The first class of bar code symbol reader uses a focused light beam, typically a focused laser beam, to sequentially scan the bars and spaces of a bar code symbol to be read. This type of bar code symbol scanner is commonly called a “flying spot” scanner as the focused laser beam appears as “a spot of light that flies” across the bar code symbol being read. In general, laser bar code symbol scanners are sub-classified further by the type of mechanism used to focus and scan the laser beam across bar code symbols.
The second class of bar code symbol readers simultaneously illuminate all of the bars and spaces of a bar code symbol with light of a specific wavelength(s) in order to capture an image thereof for recognition and decoding purposes.
The majority of laser scanners in the first class employ lenses and moving (i.e. rotating or oscillating) mirrors and/or other optical elements in order to focus and scan laser beams across bar code symbols during code symbol reading operations. Examples of hand-held laser scanning bar code readers are described in U.S. Pat. Nos. 7,007,849 and 7,028,904, each incorporated herein by reference in its entirety. Examples of laser scanning presentation bar code readers are described in U.S. Pat. No. 5,557,093, incorporated herein by reference in its entirety. Other examples of bar code symbol readers using multiple laser scanning mechanisms are described in U.S. Pat. No. 5,019,714, incorporated herein by reference in its entirety.
In demanding retail environments, such as supermarkets and high-volume department stores, where high checkout throughput is critical to achieving store profitability and customer satisfaction, it is common for laser scanning bar code reading systems to have both bottom and side-scanning windows to enable highly aggressive scanner performance. In such systems, the cashier need only drag a bar coded product past these scanning windows for the bar code thereon to be automatically read with minimal assistance of the cashier or checkout personal. Such dual scanning window systems are typically referred to as “bioptical” laser scanning systems as such systems employ two sets of optics disposed behind the bottom and side-scanning windows thereof. Examples of polygon-based bioptical laser scanning systems are disclosed in U.S. Pat. Nos. 4,229,588; 4,652,732 and 6,814,292; each incorporated herein by reference in its entirety.
Commercial examples of bioptical laser scanners include: the PSC 8500-6-sided laser based scanning by PSC Inc.; PSC 8100/8200, 5-sided laser based scanning by PSC Inc.; the NCR 7876-6-sided laser based scanning by NCR; the NCR7872, 5-sided laser based scanning by NCR; and the MS232x Stratos®H, and MS2122 Stratos® E Stratos 6 sided laser based scanning systems by Metrologic Instruments, Inc., and the MS2200 Stratos®S 5-sided laser based scanning system by Metrologic Instruments, Inc.
In general, prior art bioptical laser scanning systems are generally more aggressive than conventional single scanning window systems. However, while prior art bioptical scanning systems represent a technological advance over most single scanning window systems, in general, prior art bioptical scanning systems suffer from various shortcomings and drawbacks. In particular, the scanning coverage and performance of prior art bioptical laser scanning systems are not optimized, and require cashier-assisted operation. These systems are generally expensive to manufacture by virtue of the large number of optical components presently required to construct such laser scanning systems. Also, they require heavy and expensive motors which consume significant amounts of electrical power and generate significant amounts of heat.
In the second class of bar code symbol readers, early forms of linear imaging scanners were commonly known as CCD scanners because they used CCD image detectors to detect images of the bar code symbols being read. Examples of such scanners are disclosed in U.S. Pat. Nos. 4,282,425, and 4,570,057; each incorporated herein by reference in its entirety.
In Applicants' WIPO Publication No. WO 2005/050390, entitled “Hand-Supportable Imaging-Based Bar Code Symbol Reader Supporting Narrow-Area And Wide-Area Modes Of Illumination And Image Capture”, incorporated herein by reference, a detailed history of hand-hand imaging-based bar code symbol readers is provided, explaining that many problems that had to be overcome to make imaging-based scanners competitive against laser-scanning based bar code readers. Metrologic Instruments Focus® Hand-Held Imager is representative of an advance in the art which has overcome such historical problems. An advantage of 2D imaging-based bar code symbol readers is that they are omni-directional, by nature of image capturing and processing based decode processing software that is commercially available from various vendors.
U.S. Pat. No. 6,766,954 to Barkan et al. proposes a combination of linear image sensing arrays in a hand-held unit to form an omni-directional imaging-based bar code symbol reader. However, this hand-held imager has limited application to 1D bar code symbols, and is extremely challenged in reading 2D bar code symbologies at POS applications.
WIPO Publication No. WO 2005/050390 by Metrologic Instruments Inc., incorporated herein by reference, discloses POS-based digital imaging systems that are triggered to illuminate objects with fields of visible illumination from LED arrays upon the automatic detection of objects within the field of view of such systems using IR-based object detection techniques, and then to capture and process digital images thereof so as to read bar code symbols graphically represented in the captured images.
US Patent Publication No. 2006/0180670 to PSC Scanning, Inc. discloses digital imaging systems for use at the point of sale (POS), which are triggered to illuminate objects with visible illumination upon the detection thereof using IR-based object detection techniques.
U.S. Pat. No. 7,036,735 to Hepworth et al. disclose an imaging-based bar code reader, in which both visible (i.e. red) and invisible (i.e. IR) light emitting diodes (LEDs) are driven at different illumination intensity levels during object illumination and image capture operations so as to achieve a desired brightness in captured images, while seeking to avoid discomfort to the user of the bar code reader.
Also, US Patent Publication No. 2006/0113386 to PSC Scanning, Inc. discloses methods of illuminating bar coded objects using pulses of LED-based illumination at a rate in excess of the human flicker fusion frequency, synchronized with the exposures of a digital imager, and even at different wavelengths during sequential frame exposures of the imager. Similarly, the purpose of this approach is to be able to read bar code symbols printed on substrates having different kinds of surface reflectivity characteristics, with the added benefit of being less visible to the human eye.
However, despite the increasing popularity in area-type hand-held and presentation type imaging-based bar code symbol reading systems, and even with such proposed techniques for improved LED-based illumination of objects at POS and like imaging environments, such prior art systems still cannot complete with the performance characteristics of conventional laser scanning bi-optical bar code symbol readers at POS environments. Also, the very nature of digital imaging presents other problems which makes the use of this technique very challenging in many applications.
For example, in high-speed imaging acquisition applications, as would be the case at a retail supermarket, a short exposure time would be desired to avoid motion blurring at the POS subsystem. One known way of reducing the exposure time of the digital image detection array is to increase the intensity level of the illumination beam used to illuminate the object during illumination and imaging operations. However, at POS environments, the use of high intensity laser illumination levels is not preferred from the point of view of customers, and cashiers alike, because high brightness levels typically cause discomfort and fatigue due to the nature of the human vision system and human perception processes.
And while it is known that IR illumination can be used to form and detect digital images of bar coded labels, the use of infrared illumination degrades the image contrast quality when bar codes are printed on the thermal printing paper. Consequently, low contrast images significantly slow down imaging-based barcode decoding operations, making such operations very challenging, if not impossible, at times.
In Applicants' WIPO Publication No. WO 2002/043195, entitled “Planar Laser Illumination And Imaging (PLIIM) Systems With Integrated Despeckling Mechanisms Provided Therein”, incorporated herein by reference, Applicants address the issues of using laser illumination in diverse kinds of digital imaging barcode reading systems, including PLIIM-based digital imaging tunnel systems, namely, the inherent problem of optical noise generated by laser speckles in detected digital images. Such speckle pattern noise, as it is often called, is caused by random interferences generated by a rough paper surface, ultimately producing signal variations of the order of size of the bars and spaces of the barcode, resulting in inaccurate imaging and poor decoding. Reduction of this noise is highly desirable.
In WIPO Publication No. WO 2008/011067 entitled “Digital Image Capture And Processing Systems For Supporting 3D Imaging Volumes In Retail Point-Of-Sale Environments”, incorporated herein by reference, Applicants disclose a variety of digital image capture and processing systems and methods for generating and projecting coplanar illumination and imaging planes and/or coextensive area-type illumination and imaging zones, through one or more imaging windows, and into a 3D imaging volume in a retail POS environments. Also, Applicants disclose the use of automatic object motion and/or velocity detection, real-time image analysis and other techniques to capture and processing high-quality digital images of objects passing through the 3D imaging volume, and intelligently controlling and/or managing the use of visible and invisible forms of illumination, during object illumination and imaging operations, that might otherwise annoy or disturb human operators and/or customers working and/or shopping in such retail environments.
U.S. Pat. No. 7,161,688 to Bonner, et al. discloses a mass-transport type of image-based package identification and dimensioning system that provides dimensioning information about, and machine readable codes (i.e. identification information) from, packages passing along a conveyor belt, across a data capture point that is either singulated or non-singulated. As disclosed, the resulting data can be used to determine, for example, package dimensions, package coordinates, dimension confidence, package classification, and content and coordinates of the machine readable code. The dimensioning information is correlated with the machine readable code to form one record. Subsequent processes can access the record from all or part of the captured machine readable information to retrieve package dimension information.
Also, U.S. Pat. No. 6,330,973 to Bridgelall, et al. discloses a tunnel scanner employing a plurality of imaging or scanning modules pointed in various directions toward a target volume, seeking to increase the likelihood that a code symbol on an arbitrarily oriented object in the target volume will be read.
However, while prior digital imaging-based tunnel systems are known in the art, it has not been known how they might be designed to meet the particular needs of retail store environments, while enabling high-throughput, minimizing illumination striking the eyes of cashiers, store employees and customers, providing a relatively small form factor to meet the spatial requirements of POS environments, and support retail self-checkout and cashier-assisted checkout operations, and the like.
Thus, there is a great need in the art for improved retail-oriented digital imaging-based tunnel systems that are capable of competing with conventional laser scanning bar code readers and high-speed POS-based imaging systems employed in demanding POS environments, and of providing the many advantages offered by imaging-based bar code symbol readers, while avoiding the shortcomings and drawbacks of such prior art systems and methodologies.
Accordingly, a primary object of the present invention is to provide an improved digital image capturing and processing apparatus for use in POS environments, which are free of the shortcomings and drawbacks of prior art laser scanning and digital imaging systems and methodologies.
Another object of the present invention is to provide such a digital image capturing and processing apparatus in the form of an omni-directional tunnel-type digital imaging-based system that employs advanced coplanar illumination and imaging, and package identification, dimensioning and weighing technologies, to support automated self-checkout and cashier-assisted checkout operations in demanding retail store environments.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, comprising a plurality of coplanar illumination and imaging subsystems (i.e. subsystems), generating a plurality of coplanar light illumination beams and field of views (FOVs), that are projected through and intersect above an imaging window to generate a complex of linear-imaging planes within a 3D imaging volume for omni-directional imaging of objects passed therethrough.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system in the form of a tunnel-type digital imaging-based system for use in retail point-of-sale environments, having omni-directional 3D imaging capabilities for automatically identifying objects such as consumer products, during self-checkout and cashier-assisted checkout operations.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein each said coplanar illumination and imaging subsystem (i.e. subsystem), employing comprises a linear digital imaging engine, having independent near and far field of view (FOV) light collection optics focused onto separate segmented regions of a linear image sensing array, so as to improve the field of view and depth of field of each coplanar illumination and imaging subsystem.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, comprising a plurality of coplanar illuminating and linear imaging modules, having dual-FOV light collection optics, arranged about and supporting a 3D imaging volume above a conveyor belt surface at a retail checkout station.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system having an integrated automatic package profiling/dimensioning and weight capabilities, to accurately determine package identification, and proper purchase at self-checkout counters in retail store environments.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system which is integrated with a checkout computer system having a magnet-stripe or RF-ID card reader, visual display, keyboard, printer, and cash/coin handling subsystem, in a compact housing that mounts about a conveyor belt system under the control of the self-check out system of the present invention.
Another object of the present invention is to provide a tunnel-type digital imaging-based system capable of generating and projecting coplanar illumination and imaging planes into a 3D imaging volume within a tunnel structure.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein automatic package identification, profiling/dimensioning, weighing and tracking techniques are employed during self-checkout operations, to reduce checkout inaccuracies and possible theft during checkout operations.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein the plurality of coplanar light illumination beams can be generated by an array of coherent or incoherent light sources.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein the array of coherent light sources comprises an array of visible laser diodes (VLDs).
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein the array of incoherent light sources comprises an array of light emitting diodes (LEDs).
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, which is capable of reading (i) bar code symbols having bar code elements (i.e., ladder type bar code symbols) that are oriented substantially horizontal with respect to the imaging window, as well as (ii) bar code symbols having bar code elements (i.e., picket-fence type bar code symbols) that are oriented substantially vertical with respect to the imaging window.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, which comprises a plurality of coplanar illumination and imaging subsystems (i.e. subsystems), each of which produces a coplanar PLIB/FOV within predetermined regions of space contained within a 3-D imaging volume defined above the conveyor belt structure passing through the tunnel-type system.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein each coplanar illumination and imaging subsystem comprises a planar light illumination module (PLIM) that generates a planar light illumination beam (PLIB) and a linear image sensing array and field of view (FOV) forming optics for generating a planar FOV which is coplanar with its respective PLIB.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, comprising a plurality of coplanar illumination and imaging subsystems, each employing a linear array of laser light emitting devices configured together, with a linear imaging array with substantially planar FOV forming optics, producing a substantially planar beam of laser illumination which extends in substantially the same plane as the field of view of the linear array of the subsystem, within the working distance of the 3D imaging volume.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, having an electronic weigh scale integrated with the system housing.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system comprising a plurality of coplanar illumination and imaging subsystems, each employing an array of planar laser illumination modules (PLIMs).
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein such intelligent object presence detection, motion and trajectory detection includes the use of an imaging-based motion sensor, at each coplanar illumination and imaging subsystem, and having a field of view that is spatially aligned with at least a portion of the field of view of the linear image sensing array employed in the coplanar illumination and imaging subsystem.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein the imaging-based motion sensor is used to determine the velocity of objects moving though the field of view (FOV) of a particular coplanar illumination and imaging subsystem, and automatically control the frequency at which pixel data, associated of captured linear images, is transferred out of the linear image sensing array and into buffer memory.
Another object of the present invention is to provide a tunnel-type digital imaging-based system employing a plurality of coplanar illumination and imaging subsystems, wherein each such subsystem includes a linear imaging module realized as an array of electronic image detection cells which is segmented into a first region onto which a near field of view (FOV) is focused by way of a near-type FOV optics, and a second region onto which a far field of view (FOV) is focused by way of a far-type FOV optics, to extend the field of view and depth of field of each such illumination and imaging subsystem.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system employing a plurality of coplanar illumination and imaging subsystems, wherein each such subsystem includes a linear imaging module realized as an array of electronic image detection cells (e.g. CCD) having programmable integration time settings, responsive to the automatically detected velocity of an object being imaged, while moving along a conveyor belt structure, for enabling high-speed image capture operations.
Another object of the present invention is to provide a tunnel-type digital imaging-based system employing a plurality of coplanar illumination and imaging subsystems, wherein each such subsystem supports an independent image generation and processing channel that receives frames of linear (1D) images from the linear image sensing array and automatically buffers these linear images in video memory and automatically assembles these linear images to construct 2D images of the object taken along the field of view of the coplanar illumination and imaging plane associated with the subsystem, and then processes these images using exposure quality analysis algorithms, bar code decoding algorithms, and the like.
Another object of the present invention is to provide a tunnel-type digital imaging-based system capable of reading PDF and 2D bar codes on produce-eliminating keyboard entry and enjoying productivity gains.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein the 2D images produced from the multiple image generation and processing channels are managed by an image processing management processor programmed to optimize image processing flows.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system which supports intelligent image-based object recognition processes that can be used to automate the recognition of objects such as produce and fruit in supermarket environments.
Another object of the present invention is to provide a tunnel-type digital imaging-based system having an integrated electronic weight scale, an RFID module, and modular support of wireless technology (e.g. BlueTooth and IEEE 802.11(g)).
Another object of the present invention is to provide a tunnel-type digital imaging-based system capable of reading bar code symbologies independent of bar code orientation.
Another object of the present invention is to provide a tunnel-type digital imaging-based system having a 5 mil read capability.
Another object of the present invention is to provide a tunnel-type digital imaging-based system having an integrated Sensormatic® RFID tag deactivation device, and an integrated Checkpoint® EAS antenna, for automatically deactivating RFID tags on packages as they are transported through and exit the tunnel system.
Another object of the present invention is to provide a tunnel-type digital imaging-based system that can address the needs of the supermarket/hypermarket and grocery store market segment.
Another object of the present invention is to provide a tunnel-type digital imaging-based system having a performance advantage that leads to quicker customer checkout times and productivity gains that cannot be matched by conventional high-speed bi-optic laser scanners.
Another object of the present invention is to provide such a tunnel-type digital imaging-based system, which may also employ one or more coextensive area-type illumination and imaging subsystems, each generating an area-type illumination beam and field of view (FOV), which forms a coextensive illumination and imaging zone that is projected through and intersects above the conveyor belt structure, within a 3D imaging volume for digital imaging of objects passed therethrough.
Another object of the present invention is to provide such a POS-centric tunnel-type digital imaging-based system, which further comprises a plurality of area-type illumination and imaging subsystems, an image processing subsystem, a control subsystem, an I/O subsystem, an object recognition subsystem, a cashier's sales terminal and a customer transaction terminal.
Another object of the present invention is to provide such a POS-centric tunnel-type digital imaging-based system, having a tunnel housing architecture allowing more open and aesthetically pleasuring industrial designs required by particular retail store environments, and the like.
These and other objects of the present invention will become apparent hereinafter and in the Claims to Invention.
In order to more fully understand the Objects of the Present Invention, the following Detailed Description of the Illustrative Embodiments should be read in conjunction with the accompanying figure Drawings in which:
FIG. 6A1 is a perspective view showing the LADAR-based detection/profiling/dimensioning subsystems, that are integrated within the upper DIP, generating a pair of AM-laser beams at the input and output ports of the tunnel structure, for object profiling/dimensioning purposes;
FIG. 6A2 is a schematic representation of a spatial height (profile) map captured, at time instant t=T1, by each laser-based object detection/profiling/dimensioning subsystem of FIG. 6A1, disposed above the conveyor belt of the tunnel system in
FIG. 6B1 is a schematic representation of the digital tunnel system of the present invention having a triangulation-based detection/profiling/dimensioning subsystem integrated into its upper DIP, in lieu of each LADAR-based detection/profiling/dimensioning subsystem of FIG. 6A1 employed in the illustrative embodiment of
FIG. 6B2 is a flow chart describing the triangulation-based image processing method employed in the triangulation-based detection/profiling/dimensioning subsystem of FIG. 6B1;
Referring to the figures in the accompanying Drawings, the various illustrative embodiments of the illumination and imaging apparatus and the methodologies of the present invention will be described in greater detail, wherein like elements will be indicated using like reference numerals.
In the illustrative embodiments, the illumination and imaging apparatus of the present invention is realized in the form of an advanced, omni-directional tunnel-type digital image capturing and processing system 1 that can be deployed in various application environments, including but not limited to retail point of sale (POS) subsystems 1, as shown in
In general, the complex of coplanar illumination and imaging subsystems 4A through 4F are arranged about the conveyor belt structure subsystem 24B in the tunnel system to capture digital linear (1D) or narrow-area images along the field of view (FOV) of its coplanar illumination and imaging planes, using laser or LED-based illumination, depending on the tunnel system design and implementation. These captured digital images are then buffered and decode-processed using linear (1D) type image capturing and processing based bar code reading algorithms, or can be assembled together to reconstruct 2D images for decode-processing using 1D/2D image processing based bar code reading techniques, as taught in Applicants' U.S. Pat. No. 7,028,899 B2, incorporated herein by reference.
Referring to
As shown in
As shown in
As shown in
As shown in
The object detection/profiling/dimensioning subsystem in the upper DIP can be implemented in a variety of different ways.
In FIGS. 6A1 and 6A2, the object detection/profiling/dimensioning beam is an AM-laser beam functioning in a LADAR-based package profiling and dimensioning subsystem shown and described in International Publication No. WO 02/43195 A2, incorporated herein by reference in its entirety. In this embodiment, the LADAR-based detection/profiling/dimensioning subsystems 20′ are integrated within the upper DIP, and generating a pair of AM-laser beams at the input and output ports of the tunnel structure, for object profiling/dimensioning purposes. As indicated in FIG. 6A2, these subsystem automatically generate a spatial height (profile) map captured, at time instant t=T1. Notably, these spatial height values correspond to the height profile of object(s) supported on the conveyor belt during transport through the tunnel system, and are used to compute object dimensions through real-time computation within the object detection/profiling/dimensioning subsystem, or other suitably programmed processor in the tunnel system.
In FIGS. 6B1 and 6B2, the object detection/profiling/dimensioning beam is a planar light illumination beam (e.g. structured light generated from one or more VLDs or LEDs) functioning in a triangulation-based package profiling/dimensioning subsystem. As indicated in FIG. 6B1, a triangulation-based detection/profiling/dimensioning subsystem 20″ is integrated into the upper DIP 7C, in lieu of each LADAR-based detection/profiling/dimensioning subsystem of FIG. 6A1. In this illustrative embodiment, the triangulation-based detection/profiling/dimensioning subsystem comprises: (i) a planar illumination module (PLIM) 21 employing one or more VLDs or LEDs, for generating and projecting a planar light illumination beam (PLIB), i.e. a plane of structured light, towards the conveyor belt carrying one or more objects into the tunnel system, as illustrated in FIG. 6A1; (ii) area-type 2D imaging engine (i.e. camera) 22 for capturing digital 2D images of objects being transported through the tunnel by the conveyor belt; and (iii) a digital image processor 23 for processing sequences of digital images in order to compute height profile and dimension information about each such object transported through the tunnel system, using the triangulation-based calculation method described in FIG. 6B2.
As indicated in FIG. 6B2, the method employed in the triangulation-based detection/profiling/dimensioning subsystem of FIG. 6B1 comprises a number of primary steps: (a) supplying to the digital processor associated with the profiling and dimensioning subsystem, with the following input parameters: specifications on the FOV of the 2D imaging engine (i.e. camera, the position of camera, the relative position of planar illumination beam (i.e. light curtain) and the camera, and the conveyor belt speed (which should be maintained relative constant); (b) projecting a bright planar illumination beam (PLIB) onto one or more objects as the objects are being transported through the tunnel system; (c) capturing and buffering 2D digital images of the illuminated objects during object transport: (d) processing the buffered digital images and tracking the image of the bright planar illumination beam (PLIB) projected onto the objects, and calculating the height, width and depth of the objects being transported through the tunnel system; and (e) analyzing consecutive digital images, recognizing the outline of objects graphically represented in the digital images, and then combining acquired geometrical information to compute 3D volumetric information regarding objects, as they are being transported through the tunnel system.
Referring now to
As shown in
As shown in
When using coherent illumination sources such as VLDs to implement a linear array of VLDs, then despeckling techniques as taught in WIPO Publication No. 2002/43195 A2 and WIPO Publication No. 2008/011067, both incorporated herein by reference, can be practiced to reduce the spatial and/or temporal coherence of such illumination sources.
Also, the high-speed motion/velocity detection subsystem 80 can be realized employing any of the motion/velocity detection techniques detailed hereinabove so as to provide real-time motion and velocity data to the local control subsystem 75 for processing and automatic generation of control data that is used to control the illumination and exposure parameters of the linear image formation and detection system within the subsystem. Alternatively, motion/velocity detection subsystem 80 can be deployed outside of the illumination and imaging subsystem, as positioned globally.
During tunnel system operation, the local control subsystem (i.e. microcontroller) 75 receives object velocity data from either a conveyor belt tachometer 27 or other data source, and generates control data for optimally controlling the planar illumination arrays 71A, 71B and/or the clock frequency in the linear image sensing array 76 within the coplanar image formation and detection subsystem.
Referring to
As shown in
As shown in
As shown in the state diagram of
In an illustrative embodiment, the Main Task would carry out the basic object detection, management, tracking and correlation operations supported within the 3D imaging volume by each object detecting/profiling/dimensioning subsystem, and would be called and instantiated whenever one or more objects have been detected as entering the tunnel system by the object detecting/profiling/dimensioning subsystems supported in the upper DIP. The kinds of functions to be performed by the Main Task during the Active State are reflected in the package identification and dimension data element management, tracking, and correlation subsystem 60 schematically represented in
In
In
In the illustrative embodiment, the software-based data element management, tracking, and correlation subsystem 60 can be constructed in a manner similar to the data element management, tracking, and correlation subsystem (3950) shown in
Once the batch of products has been scanned through the retail tunnel system, the output from subsystem 60 (e.g. {Product ID, Dimensions, Weight}) is supplied to the software-based checkout subsystem 62 which has access to either a local or remote RDBMS storing retail price information about each UPC or UPC/EAN coded product, as well as information about each product's dimensions and weight. Also, the checkout subsystem 62 includes output displays such as a touchscreen LCD 40, hard copy printers 41, and electronic and cash payment systems 42.
In
As indicated at Block A in
Then, as indicated at Block B, the automatic package detecting/profiling/dimensioning subsystem at the input port automatically detects, profiles and dimensions each product as it enters the input port of the retail tunnel system, the data element management, tracking and correlation subsystem 60 generates a time-stamped package detection/dimension data element for each detected product, and then buffers the data element in the data element queues of the data element management, tracking and correlating subsystem.
As indicated at Block B, the in-motion package weighing subsystem 14 automatically detects the spatial-pressure distribution of each product as it is being transported through the retail tunnel system along the conveyor belt, computes its equivalent weight value, and the data element management, tracking and correlation subsystem 60 generates a space-stamped product weight data element for each weighed product, and buffers the data element in data element queues of the data element management, tracking and correlating subsystem.
As indicated at Block D in
As indicated at Block E, the automatic package detecting/profiling/dimensioning subsystem 14 at the exit port automatically detects (and optionally, profiles and dimensions again), and the data element management, tracking and correlation subsystem 60 generates a time-stamped product detection/dimension data element for each redetected product, and the data element management, tracking and correlation subsystem buffers the data element in data queues of the data element management, tracking and correlating subsystem.
As indicated at Block E, the data element management, tracking and correlating subsystem 60 automatically analyzes and processes the data elements buffered in its data element queues, so as to correlate each identified product with its corresponding dimensions and weight, and generates a combined {product ID/dimensions/weight} data set for each product being scanned through the retail tunnel system.
As indicated at Block G, the checkout subsystem 62 automatically compiles and displays the list of products scanned through the retail tunnel system, accesses a retail product price information in a local or remote RDBMS, and computes a total bill for the products to be purchased, including itemized prices for the batch of products being checked out.
As indicated at Block H of
As indicated at Block I, in the event that the total weight of products/goods measured at Block H does correspond with total weight of products measured by the retail tunnel system, then the consumer is provided the opportunity to make payment for the bills of products being checked-out, and upon making payment, the checkout subsystem 62 generates a sales receipt as proof of full payment for the purchased bill of goods/products.
As indicated at Block J, in the event that the total weight of products/goods measured at Block H does correspond with total weight of products measured by the retail tunnel system, then the checkout subsystem 62 automatically generates an alarm or signal advising a retail store supervisor about such weight discrepancies.
As indicated at Block K, the retail store supervisor takes appropriate measures to rectify discrepancies in the measured weights of the batch of products during tunnel scanning and package weighing operations.
The above described method of tunnel system operation is just one illustrative embodiment of how it can be programmed to operate to carry out diverse kinds of business objectives in demanding retail store environments.
In another mode of operation, the tunnel system can be used to transport batches of produce items through the tunnel system, and automatically recognize the type of produce being transported, weigh the produce batch, and compute the retail price thereof based on the current retail price list for produce items in the retail store.
In yet another illustrative embodiment, the tunnel system of the present invention can be provided with an external video camera trained on the customer during self-checkout operations, to capture video streams which can be watched remotely by retail store supervisors, security guards and the like.
In the first and second illustrative embodiments of the tunnel system of the present invention, the tunnel housing was shown to be of a substantially closed architecture, made from light opaque materials shielding internal illumination from being transmitted to the eyes of human cashiers and customers during checkout operations. Consequently, the tunnel housing generally appears like a shell or tunnel like structure having an input and an output port, with a conveyor belt structure passing therebetween. However, in such retail environments, it might be desired for the tunnel housing structure to be minimized and making it appear more “open”, yet supporting its basic components (e.g. PLIIM-based package identification subsystems, package weighing subsystems, and package detection/profiling/dimensioning subsystems) in arrangements that achieve automated package identification, dimensioning, weighing, tracking and correlation functions, in accordance with the principles of the present invention.
In such open-type tunnel housing architectures, without illumination shielding provided by the tunnel housing/enclosure, there is typically the need to either intelligently control illumination within the tunnel system, and/or use a combination of visible and invisible (i.e. IR) spectral illumination during tunnel operations. Various techniques for intelligently controlling illumination and spectral mixing are disclosed in great detail in Applicants' WIPO Publication No. WO 2008/011067, incorporated herein by reference. Also, techniques can be practiced to intelligently control the ratio of visible and invisible VLD and/or LED sources of illumination so as to maximize that projected illumination falls incident on the surface of the object, and thus minimize the illumination of customers at the POS.
When using coherent illumination sources such as VLDs, then despeckling techniques as taught in WIPO Publication Nos. WO 2002/43195 and WO 2008/011067, incorporated herein by reference, can be practiced to reduce the spatial and/or temporal coherence of such illumination sources.
Techniques can be practiced to employ coplanar and/or coextensive illuminating and imaging subsystems, constructed using (i) VLD-based and/or LED-based illumination arrays and linear and/area type image sensing arrays, and (ii) real-time object motion/velocity detection technology embedded within the system architecture so as to enable: (1) intelligent automatic illumination control within the 3D imaging volume of the system; and (2) automatic image formation and capture along each coplanar illumination and imaging plane therewithin. Also, advanced automatic image processing operations can be practiced to support diverse kinds of value-added information-based services delivered in diverse end-user environments, including retail POS environments as well as industrial environments.
Modifications that Come to Mind
In the illustrative embodiments described above, the multi-channel digital image processing subsystem 26 has been provided as a centralized processing system servicing the image processing needs of each dual-FOV PLIIM-based illumination and imaging subsystem in the system. It is understood, however, that in alternative embodiments, each subsystem 4A through 4F can be provided with its own local image processing subsystem for servicing its local image processing needs.
The tunnel-type digital imaging-based system can also be provide with one or more coextensive area-type illumination and imaging subsystems, each generating an area-type illumination beam and field of view (FOV), which forms a coextensive illumination and imaging zone that is projected through and intersects above conveyor belt structure, within the 3D imaging volume for even more aggressive digital imaging of objects passed therethrough.
Also, the tunnel-type digital-imaging systems of the present invention, disclosed herein, provide full support for (i) dynamically and adaptively controlling system control parameters in the digital image capture and processing system, as disclosed and taught in Applicants' PCT Application Serial No. PCT/US2007/009763, as well as (ii) permitting modification and/or extension of system features and function, as disclosed and taught in PCT Application No. WO 2007/075519, supra.
Several modifications to the illustrative embodiments have been described above. It is understood, however, that various other modifications to the illustrative embodiment of the present invention will readily occur to persons with ordinary skill in the art. All such modifications and variations are deemed to be within the scope and spirit of the present invention as defined by the accompanying Claims to Invention.
This is a Continuation-in-Part (CIP) of the following Applications: U.S. patent application Ser. No. 11/900,651 filed Sep. 12, 2007; which is a CIP of: U.S. application Ser. No. 11/880,087 filed Jul. 19, 2007; International Application No. PCT/US2007/016298 filed Jul. 19, 2007; U.S. Application No. U.S. application Ser. No. 11/820,497 filed Jun. 19, 2007; U.S. application Ser. No. 11/820,010 filed Jun. 15, 2007; U.S. application Ser. No. 11/809,173 filed May 31, 2007; U.S. application Ser. No. 11/809,174 filed May 31, 2007; U.S. application Ser. No. 11/809,240 filed May 31, 2007; U.S. application Ser. No. 11/809,238 filed May 31, 2007; Ser. No. 11/788,769 filed Apr. 20, 2007; International Application No. PCT/US07/09763 filed Apr. 20, 2007; U.S. application Ser. No. 11/731,866 filed Mar. 30, 2007; U.S. application Ser. No. 11/731,905 filed Mar. 30, 2007; U.S. application Ser. No. 11/729,959 filed Mar. 29, 2007; U.S. application Ser. No. 11/729,525 filed Mar. 29, 2007; U.S. application Ser. No. 11/729,945 filed Mar. 29, 2007; U.S. application Ser. No. 11/729,659 filed Mar. 29, 2007; U.S. application Ser. No. 11/729,954 filed Mar. 29, 2007; U.S. application Ser. No. 11/810,437 filed Mar. 29, 2007; U.S. application Ser. No. 11/713,535 filed Mar. 2, 2007; U.S. application Ser. No. 11/811,652 filed Mar. 2, 2007; U.S. application Ser. No. 11/713,785 filed Mar. 2, 2007; U.S. application Ser. No. 11/712,588 filed Feb. 28, 2007; U.S. application Ser. No. 11/712,605 filed Feb. 28, 2007; U.S. application Ser. No. 11/711,869 filed Feb. 27, 2007; U.S. application Ser. No. 11/711,870 filed Feb. 27, 2007; U.S. application Ser. No. 11/711,859 filed Feb. 27, 2007; U.S. application Ser. No. 11/711,857 filed Feb. 27, 2007; U.S. application Ser. No. 11/711,906 filed Feb. 27, 2007; U.S. application Ser. No. 11/711,907 filed Feb. 27, 2007; U.S. application Ser. No. 11/711,858 filed Feb. 27, 2007; U.S. application Ser. No. 11/711,871 filed Feb. 27, 2007; U.S. application Ser. No. 11/640,814 filed Dec. 18, 2006; International Application No. PCT/US06/48148 filed Dec. 18, 2006; U.S. application Ser. No. 11/489,259 filed Jul. 19, 2006; U.S. application Ser. No. 11/408,268 filed Apr. 20, 2006; U.S. application Ser. No. 11/305,895 filed Dec. 16, 2005; U.S. application Ser. No. 10/989,220 filed Nov. 15, 2004; U.S. application Ser. No. 10/712,787 filed Nov. 13, 2003, now U.S. Pat. No. 7,128,266; U.S. application Ser. No. 10/186,320 filed Jun. 27, 2002, now U.S. Pat. No. 7,164,810; Ser. No. 10/186,268 filed Jun. 27, 2002, now U.S. Pat. No. 7,077,319; International Application No. PCT/US2004/0389389 filed Nov. 15, 2004, and published as WIPO Publication No. WO 2005/050390; U.S. application Ser. No. 09/990,585 filed Nov. 21, 2001, now U.S. Pat. No. 7,028,899 B2; U.S. application Ser. No. 09/781,665 filed Feb. 12, 2001, now U.S. Pat. No. 6,742,707; U.S. application Ser. No. 09/780,027 filed Feb. 9, 2001, now U.S. Pat. No. 6,629,641 B2; and U.S. application Ser. No. 09/721,885 filed Nov. 24, 2000, now U.S. Pat. No. 6,631,842 B1; wherein each said application is commonly owned by Assignee, Metrologic Instruments, Inc., of Blackwood, N.J., and is incorporated herein by reference as if fully set forth herein in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 11900651 | Sep 2007 | US |
Child | 12283439 | US | |
Parent | 11880087 | Jul 2007 | US |
Child | 11900651 | US | |
Parent | 11820497 | Jun 2007 | US |
Child | 11880087 | US | |
Parent | 11820010 | Jun 2007 | US |
Child | 11820497 | US | |
Parent | 11809173 | May 2007 | US |
Child | 11820010 | US | |
Parent | 11809174 | May 2007 | US |
Child | 11809173 | US | |
Parent | 11809240 | May 2007 | US |
Child | 11809174 | US | |
Parent | 11809238 | May 2007 | US |
Child | 11809240 | US | |
Parent | 11788769 | Apr 2007 | US |
Child | 11809238 | US | |
Parent | PCT/US07/09763 | Apr 2007 | US |
Child | 11788769 | US | |
Parent | 11731866 | Mar 2007 | US |
Child | PCT/US07/09763 | US | |
Parent | 11731905 | Mar 2007 | US |
Child | 11731866 | US | |
Parent | 11729959 | Mar 2007 | US |
Child | 11731905 | US | |
Parent | 11729525 | Mar 2007 | US |
Child | 11729959 | US | |
Parent | 11729945 | Mar 2007 | US |
Child | 11729525 | US | |
Parent | 11729659 | Mar 2007 | US |
Child | 11729945 | US | |
Parent | 11729954 | Mar 2007 | US |
Child | 11729659 | US | |
Parent | 11810437 | Mar 2007 | US |
Child | 11729954 | US | |
Parent | 11713535 | Mar 2007 | US |
Child | 11810437 | US | |
Parent | 11811652 | Mar 2007 | US |
Child | 11713535 | US | |
Parent | 11713785 | Mar 2007 | US |
Child | 11811652 | US | |
Parent | 11712588 | Feb 2007 | US |
Child | 11713785 | US | |
Parent | 11712605 | Feb 2007 | US |
Child | 11712588 | US | |
Parent | 11711869 | Feb 2007 | US |
Child | 11712605 | US | |
Parent | 11711870 | Feb 2007 | US |
Child | 11711869 | US | |
Parent | 11711859 | Feb 2007 | US |
Child | 11711870 | US | |
Parent | 11711857 | Feb 2007 | US |
Child | 11711859 | US | |
Parent | 11711906 | Feb 2007 | US |
Child | 11711857 | US | |
Parent | 11711907 | Feb 2007 | US |
Child | 11711906 | US | |
Parent | 11711858 | Feb 2007 | US |
Child | 11711907 | US | |
Parent | 11711871 | Feb 2007 | US |
Child | 11711858 | US | |
Parent | 11640814 | Dec 2006 | US |
Child | 11711871 | US | |
Parent | PCT/US06/48148 | Dec 2006 | US |
Child | 11640814 | US | |
Parent | 11489259 | Jul 2006 | US |
Child | PCT/US06/48148 | US | |
Parent | 11408268 | Apr 2006 | US |
Child | 11489259 | US | |
Parent | 11305895 | Dec 2005 | US |
Child | 11408268 | US | |
Parent | 10989220 | Nov 2004 | US |
Child | 11305895 | US | |
Parent | 10712787 | Nov 2003 | US |
Child | 10989220 | US | |
Parent | 10186320 | Jun 2002 | US |
Child | 10712787 | US | |
Parent | 10186268 | Jun 2002 | US |
Child | 10186320 | US | |
Parent | PCT/US04/89389 | Nov 2004 | US |
Child | 10186268 | US | |
Parent | 09990585 | Nov 2001 | US |
Child | PCT/US04/89389 | US | |
Parent | 09781665 | Feb 2001 | US |
Child | 09990585 | US | |
Parent | 09780027 | Feb 2001 | US |
Child | 09781665 | US | |
Parent | 09721885 | Nov 2000 | US |
Child | 09780027 | US |