The present invention relates to optical validation and more specifically, to a validation device that may be used alone or integrated with a system for validating documents, merchandise, or currency. The present invention also relates to methods for optical validation using the validation device and to methods for improving the quality and consistency of data captured by the validation device.
Counterfeiting documents, merchandise, and currency is a growing problem, and validating these items (especially currency) is important. While currency validation (CVAL) systems exist, these systems are too slow, costly, intrusive, and/or bulky to be routinely used at common transaction locations (e.g., store checkouts, ATM machines, banks, etc.). Therefore, a need exists for a low-cost CVAL device that may function alone (e.g., handheld, kiosk, etc.) or as part of a larger system (e.g., point of sale system), and which may be operated to validate items (especially currency) in an easy (e.g., handheld) and unobtrusive (e.g., inconspicuous) way.
Accordingly, in one aspect, the present invention embraces a currency validation (CVAL) device. The CVAL device includes an imaging subsystem, which includes a high-resolution image sensor and optics for capturing digital images of items in a field of view. The CVAL device also includes an illumination subsystem that has one or more illumination sources and optics for illuminating items in the field of view. The CVAL device also includes a processor (also referred to herein as “processing circuitry”) that is configured by software to synchronize and control the imaging and illumination subsystems. The subsystems are communicatively coupled so as to exchanges signals and information.
When the CVAL device is triggered (e.g., by the movement of a switch, spoken command, signal from a point of sale system, etc.) to perform a validation process, the processor activates the illumination sources, individually or in combination (i.e., multiplexed), to sequentially illuminate the item in the field of view with light having various (e.g., different) spectral profiles, wherein the wavelengths in a spectral profile may include visible (e.g., red, blue, green, etc.) and/or invisible (e.g., near infrared, near ultraviolet) light. For each illumination, the CVAL system captures an image (or images) of the item using the image sensor. The processor is also configured to process the image or images (e.g., crop, align, resize, segment the item, recognize the item, etc.) to put them in a condition for analysis. The processor is also configured to control (e.g., activate, deactivate, switch, etc.) and synchronize the illumination and image capturing processes. In a possible embodiment, the processor is further configured to analyze the captured images and, based on the analysis, validate currency item or invalidate the currency item (e.g., detect a counterfeit). In some possible embodiments, however, the validation may be performed by a computing device (e.g., as part of a point of sale system) communicatively coupled to the CVAL device.
The validation device may be used alone or as part of a larger system (e.g., a point of sale system, a kiosk, etc.), and in various embodiments, may perform several functions. For example, a dual-purpose, handheld imager may be incorporated with a point of sale system to perform both checkout operations and currency validation. In a first mode (i.e., indicia-reading mode), the handheld imager operates as a typical imaging barcode scanner. In a second mode (i.e., CVAL mode), the handheld imager operates as a currency validator. Changing between the first and second modes of operation may be accomplished either automatically (e.g., set by the point of sale system in response to a transaction, set in response to a scanned barcode, etc.) or manually (e.g., set by an operator).
Capturing multiple images of an item (e.g., banknote) illuminated with various spectral profiles is an important aspect of the optical validation embraced by the present invention. As a result, various embodiments for the providing and controlling the illumination are envisioned.
In various embodiments, the illumination subsystem may include multiple light emitting diode (LED) arrays, each configured to radiate light in a particular spectral band (i.e., each having a particular spectral profile). Each LED array may be controlled by the processor to illuminate the field of view with a particular intensity and/or duration. Likewise, multiple LED arrays may be simultaneously activated to illuminate the field of view with a spectral bandwidth that is the combination of each individual LED array. In some possible embodiments, the light from the LED is sensed (i.e., sampled). The sensing provides feedback, that when interpreted by the processor may be used to control the illumination exposure. This feedback control may be necessary to compensate for device temperature (e.g., LED temperature) or to minimize device variations (i.e., calibration).
In various embodiments, the spectral profiles may be controlled via optical filters placed between the light sources and the item (i.e., in the transmit path) or placed between the item and the image sensor (i.e., in the receive path). To produce images of the item under various spectral conditions, different filters (or combination of filters) may be mechanically moved in/out of the transmit/receive paths. The filters may be absorptive type filters (e.g., colored glass) or interference type filters (e.g., layers of thin films on a substrate) and may be mechanically mounted on a filter wheel that can be rotated to adjust the position of the filters.
The validation device may also include means for providing feedback to a user. This feedback may (i) help a user position the item/validation-device and/or (ii) may provide the results of the validation to a user.
In various embodiments, the validation device may include an aiming subsystem to project a pattern into the handheld imager's field of view that helps a user position the item and/or the validation device (i.e., for handheld embodiments). As a result, the aiming subsystem typically includes a visible light source (e.g., laser, LED, etc.), an image-forming element (e.g., an aperture, a diffractive optical element, etc.), and a projection lens (or lenses). Further, the aiming subsystem may project two distinctly different (e.g., different in size, shape, color, flashing, etc.) targeting patterns, wherein each targeting pattern corresponds to one of the two modes of operation (e.g., indicia reading, CVAL, etc.).
In various embodiments, the validation device may include at least one positioning indicator to help a user position the currency item and/or the validation device (i.e., handheld imager) for validation. Here, the processor may generate real-time indicator signals that activate the (at least one) indicator to guide the repositioning of handheld imager and/or currency item toward an optimal position for validation. The indicator signals may also indicate that an optimal position has been achieved. The indicator signals may also indicate that a portion of the currency item is obscured. The (at least one) indicator may transmit audio, visual, or haptic (e.g., vibration) signals to a user. In various embodiments, the indicator signals may be sent to the aiming subsystem to cause a change (e.g., flashing, color change, etc.) in the targeting pattern based on the determination.
The validation device may provide (to a user) the results of the validation process via interface circuitry 57 and indicators/display, either integrated with the validation device or communicatively coupled to the validation device. The indicators may provide visual, audible, and/or tactile messages to a user based on the validation results. These messages may also include instructions for the user regarding the next steps that should be taken in the validation process.
The validation device may be powered by a battery or via a cable connected to a power supply (e.g., a USB power supply). For cases in which the power supplied by the power supply is insufficient, an additional storage element (e.g., a battery, super capacitor, etc.) may be used to provide additional power. In these cases, the storage element may be integrated with the cable.
High quality images of the item (e.g., currency item) improve the validation process. To this end, various components or systems may be integrated with the CVAL device to improve image quality.
In various embodiments, the validation device includes a set of crossed polarizers to remove specular reflections from the item. Here, a first polarizer may be positioned in front of the illumination subsystem's light sources and a second polarizer may be positioned in front of the imaging subsystem's image sensor.
In various embodiments, a banknote (i.e., bill, currency, etc.) holder may be used with the validation device to facilitate the imaging of the currency item. The banknote holder typically has a substrate with a reflective surface (e.g., metallic mirror, dichroic mirror, etc.) onto which a banknote may be placed for verification. When placed on the banknote holder and illuminated by the validation device, a portion of the light from the validation device passes through the banknote and is reflected back through the banknote to the image sensor of the validation device. In this way, features such as watermarks may be imaged. In various embodiments, the banknote holder may itself include one or more illumination sources/illumination devices.
In another aspect, the present invention embraces methods (i.e., processes) for currency validation. In various embodiments, a validation device is provided (e.g., a handheld CVAL device, a fixedly mounted CVAL device, a CVAL device integrated with a point of sale system, etc.). The validation device is capable of illuminating a field of view with light having different spectral profiles, while synchronously capturing at least one digital image of the field of view for each illumination. A currency item is positioned within the imaging device's field of view and the device is triggered to begin operation. The currency item is then illuminated and imaged in accordance with the previously mentioned capabilities of the validation device to obtain a plurality of digital images of the currency item in different spectral conditions. Then, the digital images are processed and the currency item is validated based on the results of the processing. The validation may include determining if an item is authentic or counterfeit. In various embodiments, validation may include determining if a currency item is fit for use.
In various embodiments, the processing includes recognizing characters or features on the currency item, and then comparing these recognized features to one or more comparison standards retrieved from a computer readable memory (e.g., on the device, on a network, etc.). In some possible embodiments, the recognized characters and/or features may be used to identify the banknote (e.g., to help retrieve a comparison standard) or may be stored to memory as part of a record of the validation. In some cases, these records may be turned over to agencies (e.g., law enforcement, manufacturers, store security, etc.) to help a counterfeit investigation.
In various embodiments, the validation process includes identifying one or more regions of interest on the currency item within each digital image. Then, (i) comparing the pixel levels from the one or more regions of interest to one or more comparison standards, (ii) comparing the pixel levels from a particular region of interest within an image to another region of interest within the same image, or (iii) comparing the pixel levels from a particular region of interest within a first image to another region of interest within one or more other images.
Various forms of reporting the results of the verification are embraced by the present invention, and in some cases, the results of the validation may trigger additional process steps. For example, the results of the validation may cause audible, tactile, or visual feedback to indicate if a currency item is valid or counterfeit. In another example, the results of the validation may cause a digital image of the customer to be captured by a camera (e.g., a security camera) at a point of sale.
In various embodiments, the validation process may include steps for providing feedback (e.g., audible, visual, or tactile) to help align the currency item and/or the CVAL device and/or to determine if the currency item is obscured in the digital images.
In various embodiments, the validation device may operate in two modes (e.g., indicia reading mode, CVAL mode). In this case, the validation process may include steps for adjusting the mode of operation based on an analysis of the captured digital image or images.
In various embodiments, the present invention embraces methods for improving the quality of the data (e.g., digital images) acquired for validation. In various embodiments, the validation method (i.e., validation process) may include steps to sense the authenticity of an item (e.g., merchandise, currency, etc.) by applying a chemical substance to the item before illuminating and imaging the item with different spectral profiles. In various embodiments, image-processing steps may be applied to remove artifacts from the captured digital images.
In various embodiments, the present invention embraces methods for improving the repeatability of validation. In various embodiments, the validation process includes steps to calibrate the validation device. In various embodiments, the validation process may include steps for capturing and analyzing a calibration target to determine the optimal illumination and/or image sensor settings. In various embodiments, the validation process may include steps for capturing a portion of the light from the illumination subsystem and then adjusting the exposure/illumination of the sensor/light-sources to match a calibrated value. In some cases, the calibrated value may be based on a known temperature response of the light sources.
The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.
Section I: Validation Device 10
At point of sale (POS), multiple devices are often needed to scan barcodes and to determine the authenticity of items at checkout (e.g., currency, merchandise, stamps/labels, driver licenses, etc.). Using multiple devices can slow-down checkout and is not cost/space efficient.
Referring now to
In order for paper currency to continue to be widely accept for commercial transactions, there needs to be high confidence that the bills being presented at the point-of-sale (and elsewhere) are genuine (and are not counterfeit or forged). An image-based currency evaluation system can provide a higher degree of confidence in a bill's authenticity. Often, such an imaging-based system uses various colored light sources 16 (or light sources 16 with different spectral profiles) in order to detect wavelength dependent variations in the reflectance from the bills (i.e., banknotes) to determine authenticity.
Filters 24 (
A validation device 10 may use filters 24 of different construction and/or composition. For example, colored plastic (or glass) may be used or multilayer interface filters may be used. Colored plastic (or glass) filters are relatively insensitive to angular orientation, whereas interface filters may be highly sensitive to angular orientation.
Control of the illumination's spectral profile (e.g., color) may be accomplished by controlling the filters 24 and/or the light sources 16 in the validation device 10. In various embodiments, a filter (or filters) 24 may be positioned in front of a light source 16 and mechanically moved in and out of position to change the spectral profile of the illumination. In various embodiments, a multilayer filter 24 may be positioned in front of a light source 16 and mechanically rotated to change the spectral profile of the illumination. This filter-tuning approach is especially useful if very narrow changes in peak emission wavelengths are needed for validation. In various embodiments, diffractive optical elements (e.g., gratings) may be used to produce illumination having different spectral profiles. In various embodiments, multiple light sources 16 (e.g.,
As noted previously, in order for paper currency to continue to be widely accepted for commercial transactions, there needs to be a high degree of confidence that the bills presented at the point-of-sale (and elsewhere) are genuine (i.e., not counterfeit or forgeries). Using an image-based currency evaluation device (i.e., validation device 10) provides a higher confidence of a bill's authenticity. The validation device 10 embraced by the present invention captures a plurality of images of an item, wherein each image of the item represents the item's spectral response (e.g., reflectivity, fluorescence, etc.) to a particular wavelength and/or spectral profile (i.e., collection of wavelengths). In some cases, discriminating features used for validation may appear in images of the item for a particular spectral profile, while not appearing or in other spectral-profile images. A valid banknote and counterfeit banknote illuminated and imaged using various spectral profiles are shown in
In various embodiments of the validation device 10 embraced by the present invention, the various images are obtained using optical filters 26 positioned in front of the imaging subsystem's image sensor 28 (i.e., in the return path). A benefit to using filters in this way is that the spectral profile of the light reaching the image sensor 28 is controlled, even if ambient light levels vary (e.g., vary in intensity, color, etc.).
The filters 28 used in the return path (i.e., receive path) of imaging subsystem 12 may be of various constructions and/or compositions. For example, colored (dies) plastic, colored glass, or interface (i.e., multilayer, dichroic, etc.) filters may be used. Colored plastics and glass filters are relatively insensitive to angular orientation, whereas interface filters may be highly sensitive to angular orientation.
In various embodiments, multiple filters 28 may be placed in the return path and may be mechanically moved in and out of position to change the spectral profile of the light reaching the image sensor 28. In various embodiments, the angular orientation of an interference filter in front of the image sensor 28 may be changed to tune the spectral profile precisely. Similarly, diffractive optical elements (e.g., gratings) may be used to filter the light reaching the image sensor.
Increasing evidence of counterfeiting indicates that currency validation/authentication is a growing need in many parts of the world. Multispectral illumination and imaging for validation is embraced by the present invention to address this problem. The images acquired by a verification (i.e., validation) device 10 may contain artifacts (e.g. shadows, glare, fibers, dirt, etc.). These artifacts do not contain valuable information and introduce spatial noise, thereby making validation difficult. The present invention embraces mitigating this spatial noise to improve the quality of the images provided for validation.
Surfaces typically reflect light in two ways: specular and diffuse. Diffuse reflections from a currency item (e.g., banknotes 18) are generally weaker than specular reflections from the currency item but contain the information necessary for validation. Specular reflections contain no valuable information about a printed surface, and as a result, minimizing their intensity is helpful for validation. The present invention embraces minimizing specular reflections from a currency item by controlling polarization of the illumination light and the light detected by the image sensor 28. Specifically, the illumination light may be polarized in a particular direction and the light captured by the image sensor is polarized in a direction orthogonal to the particular direction (if polarizers 30 and 32 are used). In this way, the light reflected from the currency item is filtered (i.e., by its polarization) to remove the polarization of the illuminating light. As diffuse reflected light is largely unpolarized, a portion of the diffuse reflected light will reach the image sensor 28. As the specular reflected light is largely polarized in the direction of the illumination, the specular reflected light will be substantially blocked. In various embodiments, a linear polarizer is positioned in front of the illumination subsystem and a crossed polarizer is positioned in front of the image sensor. In this way, very little light from the illuminator or from specular reflection is detected by the image sensor.
The validation device 10 further comprises a processor (also referred to herein as processing circuitry) communicatively coupled to the imaging subsystem 12 and the illumination subsystem 14. The processor 36 is configured by software 38 (stored, for example, in a storage device 42 or memory 44 of the validation device 10) to activate one or more of the light sources 16 in the illumination subsystem 14 to illuminate a currency item, capture an image of illuminated currency item, and repeat activating one or more light sources and capturing digital images until a plurality of digital images of the currency item have been captured, and process the plurality of images to validate the currency item. The storage device 42 of
Barcode scanners are ubiquitous at retail checkouts, and the ability to detect counterfeit currency is a growing need. The present invention embraces combining these functions into a single validation device 10, in which the mode of operation is indicated to avoid confusion.
In various embodiments, the validation device 10 includes an aiming subsystem 40 capable of projecting two different targeting patterns, one for each of the two modes of operation. In a first mode, one light pattern will be projected into the field of view of the device. If the mode of operation is changed, a different pattern will be projected. The targeting pattern will alert the operator of the mode and/or the mode change. The aiming subsystem 40 may be communicatively coupled to the mode-selection switch and has one or more aiming-light sources 41 and optics 42 for projecting (i) a first targeting pattern into the field of view when the CVAL device is in indicia reading mode and (ii) a second targeting pattern into the field of view when the CVAL device is in CVAL mode. The aiming system's one or more aiming-light sources 41 may include a first laser for radiating light for the first targeting pattern and a second laser for radiating light for the second targeting pattern.
The aiming subsystem 40 may project the targeting pattern into the field of view using a variety of technologies (e.g., aperture, diffractive optical element (DOE), shaping optics, etc. (referred to collectively as projection optics 42 (
The validation device 10 envisioned by the present invention requires significant energy to provide the high-intensity illumination and fast image-capture necessary for operation. As a result, the current consumption required by the validation device may exceed the current limits (e.g., 500 milliamps) of a typical power source 62 (e.g., USB). For example, current consumption of the illumination subsystem may exceed the power limits of a USB connector if multiple illuminations/image-captures are required.
The validation device 10 may store energy in a storage element during periods rest (i.e., nonoperation) and then use the stored energy for illumination, when high current is required. In various embodiments, the storage element is at least one super-capacitor capable of supplying the illumination subsystem energy without depleting the energy necessary for other operations (e.g., scanning). A typical super-capacitor has enough energy capacity for a sequence of illuminations (i.e., “flashes”) before charging is required. In various embodiments, the storage element may be a rechargeable battery. The battery may be charged when validation is not required and then may be used to provide energy for the sequences of “flashes” during validation.
The present invention also embraces integrating the storage element (or elements) 50 outside the housing 20 of the validation device 10. For example, the storage element 50 may be incorporated inside the validation device's power/data cable. In this case, efficient charging may be accomplished using a current limiting resistor directly from the power source. The storage element may also be distributed along the cable, using the length of the cable and multiple layers to create a “cable battery” or “cable capacitor”.
While various components of an exemplary validation device are depicted in
Section II: Validation Systems
To combat counterfeiting, banknotes need to be recognized, removed from circulation, and in some cases, reported to authorities for data collection and analysis. Detecting counterfeits immediately at transaction locations offers advantages to law enforcement and commerce. The validation device 10 embraced by the present invention may be combined with other systems to create a verification kiosk 200 (
In various exemplary implementations, banks may use a kiosk 200 (
Still referring to
In various embodiments of automatic mode initiation, image capture is initiated resulting in one or more captured images after which the handheld imager's firmware searches the captured digital images for items (e.g., barcodes, characters, features, artwork, banknotes, etc.) and, upon recognition, changes the mode of operation appropriately. If a barcode is detected (i.e., recognized), barcode decoding mode is initiated and the image or images are processed. If currency item (rather than a barcode) is detected (i.e., recognized), more images may be needed to obtain the full multispectral set of images, after which the images are processed for authentication. In various exemplary implementations of automatic mode initiation, a point of sale computer controls the mode of operation based on a point in a transaction process (e.g., barcode scanning is complete, payment is necessary, etc.). Regardless of the payment form, the validation device 10 comprising a dual- or multi-mode CVAL device can be changed to the currency validation mode and used for currency validation. When the POS system 100 indicates that the transaction process is complete, the dual- or multi-mode CVAL device is returned to barcode scanning mode (i.e., indicia reading mode). In various embodiments, the handheld imager (the CVAL device) supports barcode scanning as the primary function, by default. In this case, the handheld imager's decoding process attempts to decode barcodes. If a barcode is detected in a captured image, then the handheld imager does not switch to a new mode. If, however, no barcode is detected, then the handheld imager begins a process to determine if the mode of operation should be changed. As noted previously, such a process could include capturing additional images and/or additional image processing to identify currency features.
The handheld imager's operator may manually initiate the mode of operation (e.g., barcode scanning, CVAL, merchandise validation, etc.). In various embodiments, the operator initiates a mode of operation by activating a dedicated trigger switch 34 (e.g., pressing once, repeatedly, or in a pattern). The trigger switch may be a mechanical, optical, magnetic, or capacitive switch. This trigger may also control various functions within a signal mode. For example, pressing the trigger may initiate an aiming subsystem 40 to project a targeting pattern to guide the placement of a currency item, and releasing the trigger switch may initiate a verification process.
Other means to change modes are envisioned by the present invention. For example, a special barcode or symbol may be scanned, when the device is in barcode scanning mode, to initiate the authentication mode of operation. After validation is complete, the validation device 10 could return to barcode scanning mode after a set period of inactivity (e.g., a few seconds). In another example, a voice command may be used to initiate, change, or terminate the mode of operation, such as through a microphone (e.g., microphone 59 in
For commercial transactions, confidence in the validity of banknotes is needed. Using a point of sale system 100 configured for validation can provide this confidence by recognizing forgeries through the detection of security features on banknotes (i.e., bills). As many security features are located on the both the front, back, and within the banknote, it is useful to evaluate transmission as well as reflectance characteristics of the bill.
In various embodiments, in the point of sale system 100 embraced by the present invention, a banknote holder 102 (i.e., bill holder) may be used to obtain the optical transmission characteristics of the bill. The banknote holder 102 includes a highly reflective surface. When the back (or front) surface of the banknote 18 is placed on the reflective surface and the front (or back) surface of the banknote is illuminated, the light reflected from the reflective surface reveals the transmission characteristics of the banknote. The reflective surface may be a broadband reflective surface (e.g., metallic mirror) or may be a reflective surface with a particular spectral profile that is customized to reflect only specific wavelengths (e.g., infrared (IR) ultra-violet (UV), and/or portions of the visible spectrum). In various embodiments, the banknote holder 102 may itself include one or more illumination sources/illumination devices.
Section III: Methods for Validation
Still referring to
According to various embodiments, methods 500 through 600 and methods 800 through 1200 may be standalone methods as respectively depicted in
Counterfeiting is a major issue and many POS systems are not configured to validate currency items easily. Current validation methods are slow, expensive, intrusive, and bulky. There is a need for validation at the point of sale that mitigates or solves these problems. The present invention embraces a low-cost, handheld validation device 10 that uses multi-spectral imaging and that can validate currency by comparing images (or image portions) of currency items to those of known authenticity as part of validation, according to various embodiments of the present invention.
Regions of interest on a currency item may be identified and compared to comparison standards for each region of interest. Alternatively, ratios between regions of interest may be used as part of validation. Some advantages of multi-spectral imaging for validation include the creation of a large data set for analysis (i.e., gathers more data than existing systems or a human) the identification of features at dimensions beyond what the human eye can verify.
The validation device 10 embraced by the present invention includes a camera system (the imaging subsystem 12) used in combination with multiplexed light sources 16 to sequentially capture the reflected and luminescent images of value documents (i.e., banknotes, identification labels, etc.). After the multispectral digital images are captured (i.e., after step 440), one or more regions of interest are identified in the digital images in steps 450 and/or 460. Gray levels (reflectance/luminescence intensity levels) in the regions of interest are then compared to control regions in the image or to one or more other regions of interest in the image. Next, signal ratios (i.e., values) for one or more regions of interest are computed. The signal ratios for one or more regions of interest are then compared to “gold standard” (i.e., comparison standard 52) signal ratios, stored in a database or lookup table. Validation (step 460) includes determining whether the compared signal ratios for the one or more regions of interest meet a predetermined acceptable value. The validation results are then provided by the validation device 10 (or by a system connected to the validation device 10) to a user, an actuator, a system, and/or a recording device as feedback in step 470.
The validation device 10 embraced by the present invention may also scan other items (e.g., barcodes, serial numbers, etc.). The validation device 10 may include various components (e.g., color filters, cameras, etc.) The stored comparison standards 52 may be updated in various ways (e.g., electronically, via web-link, etc.). In some possible embodiments, the validation device 10 is not handheld, but rather integrated with a supermarket slot scanner or fixedly mounted options on a counter top or as an overhead document imager.
The validation device 10 embraced by the present invention embraces the multispectral imaging of currency items. Multiple light sources 16 and/or filters 24 may be used to provide illumination having various spectral profiles. For each illumination, the imaging subsystem 12 may be controlled (i.e., exposure control) to capture digital images. The present invention embraces different methods for controlling multiple illumination devices (i.e., strings of LEDs, LED arrays, etc.), each having a different spectral profile.
Referring now to
The control method may also be used to control the illumination for other applications. For example, barcodes of poor quality may be imaged using multiple spectral profiles to improve scanning. In another example, documents, currencies, or products may be verified. In still another example, imaging using multiple spectral profiles could be used for crime scene evidence collection (e.g., UV, IR images).
Referring again to
As noted previously, the validation device 10 may be capable of operating in either an indicia-reading mode or a CVAL mode. Referring now to
In a possible implementation, activating a trigger 34 (e.g., by pushing a button, touching a specific area on the validation device 10 (i.e., handheld imager)) initiates the validation device 10 to capture images and search for a barcode within the captured images (i.e., the processor activates the validation device (step 410). If there is a one or two-dimensional barcode in the captured images, the validation device 10 will scan the barcode. If there is no barcode present in the capture images, the validation device 10 performs steps 440 through 470 (of
Referring now to
One possible method for providing positioning feedback embraced by the present invention is as follows. First, in step 610, information corresponding to a currency item's position and orientation are derived. Next, in step 620, the positioning feedback is generated and communication to indicators that inform a user how to reposition the currency item. This process is iterated until the currency item is in the proper position/orientation.
The positioning feedback/indicators may be embodied in a variety of ways. For example, an indication of good/bad positioning may be conveyed via dedicated colors and/or lights. Indicator lights may also specify the direction the currency item should be moved (e.g., left/right, up, down, closer/further, and/or various forms of rotation). Positioning feedback/indicators may inform a user of an obstructed view. Positioning feedback/indicators may be audible or tactile. Audio indicators may be any combination of sounds, tones, “grunts”, or spoken words. The positioning feedback/indicators may include visual text/images projected into the validation device field of view. This feedback may visually indicate where a currency item should be located (at least initially). The various types of feedback/indicators may be combined.
Referring now to
In various embodiments, the validation device 10 provides a unique visual indication only to the operator (i.e., cashier, user, etc.) that a counterfeit has been detected, without alerting the customer or bystanders in any way. The customer is asked to provide a valid photo identification (e.g. driver's license) and the photo identification (i.e., photo-ID) will be scanned and recorded. The image of the counterfeit item, a transaction record, and the photo-ID image will be stored and made available to the law enforcement agencies or product manufacturers.
In various embodiments, the CVAL device 10 includes a multi-color illuminator, which could be used as an indication of authenticity (e.g., GREEN=valid, RED=counterfeit). The illumination of an item using a specific color (or color combination) indicates authenticity. When a counterfeit is detected, an image of the counterfeit item is automatically stored with transaction. The operator may repeat the verification process or use a secondary validation method (e.g., use a chemical pen to enhance the validation process).
In another possible embodiment, a customer is notified that the item is not authentic and his/her photo ID is requested to be imaged/recorded. If the customer is compliant, then he/she does not incur a loss. If the customer refuses to provide ID, a security camera 300 in communication with the validation device 10 may be used to surreptitiously capture an image of the customer. The information gathered may be stored (step 720) for future investigations or could be used with other stored information to facilitate “global” tracing of the counterfeit money or merchandise.
The main priority for any validation method is accuracy. There is, however, a substantial cost for highly accurate performance. The present invention embraces introducing a controlled chemical/substance to currency items, documents, or merchandise to improve the accuracy of the multi-spectrum validation device (i.e., before performing method 400 of
In various embodiments, the unique chemical/substance is applied to the currency, document, or merchandise immediately before authentication. In another possible embodiment, the unique chemical/substance may be applied to a printed document, label, or packaging (e.g., a chemical in the ink used to print the document) during fabrication. In operation, the multispectral images may be analyzed to detect a unique feature or characteristic corresponding to the chemical/substance. When illuminated with a particular spectral profile, the chemical/substance may reveal an imprinted pattern, text, or number.
Referring now to
In various embodiments of a method for determining fitness as depicted in
In various embodiments, a currency item is illuminated and imaged using the multi-spectrum validation device 10. The captured images may be analyzed (e.g., reflectance measured) in processing step 850 to determine the soiling and print quality of the currency banknote. If unacceptable, fitness feedback may be provided in step 870 to a user instructing him/her to remove the currency item from circulation.
In various embodiments in method 800 for determining the fitness of a banknote, a currency item is illuminated and imaged using the multi-spectrum validation device 10 (steps 820 through 840). The captured images are analyzed (e.g., reflectance measured) to determine the shape and to detect tears, holes, tape, and/or missing portions in step 850 by analyzing the captured digital images. If unacceptable, fitness feedback may be provided to a user instructing him/her to remove the currency item from circulation (step 870).
Section IV: Imaging Improvement Methods for Validation
Validation using multispectral illumination/imaging requires control of the illumination and image acquisition parameters in order to optimize image quality, optimize repeatability (e.g., two different devices operate similarly), and allow for similar processing of images captured under different conditions. The present invention embraces calibration methods to “normalize” images to one another and between devices so they can be processed similarly.
Referring now to
Calibration may be achieved using a variety of techniques (e.g., automatic gain control, calibrations, etc.) applied either individually or in combination to normalize/equalize the captured digital images. In various embodiments, automatic or preset exposures are assigned for each spectral profile illumination. In various embodiments, automatic or preset illumination intensities may be assigned for each spectral profile illumination. In various embodiments, automatic or preset image gains are assigned for each spectral profile illumination. In these cases, the assignment of the parameters (i.e., calibration) may occur when the device is fabricated or installed. Alternatively, the calibration may occur as a periodic (e.g., scheduled) adjustment or when the application/environment is changed.
In various embodiments, the calibration includes using a multi-spectral validation device 10 to capture multiple spectral profile images of a calibration target that particular has color/gray-scale values. The captured images are then analyzed and compared to the known values of the calibration target. Next, the illumination/imaging parameters are adjusted (e.g., illumination strength, exposure time, image gain) for each of the spectral profiles. The adjustment may be automatic or manual and the parameters are adjusted until the normalization is achieved. The final parameter settings are then saved on the validation device and used by the validation device for subsequent illumination/imaging.
Referring to now
Validation using the multi-spectral illumination/imaging relies on subtle differences in image levels. As a result, variances in illumination/imaging affect the intended results. Such variances may result from variations associated with the light sources or the image sensor (e.g., thermal variations, operating-characteristic variations, exposure-time variations, etc.). The present invention embraces real-time control of the illumination/imaging settings (i.e., parameters) to counteract these variances and maintain control of the image levels.
There are multiple embodiments for the real-time control, all using feedback from the validation device's light sources 16 to monitor the rate at which illuminance is being delivered (i.e., intensity). As exposure is proportional to illuminance multiplied by exposure time, exposure times can be adjusted to achieve the desired total exposure ratio between all spectral profiles.
In all possible embodiments, a validation device 10 is factory calibrated to determine the optimal exposure ratio. This optimal ratio is known as the target ratio and is recorded in the validation device's firmware. During operation, an initial exposure level is determined by taking an initial exposure (likely using a combination of LEDs, or perhaps IR only) to get a sense of the imaged item. The initial exposure produces a first image. The first image is used to estimate the appropriate overall exposure level required.
In various embodiments, a small reference target attached to the validation device (e.g., positioned in the periphery of the field of view, occupying the perimeter of the field of view, etc.). This reference target may include a white portion and/or may include multi-colored portions. The target reflects a small amount of light from the light sources (during the initial exposure) back to the image sensor. The signal captured on the image sensor 28 corresponding to the reference target is used to adjust the exposure times of the light sources. In some cases, the adjustment also uses the light source's response to temperature. For example, after the initial exposure time ratios are set, a string of exposures would generate captured images, one for each spectral profile. The signal captured on the image sensor 28 corresponding to the reference target for these exposures is then used to further adjust the exposure ratio (e.g., closer to the desired exposure ratio).
In various embodiments, a light source sensing subsystem 70 may be used to monitor exposure levels The light source sensing subsystem 70 may include a photodiode 72 that is positioned to pick off a small amount of light every time a light source 16 is activated. The signal from this photodiode 72 provides reference information used for adjusting the intensity of the light sources 16. Alternatively, the signal from the photodiode 72 may be used as a trigger for the sensor shutter. In this case, a trigger signal would simultaneously turn on a light source 16 and open the image sensor's 28 shutter. An integrating circuit may be used to integrate the signal from the photodiode 72 until a desired level is reached. When the desired level is reached the image sensor's 28 shutter would be closed. The real-time integration of the sensor signal ensures the proper exposure ratios. This process may be repeated for each spectral profile.
In various embodiments, a light source sensing subsystem 70 is used to generate real-time feedback and integration. This particular embodiment is the same as the last particular embodiment but rather than using a photodiode 72, a portion of the image sensor 28 is used as the sensing subsystem, and the feedback signal is generated by the image sensor's response to light reflected from a reference target.
In various embodiments, the light sources (e.g., light emitting diode, LED) 16 are controlled to prevent temperature drifts in intensity. Temperature affects LED efficiency. When an LED is turned on, its efficiency can change until a thermal equilibrium is reached. To prevent thermal drift, the LEDs are activated and allowed to reach thermal equilibrium. The activation time (i.e., burn-in time) may be set using an ambient temperature sensor, or set during fabrication as a result of testing. After the burn-in time, the image sensor's shutter would be opened to capture an image. Calibration of the collected image may then be based on the sensed temperature or cataloged data.
In various embodiments, each light source (e.g., LED) is assigned a particular initial exposure. The initial exposure is followed a second exposure, which is fine-tuned for processing. This method could be used with or without the initial multi-color single exposure. If the single exposure is used, the single exposure is used as a basis for determining the initial exposures for the spectral profiles. If the single exposure is not used, then the initial spectral profile's exposures could be used to estimate the desired exposure of the next initial color exposure. Results from each initial color exposure are processed to determine exposure refinement for the second exposures. The reference target described above could be used for feedback, or a predetermined quiet region, identified in an image during the initial color exposures, could be used to refine the secondary exposures to optimize the exposure ratios.
When using multi-spectral illumination/image for currency validation, a fold in a currency item may cause shadows in each spectral profile illumination. As a result, the shadow pattern in the images captured from the different illuminations may be used to reduce noise and non-uniformities. This approach can also suppress some features, like serial numbers and magnetic strips.
Referring now to
In various embodiments, an illumination having a spectral profile in near infrared (NIR) band would be used to minimize illumination artifacts (e.g., shadows) by normalizing validation images to an NIR image. Colored inks often have no contrast in NIR, (i.e., are invisible in NIR image). As the colored ink forms the information measured during validation, it remains relatively intact after normalizing the validation images to the NIR image. For currency items that use a significant amount of black ink (i.e., are visible in the NIR image), the black ink will provide no significant benefit to spectral analysis as it looks black to all colors. As a result, the NIR normalization process is not significantly affected by the information formed by the black ink.
In various embodiments, other spectral profiles may be used for normalization. For example, a plurality of images may be captured for each spectral profile in rapid succession (i.e., to minimize any motion that might occur between frames). The images are then processed to identify the banknote, spatially bin the reflectivity data, and adjust for exposure differences between images. Next, the average NIR return over the banknote in the images is calculated. Then each image is normalized by the image's ratio of the NIR data set to its average NIR. At this point, further normalization algorithms may be applied before validating the banknote.
While validation of banknotes has been described, other currency items such as coins may be validated by the validation device by the same or similar methods according to various embodiments.
To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:
In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.
This application is a continuation of and claiming the benefit of priority to U.S. application Ser. No. 16/414,477 entitled “DEVICES, SYSTEMS, AND METHODS FOR OPTICAL VALIDATION” filed on May 16, 2019, which is a non-provisional application claiming the benefit of priority to U.S. application Ser. No. 15/388,082 entitled “DEVICES, SYSTEMS, AND METHODS FOR OPTICAL VALIDATION” filed on Dec. 22, 2016, which is a non-provisional application claiming the benefit of priority to U.S. Provisional Application Ser. No. 62/273,493, filed on Dec. 31, 2015, the contents of each of which are incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
5149948 | Chisholm | Sep 1992 | A |
5233197 | Bowman et al. | Aug 1993 | A |
5304813 | De Man | Apr 1994 | A |
5541419 | Arackellian | Jul 1996 | A |
5855268 | Zoladz, Jr. | Jan 1999 | A |
5918960 | Hopwood et al. | Jul 1999 | A |
6082775 | Phillips | Jul 2000 | A |
6219158 | Dawe | Apr 2001 | B1 |
6550671 | Brown | Apr 2003 | B1 |
6621916 | Smith | Sep 2003 | B1 |
6741727 | Hirasawa | May 2004 | B1 |
6832425 | Taylor et al. | Dec 2004 | B2 |
6832725 | Gardiner et al. | Dec 2004 | B2 |
6832729 | Perry | Dec 2004 | B1 |
6848561 | Bao | Feb 2005 | B2 |
7128266 | Zhu et al. | Oct 2006 | B2 |
7159783 | Walczyk et al. | Jan 2007 | B2 |
7387246 | Palestini et al. | Jun 2008 | B2 |
7413127 | Ehrhart et al. | Aug 2008 | B2 |
7454049 | Paraskevakos | Nov 2008 | B2 |
7584890 | Mazowiesky et al. | Sep 2009 | B2 |
7620359 | Gardner et al. | Nov 2009 | B2 |
7684607 | Joshi et al. | Mar 2010 | B2 |
7726575 | Wang et al. | Jun 2010 | B2 |
8194237 | Cronin et al. | Jun 2012 | B2 |
8260027 | Nireki | Sep 2012 | B2 |
8290236 | Lett et al. | Oct 2012 | B2 |
8294969 | Plesko | Oct 2012 | B2 |
8317105 | Kotlarsky et al. | Nov 2012 | B2 |
8322622 | Liu | Dec 2012 | B2 |
8333323 | Richardson et al. | Dec 2012 | B2 |
8366005 | Kotlarsky et al. | Feb 2013 | B2 |
8371507 | Haggerty et al. | Feb 2013 | B2 |
8376233 | Horn et al. | Feb 2013 | B2 |
8381979 | Franz | Feb 2013 | B2 |
8390909 | Plesko | Mar 2013 | B2 |
8408464 | Zhu et al. | Apr 2013 | B2 |
8408468 | Van et al. | Apr 2013 | B2 |
8408469 | Good | Apr 2013 | B2 |
8411177 | Giebel | Apr 2013 | B2 |
8424768 | Rueblinger et al. | Apr 2013 | B2 |
8448863 | Xian et al. | May 2013 | B2 |
8457013 | Essinger et al. | Jun 2013 | B2 |
8459557 | Havens et al. | Jun 2013 | B2 |
8469272 | Kearney | Jun 2013 | B2 |
8474712 | Kearney et al. | Jul 2013 | B2 |
8479992 | Kotlarsky et al. | Jul 2013 | B2 |
8490877 | Kearney | Jul 2013 | B2 |
8517271 | Kotlarsky et al. | Aug 2013 | B2 |
8523076 | Good | Sep 2013 | B2 |
8528818 | Ehrhart et al. | Sep 2013 | B2 |
8544737 | Gomez et al. | Oct 2013 | B2 |
8548420 | Grunow et al. | Oct 2013 | B2 |
8550335 | Samek et al. | Oct 2013 | B2 |
8550354 | Gannon et al. | Oct 2013 | B2 |
8550357 | Kearney | Oct 2013 | B2 |
8556174 | Kosecki et al. | Oct 2013 | B2 |
8556176 | Van et al. | Oct 2013 | B2 |
8556177 | Hussey et al. | Oct 2013 | B2 |
8559767 | Barber et al. | Oct 2013 | B2 |
8561895 | Gomez et al. | Oct 2013 | B2 |
8561903 | Sauerwein, Jr. | Oct 2013 | B2 |
8561905 | Edmonds et al. | Oct 2013 | B2 |
8565107 | Pease et al. | Oct 2013 | B2 |
8571307 | Li et al. | Oct 2013 | B2 |
8579200 | Samek et al. | Nov 2013 | B2 |
8583924 | Caballero et al. | Nov 2013 | B2 |
8584945 | Wang et al. | Nov 2013 | B2 |
8587595 | Wang | Nov 2013 | B2 |
8587697 | Hussey et al. | Nov 2013 | B2 |
8588869 | Sauerwein et al. | Nov 2013 | B2 |
8590789 | Nahill et al. | Nov 2013 | B2 |
8596539 | Havens et al. | Dec 2013 | B2 |
8596542 | Havens et al. | Dec 2013 | B2 |
8596543 | Havens et al. | Dec 2013 | B2 |
8599271 | Havens et al. | Dec 2013 | B2 |
8599957 | Peake et al. | Dec 2013 | B2 |
8600158 | Li et al. | Dec 2013 | B2 |
8600167 | Showering | Dec 2013 | B2 |
8602309 | Longacre et al. | Dec 2013 | B2 |
8608053 | Meier et al. | Dec 2013 | B2 |
8608071 | Liu et al. | Dec 2013 | B2 |
8611309 | Wang et al. | Dec 2013 | B2 |
8615487 | Gomez et al. | Dec 2013 | B2 |
8621123 | Caballero | Dec 2013 | B2 |
8622303 | Meier et al. | Jan 2014 | B2 |
8628013 | Ding | Jan 2014 | B2 |
8628015 | Wang et al. | Jan 2014 | B2 |
8628016 | Winegar | Jan 2014 | B2 |
8629926 | Wang | Jan 2014 | B2 |
8630491 | Longacre et al. | Jan 2014 | B2 |
8635309 | Berthiaume et al. | Jan 2014 | B2 |
8636200 | Kearney | Jan 2014 | B2 |
8636212 | Nahill et al. | Jan 2014 | B2 |
8636215 | Ding et al. | Jan 2014 | B2 |
8636224 | Wang | Jan 2014 | B2 |
8638806 | Wang et al. | Jan 2014 | B2 |
8640958 | Lu et al. | Feb 2014 | B2 |
8640960 | Wang et al. | Feb 2014 | B2 |
8643717 | Li et al. | Feb 2014 | B2 |
8646692 | Meier et al. | Feb 2014 | B2 |
8646694 | Wang et al. | Feb 2014 | B2 |
8657200 | Ren et al. | Feb 2014 | B2 |
8659397 | Vargo et al. | Feb 2014 | B2 |
8668149 | Good | Mar 2014 | B2 |
8678285 | Kearney | Mar 2014 | B2 |
8678286 | Smith et al. | Mar 2014 | B2 |
8682077 | Longacre, Jr. | Mar 2014 | B1 |
D702237 | Oberpriller et al. | Apr 2014 | S |
8687282 | Feng et al. | Apr 2014 | B2 |
8692927 | Pease et al. | Apr 2014 | B2 |
8695880 | Bremer et al. | Apr 2014 | B2 |
8698949 | Grunow et al. | Apr 2014 | B2 |
8702000 | Barber et al. | Apr 2014 | B2 |
8717494 | Gannon | May 2014 | B2 |
8720783 | Biss et al. | May 2014 | B2 |
8723804 | Fletcher et al. | May 2014 | B2 |
8723904 | Marty et al. | May 2014 | B2 |
8727223 | Wang | May 2014 | B2 |
8740082 | Wilz, Sr. | Jun 2014 | B2 |
8740085 | Furlong et al. | Jun 2014 | B2 |
8746563 | Hennick et al. | Jun 2014 | B2 |
8750445 | Peake et al. | Jun 2014 | B2 |
8752766 | Xian et al. | Jun 2014 | B2 |
8756059 | Braho et al. | Jun 2014 | B2 |
8757495 | Qu et al. | Jun 2014 | B2 |
8760563 | Koziol et al. | Jun 2014 | B2 |
8763909 | Reed et al. | Jul 2014 | B2 |
8777108 | Coyle | Jul 2014 | B2 |
8777109 | Oberpriller et al. | Jul 2014 | B2 |
8779898 | Havens et al. | Jul 2014 | B2 |
8781520 | Payne et al. | Jul 2014 | B2 |
8783573 | Havens et al. | Jul 2014 | B2 |
8786839 | Cronin et al. | Jul 2014 | B2 |
8789757 | Barten | Jul 2014 | B2 |
8789758 | Hawley et al. | Jul 2014 | B2 |
8789759 | Xian et al. | Jul 2014 | B2 |
8794520 | Wang et al. | Aug 2014 | B2 |
8794522 | Ehrhart | Aug 2014 | B2 |
8794525 | Amundsen et al. | Aug 2014 | B2 |
8794526 | Wang et al. | Aug 2014 | B2 |
8798367 | Ellis | Aug 2014 | B2 |
8807431 | Wang et al. | Aug 2014 | B2 |
8807432 | Van et al. | Aug 2014 | B2 |
8820630 | Qu et al. | Sep 2014 | B2 |
8822848 | Meagher | Sep 2014 | B2 |
8824692 | Sheerin et al. | Sep 2014 | B2 |
8824696 | Braho | Sep 2014 | B2 |
8842849 | Wahl et al. | Sep 2014 | B2 |
8844822 | Kotlarsky et al. | Sep 2014 | B2 |
8844823 | Fritz et al. | Sep 2014 | B2 |
8849019 | Li et al. | Sep 2014 | B2 |
D716285 | Chaney et al. | Oct 2014 | S |
8851383 | Yeakley et al. | Oct 2014 | B2 |
8854633 | Laffargue et al. | Oct 2014 | B2 |
8866963 | Grunow et al. | Oct 2014 | B2 |
8868421 | Braho et al. | Oct 2014 | B2 |
8868519 | Maloy et al. | Oct 2014 | B2 |
8868802 | Barten | Oct 2014 | B2 |
8868803 | Caballero | Oct 2014 | B2 |
8870074 | Gannon | Oct 2014 | B1 |
8879639 | Sauerwein, Jr. | Nov 2014 | B2 |
8880426 | Smith | Nov 2014 | B2 |
8881983 | Havens et al. | Nov 2014 | B2 |
8881987 | Wang | Nov 2014 | B2 |
8903172 | Smith | Dec 2014 | B2 |
8908995 | Benos et al. | Dec 2014 | B2 |
8910870 | Li et al. | Dec 2014 | B2 |
8910875 | Ren et al. | Dec 2014 | B2 |
8914290 | Hendrickson et al. | Dec 2014 | B2 |
8914788 | Pettinelli et al. | Dec 2014 | B2 |
8915439 | Feng et al. | Dec 2014 | B2 |
8915444 | Havens et al. | Dec 2014 | B2 |
8916789 | Woodburn | Dec 2014 | B2 |
8918250 | Hollifield | Dec 2014 | B2 |
8918564 | Caballero | Dec 2014 | B2 |
8925818 | Kosecki et al. | Jan 2015 | B2 |
8931696 | Hood | Jan 2015 | B2 |
8939374 | Jovanovski et al. | Jan 2015 | B2 |
8942480 | Ellis | Jan 2015 | B2 |
8944313 | Williams et al. | Feb 2015 | B2 |
8944327 | Meier et al. | Feb 2015 | B2 |
8944332 | Harding et al. | Feb 2015 | B2 |
8950678 | Germaine et al. | Feb 2015 | B2 |
D723560 | Zhou et al. | Mar 2015 | S |
8967468 | Gomez et al. | Mar 2015 | B2 |
8971346 | Sevier | Mar 2015 | B2 |
8976030 | Cunningham et al. | Mar 2015 | B2 |
8976368 | El et al. | Mar 2015 | B2 |
8978981 | Guan | Mar 2015 | B2 |
8978983 | Bremer et al. | Mar 2015 | B2 |
8978984 | Hennick et al. | Mar 2015 | B2 |
8985456 | Zhu et al. | Mar 2015 | B2 |
8985457 | Soule et al. | Mar 2015 | B2 |
8985459 | Kearney et al. | Mar 2015 | B2 |
8985461 | Gelay et al. | Mar 2015 | B2 |
8988578 | Showering | Mar 2015 | B2 |
8988590 | Gillet et al. | Mar 2015 | B2 |
8991704 | Hopper et al. | Mar 2015 | B2 |
8996194 | Davis et al. | Mar 2015 | B2 |
8996384 | Funyak et al. | Mar 2015 | B2 |
8998091 | Edmonds et al. | Apr 2015 | B2 |
9002641 | Showering | Apr 2015 | B2 |
9007368 | Laffargue et al. | Apr 2015 | B2 |
9010641 | Qu et al. | Apr 2015 | B2 |
9015513 | Murawski et al. | Apr 2015 | B2 |
9016576 | Brady et al. | Apr 2015 | B2 |
D730357 | Fitch et al. | May 2015 | S |
9022288 | Nahill et al. | May 2015 | B2 |
9030964 | Essinger et al. | May 2015 | B2 |
9033240 | Smith et al. | May 2015 | B2 |
9033242 | Gillet et al. | May 2015 | B2 |
9036054 | Koziol et al. | May 2015 | B2 |
9037344 | Chamberlin | May 2015 | B2 |
9038911 | Xian et al. | May 2015 | B2 |
9038915 | Smith | May 2015 | B2 |
D730901 | Oberpriller et al. | Jun 2015 | S |
D730902 | Fitch et al. | Jun 2015 | S |
D733112 | Chaney et al. | Jun 2015 | S |
9047098 | Barten | Jun 2015 | B2 |
9047359 | Caballero et al. | Jun 2015 | B2 |
9047420 | Caballero | Jun 2015 | B2 |
9047525 | Barber et al. | Jun 2015 | B2 |
9047531 | Showering et al. | Jun 2015 | B2 |
9049640 | Wang et al. | Jun 2015 | B2 |
9053055 | Caballero | Jun 2015 | B2 |
9053378 | Hou et al. | Jun 2015 | B1 |
9053380 | Xian et al. | Jun 2015 | B2 |
9057641 | Amundsen et al. | Jun 2015 | B2 |
9058526 | Powilleit | Jun 2015 | B2 |
9064165 | Havens et al. | Jun 2015 | B2 |
9064167 | Xian et al. | Jun 2015 | B2 |
9064168 | Todeschini et al. | Jun 2015 | B2 |
9064254 | Todeschini et al. | Jun 2015 | B2 |
9066032 | Wang | Jun 2015 | B2 |
9070032 | Corcoran | Jun 2015 | B2 |
D734339 | Zhou et al. | Jul 2015 | S |
D734751 | Oberpriller et al. | Jul 2015 | S |
9076459 | Braho et al. | Jul 2015 | B2 |
9080856 | Laffargue | Jul 2015 | B2 |
9082023 | Feng et al. | Jul 2015 | B2 |
9084032 | Rautiola et al. | Jul 2015 | B2 |
9224022 | Ackley et al. | Dec 2015 | B2 |
9224027 | Van et al. | Dec 2015 | B2 |
D747321 | London et al. | Jan 2016 | S |
9230140 | Ackley | Jan 2016 | B1 |
9250712 | Todeschini | Feb 2016 | B1 |
9258033 | Showering | Feb 2016 | B2 |
9262633 | Todeschini et al. | Feb 2016 | B1 |
9310609 | Rueblinger et al. | Apr 2016 | B2 |
D757009 | Oberpriller et al. | May 2016 | S |
9342724 | McCloskey et al. | May 2016 | B2 |
9375945 | Bowles | Jun 2016 | B1 |
D760719 | Zhou et al. | Jul 2016 | S |
9390596 | Todeschini | Jul 2016 | B1 |
D762604 | Fitch et al. | Aug 2016 | S |
D762647 | Fitch et al. | Aug 2016 | S |
9412242 | Van et al. | Aug 2016 | B2 |
D766244 | Zhou et al. | Sep 2016 | S |
9443123 | Hejl | Sep 2016 | B2 |
9443222 | Singel et al. | Sep 2016 | B2 |
9478113 | Xie et al. | Oct 2016 | B2 |
D771631 | Fitch et al. | Nov 2016 | S |
9507974 | Todeschini | Nov 2016 | B1 |
D777166 | Bidwell et al. | Jan 2017 | S |
D783601 | Schulte et al. | Apr 2017 | S |
D785617 | Bidwell et al. | May 2017 | S |
D785636 | Oberpriller et al. | May 2017 | S |
D790505 | Vargo et al. | Jun 2017 | S |
D790546 | Zhou et al. | Jun 2017 | S |
D790553 | Fitch et al. | Jun 2017 | S |
9786101 | Ackley | Oct 2017 | B2 |
9857167 | Jovanovski et al. | Jan 2018 | B2 |
9891612 | Charpentier et al. | Feb 2018 | B2 |
9892876 | Bandringa | Feb 2018 | B2 |
9954871 | Hussey et al. | Apr 2018 | B2 |
9978088 | Pape | May 2018 | B2 |
10007112 | Fitch et al. | Jun 2018 | B2 |
10038716 | Todeschini et al. | Jul 2018 | B2 |
10066982 | Ackley et al. | Sep 2018 | B2 |
10325436 | Van Horn et al. | Jun 2019 | B2 |
10360728 | Venkatesha et al. | Jul 2019 | B2 |
10401436 | Young et al. | Sep 2019 | B2 |
20030030785 | Christophersen | Feb 2003 | A1 |
20030098350 | Liou | May 2003 | A1 |
20030210386 | Laskowski | Nov 2003 | A1 |
20040240722 | Tsuji et al. | Dec 2004 | A1 |
20070063048 | Havens et al. | Mar 2007 | A1 |
20080099561 | Douma | May 2008 | A1 |
20080137080 | Bodzin et al. | Jun 2008 | A1 |
20090073503 | Lebaschi et al. | Mar 2009 | A1 |
20090134221 | Zhu et al. | May 2009 | A1 |
20100177076 | Essinger et al. | Jul 2010 | A1 |
20100177080 | Essinger et al. | Jul 2010 | A1 |
20100177707 | Essinger et al. | Jul 2010 | A1 |
20100177749 | Essinger et al. | Jul 2010 | A1 |
20110090485 | Cronin et al. | Apr 2011 | A1 |
20110169999 | Grunow et al. | Jul 2011 | A1 |
20110202554 | Powilleit et al. | Aug 2011 | A1 |
20120081011 | Wilsher | Apr 2012 | A1 |
20120111946 | Golant | May 2012 | A1 |
20120168512 | Kotlarsky et al. | Jul 2012 | A1 |
20120193423 | Samek | Aug 2012 | A1 |
20120203647 | Smith | Aug 2012 | A1 |
20120223141 | Good et al. | Sep 2012 | A1 |
20130034290 | Lee | Feb 2013 | A1 |
20130043312 | Van Horn | Feb 2013 | A1 |
20130075168 | Amundsen et al. | Mar 2013 | A1 |
20130175341 | Kearney et al. | Jul 2013 | A1 |
20130175343 | Good | Jul 2013 | A1 |
20130257744 | Daghigh et al. | Oct 2013 | A1 |
20130257759 | Daghigh | Oct 2013 | A1 |
20130270346 | Xian et al. | Oct 2013 | A1 |
20130287258 | Kearney | Oct 2013 | A1 |
20130292475 | Kotlarsky et al. | Nov 2013 | A1 |
20130292477 | Hennick et al. | Nov 2013 | A1 |
20130293539 | Hunt et al. | Nov 2013 | A1 |
20130293540 | Laffargue et al. | Nov 2013 | A1 |
20130306728 | Thuries et al. | Nov 2013 | A1 |
20130306731 | Pedrao | Nov 2013 | A1 |
20130307964 | Bremer et al. | Nov 2013 | A1 |
20130308625 | Park et al. | Nov 2013 | A1 |
20130313324 | Koziol et al. | Nov 2013 | A1 |
20130313325 | Wilz et al. | Nov 2013 | A1 |
20130342717 | Havens et al. | Dec 2013 | A1 |
20140001267 | Giordano et al. | Jan 2014 | A1 |
20140002828 | Laffargue et al. | Jan 2014 | A1 |
20140008439 | Wang | Jan 2014 | A1 |
20140009752 | Cronin | Jan 2014 | A1 |
20140025584 | Liu et al. | Jan 2014 | A1 |
20140034734 | Sauerwein, Jr. | Feb 2014 | A1 |
20140036848 | Pease et al. | Feb 2014 | A1 |
20140037196 | Blair | Feb 2014 | A1 |
20140039693 | Havens et al. | Feb 2014 | A1 |
20140042814 | Kather et al. | Feb 2014 | A1 |
20140049120 | Kohtz et al. | Feb 2014 | A1 |
20140049635 | Laffargue et al. | Feb 2014 | A1 |
20140061306 | Wu et al. | Mar 2014 | A1 |
20140063289 | Hussey et al. | Mar 2014 | A1 |
20140066136 | Sauerwein et al. | Mar 2014 | A1 |
20140067692 | Ye et al. | Mar 2014 | A1 |
20140070005 | Nahill et al. | Mar 2014 | A1 |
20140071840 | Venancio | Mar 2014 | A1 |
20140074746 | Wang | Mar 2014 | A1 |
20140076974 | Havens et al. | Mar 2014 | A1 |
20140078341 | Havens et al. | Mar 2014 | A1 |
20140078342 | Li et al. | Mar 2014 | A1 |
20140078345 | Showering | Mar 2014 | A1 |
20140098792 | Wang et al. | Apr 2014 | A1 |
20140100774 | Showering | Apr 2014 | A1 |
20140100813 | Showering | Apr 2014 | A1 |
20140103115 | Meier et al. | Apr 2014 | A1 |
20140104413 | McCloskey et al. | Apr 2014 | A1 |
20140104414 | McCloskey et al. | Apr 2014 | A1 |
20140104416 | Giordano et al. | Apr 2014 | A1 |
20140104451 | Todeschini et al. | Apr 2014 | A1 |
20140106594 | Skvoretz | Apr 2014 | A1 |
20140106725 | Sauerwein, Jr. | Apr 2014 | A1 |
20140108010 | Maltseff et al. | Apr 2014 | A1 |
20140108402 | Gomez et al. | Apr 2014 | A1 |
20140108682 | Caballero | Apr 2014 | A1 |
20140110485 | Toa et al. | Apr 2014 | A1 |
20140112570 | Ross et al. | Apr 2014 | A1 |
20140114530 | Fitch et al. | Apr 2014 | A1 |
20140124577 | Wang et al. | May 2014 | A1 |
20140124579 | Ding | May 2014 | A1 |
20140125842 | Winegar | May 2014 | A1 |
20140125853 | Wang | May 2014 | A1 |
20140125999 | Longacre et al. | May 2014 | A1 |
20140129378 | Richardson | May 2014 | A1 |
20140131438 | Kearney | May 2014 | A1 |
20140131441 | Nahill et al. | May 2014 | A1 |
20140131443 | Smith | May 2014 | A1 |
20140131444 | Wang | May 2014 | A1 |
20140131445 | Ding et al. | May 2014 | A1 |
20140131448 | Xian et al. | May 2014 | A1 |
20140133379 | Wang et al. | May 2014 | A1 |
20140136208 | Maltseff et al. | May 2014 | A1 |
20140140585 | Wang | May 2014 | A1 |
20140151453 | Meier et al. | Jun 2014 | A1 |
20140152882 | Samek et al. | Jun 2014 | A1 |
20140158770 | Sevier et al. | Jun 2014 | A1 |
20140159639 | Miller et al. | Jun 2014 | A1 |
20140159869 | Zumsteg et al. | Jun 2014 | A1 |
20140166755 | Lu et al. | Jun 2014 | A1 |
20140166757 | Smith | Jun 2014 | A1 |
20140166759 | Liu et al. | Jun 2014 | A1 |
20140168787 | Wang et al. | Jun 2014 | A1 |
20140175165 | Havens et al. | Jun 2014 | A1 |
20140175172 | Jovanovski et al. | Jun 2014 | A1 |
20140191644 | Chaney | Jul 2014 | A1 |
20140191913 | Ge et al. | Jul 2014 | A1 |
20140197238 | Liu et al. | Jul 2014 | A1 |
20140197239 | Havens et al. | Jul 2014 | A1 |
20140197304 | Feng et al. | Jul 2014 | A1 |
20140203087 | Smith et al. | Jul 2014 | A1 |
20140204268 | Grunow et al. | Jul 2014 | A1 |
20140214631 | Hansen | Jul 2014 | A1 |
20140217166 | Berthiaume et al. | Aug 2014 | A1 |
20140217180 | Liu | Aug 2014 | A1 |
20140231500 | Ehrhart et al. | Aug 2014 | A1 |
20140232930 | Anderson | Aug 2014 | A1 |
20140247315 | Marty et al. | Sep 2014 | A1 |
20140263493 | Amurgis et al. | Sep 2014 | A1 |
20140263645 | Smith et al. | Sep 2014 | A1 |
20140270196 | Braho et al. | Sep 2014 | A1 |
20140270229 | Braho | Sep 2014 | A1 |
20140278387 | DiGregorio | Sep 2014 | A1 |
20140282210 | Bianconi | Sep 2014 | A1 |
20140284384 | Lu et al. | Sep 2014 | A1 |
20140288933 | Braho et al. | Sep 2014 | A1 |
20140297058 | Barker et al. | Oct 2014 | A1 |
20140299665 | Barber et al. | Oct 2014 | A1 |
20140312121 | Lu et al. | Oct 2014 | A1 |
20140319220 | Coyle | Oct 2014 | A1 |
20140319221 | Oberpriller et al. | Oct 2014 | A1 |
20140326787 | Barten | Nov 2014 | A1 |
20140332590 | Wang et al. | Nov 2014 | A1 |
20140344943 | Todeschini et al. | Nov 2014 | A1 |
20140346233 | Liu et al. | Nov 2014 | A1 |
20140351317 | Smith et al. | Nov 2014 | A1 |
20140353373 | Van et al. | Dec 2014 | A1 |
20140361073 | Qu et al. | Dec 2014 | A1 |
20140361082 | Xian et al. | Dec 2014 | A1 |
20140362184 | Jovanovski et al. | Dec 2014 | A1 |
20140363015 | Braho | Dec 2014 | A1 |
20140369511 | Sheerin et al. | Dec 2014 | A1 |
20140374483 | Lu | Dec 2014 | A1 |
20140374485 | Xian et al. | Dec 2014 | A1 |
20150001301 | Ouyang | Jan 2015 | A1 |
20150001304 | Todeschini | Jan 2015 | A1 |
20150003673 | Fletcher | Jan 2015 | A1 |
20150009338 | Laffargue et al. | Jan 2015 | A1 |
20150009610 | London et al. | Jan 2015 | A1 |
20150014416 | Kotlarsky et al. | Jan 2015 | A1 |
20150021397 | Rueblinger et al. | Jan 2015 | A1 |
20150028102 | Ren et al. | Jan 2015 | A1 |
20150028103 | Jiang | Jan 2015 | A1 |
20150028104 | Ma et al. | Jan 2015 | A1 |
20150029002 | Yeakley et al. | Jan 2015 | A1 |
20150032709 | Maloy et al. | Jan 2015 | A1 |
20150039309 | Braho et al. | Feb 2015 | A1 |
20150040378 | Saber et al. | Feb 2015 | A1 |
20150048168 | Fritz et al. | Feb 2015 | A1 |
20150049347 | Laffargue et al. | Feb 2015 | A1 |
20150051992 | Smith | Feb 2015 | A1 |
20150053766 | Havens et al. | Feb 2015 | A1 |
20150053768 | Wang et al. | Feb 2015 | A1 |
20150053769 | Thuries et al. | Feb 2015 | A1 |
20150062366 | Liu et al. | Mar 2015 | A1 |
20150063215 | Wang | Mar 2015 | A1 |
20150063676 | Lloyd et al. | Mar 2015 | A1 |
20150069130 | Gannon | Mar 2015 | A1 |
20150071819 | Todeschini | Mar 2015 | A1 |
20150083800 | Li et al. | Mar 2015 | A1 |
20150086114 | Todeschini | Mar 2015 | A1 |
20150088522 | Hendrickson et al. | Mar 2015 | A1 |
20150096872 | Woodburn | Apr 2015 | A1 |
20150099557 | Pettinelli et al. | Apr 2015 | A1 |
20150100196 | Hollifield | Apr 2015 | A1 |
20150102109 | Huck | Apr 2015 | A1 |
20150109643 | Auger | Apr 2015 | A1 |
20150115035 | Meier et al. | Apr 2015 | A1 |
20150127791 | Kosecki et al. | May 2015 | A1 |
20150128116 | Chen et al. | May 2015 | A1 |
20150129659 | Feng et al. | May 2015 | A1 |
20150133047 | Smith et al. | May 2015 | A1 |
20150134470 | Hejl et al. | May 2015 | A1 |
20150136851 | Harding et al. | May 2015 | A1 |
20150136854 | Lu et al. | May 2015 | A1 |
20150142492 | Kumar | May 2015 | A1 |
20150144692 | Hejl | May 2015 | A1 |
20150144698 | Teng et al. | May 2015 | A1 |
20150144701 | Xian et al. | May 2015 | A1 |
20150149946 | Benos et al. | May 2015 | A1 |
20150161429 | Xian | Jun 2015 | A1 |
20150169925 | Chen et al. | Jun 2015 | A1 |
20150169929 | Williams et al. | Jun 2015 | A1 |
20150186703 | Chen et al. | Jul 2015 | A1 |
20150193644 | Kearney et al. | Jul 2015 | A1 |
20150193645 | Colavito et al. | Jul 2015 | A1 |
20150199957 | Funyak et al. | Jul 2015 | A1 |
20150204671 | Showering | Jul 2015 | A1 |
20150210199 | Payne | Jul 2015 | A1 |
20150220753 | Zhu et al. | Aug 2015 | A1 |
20150254485 | Feng et al. | Sep 2015 | A1 |
20150327012 | Bian et al. | Nov 2015 | A1 |
20150348350 | Collins et al. | Dec 2015 | A1 |
20160014251 | Hejl | Jan 2016 | A1 |
20160040982 | Li et al. | Feb 2016 | A1 |
20160042241 | Todeschini | Feb 2016 | A1 |
20160057230 | Todeschini et al. | Feb 2016 | A1 |
20160109219 | Ackley et al. | Apr 2016 | A1 |
20160109220 | Laffargue et al. | Apr 2016 | A1 |
20160109224 | Thuries et al. | Apr 2016 | A1 |
20160112631 | Ackley et al. | Apr 2016 | A1 |
20160112643 | Laffargue et al. | Apr 2016 | A1 |
20160124516 | Schoon et al. | May 2016 | A1 |
20160125217 | Todeschini | May 2016 | A1 |
20160125342 | Miller et al. | May 2016 | A1 |
20160125873 | Braho et al. | May 2016 | A1 |
20160133253 | Braho et al. | May 2016 | A1 |
20160163142 | Auger | Jun 2016 | A1 |
20160171720 | Todeschini | Jun 2016 | A1 |
20160178479 | Goldsmith | Jun 2016 | A1 |
20160180678 | Ackley et al. | Jun 2016 | A1 |
20160189087 | Morton et al. | Jun 2016 | A1 |
20160227912 | Oberpriller et al. | Aug 2016 | A1 |
20160232891 | Pecorari | Aug 2016 | A1 |
20160292477 | Bidwell | Oct 2016 | A1 |
20160294779 | Yeakley et al. | Oct 2016 | A1 |
20160306769 | Kohtz et al. | Oct 2016 | A1 |
20160307035 | Schilling et al. | Oct 2016 | A1 |
20160314276 | Wilz et al. | Oct 2016 | A1 |
20160314294 | Kubler et al. | Oct 2016 | A1 |
20160377414 | Thuries et al. | Dec 2016 | A1 |
20190272696 | Van Horn et al. | Sep 2019 | A1 |
Number | Date | Country |
---|---|---|
2927840 | Oct 2015 | EP |
10-1117359 | Mar 2012 | KR |
2013163789 | Nov 2013 | WO |
2013173985 | Nov 2013 | WO |
2014019130 | Feb 2014 | WO |
2014110495 | Jul 2014 | WO |
2014207415 | Dec 2014 | WO |
2015021476 | Feb 2015 | WO |
2015082332 | Jun 2015 | WO |
Entry |
---|
Santhanam, Kamesh, et al. “Counterfeit currency detection technique using image processing, polarization principle and holographic technique.” 2013 Fifth International Conference on Computational Intelligence, Modelling and Simulation. IEEE, 2013. (Year: 2013). |
Thirumalai R, Mukhopadhyay RD, Praveen VK, Ajayaghosh A. A slippery molecular assembly allows water as a self-erasable security marker. Scientific reports. May 5, 2015;5(1):1-1. (Year: 2015). |
Canmax CM-2D202 2D Handheld Barcode Scanner, downloaded from http://www.canmax.com.tw/product/view/CM-2D202 on Oct. 25, 2018, Copyrighted 2010 2 pages. |
Canmax CM-890K10 Light Weight Android Barcode Reader, downloaded from http://www.canmax.com.tw/product/view/CM-890K10 on Oct. 25, 2018, Copyrighted 2010, 2 pages. |
Cap-XX Inc., “Using Supercapacitors to Solve LED Flash Power Issues for High Resolution Camera Phones”, downloaded from www.cap-xx.com website Mar. 21, 2017, 3 pages. |
Examiner initiated interview summary dated Feb. 19, 2019 for U.S. Appl. No. 15/388,082. |
Examiner initiated interview summary dated Jul. 30, 2018 for U.S. Appl. No. 15/388,082. |
Examiner Interview Summary Record dated Jul. 23, 2021 for U.S. Appl. No. 16/414,477. |
Final Rejection dated May 3, 2021 for U.S. Appl. No. 16/414,477. |
Jayan Thomas, University of Central Florida, “Charged Up”; published in Pegasus, the Magazine of the University of Central Florida, Spring 2015, [Downloaded from https://www.ucf.edu/pegasus/charged-up/on Mar. 21, 2017], 12 pages. |
Kanwal, Navjot Kaur, Divya Jat, and Manish Malhotra. “Spectral analysis of various security features in the Indian currency note of highest denomination using Video Spectral Comparator-40.” International Journal of Innovative Science Engineering and Technology 2.11 (2015): 823-842. (Year: 2015). |
Nanomatrix, Calibrated Taggants Detection Systems, downloaded from https://www.nanomatrixsecure.com/en/security-products/inspection-systems/security-taggant-detection, on Oct. 25, 2018, 11 pages. |
Non-Final Rejection dated Jul. 30, 2018 for U.S. Appl. No. 15/388,082. |
Non-Final Rejection dated Oct. 30, 2020 for U.S. Appl. No. 16/414,477. |
Notice of Allowance and Fees Due dated Feb. 19, 2019 for U.S. Appl. No. 15/388,082. |
Notice of Allowance received for U.S. Appl. No. 16/414,477, dated Jul. 23, 2021, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 16/414,477, dated Nov. 18, 2021, 11 pages. |
Search Report in related European Application No. 16207454.6 dated May 30, 2017, pp. 1-8. |
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned. |
U.S. Patent Application for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned, U.S. Appl. No. 14/283,282. |
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned. |
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned. |
U.S. Appl. No. 61/807,825 for a Wearable Barcode Scanner filed Apr. 3, 2013 (Wang). |
U.S. Appl. No. 62/043,728 for Gesture-Controlled Computer System filed Aug. 29, 2014 (Bouchat et al.). |
U.S. Appl. No. 62/056,327 for System and Method for Workflow Management filed Sep. 26, 2014 (Geisler et al.). |
U.S. Appl. No. 62/062,175 for System and Methods for Dimensioning filed Oct. 10, 2014 (McCloskey et al.). |
U.S. Appl. No. 62/083,566 for Gesture-Controlled Computer System filed Nov. 24, 2014 (Bouchat et al.). |
U.S. Appl. No. 62/092,141 for Information Augmented Product Guide filed Dec. 15, 2014 (Todeschini et al.). |
U.S. Appl. No. 62/092,147 for Augmented Reality Virtual Product for Display filed Dec. 15, 2014 (Todeschini). |
U.S. Appl. No. 62/092,156 for Augmented Reality Asset Locator filed Dec. 15, 2014 (Todeschini et al.). |
U.S. Appl. No. 62/093,448 for Location Based Forklift Collision Warning, Prediction and Avoidance filed Dec. 18, 2014 (Bernhardt et al.). |
U.S. Appl. No. 62/093,501 for Active Exit Sign filed Dec. 18, 2014 (McMahan et al.). |
U.S. Appl. No. 62/093,535 for Flip Open Wearable Computer filed Dec. 18, 2014 (Harr). |
U.S. Appl. No. 62/093,806 for Method of Identifying a Bad Battery in an Electronic Device filed Dec. 18, 2014 (Young et al.). |
U.S. Appl. No. 62/093,859 for Method to Identify Bad Touch Panel With Intermittent Field Failures filed Dec. 18, 2014 (Young et al.). |
U.S. Appl. No. 62/094,344 for Host Controllable Pop-Up Soft Keypads filed Dec. 19, 2014 (Roeder). |
U.S. Appl. No. 62/094,442 for Intelligent Small Screen Layout and Pop-Up Keypads for Screen-Only Devices filed Dec. 19, 2014 (Roeder). |
U.S. Appl. No. 62/095,089 for Conformable Hand Mount for a Mobile Scanner filed Dec. 22, 2014 (Oberpriller et al.). |
U.S. Appl. No. 62/095,453 for Augmented Display and User Input System filed Dec. 22, 2014 (Todeschini). |
U.S. Appl. No. 62/095,470 for Delayed Trim of Managed Nand Flash Memory in Computing Devices filed Dec. 2014 (Redondo et al.). |
U.S. Appl. No. 62/095,808 for Method of Barcode Templating for Enhanced Decoding Performance filed Dec. 23, 2014 (Meier et al.). |
U.S. Appl. No. 62/095,822 for Tablet Computer With Interface Channels filed Dec. 23, 2014 (Bidwell et al.). |
U.S. Appl. No. 62/096,910 for Scanning Improvements for Saturated Signals Using Automatic and Fixed Gain Control Methods filed Dec. 26, 2014 (Hejl et al.). |
U.S. Appl. No. 62/096,982 for Product and Location Management via Voice Recognition filed Dec. 26, 2014 (Pecorari et al.). |
U.S. Appl. No. 62/097,054 for Power Configurable Headband filed Dec. 27, 2014 (DiPiazza et al.). |
U.S. Appl. No. 62/097,056 for Acceleration-Based Motion Tolerance and Predictive Decoding filed Dec. 27, 2014 (Todeschini et al.). |
U.S. Appl. No. 62/097,091 for Remote Monitoring of Vehicle Diagnostic Information filed Dec. 28, 2014 (Carrasco). |
U.S. Appl. No. 62/097,097 for Dynamic Check Digit Utilization via Electronic Tag filed Dec. 28, 2014 (Pecorari et al.). |
U.S. Appl. No. 62/097,356 for Symbol Based Location Identification filed Dec. 29, 2014 (Pecorari et al.). |
U.S. Appl. No. 62/097,367 for Interleaving Surprise Activities in Workflow, filed Dec. 29, 2014 (Murawski et al.). |
U.S. Appl. No. 62/097,411 for Confirming Product Location Using a Subset of a Product Identifier filed Dec. 29, 2014 (Mellott et al.). |
U.S. Appl. No. 62/097,480 for Distributed Headset With Electronics Module filed Dec. 29, 2014 (DePiazza et al.). |
U.S. Appl. No. 62/097,632 for Method of Simulating a Virtual Out-of-Box Experience of a Packaged Product filed Dec. 30, 2014 (Todeschini et al.). |
U.S. Appl. No. 62/098,012 for Method and System for Improving Barcode Scanner Performance filed Dec. 30, 2014 (Au et al.). |
U.S. Appl. No. 62/098,072 for Real-Time Adjustable Window Feature for Barcode Scanning and Process of Scanning Barcode With Adjustable Window Feature filed Dec. 29, 2014 (Todeschini et al.). |
U.S. Appl. No. 62/098,110 for Point-of-Sale (POS) Code Sensing Apparatus filed Dec. 30, 2014 (Good et al.). |
U.S. Appl. No. 62/098,150 for Augmented Reality Vision Barcode Scanning System and Method filed Dec. 30, 2014 (Franz). |
U.S. Appl. No. 62/098,201 for Visual Feedback for Code Readers filed Dec. 30, 2014 (Sailors et al.). |
U.S. Appl. No. 62/098,458 for Method of User Authentication via Virtual Object Manipulation filed Dec. 31, 2014 (Todeschini). |
U.S. Appl. No. 62/098,540 for Speed-Limit-Compliance System and Method filed Dec. 31, 2014 (Chamberlin). |
U.S. Appl. No. 62/098,643 for Industrial Vehicle Positioning System and Method filed Dec. 31, 2014 (Chamberlin et al.). |
U.S. Appl. No. 62/098,676 for Reclosable Strap Assembly filed Dec. 31, 2014 (Oberpriller et al.). |
U.S. Appl. No. 62/098,708 for System and Method for Monitoring an Industrial Vehicle filed Dec. 31, 2014 (Smith). |
U.S. Appl. No. 62/101,156 for Multiple Primary Use Interfaces filed Jan. 8, 2015 (Pike et al.). |
U.S. Appl. No. 62/101,170 for Stack Handling Using Multiple Primary User Interfaces filed Jan. 8, 2015 (Pike et al.). |
U.S. Appl. No. 62/101,178 for Portable Dialogue Engine filed Jan. 8, 2015 (Pike et al.). |
U.S. Appl. No. 62/101,203 for Application Development Using Multiple Primary User Interfaces filed Jan. 8, 2015 (Zabel et al.). |
U.S. Appl. No. 62/101,216 for Voice Mode Asset Retrieva filed Jan. 8, 2015 (Zabel et al.). |
U.S. Appl. No. 62/101,221 for Facilitating Workflow Application Development filed Jan. 8, 2015 (Doubleday et al.). |
U.S. Appl. No. 62/101,227 for Charger With Storage Element filed Jan. 8, 2015 (Miraglia et al.). |
U.S. Appl. No. 62/101,235 for Charge Limit Selection for Variable Power Supply Configuration filed Jan. 8, 2015 (Haggerty et al.). |
U.S. Appl. No. 62/101,242 for Power Source Pack Detection filed Jan. 8, 2015 (Allen et al.). |
U.S. Appl. No. 62/101,564 for Visual Graphic Aided Location Identification filed Jan. 9, 2015 (Pecorari et al.). |
U.S. Appl. No. 62/101,568 for Tag Mounted Electronics Module for Distributed Headset filed Jan. 9, 2015 (Di Piazza et al.). |
U.S. Appl. No. 62/101,673 for Restocking Workflow Prioritization filed Jan. 9, 2015 (Mellott et al.). |
U.S. Appl. No. 62/150,352 for Systems and Methods for Imaging filed Apr. 21, 2015 (McCloskey et al.). |
U.S. Appl. No. 62/174,875 for System for Controlling Lighting in an Augmented Reality Environment filed Jun. 12, 2015 (Todeschini). |
U.S. Appl. No. 62/181,233 for Customizable Headset filed Jun. 18, 2015 (Vargo et al.). |
U.S. Appl. No. 62/183,385 for Gesture-Controlled Computer System filed Jun. 23, 2015 (Bouchat et al.). |
EP Office Action dated Feb. 22, 2023 for EP Application No. 16207454. |
Number | Date | Country | |
---|---|---|---|
20220051504 A1 | Feb 2022 | US |
Number | Date | Country | |
---|---|---|---|
62273493 | Dec 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16414477 | May 2019 | US |
Child | 17452861 | US | |
Parent | 15388082 | Dec 2016 | US |
Child | 16414477 | US |